Skip to content

fix fa2 and fa4 conflict#241

Merged
akaitsuki-ii merged 1 commit intomainfrom
dev/qz/fix_fa2_fa4_conflict
Mar 27, 2026
Merged

fix fa2 and fa4 conflict#241
akaitsuki-ii merged 1 commit intomainfrom
dev/qz/fix_fa2_fa4_conflict

Conversation

@qzzz95
Copy link
Copy Markdown
Contributor

@qzzz95 qzzz95 commented Mar 27, 2026

No description provided.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the Flash Attention 2 availability check in diffsynth_engine/utils/flag.py to target the specific flash_attn.flash_attn_func module. A review comment suggests implementing a more robust try-except block to ensure the check accurately reflects how the module is imported elsewhere in the codebase, preventing potential runtime errors.

FLASH_ATTN_4_AVAILABLE = check_module_available("flash_attn.cute.interface", "Flash attention 4")
FLASH_ATTN_3_AVAILABLE = check_module_available("flash_attn_interface", "Flash attention 3")
FLASH_ATTN_2_AVAILABLE = check_module_available("flash_attn", "Flash attention 2")
FLASH_ATTN_2_AVAILABLE = check_module_available("flash_attn.flash_attn_func", "Flash attention 2")
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The check for Flash Attention 2 availability is not fully robust. It verifies that the flash_attn.flash_attn_func module can be imported, but the code in diffsynth_engine/models/basic/attention.py uses from flash_attn import flash_attn_func. This requires flash_attn_func to be an attribute of the top-level flash_attn package.

These two conditions are not always equivalent. A more robust approach would be to verify the attribute's existence directly by attempting the import. This prevents potential ImportError at runtime if the flash-attn package structure changes.

Suggested change
FLASH_ATTN_2_AVAILABLE = check_module_available("flash_attn.flash_attn_func", "Flash attention 2")
try:
from flash_attn import flash_attn_func
logger.info("Flash attention 2 is available")
FLASH_ATTN_2_AVAILABLE = True
except (ImportError, AttributeError):
logger.info("Flash attention 2 is not available")
FLASH_ATTN_2_AVAILABLE = False

@akaitsuki-ii akaitsuki-ii merged commit ed7b4ec into main Mar 27, 2026
@akaitsuki-ii akaitsuki-ii deleted the dev/qz/fix_fa2_fa4_conflict branch March 27, 2026 06:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants