sbsa/cu130/: flash-attn versions
Because this project isn't in the mirror_whitelist
,
no releases from root/pypi are included.
Latest version on stage is: 2.8.4
Flash Attention: Fast and Memory-Efficient Exact Attention
Index | Version | Documentation |
---|---|---|
sbsa/cu130 | 2.8.4 |