Menu
About
Anaconda
Help
Download Anaconda
Sign In
Anaconda.com
A new user experience is coming soon! These rolling changes are ongoing and some pages will still have the old user interface.
services
/
packages
/
flash-attention
1.0.0
0
Fast and memory-efficient exact attention algorithm.
Conda
Files
Labels
Badges
License: Apache-2.0
Home:
https://github.com/Dao-AILab/flash-attention
Development:
https://github.com/Dao-AILab/flash-attention
Documentation:
https://github.com/Dao-AILab/flash-attention/blob/main/README.md
34
total downloads
Last upload: 8 months and 28 days ago
Installers
linux-64
v1.0.0
conda install
To install this package run one of the following:
conda install services::flash-attention
Description
© 2025 Anaconda, Inc. All Rights Reserved. (v4.2.13)
Legal
|
Privacy Policy