Menu
About
Anaconda
Help
Download Anaconda
Sign In
Anaconda.com
2025 Python Packaging Survey is now live!
Take the survey now
New Authentication Rolling Out
- We're upgrading our sign-in process to give you one account across all Anaconda products! Browser users will see a refreshed sign-in flow, while CLI users will experience no changes.
conda-forge
/
packages
/
flash-attn
2.8.3
1
Flash Attention: Fast and Memory-Efficient Exact Attention
copied from
cf-post-staging /
flash-attn
Conda
Files
Labels
Badges
License: BSD-3-Clause
Home:
https://github.com/Dao-AILab/flash-attention
134133
total downloads
Last upload: 1 month and 3 days ago
Installers
linux-64
v2.8.3
linux-aarch64
v2.8.3
conda install
To install this package run one of the following:
conda install conda-forge::flash-attn
Description
© 2025 Anaconda, Inc. All Rights Reserved. (v4.2.2)
Legal
|
Privacy Policy