Menu
About
Anaconda
Help
Download Anaconda
Sign In
Anaconda.com
2025 Python Packaging Survey is now live!
Take the survey now
services
/
packages
/
flash-attention
1.0.0
0
Fast and memory-efficient exact attention algorithm.
Conda
Files
Labels
Badges
License: Apache-2.0
Home:
https://github.com/Dao-AILab/flash-attention
Development:
https://github.com/Dao-AILab/flash-attention
Documentation:
https://github.com/Dao-AILab/flash-attention/blob/main/README.md
21
total downloads
Last upload: 4 months and 27 days ago
Installers
linux-64
v1.0.0
conda install
To install this package run one of the following:
conda install services::flash-attention
Description
© 2025 Anaconda, Inc. All Rights Reserved. (v4.2.0)
Legal
|
Privacy Policy