flash-attention
Fast and memory-efficient exact attention algorithm.
Fast and memory-efficient exact attention algorithm.
| Name | Type | Version | Platform | Labels | Updated | Size | Downloads | Actions |
|---|
linux-64/flash-attention-1.0.0-py39hdb19cb5_0.tar.bz2 | conda | 1.0.0 | linux-64 | main | Mar 26, 2025, 05:07 PM | 48.17 KB | 2 | |
linux-64/flash-attention-1.0.0-py39hdb19cb5_0.conda | conda | 1.0.0 | linux-64 | main | Mar 26, 2025, 05:07 PM | 50.38 KB | 4 | |
linux-64/flash-attention-1.0.0-py312hdb19cb5_0.tar.bz2 | conda | 1.0.0 | linux-64 | main | Mar 26, 2025, 05:07 PM | 64.67 KB | 3 | |
linux-64/flash-attention-1.0.0-py312hdb19cb5_0.conda | conda | 1.0.0 | linux-64 | main | Mar 26, 2025, 05:07 PM | 68.3 KB | 6 | |
linux-64/flash-attention-1.0.0-py311hdb19cb5_0.tar.bz2 | conda | 1.0.0 | linux-64 | main | Mar 26, 2025, 05:07 PM | 68.18 KB | 3 | |
linux-64/flash-attention-1.0.0-py311hdb19cb5_0.conda | conda | 1.0.0 | linux-64 | main | Mar 26, 2025, 05:07 PM | 71.51 KB | 8 | |
linux-64/flash-attention-1.0.0-py310hdb19cb5_0.tar.bz2 | conda | 1.0.0 | linux-64 | main | Mar 26, 2025, 05:07 PM | 48.61 KB | 2 | |
linux-64/flash-attention-1.0.0-py310hdb19cb5_0.conda | conda | 1.0.0 | linux-64 | main | Mar 26, 2025, 05:07 PM | 50.99 KB | 6 |