lldacing
Flash Attention Windows Wheel
- Switch tag branch, such as `v2.7.0.post2` (you can get latest tag by `git describe --tags` or list all available tags by `git tag -l`) - Download WindowsWhlBuildercuda.bat into `flash-attention` - To build with MSVC, please open the "Native Tools Command Prompt for Visual Studio". The exact name may depend on your version of Windows, Visual Studio, and cpu architecture (in my case it was "x64 Native Tools Command Prompt for VS 2022".) - Switch python env and make sure the corresponding torch cuda version is installed - Wheel file will be placed in the `dist` directory
NATTEN-windows
Original Doc (only install, no wheel): Buid with MSVC Build wheel steps - First clone NATTEN, and make sure to fetch all submodules - Switch tag branch, such as `v0.17.3` (you can get latest tag by `git describe --tags` or list all available tags by `git tag -l`) - To build with MSVC, please open the "Native Tools Command Prompt for Visual Studio". The exact name may depend on your version of Windows, Visual Studio, and cpu architecture (in my case it was "x64 Native Tools Command Prompt for VS 2022".) - Switch python env and make sure the corresponding torch cuda version is installed - Wheel file will be placed in the `dist` directory - Test - Install natten