This website requires JavaScript.
Explore
Help
Sign In
CS348Project
/
llama.cpp
Watch
5
Star
0
Fork
0
You've already forked llama.cpp
mirror of
https://github.com/ggml-org/llama.cpp.git
synced
2025-11-14 11:07:10 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
8dd19a48129d578972ea3d98896edbbf492891a9
llama.cpp
/
ggml
/
src
/
ggml-vulkan
/
vulkan-shaders
/
flash_attn_cm2.comp
Jeff Bolz
c9c6e01dae
vulkan: Add VK_NV_cooperative_matrix2 support for mul_mat and flash attention (
#10206
)
2024-12-05 20:15:05 +01:00
10 KiB
Raw
Blame
History
View Raw
Reference in New Issue
View Git Blame
Copy Permalink