Default Branch

945501f5ea · llama: fix leaked buffers for mmap + split files (#16765) · Updated 2025-10-27 08:17:31 +00:00

Branches

eb594c0f7d · alloc : fix build with debug · Updated 2023-12-01 08:46:05 +00:00    CS348Project

5276
14

5b74310e6e · build : enable libstdc++ assertions for debug builds · Updated 2023-11-30 23:18:24 +00:00    CS348Project

5261
1

bb39b87964 · ggml : restore abort() in GGML_ASSERT · Updated 2023-11-28 00:27:09 +00:00    CS348Project

5280
1

87f4102a70 · llama : revert n_threads_batch logic · Updated 2023-11-27 19:47:35 +00:00    CS348Project

5281
3

6272b6764a · use stride=128 if built for tensor cores · Updated 2023-11-27 18:09:14 +00:00    CS348Project

5284
3

8d8b76d469 · lookahead : add comments · Updated 2023-11-26 09:26:55 +00:00    CS348Project

5221
9

21b70babf7 · straightforward /v1/models endpoint · Updated 2023-11-24 16:22:39 +00:00    CS348Project

5222
12

f8e9f11428 · common : add -dkvc arg for enabling kv cache dumps · Updated 2023-11-23 16:47:56 +00:00    CS348Project

5228
4

f824902623 · YaRN : correction to GPT-NeoX implementation · Updated 2023-11-15 22:10:52 +00:00    CS348Project

5260
1

d0445a2eff · better documentation · Updated 2023-11-10 00:38:20 +00:00    CS348Project

5277
3

47d604fa2d · fix issues · Updated 2023-11-05 12:20:22 +00:00    CS348Project

5291
3

3ef358fffd · Revert "cuda : use CUDA memory pool with async memory allocation/deallocation when available (#3903)" · Updated 2023-11-04 20:26:51 +00:00    CS348Project

5295
2

46868a499e · metal : multi-simd softmax · Updated 2023-11-01 19:16:34 +00:00    CS348Project

5320
1

a8796f9609 · llm : cleanup + comments · Updated 2023-11-01 18:08:02 +00:00    CS348Project

5329
4

7420bef83e · wip wip wip · Updated 2023-11-01 06:51:43 +00:00    CS348Project

5329
1

afb3929279 · Merge branch 'master' into llama-refactor · Updated 2023-10-31 18:35:31 +00:00    CS348Project

5331
21

29fe516913 · wip · Updated 2023-10-31 16:36:37 +00:00    CS348Project

5332
1

dab42893c9 · scripts : working curl pipe · Updated 2023-10-31 15:03:56 +00:00    CS348Project

5332
3

7923b70cb8 · llama : add llm_build_inp_embd helper · Updated 2023-10-31 14:43:08 +00:00    CS348Project

5337
37

4b3cb98d46 · ggml-impl : move extern "C" to start of file · Updated 2023-10-30 17:05:58 +00:00    CS348Project

5333
7