Commit Graph

67 Commits

Author SHA1 Message Date
Georgi Gerganov
be58e30017 enc-dec : compose wip
ggml-ci
2025-02-24 18:12:24 +02:00
Georgi Gerganov
9cd78f11a1 context : explicit llama_context_i abstract interface
ggml-ci
2025-02-24 13:38:11 +02:00
Georgi Gerganov
4a1054b552 context : reuse built_attn_mha
ggml-ci
2025-02-24 11:29:52 +02:00
Georgi Gerganov
a5a85a3bc0 context : fix recurrent reserve
ggml-ci
2025-02-24 08:59:12 +02:00
Georgi Gerganov
0699a44c83 context : remove redundant virtual, protected -> private
ggml-ci
2025-02-23 20:02:11 +02:00
Georgi Gerganov
6378112cb5 graph : remove the build_kv_... API from llama_graph_i
ggml-ci
2025-02-23 19:39:22 +02:00
Georgi Gerganov
372fa3a894 cont : enc should work now, next is dec
ggml-ci
2025-02-23 12:20:23 +02:00
Georgi Gerganov
f5e80208c5 wip enc-dec 2025-02-21 19:17:47 +02:00
Georgi Gerganov
3753b30d65 context : fix n_outputs init
ggml-ci
2025-02-21 15:53:26 +02:00
Georgi Gerganov
f588a70da3 context : wrap input tensors in struct
ggml-ci
2025-02-21 15:09:28 +02:00
Georgi Gerganov
ebf1bdf97b context : add logs
ggml-ci
2025-02-21 14:35:23 +02:00
Georgi Gerganov
548c230dff graph : remove worst_case from the API
ggml-ci
2025-02-21 13:29:25 +02:00
Georgi Gerganov
2645a7d9a9 context : add save/load for recurrent context
ggml-ci
2025-02-21 10:28:42 +02:00
Georgi Gerganov
08011c2ca1 context : add llama_kv_cache_recurrent prototype
ggml-ci
2025-02-20 20:55:13 +02:00
Georgi Gerganov
ad870c49f4 context : fix causal input for cache-less case
ggml-ci
2025-02-20 20:01:02 +02:00
Georgi Gerganov
b1554be1d7 context : add cache-less llama_context
ggml-ci
2025-02-20 18:30:04 +02:00
Georgi Gerganov
f95b04a21c model : fix order kvq -> qkv
ggml-ci
2025-02-19 18:52:20 +02:00
Georgi Gerganov
2eacb4c1bf graph : simplify attention api
ggml-ci
2025-02-19 18:43:49 +02:00
Georgi Gerganov
e17e4b72d1 context : add llama_context_recurrent
ggml-ci
2025-02-19 16:07:27 +02:00
Georgi Gerganov
5f11a5502a kv-cache : remove llama_kv_cache_i 2025-02-19 14:36:27 +02:00
Georgi Gerganov
f5cedbcaaa kv-cache : prepare for abstraction
ggml-ci
2025-02-18 21:28:58 +02:00
Georgi Gerganov
2bffc2d514 model : pass llama_graph_i as ptr
ggml-ci
2025-02-18 14:57:26 +02:00
Georgi Gerganov
9e50456e19 context : minor simplify
ggml-ci
2025-02-18 14:53:02 +02:00
Georgi Gerganov
befe14f06f llama : reorder encode/decode in sources 2025-02-18 14:47:53 +02:00
Georgi Gerganov
bc6f187e9c cont : use returend tensors from the graph build
ggml-ci
2025-02-18 14:24:17 +02:00
Georgi Gerganov
172f61690c cont : return important tensors
ggml-ci
2025-02-18 13:48:43 +02:00
Georgi Gerganov
c23590319a graph : add llama_graph_result
ggml-ci
2025-02-18 13:48:21 +02:00
Georgi Gerganov
1d801d27b9 graph : update attn/kv_self names 2025-02-14 17:22:55 +02:00
Georgi Gerganov
828064564c context : move common inputs to base class
ggml-ci
2025-02-14 16:48:21 +02:00
Georgi Gerganov
d5e8e1a2ba context : remove batch_manager
ggml-ci
2025-02-14 16:10:55 +02:00
Georgi Gerganov
131743ff4f context : abstract constructor and init
ggml-ci
2025-02-13 17:17:51 +02:00
Georgi Gerganov
ed3cb55abe context : abstract input
ggml-ci
2025-02-13 15:53:15 +02:00
Georgi Gerganov
107d1e2c32 context : move output functionality to base class
ggml-ci
2025-02-13 15:42:14 +02:00
Georgi Gerganov
e08f38df69 context : minor cleanup
ggml-ci
2025-02-13 12:50:53 +02:00
Georgi Gerganov
f7c7757bab context : abstract state read/write
ggml-ci
2025-02-13 12:37:28 +02:00
Georgi Gerganov
3a504d9a0b llama : introduce llama_io interfaces
ggml-ci
2025-02-13 12:25:54 +02:00
Georgi Gerganov
fbe6a07256 context : rename to llama_context_kv_self 2025-02-12 17:16:44 +02:00
Georgi Gerganov
6ee86e5e0f graph : restore ubatch in build_cb
ggml-ci
2025-02-12 16:29:15 +02:00
Georgi Gerganov
f63aeecce6 llama : models now build their graphs using llama_graph_i
ggml-ci
2025-02-12 15:08:40 +02:00
Georgi Gerganov
5eae8e5183 context : move build_rope_factors to base class
ggml-ci
2025-02-12 13:32:02 +02:00
Georgi Gerganov
d146a14f77 context : minor naming fix 2025-02-12 12:41:36 +02:00
Georgi Gerganov
8da7f612b7 context : improve llama_context encapsulation
ggml-ci
2025-02-12 12:15:04 +02:00
Georgi Gerganov
b52b79b048 context : move encode/decode to llama-context.cpp 2025-02-12 11:23:38 +02:00
Georgi Gerganov
02ef4be975 context : initial abstraction
ggml-ci
2025-02-11 22:27:21 +02:00
Georgi Gerganov
2cd8a903c8 context : make output functions members
ggml-ci
2025-02-10 17:01:27 +02:00
Georgi Gerganov
d1d8d53008 bman : remove ubatch member
ggml-ci
2025-02-10 16:50:14 +02:00
Georgi Gerganov
ef358ee78f context : add decode/encode
ggml-ci
2025-02-10 16:14:13 +02:00
Georgi Gerganov
972f91c7d7 Merge branch 'master' into gg/llama-kv-cache
ggml-ci
2025-02-10 14:45:54 +02:00
Georgi Gerganov
b15fede7a9 kv-cache : fix defrag condition
ggml-ci
2025-02-06 14:35:19 +02:00
Molly Sophia
1eca8916b5 llama : fix rwkv inference (#11618)
Signed-off-by: Molly Sophia <mollysophia379@gmail.com>
2025-02-03 14:17:50 +02:00