Files
cs348project/README.md

3.4 KiB

MIND - Modular Inference & Node Database

Version: Milestone-1

Setup

This will create and setup a SQLite database if mind.db doesn't exist under backend/.

Also see backend/README.md.

Getting a llama.cpp instance

llama.cpp is included as a third-party git submodule under third_party.

A script for grabbing a Qwen 3 1.7B LLM model from HuggingFace is at backend/get-qwen3-1.7b.sh.

The default location for the Go layer of the backend to access the LLM service is localhost:8081.

Please consider adding the -n <num-of-tokens-to-predict> option to avoid infinite generation on smaller models

Running the backend layer

Run go run ./backend/main.go, which will start a backend instance at localhost:8080.

Note that if mind.db doesn't exist in the working directory, it will be created and be initialized by the code in backend/db/migrations.

All db-related operations (e.g. queries) are under backend/db.

Running tests

To run tests, we provide a simple CLI-based frontend client at backend/cli.py.

This will include all of the features supported or under development.

Current features (and features under development)

A short video demo of basic functionality on the CLI can be found under milestone-1/demo.mp4.

The specific APIs used are given in backend/design.md.

Action Command / Description
Start conversation \new <title> <owner_id>
Commit (prompt/answer or node) Free text → /completion or \node <user|assistant> <content> [parent_id]
Fork branch \branch <name> [head_node_id]
Delete branch or node \delbranch <name> or \delnode <node_id>
Extract subset as new tree \extract <title> <node_id ...>
Merge two branches \merge <left> <right> <left-first|right-first>
Detach branch to new conversation \detach <branch>
Append / copy tree to another \append-tree <src_conv> <src_branch> [dst_conv [dst_branch]]
List conversations \listconvs <owner_id>
List branches \listbranches <conversation_id>
Linearize current branch \linearize
Debug llama-server directly \llama <prompt>
Switch conversation / branch \useconv <id> and \usebranch <name>
Show current user \who

For a more detailed experience, it's strongly recommended to run the backend locally and trying out the CLI.