A Scheme-like Lisp where completions, tool use, and agentic loops are native forms — not string templates bolted onto a scripting language. Implemented in Rust. 450+ builtins. 11 providers. Bytecode VM.
;; Define a tool the LLM can call
(deftool get-weather
"Get weather for a city"
{:city {:type :string}}
(lambda (city)
(format "~a: 22°C, sunny" city)))
;; Build an agent with tools
(defagent weather-bot
{:system "You answer weather questions."
:tools [get-weather]
:model "claude-haiku-4-5-20251001"})
(agent/run weather-bot
"What's the weather in Tokyo?")
; => "The weather in Tokyo is 22°C and sunny."Conversations are persistent values. Prompts compose like any other s-expression. Completions, chat, structured extraction, classification, tool use, and agentic loops—all as native forms.
11 providers auto-configured from environment variables. Response caching, cost budgets, rate limiting, fallback chains, and batch processing built in.
;; Simple completion
(llm/complete "Say hello in 5 words"
{:max-tokens 50})
;; Chat with roles
(llm/chat
[(message :system "You are helpful.")
(message :user "What is Lisp?")]
{:max-tokens 100})
;; Structured extraction
(llm/extract
{:vendor {:type :string}
:amount {:type :number}}
"Coffee $4.50 at Blue Bottle")
; => {:amount 4.5 :vendor "Blue Bottle"}
;; Classification
(llm/classify [:positive :negative :neutral]
"This product is amazing!")
; => :positive Define tools with deftool—the LLM sees the schema, calls your Lisp function, and uses the result. Parameters are converted from JSON to Sema values automatically.
defagent combines a system prompt, tools, and a multi-turn loop. The agent calls tools and reasons until it has an answer or hits :max-turns.
;; Define a tool
(deftool lookup-capital
"Look up the capital of a country"
{:country {:type :string
:description "Country name"}}
(lambda (country)
(cond
((= country "Norway") "Oslo")
((= country "France") "Paris")
(else "Unknown"))))
;; Use tools in chat
(llm/chat
[(message :user "Capital of Norway?")]
{:tools [lookup-capital]})
;; Agent with multi-turn loop
(defagent geography-bot
{:system "You answer geography questions."
:tools [lookup-capital]
:max-turns 3})
(agent/run geography-bot "Capital of France?")A full coding agent in 25 lines. Tools are just lambdas with a schema. The agent loop handles tool dispatch, retries, and conversation management automatically.
Or extract data from PDFs, build semantic search, analyze images—all with builtins. No external databases or SDKs required.
;; A coding agent in 25 lines
(deftool read-file
"Read a file's contents"
{:path {:type :string}}
(lambda (path) (file/read path)))
(deftool run-command
"Run a shell command"
{:command {:type :string}}
(lambda (command)
(define r (shell "sh" "-c" command))
(string-append (:stdout r) (:stderr r))))
(defagent coder
{:system "You are a coding assistant.
Read files before editing.
Run tests after changes."
:tools [read-file run-command]
:max-turns 10})
(agent/run coder
"Find all TODO comments in src/")Ingest a PDF, embed each page into a vector store, and answer questions with retrieval—no external database, no LangChain, no boilerplate.
Built-in PDF extraction with pdf/extract-text-pages, embeddings via llm/embed, and an in-memory vector store with cosine similarity search and disk persistence.
;; PDF → vector store → answer
(vector-store/create "manual")
(define page-num 0)
(for-each
(lambda (page)
(set! page-num (+ page-num 1))
(vector-store/add "manual"
(format "p~a" page-num)
(llm/embed page) {:text page}))
(pdf/extract-text-pages "manual.pdf"))
;; Ask a question
(define hits
(vector-store/search "manual"
(llm/embed "How do I configure providers?") 3))
(llm/complete
(prompt
(system "Answer using only this context.")
(user (string/join
(map (fn (h) (:text (:metadata h))) hits)
"\n---\n"))
(user "How do I configure providers?"))) Extract structured data from images with llm/extract-from-image—same schema syntax as llm/extract, but for receipts, screenshots, forms, or any visual input.
Works with OpenAI, Anthropic, and Ollama vision models. Images can also be attached to conversations with message/with-image.
;; Extract typed data from an image
(llm/extract-from-image
{:vendor {:type :string}
:total {:type :number}
:date {:type :string}}
"receipt.jpg")
; => {:vendor "Blue Bottle" :total 4.5 :date "2025-02-18"}
;; Vision in conversations
(define img (file/read-bytes "screenshot.png"))
(llm/chat
[(message/with-image :user
"What error is shown?" img)]
{:max-tokens 200}) Immutable conversation values that accumulate message history. conversation/say sends a message, gets a reply, and returns a new conversation with both appended.
Process collections in parallel with llm/pmap and llm/batch.
;; Persistent conversations
(define c (conversation/new {}))
(define c (conversation/say c
"Remember: the secret is 7"))
(define c (conversation/say c
"What is the secret?"))
(conversation/last-reply c)
; => "The secret is 7."
;; Parallel batch processing
(llm/pmap
(fn (word) (format "Define: ~a" word))
'("serendipity" "ephemeral")
{:max-tokens 50})
;; Budget-scoped calls
(llm/with-budget {:max-cost-usd 1.00}
(lambda ()
(llm/complete "Summarize this")))
;; Cache repeated calls
(llm/with-cache {:ttl 3600}
(lambda ()
(llm/complete "Same prompt"))) Generate embeddings with llm/embed and compute cosine similarity with llm/similarity. Supports Jina, Voyage, Cohere, and OpenAI embedding models.
Built-in vector store with vector-store/create, vector-store/add, and vector-store/search for similarity search. Save to disk with vector-store/save and reload with vector-store/open.
;; Create a store and add documents
(vector-store/create "knowledge")
(for-each
(fn (doc)
(vector-store/add "knowledge"
(car doc) (llm/embed (cadr doc))
{:text (cadr doc)}))
'(("lisp" "Lisp is a family of programming languages")
("rust" "Rust is a systems language focused on safety")
("cooking" "Italian cooking uses fresh ingredients")))
;; Similarity search
(vector-store/search "knowledge"
(llm/embed "writing code") 2)
; => ({:id "rust" :score 0.82 :metadata {...}}
; {:id "lisp" :score 0.78 :metadata {...}})
;; Persist to disk and reload
(vector-store/save "knowledge" "knowledge.json")
(vector-store/open "reloaded" "knowledge.json")50+ LLM builtins — completion, chat, streaming, tools, agents, embeddings, vector stores, caching, and more.
Browse LLM Reference → A Scheme-like core with Clojure-style keywords (:foo), map literals ({:key val}), and vector literals ([1 2 3]).
Tail-call optimized via trampoline. Closures, macros, higher-order functions, and a module system—all in a single-threaded evaluator small enough to read in an afternoon.
;; Recursion
(define (factorial n)
(if (<= n 1) 1 (* n (factorial (- n 1)))))
(factorial 10) ; => 3628800
;; Higher-order functions
(map (lambda (x) (* x x)) (range 1 6))
; => (1 4 9 16 25)
(filter even? (range 1 11))
; => (2 4 6 8 10)
(foldl + 0 (range 1 11))
; => 55
;; Maps — keywords are functions
(define person {:name "Ada" :age 36})
(:name person) ; => "Ada"
;; Closures and composition
(define (compose f g)
(lambda (x) (f (g x))))
(define inc-then-double
(compose (lambda (x) (* x 2))
(lambda (x) (+ x 1))))
(inc-then-double 5) ; => 12 Structured error handling with typed error maps. catch binds an error map with :type, :message, and :stack-trace keys.
defmacro with quasiquote, unquote, and splicing. eval and read for runtime code generation. Inspect expansions with macroexpand.
;; Error handling
(try
(/ 1 0)
(catch e
(println (:message e))
(:type e))) ; => :eval
(throw {:code 404 :reason "not found"})
;; Macros
(defmacro unless (test . body)
`(if ,test nil (begin ,@body)))
(unless #f
(println "this runs!"))
;; Runtime eval
(eval (read "(+ 1 2 3)")) ; => 6Linked lists, vectors, and ordered maps with a full suite of higher-order operations. Slash-namespaced string functions, file I/O, HTTP client, JSON, regex, shell access, and more.
Keywords in function position act as map accessors. Map bodies auto-serialize as JSON in HTTP requests.
;; Collections
(map + '(1 2 3) '(10 20 30)) ; => (11 22 33)
(filter even? (range 1 11)) ; => (2 4 6 8 10)
(define m {:a 1 :b 2 :c 3})
(assoc m :d 4) ; => {:a 1 :b 2 :c 3 :d 4}
(map/select-keys m '(:a :c)) ; => {:a 1 :c 3}
;; Strings
(string/split "a,b,c" ",") ; => ("a" "b" "c")
(string/join '("a" "b") ", ") ; => "a, b"
(string/upper "hello") ; => "HELLO"
;; Files, HTTP & JSON
(file/write "out.txt" "hello")
(file/read "out.txt") ; => "hello"
(define resp (http/get "https://api.example.com/data"))
(json/decode (:body resp)) ; => {:key "val"}
(shell "ls -la") ; => {:exit-code 0 :stdout "..."}450+ builtins across 19 modules — math, strings, lists, maps, I/O, HTTP, regex, text processing, and more.
Browse Standard Library Reference →Trampoline-based evaluator. Deep recursion without stack overflow.
I/O, HTTP, regex, JSON, crypto, CSV, datetime, math, and more.
deftool and defagent as native special forms with multi-turn loops.
Define a schema as a map, get typed data back. llm/extract + llm/classify.
Real-time token streaming with llm/stream. Parallel batch with llm/pmap.
Conversations are immutable values. Fork, extend, inspect message history as data.
File-based modules with import and export. Paths resolve relative to current file.
try / catch / throw with typed error maps and full stack traces.
Optional bytecode compiler and stack-based VM via --vm. 1.7x faster than tree-walking.
sema build compiles programs into self-contained binaries with auto-traced imports and bundled assets.
Response caching, cost budgets, rate limiting, fallback chains, and retry with exponential backoff.
In-memory vector store with similarity search, cosine distance, and disk persistence.
--sandbox restricts shell, filesystem, network, and LLM access per capability group.
Anthropic, OpenAI, Gemini, Ollama, Groq, xAI, Mistral, Moonshot, Jina, Voyage, Cohere.
defmacro with quasiquote/unquote/splicing. macroexpand for inspection.
Keywords (:foo), maps ({:k v}), vectors ([1 2]). Keywords as functions.
Chunking, sentence splitting, HTML stripping, prompt templates, and document abstractions.
Eight Rust crates, one directed dependency graph. No circular dependencies. Single-threaded with Rc, deterministic ordering with BTreeMap.
Value types, environment, errors
Lexer and s-expression parser
Bytecode compiler, resolver, stack-based VM
Trampoline evaluator, special forms, modules
450+ builtins across 19 modules
LLM providers, vector store, caching, resilience
REPL, CLI, file runner
WebAssembly bindings for the browser playground
sema-core
/ | \
sema-reader | sema-stdlib
| \ | /
sema-vm sema-eval sema-llm
\ | / \
sema sema-wasmGet running in one command.
$ brew install helgesverre/tap/sema-lang$ curl -fsSL https://sema-lang.com/install.sh | sh$ cargo install sema-lang$ git clone https://github.com/HelgeSverre/sema
$ cd sema && cargo build --release$ sema # Start the REPL
$ sema script.sema # Run a file
$ sema -e '(+ 1 2)' # Eval expression
$ sema -p '(filter even? (range 10))' # Eval & print
$ sema -l prelude.sema script.sema # Load then run
$ sema --vm script.sema # Use the bytecode VM
$ sema compile script.sema # Compile to .semac bytecode
$ sema script.semac # Run compiled bytecode
$ sema build app.sema -o myapp # Build standalone executable
$ sema disasm script.semac # Disassemble bytecode
$ sema --sandbox=strict script.sema # Sandboxed execution
$ sema --no-llm script.sema # No LLM features