package ai-sdk-react
Install
dune-project
Dependency
Authors
Maintainers
Sources
sha256=17063971a74ccd72619a43ecfcb29d28cdf08f90d406a960bd475f65fee1a0d9
sha512=1f3f2d1d7fa4f53843cc1b303cbf9d6dcc85780e7c768ced723f3de83ea897748fa231b48e729216abcc069dee818edc6751a45271541e8a007369bf6e9d23e3
doc/README.html
ocaml-ai-sdk
Type-safe, provider-agnostic AI model abstraction for OCaml, inspired by the Vercel AI SDK. Targets wire compatibility with AI SDK v6 so you can pair an OCaml backend with @ai-sdk/react frontends.
Libraries
Library | opam lib | Description |
|---|---|---|
|
| Provider abstraction — language model module types, tool definitions, prompt types, GADT-based provider options |
|
| Anthropic Messages API — streaming SSE, thinking, cache control, full Claude model catalog |
|
| OpenAI Chat Completions API — streaming SSE, tool calling with strict mode, GPT-4o/o1/o3/o4-mini catalog |
|
| Core SDK — |
|
| Melange bindings for |
Quick start
opam install ocaml-ai-sdkOne-shot generation
open Ai_core
open Ai_provider_anthropic
let () =
Lwt_main.run @@
let model = Anthropic.create_model "claude-sonnet-4-20250514" in
let%lwt result = Generate_text.generate ~model ~prompt:"Say hello" () in
Lwt_io.printl result.textStreaming
let () =
Lwt_main.run @@
let model = Anthropic.create_model "claude-sonnet-4-20250514" in
let%lwt result = Stream_text.stream ~model ~prompt:"Tell me a joke" () in
Lwt_stream.iter_s Lwt_io.printl result.text_streamChat server (with UIMessage protocol)
let handler = Server_handler.create ~model ()
(* Serves SSE responses compatible with useChat() from @ai-sdk/react *)See examples/ for complete runnable demos including tool use, thinking, structured output, and full-stack Melange apps.
Architecture
ai_provider Provider abstraction (module types, GADT options)
├── ai_provider_anthropic Anthropic implementation
├── ai_provider_openai OpenAI implementation
└── ai_core Core SDK (generate, stream, UIMessage protocol)Key design choices:
- Provider options use an extensible GADT (
type _ key = ..) for compile-time type-safe provider-specific settings (e.g. thinking budget, cache control) - Prompt types are role-constrained variants —
Systemaccepts only strings,Useraccepts text + files, etc. - Streaming uses
Lwt_stream.t—stream_textreturns synchronously with streams populated by a background Lwt task - UIMessage protocol emits SSE chunks matching the
ai@6Zod schemas exactly, souseChat()works without adaptation
AI SDK v6 compatibility
The UIMessage stream protocol (x-vercel-ai-ui-message-stream: v1) is wire-compatible with:
@ai-sdk/react3.x (useChat,useCompletion)ai6.x (core SDK)
All chunk types are supported: text, reasoning, tool call (start/delta/result), source, file, data, error, finish message/step.
Requirements
- OCaml >= 4.14
- For Melange bindings:
melange>= 4.0.0
Development
make build # Build all libraries
make test # Run test suites
make dev # Watch modeLicense
MIT