package ai-sdk-react
sectionYPositions = computeSectionYPositions($el), 10)"
x-init="setTimeout(() => sectionYPositions = computeSectionYPositions($el), 10)"
>
On This Page
Melange bindings for @ai-sdk/react
Install
dune-project
Dependency
Authors
Maintainers
Sources
ocaml-ai-sdk-0.2.tbz
sha256=17063971a74ccd72619a43ecfcb29d28cdf08f90d406a960bd475f65fee1a0d9
sha512=1f3f2d1d7fa4f53843cc1b303cbf9d6dcc85780e7c768ced723f3de83ea897748fa231b48e729216abcc069dee818edc6751a45271541e8a007369bf6e9d23e3
doc/CHANGELOG.html
Changelog
All notable changes to this project will be documented in this file.
0.2 — 2026-04-14
Core SDK (ai_core)
Smooth_stream— stream transformer that buffersText_deltaandReasoning_deltachunks and re-emits them in controlled pieces with configurable inter-chunk delays. Five chunking modes:Word(default),Line,Regex(custom Re2 pattern),Segmenter(Unicode UAX#29 word boundaries via uuseg, recommended for CJK), andCustom(user function). Matches the upstream AI SDK'ssmoothStreamtransform.?transformparameter onstream_textandserver_handler.handle_chat— generic stream transformer (Text_stream_part.t Lwt_stream.t -> Text_stream_part.t Lwt_stream.t) applied between the raw event stream and consumer-facing streams. Bothfull_streamandtext_streamreflect the transformed output.- Retry with exponential backoff —
Retrymodule with jitter, configurable initial delay and backoff factor, and parameter validation.?max_retriesthreaded throughgenerate_text,stream_text, andserver_handler.handle_chat. Retries only on errors marked retryable. - Telemetry / observability —
Telemetrymodule with OpenTelemetry-compatible span instrumentation via thetracelibrary (ocaml-trace). ConfigurableTelemetry.tsettings control enable/disable, input/output recording privacy, function ID, custom metadata, and lifecycle integration callbacks (on_start,on_step_finish,on_tool_call_start,on_tool_call_finish,on_finish). Span hierarchy matches upstream AI SDK:ai.generateText/ai.streamTextroot spans,*.doGenerate/*.doStreamstep spans, andai.toolCalltool execution spans.?telemetryparameter threaded throughgenerate_text,stream_text, andserver_handler.handle_chat.
Provider Abstraction Layer (ai_provider)
is_retryablefield onProvider_error.t— defaults from HTTP status code (429, 5xx are retryable). Anthropic and OpenAI providers set it explicitly based on error classification.
Examples
smooth_streaming— demonstrates all five chunking modestelemetry_logging— demonstrates integration callbacks for lifecycle logging
Dependencies
- Added
re2(>= 0.16) anduuseg(>= 17.0) toai_core - Added
trace(>= 0.12) toai_core
0.1 — 2026-04-06
Initial release of the OCaml AI SDK — a type-safe, provider-agnostic AI model abstraction inspired by the Vercel AI SDK, targeting AI SDK v6 wire compatibility.
Provider Abstraction Layer (ai_provider)
- Extensible GADT-based
Provider_optionsfor compile-time type-safe provider-specific settings - Role-constrained
Prompttypes (System = string only, User = text + files, etc.) Language_model.Smodule type with first-class module wrapperTool,Tool_choice,Mode,Contentfoundation typesFinish_reason,Usage,Warning,Provider_errortypesProvider.SandMiddleware.Smodule type signaturesCall_options,Generate_result,Stream_part,Stream_resulttypes
Anthropic Provider (ai_provider_anthropic)
- Full Anthropic Messages API implementation with streaming (SSE)
Thinkingsupport withbudget_tokenssmart constructor (>= 1024)Cache_controlfor prompt cachingAnthropic_optionsvia the extensible GADT system- Model catalog with all Claude models (Opus, Sonnet, Haiku families)
- Beta header management and model-aware
max_tokens - Prompt conversion with message grouping, tool conversion, response parsing
- Provider factory and public API
OpenAI Provider (ai_provider_openai)
- OpenAI Chat Completions API implementation with streaming (SSE)
- Model catalog with GPT-4o, GPT-4o-mini, o1, o3, o4-mini families
- Tool calling with strict mode support
- Prompt conversion, response parsing, and provider factory
Core SDK (ai_core)
generate_text— synchronous text generation with multi-step tool loopstream_text— streaming text generation with multi-step tool loop, returns synchronously with streams filled by background Lwt task- Output API —
Output.text,Output.object_,Output.enum,Output.array,Output.choicewith JSON Schema validation - UIMessage stream protocol — SSE
data: {json}\n\nencoding withx-vercel-ai-ui-message-stream: v1header, all v6 chunk types Ui_message_stream_writer— composable stream builder withwrite(synchronous) andmerge(non-blocking viaLwt.async), lifecycle management, ref-counted in-flight merge tracking,on_finishcallback- Server handler — cohttp endpoint for chat with CORS support, v6-only request parsing with full part type support (text, file, reasoning, tool invocations with all states)
- Tool approval workflow —
needs_approvalpredicate onCore_tool.t, step loop partitioning,Tool_approval_requestchunk type, stateless re-submission withapproved_tool_call_ids Stop_condition— step loop termination predicates matching upstreamstopWhen:step_count_is,has_tool_call,is_met(OR semantics with short-circuit); wired throughgenerate_text,stream_text, andserver_handler;max_stepsremains as independent hard safety cap- Partial JSON parser — for streaming structured output
Melange Bindings (ai-sdk-react)
useChatanduseCompletionhook bindings for@ai-sdk/react- All v6 message part types including
data_ui_part classifyfunction for part type dispatch- Module-scoped accessors for ergonomic use from OCaml/Reason
Examples
one_shot,streaming,tool_use,thinking,generate,stream_chat,agent_loop— standalone CLI exampleschat_server— cohttp chat server with React frontend, tool approval, structured outputcustom_stream— custom data streaming with Melange frontendai-e2e— end-to-end Melange app with 11 demos (basic chat, reasoning, tool use, tool approval, client tools, file attachments, structured output, completion, web search, retry/regenerate)
Infrastructure
- Dune build with
generate_opam_filesfor automated opam file generation - mlx dialect support (OCaml + JSX via
mlx-pp/ocamlformat-mlx) - Alcotest test suites for all three libraries
- SSE wire format snapshot tests
sectionYPositions = computeSectionYPositions($el), 10)"
x-init="setTimeout(() => sectionYPositions = computeSectionYPositions($el), 10)"
>
On This Page