llm_ollama 0.1.9
llm_ollama: ^0.1.9 copied to clipboard
Ollama backend implementation for LLM interactions. Provides streaming chat, embeddings, tool calling, vision support, and model management.
Changelog #
All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
[Unreleased] #
0.1.9 - 2026-02-28 #
Added #
- Integration regression coverage for stream boundary resilience in tool-call loops (
read_file -> write_file -> final assistant response)
Changed #
OllamaStreamConverter.toLLMStream()now uses boundary-safe NDJSON line framing across transport chunks- Stream parsing now uses bounded malformed-line retries (3 consecutive malformed non-empty lines) and throws explicit
LLMApiExceptionafter budget exhaustion instead of silently dropping indefinitely maxToolAttemptsdefault increased from 25 to 90- Bumped
llm_coredependency to ^0.1.9
0.1.8 - 2026-02-26 #
Added #
OllamaMessageConverter.messagesToOllamaJson()for list-aware conversion; derivestool_namefromtoolCallIdvia preceding assistant'stool_callsor synthetic ID parsing- Fallback chain for tool_name: lookup by id, parse synthetic
tool_N_name, or sendtool_call_idonly (Ollama supports both) - Thorough tool response integration tests (15 cases) validating stream contract, deterministic results, error handling, tool chains, and Ollama-specific behavior
Changed #
- Tool messages now converted via
messagesToOllamaJson();tool_nameencapsulated in Ollama layer, derived fromtoolCallId - Replaced per-message
toJson()with list-awaremessagesToOllamaJson()for correct tool message conversion
0.1.7 - 2026-02-10 #
Added #
batchEmbed()implementation: delegates to existing batch-capableembed()(Ollama/api/embedaccepts an array of inputs).
0.1.6 - 2026-02-10 #
Fixed #
- Ensured Ollama tool calls always produce
LLMToolCallinstances with non-null, non-emptyidvalues, synthesizing IDs when Ollama does not provide them. - Aligned tool-calling behavior with
llm_core'stoolCallIdvalidation so that tool execution no longer fails withTool message must have toolCallIdwhen used together withllm_core.
0.1.5 - 2026-01-26 #
Added #
- Builder pattern for
OllamaChatRepositoryviaOllamaChatRepositoryBuilderfor complex configurations - Support for
StreamChatOptionsinstreamChat()method - Support for
chatResponse()method for non-streaming complete responses - Support for
RetryConfigandTimeoutConfigfor advanced request configuration - Input validation for model names and messages
Changed #
streamChat()now accepts optionalStreamChatOptionsparameter- Improved error handling and retry logic
- Enhanced documentation