llm_ollama 0.1.5
llm_ollama: ^0.1.5 copied to clipboard
Ollama backend implementation for LLM interactions. Provides streaming chat, embeddings, tool calling, vision support, and model management.
Changelog #
All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
[Unreleased] #
0.1.5 - 2026-01-26 #
Added #
- Builder pattern for
OllamaChatRepositoryviaOllamaChatRepositoryBuilderfor complex configurations - Support for
StreamChatOptionsinstreamChat()method - Support for
chatResponse()method for non-streaming complete responses - Support for
RetryConfigandTimeoutConfigfor advanced request configuration - Input validation for model names and messages
Changed #
streamChat()now accepts optionalStreamChatOptionsparameter- Improved error handling and retry logic
- Enhanced documentation