llm_ollama 0.1.5 copy "llm_ollama: ^0.1.5" to clipboard
llm_ollama: ^0.1.5 copied to clipboard

Ollama backend implementation for LLM interactions. Provides streaming chat, embeddings, tool calling, vision support, and model management.

Changelog #

All notable changes to this project will be documented in this file.

The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.

[Unreleased] #

0.1.5 - 2026-01-26 #

Added #

  • Builder pattern for OllamaChatRepository via OllamaChatRepositoryBuilder for complex configurations
  • Support for StreamChatOptions in streamChat() method
  • Support for chatResponse() method for non-streaming complete responses
  • Support for RetryConfig and TimeoutConfig for advanced request configuration
  • Input validation for model names and messages

Changed #

  • streamChat() now accepts optional StreamChatOptions parameter
  • Improved error handling and retry logic
  • Enhanced documentation

0.1.0 - 2026-01-19 #

Added #

  • Initial release
  • Ollama backend implementation for LLM interactions:
    • Streaming chat responses
    • Tool/function calling support
    • Vision (image) support
    • Embeddings
    • Thinking mode support
    • Model management (list, pull, show, version)
  • Full compatibility with Ollama API
0
likes
0
points
327
downloads

Publisher

unverified uploader

Weekly Downloads

Ollama backend implementation for LLM interactions. Provides streaming chat, embeddings, tool calling, vision support, and model management.

Repository (GitHub)
View/report issues

Topics

#ollama #llm #ai #chat #embeddings

License

unknown (license)

Dependencies

http, llm_core

More

Packages that depend on llm_ollama