llm_ollama 0.1.0 copy "llm_ollama: ^0.1.0" to clipboard
llm_ollama: ^0.1.0 copied to clipboard

Ollama backend implementation for LLM interactions. Provides streaming chat, embeddings, tool calling, vision support, and model management.

Changelog #

All notable changes to this project will be documented in this file.

The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.

0.1.0 - 2025-01-19 #

Added #

  • Initial release
  • Ollama backend implementation for LLM interactions:
    • Streaming chat responses
    • Tool/function calling support
    • Vision (image) support
    • Embeddings
    • Thinking mode support
    • Model management (list, pull, show, version)
  • Full compatibility with Ollama API
0
likes
0
points
360
downloads

Publisher

unverified uploader

Weekly Downloads

Ollama backend implementation for LLM interactions. Provides streaming chat, embeddings, tool calling, vision support, and model management.

Repository (GitHub)
View/report issues

Topics

#ollama #llm #ai #chat #embeddings

License

unknown (license)

Dependencies

http, llm_core

More

Packages that depend on llm_ollama