ollama 1.0.0 copy "ollama: ^1.0.0" to clipboard
ollama: ^1.0.0 copied to clipboard

outdated

Access Ollama API from Dart

Ollama for Dart #

This library provides an interface for interacting with Ollama, a tool that allows you to run LLMs (Large Language Models) locally. With this library, you can create an Ollama instance and use it to generate responses.

Usage #

If you want to generate response from a model, you can use the ask method. This method takes a prompt and a model name, and returns a CompletionChunk object.

import 'package:ollama/ollama.dart';

void main() async {
  // Create an Ollama instance
  final ollama = Ollama();

  // Generate a response from a model
  final response = await ollama.ask('Tell me about llamas', model: 'llama2');

  // Print the response
  print(response.text);
}

Stream responses can be generated using the generate method. This method takes a prompt and a model name, and returns a Stream<CompletionChunk>.

import 'package:ollama/ollama.dart';

void main() async {
  // Create an Ollama instance
  final ollama = Ollama();

  // Generate a response from a model
  final response = ollama.generate('Tell me about llamas', model: 'llama2');

  // Print the response
  await for (final chunk in response) {
    stdout.write(chunk.text);
  }
}
32
likes
0
points
230
downloads

Publisher

unverified uploader

Weekly Downloads

Access Ollama API from Dart

Repository (GitHub)
View/report issues

License

unknown (license)

More

Packages that depend on ollama