llm_llamacpp 0.1.0
llm_llamacpp: ^0.1.0 copied to clipboard
llama.cpp backend implementation for LLM interactions. Enables local on-device inference with GGUF models on Android, iOS, macOS, Windows, and Linux.
Use this package as a library
Depend on it
Run this command:
With Flutter:
$ flutter pub add llm_llamacppThis will add a line like this to your package's pubspec.yaml (and run an implicit flutter pub get):
dependencies:
llm_llamacpp: ^0.1.0Alternatively, your editor might support flutter pub get. Check the docs for your editor to learn more.
Import it
Now in your Dart code, you can use:
import 'package:llm_llamacpp/llm_llamacpp.dart';