llm_llamacpp 0.1.0 copy "llm_llamacpp: ^0.1.0" to clipboard
llm_llamacpp: ^0.1.0 copied to clipboard

llama.cpp backend implementation for LLM interactions. Enables local on-device inference with GGUF models on Android, iOS, macOS, Windows, and Linux.

Use this package as a library

Depend on it

Run this command:

With Flutter:

 $ flutter pub add llm_llamacpp

This will add a line like this to your package's pubspec.yaml (and run an implicit flutter pub get):

dependencies:
  llm_llamacpp: ^0.1.0

Alternatively, your editor might support flutter pub get. Check the docs for your editor to learn more.

Import it

Now in your Dart code, you can use:

import 'package:llm_llamacpp/llm_llamacpp.dart';
0
likes
160
points
--
downloads

Publisher

unverified uploader

Weekly Downloads

llama.cpp backend implementation for LLM interactions. Enables local on-device inference with GGUF models on Android, iOS, macOS, Windows, and Linux.

Repository (GitHub)
View/report issues
Contributing

Topics

#llamacpp #llama #llm #flutter #ffi

Documentation

API reference

License

MIT (license)

Dependencies

code_assets, ffi, flutter, hooks, http, llm_core, logging, path

More

Packages that depend on llm_llamacpp

Packages that implement llm_llamacpp