tflite_plus 1.0.0
tflite_plus: ^1.0.0 copied to clipboard
A comprehensive Flutter plugin for Google AI's LiteRT (TensorFlow Lite) with advanced machine learning capabilities for both Android and iOS platforms.
๐ฅ TensorFlow Lite Plus #
A comprehensive Flutter plugin for TensorFlow Lite with advanced ML capabilities
Bring the power of AI to your Flutter apps with ease ๐
๐ Table of Contents #
- โจ Features
- ๐ Quick Start
- ๐ฆ Installation
- โ๏ธ Platform Setup
- ๐ Available Functions
- ๐ฏ Usage Examples
- ๐ Parameter Tables
- ๐ง Advanced Configuration
- โก Performance Tips
- ๐ ๏ธ Troubleshooting
- ๐งช Complete Examples
- ๐ค Contributing
- ๐ฌ Support
- ๐ License
โจ Features #
| ๐ฅ | Image Classification Classify images using pre-trained or custom models |
| ๐ฏ | Object Detection Detect and locate objects with bounding boxes |
| ๐ | Pose Estimation Detect human poses and keypoints using PoseNet |
| ๐จ | Semantic Segmentation Pixel-level image segmentation |
| โก | Hardware Acceleration GPU, NNAPI, Metal, and CoreML delegate support |
| ๐ฑ | Cross-Platform Works seamlessly on Android and iOS |
| ๐ง | Flexible Input Support for file paths and binary data |
| ๐ | Asynchronous Non-blocking inference with async/await |
๐ Quick Start #
import 'package:tflite_plus/tflite_plus.dart';
// 1. Load your model
await TflitePlus.loadModel(
model: 'assets/models/mobilenet.tflite',
labels: 'assets/models/labels.txt',
);
// 2. Run inference
final results = await TflitePlus.runModelOnImage(
path: imagePath,
numResults: 5,
threshold: 0.1,
);
// 3. Use results
print('Prediction: ${results?[0]['label']}');
print('Confidence: ${results?[0]['confidence']}');
๐ฆ Installation #
1. Add Dependency #
dependencies:
tflite_plus: ^1.0.0
2. Install #
flutter pub get
3. Import #
import 'package:tflite_plus/tflite_plus.dart';
โ๏ธ Platform Setup #
Android Configuration #
Add to android/app/build.gradle:
android {
defaultConfig {
minSdkVersion 21
}
}
iOS Configuration #
Add to ios/Runner/Info.plist:
<key>NSCameraUsageDescription</key>
<string>This app needs camera access for ML inference.</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>This app needs photo library access for ML inference.</string>
Update ios/Podfile:
platform :ios, '12.0'
๐ Available Functions #
Core Functions #
| Function | Description | Return Type |
|---|---|---|
loadModel() |
Load TensorFlow Lite model | Future<String?> |
runModelOnImage() |
Run inference on image file | Future<List<dynamic>?> |
runModelOnBinary() |
Run inference on binary data | Future<List<dynamic>?> |
detectObjectOnImage() |
Detect objects in image | Future<List<dynamic>?> |
detectObjectOnBinary() |
Detect objects in binary data | Future<List<dynamic>?> |
runPoseNetOnImage() |
Detect poses in image | Future<List<dynamic>?> |
runSegmentationOnImage() |
Perform segmentation | Future<dynamic> |
close() |
Release model resources | Future<void> |
Utility Functions #
| Function | Description | Return Type |
|---|---|---|
isModelLoaded() |
Check if model is loaded | Future<bool> |
getModelInputShape() |
Get model input dimensions | Future<List<int>?> |
getModelOutputShape() |
Get model output dimensions | Future<List<int>?> |
getAvailableDelegates() |
Get available hardware delegates | Future<List<String>?> |
๐ฏ Usage Examples #
1. Image Classification #
// Load classification model
await TflitePlus.loadModel(
model: 'assets/models/mobilenet_v1_1.0_224.tflite',
labels: 'assets/models/mobilenet_v1_1.0_224_labels.txt',
numThreads: 1,
useGpuDelegate: true,
);
// Classify image
final results = await TflitePlus.runModelOnImage(
path: imagePath,
numResults: 5,
threshold: 0.1,
imageMean: 117.0,
imageStd: 1.0,
);
// Process results
for (var result in results ?? []) {
print('${result['label']}: ${result['confidence']}');
}
2. Object Detection #
// Load detection model
await TflitePlus.loadModel(
model: 'assets/models/ssd_mobilenet.tflite',
labels: 'assets/models/ssd_mobilenet_labels.txt',
useGpuDelegate: true,
);
// Detect objects
final detections = await TflitePlus.detectObjectOnImage(
path: imagePath,
numResultsPerClass: 5,
threshold: 0.3,
imageMean: 127.5,
imageStd: 127.5,
);
// Process detections
for (var detection in detections ?? []) {
final rect = detection['rect'];
print('Found ${detection['label']} at (${rect['x']}, ${rect['y']})');
}
3. Pose Estimation #
// Load pose model
await TflitePlus.loadModel(
model: 'assets/models/posenet.tflite',
useGpuDelegate: true,
);
// Detect poses
final poses = await TflitePlus.runPoseNetOnImage(
path: imagePath,
numResults: 5,
threshold: 0.1,
imageMean: 127.5,
imageStd: 127.5,
);
// Process keypoints
for (var pose in poses ?? []) {
for (var keypoint in pose['keypoints']) {
print('${keypoint['part']}: (${keypoint['x']}, ${keypoint['y']})');
}
}
๐ Parameter Tables #
loadModel Parameters #
| Parameter | Type | Default | Description |
|---|---|---|---|
model |
String |
Required | Path to .tflite model file |
labels |
String? |
null |
Path to labels file |
numThreads |
int? |
1 |
Number of CPU threads |
useGpuDelegate |
bool? |
false |
Enable GPU acceleration |
useNnApiDelegate |
bool? |
false |
Enable NNAPI (Android) / CoreML (iOS) |
runModelOnImage Parameters #
| Parameter | Type | Default | Description |
|---|---|---|---|
path |
String |
Required | Image file path |
numResults |
int? |
5 |
Maximum results to return |
threshold |
double? |
0.1 |
Confidence threshold |
imageMean |
double? |
117.0 |
Image normalization mean |
imageStd |
double? |
1.0 |
Image normalization std |
asynch |
bool? |
true |
Run asynchronously |
detectObjectOnImage Parameters #
| Parameter | Type | Default | Description |
|---|---|---|---|
path |
String |
Required | Image file path |
numResultsPerClass |
int? |
5 |
Max results per class |
threshold |
double? |
0.1 |
Detection threshold |
imageMean |
double? |
127.5 |
Image normalization mean |
imageStd |
double? |
127.5 |
Image normalization std |
asynch |
bool? |
true |
Run asynchronously |
runPoseNetOnImage Parameters #
| Parameter | Type | Default | Description |
|---|---|---|---|
path |
String |
Required | Image file path |
numResults |
int? |
5 |
Maximum poses to detect |
threshold |
double? |
0.1 |
Keypoint threshold |
imageMean |
double? |
127.5 |
Image normalization mean |
imageStd |
double? |
127.5 |
Image normalization std |
asynch |
bool? |
true |
Run asynchronously |
๐ง Advanced Configuration #
GPU Acceleration #
// Enable GPU acceleration
await TflitePlus.loadModel(
model: 'assets/models/model.tflite',
useGpuDelegate: true, // Android: GPU, iOS: Metal
numThreads: 1,
);
// Check GPU availability
final delegates = await TflitePlus.getAvailableDelegates();
final hasGpu = delegates?.contains('GPU') ?? false;
NNAPI/CoreML Acceleration #
// Enable NNAPI (Android) / CoreML (iOS)
await TflitePlus.loadModel(
model: 'assets/models/model.tflite',
useNnApiDelegate: true,
numThreads: 1,
);
Thread Configuration #
// Optimize for different devices
final numCores = Platform.numberOfProcessors;
await TflitePlus.loadModel(
model: 'assets/models/model.tflite',
numThreads: math.min(numCores, 4), // Use up to 4 threads
);
Binary Data Processing #
// Process image bytes directly
final imageBytes = await file.readAsBytes();
final results = await TflitePlus.runModelOnBinary(
bytesList: imageBytes,
imageHeight: 224,
imageWidth: 224,
numResults: 5,
threshold: 0.1,
);
โก Performance Tips #
๐ฏ Model Optimization #
# Optimize your TensorFlow Lite model
import tensorflow as tf
converter = tf.lite.TFLiteConverter.from_saved_model('model')
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_types = [tf.float16]
tflite_model = converter.convert()
๐ฑ Best Practices #
- Use GPU Acceleration: 2-4x faster inference on supported devices
- Quantize Models: Reduce size and improve speed
- Batch Processing: Process multiple images together
- Image Preprocessing: Resize to model input size
- Resource Management: Always call
close()when done
โ๏ธ Performance Benchmarks #
| Device | Model | CPU (ms) | GPU (ms) | Speedup |
|---|---|---|---|---|
| Pixel 6 | MobileNet | 45 | 12 | 3.75x |
| iPhone 13 | MobileNet | 38 | 8 | 4.75x |
| Galaxy S21 | EfficientNet | 120 | 28 | 4.28x |
๐ ๏ธ Troubleshooting #
Common Issues & Solutions #
Model Loading Fails
// โ Problem: Model not found
// โ
Solution: Check assets configuration
flutter:
assets:
- assets/models/
GPU Delegate Not Available
// โ Problem: GPU acceleration fails
// โ
Solution: Check device compatibility
final delegates = await TflitePlus.getAvailableDelegates();
if (delegates?.contains('GPU') == true) {
// GPU available
} else {
// Use CPU fallback
}
Memory Issues
// โ Problem: Out of memory
// โ
Solution: Resource management
await TflitePlus.close(); // Always clean up
// Process smaller batches
// Use quantized models
// Reduce image resolution
Inference Too Slow
// โ Problem: Slow inference
// โ
Solution: Optimization strategies
await TflitePlus.loadModel(
model: 'assets/models/model_quantized.tflite', // Use quantized model
useGpuDelegate: true, // Enable GPU
numThreads: 4, // Use multiple threads
);
Error Codes #
| Error | Cause | Solution |
|---|---|---|
Model not loaded |
No model loaded | Call loadModel() first |
Invalid image path |
File doesn't exist | Check file path |
GPU delegate failed |
GPU not available | Use CPU fallback |
Out of memory |
Insufficient RAM | Use smaller models/images |
๐งช Complete Examples #
Real-time Camera Classification #
class CameraClassifier extends StatefulWidget {
@override
_CameraClassifierState createState() => _CameraClassifierState();
}
class _CameraClassifierState extends State<CameraClassifier> {
CameraController? _controller;
List<dynamic>? _results;
bool _isDetecting = false;
@override
void initState() {
super.initState();
_initializeCamera();
_loadModel();
}
Future<void> _loadModel() async {
await TflitePlus.loadModel(
model: 'assets/models/mobilenet.tflite',
labels: 'assets/models/labels.txt',
useGpuDelegate: true,
);
}
Future<void> _initializeCamera() async {
final cameras = await availableCameras();
_controller = CameraController(cameras[0], ResolutionPreset.medium);
await _controller!.initialize();
_controller!.startImageStream((image) {
if (!_isDetecting) {
_isDetecting = true;
_runInference(image);
}
});
setState(() {});
}
Future<void> _runInference(CameraImage image) async {
// Convert CameraImage to file or bytes
final results = await TflitePlus.runModelOnBinary(
bytesList: _imageToByteList(image),
imageHeight: image.height,
imageWidth: image.width,
);
setState(() {
_results = results;
_isDetecting = false;
});
}
@override
Widget build(BuildContext context) {
if (_controller?.value.isInitialized != true) {
return CircularProgressIndicator();
}
return Scaffold(
body: Stack(
children: [
CameraPreview(_controller!),
Positioned(
bottom: 100,
left: 20,
right: 20,
child: Container(
padding: EdgeInsets.all(16),
decoration: BoxDecoration(
color: Colors.black.withOpacity(0.7),
borderRadius: BorderRadius.circular(8),
),
child: Column(
children: _results?.map((result) => Text(
'${result['label']}: ${(result['confidence'] * 100).toInt()}%',
style: TextStyle(color: Colors.white, fontSize: 16),
)).toList() ?? [Text('No results', style: TextStyle(color: Colors.white))],
),
),
),
],
),
);
}
@override
void dispose() {
_controller?.dispose();
TflitePlus.close();
super.dispose();
}
}
Batch Image Processing #
class BatchProcessor {
static Future<List<Map<String, dynamic>>> processImages(
List<String> imagePaths,
) async {
await TflitePlus.loadModel(
model: 'assets/models/classifier.tflite',
labels: 'assets/models/labels.txt',
useGpuDelegate: true,
);
final results = <Map<String, dynamic>>[];
for (int i = 0; i < imagePaths.length; i++) {
try {
final result = await TflitePlus.runModelOnImage(
path: imagePaths[i],
numResults: 1,
threshold: 0.1,
);
results.add({
'path': imagePaths[i],
'predictions': result,
'status': 'success',
});
// Progress callback
print('Processed ${i + 1}/${imagePaths.length} images');
} catch (e) {
results.add({
'path': imagePaths[i],
'error': e.toString(),
'status': 'error',
});
}
}
await TflitePlus.close();
return results;
}
}
๐ค Contributing #
We welcome contributions from the community! ๐
Contributors #
Shakil Ahmed ๐ Creator & Maintainer |
Want to see your profile here? Contribute to the project!
How to Contribute #
๐ Quick Start
-
Fork & Clone
git clone https://github.com/yourusername/tflite_plus.git cd tflite_plus -
Create Branch
git checkout -b feature/amazing-feature -
Make Changes
- Add your awesome code
- Write tests
- Update documentation
-
Test Your Changes
flutter test flutter analyze -
Submit PR
git push origin feature/amazing-feature
๐ฏ Contribution Types
| Type | Description | Label |
|---|---|---|
| ๐ Bug Fix | Fix existing issues | bug |
| โจ Feature | Add new functionality | enhancement |
| ๐ Documentation | Improve docs | documentation |
| ๐จ UI/UX | Design improvements | design |
| โก Performance | Speed optimizations | performance |
| ๐งช Tests | Add or improve tests | tests |
๐ Contribution Guidelines
- Code Style: Follow Dart Style Guide
- Testing: Add tests for new features
- Documentation: Update README and code comments
- Commits: Use Conventional Commits
๐ Recognition
Contributors get:
- ๐ Profile picture in README
- ๐๏ธ Contributor badge on GitHub
- ๐ข Mention in release notes
- ๐ Special Discord role (coming soon)
๐ฌ Support #
Get Help & Connect #
๐ Support Channels #
| Channel | Purpose | Response Time |
|---|---|---|
| ๐ GitHub Issues | Bug reports, feature requests | 24-48 hours |
| ๐ฌ GitHub Discussions | Questions, community help | 1-3 days |
| ๐ง Email | Private support, partnerships | 2-5 days |
| ๐ Website | Documentation, tutorials | Always available |
๐ Before Asking for Help #
- Check Documentation: Read this README thoroughly
- Search Issues: Look for existing solutions
- Provide Details: Include code, error messages, device info
- Minimal Example: Create a minimal reproducible example
๐ License #
MIT License
Copyright (c) 2024 CodeBumble
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
๐ Acknowledgments #
Special Thanks To:
- ๐ค Google AI Team for TensorFlow Lite
- ๐ฆ Flutter Team for the amazing framework
- ๐ Open Source Community for continuous support
- ๐ป All contributors who make this project better
Made with โค๏ธ by CodeBumble
If this project helped you, please consider giving it a โญ on GitHub!