tflite_plus 1.0.0 copy "tflite_plus: ^1.0.0" to clipboard
tflite_plus: ^1.0.0 copied to clipboard

A comprehensive Flutter plugin for Google AI's LiteRT (TensorFlow Lite) with advanced machine learning capabilities for both Android and iOS platforms.

๐Ÿ”ฅ TensorFlow Lite Plus #

A comprehensive Flutter plugin for TensorFlow Lite with advanced ML capabilities

pub package GitHub stars License: MIT Platform

Android iOS GitHub issues GitHub PRs

Bring the power of AI to your Flutter apps with ease ๐Ÿš€


๐Ÿ“‹ Table of Contents #


โœจ Features #

๐Ÿ”ฅ Image Classification
Classify images using pre-trained or custom models
๐ŸŽฏ Object Detection
Detect and locate objects with bounding boxes
๐Ÿƒ Pose Estimation
Detect human poses and keypoints using PoseNet
๐ŸŽจ Semantic Segmentation
Pixel-level image segmentation
โšก Hardware Acceleration
GPU, NNAPI, Metal, and CoreML delegate support
๐Ÿ“ฑ Cross-Platform
Works seamlessly on Android and iOS
๐Ÿ”ง Flexible Input
Support for file paths and binary data
๐Ÿš€ Asynchronous
Non-blocking inference with async/await

๐Ÿš€ Quick Start #

import 'package:tflite_plus/tflite_plus.dart';

// 1. Load your model
await TflitePlus.loadModel(
  model: 'assets/models/mobilenet.tflite',
  labels: 'assets/models/labels.txt',
);

// 2. Run inference
final results = await TflitePlus.runModelOnImage(
  path: imagePath,
  numResults: 5,
  threshold: 0.1,
);

// 3. Use results
print('Prediction: ${results?[0]['label']}');
print('Confidence: ${results?[0]['confidence']}');

๐Ÿ“ฆ Installation #

1. Add Dependency #

dependencies:
  tflite_plus: ^1.0.0

2. Install #

flutter pub get

3. Import #

import 'package:tflite_plus/tflite_plus.dart';

โš™๏ธ Platform Setup #

Android Configuration #

Add to android/app/build.gradle:

android {
    defaultConfig {
        minSdkVersion 21
    }
}

iOS Configuration #

Add to ios/Runner/Info.plist:

<key>NSCameraUsageDescription</key>
<string>This app needs camera access for ML inference.</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>This app needs photo library access for ML inference.</string>

Update ios/Podfile:

platform :ios, '12.0'

๐Ÿ“š Available Functions #

Core Functions #

Function Description Return Type
loadModel() Load TensorFlow Lite model Future<String?>
runModelOnImage() Run inference on image file Future<List<dynamic>?>
runModelOnBinary() Run inference on binary data Future<List<dynamic>?>
detectObjectOnImage() Detect objects in image Future<List<dynamic>?>
detectObjectOnBinary() Detect objects in binary data Future<List<dynamic>?>
runPoseNetOnImage() Detect poses in image Future<List<dynamic>?>
runSegmentationOnImage() Perform segmentation Future<dynamic>
close() Release model resources Future<void>

Utility Functions #

Function Description Return Type
isModelLoaded() Check if model is loaded Future<bool>
getModelInputShape() Get model input dimensions Future<List<int>?>
getModelOutputShape() Get model output dimensions Future<List<int>?>
getAvailableDelegates() Get available hardware delegates Future<List<String>?>

๐ŸŽฏ Usage Examples #

1. Image Classification #

// Load classification model
await TflitePlus.loadModel(
  model: 'assets/models/mobilenet_v1_1.0_224.tflite',
  labels: 'assets/models/mobilenet_v1_1.0_224_labels.txt',
  numThreads: 1,
  useGpuDelegate: true,
);

// Classify image
final results = await TflitePlus.runModelOnImage(
  path: imagePath,
  numResults: 5,
  threshold: 0.1,
  imageMean: 117.0,
  imageStd: 1.0,
);

// Process results
for (var result in results ?? []) {
  print('${result['label']}: ${result['confidence']}');
}

2. Object Detection #

// Load detection model
await TflitePlus.loadModel(
  model: 'assets/models/ssd_mobilenet.tflite',
  labels: 'assets/models/ssd_mobilenet_labels.txt',
  useGpuDelegate: true,
);

// Detect objects
final detections = await TflitePlus.detectObjectOnImage(
  path: imagePath,
  numResultsPerClass: 5,
  threshold: 0.3,
  imageMean: 127.5,
  imageStd: 127.5,
);

// Process detections
for (var detection in detections ?? []) {
  final rect = detection['rect'];
  print('Found ${detection['label']} at (${rect['x']}, ${rect['y']})');
}

3. Pose Estimation #

// Load pose model
await TflitePlus.loadModel(
  model: 'assets/models/posenet.tflite',
  useGpuDelegate: true,
);

// Detect poses
final poses = await TflitePlus.runPoseNetOnImage(
  path: imagePath,
  numResults: 5,
  threshold: 0.1,
  imageMean: 127.5,
  imageStd: 127.5,
);

// Process keypoints
for (var pose in poses ?? []) {
  for (var keypoint in pose['keypoints']) {
    print('${keypoint['part']}: (${keypoint['x']}, ${keypoint['y']})');
  }
}

๐Ÿ“Š Parameter Tables #

loadModel Parameters #

Parameter Type Default Description
model String Required Path to .tflite model file
labels String? null Path to labels file
numThreads int? 1 Number of CPU threads
useGpuDelegate bool? false Enable GPU acceleration
useNnApiDelegate bool? false Enable NNAPI (Android) / CoreML (iOS)

runModelOnImage Parameters #

Parameter Type Default Description
path String Required Image file path
numResults int? 5 Maximum results to return
threshold double? 0.1 Confidence threshold
imageMean double? 117.0 Image normalization mean
imageStd double? 1.0 Image normalization std
asynch bool? true Run asynchronously

detectObjectOnImage Parameters #

Parameter Type Default Description
path String Required Image file path
numResultsPerClass int? 5 Max results per class
threshold double? 0.1 Detection threshold
imageMean double? 127.5 Image normalization mean
imageStd double? 127.5 Image normalization std
asynch bool? true Run asynchronously

runPoseNetOnImage Parameters #

Parameter Type Default Description
path String Required Image file path
numResults int? 5 Maximum poses to detect
threshold double? 0.1 Keypoint threshold
imageMean double? 127.5 Image normalization mean
imageStd double? 127.5 Image normalization std
asynch bool? true Run asynchronously

๐Ÿ”ง Advanced Configuration #

GPU Acceleration #

// Enable GPU acceleration
await TflitePlus.loadModel(
  model: 'assets/models/model.tflite',
  useGpuDelegate: true,  // Android: GPU, iOS: Metal
  numThreads: 1,
);

// Check GPU availability
final delegates = await TflitePlus.getAvailableDelegates();
final hasGpu = delegates?.contains('GPU') ?? false;

NNAPI/CoreML Acceleration #

// Enable NNAPI (Android) / CoreML (iOS)
await TflitePlus.loadModel(
  model: 'assets/models/model.tflite',
  useNnApiDelegate: true,
  numThreads: 1,
);

Thread Configuration #

// Optimize for different devices
final numCores = Platform.numberOfProcessors;
await TflitePlus.loadModel(
  model: 'assets/models/model.tflite',
  numThreads: math.min(numCores, 4), // Use up to 4 threads
);

Binary Data Processing #

// Process image bytes directly
final imageBytes = await file.readAsBytes();
final results = await TflitePlus.runModelOnBinary(
  bytesList: imageBytes,
  imageHeight: 224,
  imageWidth: 224,
  numResults: 5,
  threshold: 0.1,
);

โšก Performance Tips #

๐ŸŽฏ Model Optimization #

# Optimize your TensorFlow Lite model
import tensorflow as tf

converter = tf.lite.TFLiteConverter.from_saved_model('model')
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_types = [tf.float16]
tflite_model = converter.convert()

๐Ÿ“ฑ Best Practices #

  1. Use GPU Acceleration: 2-4x faster inference on supported devices
  2. Quantize Models: Reduce size and improve speed
  3. Batch Processing: Process multiple images together
  4. Image Preprocessing: Resize to model input size
  5. Resource Management: Always call close() when done

โš™๏ธ Performance Benchmarks #

Device Model CPU (ms) GPU (ms) Speedup
Pixel 6 MobileNet 45 12 3.75x
iPhone 13 MobileNet 38 8 4.75x
Galaxy S21 EfficientNet 120 28 4.28x

๐Ÿ› ๏ธ Troubleshooting #

Common Issues & Solutions #

Model Loading Fails

// โŒ Problem: Model not found
// โœ… Solution: Check assets configuration
flutter:
  assets:
    - assets/models/

GPU Delegate Not Available

// โŒ Problem: GPU acceleration fails
// โœ… Solution: Check device compatibility
final delegates = await TflitePlus.getAvailableDelegates();
if (delegates?.contains('GPU') == true) {
  // GPU available
} else {
  // Use CPU fallback
}

Memory Issues

// โŒ Problem: Out of memory
// โœ… Solution: Resource management
await TflitePlus.close(); // Always clean up

// Process smaller batches
// Use quantized models
// Reduce image resolution

Inference Too Slow

// โŒ Problem: Slow inference
// โœ… Solution: Optimization strategies
await TflitePlus.loadModel(
  model: 'assets/models/model_quantized.tflite', // Use quantized model
  useGpuDelegate: true,      // Enable GPU
  numThreads: 4,            // Use multiple threads
);

Error Codes #

Error Cause Solution
Model not loaded No model loaded Call loadModel() first
Invalid image path File doesn't exist Check file path
GPU delegate failed GPU not available Use CPU fallback
Out of memory Insufficient RAM Use smaller models/images

๐Ÿงช Complete Examples #

Real-time Camera Classification #

class CameraClassifier extends StatefulWidget {
  @override
  _CameraClassifierState createState() => _CameraClassifierState();
}

class _CameraClassifierState extends State<CameraClassifier> {
  CameraController? _controller;
  List<dynamic>? _results;
  bool _isDetecting = false;

  @override
  void initState() {
    super.initState();
    _initializeCamera();
    _loadModel();
  }

  Future<void> _loadModel() async {
    await TflitePlus.loadModel(
      model: 'assets/models/mobilenet.tflite',
      labels: 'assets/models/labels.txt',
      useGpuDelegate: true,
    );
  }

  Future<void> _initializeCamera() async {
    final cameras = await availableCameras();
    _controller = CameraController(cameras[0], ResolutionPreset.medium);
    await _controller!.initialize();
    
    _controller!.startImageStream((image) {
      if (!_isDetecting) {
        _isDetecting = true;
        _runInference(image);
      }
    });
    
    setState(() {});
  }

  Future<void> _runInference(CameraImage image) async {
    // Convert CameraImage to file or bytes
    final results = await TflitePlus.runModelOnBinary(
      bytesList: _imageToByteList(image),
      imageHeight: image.height,
      imageWidth: image.width,
    );
    
    setState(() {
      _results = results;
      _isDetecting = false;
    });
  }

  @override
  Widget build(BuildContext context) {
    if (_controller?.value.isInitialized != true) {
      return CircularProgressIndicator();
    }

    return Scaffold(
      body: Stack(
        children: [
          CameraPreview(_controller!),
          Positioned(
            bottom: 100,
            left: 20,
            right: 20,
            child: Container(
              padding: EdgeInsets.all(16),
              decoration: BoxDecoration(
                color: Colors.black.withOpacity(0.7),
                borderRadius: BorderRadius.circular(8),
              ),
              child: Column(
                children: _results?.map((result) => Text(
                  '${result['label']}: ${(result['confidence'] * 100).toInt()}%',
                  style: TextStyle(color: Colors.white, fontSize: 16),
                )).toList() ?? [Text('No results', style: TextStyle(color: Colors.white))],
              ),
            ),
          ),
        ],
      ),
    );
  }

  @override
  void dispose() {
    _controller?.dispose();
    TflitePlus.close();
    super.dispose();
  }
}

Batch Image Processing #

class BatchProcessor {
  static Future<List<Map<String, dynamic>>> processImages(
    List<String> imagePaths,
  ) async {
    await TflitePlus.loadModel(
      model: 'assets/models/classifier.tflite',
      labels: 'assets/models/labels.txt',
      useGpuDelegate: true,
    );

    final results = <Map<String, dynamic>>[];
    
    for (int i = 0; i < imagePaths.length; i++) {
      try {
        final result = await TflitePlus.runModelOnImage(
          path: imagePaths[i],
          numResults: 1,
          threshold: 0.1,
        );
        
        results.add({
          'path': imagePaths[i],
          'predictions': result,
          'status': 'success',
        });
        
        // Progress callback
        print('Processed ${i + 1}/${imagePaths.length} images');
        
      } catch (e) {
        results.add({
          'path': imagePaths[i],
          'error': e.toString(),
          'status': 'error',
        });
      }
    }
    
    await TflitePlus.close();
    return results;
  }
}

๐Ÿค Contributing #

We welcome contributions from the community! ๐ŸŽ‰

Contributors #

Shakil Ahmed
Shakil Ahmed

๐Ÿš€ Creator & Maintainer

Want to see your profile here? Contribute to the project!

How to Contribute #

๐Ÿš€ Quick Start

  1. Fork & Clone

    git clone https://github.com/yourusername/tflite_plus.git
    cd tflite_plus
    
  2. Create Branch

    git checkout -b feature/amazing-feature
    
  3. Make Changes

    • Add your awesome code
    • Write tests
    • Update documentation
  4. Test Your Changes

    flutter test
    flutter analyze
    
  5. Submit PR

    git push origin feature/amazing-feature
    

๐ŸŽฏ Contribution Types

Type Description Label
๐Ÿ› Bug Fix Fix existing issues bug
โœจ Feature Add new functionality enhancement
๐Ÿ“š Documentation Improve docs documentation
๐ŸŽจ UI/UX Design improvements design
โšก Performance Speed optimizations performance
๐Ÿงช Tests Add or improve tests tests

๐Ÿ“‹ Contribution Guidelines

๐Ÿ† Recognition

Contributors get:

  • ๐ŸŒŸ Profile picture in README
  • ๐ŸŽ–๏ธ Contributor badge on GitHub
  • ๐Ÿ“ข Mention in release notes
  • ๐ŸŽ Special Discord role (coming soon)

๐Ÿ’ฌ Support #

Get Help & Connect #

Email GitHub Issues Discussions Website

๐Ÿ“ž Support Channels #

Channel Purpose Response Time
๐Ÿ› GitHub Issues Bug reports, feature requests 24-48 hours
๐Ÿ’ฌ GitHub Discussions Questions, community help 1-3 days
๐Ÿ“ง Email Private support, partnerships 2-5 days
๐ŸŒ Website Documentation, tutorials Always available

๐Ÿ†˜ Before Asking for Help #

  1. Check Documentation: Read this README thoroughly
  2. Search Issues: Look for existing solutions
  3. Provide Details: Include code, error messages, device info
  4. Minimal Example: Create a minimal reproducible example

๐Ÿ“„ License #

MIT License

Copyright (c) 2024 CodeBumble

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

๐ŸŽ‰ Acknowledgments #

Special Thanks To:

  • ๐Ÿค– Google AI Team for TensorFlow Lite
  • ๐Ÿฆ Flutter Team for the amazing framework
  • ๐ŸŒŸ Open Source Community for continuous support
  • ๐Ÿ’ป All contributors who make this project better

Made with โค๏ธ by CodeBumble

If this project helped you, please consider giving it a โญ on GitHub!

Star on GitHub

4
likes
0
points
12
downloads

Publisher

verified publishercodebumble.net

Weekly Downloads

A comprehensive Flutter plugin for Google AI's LiteRT (TensorFlow Lite) with advanced machine learning capabilities for both Android and iOS platforms.

Homepage
Repository (GitHub)
View/report issues

Topics

#machine-learning #tensorflow #litert #ai #object-detection

Documentation

Documentation

License

unknown (license)

Dependencies

flutter, image, plugin_platform_interface

More

Packages that depend on tflite_plus

Packages that implement tflite_plus