ultralytics_yolo 0.1.3 copy "ultralytics_yolo: ^0.1.3" to clipboard
ultralytics_yolo: ^0.1.3 copied to clipboard

Flutter plugin for YOLO (You Only Look Once) models, supporting object detection, segmentation, classification, pose estimation and oriented bounding boxes (OBB) on both Android and iOS.

Ultralytics logo

Ultralytics YOLO Flutter App #

Ultralytics Actions .github/workflows/ci.yml codecov

Ultralytics Discord Ultralytics Forums Ultralytics Reddit License: AGPL v3

Flutter plugin for YOLO (You Only Look Once) models, supporting object detection, segmentation, classification, pose estimation and oriented bounding boxes (OBB) on both Android and iOS.

✨ Features #

Feature Android iOS
Detection
Classification
Segmentation
Pose Estimation
OBB Detection
  • Real-time Processing: Optimized for real-time inference on mobile devices
  • Camera Integration: Easy integration with device cameras
  • Cross-Platform: Works on both Android and iOS

🚀 Installation #

Add this to your package's pubspec.yaml file:

dependencies:
  ultralytics_yolo: ^0.1.2

Then run:

flutter pub get

Platform-Specific Setup #

Android #

Add the following permissions to your AndroidManifest.xml file:

<!-- For camera access -->
<uses-permission android:name="android.permission.CAMERA" />

<!-- For accessing images from storage -->
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />

Set minimum SDK version in your android/app/build.gradle:

minSdkVersion 21

iOS #

Add these entries to your Info.plist:

<key>NSCameraUsageDescription</key>
<string>This app needs camera access to detect objects</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>This app needs photos access to get images for object detection</string>

Additionally, modify your Podfile (located at ios/Podfile) to include permission configurations:

post_install do |installer|
  installer.pods_project.targets.each do |target|
    flutter_additional_ios_build_settings(target)

    # Start of the permission_handler configuration
    target.build_configurations.each do |config|
      config.build_settings['GCC_PREPROCESSOR_DEFINITIONS'] ||= [
        '$(inherited)',

        ## dart: PermissionGroup.camera
        'PERMISSION_CAMERA=1',

        ## dart: PermissionGroup.photos
        'PERMISSION_PHOTOS=1',
      ]
    end
    # End of the permission_handler configuration
  end
end

✅ Prerequisites #

Export Ultralytics YOLO Models #

Before integrating Ultralytics YOLO into your app, you must export the necessary models. The export process generates .tflite (for Android) and .mlpackage (for iOS) files, which you'll include in your app. Use the Ultralytics YOLO Command Line Interface (CLI) for exporting.

IMPORTANT: The parameters specified in the commands below are mandatory. This Flutter plugin currently only supports models exported using these exact commands. Using different parameters may cause the plugin to malfunction. We are actively working on expanding support for more models and parameters.

Use the following commands to export the required YOLO models:

from ultralytics import YOLO
from ultralytics.utils.downloads import zip_directory


def export_and_zip_yolo_models(
    model_types=("", "-seg", "-cls", "-pose", "-obb"),
    model_sizes=("n",),  #  optional additional sizes are "s", "m", "l", "x"
):
    """Exports YOLO11 models to CoreML format and optionally zips the output packages."""
    for model_type in model_types:
        imgsz = [224, 224] if "cls" in model_type else [640, 384]  # default input image sizes
        nms = True if model_type == "" else False  # only apply NMS to Detect models
        for size in model_sizes:
            model_name = f"yolo11{size}{model_type}"
            model = YOLO(f"{model_name}.pt")

            # iOS Export
            model.export(format="coreml", int8=True, imgsz=imgsz, nms=nms)
            zip_directory(f"{model_name}.mlpackage").rename(f"{model_name}.mlpackage.zip")

            # TFLite Export
            model.export(format="tflite", int8=True, imgsz=[320, 320], nms=False)


# Execute with default parameters
export_and_zip_yolo_models()

👨‍💻 Usage #

Basic Example #

import 'package:flutter/material.dart';
import 'package:ultralytics_yolo/yolo.dart';
import 'package:ultralytics_yolo/yolo_view.dart';
import 'package:ultralytics_yolo/yolo_task.dart';

class YoloDemo extends StatelessWidget {
  // Create a controller to interact with the YoloView
  final controller = YoloViewController();
  
  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(title: Text('YOLO Object Detection')),
      body: Column(
        children: [
          // Controls for adjusting detection parameters
          Padding(
            padding: const EdgeInsets.all(8.0),
            child: Row(
              children: [
                Text('Confidence: '),
                Slider(
                  value: 0.5,
                  min: 0.1,
                  max: 0.9,
                  onChanged: (value) {
                    // Update confidence threshold
                    controller.setConfidenceThreshold(value);
                  },
                ),
              ],
            ),
          ),
          
          // YoloView with controller
          Expanded(
            child: YoloView(
              controller: controller,
              task: YOLOTask.detect,
              // Use model name only - recommended approach for cross-platform compatibility
              modelPath: 'yolo11n',
              onResult: (results) {
                // Handle detection results
                print('Detected ${results.length} objects');
              },
            ),
          ),
        ],
      ),
    );
  }
  
  @override
  void initState() {
    super.initState();
    
    // Set initial detection parameters
    controller.setThresholds(
      confidenceThreshold: 0.5,
      iouThreshold: 0.45,
    );
  }
}

Object Detection with Camera Feed #

There are three ways to control YoloView's detection parameters:

// Create a controller outside build method
final controller = YoloViewController();

// In your build method:
YoloView(
  controller: controller,  // Provide the controller
  task: YOLOTask.detect,
  modelPath: 'yolo11n',  // Just the model name - most reliable approach
  onResult: (results) {
    for (var result in results) {
      print('Detected: ${result.className}, Confidence: ${result.confidence}');
    }
  },
)

// Set detection parameters anywhere in your code
controller.setConfidenceThreshold(0.5);
controller.setIoUThreshold(0.45);

// Or set both at once
controller.setThresholds(
  confidenceThreshold: 0.5,
  iouThreshold: 0.45,
);

Method 2: Using GlobalKey Direct Access (Simpler)

// Create a GlobalKey to access the YoloView
final yoloViewKey = GlobalKey<YoloViewState>();

// In your build method:
YoloView(
  key: yoloViewKey,  // Important: Provide the key
  task: YOLOTask.detect,
  modelPath: 'yolo11n',  // Just the model name without extension
  onResult: (results) {
    for (var result in results) {
      print('Detected: ${result.className}, Confidence: ${result.confidence}');
    }
  },
)

// Set detection parameters directly through the key
yoloViewKey.currentState?.setConfidenceThreshold(0.6);
yoloViewKey.currentState?.setIoUThreshold(0.5);

// Or set both at once
yoloViewKey.currentState?.setThresholds(
  confidenceThreshold: 0.6,
  iouThreshold: 0.5,
);

Method 3: Automatic Controller (Simplest)

// No controller needed - just create the view
YoloView(
  task: YOLOTask.detect,
  modelPath: 'yolo11n',  // Simple model name works best across platforms
  onResult: (results) {
    for (var result in results) {
      print('Detected: ${result.className}, Confidence: ${result.confidence}');
    }
  },
)

// A controller is automatically created internally
// with default threshold values (0.5 for confidence, 0.45 for IoU)

Model Loading #

For the most reliable cross-platform experience, the simplest approach is to:

  1. Use model name without extension (modelPath: 'yolo11n')
  2. Place platform-specific model files in the correct locations:
    • Android: android/app/src/main/assets/yolo11n.tflite
    • iOS: Add yolo11n.mlmodel or yolo11n.mlpackage to your Xcode project

This approach avoids path resolution issues across platforms and lets each platform automatically find the appropriate model file without complicated path handling.

API Reference #

Classes #

YOLO

Main class for YOLO operations.

YOLO({
  required String modelPath,
  required YOLOTask task,
});

YoloViewController

Controller for interacting with a YoloView, managing settings like thresholds.

// Create a controller
final controller = YoloViewController();

// Get current values
double confidence = controller.confidenceThreshold;
double iou = controller.iouThreshold;
int numItems = controller.numItemsThreshold;

// Set confidence threshold (0.0-1.0)
await controller.setConfidenceThreshold(0.6);

// Set IoU threshold (0.0-1.0)
await controller.setIoUThreshold(0.5);

// Set maximum number of detection items (1-100)
await controller.setNumItemsThreshold(20);

// Set multiple thresholds at once
await controller.setThresholds(
  confidenceThreshold: 0.6,
  iouThreshold: 0.5,
  numItemsThreshold: 20,
);

// Switch between front and back camera
await controller.switchCamera();

YoloView

Flutter widget to display YOLO detection results.

YoloView({
  required YOLOTask task,
  required String modelPath,
  YoloViewController? controller,  // Optional: Controller for managing view settings
  Function(List<YOLOResult>)? onResult,
});

YOLOResult

Contains detection results.

class YOLOResult {
  final int classIndex;
  final String className;
  final double confidence;
  final Rect boundingBox;
  // For segmentation
  final List<List<double>>? mask;
  // For pose estimation
  final List<Point>? keypoints;
  // Performance metrics
  final double? processingTimeMs; // Processing time in milliseconds for the frame
  final double? fps;              // Frames Per Second (available on Android, and iOS for real-time)
}

Enums #

YOLOTask

enum YOLOTask {
  detect,   // Object detection
  segment,  // Image segmentation
  classify, // Image classification
  pose,     // Pose estimation
  obb,      // Oriented bounding boxes
}

Troubleshooting #

Common Issues #

  1. Model loading fails

    • Make sure your model file is correctly placed as described above
    • Verify that the model path is correctly specified
    • For iOS, ensure .mlpackage files are added directly to the Xcode project and properly included in target's "Build Phases" → "Copy Bundle Resources"
    • Check that the model format is compatible with TFLite (Android) or Core ML (iOS)
    • Use YOLO.checkModelExists(modelPath) to verify if your model can be found
  2. Low performance on older devices

    • Try using smaller models (e.g., YOLOv8n instead of YOLOv8l)
    • Reduce input image resolution
    • Increase confidence threshold to reduce the number of detections
    • Adjust IoU threshold to control overlapping detections
    • Limit the maximum number of detection items
  3. Camera permission issues

    • Ensure that your app has the proper permissions in the manifest or Info.plist
    • Handle runtime permissions properly in your app

💡 Contribute #

Ultralytics thrives on community collaboration, and we deeply value your contributions! Whether it's bug fixes, feature enhancements, or documentation improvements, your involvement is crucial. Please review our Contributing Guide for detailed insights on how to participate. We also encourage you to share your feedback through our Survey. A heartfelt thank you 🙏 goes out to all our contributors!

Ultralytics open-source contributors

📄 License #

Ultralytics offers two licensing options to accommodate diverse needs:

  • AGPL-3.0 License: Ideal for students, researchers, and enthusiasts passionate about open-source collaboration. This OSI-approved license promotes knowledge sharing and open contribution. See the LICENSE file for details.
  • Enterprise License: Designed for commercial applications, this license permits seamless integration of Ultralytics software and AI models into commercial products and services, bypassing the open-source requirements of AGPL-3.0. For commercial use cases, please inquire about an Enterprise License.

📮 Contact #

Encountering issues or have feature requests related to Ultralytics YOLO? Please report them via GitHub Issues. For broader discussions, questions, and community support, join our Discord server!


Ultralytics GitHub space Ultralytics LinkedIn space Ultralytics Twitter space Ultralytics YouTube space Ultralytics TikTok space Ultralytics BiliBili space Ultralytics Discord
66
likes
0
points
1.75k
downloads

Publisher

verified publisherultralytics.com

Weekly Downloads

Flutter plugin for YOLO (You Only Look Once) models, supporting object detection, segmentation, classification, pose estimation and oriented bounding boxes (OBB) on both Android and iOS.

Repository (GitHub)
View/report issues

Documentation

Documentation

License

unknown (license)

Dependencies

flutter, plugin_platform_interface

More

Packages that depend on ultralytics_yolo

Packages that implement ultralytics_yolo