flutter_pose_detection 0.1.0
flutter_pose_detection: ^0.1.0 copied to clipboard
Hardware-accelerated pose detection using native ML frameworks (Apple Vision on iOS, TensorFlow Lite on Android). Detects 33 body landmarks in MediaPipe-compatible format.
Changelog #
All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
0.1.0 - 2024-12-30 #
Added #
- Initial release of NPU Pose Detection plugin
- iOS Support: Apple Vision Framework with Neural Engine acceleration
- Automatic NPU/GPU acceleration on A12+ devices
- VNDetectHumanBodyPoseRequest for 17 body landmarks
- Landmark mapping to MediaPipe 33-point format
- Android Support: TensorFlow Lite with MoveNet Lightning model
- GPU Delegate for hardware acceleration
- NNAPI support for devices API 27-34
- CPU fallback with XNNPack optimization
- Core Features:
- Static image pose detection (
detectPose,detectPoseFromFile) - Real-time camera frame processing (
processFrame,startCameraDetection) - Video file analysis (
analyzeVideowith progress tracking) - Configurable detection parameters (
PoseDetectorConfig)
- Static image pose detection (
- Data Models:
Posewith 33 MediaPipe-compatible landmarksPoseLandmarkwith normalized coordinates (0-1) and confidence scoresBoundingBoxfor detected person regionLandmarkTypeenum for easy landmark access
- Error Handling:
- Typed
DetectionErrorwith error codes - Graceful fallback from NPU to GPU to CPU
- Typed
- Example App:
- Image detection demo with gallery picker
- Real-time camera detection with pose overlay
- Video analysis with progress UI