Synheart Emotion
On-device emotion inference from biosignals (HR/RR) for Flutter applications
๐ Features
- ๐ฑ Cross-Platform: Works on iOS and Android
- ๐ Real-Time Inference: Live emotion detection from heart rate and RR intervals
- ๐ง On-Device Processing: All computations happen locally for privacy
- ๐ Unified Output: Consistent emotion labels with confidence scores
- ๐ Privacy-First: No raw biometric data leaves your device
- โก High Performance: < 5ms inference latency on mid-range devices
๐ฆ Installation
Add synheart_emotion to your pubspec.yaml:
dependencies:
synheart_emotion: ^0.2.3
Then run:
flutter pub get
๐ฏ Quick Start
Basic Usage
import 'package:synheart_emotion/synheart_emotion.dart';
void main() async {
// Initialize the emotion engine
final engine = EmotionEngine.fromPretrained(
const EmotionConfig(
window: Duration(seconds: 60),
step: Duration(seconds: 5),
),
);
// Push biometric data
engine.push(
hr: 72.0,
rrIntervalsMs: [823, 810, 798, 815, 820],
timestamp: DateTime.now().toUtc(),
);
// Get emotion results (synchronous - no await needed)
final results = engine.consumeReady();
for (final result in results) {
print('Emotion: ${result.emotion} (${(result.confidence * 100).toStringAsFixed(1)}%)');
}
}
Real-Time Streaming
// Stream emotion results
final emotionStream = EmotionStream.emotionStream(
engine,
tickStream, // Your biometric data stream
);
await for (final result in emotionStream) {
print('Current emotion: ${result.emotion}');
print('Probabilities: ${result.probabilities}');
}
Integration with synheart-wear
synheart_emotion works independently but integrates seamlessly with synheart-wear for real wearable data.
First, add both to your pubspec.yaml:
dependencies:
synheart_wear: ^0.1.0 # For wearable data
synheart_emotion: ^0.2.3 # For emotion inference
Then integrate in your app:
import 'package:synheart_wear/synheart_wear.dart';
import 'package:synheart_emotion/synheart_emotion.dart';
// Initialize both SDKs
final wear = SynheartWear();
final emotionEngine = EmotionEngine.fromPretrained(
const EmotionConfig(window: Duration(seconds: 60)),
);
await wear.initialize();
// Stream wearable data to emotion engine
wear.streamHR(interval: Duration(seconds: 1)).listen((metrics) {
emotionEngine.push(
hr: metrics.getMetric(MetricType.hr),
rrIntervalsMs: metrics.getMetric(MetricType.rrIntervals),
timestamp: DateTime.now().toUtc(),
);
// Get emotion results (synchronous - no await needed)
final emotions = emotionEngine.consumeReady();
for (final emotion in emotions) {
// Use emotion data in your app
updateUI(emotion);
}
});
See examples/lib/integration_example.dart for complete integration examples.
๐ Supported Emotions
The library currently supports three emotion categories:
- ๐ Amused: Positive, engaged emotional state
- ๐ Calm: Relaxed, peaceful emotional state
- ๐ฐ Stressed: Anxious, tense emotional state
๐ง API Reference
EmotionEngine
The main class for emotion inference:
class EmotionEngine {
// Create engine with pretrained model
factory EmotionEngine.fromPretrained(
EmotionConfig config, {
LinearSvmModel? model,
void Function(String level, String message, {Map<String, Object?>? context})? onLog,
});
// Push new biometric data
void push({
required double hr,
required List<double> rrIntervalsMs,
required DateTime timestamp,
Map<String, double>? motion,
});
// Get ready emotion results
List<EmotionResult> consumeReady();
// Get buffer statistics
Map<String, dynamic> getBufferStats();
// Clear all buffered data
void clear();
}
EmotionConfig
Configuration for the emotion engine:
class EmotionConfig {
final String modelId; // Model identifier
final Duration window; // Rolling window size (default: 60s)
final Duration step; // Emission cadence (default: 5s)
final int minRrCount; // Min RR intervals needed (default: 30)
final bool returnAllProbas; // Return all probabilities (default: true)
final double? hrBaseline; // Optional HR personalization
final Map<String,double>? priors; // Optional label priors
}
EmotionResult
Result of emotion inference:
class EmotionResult {
final DateTime timestamp; // When inference was performed
final String emotion; // Predicted emotion (top-1)
final double confidence; // Confidence score (0.0-1.0)
final Map<String, double> probabilities; // All label probabilities
final Map<String, double> features; // Extracted features
final Map<String, dynamic> model; // Model metadata
}
๐ Privacy & Security
- On-Device Processing: All emotion inference happens locally
- No Data Retention: Raw biometric data is not retained after processing
- No Network Calls: No data is sent to external servers
- Privacy-First Design: No built-in storage - you control what gets persisted
- Real Trained Models: Uses WESAD-trained models with 78% accuracy
๐ฑ Example App
Check out the complete examples in the synheart-emotion repository:
# Clone the main repository for examples
git clone https://github.com/synheart-ai/synheart-emotion.git
cd synheart-emotion/examples
flutter pub get
flutter run
The example demonstrates:
- Real-time emotion detection
- Probability visualization
- Buffer management
- Logging system
๐งช Testing
Run the test suite:
flutter test
Run benchmarks:
flutter test test/benchmarks_test.dart
Tests cover:
- Feature extraction accuracy
- Model inference performance
- Edge case handling
- Memory usage patterns
๐ Performance
Target Performance (mid-range phone):
- Latency: < 5ms per inference
- Model Size: < 100 KB
- CPU Usage: < 2% during active streaming
- Memory: < 3 MB (engine + buffers)
- Accuracy: 78% on WESAD dataset (3-class emotion recognition)
Benchmarks:
- HR mean calculation: < 1ms
- SDNN/RMSSD calculation: < 2ms
- Model inference: < 1ms
- Full pipeline: < 5ms
๐๏ธ Architecture
Biometric Data (HR, RR)
โ
โผ
โโโโโโโโโโโโโโโโโโโโโโโ
โ EmotionEngine โ
โ [RingBuffer] โ
โ [FeatureExtractor] โ
โ [Model Inference] โ
โโโโโโโโโโโโโโโโโโโโโโโ
โ
โผ
EmotionResult
โ
โผ
Your App
๐ Integration
With synheart-wear
Perfect integration with the Synheart Wear SDK for real wearable data:
// Stream from Apple Watch, Fitbit, etc.
final wearStream = synheartWear.streamHR();
final emotionStream = EmotionStream.emotionStream(engine, wearStream);
With swip-core
Feed emotion results into the SWIP impact measurement system:
for (final emotion in emotionResults) {
swipCore.ingestEmotion(emotion);
}
๐ License
Apache 2.0 License
๐ค Contributing
We welcome contributions! See our Contributing Guidelines for details.
๐ Links
- Main Repository: synheart-emotion (Source of Truth)
- Documentation: RFC E1.1
- Model Card: Model Card
- Examples: Examples
- Models: Pre-trained Models
- Tools: Development Tools
- Synheart Wear: synheart-wear
- Synheart AI: synheart.ai
- Issues: GitHub Issues
๐ฅ Authors
- Synheart AI Team - Initial work, RFC Design & Architecture
Made with โค๏ธ by the Synheart AI Team
Technology with a heartbeat.
Patent Pending Notice
This project is provided under an open-source license. Certain underlying systems, methods, and architectures described or implemented herein may be covered by one or more pending patent applications.
Nothing in this repository grants any license, express or implied, to any patents or patent applications, except as provided by the applicable open-source license.
Libraries
- synheart_emotion
- On-device emotion inference from biosignals (heart rate and RR intervals).