synheart_focus 0.0.1
synheart_focus: ^0.0.1 copied to clipboard
On-device cognitive concentration inference from biosignals and behavioral patterns for Flutter.
Synheart Focus #
On-device cognitive concentration inference from biosignals and behavioral patterns for Flutter applications
๐ Features #
- ๐ฑ Cross-Platform: Works on iOS and Android
- ๐ Real-Time Inference: Live focus state detection from biosignals and behavior
- ๐ง On-Device Processing: All computations happen locally for privacy
- ๐ Unified Output: Consistent focus scores (0-100) with state labels
- ๐ Privacy-First: No raw biometric data leaves your device
- โก High Performance: Optimized for real-time processing on mobile devices
๐ฆ Installation #
Add synheart_focus to your pubspec.yaml:
dependencies:
synheart_focus: ^0.0.1
Then run:
flutter pub get
๐ฏ Quick Start #
Basic Usage #
import 'package:synheart_focus/synheart_focus.dart';
void main() async {
// Initialize the focus engine
final engine = await FocusEngine.initialize(
config: FocusConfig(),
);
// Subscribe to focus updates
engine.onUpdate.listen((result) {
print('Focus Score: ${result.focusScore}');
print('Focus State: ${result.focusState}');
print('Confidence: ${result.confidence}');
});
// Provide biosignal data
final hsiData = HSIData(
hr: 72.0,
hrvRmssd: 45.0,
stressIndex: 0.3,
motionIntensity: 0.1,
);
// Provide behavioral data
final behaviorData = BehaviorData(
taskSwitchRate: 0.2,
interactionBurstiness: 0.15,
idleRatio: 0.1,
);
// Run inference
final result = await engine.infer(hsiData, behaviorData);
print('Focus Score: ${result.focusScore}');
}
Integration with synheart-wear #
synheart_focus works independently but integrates seamlessly with synheart-wear for real wearable data.
First, add both to your pubspec.yaml:
dependencies:
synheart_wear: ^0.1.0 # For wearable data
synheart_focus: ^0.0.1 # For focus inference
Then integrate in your app:
import 'package:synheart_wear/synheart_wear.dart';
import 'package:synheart_focus/synheart_focus.dart';
// Initialize both SDKs
final wear = SynheartWear();
final focusEngine = await FocusEngine.initialize(
config: FocusConfig(),
);
await wear.initialize();
// Stream wearable data to focus engine
wear.streamHR(interval: Duration(seconds: 1)).listen((metrics) {
final hsiData = HSIData(
hr: metrics.getMetric(MetricType.hr),
hrvRmssd: calculateRMSSD(metrics.getMetric(MetricType.rrIntervals)),
stressIndex: calculateStress(metrics),
motionIntensity: metrics.getMetric(MetricType.motion),
);
// Add behavioral data from your app
final behaviorData = BehaviorData(
taskSwitchRate: appMetrics.taskSwitchRate,
interactionBurstiness: appMetrics.interactionBurstiness,
idleRatio: appMetrics.idleRatio,
);
// Get focus state
focusEngine.infer(hsiData, behaviorData).then((result) {
updateUI(result);
});
});
๐ Supported Focus States #
The library currently supports three focus state categories:
- ๐ฏ Focused: High concentration, productive state
- โฑ๏ธ Time Pressure: Moderate focus with elevated stress
- ๐ต Distracted: Low concentration, fragmented attention
Focus Scores:
- 70-100: Focused (optimal concentration)
- 40-70: Time Pressure (stressed but engaged)
- 0-40: Distracted (low concentration)
๐ง API Reference #
FocusEngine #
The main class for focus inference:
class FocusEngine {
// Initialize engine with config
static Future<FocusEngine> initialize({
required FocusConfig config,
});
// Stream of focus updates
Stream<FocusResult> get onUpdate;
// Run inference on current data
Future<FocusResult> infer(HSIData hsiData, BehaviorData behaviorData);
// Dispose resources
Future<void> dispose();
}
FocusConfig #
Configuration for the focus engine:
class FocusConfig {
final Duration window; // Rolling window size
final Duration step; // Emission cadence
final bool enableAdaptiveBaseline; // Adaptive personalization
final double smoothingFactor; // Result smoothing (0.0-1.0)
}
FocusResult #
Result of focus inference:
class FocusResult {
final DateTime timestamp; // When inference was performed
final String focusState; // "Focused", "time pressure", or "Distracted"
final double focusScore; // 0-100 focus score
final double confidence; // Confidence score (0.0-1.0)
final Map<String, double> probabilities; // All state probabilities
final Map<String, double> features; // Extracted features
final Map<String, dynamic> model; // Model metadata
}
HSIData #
Heart Signal Intelligence data:
class HSIData {
final double hr; // Heart rate (BPM)
final double hrvRmssd; // HRV RMSSD (ms)
final double stressIndex; // Stress index (0.0-1.0)
final double motionIntensity; // Motion intensity (0.0-1.0)
}
BehaviorData #
Behavioral pattern data:
class BehaviorData {
final double taskSwitchRate; // Task switching frequency
final double interactionBurstiness; // Interaction pattern metric
final double idleRatio; // Idle time ratio
}
๐ Privacy & Security #
- On-Device Processing: All focus inference happens locally
- No Data Retention: Raw biometric data is not retained after processing
- No Network Calls: No data is sent to external servers
- Privacy-First Design: No built-in storage - you control what gets persisted
- Real Trained Models: Uses SWELL-trained models with validated accuracy
๐ฑ Example App #
Check out the complete examples in the synheart-focus repository:
# Clone the main repository for examples
git clone https://github.com/synheart-ai/synheart-focus.git
cd synheart-focus/examples
flutter pub get
flutter run
The example demonstrates:
- Real-time focus detection
- Probability visualization
- Multi-modal data integration
- Adaptive baseline tracking
๐งช Testing #
Run the test suite:
flutter test
Tests cover:
- Feature extraction accuracy
- Model inference performance
- Edge case handling
- Multi-modal fusion
๐ Performance #
Target Performance (mid-range phone):
- Latency: < 10ms per inference
- Model Size: < 2 MB (ONNX model)
- CPU Usage: < 3% during active streaming
- Memory: < 5 MB (engine + buffers)
- Accuracy: Validated on SWELL dataset
๐๏ธ Architecture #
Biosignals (HR, HRV) + Behavior Data
โ
โผ
โโโโโโโโโโโโโโโโโโโโโโโ
โ FocusEngine โ
โ [Feature Extract] โ
โ [Adaptive Baseline]โ
โ [Model Inference] โ
โโโโโโโโโโโโโโโโโโโโโโโ
โ
โผ
FocusResult
โ
โผ
Your App
๐ Integration #
With synheart-wear #
Perfect integration with the Synheart Wear SDK for real wearable data:
// Stream from Apple Watch, Fitbit, etc.
final wearStream = synheartWear.streamHR();
final focusStream = focusEngine.onUpdate;
With synheart-emotion #
Combine with emotion detection for comprehensive mental state tracking:
// Use both emotion and focus together
final emotionResult = emotionEngine.consumeReady();
final focusResult = await focusEngine.infer(hsiData, behaviorData);
// Combined mental state
print('Emotion: ${emotionResult.emotion}, Focus: ${focusResult.focusState}');
๐ License #
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
๐ค Contributing #
We welcome contributions! See our Contributing Guidelines for details.
๐ Links #
- Main Repository: synheart-focus (Source of Truth)
- Documentation: Docs
- Examples: Examples
- Models: Pre-trained Models
- Synheart Wear: synheart-wear
- Synheart Emotion: synheart-emotion
- Synheart AI: synheart.ai
- Issues: GitHub Issues
๐ฅ Authors #
- Synheart AI Team - Initial work, Architecture & Design
Made with โค๏ธ by the Synheart AI Team
Technology with a heartbeat.
Patent Pending Notice #
This project is provided under an open-source license. Certain underlying systems, methods, and architectures described or implemented herein may be covered by one or more pending patent applications.
Nothing in this repository grants any license, express or implied, to any patents or patent applications, except as provided by the applicable open-source license.