fft_recorder_ui 1.0.4
fft_recorder_ui: ^1.0.4 copied to clipboard
audio recorder + FFT bar visualizer Flutter package. Built on top of flutter_recorder.
fft_recorder_ui #

A beginner-friendly audio recorder + FFT bar visualizer Flutter package.
Built on top of flutter_recorder and renders bars with container heights without CustomPainter.
📋 Table of Contents #
- Package Overview
- Key Features
- Installation
- Permissions Setup
- Quick Start
- Step-by-Step Tutorial
- API Documentation
- Troubleshooting Guide
- Platform-Specific Guide
- Advanced Usage
- Example App
🎯 Package Overview #
fft_recorder_ui is a Flutter package that makes it easy to implement audio recording and real-time FFT (Fast Fourier Transform) visualization in your Flutter apps.
When to Use? #
- 🎤 Voice recording app development
- 🎵 Real-time visualization for music players
- 🔊 Voice analysis tools
- 📊 Audio waveform visualization
- 🎙️ Podcast/recording apps
Key Features #
- ✅ Simple API: Implement recording and visualization with just a few lines of code
- ✅ Real-time FFT: Provides frequency data in real-time during recording
- ✅ No CustomPainter Required: Visualize bars using only Container (performance optimized)
- ✅ Platform Support: Android, iOS, and Web support
- ✅ State Management: Easy state management with ValueNotifier and Stream
✨ Key Features #
- 🎙️ Recording Control: Start, pause, resume, and stop recording
- 💾 File Saving: Save recordings as WAV files
- 📊 Real-time FFT Streaming: Real-time frequency analysis during recording
- 📈 Bar Visualization: Customizable FFT bar widget
- 🔊 Volume Measurement: Real-time decibel (dB) measurement
📦 Installation #
Step 1: Add Dependency to pubspec.yaml #
Open your project's pubspec.yaml file and add the following to the dependencies section:
dependencies:
flutter:
sdk: flutter
fft_recorder_ui: ^1.0.2
# Optional: Required for file saving
path_provider: ^2.1.5
Step 2: Install Packages #
Run the following command in your terminal:
flutter pub get
Step 3: Version Compatibility #
- Flutter SDK:
>=3.10.0 - Dart SDK:
>=3.0.0 <4.0.0
🔐 Permissions Setup #
Mobile apps require platform-specific permission settings to use the microphone.
Android Permissions #
Open android/app/src/main/AndroidManifest.xml and add the following permission inside the <manifest> tag:
<manifest xmlns:android="http://schemas.android.com/apk/res/android">
<!-- Add microphone permission -->
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<application
android:label="your_app_name"
android:name="${applicationName}"
android:icon="@mipmap/ic_launcher">
<!-- ... rest of configuration ... -->
</application>
</manifest>
File Location: android/app/src/main/AndroidManifest.xml
iOS Permissions #
Open ios/Runner/Info.plist and add the following key:
<key>NSMicrophoneUsageDescription</key>
<string>Microphone access is required for recording.</string>
File Location: ios/Runner/Info.plist
Complete Example:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<!-- ... existing configuration ... -->
<key>NSMicrophoneUsageDescription</key>
<string>Microphone access is required for recording.</string>
<!-- ... rest of configuration ... -->
</dict>
</plist>
Web Platform #
On web, the browser automatically requests permissions. No additional setup is required.
🚀 Quick Start #
Basic Example (StatefulWidget) #
The simplest usage example:
import 'package:flutter/material.dart';
import 'package:fft_recorder_ui/fft_recorder_ui.dart';
class SimpleRecorderPage extends StatefulWidget {
const SimpleRecorderPage({super.key});
@override
State<SimpleRecorderPage> createState() => _SimpleRecorderPageState();
}
class _SimpleRecorderPageState extends State<SimpleRecorderPage> {
late final FftRecorderController _controller;
List<double> _fftData = [];
@override
void initState() {
super.initState();
_controller = FftRecorderController();
// Request permission
_controller.requestMicPermission();
// Subscribe to FFT data stream
_controller.fftStream.listen((data) {
if (mounted) {
setState(() {
_fftData = data;
});
}
});
}
@override
void dispose() {
_controller.dispose();
super.dispose();
}
Future<void> _startRecording() async {
await _controller.startRecording(filePath: '/path/to/recording.wav');
}
void _stopRecording() {
_controller.stopRecording();
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(title: const Text('Simple Recorder')),
body: Column(
children: [
// Recording control buttons
Row(
mainAxisAlignment: MainAxisAlignment.center,
children: [
ElevatedButton(
onPressed: _startRecording,
child: const Text('Start Recording'),
),
const SizedBox(width: 16),
ElevatedButton(
onPressed: _stopRecording,
child: const Text('Stop Recording'),
),
],
),
const SizedBox(height: 32),
// FFT bar visualization
BarVisualizer(
data: _fftData,
barCount: 32,
barColor: Colors.blue,
barWidth: 4,
maxHeight: 100,
spacing: 4,
emptyText: 'Waiting for FFT data...',
),
],
),
);
}
}
GetX Pattern Example #
When using GetX:
import 'package:flutter/material.dart';
import 'package:get/get.dart';
import 'package:fft_recorder_ui/fft_recorder_ui.dart';
// Controller
class RecorderController extends GetxController {
late final FftRecorderController _fftController;
final RxList<double> fftData = <double>[].obs;
final Rx<RecordingStatus> recordingStatus = RecordingStatus.idle.obs;
@override
void onInit() {
super.onInit();
_fftController = FftRecorderController();
_fftController.requestMicPermission();
// Subscribe to FFT stream
_fftController.fftStream.listen((data) {
fftData.value = data;
});
// Subscribe to recording status
_fftController.recordingStatus.addListener(() {
recordingStatus.value = _fftController.recordingStatus.value;
});
}
Future<void> startRecording() async {
await _fftController.startRecording(filePath: '/path/to/recording.wav');
}
void stopRecording() {
_fftController.stopRecording();
}
@override
void onClose() {
_fftController.dispose();
super.onClose();
}
}
// View
class RecorderView extends GetView<RecorderController> {
const RecorderView({super.key});
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(title: const Text('GetX Recorder')),
body: Column(
children: [
Obx(() => Row(
mainAxisAlignment: MainAxisAlignment.center,
children: [
ElevatedButton(
onPressed: controller.recordingStatus.value == RecordingStatus.recording
? null
: controller.startRecording,
child: const Text('Start Recording'),
),
const SizedBox(width: 16),
ElevatedButton(
onPressed: controller.recordingStatus.value == RecordingStatus.idle
? null
: controller.stopRecording,
child: const Text('Stop Recording'),
),
],
)),
const SizedBox(height: 32),
Obx(() => BarVisualizer(
data: controller.fftData,
barCount: 32,
barColor: Colors.green,
barWidth: 4,
maxHeight: 100,
spacing: 4,
emptyText: 'Waiting for FFT data...',
)),
],
),
);
}
}
Complete Example (with File Saving and Playback) #
Complete example with file saving and playback functionality:
import 'dart:async';
import 'package:flutter/material.dart';
import 'package:fft_recorder_ui/fft_recorder_ui.dart';
import 'package:path_provider/path_provider.dart';
import 'package:audioplayers/audioplayers.dart';
class CompleteRecorderPage extends StatefulWidget {
const CompleteRecorderPage({super.key});
@override
State<CompleteRecorderPage> createState() => _CompleteRecorderPageState();
}
class _CompleteRecorderPageState extends State<CompleteRecorderPage> {
late final FftRecorderController _controller;
late final AudioPlayer _player;
StreamSubscription<List<double>>? _fftSubscription;
List<double> _fftData = [];
String? _savedFilePath;
@override
void initState() {
super.initState();
_controller = FftRecorderController();
_player = AudioPlayer();
// Request permission
_controller.requestMicPermission();
// Subscribe to FFT stream
_fftSubscription = _controller.fftStream.listen((data) {
if (mounted) {
setState(() {
_fftData = data;
});
}
});
}
@override
void dispose() {
_fftSubscription?.cancel();
_player.dispose();
_controller.dispose();
super.dispose();
}
Future<void> _startRecording() async {
// Get app documents directory
final dir = await getApplicationDocumentsDirectory();
final timestamp = DateTime.now().millisecondsSinceEpoch;
final filePath = '${dir.path}/recording_$timestamp.wav';
await _controller.startRecording(filePath: filePath);
setState(() {
_savedFilePath = filePath;
});
}
void _pauseRecording() {
_controller.pauseRecording();
}
void _resumeRecording() {
_controller.resumeRecording();
}
void _stopRecording() {
final path = _controller.stopRecording();
if (path != null) {
setState(() {
_savedFilePath = path;
});
}
}
Future<void> _playRecording() async {
if (_savedFilePath != null) {
await _player.stop();
await _player.play(DeviceFileSource(_savedFilePath!));
}
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(title: const Text('Complete Recorder')),
body: Padding(
padding: const EdgeInsets.all(16),
child: Column(
children: [
// Recording status display
ValueListenableBuilder<RecordingStatus>(
valueListenable: _controller.recordingStatus,
builder: (context, status, _) {
String statusText;
Color statusColor;
switch (status) {
case RecordingStatus.idle:
statusText = 'Idle';
statusColor = Colors.grey;
break;
case RecordingStatus.recording:
statusText = 'Recording...';
statusColor = Colors.red;
break;
case RecordingStatus.paused:
statusText = 'Paused';
statusColor = Colors.orange;
break;
}
return Container(
padding: const EdgeInsets.all(12),
decoration: BoxDecoration(
color: statusColor.withOpacity(0.1),
borderRadius: BorderRadius.circular(8),
),
child: Row(
mainAxisAlignment: MainAxisAlignment.center,
children: [
Icon(Icons.fiber_manual_record, color: statusColor),
const SizedBox(width: 8),
Text(
statusText,
style: TextStyle(
fontSize: 18,
fontWeight: FontWeight.bold,
color: statusColor,
),
),
],
),
);
},
),
const SizedBox(height: 24),
// Recording control buttons
Wrap(
spacing: 8,
runSpacing: 8,
children: [
ElevatedButton.icon(
onPressed: _controller.recordingStatus.value == RecordingStatus.recording
? null
: _startRecording,
icon: const Icon(Icons.play_arrow),
label: const Text('Start'),
),
ElevatedButton.icon(
onPressed: _controller.recordingStatus.value == RecordingStatus.recording
? _pauseRecording
: null,
icon: const Icon(Icons.pause),
label: const Text('Pause'),
),
ElevatedButton.icon(
onPressed: _controller.recordingStatus.value == RecordingStatus.paused
? _resumeRecording
: null,
icon: const Icon(Icons.play_arrow),
label: const Text('Resume'),
),
ElevatedButton.icon(
onPressed: _controller.recordingStatus.value == RecordingStatus.idle
? null
: _stopRecording,
icon: const Icon(Icons.stop),
label: const Text('Stop'),
),
ElevatedButton.icon(
onPressed: _savedFilePath == null ? null : _playRecording,
icon: const Icon(Icons.volume_up),
label: const Text('Play'),
),
],
),
const SizedBox(height: 24),
// Saved file path display
if (_savedFilePath != null)
Container(
padding: const EdgeInsets.all(12),
decoration: BoxDecoration(
color: Colors.grey[200],
borderRadius: BorderRadius.circular(8),
),
child: Column(
crossAxisAlignment: CrossAxisAlignment.start,
children: [
const Text(
'Saved File:',
style: TextStyle(fontWeight: FontWeight.bold),
),
const SizedBox(height: 4),
Text(
_savedFilePath!,
style: const TextStyle(fontSize: 12),
),
],
),
),
const SizedBox(height: 24),
// FFT bar visualization
Container(
width: double.infinity,
height: 200,
padding: const EdgeInsets.all(16),
decoration: BoxDecoration(
color: Colors.grey[900],
borderRadius: BorderRadius.circular(12),
),
child: BarVisualizer(
data: _fftData,
barCount: 64,
barColor: Colors.cyan,
barWidth: 4,
maxHeight: 150,
spacing: 4,
emptyText: 'Waiting for FFT data...',
),
),
const SizedBox(height: 16),
// Volume display
ValueListenableBuilder<double>(
valueListenable: _controller.volumeDb,
builder: (context, volume, _) {
return Column(
children: [
Text('Volume: ${volume.toStringAsFixed(1)} dB'),
const SizedBox(height: 8),
LinearProgressIndicator(
value: (volume + 60) / 60, // Convert -60dB ~ 0dB to 0 ~ 1
minHeight: 8,
),
],
);
},
),
],
),
),
);
}
}
📚 Step-by-Step Tutorial #
Step 1: Project Setup and Package Installation #
- Create a Flutter project (or use an existing one)
- Add
fft_recorder_uidependency topubspec.yaml - Run
flutter pub getin terminal
flutter pub get
Step 2: Permissions Setup #
Android
Add microphone permission to android/app/src/main/AndroidManifest.xml:
<uses-permission android:name="android.permission.RECORD_AUDIO" />
iOS
Add microphone usage description to ios/Runner/Info.plist:
<key>NSMicrophoneUsageDescription</key>
<string>Microphone access is required for recording.</string>
Step 3: Controller Creation and Initialization #
class _MyRecorderPageState extends State<MyRecorderPage> {
late final FftRecorderController _controller;
@override
void initState() {
super.initState();
_controller = FftRecorderController();
// Request permission
_controller.requestMicPermission();
// Subscribe to FFT data stream
_controller.fftStream.listen((data) {
setState(() {
_fftData = data;
});
});
}
@override
void dispose() {
_controller.dispose();
super.dispose();
}
}
Step 4: UI Setup (BarVisualizer Widget) #
BarVisualizer(
data: _fftData, // FFT data list (0.0 ~ 1.0)
barCount: 32, // Number of bars
barColor: Colors.blue, // Bar color
barWidth: 4, // Bar width
maxHeight: 100, // Maximum height
spacing: 4, // Spacing between bars
emptyText: 'Waiting for FFT data...', // Text to display when data is empty
)
Step 5: Recording Functionality Implementation #
// Start recording
Future<void> _startRecording() async {
await _controller.startRecording(filePath: '/path/to/file.wav');
}
// Pause
void _pauseRecording() {
_controller.pauseRecording();
}
// Resume
void _resumeRecording() {
_controller.resumeRecording();
}
// Stop
void _stopRecording() {
final savedPath = _controller.stopRecording();
print('Saved path: $savedPath');
}
📖 API Documentation #
FftRecorderController #
The main controller for recording and FFT streaming.
Constructor
FftRecorderController({
this.sampleRate = 22050, // Sampling rate (Hz)
this.channels = RecorderChannels.mono, // Channels (mono/stereo)
this.format = PCMFormat.f32le, // PCM format
})
Parameters:
sampleRate(int): Audio sampling rate. Default: 22050 Hzchannels(RecorderChannels): Audio channels.RecorderChannels.monoorRecorderChannels.stereoformat(PCMFormat): PCM format. FFT visualization requiresPCMFormat.f32le
State and Data (ValueNotifier)
All states are provided as ValueNotifier and can be subscribed using ValueListenableBuilder or Obx.
recordingStatus (ValueNotifier
Represents the recording status.
Values:
RecordingStatus.idle: IdleRecordingStatus.recording: RecordingRecordingStatus.paused: Paused
Usage Example:
ValueListenableBuilder<RecordingStatus>(
valueListenable: controller.recordingStatus,
builder: (context, status, _) {
return Text('Status: $status');
},
)
isStreaming (ValueNotifier
Whether FFT streaming is active.
Usage Example:
ValueListenableBuilder<bool>(
valueListenable: controller.isStreaming,
builder: (context, isStreaming, _) {
return Icon(isStreaming ? Icons.wifi : Icons.wifi_off);
},
)
fftData (ValueNotifier<List
Current FFT data list. Each value is a normalized value between 0.0 and 1.0.
Usage Example:
ValueListenableBuilder<List<double>>(
valueListenable: controller.fftData,
builder: (context, data, _) {
return BarVisualizer(data: data);
},
)
volumeDb (ValueNotifier
Current volume in decibels (dB). Typically ranges from -60dB to 0dB.
Usage Example:
ValueListenableBuilder<double>(
valueListenable: controller.volumeDb,
builder: (context, volume, _) {
return Text('Volume: ${volume.toStringAsFixed(1)} dB');
},
)
Methods
requestMicPermission()
Requests microphone permission.
Returns: Future<bool> - Returns true if permission is granted, false if denied
Usage Example:
final granted = await controller.requestMicPermission();
if (!granted) {
// Handle permission denied
print('Microphone permission is required');
}
Notes:
- Always returns
trueon web (browser handles automatically) - Only performs actual permission request on Android/iOS
init()
Initializes the recorder. Automatically called when startRecording() or startStreaming() is invoked, so you generally don't need to call this directly.
Returns: Future<void>
Usage Example:
await controller.init();
startRecording({String? filePath, bool startStreamingIfNeeded = true})
Starts recording. If a file path is provided, saves as a WAV file.
Parameters:
filePath(String?, optional): File path to save. Ifnull, recording only without file savingstartStreamingIfNeeded(bool, default: true): Whether to automatically start FFT streaming
Returns: Future<void>
Usage Example:
// Recording only without file saving
await controller.startRecording();
// Recording with file saving
final dir = await getApplicationDocumentsDirectory();
final filePath = '${dir.path}/recording.wav';
await controller.startRecording(filePath: filePath);
Notes:
- If
filePathisnullor empty string, recording only without file saving - File path must be valid if provided
- On web, browser manages file paths
- FFT streaming automatically starts when recording begins (default)
- If
filePathisnull,stopRecording()also returnsnull
pauseRecording()
Pauses recording. FFT streaming is also stopped.
Returns: void
Usage Example:
controller.pauseRecording();
resumeRecording()
Resumes paused recording. FFT streaming is automatically restarted.
Returns: void
Usage Example:
controller.resumeRecording();
Notes:
- Streaming is automatically restarted if it was stopped
- Calls
startStreaming()asynchronously but doesn't wait for completion (fire-and-forget)
stopRecording()
Stops recording. FFT streaming is also stopped.
Returns: String? - Saved file path. Returns null if file saving was not performed
Usage Example:
final savedPath = controller.stopRecording();
if (savedPath != null) {
print('File saved at: $savedPath');
}
startStreaming({Duration interval = const Duration(milliseconds: 16)})
Manually starts FFT streaming. Use only when not recording.
Parameters:
interval(Duration, default: 16ms): FFT data update interval
Returns: Future<void>
Usage Example:
// Default update interval (16ms, ~60fps)
await controller.startStreaming();
// Custom update interval (33ms, ~30fps)
await controller.startStreaming(interval: const Duration(milliseconds: 33));
Notes:
- Streaming automatically starts during recording, so manual call is not needed
- Use when you need FFT visualization only when not recording
stopStreaming({bool force = false})
Stops FFT streaming.
Parameters:
force(bool, default: false): Whether to force stop even during recording
Returns: void
Usage Example:
// Normal stop (ignored if recording)
controller.stopStreaming();
// Force stop (stops even during recording)
controller.stopStreaming(force: true);
Notes:
- When
force: false, streaming won't stop if recording - Streaming automatically stops when recording stops
dispose()
Releases the controller and cleans up all resources. Must be called when the widget is disposed.
Returns: void
Usage Example:
@override
void dispose() {
controller.dispose();
super.dispose();
}
Stream
fftStream (Stream<List
Real-time FFT data stream. Can be subscribed using listen().
Usage Example:
// Basic usage
controller.fftStream.listen((data) {
setState(() {
_fftData = data;
});
});
// With GetX
controller.fftStream.listen((data) {
fftData.value = data;
});
Notes:
- Must cancel subscription when widget is disposed (
cancel()) - Stream is a broadcast stream, so multiple listeners can be registered
BarVisualizer Widget #
A widget that visualizes FFT data as bars.
Constructor
const BarVisualizer({
required this.data, // Required: FFT data list
this.barCount = 9, // Number of bars (default: 9)
this.barWidth = 5.33, // Bar width (default: 5.33)
this.maxHeight = 48, // Maximum height (default: 48)
this.barColor = Colors.amberAccent, // Bar color (default: amberAccent)
this.emptyText = 'Waiting for audio data...', // Empty data text
this.spacing = 8, // Spacing between bars (default: 8)
})
Parameters
data (List
FFT data list. Each value should be a normalized value between 0.0 and 1.0.
Example:
BarVisualizer(
data: [0.1, 0.5, 0.8, 0.3, 0.9], // 5 FFT values
)
barCount (int, default: 9)
Number of bars to display. Automatically downsampled if less than data length.
Example:
BarVisualizer(
data: fftData, // 128 FFT values
barCount: 32, // Display 32 bars (auto downsampling)
)
barWidth (double, default: 5.33)
Width of each bar.
Example:
BarVisualizer(
barWidth: 4, // 4 pixels wide
)
maxHeight (double, default: 48)
Maximum height of bars. Each bar's height is calculated proportionally to data values.
Example:
BarVisualizer(
maxHeight: 100, // Maximum 100 pixels height
)
barColor (Color, default: Colors.amberAccent)
Color of bars.
Example:
BarVisualizer(
barColor: Colors.blue, // Solid color
barColor: Colors.cyan, // Different color
barColor: Color(0xFF00FF00), // Custom color
)
spacing (double, default: 8)
Spacing between bars.
Example:
BarVisualizer(
spacing: 4, // 4 pixels spacing
)
emptyText (String, default: 'Waiting for audio data...')
Text to display when data is empty.
Example:
BarVisualizer(
emptyText: 'Waiting for FFT data...',
)
Usage Examples
// Basic usage
BarVisualizer(
data: controller.fftData.value,
)
// Customized
BarVisualizer(
data: controller.fftData.value,
barCount: 64,
barColor: Colors.cyan,
barWidth: 4,
maxHeight: 150,
spacing: 4,
emptyText: 'Waiting for FFT data...',
)
// With ValueListenableBuilder
ValueListenableBuilder<List<double>>(
valueListenable: controller.fftData,
builder: (context, data, _) {
return BarVisualizer(
data: data,
barCount: 32,
);
},
)
🔧 Troubleshooting Guide #
Permission Issues #
When Permission is Denied
Symptoms: requestMicPermission() returns false or recording doesn't start
Solutions:
- Android: Check if permission is added in
AndroidManifest.xml - iOS: Check if
NSMicrophoneUsageDescriptionis added inInfo.plist - Completely close and restart the app
- Check app permissions in device settings and manually allow
Code Example:
final granted = await controller.requestMicPermission();
if (!granted) {
// Notify user that permission is needed
showDialog(
context: context,
builder: (context) => AlertDialog(
title: const Text('Microphone Permission Required'),
content: const Text('Microphone permission is required for recording. Please allow it in settings.'),
actions: [
TextButton(
onPressed: () => Navigator.pop(context),
child: const Text('OK'),
),
],
),
);
}
Recording Doesn't Start #
Checklist:
- ✅ Check if permission is granted (
requestMicPermission()result) - ✅ Check if
init()orstartRecording()is called - ✅ Check if file path is valid (when saving file)
- ✅ Check if another app is using the microphone
- ✅ Check if device microphone is working
Debugging Code:
Future<void> _startRecording() async {
// Check permission
final hasPermission = await controller.requestMicPermission();
print('Permission status: $hasPermission');
if (!hasPermission) {
print('Permission not granted');
return;
}
// Start recording
try {
await controller.startRecording(filePath: '/path/to/file.wav');
print('Recording started successfully');
} catch (e) {
print('Recording start failed: $e');
}
}
FFT Data Not Displaying #
Symptoms: BarVisualizer shows no data or always shows empty text
Solutions:
- Check if stream subscription is properly set up
- Check if recording has started or
startStreaming()is called (FFT data is generated only when streaming is active) - Check if
fftData.valueis not empty - Check if properly subscribed using
ValueListenableBuilderorObx - Check if
isStreaming.valueistrue
Debugging Code:
// Check FFT stream subscription
controller.fftStream.listen((data) {
print('FFT data received: ${data.length} items, first value: ${data.isNotEmpty ? data[0] : 'none'}');
setState(() {
_fftData = data;
});
});
// Check ValueNotifier directly
controller.fftData.addListener(() {
print('fftData changed: ${controller.fftData.value.length} items');
});
Correct Usage Example:
// ✅ Correct: Use ValueListenableBuilder
ValueListenableBuilder<List<double>>(
valueListenable: controller.fftData,
builder: (context, data, _) {
return BarVisualizer(data: data);
},
)
// ❌ Incorrect: Read only once
BarVisualizer(data: controller.fftData.value) // Won't update!
File Save Path Issues #
Symptoms: File not saved or path not found
Solutions:
- Use
path_providerpackage to get correct path - Check if directory exists in file path
- On web, browser manages file paths, so handle differently
Correct File Path Example:
import 'package:path_provider/path_provider.dart';
Future<String> getRecordingPath() async {
final dir = await getApplicationDocumentsDirectory();
final timestamp = DateTime.now().millisecondsSinceEpoch;
return '${dir.path}/recording_$timestamp.wav';
}
// Usage
final filePath = await getRecordingPath();
await controller.startRecording(filePath: filePath);
Platform-Specific Differences #
Android
- If permission is denied, need to navigate to settings screen
- Runtime permission request required on Android 6.0+
iOS
- App may be rejected if permission description (
NSMicrophoneUsageDescription) is missing - Microphone may not work in simulator
Web
- Browser automatically requests permission
- File paths are managed by browser
- Microphone access may be unavailable on some browsers without HTTPS
🌐 Platform-Specific Guide #
Mobile (Android/iOS) #
Common
- Microphone permission setup required
- File system access available (use
path_provider) - Real-time FFT streaming supported
Android
- Permission:
RECORD_AUDIOpermission required - File Path: Use
getApplicationDocumentsDirectory()recommended - Notes: Runtime permission request required on Android 6.0+
iOS
- Permission:
NSMicrophoneUsageDescriptionrequired - File Path: Use
getApplicationDocumentsDirectory()recommended - Notes: Microphone may not work in simulator
Web #
Features
- Browser automatically requests permission
- File paths are managed by browser
- HTTPS required (some browsers)
Limitations
- Cannot directly specify file save path
- Behavior may differ by browser
- Microphone access unavailable without HTTPS
Web Usage Example
// On web, set filePath to null or omit
await controller.startRecording();
// Or use browser-managed path
import 'package:flutter/foundation.dart';
if (kIsWeb) {
await controller.startRecording(); // Recording only without file saving
} else {
final dir = await getApplicationDocumentsDirectory();
await controller.startRecording(filePath: '${dir.path}/recording.wav');
}
🚀 Advanced Usage #
Adjusting FFT Update Interval #
Default update interval is 16ms (~60fps). Can be adjusted for performance or battery.
// Faster updates (8ms, ~120fps) - increased battery consumption
await controller.startStreaming(interval: const Duration(milliseconds: 8));
// Slower updates (33ms, ~30fps) - battery saving
await controller.startStreaming(interval: const Duration(milliseconds: 33));
Custom Sampling Rate #
Default sampling rate is 22050 Hz. Can be changed as needed.
// High quality recording (44100 Hz)
final controller = FftRecorderController(sampleRate: 44100);
// Low quality recording (16000 Hz) - save file size
final controller = FftRecorderController(sampleRate: 16000);
Note: Higher sampling rates increase file size and CPU usage.
Stereo Recording #
Default is mono channel. To record in stereo:
final controller = FftRecorderController(
channels: RecorderChannels.stereo,
);
Memory Management Tips #
- Cancel Stream Subscription: Always cancel subscription when widget is disposed
- Controller Dispose: Dispose controller when widget is disposed
- Prevent Unnecessary Updates: Use
mountedcheck to prevent updates on disposed widgets
StreamSubscription<List<double>>? _subscription;
@override
void initState() {
super.initState();
_subscription = controller.fftStream.listen((data) {
if (mounted) { // ✅ mounted check required!
setState(() {
_fftData = data;
});
}
});
}
@override
void dispose() {
_subscription?.cancel(); // ✅ Cancel subscription required!
controller.dispose(); // ✅ Controller dispose required!
super.dispose();
}
Battery Optimization #
- Increase FFT Update Interval: 16ms → 33ms or 50ms
- Stream Only When Needed: Stop streaming when not recording
- Use Lower Sampling Rate: 22050 Hz → 16000 Hz
// Battery saving settings
final controller = FftRecorderController(
sampleRate: 16000, // Lower sampling rate
);
// Slower update interval
await controller.startStreaming(
interval: const Duration(milliseconds: 50), // 20fps
);
Architecture Patterns #
Controller Pattern
FftRecorderController uses ValueNotifier for state management, which fits well with Flutter's reactive programming pattern.
Advantages:
- Automatic UI updates on state changes
- Multiple widgets can subscribe to same state
- Easy to test
Usage Pattern:
// Use ValueListenableBuilder
ValueListenableBuilder<RecordingStatus>(
valueListenable: controller.recordingStatus,
builder: (context, status, _) {
return Text('Status: $status');
},
)
// Use GetX Obx
Obx(() => Text('Status: ${controller.recordingStatus.value}'))
Stream Pattern
FFT data is provided as a Stream for asynchronous processing.
Advantages:
- Real-time data processing
- Multiple listeners can be registered
- Easy asynchronous processing
Usage Pattern:
// Basic usage
controller.fftStream.listen((data) {
// Process data
});
// Register multiple listeners
controller.fftStream.listen((data) {
// UI update
});
controller.fftStream.listen((data) {
// Analysis logic
});
📱 Example App #
This package includes a complete example app.
Running the Example App #
- Clone or download the project
- Navigate to
examplefolder - Run
flutter pub get - Run
flutter run
cd example
flutter pub get
flutter run
Example App Features #
- ✅ Start/pause/resume/stop recording
- ✅ Real-time FFT bar visualization
- ✅ Play recorded files
- ✅ Display file path
- ✅ Volume display
Example Code Location #
- Example App:
example/lib/main.dart - Test Code:
test/fft_recorder_ui_test.dart
📝 Notes #
- FFT visualization works only with
PCMFormat.f32leformat - Valid file path required for file saving
- On web, browser manages file paths
- FFT streaming automatically starts during recording
- Must dispose Controller when widget is disposed
🔗 Related Packages #
flutter_recorder: Audio recording enginepermission_handler: Permission managementpath_provider: File path management (used in examples)
📄 License #
This package follows the MIT license.
🤝 Contributing #
For bug reports or feature suggestions, please contact the maintainer:
Email: dlgodlf11 [at] naeileun [dot] dev
Happy Coding! 🎉