facetagr 0.0.18 copy "facetagr: ^0.0.18" to clipboard
facetagr: ^0.0.18 copied to clipboard

At FaceTagr, we are pioneers in advanced face recognition technology, delivering solutions that are accurate, reliable, and scalable. Our NIST-tested algorithms, with 99.91% accuracy, ensure that our [...]

FaceTagr Flutter Package #

The FaceTagr Flutter package allows third-party teams to integrate face recognition capabilities into their applications. This package provides two primary functions: initialization (init) and face matching (fnFaceMatch).

📦 Installation #

> ❗ **Platform Support:** Android ✔ | iOS ✔ | Web ❌ (Web is not supported)

Add the SDK to your pubspec.yaml:

dependencies:
  facetagr: ^0.0.18
  camera: ^0.10.5+9
  wakelock_plus: 1.3.3
  uuid: ^4.5.1
  

Add the FaceTagr Tools CLI as a dev-only dependency:

dev_dependencies:
facetagr_tools: ^0.0.18

Install:


flutter pub get

⚙️ Setup (FaceTagr Tools CLI) #

Before running the app, initialize your FaceTagr environment:


dart run facetagr_tools:facetagr_init --clientID <clientID> --clientKey <clientKey> --apiURL <apiURL> --path <path>

Required Parameters #

--clientID Your FaceTagr client ID

--clientKey Your FaceTagr client key (used to generate secure hash)

--apiURL Your FaceTagr backend API base URL

--path Local Pub cache folder path where models will be installed

📁 Path Examples #

FaceTagr Tools requires the pub cache directory as --path.

Windows:

C:\Users\<USERNAME>\AppData\Local\Pub\Cache\hosted\pub.dev

Mac/Linux:

/Users/<USERNAME>/.pub-cache/hosted/pub.dev

📱 Required Permissions for FaceTagr (Android & iOS) #

The FaceTagr Flutter package requires certain platform permissions to access the device camera and perform secure face recognition.

Include the following configurations in AndroidManifest.xml and iOS Info.plist.

🟩 Android — Required Permissions #

Add the following inside:

android/app/src/main/AndroidManifest.xml

(Place permissions outside application)

✅ Required AndroidManifest.xml Section for FaceTagr


<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.example.yourapp">

    <!-- Permissions -->
    <uses-permission android:name="android.permission.CAMERA" />
    <uses-permission android:name="android.permission.INTERNET" />
    <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
    <uses-permission android:name="android.permission.WAKE_LOCK" />

    <!-- Required Hardware -->
    <uses-feature android:name="android.hardware.camera" android:required="false" />
    <uses-feature android:name="android.hardware.camera.front" android:required="false" />
    <uses-feature android:name="android.hardware.camera.autofocus" android:required="false" />

    <application
        android:label="yourapp"
        android:icon="@mipmap/ic_launcher"
        android:requestLegacyExternalStorage="true"
        android:usesCleartextTraffic="true">

        <!-- (Other Flutter auto-generated metadata remains unchanged) -->

    </application>

</manifest>

🟦 iOS — Required Permissions #

Add the following inside:

ios/Runner/Info.plist (Place inside the main

✅ Required iOS Info.plist Section for FaceTagr

<key>NSCameraUsageDescription</key>
<string>This app requires camera access to use the FaceTagr SDK.</string>
<key>NSMicrophoneUsageDescription</key>
<string>This app may require microphone access to support features in the FaceTagr SDK.</string>

📥 Import Package #

import 'package:facetagr/facetagr.dart';
import 'package:camera/camera.dart';
import 'dart:convert';
import 'package:crypto/crypto.dart';
import 'package:uuid/uuid.dart';

🛠️ Initialization #

FaceTagr provides two ways to initialize the SDK. Choose one method, depending on your app flow.

Use when you want a clean async call and then navigate.

class _HomePageState extends State<HomePage> {
  bool _isProcessing = false;

  @override
  void initState() {
    super.initState();
  }

  @override
  void dispose() {
    super.dispose();
  }

  String fn_get_hash(String clientID, String utctime, String requestID, String clientKey) {
    String input = clientID + utctime + requestID + clientKey;
    var bytes = utf8.encode(input);
    var hash = sha512.convert(bytes);
    return hash.toString();
  }
  void _facetagr_initialize() {
    String clientKey = "clientKey";
    String apiURL = "https://yourapiurl.com";
    String clientID = "yourClientID";
    String externalID = "yourExternalID";
    String requestID = const Uuid().v4();
    String utcTime = DateTime.now().toUtc().toString();
    String hashcode = "hashcode";
    Facetagr.initializeAndAwait(
      apiURL: apiURL,
      clientID: clientID,
      externalID: externalID,
      hashcode: hashcode,
      utcTime: utcTime,
      requestID: requestID,

      // Optional – enables user face registration flow  
      // Default value is "false".
      allowUserRegistration: false,
    ).then((message) {
      if (message['StatusCode'] == "1001") {
        if (!mounted) return;
        setState(() => _isProcessing = true);
        Navigator.of(context).push(
          MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
        ).then((_) => mounted ? setState(() => _isProcessing = false) : null);
      } else {
        if (!mounted) return;
        ScaffoldMessenger.of(context).showSnackBar(
          SnackBar(content: Text(message['StatusMessage'])),
        );
      }
      setState(() => _isProcessing = false);
    });
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(title: const Text('FaceTagr')),
      body: Center(
        child: Padding(
          padding: const EdgeInsets.all(20),
          child: Column(
            crossAxisAlignment: CrossAxisAlignment.center,
            mainAxisAlignment: MainAxisAlignment.center,

            children: [
              ElevatedButton.icon(
                icon: const Icon(Icons.face),
                label: const Text('Start'),
                onPressed: () {
                  _facetagr_initialize();
                },
              ),
              const SizedBox(height: 30),
              if (_isProcessing) const CircularProgressIndicator(),
            ],
          ),
        ),
      ),
    );
  }
}

✔ Direct result handling (no streams)

Option 2 — Event-based initialization (Stream Listener) #

Use when your app listens for init events globally.

class _HomePageState extends State<HomePage> {
StreamSubscription<String>? _initSub;
Facetagr _faceTagr = Facetagr();

@override
void initState() {
  super.initState();
  _facetagr_initialize();
  _listenToBroadcast();
}
void _listenToBroadcast() {
    _initSub = Facetagr.initStream.listen((message) {
      final decoded = jsonDecode(message);
      if (decoded['StatusCode'] == "1001") {
        if (!mounted) return;
        ScaffoldMessenger.of(context).showSnackBar(
          SnackBar(content: Text(decoded['StatusMessage'])),
        );
      } else {
        if (!mounted) return;
        ScaffoldMessenger.of(context).showSnackBar(
          SnackBar(content: Text(decoded['StatusMessage'])),
        );
      }
      setState(() => _isProcessing = false);
    });
  }

void _facetagr_initialize() {
    String clientKey = "clientKey";
    String apiURL = "https://yourapiurl.com";
    String clientID = "yourClientID";
    String externalID = "yourExternalID";
    String requestID = const Uuid().v4();
    String utcTime = DateTime.now().toUtc().toString();
    String hashcode = "hashcode";
    _faceTagr.init(apiURL, clientID, externalID, hashcode, utcTime, requestID);
  }
  @override
  void dispose() {
    _initSub?.cancel();
    super.dispose();
  }
}

✔ Automatically receives events

✔ Matches native FaceTagr behavior

🔑 Hash Logic #

FaceTagr uses a SHA-512 hash for request signing.


import 'dart:convert';
import 'package:crypto/crypto.dart';

String fn_get_hash(String clientID, String utcTime, String requestID, String clientKey) {
  String input = clientID + utcTime + requestID + clientKey;
  var bytes = utf8.encode(input);
  var hash = sha512.convert(bytes);
  return hash.toString();
}

Example Flow

String requestID = const Uuid().v4();
String utcTime   = DateTime.now().toUtc().toString();
String hash      = fn_get_hash(clientID, utcTime, requestID, clientKey);

_faceTagr.init(apiURL, clientID, externalID, hash, utcTime, requestID);

🔒 Best practice: Generate the hash server-side (so the clientKey never sits inside the app).

🎧 Listening to Events #

• Initialization → Facetagr.initStream.listen(...)

• Face Match → Facetagr.faceMatchStream.listen(...)

Events are returned as JSON:


{
  "StatusCode": "1001",
  "StatusMessage": "Success"
}

🖼️ Live Preview Widget #

Navigator.push(
  context,
  MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
);

The widget provides: #

• Front camera stream

• Face bounding box overlays

• Spinner while matching

• Dialogs on success/failure

FaceTagr Camera FaceTagrLivePreview.dart #

Create a new file named FaceTagrLivePreview.dart in your Flutter app and add the widget code below. This widget provides the camera preview and integrates with the FaceTagr SDK.

import 'dart:async';
import 'dart:convert';
import 'dart:io';
import 'dart:typed_data';
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
import 'package:camera/camera.dart';
import 'package:facetagr/facetagr.dart';
import 'package:wakelock_plus/wakelock_plus.dart';
import 'main.dart';

const bool kDetectorMirrorsFront = true;
const bool kDetectorReturnsPreviewOrientedBoxes = true;
class FaceTagrLivePreview extends StatefulWidget {
  const FaceTagrLivePreview({Key? key}) : super(key: key);

  @override
  State<FaceTagrLivePreview> createState() => _FaceTagrLivePreviewState();
}

class _FaceTagrLivePreviewState extends State<FaceTagrLivePreview> {
  StreamSubscription<String>? _matchSub;
  CameraController? _controller;
  bool _isDetecting = false;
  Rect? _faceBox;
  String _status = "Initializing camera...";
  late bool _isFrontCamera;
  bool _showSpinner = false;
  String _deviceType = "";

  @override
  void initState() {
    super.initState();
    _deviceType = Platform.isIOS ? "ios" : "android";
    WakelockPlus.enable();
    _listenToBroadcast();
    initializeCamera();
  }

  Future<void> _listenToBroadcast() async {
    _matchSub = Facetagr.faceMatchStream.listen((message) {
      try {
        final decoded = jsonDecode(message);
        final int statusCode =  int.tryParse(decoded['StatusCode'].toString()) ?? -1;
        final String statusMessage = decoded['StatusMessage'];
        if (!mounted) return;
        if(statusCode < 5000) {
          _showPopup(statusCode, statusMessage);
        }else{
          setState(() {
            _status = statusMessage;
            _faceBox = null;
          });
        }
      } catch (_) {
        // not JSON; ignore
      }
    });
  }
  void _showPopup(int statusCode, String message) {
    showDialog(
      context: context,
      barrierDismissible: false, // prevent closing by tapping outside
      builder: (BuildContext context) {
        return AlertDialog(
          shape: RoundedRectangleBorder(
            borderRadius: BorderRadius.circular(12),
          ),
          backgroundColor: Colors.white,
          title: const Text(
            "FaceTagr",
            style: TextStyle(color: Colors.blue),
          ),
          content: Text(
            message,
            style: const TextStyle(color: Colors.lightBlue),
          ),
          actions: [
            TextButton(
              child: const Text("OK", style: TextStyle(color: Colors.blue)),
              onPressed: () {
                Navigator.of(context).pop();
                if (statusCode == 1001) {
                  Navigator.of(context).pushAndRemoveUntil(
                    MaterialPageRoute(builder: (_) => const HomePage()),
                        (route) => false,
                  );
                } else {
                  setState(() {
                    _showSpinner = false;
                  });
                  initializeCamera();
                }
              },
            ),
          ],
        );
      },
    );
  }
  Future<void> initializeCamera() async {
    try {
      final cameras = await availableCameras();
      final camera = cameras.firstWhere(
            (c) => c.lensDirection == CameraLensDirection.front,
        orElse: () => cameras.first,
      );
      _isFrontCamera = camera.lensDirection == CameraLensDirection.front;

      _controller = CameraController(
        camera,
        ResolutionPreset.medium,
        imageFormatGroup: ImageFormatGroup.nv21,
        enableAudio: false,
      );

      await _controller!.initialize();
      if (!mounted) return;

      int frameCount = 1;
      const int frameSkip = 5;

      await _controller!.startImageStream((CameraImage image) async {
        if (!mounted) return;
        if (_isDetecting || (frameCount++ % frameSkip != 0)) return;

        _isDetecting = true;
        try {
          final int width = image.width;
          final int height = image.height;
          Map<String, dynamic>? result;
          if (_deviceType == "android") {
            final yuv = _concatenatePlanes(image.planes);
            result = await Facetagr.detectFace(yuv, width, height, 8);
          }else if (_deviceType == "ios"){
            final yuv = _bgraToYUV420(image); // BGRA to YUV420
            result = await Facetagr.detectFace(yuv, width, height, 1);
          }

          if (!mounted) return;

          if (result is Map && result?["status"] != null) {
            final int status = result?["status"];
            final String msg = (result?["message"] ?? "").toString();
            final double left = (result?['x1'] ?? 0).toDouble();
            final double top = (result?['y1'] ?? 0).toDouble();
            final double w = (result?['width'] ?? 0).toDouble();
            final double h = (result?['height'] ?? 0).toDouble();
           
            if (status == 1001 || status == 1002) {
              _showSpinner = true;
              setState(() {
                _status = ""; // no message text
                _faceBox = Rect.fromLTRB(left, top, w, h);
              });
              await _controller?.stopImageStream();
            } else if (status == 1000) {
              setState(() {
                _status = msg; // no message text
                _faceBox = Rect.fromLTRB(left, top, w, h);
              });
            } else {
              setState(() {
                _faceBox = null;
                _status = msg;
              });
            }
          } else {
            setState(() {
              _status = "Error";
              _faceBox = null;
            });
          }
        } catch (e) {
          if (mounted) {
            setState(() {
              _status = "Error: $e";
              _faceBox = null;
            });
          }
        } finally {
          _isDetecting = false;
        }
      });
    } catch (e) {
      if (!mounted) return;
      setState(() => _status = "Camera error: $e");
    }
  }

  // Put this inside your State class (_FaceTagrLivePreviewState)
  Uint8List _concatenatePlanes(List<Plane> planes) {
    final WriteBuffer allBytes = WriteBuffer();
    for (Plane plane in planes) {
      allBytes.putUint8List(plane.bytes);
    }
    return allBytes.done().buffer.asUint8List();
  }

  Uint8List _bgraToYUV420(CameraImage image) {
    final int width = image.width;
    final int height = image.height;
    final int frameSize = width * height;
    final Uint8List yuv = Uint8List(frameSize + (frameSize ~/ 2));
    final Uint8List bgra = image.planes[0].bytes;

    int yIndex = 0;
    int uvIndex = frameSize;

    for (int j = 0; j < height; j++) {
      for (int i = 0; i < width; i++) {
        final int index = (j * width + i) * 4;

        final int b = bgra[index];
        final int g = bgra[index + 1];
        final int r = bgra[index + 2];

        final y = (((66 * r + 129 * g + 25 * b + 128) >> 8) + 16).clamp(0, 255);
        final u = (((-38 * r - 74 * g + 112 * b + 128) >> 8) + 128).clamp(0, 255);
        final v = (((112 * r - 94 * g - 18 * b + 128) >> 8) + 128).clamp(0, 255);

        yuv[yIndex++] = y;

        if (j % 2 == 0 && i % 2 == 0) {
          yuv[uvIndex++] = v;
          yuv[uvIndex++] = u;
        }
      }
    }

    return yuv;
  }

  @override
  void dispose() {
    WakelockPlus.disable();
    _controller?.dispose();
    _matchSub?.cancel();
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    final controller = _controller;
    if (controller == null || !controller.value.isInitialized) {
      return const Scaffold(body: Center(child: CircularProgressIndicator()));
    }

    final bool previewMirrored = _isFrontCamera;
    final Widget basePreview = CameraPreview(controller);
    
    final Widget previewWidget = previewMirrored
        ? Transform(
      alignment: Alignment.center,
      transform: Matrix4.identity()..rotateY(3.1415926535),
      child: basePreview,
    )
        : basePreview;

    final Size sensorSize = controller.value.previewSize!;
    final Size screenSize = MediaQuery.of(context).size;

    final bool isPortrait = screenSize.height > screenSize.width;
    final Size orientedSensor =
    isPortrait ? Size(sensorSize.height, sensorSize.width) : sensorSize;

    final int overlayRotation = kDetectorReturnsPreviewOrientedBoxes
        ? 0
        : controller.description.sensorOrientation;

    final Size imageSpaceSize = kDetectorReturnsPreviewOrientedBoxes
        ? orientedSensor
        : Size(sensorSize.width, sensorSize.height);
    
    bool overlayMirror = true;
    if(_deviceType == "ios"){
      overlayMirror = true;
    }else{
      overlayMirror = previewMirrored ^ kDetectorMirrorsFront;
    }

    return Scaffold(
      appBar: AppBar(title: const Text('FaceTagr')),
      body: Stack(
        children: [
          Positioned.fill(
            child: FittedBox(
              fit: BoxFit.cover,
              child: SizedBox(
                width: orientedSensor.width,
                height: orientedSensor.height,
                child: Stack(
                  fit: StackFit.passthrough,
                  children: [
                    previewWidget,

                    if (_faceBox != null)
                      CustomPaint(
                        size: orientedSensor,
                        painter: FaceBoxPainter(
                          faceBoxImageSpace: _faceBox!,
                          imageSize: imageSpaceSize,
                          mirrorHorizontally: overlayMirror,
                          rotationDegrees: overlayRotation,
                          label: "",
                        ),
                      ),
                  ],
                ),
              ),
            ),
          ),
          
          if (_showSpinner)
            Positioned.fill(
              child: Container(
                color: Colors.blue,
                child: const Center(
                  child: CircularProgressIndicator(),
                ),
              ),
            ),
          
          if (_status != "")
            Positioned(
              left: 16,
              right: 16,
              bottom: 16,
              child: Container(
                padding: const EdgeInsets.symmetric(horizontal: 12, vertical: 8),
                decoration: BoxDecoration(
                  color: Colors.white,
                  borderRadius: BorderRadius.circular(8),
                ),
                child: Text(_status, style: const TextStyle(color: Colors.blue)),
              ),
            ),
        ],
      ),
    );
  }
}

class FaceBoxPainter extends CustomPainter {
  final Rect faceBoxImageSpace; 
  final Size imageSize; 
  final bool mirrorHorizontally;
  final int rotationDegrees;
  final String? label;

  FaceBoxPainter({
    required this.faceBoxImageSpace,
    required this.imageSize,
    required this.mirrorHorizontally,
    required this.rotationDegrees,
    this.label,
  });

  @override
  void paint(Canvas canvas, Size size) {
    final _Rotated r = _rotateRect(faceBoxImageSpace, imageSize, rotationDegrees);
    final double sx = size.width / r.rotatedImageSize.width;
    final double sy = size.height / r.rotatedImageSize.height;

    Rect box = Rect.fromLTWH(
      r.rect.left * sx,
      r.rect.top * sy,
      r.rect.width * sx,
      r.rect.height * sy,
    );

    if (mirrorHorizontally) {
      box = Rect.fromLTWH(size.width - (box.left + box.width), box.top, box.width, box.height);
    }

    box = Rect.fromLTWH(
      box.left,
      box.top,
      box.width,
      box.height,
    );
    _drawCornerTicks(canvas, box, color: Colors.green, length: 28, thickness: 4);
    
    if ((label ?? '').isNotEmpty) {
      _drawLabel(canvas, size, box, label!);
    }
  }

  void _drawCornerTicks(Canvas canvas, Rect box,
      {required Color color, double length = 22, double thickness = 3}) {
    final Paint p = Paint()
      ..color = color
      ..strokeWidth = thickness
      ..strokeCap = StrokeCap.round;

    final tl = box.topLeft;
    final tr = box.topRight;
    final bl = box.bottomLeft;
    final br = box.bottomRight;
    
    canvas.drawLine(tl, tl + Offset(length, 0), p);
    canvas.drawLine(tl, tl + Offset(0, length), p);
    canvas.drawLine(tr, tr + Offset(-length, 0), p);
    canvas.drawLine(tr, tr + Offset(0, length), p);
    canvas.drawLine(bl, bl + Offset(length, 0), p);
    canvas.drawLine(bl, bl + Offset(0, -length), p);
    canvas.drawLine(br, br + Offset(-length, 0), p);
    canvas.drawLine(br, br + Offset(0, -length), p);
  }

  void _drawLabel(Canvas canvas, Size screenSize, Rect box, String text) {
    final TextPainter tp = TextPainter(
      text: TextSpan(
        text: text,
        style: const TextStyle(
          color: Colors.white,
          fontSize: 14,
          fontWeight: FontWeight.bold,
        ),
      ),
      textDirection: TextDirection.ltr,
    )..layout(maxWidth: screenSize.width * 0.8);
    
    Offset textOffset = Offset(box.left, box.top - tp.height - 6);
    if (textOffset.dy < 0) {
      textOffset = Offset(box.left, box.bottom + 6);
    }

    tp.paint(canvas, textOffset);
  }

  @override
  bool shouldRepaint(covariant FaceBoxPainter old) =>
      old.faceBoxImageSpace != faceBoxImageSpace ||
          old.imageSize != imageSize ||
          old.mirrorHorizontally != mirrorHorizontally ||
          old.rotationDegrees != rotationDegrees ||
          old.label != label;
}

class _Rotated {
  final Rect rect;
  final Size rotatedImageSize;
  _Rotated(this.rect, this.rotatedImageSize);
}

_Rotated _rotateRect(Rect r, Size img, int deg) {
  switch (deg % 360) {
    case 0:
      return _Rotated(r, img);
    case 90:
      return _Rotated(
        Rect.fromLTWH(
          img.height - (r.top + r.height),
          r.left,
          r.height,
          r.width,
        ),
        Size(img.height, img.width),
      );
    case 180:
      return _Rotated(
        Rect.fromLTWH(
          img.width - (r.left + r.width),
          img.height - (r.top + r.height),
          r.width,
          r.height,
        ),
        img,
      );
    case 270:
      return _Rotated(
        Rect.fromLTWH(
          r.top,
          img.width - (r.left + r.width),
          r.height,
          r.width,
        ),
        Size(img.height, img.width),
      );
    default:
      return _Rotated(r, img);
  }
}

📷 Open FaceTagr Camera #

Navigator.of(context).push(
  MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
).then((_) => mounted ? setState(() => _isProcessing = false) : null);

This launches the built-in FaceTagrLivePreview widget with face recognition.

🔐 Logout #

await faceTagr.fnLogout();

This clears local tokens and resets the session.

📡 Event Payload Reference #

✅ Initialization Events (both init and initializeAndAwait) #

These events are sent after calling:

Facetagr.init(apiURL, clientID, externalID, hashcode, utcTime, requestID);
Facetagr.initializeAndAwait(
      apiURL,
      clientID,
      externalID,
      hashcode,
      utcTime,
      requestID,
    ).then((message) {
      if (message['StatusCode'] == "1001") {
        if (!mounted) return;
        setState(() => _isProcessing = true);
        Navigator.of(context).push(
          MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
        ).then((_) => mounted ? setState(() => _isProcessing = false) : null);
      } else {
        if (!mounted) return;
        ScaffoldMessenger.of(context).showSnackBar(
          SnackBar(content: Text(message['StatusMessage'])),
        );
      }
      setState(() => _isProcessing = false);
    });

Possible Payloads

| StatusCode | StatusMessage                                                          |
|------------|------------------------------------------------------------------------|
| 1001       | Connected successfully.                                                |
| 4001       | Mandatory inputs can not be empty. Please try again with valid values. |
| 4002       | Input JSON is not valid.                                               |
| 4003       | Given ClientID is not valid.                                           |
| 4004       | Authentication failed.                                                 |
| 4005       | Given ExternalID is not valid.                                         |
| 4006       | Licensing limits exceeded.                                             |
| 4007       | Failed to connect.                                                     |
| 5001       | Oops! Something went wrong! Please try again!                          |
| 5002       | Unable to connect to the server. Please try again later.                                             |
| 5003       | Internal server error. Please try again.                               |
| 5004       | Oops! Something went wrong! Please try again!                          |

🎯 Face Match Events (faceMatchStream) #

These events are sent every time the camera detects a face and FaceTagr completes the verification.

Possible Payloads

| StatusCode | StatusMessage                                                    |
|------------|------------------------------------------------------------------|
| 1001       | Face verified successfully.                                      |
| 1002       | Face is not matching.                                            |
| 4001       | No face found.                                                   |
| 4002       | Face size is less than the minimum required size.                |
| 4003       | Face should be facing the camera straight.                       |
| 4004       | Face is blurred and/or not clear.                                |
| 4005       | Face is not live. Spoofing detected.                             |
| 4006       | Image format error.                                              |
| 4007       | Input JSON is not valid.                                         |
| 4008       | Given ClientID is not valid.                                     |
| 4009       | Authentication failed.                                           |
| 4010       | Given ExternalID is not valid.                                   |
| 5001       | Oops! Something went wrong! Please try again!                    |
| 5002       | Unable to connect to the server. Please try again later.               |
| 5003       | Face verification failed. Server error.                          |
| 5004       | Oops! Something went wrong! Please try again!                    |
| 5005       | Oops! Something went wrong! Please try again!                    |
| 5006       | Oops! Something went wrong! Please try again!                    |

🧩 Event Payload Reference (Registration Mode – allowUserRegistration = true) #

When allowUserRegistration: true is passed in initialization, the FaceTagr SDK may return the following Face Registration Events:

✅ Face Registration Events

| StatusCode | StatusMessage                                                              |
|------------|----------------------------------------------------------------------------|
| 201        | JSON format error. Given input not in the correct format.                  |
| 200        | Authentication failed.                                                     |
| 300        | This person already exists in your database: [ExternalID]                  |
| 301        | Collection name does not exist.                                            |
| 302        | DisplayName is mandatory.                                                  |
| 303        | ExternalID is mandatory.                                                   |
| 304        | ExternalID exists. ExternalID should be unique.                            |
| 310        | The captured photo is not correct. Please try again !!                     |
| 401        | Face registration failed. No face found.                                   |
| 402        | Face registration failed. Face box size is less than minimum required size.|
| 403        | Face registration failed. Face is blurred and/or not clear.                |
| 405        | Face registration failed. Face should be facing the camera straight.       |
| 501        | Face registration failed. Server error.                                    |
| 503        | Oops something went wrong. Please try again !!                             |

🔄 Flow Diagram #

sequenceDiagram
    participant App
    participant FaceTagr SDK
    participant Backend API

    App->>FaceTagr SDK: init(apiURL, clientID, externalID, hash, time, reqID)
    FaceTagr SDK->>Backend API: Validate credentials
    Backend API-->>FaceTagr SDK: Auth success
    FaceTagr SDK-->>App: Init success (1001)

    App->>FaceTagr SDK: Open Camera
    FaceTagr SDK->>Backend API: Stream frames
    Backend API-->>FaceTagr SDK: Match success (1001)
    FaceTagr SDK-->>App: FaceMatch event

✅ Quick Recap #

  1. Add facetagr to dependencies
  2. Add facetagr_tools to dev_dependencies
  3. Run initialization command
  4. Generate SHA-512 hash.
  5. Call init() with credentials.
  6. Open camera (FaceTagrLivePreview).
  7. Listen for faceMatchStream.

License #

This package is part of the FaceTagr ecosystem.

© 2025 NotionTag Technologies Pvt Ltd. All rights reserved.

💬 Support #

For integration support, please contact:

📧 [email protected]

🌐 https://www.facetagr.com

0
likes
110
points
159
downloads

Publisher

unverified uploader

Weekly Downloads

At FaceTagr, we are pioneers in advanced face recognition technology, delivering solutions that are accurate, reliable, and scalable. Our NIST-tested algorithms, with 99.91% accuracy, ensure that our technology meets the highest global standards for identity verification and security.

Homepage

Documentation

API reference

License

unknown (license)

Dependencies

crypto, dio, flutter, flutter_secure_storage, image, uuid

More

Packages that depend on facetagr

Packages that implement facetagr