Capture SDK Guide 

The purpose of the Integration Guide is to give developers everything they need to set up and work with a minimally viable application using the Capture SDK.

Introduction 

The CaptureSDK is targeted to developers who want to use IDEMIA technologies within their mobile apps.

The main features are:

  • Biometric captures
  • Biometric coding
  • Fingerprint capture and matching
  • Biometric authentication and identification
  • Identity documents reading

Adding the SDK to your project 

Gradle

Configure repository:

XML
1buildscript {
2 repositories {
3 maven {
4 url "$repositoryUrlMI"
5 credentials {
6 username "$artifactoryUserMI"
7 password "$artifactoryPasswordMI"
8 }
9 }
10 ...
11 }
12 ...
13}

repositoryUrlMI: Mobile Identity artifactory repository url

artifactoryUserMI: Mobile Identity artifactory username

artifactoryPasswordMI: Mobile Identity artifactory password

These properties can be obtained through Experience Portal(My Identity Proofing -> Access) and should be stored in local gradle.properties file. In such case credentials will not be included in source code. Configuration of properties:

XML
1artifactoryUserMI=artifactory_user
2artifactoryPasswordMI=artifactory_credentials
3repositoryUrlMI=https://mi-artifactory.otlabs.fr/artifactory/smartsdk-android-local

More about gradle properties can be found here.

For biometric features the dependency is:

Groovy
1implementation("morpho.mph_bio_sdk.android:SmartBio:version")

For document features the dependency is:

Groovy
1implementation("morpho.mph_bio_sdk.android:SmartDoc:version")

For all features the dependency is:

Groovy
1implementation("morpho.mph_bio_sdk.android:SmartSDK:version")

Version: artifact version

Components

The SDK comprises five distinct components:

  1. BioCaptureHandler: Handles the capture of the biometrics through the camera of the device.
  2. BioMatcherHandler: Handles the biometric coding and matching.
  3. DocumentCaptureHandler (see: DocumentCaptureHandler): Handles the document reading features (like reading MRZ documents).
  4. BioStoreDB: Repository to store biometric templates. (This component is optional, in case you don't want to implement your own database.)
  5. ImageUtils: Handles the image format conversion, in case the integrator must change the image format or import an image.
  6. LicenseManager: Handles the license management. Refer to License Manager for more details.

Access to BioCaptureHandler, BioMatcherHandler and DocumentCaptureHandler is through the Biometric Capture SDK entry points.

Design considerations 

  • User permissions must be handled by the integrator. You must check that the app permissions are granted by the user if the Android version is higher than 23 (as detailed here).

  • Remember: You must always have a valid license before using any method of this SDK. You can activate it through LicenseManager. Refer to License Manager for more details.

  • Note: If your app is to run in low memory devices, you must add android:largeHeap="true" to your application.

  • If you find that your project requires other native libraries, you must add in your gradle.properties file the following filter:

XML
1android.useDeprecatedNdk=true

And in your build.gradle add filters for the desired ABI. For now, the SDK supports armeabi-v7a and arm64-v8a:

XML
1defaultConfig {
2 ....
3 ndk.abiFilters 'armeabi-v7a','arm64-v8a'
4 }

Prerequisites 

Skills required

The integration tasks should be done by developers with knowledge of:

  • Android Studio
  • Java for Android
  • Android OS

Resources required

Integration may be performed on computers running Windows, Linux, or macOS.

The tools required are:

  • Android Studio
  • Android SDK tools: preferred latest version
  • JDK: preferred latest version
  • Android device (emulator is not supported)
  • Minimum SDK version is 21

Biometric capture SDK structure 

The SDK's structure is displayed below.

Intro_BioSDKStructure

Tips 

App size optimization

After adding the SDK to your project you will observe that the size of application has grown significantly. This is because the SDK now includes native libraries for two ABIs: armeabi-v7a and arm64-v8a. What is generated is an .apk file that deploys to Google Play. Your application will contain both application binary interfaces even if one is not used.

Android App Bundle is the solution for this issue. Instead of generating an .apk, it is possible to generate a bundle (.aab). When a user installs the application from the store that contains the bundle, only the required components for the user's specific device will be downloaded.

Additionally, the maximum size of the bundle increases to 150 MB (100 MB is still maximum size for .apk files).

No changes on Google Play are required - just upload .aab instead of .apk. Also, no development in the application project is required.

It is recommended that the bundle options be declared inside the Gradle file, for example:

XML
1android {
2 ...
3 bundle {
4 density {
5 enableSplit true
6 }
7 abi {
8 enableSplit true
9 }
10 language {
11 enableSplit false
12 }
13 }
14}

More about app bundles can be found here.

License manager 

The purpose of this section is to show the API of the license management portion of the SDK, and expose the objects involved.

License manager 

The License manager is the main entry point to use the SDK. You can manage licenses through LicenseManager.

Note: A valid license is required before using any feature of the SDK.

provideLicenseManager

This method provides an instance of LicenseManager with a predefined LKMS profile. Operation with LicenseManager should be executed before starting capture.

Kotlin
1LicenseManager manager = LicenseManager . provideLicenseManager (LkmsProfileId, LkmsApiKey, lkmsUrl)

Activating license

This function takes care of making sure a valid license is stored on the device. This process is crucial and must occur each time before any SDK usage. In most cases it does not require any effort from integrator side. However, it might fail in some corner cases that are listed below.

Method handles license management on calling thread.

Callback solution:

Kotlin
1val activationResult = manager.activate(
2 object : LicenseActivationListener {
3 override fun onLicenseActivated() {
4 //License fetched and activated with success.
5 }
6
7 override fun onLicenseActivationFailed(licenseActivationError: LicenseActivationError) {
8 //Failed to fetch or activate the license.
9 }
10 }, applicationContext)

Coroutines solution: It returns LicenseActivationResult

Kotlin
1val activationResult = manager.activate(applicationContext)
2when (activationResult) {
3 is LicenseActivationSuccess -> {
4 //License fetched and activated with success.
5 }
6 is LicenseActivationError -> {
7 //Failed to fetch or activate the license.
8 }
9}

LicenseActivationResult

This is information of result from activation license using coroutines solution. Instance might be type of:

LicenseActivationError

This is the information about why license can not be activated.

Attribute
Description
type ActivationErrorTypeThe type of error why license activation failed
message StringThe activation failure reason.

ActivationErrorType

Attribute
Description
PROFILE_EXPIREDProfile expired, all licenses won’t work anymore. (Contact with support)
ACTIVATION_COUNT_EXCEEDEDNo more licenses can be consumed. (Contact with support)
AUTHENTICATION_ISSUECredentials and/or profile information are wrong.
CONNECTION_ISSUEConnection issue. Make sure that your internet connection is stable.
UNKNOWNUnknown issue.

Getting started 

This guide illustrates the required steps to configure a minimally viable project for capturing biometrics using the Biometric Capture SDK.

Downloadable sample apps are here:

Creating your app 

  1. Add the SDK library to your app's build.gradle:
Groovy
1implementation("morpho.mph_bio_sdk.android:SmartBio:version")

If you do not have configured repository for the SDK yet, see introduction that explains how to do that.

  1. Add the correct plugin dependency if you use face capture.

Plugins are special extensions to the SDK that might add or change its features. In this way, users can save memory and increase performance by picking plugins they need.

For face capture there are three plugins to choose from. There should be only one plugin selected during the build. If more than one for a specific flavor is selected, it will cause a MultipleFaceInitPluginsException.

Available plugins

  • plugin-face-normal should be used when WebBioServer is not used and there is need for strong security during local liveness challenges.

  • plugin-face-lite should be used when WebBioServer is used because it can reduce the size of an application significantly.

  • plugin-face-cr2dmatching should be used for local usage with additional security feature for FaceLiveness.ACTIVE mode.

Example plugin dependency for face capture:

Groovy
1implementation 'com.idemia.smartsdk:plugin-face-normal:version'

Plugins for face matching 

If you use the finger only variant you can skip this section because the proper plugin is already attached to that version.

For face matching there are three options to choose from. Keep in mind that these algorithms are not compatible.

Stored templates will not be successfully matched against templates from another algorithm.

  • plugin-algorithm-f5-4-low75: This has been improved to perform better with default compression. If a previous SDK has been used before and there is a user base with stored templates already, then full migration will be required. All templates must be generated again with the new plugin in use.

  • plugin-algorithm-f5-0-vid81: This is the default algorithm that is compatible with previous SDK versions.

  • plugin-algorithm-fingerv9: This provides only finger matching.

  • plugin-algorithm-f6-5-low70: Recommended algorithm for face matching, introduced in SDK version 4.44.0. There is no compatibility in the template level with other plugins.

Remember to attach only one matching plugin per flavor, otherwise a MultipleInitBlockPluginsException will occur.

More about plugins

  1. Add the CaptureView to the layout where you handle the biometric capture:
XML
1<com.idemia.smartsdk.preview.CaptureView
2 android:id="@+id/captureView"
3 android:layout_width="match_parent"
4 android:layout_height="match_parent" />
  1. On your activity or fragment get a reference to this view:
Java
1CaptureView cameraPreview = (CaptureView) findViewById(R.id.captureView);
  1. Activate your license. This can be done in the onCreate(Bundle savedInstanceState) or in a previous stage of your app. This must be done only once.
Java
1LicenseManager manager = LicenseManager.provideLicenseManager(LkmsProfileId, LkmsApiKey, lkmsUrl)
2 val activationResult = manager.activate(applicationContext)
3 when(activationResult) {
4 is LicenseActivationSuccess -> {
5 //License fetched and activated with success.
6 }
7 is LicenseActivationError -> {
8 //Failed to fetch or activate the license.
9 }
10 }

For security reasons it is good to consider storing LKMS credentials outside source code (for example gradle properties).

  1. Prepare capture settings. For face capture, you should use FaceCaptureOptions.
Java
1FaceCaptureOptions captureOptions = new FaceCaptureOptions(FaceLiveness.PASSIVE);
2 captureOptions.setCamera(Camera.FRONT);
3 captureOptions.setCaptureTimeout(120);
4 captureOptions.setOverlay(Overlay.OFF);
  1. In the onResume() method of your activity or fragment, obtain a valid reference to the IBioCaptureHandler using the previously created capture options.
Java
1protected void onResume() {
2 //Create handler
3 BioSdk.createFaceCaptureHandler(this, captureOptions, new MscAsyncCallbacks<IFaceCaptureHandler>() {
4 @Override
5 public void onPreExecute() {
6 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
7 }
8 @Override
9 public void onSuccess(IFaceCaptureHandler result) {
10 // Indicates that initialization succeeded, the returned handler can be used to start the capture.
11 faceCaptureHandler = result;
12 }
13 @Override
14 public void onError(BioCaptureHandlerError e) {
15 // An error has occurred during the initialization
16 }
17 });
18 super.onResume();
19}
  1. Add the listeners for the events to the handler:
Java
1faceCaptureHandler.setFaceCaptureResultListener(new FaceCaptureResultListener() {
2 @Override
3 public void onCaptureSuccess (@NotNull FaceImage image){
4 //Successfully captured image
5 }
6
7 @Override
8 public void onCaptureFailure (@NotNull CaptureError captureError,
9 @NotNull IBiometricInfo biometricInfo,
10 @NotNull Bundle extraInfo){
11 //Capture failure
12 );
13 }
14 });
15faceCaptureHandler.setFaceCaptureFeedbackListener(new FaceCaptureFeedbackListener() {
16 @Override
17 public void onCaptureInfo(FaceCaptureInfo captureInfo) {
18 //Face capture feedback info, like move your face to the right
19 }
20 });
21faceCaptureHandler.setFaceTrackingListener(new FaceCaptureTrackingListener() {
22 @Override
23 public void onTracking(List<FaceTracking> trackingInfo) {
24 //Tracking info to know where the face is.
25 }
26 });
  1. Initialize the preview and capture to start receiving events. It should happen after creating the capture handler. The most common place for this would be onResume:
1faceCaptureHandler.startPreview(new PreviewStatusListener() {
2 @Override
3 public void onStarted() {
4 try {
5 captureHandler.startCapture();
6 } catch (MSCException e) {
7 // handle exception
8 }
9 }
10
11 @Override
12 public void onError(PreviewError error) {
13 // Preview initialization failed and can not be started
14 }
15 });
  1. Destroy the handler when onPause() is invoked:
Java
1@Override
2 protected void onPause() {
3 if (captureHandler!=null) {
4 faceCaptureHandler.stopCapture();
5 faceCaptureHandler.stopPreview();
6 faceCaptureHandler.destroy();
7 }
8 super.onPause();
9 }
  1. In your manifest, you must add:
XML
1<!--Declare new permissions-->
2 <permission
3 android:name="your.new.permission.NEW_READ_MPH_BIO_SDK_PROVIDER"
4 android:protectionLevel="signature" /> <!--unless otherwise required, set the maximum security permission -->
5 <permission
6 android:name="your.new.permission.NEW_WRITE_MPH_BIO_SDK_PROVIDER"
7 android:protectionLevel="signature" /> <!--unless otherwise required, set the maximum security permission -->
XML
1<!--The provider must be defined by the implementing app so as to allow multiple apps-->
2 <!--Bio store provider provider-->
3 <provider
4 android:name="com.morpho.mph_bio_sdk.android.sdk.content_provider.BioStoreProvider"
5 android:authorities="your.new.authority"
6 android:readPermission="your.new.permission.NEW_READ_MPH_BIO_SDK_PROVIDER"
7 android:writePermission="your.new.permission.NEW_WRITE_MPH_BIO_SDK_PROVIDER"
8 tools:replace="android:authorities, android:readPermission, android:writePermission">
9 </provider>

Analytics 

Capture SDK offers a logging mechanism that collects analytics data about SDK usage, and sends this data to IDEMIA's server. This data helps IDEMIA to improve Capture SDK and the likelihood of integrator success within the app. It is strongly recommended to activate the analytics mechanism.

  • You can enable or disable sending analytics data.
  • You can choose to send analytics data only when you are connected to a Wi-Fi network, so as not to not use your cellular connection.
  • Analytics data that IDEMIA collects contains only technical data.
  • No sensitive personal data is collected.
  • IDEMIA does not collect any images.

Analytics data that we collect include following information:

  • Application name, bundle id, version
  • Capture SDK and RemoteLogger libraries versions
  • Device model and operating system version
  • Technical information about performed face, finger, and document capture (such as: capture mode used; timestamp; reason of error; time needed to perform a capture; quality of captured image; and light condition)
  • Technical information about performed authentication and identification events (such as: used threshold, duration, and obtained score)
  • Other technical information (such as: image compression, occurred errors, and SDK performance) that does not contain personal data

You can disable analytics reporting using the appropriate SDK method.

Capture SDK plugins 

Plugins have been introduced to give even more flexibility than variants of the SDK. Every integrator might have different needs and size requirements. A new plugin mechanism allows for greater flexibility. Plugins are split to two groups: feature and algorithm.

Feature plugins 

Provides various SDK functionalities such as: face capture, document capture, and optical character recognition (OCR).

Algorithm plugins 

Provides for extracting biometric data from images, matching this data, and storing it as templates.

How it works 

Capture SDK still has previous variants with predefined plugins in the dependency list that are still required. However, some features might be different, like the matching algorithm or face capture challenge behavior. In such cases, these features might be configured via adding specific plugins.

All that must be done to add a plugin is to put the proper dependency to a project's module that will be using the plugin (How to use them).

Benefits 

The obvious benefit is reducing the number of SDK variants which makes it easier to pick the proper SDK dependency. It also brings flexibility to the product, namely, the ability to mix or replace features in the future or even extend SDK possibilities by implementing your own plugins.

How to use them 

Plugins are just ordinary dependencies. All that must be done is to add the proper dependency for the plugins that are needed. Read carefully about allowed combinations and predefined plugins in SDK variants.

Here is a snippet with all available plugins:

Gradle
1//Feature plugins
2implementation 'com.idemia.smartsdk:plugin-finger:$version'
3implementation 'com.idemia.smartsdk:plugin-face:$version'
4implementation 'com.idemia.smartsdk:plugin-face-normal:$version'
5implementation 'com.idemia.smartsdk:plugin-face-lite:$version'
6implementation 'com.idemia.smartsdk:plugin-face-cr2dmatching:$version'
7implementation 'com.idemia.smartsdk:plugin-face:$version'
8implementation 'com.idemia.smartsdk:plugin-improved-pdf417-detection:$version'
9
10//Algorithm plugins
11implementation 'com.idemia.smartsdk:plugin-algorithm-f5-0-vid81:$version'
12implementation 'com.idemia.smartsdk:plugin-algorithm-f5-4-low75:$version'
13implementation 'com.idemia.smartsdk:plugin-algorithm-f6-0-idd80:$version'
14implementation 'com.idemia.smartsdk:plugin-algorithm-f6-5-low70:$version'
15implementation 'com.idemia.smartsdk:plugin-algorithm-fingerv9:$version'

Allowed combinations 

Here are all possible combinations of plugins for specific use cases.

As mentioned above, the SDK variants have predefined plugins dependency, so that only a few must be defined.

See what predefined plugins has which variant of the SDK you use.

Face capture
plugin-face
plugin-face-lite
plugin-face-normal
plugin-face-cr2dmatching
Available algorithm plugins
plugin-algorithm-f5-4-low75
plugin-algorithm-f5-0-vid81
plugin-algorithm-f6-5-low70
plugin-algorithm-f6-0-idd80
Finger capture
plugin-finger
Available algorithm plugins
plugin-algorithm-f5-4-low75
plugin-algorithm-f5-0-vid81
plugin-algorithm-f6-0-idd80
plugin-algorithm-f6-5-low70
plugin-algorithm-fingerv9
Document capture
plugin-improved-pdf417-detection

Warning: Only one of: plugin-face-lite, plugin-face-normal, plugin-face-cr2dmatching can be used at a time. The integrator must pick one of them. A MultipleFaceInitPluginsException will occur if more than one has been picked.

SDK variants and their plugins 

Each SDK plugin variant delivers something different - check carefully what each plugin variant contains. Plugin variants should not be added in a module that uses this specific variant. As can be seen below, no document-related plugins must be added for variants that deliver this feature. In other words, variants contain all plugins that are required and have no alternatives.

Capture SDK
plugin-face
plugin-finger

Plugins that might be added for a Capture SDK variant:

  • One of: plugin-face-normal, plugin-face-lite, plugin-face-cr2dmatching
  • One of: plugin-algorithm-f5-4-low75, plugin-algorithm-f5-0-vid81, plugin-algorithm-f6-0-idd80, plugin-algorithm-f6-5-low70, plugin-algorithm-fingerv9* (this one is not recommended if face matching will be performed)
Biometric Capture SDK
plugin-face
plugin-finger

Plugins that can be added for the Biometric Capture SDK variant:

  • One of: plugin-face-normal, plugin-face-lite, plugin-face-cr2dmatching
  • One of: plugin-algorithm-f5-4-low75, plugin-algorithm-f5-0-vid81, plugin-algorithm-f6-0-idd80, plugin-algorithm-f6-5-low70 plugin-algorithm-fingerv9 (this one is not recommended if face matching is going to be performed)
SmartFinger
plugin-finger
plugin-algorithm-fingerv9

There are no plugins for the SmartFinger variant.

SmartFace
plugin-face

Plugins that can be added for the SmartFace variant:

  • One of: plugin-face-normal, plugin-face-lite, plugin-face-cr2dmatching
  • One of: plugin-algorithm-f5-4-low75, plugin-algorithm-f5-0-vid81, plugin-algorithm-f6-0-idd80, plugin-algorithm-f6-5-low70
SmartFaceDoc
plugin-face
plugin-face-lite

Plugins that can be added for the SmartFaceDoc variant:

  • One of: plugin-algorithm-f5-4-low75, plugin-algorithm-f5-0-vid81, plugin-algorithm-f6-0-idd80, plugin-algorithm-f6-5-low70

However, this variant is meant to be used with WebBioServer which performs matching operations (no need to do that locally).

For SDK variants with document plugin-improved-pdf417-detection may be added in order to improve capture of barcodes.

Feature plugins descriptions 

plugin-face

Basic plugin needed for face capture. Usually it is predefined in every SDK variant that delivers face capture functionality.

plugin-face-normal

Should be used for face capture when WebBioServer is not used and there is a need for strong security during local liveness challenges.

plugin-face-lite

Should be used when WebBioServer is used for liveness check during face capture, because it can reduce the size of application significantly.

plugin-face-cr2dmatching

Should be used for local usage (without WebBioServer) when additional security feature for FaceLiveness.ACTIVE mode is needed.

plugin-finger

Plugin needed for finger capture. Usually it is predefined in every SDK variant that delivers finger capture functionality.

plugin-improved-pdf417-detection

Plugin that can be used to speed up barcode capture.

Algorithm plugins descriptions 

plugin-algorithm-f6-0-idd80

It is more accurate than f5-4-low75 and much smaller than f5-0-vid81.

plugin-algorithm-f5-4-low75

Improved to perform better with default compression. If a previous SDK has been used before and there is a user base with stored templates already, then full migration of the user's biometrics will be required. All templates must be generated again with the new plugin in use.

plugin-algorithm-f5-0-vid81

This is the default algorithm that is compatible with previous SDK versions.

plugin-algorithm-f6-5-low70

Recommended algorithm for face capture. It is more accurate than f6-0-idd80. If a previous SDK has been used before and there is a user base with stored templates already, then full migration of the user's biometrics will be required. All templates must be generated again with the new plugin in use.

plugin-algorithm-fingerv9

Provides only finger matching feature. It is best to pick this one when only finger matching will be performed.

WARNING

The algorithms are NOT compatible with each other. The templates generated by one of the algorithms cannot be processed with the other one; that is, it is not possible to match a template generated with F6_0_IDD80 against a template generated with F5_4_LOW75 or F5_0_VID81. If an integrator wants to change the algorithm in their solution, all the stored templates must be recreated with the new algorithm.

SDK size 

This is the estimated size of an SDK variant with all its dependencies, like predefined plugins (see Plugins section). The UI-extension is not included in size as it is not a predefined dependency.

SDK variant
Size
CaptureFace24.29 MB
CaptureDocument16.94 MB
CaptureFinger17.24 MB
CaptureBiometry28.78 MB
CaptureBiometry_document43.48 MB
CaptureFace_document38.99 MB

Plugins size 

Plugin
Size
plugin-face7.59 KB
plugin-face-normal6.75 MB
plugin-face-lite4.79 MB
plugin-face-cr2dmatching6.75 MB
plugin-finger794.64 KB
plugin-algorithm-f5-4-low7512.30 MB
plugin-algorithm-f5-0-vid814.08 MB
plugin-algorithm-f6-5-low707.45 MB
plugin-algorithm-fingerv91.51 KB
plugin-improved-pdf417-detection8.81MB

Integration guide 

The purpose of this document is to show the API of the SDK and expose all of its involved objects.

Use cases 

Capture biometrics

Below is the generic execution flow to perform a biometric capture (Get Picture), and get information about the biometry. For example, getting a picture and moving your head to the left.

capture_images

Capture timeout

Below is the generic execution flow to be followed when a capture timeout occurs.

capture_timeout

Capture enroll

Below is the generic execution flow to perform a biometric capture (Get Picture). After that, the biometrics template is extracted from the image returned by the capture component. The biometric template is linked to one user using the userUUID. The UUID of this template and the userUUID are stored in a database.

capture_enrol

Capture authenticate

Below is the generic execution flow to perform a biometric capture (Get Picture). The biometrics template is then extracted from the image and returned by the capture component. These are the candidate templates that you must use to create an IBiometricCandidate.

After the IBiometricCandidate is created, a list of reference templates must be extracted. These will then be used to create an IBiometricReference object with which to match against the IBiometricCandidate and authenticate that the candidate templates belong to the user.

There are two ways to extract a list of template references: the first is to retrieve them from the database used during the enrollment process; the second is to extract the templates from another image with detectBiometrics(...).

capture_verify

Capture identify

Below is the generic execution flow to perform a biometric capture (Get Picture). The biometrics template is then extracted from the image and returned by the capture component. These are the candidate templates which you must use to create an IBiometricCandidate.

After the IBiometricCandidate is created, a list of reference templates must be extracted. These will then be used to create an IBiometricReference object with which to match against the IBiometricCandidate and authenticate that the candidate templates belong to the user.

capture_identify

Creating BioMatcherHandler

Below is the generic execution flow to retrieve and release a BioMatcherhandler.

create

Authenticating

Below is the generic execution flow to perform a generic verification process which involves extracting the biometrics template from an image. These are the candidate templates which you must use to create an IBiometricCandidate.

After the IBiometricCandidate is created, a list of reference templates must be extracted. These will then be used to create an IBiometricReference object with which to match against the IBiometricCandidate and authenticate that the candidate templates belong to the user.

There are two ways to extract a list of template references: the first is to retrieve them from the database used during the enrollment process; the second to is extract the templates from another image with detectBiometrics(...).

verify

Identifying

Below is the generic execution flow to perform a generic identification process which involves extracting the biometrics template from an image. These are the candidate templates which you must use to create an IBiometricCandidate.

After the IBiometricCandidate is created, a list of reference templates must be extracted. These will then be used to create an IBiometricReference object with which to match against the IBiometricCandidate and authenticate that the candidate templates belong to the user.

identify

Detect biometrics

This describes detecting the biometrics in an IImage. This function is intended to be used to extract all the biometric templates contained in an image; for example, all the faces that are in an image.

detect_biometric

Introduction 

In order to make integration of the SDK easier and more intuitive - new API for Face Capture has been delivered. It is based on use cases that are self-explaining which provide specific information depending on a given use case. This allows integrator to focus on working with the data provided by the SDK rather than on SDK configuration.

Old API is still available for backward compability for already integrated users. It's description might be found here.

NOTE: The new API now supports both: remote and local liveness use cases.

Integration 

License activation

First step that is mandatory to use SDK is to activate license and grant camera permission for application. This part is common for old API and the new one. License handling can be found here.

Adding FaceCaptureView

FaceCaptureView is a key component of SDK. It not only provides preview for the capture but also it is an entry point to SDK's API. It means that on this component integrator sets up capture and orchestrate it's flow.

FaceCaptureView should be added to the layout of capture Activity. It is done as any other android's view. It must be visible to the end user.

XML
1<androidx.constraintlayout.widget.ConstraintLayout
2 xmlns:android="http://schemas.android.com/apk/res/android"
3 xmlns:app="http://schemas.android.com/apk/res-auto"
4 android:layout_width="match_parent"
5 android:layout_height="match_parent"
6 android:orientation="horizontal">
7
8 <com.idemia.capture.face.api.FaceCaptureView
9 android:id="@+id/captureView"
10 android:layout_width="0dp"
11 android:layout_height="0dp"
12 app:layout_constraintBottom_toBottomOf="parent"
13 app:layout_constraintEnd_toEndOf="parent"
14 app:layout_constraintStart_toStartOf="parent"
15 app:layout_constraintTop_toTopOf="parent" />
16
17</androidx.constraintlayout.widget.ConstraintLayout>

As FaceCaptureView is also entry point to SDK, it should be also invoked with proper methods in application logic. This can be done by old way - findViewById:

Kotlin
1private var captureView: FingerCaptureView? = null
2
3override fun onCreate(savedInstanceState: Bundle?) {
4 super.onCreate(savedInstanceState)
5 setContentView(R.layout.your_activity_layout)
6
7 captureView = findViewById(R.id.captureView)
8}

or with more popular way - through binding:

Kotlin
1lateinit var binding: YourActivityCaptureBinding
2
3override fun onCreate(savedInstanceState: Bundle?) {
4 super.onCreate(savedInstanceState)
5 binding = YourActivityCaptureBinding.inflate(layoutInflater)
6 setContentView(binding.root)
7
8 val captureView = binding.captureView
9}

Creating use case

In order to perform capture, the next step is to create use case we are interested in. As mentioned above - new API focuses on what we want to do and not on how we want to do. In order to achieve that, use cases have been introduced. They define what will be done and require (at least for result) set of listeners to provide information about capture.

Kotlin
1val remoteUseCase =
2 RemoteUseCase(
3 sessionId,
4 RemoteCaptureListeners(
5 faceTrackingInfo = faceTrackingInfoListener,
6 captureFeedback = feedbackListener,
7 captureLivenessListener = captureLivenessListener,
8 stepInfoListener = stepInfoListener,
9 passiveVideoListener = passiveVideoListener,
10 captureResultListener = captureResultListener
11 ),
12 environmentInfo
13 )

More about use cases and their properties can be found in dedicated section.

Setting up capture

When license is activated, camera permission is granted, and use case has been created, it is time to set up capture and perform it. In order to do that use method setUp on FaceCaptureView:

Kotlin
1fun setUp(useCase: UseCase, lifecycle: Lifecycle?, uiSettings: UISettings?)

Please find below explanation to each function argument:

Parameter
Description
useCase UseCaseThis is use case instance providing type of capture and allowing integrator to get data from it
lifecycle LifecycleThis is Android's component allowing SDK be lifecycle aware. Argument is optional. If not provided integrator has to explicitly manage flow. If lifecycle is provided there is no need to start/cancel/destroy flow.
uiSettings UISettingsSettings providing details to UI-Extensions library. If not provided integrator has to handle displaying proper UI to end user on his own. More information about it can be found here.

In case that Lifecycle component is not configured, methods:

  • start() - Start's capture and liveness verification flow. Recommended to invoke in onResume or onStart methods of Android's lifecycle.
  • cancel() - Cancels flow. Recommended to invoke in onPause or onStop methods of Android's lifecycle - depending on desired effect.
  • destroy() - Cleans up capture view and it's data. Recommended to invoke in onDestroy method of Android's lifecycle. have to be called explicitly by integrator in order to provide smooth and stable user experience.

Provided that above steps has been done, capture needs to be set up, as in the following example:

Kotlin
1[code in Activity]
2
3fun setupView(useCase: UseCase) {
4 binding.captureView.setUp(
5 useCase,
6 lifecycle,
7 UISettings(passiveVideoSettings, passiveSettings, joinThePointsCaptureSettings)
8 )
9}

Here is example of UISettings setup. Keep in mind that this class keep configuration for each FaceLiveness mode except from NO_LIVENESS:

Kotlin
1val joinThePointsCaptureSettings = joinThePointsChallengeSettings {
2 useInterpolation = true
3 scene {
4 overlay {
5 showOverlay = true
6 imageRes = R.drawable.ic_face_overlay
7 marginVertical = R.dimen.default_face_overlay_vertical_padding
8 marginHorizontal = R.dimen.default_face_overlay_vertical_padding
9 text {
10 text = R.string.default_overlay_text
11 textSize = R.dimen.default_overlay_text_size
12 textColor = Color.parseColor(Colors.text_black)
13 }
14 }
15 capturedLineOpacity = 0.5f
16 pointer {
17 type = PointerType.PULSING
18 collisionWithTargetAction = PointerCollisionAction.NONE
19 }
20 target {
21 pulseAnimation {
22 waves = 3
23 }
24 showMarkOnCurrentTarget = true
25 }
26 verticalTilt {
27 enabled = false
28 }
29 tapping {
30 enabled = false
31 }
32 result {
33 failureImageResId = R.drawable.ic_challenge_failed
34 successImageResId = R.drawable.ic_challenge_success
35 }
36 }
37}
38
39val passiveSettings = passiveCaptureSettings {
40 scene {
41 background {
42 colorEnd = Color.parseColor("#189482")
43 colorStart = Color.parseColor("#38ddb8")
44 }
45 previewScale {
46 scaleX = 1.0f
47 scaleY = 1.0f
48 }
49 feedback {
50 colorText = Color.parseColor(Colors.white)
51 }
52 overlay {
53 showOverlay = true
54 }
55 tapping {
56 colorBackground = Color.parseColor("#FAFAFA")
57 colorImage = Color.parseColor(Colors.black)
58 colorText = Color.parseColor(Colors.black)
59 textResId = "Use your head to interact"
60 textH1ResId = "No tapping needed"
61 enabled = true
62 }
63 verticalTilt {
64 colorBackground = Color.parseColor("#FAFAFA")
65 colorImage = Color.parseColor("#000000")
66 colorText = Color.parseColor("#000000")
67 textResId = "Please hold your phone vertically."
68 enabled = true
69 }
70 countdown {
71 countdownSeconds = 3
72 }
73 delay {
74 isEnabled = true
75 message = "Authentication locked.\nPlease wait for:\n%1$s"
76 }
77 }
78}
79
80val passiveVideoSettings = passiveVideoCaptureSettings {
81 scene {
82 preparationScene {
83 backgroundColor = Color.WHITE
84 }
85 faceOverlay {
86 progressBar {
87 progressFill = Color.GREEN
88 }
89 }
90 background {
91 colorEnd = Color.parseColor("#189482")
92 colorStart = Color.parseColor("#38ddb8")
93 }
94 previewScale {
95 scaleX = 1.0f
96 scaleY = 1.0f
97 }
98 feedback {
99 videoBackground { }
100 }
101 tapping {
102 colorBackground = Color.parseColor("#FAFAFA")
103 colorImage = Color.parseColor("#000000")
104 colorText = Color.parseColor("#000000")
105 textResId = "Use your head to interact"
106 textH1ResId = "No tapping needed"
107 enabled = true
108 }
109 verticalTilt {
110 colorBackground = Color.parseColor("#FAFAFA")
111 colorImage = Color.parseColor("#000000")
112 colorText = Color.parseColor("#000000")
113 textResId = "Please hold your phone vertically."
114 enabled = true
115 }
116 delay {
117 isEnabled = true
118 message = "Authentication locked.\nPlease wait for:\n%1$s"
119 }
120 }
121}

Use cases 

As mentioned in sections above new API is meant to be easier to integrate and more intuitive in general. In order to achieve that use cases have been introduced. Every use case is dedicated to do particular job. Please find below list of available use cases.

This use case is used to perform face capture with backend liveness verification. Thanks to this use case integrator does not have to integrate with backend services as it provides end to end integration. However, few things needs to be provided:

  • Session id for given capture
  • RemoteCaptureListeners
  • EnvironmentInfo
Kotlin
1RemoteUseCase(sessionId: String, listeners: RemoteCaptureListeners, environmentInfo: EnvironmentInfo)
Parameter
Description
sessionId StringSession id correlated with face capture. Most popular approach is to create session outside of application (integrator's backend) and pass it to it. Can be created via backend components: GIPS or directly via WebBio. Please find more instructions according to session creation below.
listeners RemoteCaptureListenersGroup of listeners related to remote use case. They help to gather capture data and informs about flow state and result. See listeners section for more details.
environmentInfo EnvironmentInfoInformations about Proofing Platform environment and authentication method.

RemoteCaptureListeners - detailed description of each listeners can be found in listeners section.

Parameter
Description
livenessActiveListener LivenessActiveListenerProvides information about liveness ACTIVE mode. Usefull when UISettings are not provided to FaceCaptureView.
faceTrackingInfo FaceTrackingInfoListenerProvides coordinates with face.
captureFeedback CaptureFeedbackListenerProvides feedback that should be presented to end user to improve capture process. Handled when UISettings are used.
captureLivenessListener CaptureLivenessListenerProvides information about liveness mode for current capture.
stepInfoListener StepInfoListenerProvides information about state of capture.
passiveVideoListener PassiveVideoListenerProvides information about liveness PASSIVE_VIDEO mode. Usefull when UISettings are not provided to FaceCaptureView.
captureResultListener RemoteCaptureResultListenerProvides information about result of whole flow.
livenessProcessingListener LivenessProcessingListenerProvides progress of sending user image metadata to the server. It takes values from 0.0 to 1.0

EnvironmentInfo

FaceCapture is compatible with two types of the authorization: API Key and OAuth.

  • In token type authorization, an access token is generated by the authorization server using the provided secrets. This token can be utilized by creating an AccessToken class and using the appropriate initializer in the EnvironmentInfo class: init(accessToken: AccessToken, baseUrl: URL).
  • The AccessToken class holds information about the secret and token type from the OAuth authorization server.

Secrets can be found on the webpage: https://experience.idemia.com/dashboard/my-identity-proofing/access/environments/.

Access Token solution constructor:

Parameter
Description
baseUrl StringURL to Proofing services. For example production URL is:
https://proofing.app.eu.identity-prod.idemia.io:443/
accessToken AccessTokenDedicated information about access token used to authenticate on Proofing services. Do not share this
and try to avoid storing it on application's repository

ApiKey solution constructor:

Parameter
Description
baseUrl StringURL to Proofing services. For example production URL is: https://proofing.app.eu.identity-prod.idemia.io:443/
apiKey StringDedicated key used to authenticate on Proofing services. Do not share this and try to avoid storing it on application's repository

AccessToken

Parameter
Description
secret StringDedicated token used to authenticate on Proofing services. Do not share this and try to avoid storing it on
application's repository
tokenType StringToken type to indicate how it should be used in authorization request.
Creating capture session

RemoteUseCase handles liveness verification on backend side. It requires to create session per capture. Common approach is to create session on integrator's backend side and provide it to application. Then capture flow might be triggered. Good to start with pages:

  • General description of liveness remote capture - here
  • GIPS API description - here
  • WebBio API description - here

For example, creating session via GIPS requires:

  1. Create identity by calling: POST: /v1/identities As a result an identity is being returned

  2. Submit confirmation that the user has consented to perform specific evidence verifications: POST: /v1/identities/{id}/consents

  3. Start liveness session by calling: POST: /v1/identities/{id}/attributes/portrait/live-capture-session?mode=nativeSDK Response to this call contains session id used by SDK.

Proceeding with WebBio requires:

  1. Create session by calling: POST: /bioserver-app/v2/bio-sessions Call must contain session data in body.

  2. Retrieve session path from the response:

Kotlin
1val bioSessionPath = response.headers()["Location"]
  1. Get BioSession: GET: /bioserver-app{bioSessionPath} Response to this call contains session id used by SDK.

  2. Initialize the session with id from previous step and liveness parameters passed in body: POST: /bioserver-app/v2/bio-sessions/{bioSessionId}/init-liveness-parameters

Snippet below shows use case creation:

Kotlin
1val environmentInfo = EnvironmentInfo(
2 "https://proofing.app.eu.identity-prod.idemia.io:443/",
3 "YourApiKey"
4)
5val sessionInfo = sessionHandler.createSession(
6 readFaceLivenessModeFromSettings(),
7 readFaceSecurityLevelFromSettings()
8)
9val remoteUseCase = RemoteUseCase(
10 sessionInfo.sessionId,
11 RemoteCaptureListeners(
12 faceTrackingInfo = faceTrackingInfoLoggingListener,
13 captureFeedback = feedbackListener,
14 captureLivenessListener = captureLivenessLoggingListener,
15 stepInfoListener = stepInfoListener,
16 passiveVideoListener = passiveVideoLoggingListener,
17 captureResultListener = captureResultListener
18 ),
19 environmentInfo
20)

If this guide is not enough, there is still FaceSampleAppLite source code on our Artifactory repository. Feel free to download latest package with GIPS implementation from here or WBS implementation from here and see integration with session creation included.

Keep in mind that result of flow on SDK side is sending required data to backend service. Captured image might be required for application needs. To aquire it, additional request to WebBioServer has to be done. API description related to this can be found here.

Simple interface responsible for starting illumination process. It's returned by onIlluminationDemand method of MlcListener.

Listeners

New API introduces multiple listeners to acquire capture related data by integrator. All listeners are being called on UI thread and it is safe to manipulate UI components directly from them.

CaptureLivenessListener

Returns information about current liveness capture mode. Usefull for RemoteUseCase where mode comes from backend side.

Kotlin
1fun captureLiveness(liveness: Liveness)

Liveness

Value
Description
ACTIVEMeans that current liveness mode is active one. User need to connect points using face.
PASSIVEMeans that current liveness mode is passive one. No challenges for user.
PASSIVE_VIDEOMore advanced variant of passive mode. It requires backend integration (can be used with RemoteUseCase) as it uses more restrictive liveness algorithms.

StepInfoListener

This listener provides information about capture flow state within StepInfo object.

Kotlin
1fun stepInfo(stepInfo: StepInfo)

StepInfo

Value
Description
PREPARING_LIVENESSLiveness challenge is being prepared.
CAPTURE_STARTEDCapture has been started. Preview should actively show frames from camera.

LivenessActiveListener

This listener provides information about active face capture. This mode requires user to connect dots in correct order by moving his face. Callbacks tells integrator what is current status of challenge and what to display. Keep in mind that by providing UISettings to FaceCaptureView, there is no need to handle that because SDK will draw this challenge with style provided.

Kotlin
1fun onPointerUpdate(pointInfo: PointerInfo)

PointerInfo contains information about user's "viewfinder" position. This is the point that user needs to put on a target in order to mark it as "captured".

Kotlin
1fun onTargetUpdate(targetInfo: TargetInfo)

TargetInfo contains information about targets to capture.

Parameter
Description
x IntX coordinate of target.
y IntY coordinate of target.
show BooleanIndicates if target should be visible to user.
radius IntRadius of target relative to capture frame size.
number IntNumber of target.
completness FloatValue ranged from 0.0 to 1.0, where 1.0 tells that target is fully captured.
current BooleanInforms if given target is currently active one (to be captured).
Kotlin
1fun onNumberTargets(numberOfTargets: Int)

This callback provides information about how many points needs to be captured to pass challenge.

PassiveVideoListener

This listeners helps to pass passive video liveness capture mode. Before capture starts there is a preparation phase and capture itself has it's progress. Information are provided within this listener should be presented to end user. Keep in mind that by providing UISettings to * FaceCaptureView*, there is no need to handle this listener because SDK will draw this challenge with style provided.

Kotlin
1fun onPreparationStarted()

Tells integrator that preparation of capture has been started.

Kotlin
1fun onPreparationFinished()

Tells integrator that preparation phase has finished. Now capture will be performed.

Kotlin
1fun overlayUpdated(overlay: OvalOverlay)

To make capture easier there is a special configuration for UI oval to be displayed to end user. OvalOverlay has coordinates and size of that oval.

Kotlin
1fun progressUpdated(progress: Float)

Progress of capture.

CaptureResultListener

Used for all use cases. Provides information about flow result.

Kotlin
1fun onFinish(result: CaptureResult)

CaptureResult instance might be type of Success that is information about successful flow or Failure that contains Error instance inside. For more details see errors section.

CaptureFeedbackListener

This is listener with crucial information for user about the capture. Helps to find optimal position in front of camera.

Kotlin
1fun onFeedback(captureFeedback: CaptureFeedback)

Where CaptureFeedback is enum with self-explanatory instruction. It is covered by UISettings* by mapping these to text instructions to the user.

FaceTrackingInfoListener

This listener provides rectangle coordinates and size with face position (relative to preview frame size) in real time during capture. Helpful when there is a need to draw overlay on preview showing detected face.

Kotlin
1fun faceTrackingInfo(trackingInfo: FaceTrackingInfo)

FaceTrackingInfo provides "face rectangle" informations.

LivenessProcessingListener

This listener provides information about current progress of uploading metadata to the server.

Kotlin
1fun onLivenessMetadataUploadProgressUpdated(@FloatRange(from = 0.0, to = 1.0) progress: Float)
MlcListener

This listener provides information related to MultidimensionalLivenessCheck capture.

It contains following methods:

Kotlin
1fun onSmileStabilityChange(@FloatRange(from = 0.0, to = 1.0) value: Float)

Returns the percentage of smile step completion.

Kotlin
1fun onSmileSizeChange(@FloatRange(from = 0.0, to = 1.0) value: Float)

Returns current smile size.

Kotlin
1fun onSmileFinished()

Called when smile acquisition process has ended.

Kotlin
1fun onIlluminationPrepared(@FloatRange(from = 0.0, to = 1.0) scale: Float)

Indicates that illumination process is ready to start. scale parameter is a value between 0.0 and 1.0, and should be used to rescale preview component before triggering illumination.

Kotlin
1fun onIlluminationProgressChange(@FloatRange(from = 0.0, to = 1.0) progress: Float)

Returns current illumination progress value in the range between 0.0 and 1.0.

Kotlin
1fun onColorToDisplay(red: Int, green: Int, blue: Int)

Returns colors which should be used as a part of illumination process. Background of preview should change according to those colors.

Kotlin
1fun onIlluminationFinished()

Called after end of illumination process.

Kotlin
1fun onIlluminationDemand(request: IlluminationRequest)

Returns IlluminationRequest which is used to start illumination.

Additional classes

IlluminationRequest
Kotlin
1interface IlluminationRequest {
2 fun start()
3}

ActiveLivenessUseCase 

This use case performs face capture with on-device liveness verification. It means that no external requests are made in order to verify user. "Active" in use case name indicate that in order to pass verification end user has to complete an action. Points on the screen need to be joined in given order using face as a pointer.

Use case should be created using its constructor:

Kotlin
1ActiveLivenessUseCase (
2 val listeners: ActiveCaptureListeners = ActiveCaptureListeners(),
3 val numberOfPoints: Int,
4 val timeoutInSeconds: Long = DEFAULT_TIMEOUT_SECONDS,
5 val securityLevel: SecurityLevel = SecurityLevel.HIGH,
6)
Parameter
Description
listeners ActiveCaptureListenersAggregated listeners used during this use case. See listeners section for more details.
numberOfPoints: IntThis parameter describes amount of points to be connected during the capture.
timeoutInSeconds LongAfter given amount of seconds, capture finishes with timeour error. Default value is: 120
securityLevel SecurityLevelThis parameter determines how restrictive liveness alghoritms are. Default value is: SecurityLevel.HIGH

Listeners

LivenessActiveListener

This listener provides information about active face capture. This mode requires user to connect dots in correct order by moving his face. Callbacks tells integrator what is current status of challenge and what to display. Keep in mind that by providing UISettings to FaceCaptureView, there is no need to handle that because SDK will draw this challenge with style provided.

Kotlin
1fun onPointerUpdate(pointInfo: PointerInfo)

PointerInfo contains information about user's "viewfinder" position. This is the point that user needs to put on a target in order to mark it as "captured".

Kotlin
1fun onTargetUpdate(targetInfo: TargetInfo)

TargetInfo contains information about targets to capture.

Parameter
Description
x IntX coordinate of target.
y IntY coordinate of target.
show BooleanIndicates if target should be visible to user.
radius IntRadius of target relative to capture frame size.
number IntNumber of target.
completness FloatValue ranged from 0.0 to 1.0, where 1.0 tells that target is fully captured.
current BooleanInforms if given target is currently active one (to be captured).
Kotlin
1fun onNumberTargets(numberOfTargets: Int)

This callback provides information about how many points needs to be captured to pass challenge.

CaptureResultListener

Provides information about flow result.

Kotlin
1fun onFinish(result: CaptureResult)

CaptureResult instance might be type of Success that is information about successful flow or Failure that contains Error instance inside. For more details see errors section.

FaceTrackingInfoListener

This listener provides rectangle coordinates and size with face position (relative to preview frame size) in real time during capture. Helpful when there is a need to draw overlay on preview showing detected face.

Kotlin
1fun faceTrackingInfo(trackingInfo: FaceTrackingInfo)
CaptureFeedbackListener

This is listener with crucial information for user about the capture. Helps to find optimal position in front of camera.

Kotlin
1fun onFeedback(captureFeedback: CaptureFeedback)

Where CaptureFeedback is enum with self-explanatory instruction. It is covered by UISettings* by mapping these to text instructions to the user.

PassiveLivenessUseCase 

This use case performs face capture with on-device liveness verification. It means that no external requests are made in order to verify user. "Passive" in use case name indicate that in order to pass verification end user does not have to do any action - just place his face in front of the camera.

Use case should be created using its constructor:

Kotlin
1PassiveLivenessUseCase (
2 val listeners: ActiveCaptureListeners = ActiveCaptureListeners(),
3 val timeoutInSeconds: Long = DEFAULT_TIMEOUT_SECONDS,
4 val securityLevel: SecurityLevel = SecurityLevel.HIGH,
5)
Parameter
Description
listeners ActiveCaptureListenersAggregated listeners used during this use case. See listeners section for more details.
timeoutInSeconds LongAfter given amount of seconds, capture finishes with timeour error. Default value is: 120
securityLevel SecurityLevelThis parameter determines how restrictive liveness alghoritms are. Default value is: SecurityLevel.HIGH

Listeners

CaptureResultListener

Provides information about flow result.

Kotlin
1fun onFinish(result: CaptureResult)

CaptureResult instance might be type of Success that is information about successful flow or Failure that contains Error instance inside. For more details see errors section.

FaceTrackingInfoListener

This listener provides rectangle coordinates and size with face position (relative to preview frame size) in real time during capture. Helpful when there is a need to draw overlay on preview showing detected face.

Kotlin
1fun faceTrackingInfo(trackingInfo: FaceTrackingInfo)
CaptureFeedbackListener

This is listener with crucial information for user about the capture. Helps to find optimal position in front of camera.

Kotlin
1fun onFeedback(captureFeedback: CaptureFeedback)

Where CaptureFeedback is enum with self-explanatory instruction. It is covered by UISettings* by mapping these to text instructions to the user.

Errors 

For every flow there is possibility to receive Error type of result. It means that something went wrong during the capture or backend communication. Fortunately, Error object contains a lot of useful information that help to handle failed flow.

Error

Parameter
Description
type ErrorTypeType of an error. High level information what goes wrong. Find types description below.
code IntSpecial code dedicated for particular case. Very helpful in L2, L3 troubleshooting.
message StringMessage with error description.
unlockDateTime Long?Time in "UTC" time zone when capture will be unblocked. This field have value when ErrorType is DEVICE_BLOCKED
failureReasons List?List of enums containing reasons of failure.

ErrorType

Type
Description
CONNECTION_ISSUEGeneral issue with connection. See message and error code for more information.
AUTHENTICATIONBackend authentication failed. Probably wrong credentials has been used for the given environment.
INVALID_SESSIONSession ID is not correct. Most probably session expired or has been finished.
TIMEOUTTimeout occurred during the flow.
BAD_CAPTURECapture failed. Face was not detected or liveness check did not pass.
UNKNOWNUnknown type of exception. Also used as default type for few cases.
CANCELEDFlow has been canceled. Can be triggered by integrator or automatically when Lifecycle has been passed to setUp method.
VERIFICATIONDevice signature verification failed.
INVALID_LICENSELicense validation failed. Make sure that it has been activated with LicenseManager
DEVICE_BLOCKEDCapture on this device got blocked for period of time, because of many failures.
LIVENESS_CHECKLiveness verification failed during the capture. It can happen during offline capture use cases like: ActiveLivenessUseCase or PassiveLivenessUseCase

FailureReason

Type
Description
FACE_NOT_DETECTEDFace was not detected.
INSUFFICIENT_SMILESmile was not detected (or was not genuine enough) during MLC capture.
INCORRECT_FACE_POSITIONFace was not correctly positioned within the frame.
TOO_DARKIt's too dark to perform the capture.
TOO_BRIGHTIt's too bright to perform the capture.

Integration of MLC capture 

MultidimensionalLivenessCheck is a recommended mode for liveness capture. This capture consist of three phases:

  1. Framing - when the user have to correctly align face within the frame.
  2. Smile - when the user needs to smile genuinely.
  3. Illumination - when face will be verified with a sequence of color flashes.

There are two ways of integrating this capture - with providing UiSettings, or by handling each step of the capture on your own.

Integration with using UiSettings

Using UiSettings is the easiest way of integrating MLC capture. It requires following steps:

  1. Create session with Liveness of type MLC. See creating capture session section for more details.
  2. Create RemoteUseCase with sessionId, EnvironmentInfo and listeners.
Kotlin
1val remoteUseCase = RemoteUseCase(
2 sessionId,
3 RemoteCaptureListeners(
4 stepInfoListener = stepInfoListener,
5 captureResultListener = captureResultListener,
6 ),
7 environmentInfo
8)
  1. Create mlcCaptureSettings with feedbacks specific for MLC and pass it to UiSettings object.
  2. Call setUp on FaceCaptureView with previously created objects:
Kotlin
1binding.captureView.setUp(remoteUseCase,
2 lifecycle, //If you don't want to start and stop capture on your own
3 UISettings(mlcCaptureSettings = mlcCaptureSettings),
4 )

After that capture will be started.

Integration based on MlcListener

The second way of handling MLC capture is to implement your own UI and logic for handling each phase of the capture. It's more complex, but allows better customization than approach with using UiSettings. This integration is based on correctly handling information coming from MlcListener. As mentioned before, MLC capture consist of three phases:

  1. Framing - it requires correctly placing the face within the preview. To proceed, the user have to follow feedbacks coming from CaptureFeedbackListener.
  2. Smile - starts after receiving feedback CaptureFeedback.FACE_INFO_MAKE_A_SMILE. There are three methods in MlcListener related with this phase: onSmileStabilityChange - which can be used to track progress of whole phase, onSmileSizeChange - to show smile progress for the user (It can be done for example with using SmileIndicatorBar) and onSmileFinished - which means that smile phase ended.
  3. Illumination - during this phase preview background have to change colors according to values coming from the SDK. To properly handle this phase, you have to:
    • Rescale FaceCaptureView with value received from onIlluminationPrepared method.
    • Show some additional instructions for the user, for example informing about incoming color changes or necessity of moving face closer to the screen due to changed preview size.
    • Trigger illumination process by calling start() on IlluminationRequest received from onIlluminationDemand method.
    • Adjust color of preview background to values coming from onColorToDisplay. If you want to track progress of illumination phase, then progress will be returned in onIlluminationProgressChange (as value in range between 0.0 and 1.0). The process ends when onIlluminationFinished is called.

When MlcListener is implemented based on given instructions, then you must:

  1. Create screen with FaceCaptureView which does not cover all available space - because during illumination phase user's face should be verified with sequence of a color flashes, and FaceCaptureView background have to be adjusted to colors from onColorToDisplay callback.
  2. Create session with Liveness of type MLC. See creating capture session section for more details.
  3. Create RemoteUseCase with sessionId, EnvironmentInfo and listeners.
Kotlin
1val remoteUseCase = RemoteUseCase(
2 sessionId,
3 RemoteCaptureListeners(
4 stepInfoListener = stepInfoListener,
5 captureFeedback = captureFeedbackListener,
6 captureResultListener = captureResultListener,
7 mlcListener = mlcListener
8 ),
9 environmentInfo
10)
  1. Call setUp on FaceCaptureView with previously created objects:
Kotlin
1binding.captureView.setUp(remoteUseCase,
2 lifecycle, //If you don't want to start and stop capture on your own
3 null,
4 )

Face capture - local 

Creating a FaceCaptureHandler 

These instructions will help you create a BioCapture handler.

  1. Retrieve a capture handler to perform all the biometric capture operations. You must first configure the capture options. For projects that use Kotlin there are handlers from com.idemia.smartsdk.capture that support initializing capture preview with suspend method.

BioCaptureMode is now deprecated and should not be used during capture configuration because it will be removed in a future release.

  • Review the use cases named Capture biometrics.

  • Review all the features provided by the BioCaputureHandler handler here.

1 val captureOptions = FaceCaptureOptions(FaceLiveness.PASSIVE).apply {
2 camera = Camera.FRONT
3 captureTimeout = 120
4 overlay = Overlay.OFF
5 }
6 val captureHandler = FaceCaptureHandler(context, captureOptions)
Parameter
Description
activity ActivityThe Android activity.
options IFaceCaptureOptionsThe capture options to configure the bio capture handler.
callbacks BioCaptureAsyncCallbacksCallbacks to be executed depending on the result.

Errors

Error code
Description
MSC_ERR_APPLINOTAVAILABLEThe application parameter is not available.
MSC_ERR_GRAPH_INITIALISATION_FAILEDThe graph initialization failed.
MSC_ERR_INITInitialization failed.
MSC_ERR_PARAMETERSParameters are invalid.
MSC_ERR_PARAMETER_NOT_FOUNDParameter is missing.
MSC_ERR_PARAMETER_SIZEParameter size is incorrect.
MSC_ERR_PARAMETER_UNKNOWNOne of the parameters is unknown.
MSC_ERR_INVALID_HANDLEHandle is invalid.
LIBS_NOT_FOUNDJava libraries are not found.
NO_CONTEXT_SETJava context is not set.
NOT_EXECUTEDJava is unable to execute.
MSC_ERR_LICENSELicense is invalid.
MSC_ERR_MEMALLOCMemory allocation issue.
MSC_ERR_PROFILENOTAVAILABLEBioCapture profile is not available.
MSC_ERR_SUBPROFILENOTAVAILABLEBioCapture sub-profile is not available.
MSC_ERR_TYPE_MISMATCHBioCapture type mismatch.
UNKNOWNUnknown error

Handlers 

This section discusses the BioCapture handler, FaceCapture handler and BioMatcher handler.

BioCapture handler

You must retrieve the capture handler through the Biometric Capture SDK entry point.

Face listener

This sets the listener to receive feedback (such as when a user moves their face to the right) as shown in the snippet:

Java
1captureHandler.setFaceTrackingListener(new FaceCaptureTrackingListener() {
2 @Override
3 public void onTracking(List<FaceTracking> trackingInfo) {
4 //Tracking info to know where the face is.
5 }
6 });

Start preview

This asynchronously starts the camera preview. It is recommended to start the capture once the preview has been initialized, as shown in the snippet:

Java
1handler.startPreview(new PreviewStatusListener() {
2 @Override
3 public void onStarted() {
4 try {
5 captureHandler.startCapture();
6 } catch (MSCException e) {
7 // handle exception
8 }
9 }
10
11 @Override
12 public void onError(PreviewError error) {
13 // Preview initialization failed and can not be started
14 }
15 });
Kotlin
1coroutineScope.launch {
2 handler.startPreview()
3 handler.startCapture()
4}

Stop preview

This stops the camera preview as shown in the snippet:

Java
1handler.stopPreview();

Start capture

This starts the biometric capture as shown in the snippet.

Java
1handler.startCapture();

Stop capture

This stops the biometric capture as shown in the snippet:

Java
1handler.stopCapture();

Switch camera

This switches between different cameras as shown in the snippet:

Java
1handler.switchCamera(Camera.FRONT); // Use front camera
2handler.switchCamera(Camera.REAR); // Use rear camera

Destroy

This releases all the handler resources as shown in the snippet:

Java
1handler.destroy();

Overlay

This sets the overlay option.

Java
1handler.setOverlay(Overlay.OFF); // Disable preview's overlay
2handler.setOverlay(Overlay.ON); // Enable preview's overlay

CaptureOptions

This retrieves the capture options used in this handler as shown in the snippet:

Java
1ICaptureOptions options = handler.getCaptureOptions();

Force capture

This forces a capture as shown in the snippet:

Java
1handler.forceCapture();

Capture handler status

Note: Check CaptureHandlerStatus.

This retrieves the status of the capture handler as shown in the snippet:

Java
1CaptureHandlerStatus captureHandlerStatus = handler.getCaptureStatus();

FaceCapture handler

Note: It extends from BioCaptureHandler.

You must retrieve the capture handler through the Biometric Capture SDK entry point for BioCaptureHandler, as shown in the snippet:

Java
1// Get activity from application
2Activity activity = ...
3// Populate a CaptureOptions object
4IFaceCaptureOptions captureOptions = new FaceCaptureOptions(FaceLiveness.PASSIVE);
5captureOptions.setFaceLivenessSecurityLevel(FaceLivenessSecurityLevel.HIGH);
6captureOptions.setCamera(Camera.FRONT);
7captureOptions.setCaptureTimeout(120);
8captureOptions.setOverlay(Overlay.OFF);
9BioSdk.createFaceCaptureHandler(activity, captureOptions, new MscAsyncCallbacks<IFaceCaptureHandler>() {
10 @Override
11 public void onPreExecute() {
12 // Optional hook on the builtin Android AsyncTask call-back `onPreExecute`
13 }
14
15 @Override
16 public void onSuccess(IFaceCaptureHandler faceCaptureHandler) {
17 // Indicates that initialization succeeded, the returned handler can be used to start the capture.
18 handler = faceCaptureHandler;
19 //handler.setTotalNumberOfCapturesBeforeDelay(-1); to disable delays between face capture failures.
20 }
21
22 @Override
23 public void onError(BioCaptureHandlerError e) {
24 // An error has occurred during the initialization
25 }
26});

Capture result listener

This sets the listener to receive the face captures. The face image callback will be fired whenever the capture is finished, as shown in the snippet:

Java
1handler.setFaceCaptureResultListener(new FaceCaptureResultListener() {
2 @Override
3 public void onCaptureSuccess(@NotNull FaceImage image) {
4 }
5
6 @Override
7 public void onCaptureFailure(@NotNull CaptureError captureError,
8 @NotNull IBiometricInfo biometricInfo,
9 @NotNull Bundle extraInfo) {
10 }
11 });
onCaptureSuccess
Called when captured is finished successfully
image FaceImageCapture face image
FaceImage
getLivenessResult FaceLivenessResultResolution of capture liveness: LIVE,FAKE, or NO_DECISION
getMetadata MetadataLow level data needed for verification or debug

FaceImage extends IImage, so it is possible to call getImageQuality() on it. However it is not recommended, as imageQuality is only available for finger capture and this method will always return -1 for FaceImage.

onCaptureFailure
Called when capture failed
captureError CaptureErrorReason of capture failure
biometricInfo IBiometricInfoBiometric information about location and classification
extraInfo BundleHolds capture extra info: capture delay date

Use CR2D challenges

This is another type of challenges which contains target points and points controlled by the user. To use it, pass FaceLiveness.ACTIVE to the FaceCaptureOptions constructor.

This example sets CR2D in capture options as shown in the snippet:

Java
1FaceCaptureOptions options = new FaceCaptureOptions(FaceLiveness.ACTIVE);

Use passive liveness challenge

Passive mode checks for liveness without user interaction (no head movement required). As in default mode, it requires only to show the face in front of the camera so that an image can be acquired. Special algorithms estimate if a user is a real person or not. To use this mode, pass FaceLiveness.PASSIVE to FaceCaptureOptions constructor.

This example sets Passive in the capture options as shown in the snippet"

Java
1FaceCaptureOptions options = new FaceCaptureOptions(FaceLiveness.PASSIVE);

Get debug data

You can save some capture data on the user device's memory. In some cases, keeping those files might help to solve issues.

Below is an example of how to configure the debug data options. The data can be found on the SD card in the SmartSDK_debug_data directory. An example is shown in the snippet:

Java
1[...]
2 DebugSettingsBuilder debugSettingsBuilder = new DebugSettingsBuilder();
3 debugSettingsBuilder.logLevel(logLevel)
4 .storingType(DataStoringType.LAST_SESSION_ONLY)
5 .recordRtv(DebugOption.DISABLED)
6 .recordPostMortemRtv(DebugOption.DISABLED)
7 .saveCapturedImages(DebugOption.DISABLED);
8 captureOptions.setDebugDataSettings(debugSettingsBuilder.build());

Note: DataStoringType might have two values: LAST_SESSION_ONLY or MULTIPLE_SESSIONS. The first overwrites data in a single directory. The second makes a separate directory per capture.

An option exists to store special .rtv files that help you understand what is happening during a capture.

Note: Storing these files takes a lot of space. LogLevel describes what part of the logs will be saved to a file. If needed, the integrator can also save captured images by enabling saveCapturedImages option.

Set maximum captures before delay

This field sets the maximum number of captures before preventing a capture.

Values less than or equal to 0 disable the functionality. Values greater than 0 set the number of attempts before blocking.

The default value is 5 as shown in the snippet:

Java
1((FaceCaptureHandler)handler).setMaxCapturesBeforeDelay(5);

There is also a getter for this value as shown in the snippet:

Java
1((FaceCaptureHandler)handler).getMaxCapturesBeforeDelay();

Set capture delay time array

This sets the array list with capture delay for failed attempts that will happen after maxCapturesBeforeDelay.

Delay for the next attempt is taken from the arrays as timeCaptureDelayArray[n maxCapturesBeforeDelay].

For all the attempts after the array length, the last item is taken as shown in the snippet:

Java
1List<Long> delayTimes = Arrays.asList(1L, 5L, 10L, 30L, 60L);
2((FaceCaptureHandler)captureHandler).setTimeCaptureDelayArray(delayTimes);

Get time to unlock capture

This provides information about the delay time before the user can retry (specified in seconds). A return value of 0 means that the capture is not blocked.

An example request is shown in the snippet:

Java
1((FaceCaptureHandler)handler).timeToUnlock();

Get liveness captures attempts left before delay

This provides information about the number of captures that can be completed before delay is initialized.

It returns the number of attempts before the capture will be blocked, as shown in the snippet:

Java
1handler.captureAttemptsLeft();

It returns 0 if the capture is blocked, and Int.MAX_VALUE if the capture delays are turned off.

Liveness security levels

In IFaceCaptureOptions you can set the liveness security strength. It is configured via the FaceLivenessSecurityLevel enum.

The liveness security levels are:

  • LOW
  • MEDIUM
  • HIGH recommended

An option exists to store special .rtv files that help you understand what happens during a capture.

Compression recommendations 

Selfie images
  • Recommended compression is JPEG90
  • Size of image will be about 100 KB

BioStore 

The use of this component is optional. Its purpose is to allow the integrator to easily persist templates.

Query templates by userUUID 

This lists the templates stored in the repository filtering by user.

list_template_by_user_id
s
Java
1BioStoreDB.listTemplates(context, userId, new DataBaseAsyncCallbacks<List<IMorphoTemplate>>() {
2 @Override
3 public void onPreExecute() {
4 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
5 }
6
7 @Override
8 public void onSuccess(List<IMorphoTemplate> result) {
9 //The list of templates that match the criteria.
10 }
11
12 @Override
13 public void onError(Exception e) {
14 // An error has occurred.
15 }
16 });

Function

Java
1public static void listTemplates(final Context context, final UUID userId, DataBaseAsyncCallbacks<List<IMorphoTemplate>> callbacks);
Parameter
Description
context ContextThe Android context.
userId UUIDThe user identifier.
callbacks DataBaseAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

Query templates by userUUID and modality 

This lists the templates stored in the repository filtering by User and BiometricModality.

list_template_by_user_id_and_modality
Java
1BioStoreDB.listTemplates(context, userId, biometricModality, new DataBaseAsyncCallbacks<List<IMorphoTemplate>>() {
2 @Override
3 public void onPreExecute() {
4 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
5 }
6
7 @Override
8 public void onSuccess(List<IMorphoTemplate> result) {
9 //The list of templates that match the criteria.
10 }
11
12 @Override
13 public void onError(Exception e) {
14 // An error has occurred.
15 }
16 });

Function

Java
1void listTemplates(final Context context, final UUID userId, final BiometricModality biometricModality, DataBaseAsyncCallbacks<List<IMorphoTemplate>> callbacks);
Parameter
Description
context ContextThe Android context.
userId UUIDThe user id.
biometricModality BiometricModalityThe BiometricModality enum option.
callbacks DataBaseAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

Add template 

This stores a template in the repository. If there is a previous template with the same user UUID, Biometric location, and Biometric modality, it will be updated and the UUID returned.

Note: You cannot have two templates with the same configuration.

add_template
Java
1BioStoreDB.addTemplate(context, morphoTemplate, new DataBaseAsyncCallbacks<UUID>() {
2 @Override
3 public void onPreExecute() {
4 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
5 }
6
7 @Override
8 public void onSuccess(UUID result) {
9 //The template has been added and the template's uuid is returned as a parameter.
10 }
11
12 @Override
13 public void onError(Exception e) {
14 // An error has occurred.
15 }
16 });

Function

Java
1public static void addTemplate(final Context context, final IMorphoTemplate template, DataBaseAsyncCallbacks<UUID> callbacks);
Parameter
Description
context ContextThe Android context.
template IMorphoTemplateThe template to be stored.
callbacks DataBaseAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

Update template 

This updates a template in the repository.

Update_template
Java
1BioStoreDB.updateTemplate(context, morphoTemplate, new DataBaseAsyncCallbacks<Void>() {
2 @Override
3 public void onPreExecute() {
4 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
5 }
6
7 @Override
8 public void onSuccess(Void result) {
9 //updated.
10 }
11
12 @Override
13 public void onError(Exception e) {
14 // An error has occurred.
15 }
16});

Function

Java
1void updateTemplate(final Context context, final IMorphoTemplate template, DataBaseAsyncCallbacks<Void> callbacks);
Parameter
Description
context ContextThe Android context.
template IMorphoTemplateThe template to be updated.
callbacks DataBaseAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

Remove template 

This removes a template from the repository.

remove_template_by_template_id
Java
1BioStoreDB.removeTemplate(context, templateId, new DataBaseAsyncCallbacks<Void>() {
2 @Override
3 public void onPreExecute() {
4 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
5 }
6
7 @Override
8 public void onSuccess(Void result) {
9 //The template was removed.
10 }
11
12 @Override
13 public void onError(Exception e) {
14 // An error has occurred.
15 }
16 });

Function

Java
1void removeTemplate(final Context context, final UUID templateId, DataBaseAsyncCallbacks<Void> callbacks);
Parameter
Description
context ContextThe Android context.
templateId UUIDThe template id to be removed.
callbacks DataBaseAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

Remove templates associated to one userUUID 

This removes the templates associated to the user identifier from the repository.

remove_template_by_user_id
Java
1BioStoreDB.removeTemplateByUserId(context, userId, new DataBaseAsyncCallbacks<Integer>() {
2 @Override
3 public void onPreExecute() {
4 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
5 }
6
7 @Override
8 public void onSuccess(Integer result) {
9 //The number of templates removed.
10 }
11
12 @Override
13 public void onError(Exception e) {
14 // An error has occurred.
15 }
16 });

Function

Java
1void removeTemplateByUserId(final Context context, final UUID userId, DataBaseAsyncCallbacks<Integer> callbacks);
Parameter
Description
context ContextThe Android context.
userId UUIDThe user id.
callbacks DataBaseAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

Retrieve template 

This retrieves a template from the database.

retrieve_template
Java
1BioStoreDB.getTemplate(context, templateId, new DataBaseAsyncCallbacks<IMorphoTemplate>() {
2 @Override
3 public void onPreExecute() {
4 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
5 }
6
7 @Override
8 public void onSuccess(IMorphoTemplate result) {
9 //The template if exists.
10 }
11
12 @Override
13 public void onError(Exception e) {
14 // An error has occurred.
15 }
16 });

Function

Java
1void getTemplate(final Context context, final UUID templateId, DataBaseAsyncCallbacks<MorphoTemplate> callbacks);
Parameter
Description
context ContextThe Android context.
templateId UUIDThe template id.
callbacks DataBaseAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

Clear database 

This clears all the data stored in the database.

Java
1BioStoreDB.clear(context, new DataBaseAsyncCallbacks<Void>() {
2 @Override
3 public void onPreExecute() {
4 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
5 }
6
7 @Override
8 public void onSuccess(Void result) {
9 //Data has been cleared
10 }
11
12 @Override
13 public void onError(Exception e) {
14 // An error has occurred
15 }
16 });

Function

Java
1void clear(final Context context, DataBaseAsyncCallbacks<Void> callbacks);
Parameter
Description
context ContextThe Android context.
callbacks DataBaseAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

Add user 

This adds a new user to the database.

Java
1IUser user = new User();
2 user.setName("Jose");
3 BioStoreDB.addUser(context, user, new DataBaseAsyncCallbacks<UUID>() {
4 @Override
5 public void onPreExecute() {
6 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
7 }
8
9 @Override
10 public void onSuccess(UUID result) {
11 //User saved
12 }
13
14 @Override
15 public void onError(Exception e) {
16 // An error has occurred
17 }
18 });

Function

Java
1void addUser(final Context context, final IUser user, DataBaseAsyncCallbacks<UUID> callbacks);
Parameter
Description
context ContextThe Android context.
user IUserThe user.
callbacks DataBaseAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

Update user 

This updates a user in the database.

Java
1IUser user = ... //retrieve old user
2 BioStoreDB.updateUser(context, user, new DataBaseAsyncCallbacks<Void>() {
3 @Override
4 public void onPreExecute() {
5 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
6 }
7
8 @Override
9 public void onSuccess(Void result) {
10 //User updated.
11 }
12
13 @Override
14 public void onError(Exception e) {
15 // An error has occurred.
16 }
17 });

Function

Java
1void updateUser(final Context context, final IUser user, DataBaseAsyncCallbacks<Void> callbacks);
Parameter
Description
context ContextThe Android context.
user IUserThe user.
callbacks DataBaseAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

Remove user 

This removes a user from the database.

Java
1BioStoreDB.removeUser(context, uuid, new DataBaseAsyncCallbacks<Void>() {
2 @Override
3 public void onPreExecute() {
4 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
5 }
6
7 @Override
8 public void onSuccess(Void result) {
9 //User removed
10 }
11
12 @Override
13 public void onError(Exception e) {
14 // An error has occurred
15 }
16 });

Function

Java
1void removeUser(final Context context, final UUID uuid, DataBaseAsyncCallbacks<Void> callbacks);
Parameter
Description
context ContextThe Android context.
uuid UUIDThe user uuid.
callbacks DataBaseAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

Get user 

This retrieves a user from the database.

Java
1BioStoreDB.getUser(context, uuid, new DataBaseAsyncCallbacks<IUser>() {
2 @Override
3 public void onPreExecute() {
4 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
5 }
6
7 @Override
8 public void onSuccess(IUser result) {
9 //User
10 }
11
12 @Override
13 public void onError(Exception e) {
14 // An error has occurred
15 }
16 });

Function

Java
1void getUser(final Context context, final UUID uuid, DataBaseAsyncCallbacks<IUser> callbacks);
Parameter
Description
context ContextThe Android context.
uuid UUIDThe user uuid.
callbacks DataBaseAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

List users 

List the users stored in the repository.

Java
1BioStoreDB.listUsers(context, new DataBaseAsyncCallbacks<List<IUser>>() {
2 @Override
3 public void onPreExecute() {
4 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
5 }
6
7 @Override
8 public void onSuccess(List<IUser> result) {
9 //Users
10 }
11
12 @Override
13 public void onError(Exception e) {
14 // An error has occurred
15 }
16 });

Function

Java
1void listUsers(final Context context, DataBaseAsyncCallbacks<List<IUser>> callbacks);
Parameter
Description
context ContextThe Android context.
callbacks DataBaseAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

Helper objects 

BioSdk class - enableAnalytics

This method turns on the reporting and sending analytics report. It also changes the analytics server and its API key.

NOTE: The server is set to Europe by default.

Parameter
Description
network NetworkPreferred network type that will be used to send the report.
analyticsConfigurationData AnalyticsConfigurationDataClass that allows the setting of SERVER URL and API KEY.

BioSdk class - disableAnalytics

This turns off the reporting and sending analytics report.

Note: By default, the analytics mechanism is turned on and the server is set to Europe.

This section describes the helper objects that are necessary to use the Biometric Capture SDK.

MscAsyncCallbacks

Generic callbacks which execute task asynchronous.

Function
Kotlin
1fun onPreExecute() {
2
3}
Function
Kotlin
1fun onSuccess(result: T) {
2 //on success
3}
Arguments
Parameter
Description
result Treturn on success with result type set when callback was created
Function
Kotlin
1fun onError(e: BioCaptureHandlerError) {
2 //on success
3}
Arguments
Parameter
Description
e BioCaptureHandlerErrorAn error

BioSdk class - IBioSdkInfo

This object exposes information about the SDK. An example snippet is shown:

Java
1IBioSdkInfo sdkInfo = BioSdk.getInfo();
Parameter
Description
version StringThe version of the SDK

IBioMatcherSettings

This object is used to configure the behavior of BioMatcher.

Attribute
Description
debugDataSettings DebugDataSettingsSets options used for preparing debug data dump
fingerTemplateFormat MorphoFingerTemplateFormatFinger template format, only used for fingerprints—default format is PKLITE

ICaptureOptions

This object is used to configure the behavior of Capture.

Attribute
Description
camera CameraThe app camera option to configure BioCapture.
overlay OverlaySets the overlay value.
captureTimeout LongCapture timeout in seconds (default value 120).
logLevel LogLevelSets the log level.
DebugDataSettings Debug DataSets debug data options that stores key information about capture on the device's memory.

FaceCaptureOptions

This object used to configure the behavior of FaceCapture. It extends from CaptureOptions.

Attribute
Description
seed intFor CR2D sets dots seed
liveness FaceLivenessSet challenge for face
securityLevel FaceLivenessSecurityLevelSet liveness security
videoRecordingOptions VideoRecordingOptionsWhen enabled, CaptureSDK returns VideoRecording to generate video from a taken capture. (By default video recording is turned off)

IBiometricSet

This is a common interface that all the candidates and references that perform authentication and identification operations extend.

Parameter
Description
templates List<IMorphoTemplate>The Biometric templates; refer to IMorphoTemplate
biometricModality BiometricModalityThe BiometricModality enum option.

IBiometricCandidate

This is a common interface that all the candidates extend. It extends IBiometricSet.

IBiometricReference

This is a common interface that all the references used to perform an authentication or identification extend. It extends IBiometricSet.

Parameter
Description
userUuid UUIDUser uuid

IBiometricInfo

This is a common interface that all the different Biometrics implement.

Parameter
Description
biometricLocation BiometricLocationThe BiometricLocation enum option.
biometricModality BiometricModalityThe BiometricModality enum option.

BiometricInfo

This is a common object that all the different Biometrics extend. It implements the interface IBiometricInfo.

IImage

This is the image interface that the SDK image objects extend.

Parameter
Description
buffer byte[]The image.
stride intThe stride of the biometric.
width longThe width of the image.
height longThe height of the image.
colorSpace ColorSpaceThe ColorSpace of the image.
resolution floatThe resolution of the image.
imageQuality intImage quality if available, otherwise -1. Currently only available for fingerprint images.
labelLabel associated with this image, if any. It can be 'null'
toJPEG byte[]Retrieves the image as a JPEG image. Default quality for that document is 70%. The created JPEG for the document will contain capture maker note data inside EXIF metadata, containing information such as the SDK version used for capturing the image.
toJPEG(float quality)Retrieves the image as a JPEG image with quality valued from '0' to '1'

MorphoImage

This is the image object returned by the SDK. It extends BiometricInfo and implements IImage.

Metadata

This is low level data information about capture for verification.

Description
getData() byte[]The information about capture.
encrypt(String random, List<String> certificates) EncryptedDataThe encrypted data.

EncryptedData

This is low level encrypted data information about Capture for verification.

Parameter
Description
data byte[]The information about capture.
encryptedMaster byte[]The encrypted master key.

EncryptedMetadata

This is low level encrypted data information about Capture for verification.

Function name
Description
getEncryptedData() byte[]The information about capture.

FaceTracking

This is a face tracking object returned by the FaceCaptureTrackingListener.

Parameter
Description
rect RectThe position of the biometric.
previewRect RectThe original preview size to which the coordinates are referred.

OvalOverlay

Information about oval overlay that might help user to position face correctly in front of camera. Keep in mind these numbers are relative to preview image size. If you want to display some UI elements helping user to place his face, using UIExtensions library is recommended. Otherwise you have to rescale these coordinates by your own to fit your view.

Parameter
Description
width floatThe width length of the oval.
height floatThe height length of the oval.
centerX floatThe position x of oval center.
centerY floatThe position y of oval center.

IMorphoTemplate

This is the biometric template object returned by the SDK. It extends IBiometricInfo.

Parameter
Description
buffer byte[]The template.
uuid UUIDThe template uuid in the database (Can be null).
uuidUser UUIDThe user uuid (Can be null).

IMorphoFaceTemplate

This is the biometric face template object returned by the SDK. It extends IMorphoTemplate.

Parameter
Description
eyesPosition List<IEyePosition>The eyes position; refer to IEyePosition
templateFormat MorphoFaceTemplateFormatThe template format; refer to MorphoFaceTemplateFormat.
qualityRawValue short
quality FaceTemplateQuality

IEyePosition

This is the position of eyes on face.

Parameter
Description
getPosition RectFThe eye position.

CaptureListener

This is a generic capture listener.

onCaptureFinish

This is invoked by BioCapture when the capture finishes.

Function

An example snippet is shown:

Java
1void onCaptureFinish();

BioCaptureCR2DListener

This listener receives information about CR2D challenge objects.

onCurrentUpdated

This is called every time the current point changes. Cr2dCurrentPoint is an object with information about user position during the challenge.

Java
1void onCurrentUpdated(Cr2dCurrentPoint currentPoint);

onTargetUpdated

This is called every time the target changes. Cr2dTargetPoint is the object with information about a specific target in a challenge. Each target has a unique number. This method is called once per specific target update.

Java
1void onTargetUpdated(Cr2dTargetPoint targetPoint);

onTargetsConditionUpdated

This is called every time the condition of targets changes, where targetCount is the amount of all targets in a challenge. targetStability is the stability for the current target in a challenge (possible value is from 0 to 100).

Java
1void onTargetsConditionUpdated(int targetCount, int targetStability);

FaceCaptureFeedbackListener

This is the capture feedback listener. It enables the app to receive feedback about the biometric captures, like moving your head to the left.

onCaptureInfo

This is invoked multiple times by BioCapture to send feedback about the capture process to the app.

Function

An example snippet is shown.

Java
1void onCaptureInfo(FaceCaptureInfo captureInfo);

Arguments

Parameter
Description
faceCaptureInfo FaceCaptureInfoThe feedback.

IDetectBiometricOptions

This interface represents the verification options. This interface extends IBiometricInfo.

Parameter
Description
isTemplateCompressionEnabled booleanEnables or disables the template compression. For the moment this feature is only available for face.

IMatchingOptions

This interface represents the basic matching options.

Parameter
Description
biometricModality BiometricModalityThe BiometricModality enum option

IAuthenticationOptions

This is the interface that represents the authentication options. This interface extends IMatchingOptions.

The matching result is a score that reflects the similarity of two biometrics acquisitions. The threshold is the score value that is used to differentiate a HIT from a NOHIT.

Threshold choice is a compromise between FAR (False Acceptance Rate) and FRR (False Reject Rate).

FAR is the proportion of requests that generate an unexpected HIT with two biometrics acquisitions of two different persons.

FRR is the proportion of requests that generate an unexpected NOHIT with two biometrics acquisitions of the same person.

IDEMIA algorithms

The recognition algorithm similarity matching score is linked with the FAR (as previously defined):

FAR
Score
1%2500
0.1%3000
0.01%3500
0.001%4000
0.0001%4500
0.00001%5000
Parameter
Description
threshold longThe authentication threshold to be considered valid.

IAuthenticationResult

This is the interface that represents an authentication result.

Parameter
Description
score longThe authentication score (between 0 - 50000).
authenticationStatus AuthenticationStatusThe authentication status.

IIdentificationOptions

This is the interface that represents the identification options. This interface extends IMatchingOptions.

IIdentificationResult

This is the interface that represents an identification result.

Parameter
Description
candidateList List<IIdentificationCandidate>The authentication result; refer to IIdentificationCandidate.

IIdentificationCandidate

This is the Interface that represents a candidate result.

Parameter
Description
uuid UUIDThe candidate uuid.
score longThe identification score result.

VideoRecordingReadyForGenerationListener (Only for Face capture and RemoteFace capture)

This sets the listener for to get a VideoRecording. This listener is called when recording is enabled. It is called on successful and failed capture.

Setting callback

Java
1captureHandler.setVideoRecordingReadyForGenerationListener(new VideoRecordingReadyForGenerationListener() {
2 @Override
3 public void videoRecordingReadyForGeneration (VideoRecording videoRecording){
4 //Video recording object to generate video
5 }
6});

Callback returns VideoRecording object to generate video from capture.

VideoRecordingOptions

This object is used to configure the behavior of the VideoRecording. You cannot generate two or more videos at the same time.

Parameter
Description
recordingEnable booleanEnable video recording

VideoRecording

This object is used to generate video in MP4 format. On success, it return the path to the video. You can generate one video.

Java
1videoRecording.generateVideo(new VideoProgressListener() {
2
3
4 @Overide
5 void onFinish(String path) {
6 //When creating video comlete with success return path to video
7 }
8
9 @Overide
10 void progress(int progress) {
11 //Showing progress of generating video from 0 to 100
12 }
13
14 @Overide
15 void onError(VideoError error) {
16 //It's call when generating video failed or another video is current generating
17 }
18});

BioMatcher handler 

This interface provides all the necessary helper methods to perform all the matching, identifying, and template coding operations.

Authenticate

This verifies a list of candidate templates against a list of reference templates. This method can be used to authenticate users.

Note: Review the use cases named Authenticate.

An example snippet is shown:

Java
1//Authentication options
2IAuthenticationOptions authenticationOptions = new AuthenticationOptions();
3authenticationOptions.setThreshold(3500);
4
5//Biometric candidate
6IBiometricCandidate biometricCandidate = new BiometricCandidate(BiometricModality.FACE);
7//We add all the templates for this candidate
8biometricCandidate.addTemplates(candidates);
9
10//Biometric references
11IBiometricReference biometricReference = new BiometricReference(user.getUuid(), BiometricModality.FACE);
12//We add all the templates for this user
13biometricReference.addTemplates(references);
14
15matcherHandler.authenticate(authenticationOptions, biometricCandidate, biometricReference, new BioMatcherAsyncCallbacks<IAuthenticationResult>() {
16 @Override
17 public void onPreExecute() {
18 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
19 }
20
21 @Override
22 public void onSuccess(IAuthenticationResult result) {
23 //The result of the authentication
24 long resultScore = result.getScore();
25 //authentication status (FAILURE, SUCCESS...)
26 AuthenticationStatus authenticationStatus = authenticationResult.getStatus();
27 }
28
29 @Override
30 public void onError(Exception e) {
31 // An error has occurred
32 }
33});

Function

Java
1void authenticate(IAuthenticationOptions authenticationOptions, IBiometricCandidate biometricCandidate, IBiometricReference biometricReference, BioMatcherAsyncCallbacks<IAuthenticationResult> callbacks);
Parameter
Description
authenticationOptions IAuthenticationOptionsThe options used to perform the authentication.
biometricCandidate IBiometricCandidateIt contains the list of templates that you want to match.
biometricReference IBiometricReferenceIt contains the list of templates that you want to use as reference, each of one has the userUUID to which they belong.
callbacks BioMatcherAsyncCallbacksCallbacks to be executed depending on the result; refer to IAuthenticationResult.

Errors

You will receive an exception reporting the error.

Authenticate synchronous

This verifies a list of candidate templates against a list of reference templates. This method can be used to authenticate users.

Note: This function must be executed in a different thread than the UI. Check the use cases named Authenticate.

An example snippet is shown:

Java
1//Authentication options
2IAuthenticationOptions authenticationOptions = new AuthenticationOptions();
3authenticationOptions.setThreshold(3500);
4
5//Biometric candidate
6IBiometricCandidate biometricCandidate = new BiometricCandidate(BiometricModality.FACE);
7//We add all the templates for this candidate
8biometricCandidate.addTemplates(candidates);
9
10//Biometric references
11IBiometricReference biometricReference = new BiometricReference(user.getUuid(), BiometricModality.FACE);
12//We add all the templates for this user
13biometricReference.addTemplates(references);
14
15IAuthenticationResult result = matcherHandler.authenticate(authenticationOptions, biometricCandidate, biometricReference);
16//The result of the authentication
17long resultScore = result.getScore();
18//authentication status (FAILURE, SUCCESS...)
19AuthenticationStatus authenticationStatus = authenticationResult.getStatus();

Function

An example snippet is shown.

Java
1IAuthenticationResult authenticate(IAuthenticationOptions authenticationOptions, IBiometricCandidate biometricCandidate, IBiometricReference biometricReference);
Parameter
Description
authenticationOptions IAuthenticationOptionsThe options used to perform the authentication.
biometricCandidate IBiometricCandidateContains the list of templates that you want to match.
biometricReference IBiometricReferenceContains the list of templates that you want to use as reference; each has the userUUID to which they belong.

Errors

You will receive an exception reporting the error.

Identify

This method can be used to identify users. It identifies the user from the list of candidate templates that are matched against the list of reference templates.

Note: Check the use case named Identify.

An example snippet is shown:

Java
1//Identification options
2IIdentificationOptions identificationOptions = new IdentificationOptions();
3
4//Biometric candidate
5IBiometricCandidate biometricCandidate = new BiometricCandidate(BiometricModality.FACE);
6//We add all the templates for this candidate
7biometricCandidate.addTemplates(candidates);
8
9//We create the list of references
10ArrayList<IBiometricReference> biometricReferences = new ArrayList<IBiometricReference>();
11//Biometric reference for one user
12IBiometricReference biometricReference = new BiometricReference(user.getUuid(), BiometricModality.FACE);
13//We add all the templates for this user
14biometricReference.addTemplates(references);
15
16//We add the user to the list
17biometricReferences.add(biometricReference);
18
19matcherHandler.identify(identificationOptions, biometricCandidate, biometricReferences, new BioMatcherAsyncCallbacks<IIdentificationResult>() {
20 @Override
21 public void onPreExecute() {
22 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
23 }
24
25 @Override
26 public void onSuccess(IIdentificationResult result) {
27 //The identification result
28 List<IIdentificationCandidate> candidates = result.getIdentificationCandidateList();
29 if(candidates.size()>0){
30 IIdentificationCandidate candidate = candidates.get(0);
31 UUID userUUID = candidate.getUuid();
32 long candidateScore = candidate.getScore();
33 }
34
35 }
36
37 @Override
38 public void onError (Exception e) {
39 // An error has occurred
40 }
41});

Function

An example snippet is shown.

Java
1void identify(IIdentificationOptions identificationOptions, IBiometricCandidate biometricCandidate, List<IBiometricReference> biometricReferences, BioMatcherAsyncCallbacks<IIdentificationResult> callbacks);
Parameter
Description
identificationOptions IIdentificationOptionsThe options used to perform the identification
biometricCandidate IBiometricCandidateContains the list of templates that you want to match
biometricReferences List<IBiometricReference>Contains the list of references against you will identify your candidate—check IBiometricReference
callbacks BioMatcherAsyncCallbacksCallbacks to be executed depending on the result—check IIdentificationResult

Errors

You will receive an exception reporting the error.

Identify synchronous

This method can be used to identify users.

It identifies the user from the list of candidate templates that are matched against a list of reference templates.

Note: This function must be executed in a different thread than UI. Check the use case named Identify.

An example snippet is shown.

Java
1//Identification options
2IIdentificationOptions identificationOptions = new IdentificationOptions();
3
4//Biometric candidate
5IBiometricCandidate biometricCandidate = new BiometricCandidate(BiometricModality.FACE);
6//We add all the templates for this candidate
7biometricCandidate.addTemplates(candidates);
8
9//We create the list of references
10ArrayList<IBiometricReference> biometricReferences = new ArrayList<IBiometricReference>();
11//Biometric reference for one user
12IBiometricReference biometricReference = new BiometricReference(user.getUuid(), BiometricModality.FACE);
13//We add all the templates for this user
14biometricReference.addTemplates(references);
15
16//We add the user to the list
17biometricReferences.add(biometricReference);
18
19IIdentificationResult result = matcherHandler.identify(identificationOptions, biometricCandidate, biometricReferences)
20//The identification result
21List<IIdentificationCandidate> candidates = result.getIdentificationCandidateList();
22if (candidates.size() > 0) {
23 IIdentificationCandidate candidate = candidates.get(0);
24 UUID userUUID = candidate.getUuid();
25 long candidateScore = candidate.getScore();
26}

Function

An example snippet is shown.

Java
1IIdentificationResult identify(IIdentificationOptions identificationOptions, IBiometricCandidate biometricCandidate, List<IBiometricReference> biometricReferences);
Parameter
Description
identificationOptions IIdentificationOptionsThe options used to perform the identification.
biometricCandidate IBiometricCandidateContains the list of templates that you want to match.
biometricReferences List<IBiometricReference>Contains the list of references against which you will identify your candidate; refer to IBiometricReference

Errors

You will receive an exception reporting the error.

Detect biometrics

This allows you to detect the biometrics in a MorphoImage.

This function extracts all the biometric templates contained in an image (such as, all the faces that are in an image).

Note: Check the use case named Detect Biometric.

Java
1//Create a populate options
2 IDetectBiometricOptions detectBiometricsOptions = new DetectBiometricsOptions();
3 detectBiometricsOptions.setBiometricLocation(BiometricLocation.FACE_FRONTAL);
4 detectBiometricsOptions.setBiometricModality(BiometricModality.FACE);
5
6 bioMatcherHandler.detectBiometric(detectBiometricsOptions, image, new BioMatcherAsyncCallbacks<List<IMorphoTemplate>>() {
7 @Override
8 public void onPreExecute() {
9 // Optional hook on the builtin Android AsyncTask callback `onPreExecute`
10 }
11
12 @Override
13 public void onSuccess(List<IMorphoTemplate> result) {
14 //A List of templates extracted from the image
15 }
16
17 @Override
18 public void onError(Exception e) {
19 // An error has occurred
20 }
21 });

Function

An example snippet is shown.

Java
1public void detectBiometric(final IDetectBiometricOptions detectBiometricsOptions, final IImage image, BioMatcherAsyncCallbacks<List<IMorphoTemplate>> callbacks)
Parameter
Description
detectBiometricsOptions IDetectBiometricOptionsThe options used during the detection process.
image IImageThe image.
callbacks BioMatcherAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

Detect biometric synchronous

This allows you to detect the biometrics in a MorphoImage.

This function extracts all the biometric templates contained in an image (such as, all the faces that are in an image).

Note: This function must be executed in a different thread than the UI. Check the use case named Detect Biometric.

An example snippet is shown.

Java
1/Create a populate options
2IDetectBiometricOptions detectBiometricsOptions = new DetectBiometricsOptions();
3detectBiometricsOptions.setBiometricLocation(BiometricLocation.FACE_FRONTAL);
4detectBiometricsOptions.setBiometricModality(BiometricModality.FACE);
5
6List<IMorphoTemplate> templates = bioMatcherHandler.detectBiometric(detectBiometricsOptions, image)
7/A List of templates extracted from the image

Function

An example snippet is shown.

Java
1public List<IMorphoTemplate> detectBiometric(final IDetectBiometricOptions detectBiometricsOptions, final IImage image)
Parameter
Description
detectBiometricsOptions IDetectBiometricOptionsThe options to use during the detection process.
image IImageThe image.

Errors

You will receive an exception reporting the error.

Destroy

This releases all the handler resources as shown in the snippet:

Java
1handler.destroy();

Creating a BioMatcher Handler 

This allows you to retrieve a handler to perform all the matching, identifying, and template coding operations.

Java
1IBioMatcherSettings bioMatcherSettings = new BioMatcherSettings();
2 bioMatcherSettings.setLogLevel(LogLevel.DISABLE);
3 bioMatcherSettings.setDumpFileEnable(false);
4 bioMatcherSettings.setDumpFileFolder(null);
5 //To configure finger print template format
6 bioMatcherSettings.setFingerprintTemplate(MorphoFingerTemplateFormat.PKCOMPV2);
7 BioSdk.createBioMatcherHandler(this, bioMatcherSettings, new BioMatcherAsyncCallbacks<IBioMatcherHandler>() {
8 @Override
9 public void onPreExecute() {
10 // Optional hook on the builtin Android AsyncTask call-back `onPreExecute`
11 }
12
13 @Override
14 public void onSuccess(IBioMatcherHandler result) {
15 // Indicates that initialization succeeded. The returned handler can be used to perform the matching and identify operations.
16 }
17
18 @Override
19 public void onError(Exception e) {
20 // An error has occurred.
21 }
22 });
Parameter
Description
context ContextThe Android context.
settings IBioMatcherSettingsThe settings to configure the matcher.
callbacks BioMatcherAsyncCallbacksCallbacks to be executed depending on the result.

Errors

You will receive an exception reporting the error.

Enums 

ColorSpace

This is the colorspace enum.

Attribute
Description
Y8Grayscale 8bpp image.
Y16LEGrayscale 16bpp image (Little Endian).
BGR24Colour 24bpp BGR image (BMP like memory layout).
RGB24Colour 24bpp RGB image (reversed memory layout compared to RT_COLORSPACE_BGR24).

Camera

This is the enum used to configure the camera for the capture.

Attribute
Description
FRONTFront camera
REARRear camera

CameraFlash enum

This enum is used to configure the camera flash of the capture.

Attribute
Description
OFFCamera flash off
ONCamera flash on

Overlay

This is the enum used to configure the overlay for the capture.

Attribute
Description
OFFOverlay off
ONOverlay on

LogLevel

This enum controls the log level.

Attribute
Description
ERRORError log level or above
DEBUGDebug log level or above
WARNINGWarning log level or above
INFOInfo log level or above
DISABLEDisables logs

CaptureHandlerStatus enum

This enum retrieves the status of the capture handler.

Attribute
Description
STOPThe handler is stopped.
PREVIEWThe handler is in preview mode.
CAPTUREThe handler is in capture mode.

BioCaptureHandlerError

This enums gives information why msc async failed

Attribute
Description
MSC_ERR_PARAMETERSParameters are invalid.
MSC_ERR_PARAMETER_UNKNOWNParameter is missing
MSC_ERR_MEMALLOCMemory allocation issue
MSC_ERR_INITInitialization failed
MSC_ERR_GRAPH_INITIALISATION_FAILEDthe graph initialization failed
MSC_ERR_PARAMETER_NOT_FOUNDParameter is missing
MSC_ERR_PARAMETER_SIZEParameter size is incorrect
MSC_ERR_TYPE_MISMATCHMSC type mismatch
MSC_ERR_INVALID_HANDLEHandle is invalid
MSC_ERR_LICENSELicense is invalid
MSC_ERR_APPLINOTAVAILABLEthe application parameter is not available
MSC_ERR_PROFILENOTAVAILABLEMSC profile is not available
NOT_EXECUTEDJava is unable to execute
LIBS_NOT_FOUNDJava libraries are not found
NO_CONTEXT_SETJava context is not set
MSC_ERR_SUBPROFILENOTAVAILABLEMSC sub-profile is not available
MSC_ERR_UNKNOWNAn unknown error occurred
MSC_ERR_INVALID_OPERATIONThe operation is invalid
MSC_ERR_INCOMPATIBLE_API_VERSIONThe API version is incompatible, your application must be recompiled
MSC_ERR_PARAMETER_WRONG_TYPEParameter is not the right type
MSC_ERR_PARAMETER_NOT_SETParameter is not set in current scope
UNKNOWNUnknown error

FaceLiveness enum

This enum describes liveness verification mode.

Attribute
Description
NO_LIVENESSNo liveness detection is performed during capture.
ACTIVETriggers a more complex challenge to detect liveness.
PASSIVELiveness is detected without a user challenge—the user is unaware that liveness detection is being employed.
PASSIVE_VIDEOFace is detected without any challenge and effort by the user. Liveness detection is done on server side. Works only with RemoteCaptureHandler(DEPRECATED) and RemoteLiveness

FaceLivenessSecurityLevel enum

Security level for face liveness capture. Defines how restrictive the liveness verification will be. The higher the level is set, the more restrictive the verification will be.

Attribute
Description
LOW
MEDIUM
HIGHRecommended level

FaceLivenessResult enum

This enum represents the result of a face liveness check.

Attribute
Description
UNKNOWNUnable to define or liveness is turned off
LIVELiveness success - a living person is detected
FAKELiveness check failure - not a living person
NO_DECISIONWebBioServer is needed to make a decision

FaceCaptureInfo enum

Attribute
Description
INFO_GET_OUT_FIELDUser must move out of the camera field
INFO_COME_BACK_FIELDUser must move back into the camera field
INFO_TURN_LEFTUser must turn head left
INFO_TURN_RIGHTUser must turn head right
INFO_CENTER_TURN_LEFTUser must face center but turn head left
INFO_CENTER_TURN_RIGHTUser must face center but turn head right
INFO_CENTER_ROTATE_DOWNUser must face center but rotate head down
INFO_CENTER_ROTATE_UPUser must face center but rotate head up
INFO_CENTER_TILT_LEFTUser must face center but tilt head left
INFO_CENTER_TILT_RIGHTUser must face center but tilt head right
INFO_CENTER_MOVE_FORWARDSUser must move forwards
INFO_CENTER_MOVE_BACKWARDSUser must move backwards
INFO_CENTER_LOOK_FRONT_OF_CAMERAUser must look in front of the camera
INFO_CENTER_LOOK_CAMERA_WITH_LESS_MOVEMENTUser must look at the camera with less movement
INFO_TURN_LEFTRIGHTUser must turn left, then right or right, then left
INFO_TURN_DOWNUser must turn head down
INFO_TOO_FASTUser is moving his/her head too fast
INFO_NOT_MOVINGFace movement not detected
DEVICE_MOVEMENT_ROTATIONSmartphone movement detected (the user is moving his/her smartphone and not his/her face)

BiometricLocation enum

Attribute
Description
FACE_FRONTALFace
FINGER_RIGHT_INDEXRight index finger
FINGER_RIGHT_MIDDLERight middle finger
FINGER_RIGHT_RINGRight ring finger
FINGER_RIGHT_LITTLERight little finger
FINGER_RIGHT_THUMBRight thumb
FINGER_RIGHT_FOURRight four fingers
FINGER_LEFT_INDEXLeft index finger
FINGER_LEFT_MIDDLELeft middle finger
FINGER_LEFT_RINGLeft ring finger
FINGER_LEFT_LITTLELeft little finger
FINGER_LEFT_THUMBLeft thumb
FINGER_LEFT_FOURLeft four fingers
FINGER_UNKNOWNUnknown finger
HAND_LEFTLeft hand
HAND_RIGHTRight hand
HAND_UNKNOWNUnknown hand
UNKNOWNUnknown

BiometricModality enum

Attribute
Description
UNKNOWNUnknown
FACEFace
FRICTION_RIDGEFriction ridge (fingers)

Cr2dMode enum

Attribute
Description
RANDOMTargets fully random
PATHTargets defined with path
FIXEDFixed position for target

MorphoFaceTemplateFormat enum

This enum retrieves information about the face template format.

Attribute
Description
MIMAMIMA format
MOCMOC format

FaceTemplateQuality enum

This enum retrieves information about quality of the face saved in template.

Attribute
Description
LOWThe quality of face is low (not recommended to perform matching).
MEDIUMThe quality of face is medium - good enough to perform matching.
HIGHThe quality of face is high.

CaptureError enum

This enum reports the reason that a capture attempt failed.

Attribute
Description
UNKNOWNUnknown error
LOW_RESOLUTIONResolution too low
NOT_ENOUGH_MOVEMENTNot enough movement
TOO_FASTToo fast
HINT_UNKNOWNHint value is unknown
CAPTURE_TIMEOUTCapture timeout
CAPTURE_DELAYEDCapture delayed due to liveness failures
BAD_CAPTURECapture went wrong
BAD_CAPTURE_FINGERSCapture of the fingers went wrong
BAD_CAPTURE_FACECapture of the face went wrong
BAD_CAPTURE_HANDCapture of the hand went wrong
LIVENESS_CHECKLiveness check failed

AuthenticationStatus enum

This enum contains the authentication status.

Attribute
Description
SUCCESSAuthentication success (above the threshold used for the authentication process)
FAILUREAuthentication failure (below the threshold used for the authentication process)

LogLevel enum

This enum controls the logging level.

Attribute
Description
ERRORError log level or above
DEBUGDebug log level or above
WARNINGWarning log level or above
INFOInfo log level or above
DISABLEDisables logging
Example code
Kotlin
1val IMAGE_SIZE = 400
2val JPEG_COMPRESSION_LEVEL = 90
3
4private fun prepareImage(image: ByteArray): ByteArray {
5 val imageBitmap = BitmapFactory.decodeByteArray(image, 0, image.size)
6 val scaledBitmap = Bitmap.createScaledBitmap(imageBitmap, IMAGE_SIZE, IMAGE_SIZE, true)
7 imageBitmap.recycle()
8 val byteArrayOutStream = ByteArrayOutputStream()
9 val result = scaledBitmap.compress(Bitmap.CompressFormat.JPEG, JPEG_COMPRESSION_LEVEL, byteArrayOutStream)
10 scaledBitmap.recycle()
11 return byteArrayOutStream.toByteArray()
12}