FaceLivenessDetector

Get started with the Azure AI Vision Face UI Web SDK

In this sample, you will learn how to build and run the face liveness detection application.

Table of Contents

Introduction

The Azure AI Vision Face UI Web SDK is a client library intended to enable the integration of the face liveness feature into web-applications. It works seamlessly with Azure AI Face APIs to determine the authenticity of a face in a video stream.

Prerequisites

  1. An Azure Face API resource subscription.
  2. Install node from https://nodejs.org/en/download/prebuilt-installer

Installation

  1. Create .npmrc file in root of app folder to pull packages from https://pkgs.dev.azure.com/msface/SDK/_packaging/AzureAIVision/npm/registry/ registry. An example .npmrc file is available here(https://github.com/Azure-Samples/azure-ai-vision-sdk/blob/main/samples/web/angularjs/.npmrc).

  2. Fetch the base64 access token required in the .npmrc file using the API: Liveness Session Operations - Get Client Assets Access Token

  3. To install the SDK via NPM, run the following command in the root of the app folder:

    npm install @azure/ai-vision-face-ui@latest
    

Integrate face liveness detector into your own application

First, ensure you have installed the npm package as described in the Installation section.

Obtaining a session token

The session-authorization-token is required to start a liveness session. See fetchTokenOnServer in server.js file method for a demo. For more information on how to orchestrate the liveness flow by utilizing the Azure AI Vision Face service, visit: https://aka.ms/azure-ai-vision-face-liveness-tutorial

Injecting the web component

After obtaining a valid session-authorization-token, you can integrate the web component, <azure-ai-vision-face-ui> element, using JavaScript.

const azureAIVisionFaceUI = document.createElement("azure-ai-vision-face-ui");
document.getElementById("your-container-id").appendChild(azureAIVisionFaceUI);
azureAIVisionFaceUI.start("***FACE_API_SESSION_TOKEN***")
.then(resultData => {
// The resultData which is LivenessDetectionSuccess interface.
// The result of analysis is queryable from the service using sessions result API
// https://learn.microsoft.com/rest/api/face/liveness-session-operations/get-liveness-session-result?view=rest-face-v1.2-preview.1&tabs=HTTP

})
.catch(errorData => {
// In case of failures, the promise is rejected. The errorData which is LivenessDetectionError interface, contains the reason for the failure.
});

Deployment

It's important to note that essential assets like WebAssembly (wasm) files and localization files are packaged within the NPM distribution. During deployment to a production environment, it's essential to include these assets. As an example, you can deploy the 'facelivenessdetector-assets' from the node_modules\azure-ai-vision-face-ui folder to the root assets directory like public folder after the npm installation to ensure proper asset deployment.

🌍 Localization

The Azure AI Vision Face UI SDK embraces global diversity by supporting multiple languages. The complete list of supported locales and language dictionary is available here

🌐 Setting a Locale

To use a specific locale, assign the locale attribute to the azure-ai-vision-face-ui component. If translations are available for that locale, they will be used; otherwise, the SDK will default to English.

  • Example - Enabling Portuguese
    const azureAIVisionFaceUI = document.createElement("azure-ai-vision-face-ui");
    azureAIVisionFaceUI.locale = "pt-PT"; // Setting Portuguese locale
    document.getElementById("your-container-id").appendChild(azureAIVisionFaceUI);

UX Customization

You can customize the layout of the page using following options:

Increase your brightness image

Customize the default "Increase your screen brightness" image by providing your own image. Ensure the image is correctly deployed for production. azureAIVisionFaceUI.brightnessImagePath = newImagePath;

Font size

Customize the default font size for all the text. The default is 1.5rem azureAIVisionFaceUI.fontSize = newSize;

Font family

Customize the default font family for all the text. The default value is font-family: system-ui, -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif;

azureAIVisionFaceUI.fontFamily = newFontFamily;

Continue button

Customize the look and feel of the "Continue" button by providing your own CSS styles. To change the text, use languageDictionary attribute and override the "Continue" key.

azureAIVisionFaceUI.continueButtonStyles = newCSS;

Feedback messages

Customize the look and feel of the feedback messages by providing your own CSS styles.

azureAIVisionFaceUI.feedbackMessageStyles = newCSS;

FAQ

Q: How can I get the results of the liveness session?

Once the session is completed and the promise fulfilled, for security reasons the client does not receive the outcome whether face is live or spoof.

You can query the result from your backend service by calling the sessions results API to get the outcome https://aka.ms/face/liveness-session/get-liveness-session-result

Q: How can I automate deployment of the assets?

  • React

    For deployment You can add postbuild script to your package.json to copy facelivenessdetector-assets to public

    "scripts": {
    "postbuild": "cpy node_modules/azure-ai-vision-face-ui/facelivenessdetector-assets/**/* public/facelivenessdetector-assets --parents"
    }
  • Angular

    Please see the AngularJS integration example at samples/angularjs/src/face/face.component.ts

    For deployment you can add section to deploy facelivenessdetector-assets in your projects' build section of the configuration file


    "build": {
    "options": {
    "assets": [
    { "glob": "**/*", "input": "./node_modules/azure-ai-vision-face-ui/facelivenessdetector-assets", "output": "/facelivenessdetector-assets" }
    ],
    }
    }

Generated using TypeDoc