In this sample, you will learn how to build and run the face liveness detection application.
The Azure AI Vision Face UI Web SDK is a client library intended to enable the integration of the face liveness feature into web-applications. It works seamlessly with Azure AI Face APIs to determine the authenticity of a face in a video stream.
Create .npmrc file in root of app folder to pull packages from https://pkgs.dev.azure.com/msface/SDK/_packaging/AzureAIVision/npm/registry/
registry.
An example .npmrc file is available here(https://github.com/Azure-Samples/azure-ai-vision-sdk/blob/main/samples/web/angularjs/.npmrc).
Fetch the base64 access token required in the .npmrc file using the API: Liveness Session Operations - Get Client Assets Access Token
To install the SDK via NPM, run the following command in the root of the app folder:
npm install @azure/ai-vision-face-ui@latest
Follow these steps to quickly run a sample app built with Next.js, Angular, or Vue.js.
Follow the steps in Installation
section to install the npm package.
Copy facelivenessdetector-assets/
folder from node_modules/@azure-ai-vision-face/ui-assets
to public/
.
Update the variables in .env.local
with your own face-api key and endpoint.
Run the app with npm run dev
. On the first run, the development server may take a few minutes to initialize.
Note: the samples/web/javascript
contains a fully featured vanilla-javascript sample
Generated using TypeDoc