In this sample, you will learn how to build and run the face liveness detection application. The Azure AI Vision Face UI SDK for iOS is currently in preview. The APIs are subject to change.
- Swift API reference documents: Azure SDK for iOS Docs, AzureAIVisionCore, AzureAIVisionFaceUI
- An Azure Face API resource subscription.
- A Mac (with iOS development environment including Xcode 13+) and an iPhone (with iOS version 14 or above) to test the AzureAIVision SDK.
- An Apple developer account to install and run development apps on the iPhone.
-
If this is your first time using your Mac to develop, you should build a sample app from About Me — Sample Apps Tutorials | Apple Developer Documentation and run it on your phone before you attempt to build the App here. This will help ensure that your developer environment has been setup properly.
-
If you have a valid Azure subscription that has been provisioned for Face API Liveness Detection, you can get the access token to access the release artifacts. More details can be found in GET_FACE_ARTIFACTS_ACCESS.
-
To install the SDK packages for your iOS development Application in Xcode, here are ways through CocoaPods or Swift Package Manager:
-
Prerequisite
- You may encounter error if your Xcode has never been configured to use Git LFS. If Git LFS is never installed on your machine, refer to Git LFS official site for instructions on how to install.
- An example installation command in macOS is:
brew install git-lfs
- Make sure you initialize Git LFS after installation is done:
git lfs install
- You can verify your Git LFS installation by checking the version:
git lfs --version
-
Swift Package Manager (Swift.org - Package Manager)
-
This approach requires XCode recognize and execute
git-lfs
command. However, the execution ofgit-lfs
may be impacted by your macOS System Integrity Protection (SIP). First, please check whether SIP is enabled or not by running in your macOS Terminal:csrutil status
-
If SIP is not enabled, or you temporarily disabled it. You can run the following command to create symbolic link and make Xcode recognize the
git-lfs
command:sudo ln -s $(which git-lfs) $(xcode-select -p)/usr/bin/git-lfs
-
After you configured Git LFS successfully, use the following repository URLs in Swift Package Manager, one at a time:
https://msface.visualstudio.com/SDK/_git/AzureAIVisionCore.xcframework
https://msface.visualstudio.com/SDK/_git/AzureAIVisionFaceUI.xcframework
-
You will see a pop-up window asking for username and password. Make a random username and use the accessToken from previous step to be the password.
Note: the username is just a placeholder, and it can be any random string.
-
If your Xcode's Git cannot be configured to enable Git LFS, switch to using your system Git, as follows:
-
When you encounter error that says "Package Resolution Failed", dismiss with "Add Anyway". This will add a partially configured package dependency.
-
If Xcode Command Line Tools is never installed on your machine, install it first following instructions from Apple Developer website.
-
Run the following command from Terminal, from the directory where your .xcodeproj is located, as appropriate for your project. It will resolve the package through your system Git. Your system Git should already have Git LFS configured, as mentioned in Prerequisites section.
xcodebuild -scmProvider system -resolvePackageDependencies
Note: You can also add
-project
,-workspace
, and-scheme
arguments as appropriate to target your development setup. Refer to the xcodebuild (Command Line Tools) FAQ and-help
section. -
Once the command above ran successfully, your project should be able to successfully resolve the dependency from Xcode as well.
-
-
Alternatively, if you still cannot resolve Git LFS issue for Swift Package Manager, you can download the packages by cloning the source Git repositories directly. This should be the least preferred method, as it requires manual dependency version management and may lead to staleness or version conflicts.
-
use the access token from GET_FACE_ARTIFACTS_ACCESS as a "password" to clone the following repositories, then manually copy the files to your project.
git clone https://username:{accessToken}@msface.visualstudio.com/SDK/_git/AzureAIVisionCore.xcframework git clone https://username:{accessToken}@msface.visualstudio.com/SDK/_git/AzureAIVisionFaceUI.xcframework
Note: "accessToken" is the only required parameter here.
-
After cloning the repositories, you should see 'AzureAIVisionCore.xcframework' and 'AzureAIVisionFaceUI.xcframework' as two separate folders in your local path. The frameworks you should use are located under the parent folders, like:
AzureAIVisionCore.xcframework/AzureAIVisionCore.xcframework AzureAIVisionFaceUI.xcframework/AzureAIVisionFaceUI.xcframework
Ensure their disk size is larger than 100MB. If not, check your Git LFS installation and initialization, then run the following commands in each repository directory:
git lfs pull
-
Open your Xcode project and navigate to Target -> General -> Frameworks, Libraries, and Embedded Content. Remove any existing Swift Package Manager dependencies for 'AzureAIVisionCore.xcframework' and 'AzureAIVisionFaceUI.xcframework' if they are defined that way. Choose "Add Other", then "Add files", and add both frameworks from your cloned repositories path:
localPath\AzureAIVisionCore.xcframework\AzureAIVisionCore.xcframework localPath\AzureAIVisionFaceUI.xcframework\AzureAIVisionFaceUI.xcframework
Mark them as "Do Not Embed".
-
-
-
CocoaPods (CocoaPods Guides - Getting Started)
-
Add the following lines to your project's Podfile.
'YourBuildTargetNameHere'
is an example target, and you should use your actual target project instead.# add repos as source source 'https://msface.visualstudio.com/SDK/_git/AzureAIVisionCore.podspec' source 'https://msface.visualstudio.com/SDK/_git/AzureAIVisionFaceUI.podspec' target 'YourBuildTargetNameHere' do # add the pods here, optionally with version specification as needed pod 'AzureAIVisionCore', '0.17.1-beta.1' pod 'AzureAIVisionFaceUI', '0.17.1-beta.1' end
-
For access authorization to the repos, the steps depend on your system Git and your security preferences.
-
If you are using Git credential manager, you will be prompted for username and password.
Note: the username is just a placeholder, and it can be any random string.
-
To use
http.extraHeader
approach ofgit-config
, you need to convert the token to base64 format. Refer to the Use a PAT section of this Azure DevOps documentation article. Note that instead of using the git clone invocation as shown in the example, you should call:MY_PAT=accessToken HEADER_VALUE=$(printf "Authorization: Basic %s" "$MY_PAT" | base64) git config --global http.https://msface.visualstudio.com/SDK.extraHeader "${HEADER_VALUE}"
-
For other types of Git installation, refer to the Credentials section of Git FAQ.
-
-
-
-
Refer to the API reference documentation to learn more about our SDK APIs.
Now that you have setup your environment you can either:
- Download the sample App folder. Double click the .xcodeproj file. This will open the project in Xcode.
- Add package dependency through Swift Package Manager, as mentioned before. You should add both AzureAIVisionFaceUI.xcframework and AzureAIVisionCore.xcframework into the project. If you failed to use Swift Package Manager to add the frameworks, Please consider using alternative ways like cloning the source Git repositories or CocoaPods in Set up the environment.
- Set the App bundle identifier and developer team in "XCode -> Targets -> Signing & Capabilities" using your Apple developer account information.
- Now attach your phone to the Mac. You should get prompt on the phone asking you to "Trust" the Mac. Enable the trust.
- The phone should now show up in the Xcode top bar. Your iPhone name should be visible.
- Now build and run the app.
The first time the app runs, It is going to ask for camera permission. Allow the camera access. The App starts with the launch page. Some of the buttons are disabled until you configure the settings. Click the settings page. The settings page has the following fields/settings from top to bottom. Enter them correctly.
API endpoint This is the Azure subscription endpoint, where the application makes the FaceAPI calls to.
Subscription This secret key to access the Azure endpoint.
The application supports 2 scenarios.
This mode checks to see if the person in camera view is a live person or not.
- Click on the "Liveness" button, following the guidance on the screen.
- Click "Start" and show your face to the front-facing camera. As it processes your images, the screen will display user feedback on image quality. The screen will also flash black and white. This is needed for liveness analysis.
- The result is displayed as Liveness status (Real/Spoof).
- You can return to the main page by clicking "Continue".
This mode checks the person liveness with verification against a provided face image.
- Click on the "LivenessWithVerify" button and it will prompt you to select an image of a face to verify against.
- Click next and show your face to the front-facing camera. As it processes your images, the screen will display user feedback on image quality. The screen will also flash black and white. This is needed for liveness analysis.
- The result is displayed as Liveness status (Real/Spoof), verification status (Recognized/NotRecognized), and verification confidence score.
- You can return to the main page by clicking "Continue".
Based on the provided sample App, please refer to MainView.swift
for an example usage of the FaceLivenessDetectorView
. Here are the steps to integrate the face liveness detection into your own application:
-
in the "Xcode -> Targets -> Build Settings -> Swift Compiler - Language", select the "C++ and Objective-C Interoperability" to be "C++ / Objective-C++
-
in the "Xcode -> Targets -> Info -> Custom iOS Target Properties", add Key "Privacy-Camera Usage Description" with your description, like "This App requires camera usage."
In the "Xcode -> Files -> Add Package Dependencies", add the AzureAIVisionCore.xcframework, AzureAIVisionFaceUI.xcframework as mentioned in Set up the environment.
Please refer to MainView.swift
for an example usage of the FaceLivenessDetectorView
.
Here are more details about the FaceLivenessDetectorView
parameters.
For more information on how to orchestrate the liveness flow by utilizing the Azure AI Vision Face service, visit: https://aka.ms/azure-ai-vision-face-liveness-tutorial.
-
result: Binding<LivenessDetectionResult?>
Required parameter. The result of liveness detection. It is only
nil
while the analysis has not completed.Define a
@Binding
of typeLivenessDetectionResult?
in your View to receive the result of the liveness detection, and provide this toFaceLivenessDetectorView
. InMainView.swift
example, thisView
holds the source@State
variable of the passed binding directly. -
sessionAuthorizationToken: String
Required parameter. Used to authorize the client and allow the client to establish the session connection to the server.
/// - Parameter deviceCorrelationId: The client-specified correlation ID, if required.
-
verifyImageFileContent: Data?
Optional parameter, default as
nil
. This refers to the reference image, if provided in client. For most production scenario, you should provide this in the App server when creating the session. A non-nil
value here requires that the providedsessionAuthorizationToken
allows setting this value on client-side. Else, re-specifying them here will result in an error. -
deviceCorrelationId: String?
Optional parameter, default as
nil
. This refers to the device correlation identifier, if provided in client. For most production scenario, you should provide this in the App server when creating the session. A non-nil
value here requires that the providedsessionAuthorizationToken
allows setting this value on client-side. Else, re-specifying them here will result in an error.
Next, respond to the update of the passed binding in your View
.
In MainView.swift
example, the View
uses onChange(of:perform:)
to demonstrate a more imperative way of handling the result, but you can also use a more SwiftUI-esque declarative way of handling the result, like:
struct HostView: View {
@State var livenessDetectionResult: LivenessDetectionResult? = nil
var token: String
var body: some View {
if livenessDetectionResult == nil {
FaceLivenessDetectorView(result: $livenessDetectionResult,
sessionAuthorizationToken: token)
} else if let result = livenessDetectionResult {
VStack {
switch result {
case .success(let result):
/// <#handle success#>
case .failure(let error):
/// <#handle failure#>
}
}
}
}
}
With first 4 steps, you should be able to run liveness detection in your own project. Here are more advanced details for you to understand the API usage. The file LaunchView.swift
contains the method on how the token was obtained.
- Configuring the FaceAPI service to obtain the required session-authorization-token
// this is for demo purpose only, session-authorization-token can be obtained in the App server directly
sessionData.token = obtainToken(...)
Note:
- A demo version on obtaining the token is in
AppUtility.swift
for the demo app to be built as an standalone solution, but this is not recommended. The "session-authorization-token" is required to start a liveness session. For more information on how to orchestrate the liveness flow by utilizing the Azure AI Vision Face service, visit: https://aka.ms/azure-ai-vision-face-liveness-tutorial.
All the on-screen prompt are defined with English as default language. We provide Chinese (simplified) as an example to add your localization.
(1) Go to "Xcode -> Targets -> Info -> Custom iOS Target Properties -> Localizations" to add all the languages you want to support.
(2) Refer to the Core/en.lproj and Core/zh-Hans.lproj to add the corresponding translation for all the added languages in your localizations.
We highly recommend leveraging the "digest" generated within the solution to validate the integrity of the communication between your application and the Azure AI Vision Face service. This is necessary to ensure that the final liveness detection result is trustworthy. "Digest" is provided in the following two locations:
-
The
FaceLivenessDetectorView
running on your application.In the resulting
LivenessDetectionResult
:
if case let .success(success) = result {
sessionData.resultDigest = success.digest
}
-
The Azure AI Vision Face service.
The "digest" will be contained within the liveness detection result when calling the detectliveness/singlemodal/sessions/ REST call. Look for an example of the "digest" in the tutorial where the liveness detection result is shown.
Digests must match between the application and the service. We recommend using these digests in conjunction with iOS integrity APIs to perform the final validation. For more information on the iOS Integrity APIs, please refer to: