You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/computer-vision/Tutorials/liveness.md
+10-10Lines changed: 10 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -37,7 +37,7 @@ The liveness solution integration involves two different components: a mobile ap
37
37
### Integrate liveness into mobile application
38
38
39
39
Once you have access to the SDK, follow instruction in the [azure-ai-vision-sdk](https://github.com/Azure-Samples/azure-ai-vision-sdk) GitHub repository to integrate the UI and the code into your native mobile application. The liveness SDK supports both Java/Kotlin for Android and Swift for iOS mobile applications:
40
-
- For Swift iOS, follow the instructions in the [iOS sample](https://aka.ms/liveness-sample-ios)
40
+
- For Swift iOS, follow the instructions in the [iOS sample](https://aka.ms/azure-ai-vision-face-liveness-client-sdk-ios-readme)
41
41
- For Kotlin/Java Android, follow the instructions in the [Android sample](https://aka.ms/liveness-sample-java)
42
42
43
43
Once you've added the code into your application, the SDK will handle starting the camera, guiding the end-user to adjust their position, composing the liveness payload, and calling the Azure AI Face cloud service to process the liveness payload.
@@ -54,7 +54,7 @@ The high-level steps involved in liveness orchestration are illustrated below:
0 commit comments