Skip to content

Commit 6c38dc6

Browse files
Merge pull request #258744 from dem108/patch-28
Clarify on limitation for local deployment
2 parents 3499440 + cfef112 commit 6c38dc6

1 file changed

Lines changed: 6 additions & 5 deletions

File tree

articles/machine-learning/how-to-deploy-online-endpoints.md

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.subservice: inferencing
88
author: dem108
99
ms.author: sehan
1010
ms.reviewer: mopeakande
11-
ms.date: 10/18/2023
11+
ms.date: 11/15/2023
1212
reviewer: msakande
1313
ms.topic: how-to
1414
ms.custom: how-to, devplatv2, ignite-fall-2021, cliv2, event-tier1-build-2022, sdkv2
@@ -555,10 +555,11 @@ To deploy locally, [Docker Engine](https://docs.docker.com/engine/install/) must
555555
> [!TIP]
556556
> You can use [Azure Machine Learning inference HTTP server Python package](how-to-inference-server-http.md) to debug your scoring script locally **without Docker Engine**. Debugging with the inference server helps you to debug the scoring script before deploying to local endpoints so that you can debug without being affected by the deployment container configurations.
557557
558-
Local endpoints have the following limitations:
559-
- They do *not* support traffic rules, authentication, or probe settings.
560-
- They support only one deployment per endpoint.
561-
- They support local model files only. If you want to test registered models, first download them using [CLI](/cli/azure/ml/model#az-ml-model-download) or [SDK](/python/api/azure-ai-ml/azure.ai.ml.operations.modeloperations#azure-ai-ml-operations-modeloperations-download), then use `path` in the deployment definition to refer to the parent folder.
558+
> [!NOTE]
559+
> Local endpoints have the following limitations:
560+
> - They do *not* support traffic rules, authentication, or probe settings.
561+
> - They support only one deployment per endpoint.
562+
> - They support local model files and environment with local conda file only. If you want to test registered models, first download them using [CLI](/cli/azure/ml/model#az-ml-model-download) or [SDK](/python/api/azure-ai-ml/azure.ai.ml.operations.modeloperations#azure-ai-ml-operations-modeloperations-download), then use `path` in the deployment definition to refer to the parent folder. If you want to test registered environments, check the context of the environment in Azure Machine Learning studio and prepare local conda file to use. Example in this article demonstrates using local model and environment with local conda file, which supports local deployment.
562563
563564
For more information on debugging online endpoints locally before deploying to Azure, see [Debug online endpoints locally in Visual Studio Code](how-to-debug-managed-online-endpoints-visual-studio-code.md).
564565

0 commit comments

Comments
 (0)