Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/new-windows-ml/api-reference.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ ms.topic: article

# Windows ML APIs

For conceptual guidance, see [Run ONNX models with Windows ML)](./run-onnx-models.md).
For conceptual guidance, see [Run ONNX models with Windows ML](./run-onnx-models.md).

You can think of the APIs in the *Microsoft.WindowsAppSDK.ML* NuGet package as being the superset of these two sets:

Expand Down Expand Up @@ -73,7 +73,7 @@ The ONNX runtime is designed in a way where the Python and native environments a

**Remove pywinrt's packed vcruntime**

The pywinrt project includes a msvcp140.dll in the winrt-runtime package. This may conflict with other packages. Please remove this dll to avoid this problem and install the missing vcruntime libraries with the [vc redistributable](/cpp/windows/latest-supported-vc-redist)
The pywinrt project includes a msvcp140.dll in the winrt-runtime package. This may conflict with other packages. Please remove this dll to avoid this problem and install the missing vcruntime libraries with the [vc redistributable](/cpp/windows/latest-supported-vc-redist).

## See also

Expand Down
4 changes: 3 additions & 1 deletion docs/new-windows-ml/initialize-execution-providers.md
Original file line number Diff line number Diff line change
Expand Up @@ -436,7 +436,8 @@ std::vector<ExecutionProvider> targetProviders;
for (auto const& p : allProviders)
{
auto name = p.Name();
if (name == L"VitisAIExecutionProvider" ||
if (name == L"MIGraphXExecutionProvider" ||
name == L"VitisAIExecutionProvider" ||
name == L"OpenVINOExecutionProvider" ||
name == L"QNNExecutionProvider" ||
name == L"NvTensorRtRtxExecutionProvider")
Expand Down Expand Up @@ -497,6 +498,7 @@ else

// List of provider names our app supports
const char* targetProviderNames[] = {
"MIGraphXExecutionProvider",
"VitisAIExecutionProvider",
"OpenVINOExecutionProvider",
"QNNExecutionProvider",
Expand Down
2 changes: 1 addition & 1 deletion docs/new-windows-ml/logs.md
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@ WindowsAppSDK.ML Version (2024/01/15 10:30:45.100):
Version = 2.0.0
```

Shows the Windows App SDK Windows ML version used by the application. Duplicate entries with the same version are suppressed
Shows the Windows App SDK Windows ML version used by the application. Duplicate entries with the same version are suppressed.

#### Driver info

Expand Down
2 changes: 1 addition & 1 deletion docs/new-windows-ml/model-catalog/model-catalog-source.md
Original file line number Diff line number Diff line change
Expand Up @@ -326,7 +326,7 @@ The Windows ML Model Catalog follows the JSON Schema specification (draft 2020-1

1. **Use semantic versioning**: Follow semantic versioning (e.g., "1.2.3") for the `version` field
2. **Provide accurate SHA256 hashes**: Always include correct SHA256 hashes for integrity verification
5. **Consistent naming**: Use consistent naming conventions for IDs and names across model versions
3. **Consistent naming**: Use consistent naming conventions for IDs and names across model versions
4. **Clear descriptions**: Write helpful descriptions that explain model capabilities and use cases
5. **Proper licensing**: Always include complete license information (type, URI, and text)
6. **Test accessibility**: Ensure all URIs are accessible and return the expected content
Expand Down
2 changes: 1 addition & 1 deletion docs/new-windows-ml/multiple-onnx-versions.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ Some apps might need to use multiple different versions of ONNX Runtime. For exa

In these cases, you can run Windows ML's ONNX Runtime alongside another ONNX Runtime by running Windows ML in a separate process.

As an example of how to do this, say you wanted to take advantage of Windows ML's ability to run SqueezeNet. You would first build [this sample](https://github.com/microsoft/WindowsAppSDK-Samples/tree/main/Samples/WindowsML/cs/CSharpConsoleDesktop).
As an example of how to do this, say you wanted to take advantage of Windows ML's ability to run SqueezeNet. You would first build the [CSharpConsoleDesktop Windows ML sample](https://github.com/microsoft/WindowsAppSDK-Samples/tree/main/Samples/WindowsML/cs/CSharpConsoleDesktop).

This sample accepts an image file as a command-line argument and outputs its interpretation of the image contents.

Expand Down