You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: hub/apps/get-started/ai-build.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -132,7 +132,7 @@ You've built a complete WinUI 3 notes app using:
132
132
133
133
## Optional: Add on-device AI to your app
134
134
135
-
The notes app is fully functional — but you can take it further by adding an AI feature that runs entirely on the user's device. [Foundry Local](https://learn.microsoft.com/windows/ai/foundry-local/get-started) makes this straightforward: it runs a language model locally and exposes an OpenAI-compatible API.
135
+
The notes app is fully functional — but you can take it further by adding an AI feature that runs entirely on the user's device. [Foundry Local](/windows/ai/foundry-local/get-started) makes this straightforward: it runs a language model locally and exposes an OpenAI-compatible API.
136
136
137
137
### Install Foundry Local and download a model
138
138
@@ -171,7 +171,7 @@ var completion = await client.GetChatClient("phi-4-mini")
171
171
No internet connection required. No API key. The model runs on their PC — fast, private, and free.
172
172
173
173
> [!TIP]
174
-
> For apps targeting Copilot+ PCs, you can swap Foundry Local for [Phi Silica](https://learn.microsoft.com/windows/ai/apis/phi-silica) to use the NPU directly for even faster inference. The API surface is different (Windows AI APIs rather than OpenAI-compatible), but Copilot can help you make the switch.
174
+
> For apps targeting Copilot+ PCs, you can swap Foundry Local for [Phi Silica](/windows/ai/apis/phi-silica) to use the NPU directly for even faster inference. The API surface is different (Windows AI APIs rather than OpenAI-compatible), but Copilot can help you make the switch.
Copy file name to clipboardExpand all lines: hub/apps/get-started/ai-for-windows-developers.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,7 +16,7 @@ Windows is where AI development is happening — both for developers writing app
16
16
This article covers both: the AI coding tools that help you build Windows apps faster, and the Windows AI stack that lets you put intelligence directly into your app. When you're ready, follow the links to set up your environment and start building.
17
17
18
18
> [!TIP]
19
-
> **New to Windows development?** Windows has the deepest local AI stack of any platform: [Foundry Local](https://learn.microsoft.com/windows/ai/foundry-local/get-started) runs state-of-the-art models on any hardware, [Phi Silica](https://learn.microsoft.com/windows/ai/apis/phi-silica) uses the NPU on Copilot+ PCs for near-instant inference, and the full [Windows AI API surface](/windows/ai/) is available to any packaged app. If you're coming from Linux or macOS, Windows Subsystem for Linux (WSL) and the GitHub Copilot CLI Terminal mean you don't have to give up your existing workflow to get started.
19
+
> **New to Windows development?** Windows has the deepest local AI stack of any platform: [Foundry Local](/windows/ai/foundry-local/get-started) runs state-of-the-art models on any hardware, [Phi Silica](/windows/ai/apis/phi-silica) uses the NPU on Copilot+ PCs for near-instant inference, and the full [Windows AI API surface](/windows/ai/) is available to any packaged app. If you're coming from Linux or macOS, Windows Subsystem for Linux (WSL) and the GitHub Copilot CLI Terminal mean you don't have to give up your existing workflow to get started.
20
20
21
21
## Two ways AI changes Windows development
22
22
@@ -95,7 +95,7 @@ The Windows AI stack lets you ship AI features directly in your app — with har
95
95
96
96
### Foundry Local
97
97
98
-
[Foundry Local](https://learn.microsoft.com/windows/ai/foundry-local/get-started) runs large language models locally on any Windows PC. It exposes an OpenAI-compatible REST API, so you can use your existing AI code against local models with no rewrite. Foundry Local is the recommended starting point for adding AI to a Windows app — it works on any hardware, requires no Azure subscription, and keeps user data on-device.
98
+
[Foundry Local](/windows/ai/foundry-local/get-started) runs large language models locally on any Windows PC. It exposes an OpenAI-compatible REST API, so you can use your existing AI code against local models with no rewrite. Foundry Local is the recommended starting point for adding AI to a Windows app — it works on any hardware, requires no Azure subscription, and keeps user data on-device.
99
99
100
100
```bash
101
101
winget install Microsoft.AIFoundry.Local
@@ -106,7 +106,7 @@ After the model starts, call it from your app using the OpenAI-compatible endpoi
106
106
107
107
### Phi Silica
108
108
109
-
[Phi Silica](https://learn.microsoft.com/windows/ai/apis/phi-silica) is a compact, highly capable model built into Windows 11 on Copilot+ PCs. It runs entirely on the NPU — no GPU, no cloud, near-instant inference. If your app targets Copilot+ PCs, Phi Silica is the fastest local AI option available.
109
+
[Phi Silica](/windows/ai/apis/phi-silica) is a compact, highly capable model built into Windows 11 on Copilot+ PCs. It runs entirely on the NPU — no GPU, no cloud, near-instant inference. If your app targets Copilot+ PCs, Phi Silica is the fastest local AI option available.
110
110
111
111
> [!NOTE]
112
112
> Phi Silica requires a Copilot+ PC (with NPU, 40+ TOPS). For apps targeting all Windows hardware, use Foundry Local with a fallback to cloud APIs.
@@ -132,7 +132,7 @@ All of these run on-device, require no cloud subscription, and become available
132
132
Or jump straight to:
133
133
134
134
-[Tutorial: Build a Windows app with GitHub Copilot](ai-build.md)
135
-
-[Get started with Foundry Local](https://learn.microsoft.com/windows/ai/foundry-local/get-started)
135
+
-[Get started with Foundry Local](/windows/ai/foundry-local/get-started)
136
136
-[Modernize or port a Windows app with Copilot](../windows-app-sdk/migrate-to-windows-app-sdk/ai-modernize.md)
137
137
-[Agentic AI tools for Windows development](../dev-tools/agentic-tools.md)
Copy file name to clipboardExpand all lines: hub/apps/publish/recent-data-usage-reports.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,7 +9,7 @@ ms.localizationpriority: medium
9
9
10
10
# What’s New - Recent data view in Usage reports
11
11
12
-
The [**Recent data** view in Usage reports](https://partner.microsoft.com/dashboard/insights/analytics/store/usage?viewSelected=48h) provides near real‑time visibility into how users are engaging with your app. With this launch, data is now refreshed in about 3 hours instead of 30 hours, delivering a faster and more responsive reporting experience. This view is designed to complement the [existing daily and longer‑term Usage insights](https://learn.microsoft.com/windows/apps/publish/usage) by focusing on very recent activity, helping you react quickly to changes without waiting for daily aggregation.
12
+
The [**Recent data** view in Usage reports](https://partner.microsoft.com/dashboard/insights/analytics/store/usage?viewSelected=48h) provides near real‑time visibility into how users are engaging with your app. With this launch, data is now refreshed in about 3 hours instead of 30 hours, delivering a faster and more responsive reporting experience. This view is designed to complement the [existing daily and longer‑term Usage insights](/windows/apps/publish/usage) by focusing on very recent activity, helping you react quickly to changes without waiting for daily aggregation.
0 commit comments