From ba65930a83daad421f0ed51451ba7ff5d3dfaff1 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?S=C3=A9bastien=20Conejo?= Date: Fri, 1 May 2026 22:56:51 -0700 Subject: [PATCH] docs: add Manifest provider section --- packages/web/src/content/docs/providers.mdx | 55 +++++++++++++++++++++ 1 file changed, 55 insertions(+) diff --git a/packages/web/src/content/docs/providers.mdx b/packages/web/src/content/docs/providers.mdx index 7c395022c14a..4aaf252004cb 100644 --- a/packages/web/src/content/docs/providers.mdx +++ b/packages/web/src/content/docs/providers.mdx @@ -1642,6 +1642,61 @@ OpenCode Zen is a list of tested and verified models provided by the OpenCode te --- +### Manifest + +[Manifest](https://manifest.build) is an open-source LLM router that cuts inference costs through smart routing across 16+ providers. You get full control over which model handles each request. Route by complexity tier, task-specificity (coding, web browsing, etc.) and custom tiers. + +1. Head over to [manifest.build](https://manifest.build), create an account, and copy your API key (starts with `mnfst_`). + +2. Run the `/connect` command and search for Manifest. + + ```txt + /connect + ``` + +3. Enter the API key for the provider. + + ```txt + ┌ API key + │ + │ + └ enter + ``` + +4. Run the `/models` command and select `auto`. + + ```txt + /models + ``` + +5. You can also configure Manifest through your opencode config. + + **Cloud** (default, no base URL needed): + + ```json title="opencode.json" + { + "$schema": "https://opencode.ai/config.json", + "provider": { + "manifest": {} + } + } + ``` + + **Self-hosted** (Manifest is open-source, can run locally with Docker for fully private inference): + + ```json title="opencode.json" + { + "$schema": "https://opencode.ai/config.json", + "provider": { + "manifest": { + "api": "http://localhost:2099/v1" + } + } + } + ``` + +--- + ### LLM Gateway 1. Head over to the [LLM Gateway dashboard](https://llmgateway.io/dashboard), click **Create API Key**, and copy the key.