Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
55 changes: 55 additions & 0 deletions packages/web/src/content/docs/providers.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -1642,6 +1642,61 @@ OpenCode Zen is a list of tested and verified models provided by the OpenCode te

---

### Manifest

[Manifest](https://manifest.build) is an open-source LLM router that cuts inference costs through smart routing across 16+ providers. You get full control over which model handles each request. Route by complexity tier, task-specificity (coding, web browsing, etc.) and custom tiers.

1. Head over to [manifest.build](https://manifest.build), create an account, and copy your API key (starts with `mnfst_`).

2. Run the `/connect` command and search for Manifest.

```txt
/connect
```

3. Enter the API key for the provider.

```txt
┌ API key
└ enter
```

4. Run the `/models` command and select `auto`.

```txt
/models
```

5. You can also configure Manifest through your opencode config.

**Cloud** (default, no base URL needed):

```json title="opencode.json"
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"manifest": {}
}
}
```

**Self-hosted** (Manifest is open-source, can run locally with Docker for fully private inference):

```json title="opencode.json"
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"manifest": {
"api": "http://localhost:2099/v1"
}
}
}
```

---

### LLM Gateway

1. Head over to the [LLM Gateway dashboard](https://llmgateway.io/dashboard), click **Create API Key**, and copy the key.
Expand Down
Loading