Skip to content

Commit b6db2a9

Browse files
committed
docs: document the Kimi (Moonshot) HTTP adapter
Adds the kimi adapter to the supported-LLMs list (README, doc/index.md, regenerated doc/codecompanion.txt) and a Setup Examples entry in doc/configuration/adapters-http.md. The setup example covers: - Minimal config (just MOONSHOT_API_KEY plus interactions.chat.adapter). - Overriding the API-key source via the cmd: prefix (1Password CLI example) and switching the URL for region-specific endpoints (e.g. api.moonshot.cn). - Schema overrides for `model` and `think` so users can pick a non- thinking K2 model or disable thinking on the K2-thinking variants. - An IMPORTANT callout that kimi-k2-thinking pins temperature=1 and top_p=0.95 server-side, matching the adapter's defaults. The example is placed between the llama.cpp and Ollama sections — both neighbours involve OpenAI-compatible reasoning configuration, which keeps the page topically grouped.
1 parent 53a8509 commit b6db2a9

4 files changed

Lines changed: 102 additions & 2 deletions

File tree

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ Thank you to the following people:
3030

3131
- :speech_balloon: [Copilot Chat](https://github.com/features/copilot) meets [Zed AI](https://zed.dev/blog/zed-ai), in Neovim
3232
- :zap: Integrates Neovim with LLMs and Agents in the CLI
33-
- :electric_plug: Support for LLMs from Anthropic, Copilot, GitHub Models, DeepSeek, Gemini, Mistral AI, Novita, Ollama, OpenAI, Azure OpenAI, HuggingFace and xAI (or [bring your own](https://codecompanion.olimorris.dev/extending/adapters.html))
33+
- :electric_plug: Support for LLMs from Anthropic, Copilot, GitHub Models, DeepSeek, Gemini, Kimi (Moonshot), Mistral AI, Novita, Ollama, OpenAI, Azure OpenAI, HuggingFace and xAI (or [bring your own](https://codecompanion.olimorris.dev/extending/adapters.html))
3434
- :robot: Support for [Agent Client Protocol](https://agentclientprotocol.com/overview/introduction), enabling coding with agents like [Augment Code](https://docs.augmentcode.com/cli/overview), [Cagent](https://github.com/docker/cagent) from Docker, [Claude Code](https://docs.anthropic.com/en/docs/claude-code/overview), [Codex](https://openai.com/codex), [Copilot CLI](https://github.com/features/copilot/cli), [Gemini CLI](https://github.com/google-gemini/gemini-cli), [Goose](https://block.github.io/goose/), [Cursor CLI](https://cursor.com/docs/cli/overview), [Kimi CLI](https://github.com/MoonshotAI/kimi-cli), [Kiro](https://kiro.dev/docs/cli/), [Mistral Vibe](https://github.com/mistralai/mistral-vibe) and [OpenCode](https://opencode.ai)
3535
- :heart_hands: User contributed and supported [adapters](https://codecompanion.olimorris.dev/configuration/adapters-http#community-adapters)
3636
- :battery: Support for [Model Context Protocol (MCP)](https://codecompanion.olimorris.dev/model-context-protocol#model-context-protocol-mcp-support)

doc/codecompanion.txt

Lines changed: 55 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
*codecompanion.txt* For NVIM v0.11 Last change: 2026 April 29
1+
*codecompanion.txt* For NVIM v0.11 Last change: 2026 May 03
22

33
==============================================================================
44
Table of Contents *codecompanion-table-of-contents*
@@ -126,6 +126,7 @@ agents. Out of the box, the plugin supports:
126126
- Goose (`goose`) - Requires an API key
127127
- HuggingFace (`huggingface`) - Requires an API key
128128
- Kilo Code (`kilocode`) - Requires an API key
129+
- Kimi (`kimi`) - Moonshot's Kimi K2 family; requires an API key
129130
- Kimi CLI (`kimi_cli`) - Requires an API key
130131
- Mistral AI (`mistral`) - Requires an API key or a Le Chat Pro subscription
131132
- Novita (`novita`) - Requires an API key
@@ -1757,6 +1758,59 @@ LLAMA.CPP WITH --REASONING-FORMAT DEEPSEEK
17571758
<
17581759

17591760

1761+
KIMI (MOONSHOT)
1762+
1763+
CodeCompanion ships a built-in `kimi` adapter for Moonshot's Kimi K2 family
1764+
<https://platform.kimi.ai/docs/models>. Unlike the generic `openai_compatible`
1765+
adapter, it captures and round-trips Kimi's `reasoning_content` so the
1766+
K2-thinking variants (`kimi-k2-thinking`, `kimi-k2-thinking-turbo`, and the
1767+
`can_reason` K2 generals such as `kimi-k2.6`) work correctly with tool calling
1768+
— without it, the second turn of a tool-using chat fails with `"thinking is
1769+
enabled but reasoning_content is missing in assistant tool call message"`.
1770+
1771+
For the default setup, simply set `MOONSHOT_API_KEY` and pick the adapter:
1772+
1773+
>lua
1774+
require("codecompanion").setup({
1775+
interactions = {
1776+
chat = { adapter = "kimi" },
1777+
inline = { adapter = "kimi" },
1778+
},
1779+
})
1780+
<
1781+
1782+
To override the API-key source, swap models, or disable thinking mode:
1783+
1784+
>lua
1785+
require("codecompanion").setup({
1786+
adapters = {
1787+
http = {
1788+
kimi = function()
1789+
return require("codecompanion.adapters").extend("kimi", {
1790+
env = {
1791+
-- Use the 1Password CLI instead of an environment variable:
1792+
api_key = "cmd:op read op://API/Kimi/credential --no-newline",
1793+
-- Region override (Moonshot has separate endpoints, e.g. for China):
1794+
-- url = "https://api.moonshot.cn",
1795+
},
1796+
schema = {
1797+
model = { default = "kimi-k2.6" },
1798+
-- Set to false to disable thinking mode (e.g. for the K2-general
1799+
-- non-reasoning preview models, where `think` is a no-op):
1800+
think = { default = true },
1801+
},
1802+
})
1803+
end,
1804+
},
1805+
},
1806+
})
1807+
<
1808+
1809+
1810+
[!IMPORTANT] The K2-thinking models pin `temperature` to `1` and `top_p` to
1811+
`0.95`; the adapter's defaults match. Overriding either with another value will
1812+
yield a 400 from the API. Other K2 models accept the full ranges.
1813+
17601814
OLLAMA (REMOTELY)
17611815

17621816
The simplest way to connect to a remote Ollama instance is to set the

doc/configuration/adapters-http.md

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -380,6 +380,51 @@ require("codecompanion").setup({
380380
})
381381
```
382382

383+
### Kimi (Moonshot)
384+
385+
CodeCompanion ships a built-in `kimi` adapter for Moonshot's [Kimi K2 family](https://platform.kimi.ai/docs/models). Unlike the generic `openai_compatible` adapter, it captures and round-trips Kimi's `reasoning_content` so the K2-thinking variants (`kimi-k2-thinking`, `kimi-k2-thinking-turbo`, and the `can_reason` K2 generals such as `kimi-k2.6`) work correctly with tool calling — without it, the second turn of a tool-using chat fails with `"thinking is enabled but reasoning_content is missing in assistant tool call message"`.
386+
387+
For the default setup, simply set `MOONSHOT_API_KEY` and pick the adapter:
388+
389+
```lua
390+
require("codecompanion").setup({
391+
interactions = {
392+
chat = { adapter = "kimi" },
393+
inline = { adapter = "kimi" },
394+
},
395+
})
396+
```
397+
398+
To override the API-key source, swap models, or disable thinking mode:
399+
400+
```lua
401+
require("codecompanion").setup({
402+
adapters = {
403+
http = {
404+
kimi = function()
405+
return require("codecompanion.adapters").extend("kimi", {
406+
env = {
407+
-- Use the 1Password CLI instead of an environment variable:
408+
api_key = "cmd:op read op://API/Kimi/credential --no-newline",
409+
-- Region override (Moonshot has separate endpoints, e.g. for China):
410+
-- url = "https://api.moonshot.cn",
411+
},
412+
schema = {
413+
model = { default = "kimi-k2.6" },
414+
-- Set to false to disable thinking mode (e.g. for the K2-general
415+
-- non-reasoning preview models, where `think` is a no-op):
416+
think = { default = true },
417+
},
418+
})
419+
end,
420+
},
421+
},
422+
})
423+
```
424+
425+
> [!IMPORTANT]
426+
> The K2-thinking models pin `temperature` to `1` and `top_p` to `0.95`; the adapter's defaults match. Overriding either with another value will yield a 400 from the API. Other K2 models accept the full ranges.
427+
383428
### Ollama (remotely)
384429

385430
The simplest way to connect to a remote Ollama instance is to set the `OLLAMA_HOST` environment variable (the same variable used by the Ollama CLI):

doc/index.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -57,6 +57,7 @@ CodeCompanion uses [HTTP](configuration/adapters-http) and [ACP](configuration/a
5757
- Goose (`goose`) - Requires an API key
5858
- HuggingFace (`huggingface`) - Requires an API key
5959
- Kilo Code (`kilocode`) - Requires an API key
60+
- Kimi (`kimi`) - Moonshot's Kimi K2 family; requires an API key
6061
- Kimi CLI (`kimi_cli`) - Requires an API key
6162
- Mistral AI (`mistral`) - Requires an API key or a Le Chat Pro subscription
6263
- Novita (`novita`) - Requires an API key

0 commit comments

Comments
 (0)