Skip to content

Commit 8115794

Browse files
committed
docs: Add local installation instructions
This release introduces detailed instructions for building and installing the LLM Commit Message VS Code extension locally. - Provides prerequisites for development and usage. - Includes step-by-step instructions for building the extension using npm. - Offers guidelines for running the extension locally.
1 parent f019841 commit 8115794

5 files changed

Lines changed: 64 additions & 23 deletions

File tree

README.md

Lines changed: 11 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,10 @@
66

77
This extensions adds a button to the Git extension sidebar - clicking it sends a Git diff to a local LLM (default Ollama) and uses a generated commit message in the box.
88

9+
## Note
10+
11+
This extension is not available in the extensions library. You can build and install it locally though.
12+
913
## Benefits
1014

1115
- Free (does not require ChatGPT account and subscription)
@@ -14,11 +18,15 @@ This extensions adds a button to the Git extension sidebar - clicking it sends a
1418

1519
## Documentation
1620

17-
See [docs](/docs/) for more details.
21+
[Docs website](https://michaelcurrin.github.io/llm-commit-msg-vs-code/)
1822

1923
## Sample
2024

21-
![SCM action button](./sample.png)
25+
<div align="center">
26+
<img src="https://github.com/MichaelCurrin/llm-commit-msg-vs-code/raw/docs/sample.png"
27+
alt="sample screenshot"
28+
width="300" />
29+
</div>
2230

2331
## Related projects
2432

@@ -27,4 +35,4 @@ See [docs](/docs/) for more details.
2735
2836
## License
2937

30-
Licensed under [MIT](/LICENSE).
38+
Released under [MIT](/LICENSE) by [@MichaelCurrin](https://github.com/MichaelCurrin).

docs/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,9 @@ LLM Commit Message – Documentation
22

33
This VS Code extension generates high‑quality Git commit messages using a local LLM exposed via an OpenAI‑compatible API (e.g. Ollama).
44

5-
- [Installation](docs/installation.md)
6-
- [Usage](docs/usage.md)
7-
5+
- [Installation](installation.md)
6+
- [Usage](usage.md)
7+
- [Development](development/)
88

99
## Quickstart
1010

docs/development/README.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
# Development
2+
3+
- [Back to user docs](../)
4+
- [Installation](installation.md)

docs/development/installation.md

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
# Installation
2+
3+
# Prerequisites
4+
5+
- **VS Code** `^1.104.0`.
6+
- **Git** installed and a Git repository opened in VS Code.
7+
- **Ollama** (or another OpenAI‑compatible server) running and reachable at `http://localhost:11434/v1`.
8+
9+
## Ollama setup (example)
10+
11+
- Install Ollama and ensure the OpenAI‑compatible endpoint is running.
12+
- Pull or run a model that matches your settings, e.g. `gemma3`:
13+
- `ollama pull gemma3`
14+
- `ollama run gemma3`
15+
16+
## Configure the extension
17+
18+
- Open VS Code Settings and search for `LLM Commit Message`.
19+
- Set:
20+
- `llmCommitMsg.endpoint` (default `http://localhost:11434/v1`).
21+
- `llmCommitMsg.model` (default `gemma3`).
22+
23+
Install from source (development)
24+
25+
- Clone this repository.
26+
- Open it in VS Code and press `F5` to launch an Extension Development Host.
27+
- In the Extension Development Host, open a Git repository and use the command from the Source Control view (see Usage).

docs/installation.md

Lines changed: 19 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1,27 +1,29 @@
11
# Installation
2+
> Guide for end users to build and install the extension locally.
23
3-
# Prerequisites
4+
## Prerequisites
45

56
- **VS Code** `^1.104.0`.
6-
- **Git** installed and a Git repository opened in VS Code.
7-
- **Ollama** (or another OpenAI‑compatible server) running and reachable at `http://localhost:11434/v1`.
7+
- **Node.js** and **npm**.
8+
- Optional: **Ollama** (or any OpenAI‑compatible server) if you plan to use the extension.
89

9-
## Ollama setup (example)
10+
## Steps
1011

11-
- Install Ollama and ensure the OpenAI‑compatible endpoint is running.
12-
- Pull or run a model that matches your settings, e.g. `gemma3`:
13-
- `ollama pull gemma3`
14-
- `ollama run gemma3`
12+
Clone the repo.
1513

16-
## Configure the extension
14+
Install dependencies:
1715

18-
- Open VS Code Settings and search for `LLM Commit Message`.
19-
- Set:
20-
- `llmCommitMsg.endpoint` (default `http://localhost:11434/v1`).
21-
- `llmCommitMsg.model` (default `gemma3`).
16+
```bash
17+
npm ci
18+
```
2219

23-
Install from source (development)
20+
Package the extension (creates a `.vsix` in the current directory):
2421

25-
- Clone this repository.
26-
- Open it in VS Code and press `F5` to launch an Extension Development Host.
27-
- In the Extension Development Host, open a Git repository and use the command from the Source Control view (see Usage).
22+
```bash
23+
npm run ext
24+
```
25+
26+
## Run for development (no install)
27+
28+
- Open the repo in VS Code and press `F5` to launch an Extension Development Host.
29+
- Use the command palette or Source Control view to trigger the extension command.

0 commit comments

Comments
 (0)