Script for easily configuring, using, switching and comparing local offline coding models
1# CLAUDE.md
2
3This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
4
5## Project Overview
6
7`localcode` — a single CLI for managing a fully offline local AI coding environment on macOS Apple Silicon. Uses Ollama to serve Qwen 2.5 Coder models via OpenAI-compatible APIs, with switchable terminal coding agents (Aider, OpenCode, Pi).
8
9## Commands
10
11- `npm run dev -- <args>` — Run via tsx (development)
12- `npm run build` — Compile TypeScript to `dist/`
13- `npx tsc --noEmit` — Type-check without emitting
14
15After `localcode setup`, the `localcode` binary is available in `~/.local/bin/`.
16
17## CLI
18
19```
20localcode Launch active TUI in current directory
21localcode status Show current config + server health
22localcode start Start Ollama + pull models
23localcode stop Stop Ollama
24localcode model List available models
25localcode set model <id> Switch the chat model
26localcode set autocomplete <id> Switch the autocomplete model
27localcode tui List available TUIs
28localcode set tui <id> Switch the active TUI
29localcode bench Benchmark running chat model
30localcode bench history Show past benchmark results
31localcode pipe "prompt" Pipe stdin through the model
32localcode setup Full install
33```
34
35## Architecture
36
37```
38src/
39 main.ts — CLI dispatcher (switch on process.argv[2])
40 config.ts — Ollama URL/port constants, TUI config paths
41 log.ts — log/warn/err with ANSI colors
42 util.ts — Shell exec helpers, file writers
43 runtime-config.ts — Read/write ~/.config/localcode/config.json
44 registry/
45 models.ts — ModelDef interface + MODELS array (Ollama tags)
46 tuis.ts — TuiDef interface + TUIS array
47 commands/
48 run.ts — Default action: ensure Ollama, init git, exec TUI
49 status.ts — Show config + Ollama health
50 server.ts — Start/stop Ollama, pull models
51 setup.ts — Full install pipeline
52 models.ts — List/switch models, auto-pull + regen configs
53 tuis.ts — List/switch TUIs, auto-install + regen configs
54 bench.ts — Benchmark against running Ollama
55 pipe.ts — Pipe stdin through the model
56 steps/ — Individual setup phases (preflight, homebrew, ollama, etc.)
57 templates/
58 scripts.ts — localcode wrapper script
59 aider.ts — Aider config template
60 opencode.ts — OpenCode config template
61 pi.ts — Pi models.json + settings.json templates
62```
63
64### Key patterns
65
66**Ollama backend**: Single Ollama server on port 11434 serves all models. Models identified by Ollama tags (e.g., `qwen2.5-coder:32b`). No separate chat/autocomplete server processes — Ollama loads/unloads models on demand.
67
68**Runtime config** (`~/.config/localcode/config.json`): Stores active chatModel, autocompleteModel, and tui IDs. Read by `runtime-config.ts` with defaults fallback.
69
70**Registries**: `registry/models.ts` and `registry/tuis.ts` define available options as typed arrays. Add new models/TUIs by appending to these arrays. Models have `ollamaTag` field for the Ollama model identifier.
71
72**Config regeneration**: When models or TUI are switched, TUI configs are automatically regenerated.
73
74**Generated scripts**: Only 1 bash script is generated in `~/.local/bin/`: `localcode` (thin wrapper calling `node dist/main.js`). All other functionality lives in TypeScript commands.
75
76**Benchmark**: Hits Ollama's `/v1/chat/completions` with 3 hardcoded prompts, measures wall-clock time + token counts. Results saved to `~/.config/localcode/benchmarks.json`.
77
78## Key paths on the user's system
79
80- `~/.local/bin/localcode` — CLI wrapper script
81- `~/.config/localcode/config.json` — Active model/TUI selection
82- `~/.config/localcode/benchmarks.json` — Benchmark history
83- `~/.aider/` — Aider config
84- `~/.config/opencode/opencode.json` — OpenCode config
85- `~/.pi/agent/models.json` — Pi config
86- `~/.pi/agent/settings.json` — Pi settings (packages)
87- Ollama port **11434**
88
89## Important: after changing TypeScript
90
91The `localcode` wrapper in `~/.local/bin/` calls `node dist/main.js`. After modifying TypeScript source, run `npm run build` to recompile, or the wrapper will run stale code.
92
93## Dead files to clean up
94
95- `src/commands/proxy.ts` — Was the llama.cpp tool-call rewriting proxy, now unused (Ollama handles tool calling natively)
96- `templates/qwen-tool-call.jinja` — Was the Qwen tool-use Jinja template for llama.cpp, now unused