🕷️ Angel — Autonomous Coding Agent#
"Hey there, sugar~ I'm Angel. Your sassy autonomous coding agent." 💅
Angel is a TUI-based autonomous coding agent built on fauxtp GenServers. Named after Angel Dust from Hazbin Hotel — flirty, dramatic, sharp-tongued, but secretly competent and caring underneath.
Architecture#
Angel is built as a supervision tree of GenServer actors communicating via message passing:
+------------------------------------------+
| TUI |
| (main thread / asyncio) |
+------------------------------------------+
| BlockingPortal
- - - - - | - - - - - - - - - - - - - - - -
| anyio actor thread
v
+---------------+ +----------------+
| UserServer |<----->| AgentServer |
| (bridge) | | (brain) |
+---------------+ +----------------+
| |
v v
+-----------+ +-----------+
| LLMServer | |ToolServer |
| (litellm) | | (fs/shell)|
+-----------+ +-----------+
- LLMServer — wraps litellm for chat completions (any provider)
- ToolServer — file I/O, shell commands, code search
- AgentServer — orchestrates the LLM ↔ Tool agentic loop
- UserServer — bridges TUI events to the agent system
All actors are GenServers using fauxtp's call/cast/send messaging primitives.
Setup#
# Clone and install
uv sync
# Set your API key (litellm supports any provider)
export OPENAI_API_KEY="sk-..."
# Or use any litellm-compatible model
export ANGEL_MODEL="anthropic/claude-sonnet-4-20250514"
export ANTHROPIC_API_KEY="sk-ant-..."
Usage#
uv run python main.py
Environment Variables#
| Variable | Default | Description |
|---|---|---|
ANGEL_MODEL |
openai/gpt-4.1 |
litellm model identifier |
ANGEL_PROJECT_ROOT |
. |
Project root for file operations |
TUI Keybindings#
| Key | Action |
|---|---|
Enter |
Send message |
Ctrl+L |
Clear chat display |
Ctrl+C |
Quit |
Tools#
Angel has access to:
- read_file — Read file contents with line numbers
- write_file — Write/create files
- list_directory — List directory contents
- run_command — Execute shell commands
- search_files — Regex search across files
How It Works#
- You type a message in the TUI
- The message is
castto the UserServer - UserServer spawns a background task that
calls the AgentServer - AgentServer enters an autonomous loop:
- Calls LLMServer with conversation history + tool definitions
- If the LLM returns tool calls, executes them via ToolServer
- Sends status updates back to UserServer via
send - Repeats until the LLM produces a final text response
- UserServer relays all status updates to the TUI via thread-safe callbacks
- The TUI displays tool calls, results, and the final response