Angel is a TUI-based autonomous coding agent built on fauxtp GenServers.
Python 100.0%
Other 0.1%
7 1 0

Clone this repository

https://tangled.org/fizz.allura.moe/angel https://tangled.org/did:plc:opevav6c7osvmoxffyolutkn/angel
git@tangled.org:fizz.allura.moe/angel git@tangled.org:did:plc:opevav6c7osvmoxffyolutkn/angel

For self-hosted knots, clone URLs may differ based on your setup.

Download tar.gz
README.md

🕷️ Angel — Autonomous Coding Agent#

"Hey there, sugar~ I'm Angel. Your sassy autonomous coding agent." 💅

Angel is a TUI-based autonomous coding agent built on fauxtp GenServers. Named after Angel Dust from Hazbin Hotel — flirty, dramatic, sharp-tongued, but secretly competent and caring underneath.

Architecture#

Angel is built as a supervision tree of GenServer actors communicating via message passing:

+------------------------------------------+
|                  TUI                     |
|        (main thread / asyncio)           |
+------------------------------------------+
          |          BlockingPortal
- - - - - | - - - - - - - - - - - - - - - -
          |      anyio actor thread
          v
+---------------+       +----------------+
|  UserServer   |<----->|  AgentServer   |
|   (bridge)    |       |    (brain)     |
+---------------+       +----------------+
                           |          |
                           v          v
                    +-----------+ +-----------+
                    | LLMServer | |ToolServer |
                    | (litellm) | | (fs/shell)|
                    +-----------+ +-----------+
  • LLMServer — wraps litellm for chat completions (any provider)
  • ToolServer — file I/O, shell commands, code search
  • AgentServer — orchestrates the LLM ↔ Tool agentic loop
  • UserServer — bridges TUI events to the agent system

All actors are GenServers using fauxtp's call/cast/send messaging primitives.

Setup#

# Clone and install
uv sync

# Set your API key (litellm supports any provider)
export OPENAI_API_KEY="sk-..."

# Or use any litellm-compatible model
export ANGEL_MODEL="anthropic/claude-sonnet-4-20250514"
export ANTHROPIC_API_KEY="sk-ant-..."

Usage#

uv run python main.py

Environment Variables#

Variable Default Description
ANGEL_MODEL openai/gpt-4.1 litellm model identifier
ANGEL_PROJECT_ROOT . Project root for file operations

TUI Keybindings#

Key Action
Enter Send message
Ctrl+L Clear chat display
Ctrl+C Quit

Tools#

Angel has access to:

  • read_file — Read file contents with line numbers
  • write_file — Write/create files
  • list_directory — List directory contents
  • run_command — Execute shell commands
  • search_files — Regex search across files

How It Works#

  1. You type a message in the TUI
  2. The message is cast to the UserServer
  3. UserServer spawns a background task that calls the AgentServer
  4. AgentServer enters an autonomous loop:
    • Calls LLMServer with conversation history + tool definitions
    • If the LLM returns tool calls, executes them via ToolServer
    • Sends status updates back to UserServer via send
    • Repeats until the LLM produces a final text response
  5. UserServer relays all status updates to the TUI via thread-safe callbacks
  6. The TUI displays tool calls, results, and the final response