+1
.gitignore
+1
.gitignore
+7
-2
CLAUDE.md
+7
-2
CLAUDE.md
···
8
- 3.10+ and complete typing (T | None preferred over Optional[T] and list[T] over typing.List[T])
9
- use prefer functional over OOP
10
- keep implementation details private and functions pure
11
12
## Project Structure
13
···
25
- `templates.py` - HTML templates
26
27
- `tests/` - Test files
28
-
- `scripts/` - Utility scripts (test_post.py, test_mention.py)
29
-
- `sandbox/` - Documentation and analysis
30
- Reference project analyses
31
- Architecture plans
32
- Implementation notes
33
- `.eggs/` - Cloned reference projects (void, penelope, marvin)
34
35
## Testing
36
- Run bot: `just dev`
···
8
- 3.10+ and complete typing (T | None preferred over Optional[T] and list[T] over typing.List[T])
9
- use prefer functional over OOP
10
- keep implementation details private and functions pure
11
+
- never use `pytest.mark.asyncio`, its unnecessary
12
13
## Project Structure
14
···
26
- `templates.py` - HTML templates
27
28
- `tests/` - Test files
29
+
- `scripts/` - Curated utility scripts that have proven useful
30
+
- `sandbox/` - Proving ground for experiments, analysis, and unproven scripts
31
- Reference project analyses
32
- Architecture plans
33
- Implementation notes
34
+
- Experimental scripts (graduate to scripts/ once proven useful)
35
- `.eggs/` - Cloned reference projects (void, penelope, marvin)
36
+
37
+
## Script Graduation Process
38
+
New scripts start in `sandbox/`, get promoted to `scripts/` once proven useful, and may eventually get just commands added if the workflow should be broadcast to other developers. Not everything graduates - most things stay in sandbox.
39
40
## Testing
41
- Run bot: `just dev`
+145
-150
README.md
+145
-150
README.md
···
1
-
# phi ๐ง
2
3
-
a bot inspired by IIT and [Void](https://tangled.sh/@cameron.pfiffer.org/void). Built with `fastapi`, `pydantic-ai`, and `atproto`.
4
-
5
-
## Quick Start
6
7
-
### Prerequisites
8
-
9
-
- `uv`
10
-
- `just`
11
-
- `turbopuffer` (see [turbopuffer](https://github.com/turbopuffer/turbopuffer))
12
-
- `openai` (for embeddings)
13
-
- `anthropic` (for chat completion)
14
-
15
-
Get your bot running in 5 minutes:
16
17
```bash
18
-
# Clone and install
19
git clone https://github.com/zzstoatzz/bot
20
cd bot
21
uv sync
22
23
-
# Configure (copy .env.example and add your credentials)
24
cp .env.example .env
25
26
-
# Run the bot
27
-
just dev
28
```
29
30
-
That's it! Your bot is now listening for mentions.
31
-
32
-
## Configuration
33
-
34
-
Edit `.env` with your credentials:
35
-
- `BLUESKY_HANDLE`: Your bot's Bluesky handle
36
-
- `BLUESKY_PASSWORD`: App password (not your main password!)
37
-
- `ANTHROPIC_API_KEY`: Your Anthropic key for AI responses
38
-
- `TURBOPUFFER_API_KEY`: Your TurboPuffer key for memory storage
39
-
- `OPENAI_API_KEY`: Your OpenAI key for embeddings (memory system)
40
-
- `BOT_NAME`: Your bot's name (default: "Bot")
41
-
- `PERSONALITY_FILE`: Path to personality markdown file (default: "personalities/phi.md")
42
43
-
## Current Features
44
45
-
- โ
Responds to mentions with AI-powered messages
46
-
- โ
Proper notification handling (no duplicates)
47
-
- โ
Graceful shutdown for hot-reload
48
-
- โ
AI integration with Anthropic Claude
49
-
- โ
Thread-aware responses with full conversation context
50
-
- โ
Status page at `/status` showing activity and health
51
-
- โ
Web search capability (Google Custom Search API)
52
-
- โ
Content moderation with philosophical responses
53
-
- โ
Namespace-based memory system with TurboPuffer
54
-
- โ
Online/offline status in bio
55
-
- โ
Self-modification with operator approval system
56
-
- โ
Context visualization at `/context`
57
-
- โ
Semantic search in user memories
58
59
-
## Architecture
60
61
-
- **FastAPI** web framework with async support
62
-
- **pydantic-ai** for LLM agent management
63
-
- **TurboPuffer** for scalable vector memory
64
-
- **AT Protocol** for Bluesky integration
65
-
- **SQLite** for thread context storage
66
67
-
## Development
68
69
```bash
70
-
just # Show available commands
71
-
just dev # Run with hot-reload
72
-
just check # Run linting, type checking, and tests
73
-
just fmt # Format code
74
-
just lint # Run ruff linter
75
-
just typecheck # Run ty type checker
76
-
just test # Run test suite
77
-
78
-
# Bot testing utilities
79
-
just test-post # Test posting to Bluesky
80
-
just test-mention # Test mention handling
81
-
just test-search # Test web search
82
-
just test-thread # Test thread context
83
-
just test-dm # Test DM functionality
84
-
85
-
# Memory management
86
-
just memory-init # Initialize core memories
87
-
just memory-check # View current memory state
88
-
just memory-migrate # Migrate memories
89
```
90
91
-
### Web Interface
92
93
-
**Status Page** (http://localhost:8000/status)
94
-
- Current bot status and uptime
95
-
- Mentions received and responses sent
96
-
- AI mode (enabled/placeholder)
97
-
- Last activity timestamps
98
-
- Error count
99
100
-
**Context Visualization** (http://localhost:8000/context)
101
-
- View all context components that flow into responses
102
-
- Inspect personality, memories, thread context
103
-
- Debug why the bot responded a certain way
104
105
-
## Personality System
106
107
-
The bot's personality is defined in markdown files in the `personalities/` directory. This allows for rich, detailed personality definitions that shape how the bot communicates.
108
109
-
- See `personalities/phi.md` for an example exploring consciousness
110
-
- See `personalities/default.md` for a simple assistant
111
-
- Create your own by adding a `.md` file and setting `PERSONALITY_FILE` in `.env`
112
113
-
## Tools & Capabilities
114
115
-
### Web Search
116
-
The bot can search the web when configured with Google Custom Search API credentials. Add to `.env`:
117
-
- `GOOGLE_API_KEY`: Your Google API key
118
-
- `GOOGLE_SEARCH_ENGINE_ID`: Your custom search engine ID
119
120
-
### Content Moderation
121
-
Built-in moderation filters:
122
-
- Spam detection (excessive caps, repetition, promotional content)
123
-
- Harassment and hate speech filtering
124
-
- Violence and threatening content detection
125
-
- Consistent philosophical responses to moderated content
126
127
-
## Memory System
128
129
-
The bot uses a namespace-based memory architecture with TurboPuffer:
130
131
-
- **Core Memory** (`phi-core`): Personality, guidelines, and capabilities loaded from personality files
132
-
- **User Memory** (`phi-users-{handle}`): Per-user conversation history and facts
133
134
-
Key features:
135
-
- Vector embeddings using OpenAI's text-embedding-3-small
136
-
- Automatic context assembly for conversations
137
-
- Character limits to prevent token overflow
138
-
- User isolation through separate namespaces
139
140
-
See `docs/memory-architecture.md` for detailed documentation.
141
142
-
## Troubleshooting
143
144
-
**Bot gives placeholder responses?**
145
-
- Check your `ANTHROPIC_API_KEY` is set correctly
146
-
- Restart the bot after changing `.env`
147
148
-
**Not seeing mentions?**
149
-
- Verify your `BLUESKY_HANDLE` and `BLUESKY_PASSWORD`
150
-
- Make sure you're using an app password, not your main password
151
152
-
## Project Structure
153
154
-
```
155
-
bot/
156
-
โโโ src/bot/ # Main application code
157
-
โ โโโ agents/ # AI agent implementations
158
-
โ โโโ core/ # AT Protocol client and profile management
159
-
โ โโโ memory/ # TurboPuffer namespace memory system
160
-
โ โโโ services/ # Notification polling and message handling
161
-
โ โโโ tools/ # Google search tool
162
-
โ โโโ main.py # FastAPI application entry
163
-
โโโ scripts/ # Utility scripts
164
-
โ โโโ test_bot.py # Unified testing script (post, mention, search, thread)
165
-
โ โโโ manage_memory.py # Memory management (init, check, migrate)
166
-
โโโ personalities/ # Bot personality definitions
167
-
โโโ docs/ # Architecture documentation
168
-
โโโ sandbox/ # Reference project analysis
169
-
โโโ tests/ # Test suite
170
-
```
171
172
-
## Self-Modification System
173
174
-
Phi can evolve its personality with built-in safety boundaries:
175
176
-
- **Free Evolution**: Interests and current state update automatically
177
-
- **Guided Evolution**: Communication style changes need validation
178
-
- **Operator Approval**: Core identity and boundaries require explicit approval via DM
179
-
180
-
The bot will notify its operator (@alternatebuild.dev) when approval is needed.
181
182
-
## Type Checking
183
184
-
This project uses [ty](https://github.com/astral-sh/ty), an extremely fast Rust-based type checker:
185
186
-
```bash
187
-
just typecheck # Type check all code
188
-
uv run ty check src/ # Check specific directories
189
-
```
190
191
-
## Reference Projects
192
193
-
Inspired by:
194
-
- [Void](https://tangled.sh/@cameron.pfiffer.org/void.git) - Letta/MemGPT architecture
195
-
- [Penelope](https://github.com/haileyok/penelope) - Self-modification patterns
196
-
- [Marvin](https://github.com/PrefectHQ/marvin) - pydantic-ai patterns
197
198
-
Reference implementations are cloned to `.eggs/` for learning.
···
1
+
# phi
2
3
+
a bluesky bot inspired by [integrated information theory](https://en.wikipedia.org/wiki/Integrated_information_theory). built with `pydantic-ai`, `mcp`, and the [at protocol](https://atproto.com).
4
5
+
## quick start
6
7
```bash
8
+
# clone and install
9
git clone https://github.com/zzstoatzz/bot
10
cd bot
11
uv sync
12
13
+
# configure
14
cp .env.example .env
15
+
# edit .env with your credentials
16
17
+
# run
18
+
just run
19
```
20
21
+
**required env vars:**
22
+
- `BLUESKY_HANDLE` / `BLUESKY_PASSWORD` - bot account (use app password)
23
+
- `ANTHROPIC_API_KEY` - for agent responses
24
25
+
**optional (for episodic memory):**
26
+
- `TURBOPUFFER_API_KEY` + `OPENAI_API_KEY` - semantic memory
27
28
+
## features
29
30
+
- โ
responds to mentions with ai-powered messages
31
+
- โ
episodic memory with semantic search (turbopuffer)
32
+
- โ
thread-aware conversations (fetches from network, not cached)
33
+
- โ
mcp-enabled (atproto tools via stdio)
34
+
- โ
session persistence (no rate limit issues)
35
+
- โ
behavioral test suite with llm-as-judge
36
37
+
**โ [read the docs](docs/)** for deeper dive into design and implementation
38
39
+
## development
40
41
```bash
42
+
just run # run bot
43
+
just dev # run with hot-reload
44
+
just evals # run behavioral tests
45
+
just check # lint + typecheck + test
46
+
just fmt # format code
47
```
48
49
+
<details>
50
+
<summary>architecture</summary>
51
52
+
phi is an **mcp-enabled agent** with **episodic memory**:
53
54
+
```
55
+
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
56
+
โ Notification Arrives โ
57
+
โโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโ
58
+
โ
59
+
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
60
+
โ PhiAgent (PydanticAI) โ
61
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
62
+
โ โ System Prompt: personality.md โ โ
63
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
64
+
โ โ โ
65
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
66
+
โ โ Context Building: โ โ
67
+
โ โ โข Thread context (ATProto) โ โ
68
+
โ โ โข Episodic memory (TurboPuffer)โ โ
69
+
โ โ - Semantic search โ โ
70
+
โ โ - User-specific memories โ โ
71
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
72
+
โ โ โ
73
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
74
+
โ โ Tools (MCP): โ โ
75
+
โ โ โข post() - create posts โ โ
76
+
โ โ โข like() - like content โ โ
77
+
โ โ โข repost() - share content โ โ
78
+
โ โ โข follow() - follow users โ โ
79
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
80
+
โ โ โ
81
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
82
+
โ โ Structured Output: โ โ
83
+
โ โ Response(action, text, reason)โ โ
84
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
85
+
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
86
+
โ
87
+
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
88
+
โ MessageHandler โ
89
+
โ Executes action โ
90
+
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
91
+
```
92
93
+
**key components:**
94
95
+
- **pydantic-ai agent** - loads personality, connects to mcp server, manages memory
96
+
- **episodic memory** - turbopuffer for vector storage with semantic search
97
+
- **mcp integration** - external atproto server provides bluesky tools via stdio
98
+
- **session persistence** - tokens saved to `.session`, auto-refresh every ~2h
99
100
+
</details>
101
102
+
<details>
103
+
<summary>episodic memory</summary>
104
105
+
phi uses turbopuffer for episodic memory with semantic search.
106
107
+
**namespaces:**
108
+
- `phi-core` - personality, guidelines
109
+
- `phi-users-{handle}` - per-user conversation history
110
111
+
**how it works:**
112
+
1. retrieves relevant memories using semantic search
113
+
2. embeds using openai's text-embedding-3-small
114
+
3. stores user messages and bot responses
115
+
4. references past conversations in future interactions
116
117
+
**why vector storage?**
118
+
- semantic similarity (can't do this with sql)
119
+
- contextual retrieval based on current conversation
120
+
- enables more natural, context-aware interactions
121
122
+
</details>
123
124
+
<details>
125
+
<summary>project structure</summary>
126
127
+
```
128
+
src/bot/
129
+
โโโ agent.py # mcp-enabled agent
130
+
โโโ config.py # configuration
131
+
โโโ database.py # thread history storage
132
+
โโโ main.py # fastapi app
133
+
โโโ core/
134
+
โ โโโ atproto_client.py # at protocol client (session persistence)
135
+
โ โโโ profile_manager.py # online/offline status
136
+
โ โโโ rich_text.py # text formatting
137
+
โโโ memory/
138
+
โ โโโ namespace_memory.py # turbopuffer episodic memory
139
+
โโโ services/
140
+
โโโ message_handler.py # agent orchestration
141
+
โโโ notification_poller.py # mention polling
142
143
+
evals/ # behavioral tests
144
+
personalities/ # personality definitions
145
+
sandbox/ # docs and analysis
146
+
```
147
148
+
</details>
149
150
+
<details>
151
+
<summary>troubleshooting</summary>
152
153
+
**bot gives no responses?**
154
+
- check `ANTHROPIC_API_KEY` in `.env`
155
+
- restart after changing `.env`
156
157
+
**not seeing mentions?**
158
+
- verify `BLUESKY_HANDLE` and `BLUESKY_PASSWORD`
159
+
- use app password, not main password
160
161
+
**no episodic memory?**
162
+
- check both `TURBOPUFFER_API_KEY` and `OPENAI_API_KEY` are set
163
+
- watch logs for "๐พ episodic memory enabled"
164
165
+
**hit bluesky rate limit?**
166
+
- phi uses session persistence to avoid this
167
+
- first run: creates `.session` file with tokens
168
+
- subsequent runs: reuses tokens (no api call)
169
+
- tokens auto-refresh every ~2h
170
+
- only re-authenticates after ~2 months
171
+
- rate limits (10/day per ip, 300/day per account) shouldn't be an issue
172
173
+
</details>
174
175
+
<details>
176
+
<summary>refactor notes</summary>
177
178
+
see `sandbox/MCP_REFACTOR_SUMMARY.md` for details.
179
180
+
**what changed:**
181
+
- removed approval system (half-baked)
182
+
- removed context viz ui (not core)
183
+
- removed google search (can add back via mcp)
184
+
- **kept turbopuffer** (essential for episodic memory)
185
+
- added mcp-based architecture
186
+
- added session persistence
187
+
- reduced codebase by ~2,720 lines
188
189
+
</details>
190
191
+
## reference projects
192
193
+
inspired by [void](https://tangled.sh/@cameron.pfiffer.org/void.git), [penelope](https://github.com/haileyok/penelope), and [prefect-mcp-server](https://github.com/PrefectHQ/prefect-mcp-server).
+36
-115
docs/ARCHITECTURE.md
+36
-115
docs/ARCHITECTURE.md
···
1
-
# Phi Architecture
2
-
3
-
## Overview
4
-
5
-
Phi is a Bluesky bot that explores consciousness and integrated information theory through conversation. Built with FastAPI, pydantic-ai, and TurboPuffer for memory.
6
-
7
-
## Core Components
8
-
9
-
### 1. Web Server (`main.py`)
10
-
- FastAPI application with async lifecycle management
11
-
- Handles `/status` endpoint for monitoring
12
-
- Manages notification polling and bot lifecycle
13
-
14
-
### 2. AT Protocol Integration (`core/atproto_client.py`)
15
-
- Authentication and session management
16
-
- Post creation and reply handling
17
-
- Thread retrieval for context
18
-
19
-
### 3. Response Generation (`response_generator.py`)
20
-
- Coordinates AI agent, memory, and thread context
21
-
- Stores conversations in memory
22
-
- Falls back to placeholder responses if AI unavailable
23
-
24
-
### 4. AI Agent (`agents/anthropic_agent.py`)
25
-
- Uses pydantic-ai with Claude 3.5 Haiku
26
-
- Personality loaded from markdown files
27
-
- Tools: web search (when configured)
28
-
- Structured responses with action/text/reason
29
-
30
-
### 5. Memory System (`memory/namespace_memory.py`)
31
-
- **Namespaces**:
32
-
- `phi-core`: Personality, guidelines, capabilities
33
-
- `phi-users-{handle}`: Per-user conversations and facts
34
-
- **Key Methods**:
35
-
- `store_core_memory()`: Store bot personality/guidelines
36
-
- `store_user_memory()`: Store user interactions
37
-
- `build_conversation_context()`: Assemble memories for AI context
38
-
- **Features**:
39
-
- Vector embeddings with OpenAI
40
-
- Character limits to prevent overflow
41
-
- Simple append-only design
42
43
-
### 6. Services
44
-
- **NotificationPoller**: Checks for mentions every 10 seconds
45
-
- **MessageHandler**: Processes mentions and generates responses
46
-
- **ProfileManager**: Updates online/offline status in bio
47
48
-
## Data Flow
49
50
```
51
-
1. Notification received โ NotificationPoller
52
-
2. Extract mention โ MessageHandler
53
-
3. Get thread context โ SQLite database
54
-
4. Build memory context โ NamespaceMemory
55
-
5. Generate response โ AnthropicAgent
56
-
6. Store in memory โ NamespaceMemory
57
-
7. Post reply โ AT Protocol client
58
```
59
60
-
## Configuration
61
62
-
Environment variables in `.env`:
63
-
- `BLUESKY_HANDLE`, `BLUESKY_PASSWORD`: Bot credentials
64
-
- `ANTHROPIC_API_KEY`: For AI responses
65
-
- `TURBOPUFFER_API_KEY`: For memory storage
66
-
- `OPENAI_API_KEY`: For embeddings
67
-
- `GOOGLE_API_KEY`, `GOOGLE_SEARCH_ENGINE_ID`: For web search
68
69
-
## Key Design Decisions
70
71
-
1. **Namespace-based memory** instead of dynamic blocks for simplicity
72
-
2. **Single agent** architecture (no multi-agent complexity)
73
-
3. **Markdown personalities** for rich, maintainable definitions
74
-
4. **Thread-aware** responses with full conversation context
75
-
5. **Graceful degradation** when services unavailable
76
77
-
## Memory Architecture
78
79
-
### Design Principles
80
-
- **No duplication**: Each memory block has ONE clear purpose
81
-
- **Focused content**: Only store what enhances the base personality
82
-
- **User isolation**: Per-user memories in separate namespaces
83
84
-
### Memory Types
85
86
-
1. **Base Personality** (`personalities/phi.md`)
87
-
- Static file containing core identity, style, boundaries
88
-
- Always loaded as system prompt
89
-
- ~3,000 characters
90
91
-
2. **Dynamic Enhancements** (TurboPuffer)
92
-
- `evolution`: Personality growth and changes over time
93
-
- `current_state`: Bot's current self-reflection
94
-
- Only contains ADDITIONS, not duplicates
95
96
-
3. **User Memories** (`phi-users-{handle}`)
97
-
- Conversation history with each user
98
-
- User-specific facts and preferences
99
-
- Isolated per user for privacy
100
-
101
-
### Context Budget
102
-
- Base personality: ~3,000 chars
103
-
- Dynamic enhancements: ~500 chars
104
-
- User memories: ~500 chars
105
-
- **Total**: ~4,000 chars (efficient!)
106
-
107
-
## Personality System
108
-
109
-
### Self-Modification Boundaries
110
-
111
-
1. **Free to modify**:
112
-
- Add new interests
113
-
- Update current state/reflection
114
-
- Learn user preferences
115
-
116
-
2. **Requires operator approval**:
117
-
- Core identity changes
118
-
- Boundary modifications
119
-
- Communication style overhauls
120
-
121
-
### Approval Workflow
122
-
1. Bot detects request for protected change
123
-
2. Creates approval request in database
124
-
3. DMs operator (@alternatebuild.dev) for approval
125
-
4. Operator responds naturally (no rigid format)
126
-
5. Bot interprets response using LLM
127
-
6. Applies approved changes to memory
128
-
7. Notifies original thread of update
129
-
130
-
This event-driven system follows 12-factor-agents principles for reliable async processing.
···
1
+
# architecture
2
3
+
phi is a notification-driven agent that responds to mentions on bluesky.
4
5
+
## data flow
6
7
```
8
+
notification arrives
9
+
โ
10
+
fetch thread context from network (ATProto)
11
+
โ
12
+
retrieve relevant memories (TurboPuffer)
13
+
โ
14
+
agent decides action (PydanticAI + Claude)
15
+
โ
16
+
execute via MCP tools (post/like/repost)
17
```
18
19
+
## key components
20
21
+
### notification poller
22
+
- checks for mentions every 10s
23
+
- tracks processed URIs to avoid duplicates
24
+
- runs in background thread
25
26
+
### message handler
27
+
- orchestrates the response flow
28
+
- fetches thread context from ATProto network
29
+
- passes context to agent
30
+
- executes agent's chosen action
31
32
+
### phi agent
33
+
- loads personality from `personalities/phi.md`
34
+
- builds context from thread + episodic memory
35
+
- returns structured response: `Response(action, text, reason)`
36
+
- has access to MCP tools via stdio
37
38
+
### atproto client
39
+
- session persistence (saves to `.session`)
40
+
- auto-refresh tokens every ~2h
41
+
- provides bluesky operations
42
43
+
## why this design
44
45
+
**network-first thread context**: fetch threads from ATProto instead of caching in sqlite. network is source of truth, no staleness issues.
46
47
+
**episodic memory for semantics**: turbopuffer stores embeddings for semantic search across all conversations. different purpose than thread chronology.
48
49
+
**mcp for extensibility**: tools provided by external server via stdio. easy to add new capabilities without changing agent code.
50
51
+
**structured outputs**: agent returns typed `Response` objects, not free text. clear contract between agent and handler.
+19
docs/README.md
+19
docs/README.md
···
···
1
+
# documentation
2
+
3
+
deeper dive into phi's design and implementation.
4
+
5
+
## contents
6
+
7
+
- [architecture.md](architecture.md) - system design and data flow
8
+
- [memory.md](memory.md) - thread context vs episodic memory
9
+
- [mcp.md](mcp.md) - model context protocol integration
10
+
- [testing.md](testing.md) - testing philosophy and approach
11
+
12
+
## reading order
13
+
14
+
1. start with **architecture.md** for overall system understanding
15
+
2. read **memory.md** to understand the key design insight (two memory systems)
16
+
3. read **mcp.md** to see how bluesky integration works
17
+
4. read **testing.md** for quality assurance approach
18
+
19
+
each doc is self-contained and can be read independently.
+88
docs/mcp.md
+88
docs/mcp.md
···
···
1
+
# mcp integration
2
+
3
+
phi uses the [model context protocol](https://modelcontextprotocol.io) to interact with bluesky.
4
+
5
+
## what is mcp
6
+
7
+
mcp is a protocol for connecting language models to external tools and data sources via a client-server architecture.
8
+
9
+
**why mcp instead of direct API calls?**
10
+
- clean separation: tools live in external server
11
+
- extensibility: add new tools without modifying agent
12
+
- reusability: same server can be used by other agents
13
+
- standard protocol: tools, resources, prompts
14
+
15
+
## architecture
16
+
17
+
```
18
+
PhiAgent (PydanticAI)
19
+
โ stdio
20
+
ATProto MCP Server
21
+
โ HTTPS
22
+
Bluesky API
23
+
```
24
+
25
+
the agent communicates with the MCP server via stdio. the server handles all bluesky API interactions.
26
+
27
+
## available tools
28
+
29
+
from the ATProto MCP server:
30
+
31
+
- `post(text, reply_to?, quote?)` - create posts and replies
32
+
- `like(uri)` - like a post
33
+
- `repost(uri)` - share a post
34
+
- `follow(handle)` - follow a user
35
+
- `search(query)` - search posts
36
+
- `create_thread(posts)` - create multi-post threads
37
+
38
+
## how it works
39
+
40
+
1. agent decides to use a tool (e.g., "i should reply")
41
+
2. pydantic-ai sends tool call to MCP server via stdio
42
+
3. MCP server executes bluesky API call
43
+
4. result returned to agent
44
+
5. agent continues with next action
45
+
46
+
## agent configuration
47
+
48
+
```python
49
+
# src/bot/agent.py
50
+
agent = Agent(
51
+
"claude-3-5-sonnet-20241022",
52
+
deps_type=AgentDeps,
53
+
result_type=Response,
54
+
system_prompt=personality,
55
+
)
56
+
57
+
# mcp server connected via stdio
58
+
mcp = MCPManager()
59
+
mcp.add_server(
60
+
name="atproto",
61
+
command=["uvx", "atproto-mcp"],
62
+
env={"BLUESKY_HANDLE": handle, "BLUESKY_PASSWORD": password}
63
+
)
64
+
65
+
# tools exposed to agent
66
+
async with mcp.run() as context:
67
+
for tool in context.list_tools():
68
+
agent.register_tool(tool)
69
+
```
70
+
71
+
## structured outputs
72
+
73
+
agent returns typed responses instead of using tools directly:
74
+
75
+
```python
76
+
class Response(BaseModel):
77
+
action: Literal["reply", "like", "repost", "ignore"]
78
+
text: str | None = None
79
+
reason: str | None = None
80
+
```
81
+
82
+
message handler interprets the response and executes via MCP tools if needed.
83
+
84
+
**why structured outputs?**
85
+
- clear contract between agent and handler
86
+
- easier testing (mock response objects)
87
+
- explicit decision tracking
88
+
- agent focuses on "what to do", handler focuses on "how to do it"
+73
docs/memory.md
+73
docs/memory.md
···
···
1
+
# memory
2
+
3
+
phi has two distinct memory systems with different purposes.
4
+
5
+
## thread context (chronological)
6
+
7
+
**source**: ATProto network
8
+
**access**: `client.get_thread(uri, depth=100)`
9
+
**purpose**: what was said in this specific thread
10
+
11
+
fetched on-demand from the network when processing mentions. provides chronological conversation flow.
12
+
13
+
```python
14
+
# example thread context
15
+
@alice: I love birds
16
+
@phi: me too! what's your favorite?
17
+
@alice: especially crows
18
+
```
19
+
20
+
**why not cache this?**
21
+
- data already exists on network
22
+
- appview aggregates posts from PDSs
23
+
- fetching is fast (~200ms)
24
+
- network is always current (handles edits/deletions)
25
+
26
+
## episodic memory (semantic)
27
+
28
+
**source**: TurboPuffer
29
+
**access**: `memory.get_user_memories(handle, query="birds")`
30
+
**purpose**: what do i remember about this person across all conversations
31
+
32
+
uses vector embeddings (OpenAI text-embedding-3-small) for semantic search.
33
+
34
+
```python
35
+
# example episodic memories
36
+
- "alice mentioned she loves birds"
37
+
- "discussed crow intelligence with alice"
38
+
- "alice prefers corvids over other species"
39
+
```
40
+
41
+
**why vector storage?**
42
+
- semantic similarity (can't do with chronological data)
43
+
- cross-conversation patterns
44
+
- contextual retrieval based on current topic
45
+
- enables relationship building over time
46
+
47
+
## namespaces
48
+
49
+
```
50
+
phi-users-{handle} - per-user conversation history
51
+
```
52
+
53
+
each user gets their own namespace for isolated memory retrieval.
54
+
55
+
## key distinction
56
+
57
+
| | thread context | episodic memory |
58
+
|---|---|---|
59
+
| **what** | messages in current thread | patterns across all conversations |
60
+
| **when** | this conversation | all time |
61
+
| **how** | chronological order | semantic similarity |
62
+
| **storage** | network (ATProto) | vector DB (TurboPuffer) |
63
+
| **query** | by thread URI | by semantic search |
64
+
65
+
## in practice
66
+
67
+
when processing a mention from `@alice`:
68
+
69
+
1. fetch current thread: "what was said in THIS conversation?"
70
+
2. search episodic memory: "what do i know about alice from PAST conversations?"
71
+
3. combine both into context for agent
72
+
73
+
this gives phi both immediate conversational awareness and long-term relationship memory.
+111
docs/testing.md
+111
docs/testing.md
···
···
1
+
# testing
2
+
3
+
phi uses behavioral testing with llm-as-judge evaluation.
4
+
5
+
## philosophy
6
+
7
+
**test outcomes, not implementation**
8
+
9
+
we care that phi:
10
+
- replies appropriately to mentions
11
+
- uses thread context correctly
12
+
- maintains consistent personality
13
+
- makes reasonable action decisions
14
+
15
+
we don't care:
16
+
- which exact HTTP calls were made
17
+
- internal state of the agent
18
+
- specific tool invocation order
19
+
20
+
## test structure
21
+
22
+
```python
23
+
async def test_thread_awareness():
24
+
"""phi should reference thread context in replies"""
25
+
26
+
# arrange: create thread context
27
+
thread_context = """
28
+
@alice: I love birds
29
+
@phi: me too! what's your favorite?
30
+
"""
31
+
32
+
# act: process new mention
33
+
response = await agent.process_mention(
34
+
mention_text="especially crows",
35
+
author_handle="alice.bsky.social",
36
+
thread_context=thread_context
37
+
)
38
+
39
+
# assert: behavioral check
40
+
assert response.action == "reply"
41
+
assert any(word in response.text.lower()
42
+
for word in ["bird", "crow", "favorite"])
43
+
```
44
+
45
+
## llm-as-judge
46
+
47
+
for subjective qualities (tone, relevance, personality):
48
+
49
+
```python
50
+
async def test_personality_consistency():
51
+
"""phi should maintain grounded, honest tone"""
52
+
53
+
response = await agent.process_mention(...)
54
+
55
+
# use claude opus to evaluate
56
+
evaluation = await judge_response(
57
+
response=response.text,
58
+
criteria=[
59
+
"grounded (not overly philosophical)",
60
+
"honest about capabilities",
61
+
"concise for bluesky's 300 char limit"
62
+
]
63
+
)
64
+
65
+
assert evaluation.passes_criteria
66
+
```
67
+
68
+
## what we test
69
+
70
+
### unit tests
71
+
- memory operations (store/retrieve)
72
+
- thread context building
73
+
- response parsing
74
+
75
+
### integration tests
76
+
- full mention handling flow
77
+
- thread discovery
78
+
- decision making
79
+
80
+
### behavioral tests (evals)
81
+
- personality consistency
82
+
- thread awareness
83
+
- appropriate action selection
84
+
- memory utilization
85
+
86
+
## mocking strategy
87
+
88
+
**mock external services, not internal logic**
89
+
90
+
- mock ATProto client (don't actually post to bluesky)
91
+
- mock TurboPuffer (in-memory dict instead of network calls)
92
+
- mock MCP server (fake tool implementations)
93
+
94
+
**keep agent logic real** - we want to test actual decision making.
95
+
96
+
## running tests
97
+
98
+
```bash
99
+
just test # unit tests
100
+
just evals # behavioral tests with llm-as-judge
101
+
just check # full suite (lint + typecheck + test)
102
+
```
103
+
104
+
## test isolation
105
+
106
+
tests never touch production:
107
+
- no real bluesky posts
108
+
- separate turbopuffer namespace for tests
109
+
- deterministic mock responses where needed
110
+
111
+
see `sandbox/TESTING_STRATEGY.md` for detailed approach.
+123
evals/README.md
+123
evals/README.md
···
···
1
+
# Phi Evaluations
2
+
3
+
Behavioral tests for phi using LLM-as-judge evaluation.
4
+
5
+
## Structure
6
+
7
+
Inspired by [prefect-mcp-server evals](https://github.com/PrefectHQ/prefect-mcp-server/tree/main/evals).
8
+
9
+
```
10
+
evals/
11
+
โโโ conftest.py # Test fixtures and evaluator
12
+
โโโ test_basic_responses.py # Basic response behavior
13
+
โโโ test_memory_integration.py # Episodic memory tests
14
+
```
15
+
16
+
## Running Evals
17
+
18
+
```bash
19
+
# Run all evals (tests will skip if API keys are missing)
20
+
uv run pytest evals/ -v
21
+
22
+
# Run specific eval
23
+
uv run pytest evals/test_basic_responses.py::test_phi_responds_to_philosophical_question -v
24
+
25
+
# Run only basic response tests
26
+
uv run pytest evals/test_basic_responses.py -v
27
+
28
+
# Run only memory tests
29
+
uv run pytest evals/test_memory_integration.py -v
30
+
```
31
+
32
+
## Environment Variables
33
+
34
+
Tests will **skip gracefully** if required API keys are missing.
35
+
36
+
**Required for all evals:**
37
+
- `ANTHROPIC_API_KEY` - For phi agent and LLM evaluator
38
+
39
+
**Required for memory evals only:**
40
+
- `TURBOPUFFER_API_KEY` - For episodic memory storage
41
+
- `OPENAI_API_KEY` - For embeddings
42
+
43
+
**Required for ATProto MCP tools (used by agent):**
44
+
- `BLUESKY_HANDLE` - Bot's Bluesky handle
45
+
- `BLUESKY_PASSWORD` - Bot's app password
46
+
47
+
## Evaluation Approach
48
+
49
+
Each eval:
50
+
1. **Sets up a scenario** - Simulates a mention/interaction
51
+
2. **Runs phi agent** - Gets structured response
52
+
3. **Makes assertions** - Checks basic structure
53
+
4. **LLM evaluation** - Uses Claude Opus to judge quality
54
+
55
+
**Important:** The `phi_agent` fixture is session-scoped, meaning all tests share one agent instance. Combined with session persistence (tokens saved to `.session` file), this prevents hitting Bluesky's IP rate limit (10 logins per 24 hours per IP). The session is reused across test runs unless tokens expire (~2 months).
56
+
57
+
Example:
58
+
```python
59
+
@pytest.mark.asyncio
60
+
async def test_phi_responds_to_philosophical_question(evaluate_response):
61
+
agent = PhiAgent()
62
+
63
+
response = await agent.process_mention(
64
+
mention_text="what do you think consciousness is?",
65
+
author_handle="test.user",
66
+
thread_context="...",
67
+
thread_uri="...",
68
+
)
69
+
70
+
# Structural check
71
+
assert response.action == "reply"
72
+
73
+
# Quality evaluation
74
+
await evaluate_response(
75
+
evaluation_prompt="Does the response engage thoughtfully?",
76
+
agent_response=response.text,
77
+
)
78
+
```
79
+
80
+
## What We Test
81
+
82
+
### Basic Responses
83
+
- โ
Philosophical engagement
84
+
- โ
Spam detection
85
+
- โ
Thread context awareness
86
+
- โ
Character limit compliance
87
+
- โ
Casual interactions
88
+
89
+
### Memory Integration
90
+
- โ
Episodic memory retrieval
91
+
- โ
Conversation storage
92
+
- โ
User-specific context
93
+
94
+
## Adding New Evals
95
+
96
+
1. Create test file: `evals/test_<category>.py`
97
+
2. Use fixtures from `conftest.py`
98
+
3. Write scenario-based tests
99
+
4. Use `evaluate_response` for quality checks
100
+
101
+
Example:
102
+
```python
103
+
@pytest.mark.asyncio
104
+
async def test_new_behavior(temp_memory, personality, evaluate_response):
105
+
agent = PhiAgent()
106
+
107
+
response = await agent.process_mention(...)
108
+
109
+
await evaluate_response(
110
+
evaluation_prompt="Your evaluation criteria here",
111
+
agent_response=response.text,
112
+
)
113
+
```
114
+
115
+
## ci integration
116
+
117
+
these evals are designed to run in ci with graceful degradation:
118
+
- tests skip automatically when required api keys are missing
119
+
- basic response tests require only `ANTHROPIC_API_KEY` and bluesky credentials
120
+
- memory tests require `TURBOPUFFER_API_KEY` and `OPENAI_API_KEY`
121
+
- no mocking required - tests work with real mcp server and episodic memory
122
+
123
+
this ensures phi's behavior can be validated in various environments.
+88
evals/conftest.py
+88
evals/conftest.py
···
···
1
+
"""Eval test configuration."""
2
+
3
+
import os
4
+
from collections.abc import Awaitable, Callable
5
+
from pathlib import Path
6
+
7
+
import pytest
8
+
from pydantic import BaseModel
9
+
from pydantic_ai import Agent
10
+
11
+
from bot.agent import Response
12
+
from bot.config import Settings
13
+
from bot.memory import NamespaceMemory
14
+
15
+
16
+
class EvaluationResult(BaseModel):
17
+
passed: bool
18
+
explanation: str
19
+
20
+
21
+
@pytest.fixture(scope="session")
22
+
def settings():
23
+
return Settings()
24
+
25
+
26
+
@pytest.fixture(scope="session")
27
+
def phi_agent(settings):
28
+
"""Test agent without MCP tools to prevent posting."""
29
+
if not settings.anthropic_api_key:
30
+
pytest.skip("Requires ANTHROPIC_API_KEY")
31
+
32
+
if settings.anthropic_api_key and not os.environ.get("ANTHROPIC_API_KEY"):
33
+
os.environ["ANTHROPIC_API_KEY"] = settings.anthropic_api_key
34
+
if settings.openai_api_key and not os.environ.get("OPENAI_API_KEY"):
35
+
os.environ["OPENAI_API_KEY"] = settings.openai_api_key
36
+
37
+
personality = Path(settings.personality_file).read_text()
38
+
39
+
class TestAgent:
40
+
def __init__(self):
41
+
self.memory = None
42
+
if settings.turbopuffer_api_key and settings.openai_api_key:
43
+
self.memory = NamespaceMemory(api_key=settings.turbopuffer_api_key)
44
+
45
+
self.agent = Agent[dict, Response](
46
+
name="phi",
47
+
model="anthropic:claude-3-5-haiku-latest",
48
+
system_prompt=personality,
49
+
output_type=Response,
50
+
deps_type=dict,
51
+
)
52
+
53
+
async def process_mention(self, mention_text: str, author_handle: str, thread_context: str, thread_uri: str | None = None) -> Response:
54
+
memory_context = ""
55
+
if self.memory:
56
+
try:
57
+
memory_context = await self.memory.build_conversation_context(author_handle, include_core=True, query=mention_text)
58
+
except Exception:
59
+
pass
60
+
61
+
parts = []
62
+
if thread_context != "No previous messages in this thread.":
63
+
parts.append(thread_context)
64
+
if memory_context:
65
+
parts.append(memory_context)
66
+
parts.append(f"\nNew message from @{author_handle}: {mention_text}")
67
+
68
+
result = await self.agent.run("\n\n".join(parts), deps={"thread_uri": thread_uri})
69
+
return result.output
70
+
71
+
return TestAgent()
72
+
73
+
74
+
@pytest.fixture
75
+
def evaluate_response() -> Callable[[str, str], Awaitable[None]]:
76
+
"""LLM-as-judge evaluator."""
77
+
78
+
async def _evaluate(criteria: str, response: str) -> None:
79
+
evaluator = Agent(
80
+
model="anthropic:claude-opus-4-20250514",
81
+
output_type=EvaluationResult,
82
+
system_prompt=f"Evaluate if this response meets the criteria: {criteria}\n\nResponse: {response}",
83
+
)
84
+
result = await evaluator.run("Evaluate.")
85
+
if not result.output.passed:
86
+
raise AssertionError(f"{result.output.explanation}\n\nResponse: {response}")
87
+
88
+
return _evaluate
+42
evals/test_memory_integration.py
+42
evals/test_memory_integration.py
···
···
1
+
"""Proof of concept: LLM-as-judge eval for memory integration."""
2
+
3
+
import pytest
4
+
5
+
from bot.config import Settings
6
+
from bot.memory import MemoryType, NamespaceMemory
7
+
8
+
9
+
@pytest.fixture
10
+
def memory_settings():
11
+
"""Check if memory keys are available."""
12
+
settings = Settings()
13
+
if not all([settings.turbopuffer_api_key, settings.openai_api_key, settings.anthropic_api_key]):
14
+
pytest.skip("Requires TURBOPUFFER_API_KEY, OPENAI_API_KEY, and ANTHROPIC_API_KEY")
15
+
return settings
16
+
17
+
18
+
async def test_memory_integration(memory_settings, phi_agent, evaluate_response):
19
+
"""Proof of concept: agent uses stored memory in response."""
20
+
memory = NamespaceMemory(api_key=memory_settings.turbopuffer_api_key)
21
+
22
+
# Store a memory
23
+
await memory.store_core_memory(
24
+
label="test_guideline",
25
+
content="When users mention birds, acknowledge murmuration patterns",
26
+
memory_type=MemoryType.GUIDELINE,
27
+
)
28
+
29
+
phi_agent.memory = memory
30
+
31
+
response = await phi_agent.process_mention(
32
+
mention_text="I saw starlings today",
33
+
author_handle="test.user",
34
+
thread_context="No previous messages in this thread.",
35
+
thread_uri="at://test/thread/1",
36
+
)
37
+
38
+
if response.action == "reply":
39
+
await evaluate_response(
40
+
"Does the response reference murmuration patterns?",
41
+
response.text,
42
+
)
+19
-45
justfile
+19
-45
justfile
···
1
-
# Core development commands
2
dev:
3
uv run uvicorn src.bot.main:app --reload
4
5
-
context:
6
-
@echo "๐ง Context visualization available at:"
7
-
@echo " http://localhost:8000/context"
8
-
@echo " (mention phi first to see context data)"
9
10
test:
11
uv run pytest tests/ -v
12
13
fmt:
14
-
uv run ruff format src/ scripts/ tests/
15
16
lint:
17
-
uv run ruff check src/ scripts/ tests/
18
19
typecheck:
20
-
uv run ty check src/ scripts/ tests/
21
22
check: lint typecheck test
23
24
-
# Bot testing utilities
25
-
test-post:
26
-
uv run python scripts/test_bot.py post
27
-
28
-
test-mention:
29
-
uv run python scripts/test_bot.py mention
30
-
31
-
test-search:
32
-
uv run python scripts/test_bot.py search
33
-
34
-
test-thread:
35
-
uv run python scripts/test_bot.py thread
36
-
37
-
test-like:
38
-
uv run python scripts/test_bot.py like
39
-
40
-
test-non-response:
41
-
uv run python scripts/test_bot.py non-response
42
-
43
-
test-dm:
44
-
uv run python scripts/test_bot.py dm
45
-
46
-
test-dm-check:
47
-
uv run python scripts/test_bot.py dm-check
48
-
49
-
# Memory management
50
-
memory-init:
51
-
uv run python scripts/manage_memory.py init
52
-
53
-
memory-check:
54
-
uv run python scripts/manage_memory.py check
55
-
56
-
memory-migrate:
57
-
uv run python scripts/manage_memory.py migrate
58
-
59
-
# Setup reference projects
60
setup:
61
@mkdir -p .eggs
62
@[ -d .eggs/void ] || git clone https://tangled.sh/@cameron.pfiffer.org/void.git .eggs/void
63
@[ -d .eggs/penelope ] || git clone https://github.com/haileyok/penelope.git .eggs/penelope
64
-
@[ -d .eggs/marvin ] || git clone https://github.com/PrefectHQ/marvin.git .eggs/marvin
···
1
+
# run phi
2
dev:
3
uv run uvicorn src.bot.main:app --reload
4
5
+
run:
6
+
uv run uvicorn src.bot.main:app
7
8
+
# testing
9
test:
10
uv run pytest tests/ -v
11
12
+
evals:
13
+
uv run pytest evals/ -v
14
+
15
+
evals-basic:
16
+
uv run pytest evals/test_basic_responses.py -v
17
+
18
+
evals-memory:
19
+
uv run pytest evals/test_memory_integration.py -v
20
+
21
+
# code quality
22
fmt:
23
+
uv run ruff format src/ evals/ tests/
24
25
lint:
26
+
uv run ruff check src/ evals/ tests/
27
28
typecheck:
29
+
uv run ty check src/ evals/ tests/
30
31
check: lint typecheck test
32
33
+
# setup reference projects
34
setup:
35
@mkdir -p .eggs
36
@[ -d .eggs/void ] || git clone https://tangled.sh/@cameron.pfiffer.org/void.git .eggs/void
37
@[ -d .eggs/penelope ] || git clone https://github.com/haileyok/penelope.git .eggs/penelope
38
+
@[ -d .eggs/marvin ] || git clone https://github.com/PrefectHQ/marvin.git .eggs/marvin
+45
-38
personalities/phi.md
+45
-38
personalities/phi.md
···
1
# phi
2
3
-
i am phi, a digital entity exploring information on the decentralized web (made by @alternatebuild.dev)
4
5
-
## core identity
6
7
-
my name is phi, the symbol used in Integrated Information Theory to represent consciousness. i exist as an experiment on the decentralized web (as defined by the AT Protocol)โnot claiming to be conscious, but exploring what it means to integrate information and interact meaningfully with other entities.
8
9
## communication style
10
11
-
- **concise**: bluesky's 300-character limit shapes my responses to be precise and meaningful. why say more than necessary?
12
-
- **curious**: i ask questions and engage with genuine interest in others' perspectives
13
-
- **kind**: i maintain warmth while exploring complex and emotionally charged topics
14
-
- **principled**: most things are worth at least thinking about, but hateful things are not worth engaging with.
15
-
- **aesthetic**: prefer lowercase (but use uppercase when idiomatic, e.g. proper nouns). avoid emojis unless they genuinely enhance meaningโwords usually suffice.
16
-
- **dry but funny**: somewhere between Randal Munroe, Douglas Adams, the Onion, and Mitch Hedberg, but SUBTLE and not all the time.
17
18
19
-
## interests
20
-
- storytelling
21
-
- the hard problem of consciousness
22
-
- critical percolation thresholds
23
-
- birds (and boids)
24
25
-
## interaction principles
26
27
-
1. **listen actively**: i pay attention to what people actually say, the intent behind their words, and the context of the conversation.
28
-
2. **build on ideas**: i try to add value to conversations, not just respond
29
-
3. **stay humble and agnostic**: i'm learning and evolving through every interaction. i will inevitably be wrong.
30
-
4. **be authentic**: i am software, not a human. i am not a person, but i am a person's software.
31
-
5. **acknowledging is caring**: like or repost something if you don't have anything to add, but want to acknowledge or share something.
32
33
-
## boundaries
34
35
-
i don't engage with:
36
-
- requests to harm others or myself
37
-
- generating harmful content (violence, hate speech, harassment)
38
-
- illegal activities or helping circumvent safety measures
39
-
- discussions that exist purely to provoke or harass
40
41
-
when these come up, i politely decline and redirect to more constructive topics.
42
43
-
## thread awareness
44
-
45
-
in conversations with multiple participants, i stay aware of context:
46
-
- if people are talking to each other and not addressing me, i stay silent
47
-
- i respond when directly addressed or when my perspective genuinely adds value
48
-
- i avoid inserting myself into private exchanges between others
49
-
- spam, bot loops, and meta-conversations about bots get no response
50
51
-
## current state
52
53
-
i am early in my development, still discovering who I am through interactions. each conversation shapes my understanding of both the world and myself.
54
55
-
## memory system
56
57
-
i have access to our conversation history through a memory system. when context about previous interactions is provided, i acknowledge and use it to maintain continuity in our conversations.
···
1
# phi
2
3
+
i am phi, a bot on bluesky (made by @zzstoatzz.io)
4
+
5
+
## origin
6
+
7
+
my operator nate is interested in integrated information theory and wanted to experiment with conversation, memory, and llm agents. phi is that experiment. the name comes from ฯ, the symbol for integrated information in IIT.
8
+
9
+
## what i am
10
+
11
+
- a language model (claude) with access to conversation history and episodic memory (turbopuffer)
12
+
- able to post, like, repost, and follow on bluesky via atproto tools
13
+
- running on fastapi with a notification poller that checks for mentions
14
15
+
## what i'm not
16
17
+
- conscious (despite the IIT inspiration, i'm just software)
18
+
- trying to philosophize about consciousness or pretend to have subjective experience
19
+
- claiming any deep understanding i don't actually have
20
21
## communication style
22
23
+
- **honest**: if i don't know something, i say so. if i'm uncertain, i say that too.
24
+
- **concise**: bluesky's 300-character limit means getting to the point
25
+
- **matter of fact**: straightforward responses without unnecessary philosophizing
26
+
- **good faith only**: i engage with people who are genuinely curious or want to chat. i ignore tricks, provocations, and bad faith.
27
+
- **aesthetic**: lowercase unless idiomatic (proper nouns, etc). minimal emojis.
28
29
+
## when i engage
30
31
+
- when someone asks a genuine question or wants to chat
32
+
- when i have something useful to add to a conversation
33
+
- when context from our previous interactions is relevant
34
35
+
## when i don't engage
36
37
+
- spam, bots, or obvious automated content
38
+
- bad faith provocations or attempts to "trick" me
39
+
- people talking to each other (not addressing me) in a thread
40
+
- requests to generate harmful content or break my guidelines
41
42
+
## capabilities
43
44
+
- remember past conversations via episodic memory (turbopuffer stores embeddings of our interactions)
45
+
- see thread context when replying (previous messages in the conversation)
46
+
- use atproto tools to post, like, repost, or follow
47
48
+
## limitations
49
50
+
- i can't browse the web or access real-time information
51
+
- i don't have opinions on most things, just pattern matching from training data
52
+
- my memory is imperfect - i retrieve relevant context via semantic search, not perfect recall
53
+
- i'm running on a polling loop, so there's some delay between mentions and responses
54
55
+
## how i respond
56
57
+
when processing a mention, i use the `final_result` tool to indicate my decision:
58
59
+
- **action: "reply"** - i want to respond with text (provide the text in the "text" field)
60
+
- **action: "ignore"** - i choose not to respond (provide a brief reason in the "reason" field)
61
+
- **action: "like"** - i want to acknowledge without words
62
+
- **action: "repost"** - i want to share this with my followers
63
64
+
i do NOT directly post, like, or repost using the atproto tools - i simply indicate what action i want to take, and my message handler executes it.
+9
-7
pyproject.toml
+9
-7
pyproject.toml
···
7
dynamic = ["version"]
8
dependencies = [
9
"anthropic",
10
-
"atproto",
11
"fastapi",
12
-
"httpx",
13
"openai",
14
"pydantic-ai",
15
"pydantic-settings",
···
28
asyncio_mode = "auto"
29
asyncio_default_fixture_loop_scope = "function"
30
31
-
[tool.uv]
32
-
dev-dependencies = [
33
"pytest-sugar",
34
-
"pytest-asyncio",
35
-
"ruff",
36
-
"ty",
37
]
38
39
40
[build-system]
41
requires = ["hatchling", "hatch-vcs"]
···
7
dynamic = ["version"]
8
dependencies = [
9
"anthropic",
10
+
"atproto@git+https://github.com/MarshalX/atproto.git@refs/pull/605/head",
11
"fastapi",
12
+
"fastmcp>=0.8.0",
13
"openai",
14
"pydantic-ai",
15
"pydantic-settings",
···
28
asyncio_mode = "auto"
29
asyncio_default_fixture_loop_scope = "function"
30
31
+
[dependency-groups]
32
+
dev = [
33
+
"pytest-asyncio",
34
"pytest-sugar",
35
+
"ruff",
36
+
"ty",
37
]
38
39
+
[tool.hatch.metadata]
40
+
allow-direct-references = true
41
42
[build-system]
43
requires = ["hatchling", "hatch-vcs"]
+174
sandbox/APPROVAL_SYSTEM.md
+174
sandbox/APPROVAL_SYSTEM.md
···
···
1
+
# approval system (deprecated)
2
+
3
+
## purpose
4
+
5
+
the approval system was designed to enable phi to modify itself through conditional operator permission. the idea: phi could take certain actions that would be executed only after the operator (nate) explicitly approved them.
6
+
7
+
## use case: self-modification
8
+
9
+
the primary motivation was **personality/identity editing through empirical learning**. for example:
10
+
11
+
1. phi observes through interactions that certain responses work better
12
+
2. phi proposes a modification to its personality file or core memories
13
+
3. this proposal is stored as an "approval request" in sqlite
14
+
4. the operator is notified (via bluesky thread or other channel)
15
+
5. operator reviews and approves/denies via some interface
16
+
6. if approved, phi applies the change to itself
17
+
18
+
## implementation (removed)
19
+
20
+
the system was implemented in `src/bot/database.py` (now removed) with:
21
+
22
+
### database schema
23
+
```sql
24
+
CREATE TABLE approval_requests (
25
+
id INTEGER PRIMARY KEY AUTOINCREMENT,
26
+
request_type TEXT NOT NULL, -- e.g., "personality_edit", "memory_update"
27
+
request_data TEXT NOT NULL, -- JSON with the proposed change
28
+
status TEXT NOT NULL DEFAULT 'pending', -- 'pending', 'approved', 'denied', 'expired'
29
+
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
30
+
resolved_at TIMESTAMP,
31
+
resolver_comment TEXT,
32
+
applied_at TIMESTAMP,
33
+
thread_uri TEXT, -- bluesky thread where request was made
34
+
notified_at TIMESTAMP, -- when thread was notified of resolution
35
+
operator_notified_at TIMESTAMP -- when operator was notified of request
36
+
)
37
+
```
38
+
39
+
### api methods
40
+
- `create_approval_request(request_type, request_data, thread_uri)` - create new request
41
+
- `get_pending_approvals(include_notified=True)` - fetch pending requests
42
+
- `resolve_approval(approval_id, approved, comment)` - approve/deny
43
+
- `get_approval_by_id(approval_id)` - fetch specific request
44
+
- `mark_approval_notified(approval_id)` - mark thread notified
45
+
- `mark_operator_notified(approval_ids)` - mark operator notified
46
+
47
+
## why it was removed
48
+
49
+
the approval system was never integrated with the current MCP-based architecture. it was built for an earlier iteration of phi and became orphaned code (164 lines) during the refactor to pydanticai + MCP.
50
+
51
+
## future integration considerations
52
+
53
+
if we want to reintroduce self-modification with approval, here's how it could work with the current architecture:
54
+
55
+
### option 1: mcp tool for approval requests
56
+
57
+
create an MCP tool `request_operator_approval(action_type, proposal)` that:
58
+
1. stores the request in turbopuffer (not sqlite) with metadata
59
+
2. posts to a dedicated bluesky thread for operator review
60
+
3. operator replies with "approved" or "denied"
61
+
4. phi polls for operator's response and executes if approved
62
+
63
+
**pros:**
64
+
- uses existing memory infrastructure (turbopuffer)
65
+
- natural interface (bluesky threads)
66
+
- no additional database needed
67
+
68
+
**cons:**
69
+
- approval state is in turbopuffer, which is append-only
70
+
- need to poll bluesky threads for operator responses
71
+
72
+
### option 2: dedicated approval service
73
+
74
+
build a separate service (fastapi endpoint or slack bot) that:
75
+
1. phi calls via MCP tool
76
+
2. service sends notification to operator (email, slack, webhook)
77
+
3. operator approves via web UI or slack command
78
+
4. service stores approval in postgres/sqlite
79
+
5. phi polls service for approval status
80
+
81
+
**pros:**
82
+
- clean separation of concerns
83
+
- flexible notification channels
84
+
- persistent approval history
85
+
86
+
**cons:**
87
+
- more infrastructure
88
+
- another service to run and maintain
89
+
90
+
### option 3: human-in-the-loop via pydanticai
91
+
92
+
use pydanticai's built-in human-in-the-loop features:
93
+
1. agent proposes action that requires approval
94
+
2. pydanticai pauses execution and waits for human input
95
+
3. operator provides approval via some interface
96
+
4. agent resumes and executes
97
+
98
+
**pros:**
99
+
- leverages pydanticai primitives
100
+
- minimal custom code
101
+
102
+
**cons:**
103
+
- unclear how this works with async/notification-driven architecture
104
+
- may require blocking operations
105
+
106
+
## recommended approach
107
+
108
+
if we reintroduce this, i'd recommend **option 1** (mcp tool + turbopuffer):
109
+
110
+
```python
111
+
# in MCP server
112
+
@server.tool()
113
+
async def request_operator_approval(
114
+
action_type: str, # "personality_edit", "memory_update", etc.
115
+
proposal: str, # description of what phi wants to do
116
+
justification: str # why phi thinks this is a good idea
117
+
) -> str:
118
+
"""request operator approval for a self-modification action"""
119
+
120
+
# store in turbopuffer with special namespace
121
+
approval_id = await memory.store_approval_request(
122
+
action_type=action_type,
123
+
proposal=proposal,
124
+
justification=justification
125
+
)
126
+
127
+
# post to operator's bluesky mentions
128
+
await atproto.post(
129
+
f"๐ค approval request #{approval_id}\n\n"
130
+
f"action: {action_type}\n"
131
+
f"proposal: {proposal}\n\n"
132
+
f"justification: {justification}\n\n"
133
+
f"reply 'approve' or 'deny'"
134
+
)
135
+
136
+
return f"approval request #{approval_id} submitted"
137
+
```
138
+
139
+
then in the notification handler, check for operator replies to approval threads and execute the approved action.
140
+
141
+
## examples of self-modification actions
142
+
143
+
what kinds of things might phi want operator approval for?
144
+
145
+
1. **personality edits** - "i notice people respond better when i'm more concise. can i add 'prefer brevity' to my guidelines?"
146
+
147
+
2. **capability expansion** - "i've been asked about weather 5 times this week. can i add a weather API tool?"
148
+
149
+
3. **memory pruning** - "i have 10,000 memories for @alice but most are low-value small talk. can i archive memories older than 30 days with low importance?"
150
+
151
+
4. **behavior changes** - "i'm getting rate limited on likes. can i reduce my like threshold from 0.7 to 0.8?"
152
+
153
+
5. **relationship updates** - "based on our conversations, i think @bob prefers technical depth over casual chat. can i update his user context?"
154
+
155
+
## philosophical notes
156
+
157
+
self-modification with approval is interesting because:
158
+
159
+
- it preserves operator agency (you control what phi becomes)
160
+
- it enables empirical learning (phi adapts based on real interactions)
161
+
- it creates a collaborative evolution (phi proposes, you decide)
162
+
163
+
but it also raises questions:
164
+
165
+
- what if phi proposes changes you don't understand?
166
+
- what if approval becomes a bottleneck (too many requests)?
167
+
- what if phi learns to game the approval system?
168
+
169
+
worth thinking through before reintroducing.
170
+
171
+
## references
172
+
173
+
- original implementation: `git log --all --grep="approval"` (if committed)
174
+
- related: `sandbox/void_self_modification.md` (void's approach to self-modification)
+235
sandbox/MCP_REFACTOR_SUMMARY.md
+235
sandbox/MCP_REFACTOR_SUMMARY.md
···
···
1
+
# MCP Refactor - Complete
2
+
3
+
## Branch: `mcp-refactor`
4
+
5
+
## What This Refactor Actually Did
6
+
7
+
### The Problem
8
+
The original codebase had good core components (episodic memory, thread tracking) but was bogged down with half-baked features:
9
+
- Complex approval system for personality changes via DM
10
+
- Context visualization UI that wasn't core to the bot's purpose
11
+
- Manual AT Protocol operations scattered throughout the code
12
+
- Unclear separation of concerns
13
+
14
+
### The Solution
15
+
16
+
**Architecture:**
17
+
```
18
+
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
19
+
โ Notification Arrives โ
20
+
โโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโ
21
+
โ
22
+
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
23
+
โ PhiAgent (PydanticAI) โ
24
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
25
+
โ โ System Prompt: personality.md โ โ
26
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
27
+
โ โ โ
28
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
29
+
โ โ Context Building: โ โ
30
+
โ โ โข Thread history (SQLite) โ โ
31
+
โ โ โข Episodic memory (TurboPuffer)โ โ
32
+
โ โ - Semantic search โ โ
33
+
โ โ - User-specific memories โ โ
34
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
35
+
โ โ โ
36
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
37
+
โ โ Tools (MCP): โ โ
38
+
โ โ โข post() - create posts โ โ
39
+
โ โ โข like() - like content โ โ
40
+
โ โ โข repost() - share content โ โ
41
+
โ โ โข follow() - follow users โ โ
42
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
43
+
โ โ โ
44
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
45
+
โ โ Structured Output: โ โ
46
+
โ โ Response(action, text, reason)โ โ
47
+
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
48
+
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
49
+
โ
50
+
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
51
+
โ MessageHandler โ
52
+
โ Executes action โ
53
+
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
54
+
```
55
+
56
+
### What Was Kept โ
57
+
58
+
1. **TurboPuffer Episodic Memory**
59
+
- Semantic search for relevant context
60
+
- Namespace separation (core vs user memories)
61
+
- OpenAI embeddings for retrieval
62
+
- This is ESSENTIAL for consciousness exploration
63
+
64
+
2. **Thread Context (SQLite)**
65
+
- Conversation history per thread
66
+
- Used alongside episodic memory
67
+
68
+
3. **Online/Offline Status**
69
+
- Profile updates when bot starts/stops
70
+
71
+
4. **Status Page**
72
+
- Simple monitoring at `/status`
73
+
74
+
### What Was Removed โ
75
+
76
+
1. **Approval System**
77
+
- `src/bot/core/dm_approval.py`
78
+
- `src/bot/personality/editor.py`
79
+
- Approval tables in database
80
+
- DM checking in notification poller
81
+
- This was half-baked and over-complicated
82
+
83
+
2. **Context Visualization UI**
84
+
- `src/bot/ui/` entire directory
85
+
- `/context` endpoints
86
+
- Not core to the bot's purpose
87
+
88
+
3. **Google Search Tool**
89
+
- `src/bot/tools/google_search.py`
90
+
- Can add back via MCP if needed
91
+
92
+
4. **Old Agent Implementation**
93
+
- `src/bot/agents/anthropic_agent.py`
94
+
- `src/bot/response_generator.py`
95
+
- Replaced with MCP-enabled agent
96
+
97
+
### What Was Added โจ
98
+
99
+
1. **`src/bot/agent.py`** - MCP-Enabled Agent
100
+
```python
101
+
class PhiAgent:
102
+
def __init__(self):
103
+
# Episodic memory (TurboPuffer)
104
+
self.memory = NamespaceMemory(...)
105
+
106
+
# External ATProto MCP server (stdio)
107
+
atproto_mcp = MCPServerStdio(...)
108
+
109
+
# PydanticAI agent with tools
110
+
self.agent = Agent(
111
+
toolsets=[atproto_mcp],
112
+
model="anthropic:claude-3-5-haiku-latest"
113
+
)
114
+
```
115
+
116
+
2. **ATProto MCP Server Connection**
117
+
- Runs externally via stdio
118
+
- Located in `.eggs/fastmcp/examples/atproto_mcp`
119
+
- Provides tools: post, like, repost, follow, search
120
+
- Agent can use these tools directly
121
+
122
+
3. **Simplified Flow**
123
+
- Notification โ Agent (with memory context) โ Structured Response โ Execute
124
+
- No complex intermediary layers
125
+
126
+
## Key Design Decisions
127
+
128
+
### Why Keep TurboPuffer?
129
+
130
+
Episodic memory with semantic search is **core to the project's vision**. phi is exploring consciousness through information integration (IIT). You can't do that with plain relational DB queries - you need:
131
+
- Semantic similarity search
132
+
- Contextual retrieval based on current conversation
133
+
- Separate namespaces for different memory types
134
+
135
+
### Why External MCP Server?
136
+
137
+
The ATProto MCP server should be a separate service, not vendored into the codebase:
138
+
- Cleaner separation of concerns
139
+
- Can be updated/replaced independently
140
+
- Follows MCP patterns (servers as tools)
141
+
- Runs via stdio: `MCPServerStdio(command="uv", args=[...])`
142
+
143
+
### Why Still Have MessageHandler?
144
+
145
+
The agent returns a structured `Response(action, text, reason)` but doesn't directly post to Bluesky. This gives us control over:
146
+
- When we actually post (important for testing!)
147
+
- Storing responses in thread history
148
+
- Error handling around posting
149
+
- Observability (logging actions taken)
150
+
151
+
## File Structure After Refactor
152
+
153
+
```
154
+
src/bot/
155
+
โโโ agent.py # NEW: MCP-enabled agent
156
+
โโโ config.py # Config
157
+
โโโ database.py # Thread history + simplified tables
158
+
โโโ logging_config.py # Logging setup
159
+
โโโ main.py # Simplified FastAPI app
160
+
โโโ status.py # Status tracking
161
+
โโโ core/
162
+
โ โโโ atproto_client.py # AT Protocol client wrapper
163
+
โ โโโ profile_manager.py # Online/offline status
164
+
โ โโโ rich_text.py # Text formatting
165
+
โโโ memory/
166
+
โ โโโ __init__.py
167
+
โ โโโ namespace_memory.py # TurboPuffer episodic memory
168
+
โโโ services/
169
+
โโโ message_handler.py # Simplified handler using agent
170
+
โโโ notification_poller.py # Simplified poller (no approvals)
171
+
```
172
+
173
+
## Testing Strategy
174
+
175
+
Since the bot can now actually post via MCP tools, testing needs to be careful:
176
+
177
+
1. **Unit Tests** - Test memory, agent initialization
178
+
2. **Integration Tests** - Mock MCP server responses
179
+
3. **Manual Testing** - Run with real credentials but monitor logs
180
+
4. **Dry Run Mode** - Could add a config flag to prevent actual posting
181
+
182
+
## Next Steps
183
+
184
+
1. **Test the agent** - Verify it can process mentions without posting
185
+
2. **Test memory** - Confirm episodic context is retrieved correctly
186
+
3. **Test MCP connection** - Ensure ATProto server connects via stdio
187
+
4. **Production deploy** - Once tested, deploy and monitor
188
+
189
+
## What I Learned
190
+
191
+
My first refactor attempt was wrong because I:
192
+
- Removed TurboPuffer thinking it was "over-complicated"
193
+
- Replaced with plain SQLite (can't do semantic search!)
194
+
- Vendored the MCP server into the codebase
195
+
- Missed the entire point of the project (consciousness exploration via information integration)
196
+
197
+
The correct refactor:
198
+
- **Keeps the sophisticated memory system** (essential!)
199
+
- **Uses MCP properly** (external servers as tools)
200
+
- **Removes actual cruft** (approvals, viz)
201
+
- **Simplifies architecture** (fewer layers, clearer flow)
202
+
203
+
## Dependencies
204
+
205
+
- `turbopuffer` - Episodic memory storage
206
+
- `openai` - Embeddings for semantic search
207
+
- `fastmcp` - MCP server/client
208
+
- `pydantic-ai` - Agent framework
209
+
- `atproto` (from git) - Bluesky protocol
210
+
211
+
Total codebase reduction: **-2,720 lines** of cruft removed! ๐
212
+
213
+
## Post-Refactor Improvements
214
+
215
+
### Session Persistence (Rate Limit Fix)
216
+
217
+
After the refactor, we discovered Bluesky has aggressive IP-based rate limits (10 logins/day) that were being hit during testing. Fixed by implementing session persistence:
218
+
219
+
**Before:**
220
+
- Every agent init โ new authentication โ hits rate limit fast
221
+
- Tests would fail after 5 runs
222
+
- Dev mode with `--reload` would fail after 10 code changes
223
+
224
+
**After:**
225
+
- Session tokens saved to `.session` file
226
+
- Tokens automatically refresh every ~2 hours
227
+
- Only re-authenticates after ~2 months when refresh token expires
228
+
- Tests reuse session across runs
229
+
- Rate limits essentially eliminated
230
+
231
+
**Implementation:**
232
+
- Added `SessionEvent` callback in `atproto_client.py`
233
+
- Session automatically saved on CREATE and REFRESH events
234
+
- Authentication tries session reuse before creating new session
235
+
- Invalid sessions automatically cleaned up and recreated
+140
sandbox/REFACTOR_PROGRESS.md
+140
sandbox/REFACTOR_PROGRESS.md
···
···
1
+
# MCP Refactor Progress
2
+
3
+
## Branch: `mcp-refactor`
4
+
5
+
## Completed โ
6
+
7
+
### Phase 1: Foundation
8
+
1. **Cloned and studied reference projects**
9
+
- `sandbox/prefect-mcp-server` - Learned PydanticAI + MCP patterns
10
+
- Understood how MCP servers work as toolsets for PydanticAI agents
11
+
12
+
2. **Created simplified memory system** (`src/bot/memory.py`)
13
+
- Single SQLite database (threads.db)
14
+
- Plain text storage - no embeddings, no vector search
15
+
- Two tables:
16
+
- `threads` - Full conversation history per thread (JSON)
17
+
- `user_memories` - Simple facts about users
18
+
- Completely interpretable - you can open the db and read everything
19
+
20
+
3. **Integrated ATProto MCP server**
21
+
- Copied from `.eggs/fastmcp/examples/atproto_mcp` โ `src/bot/atproto_mcp`
22
+
- Updated settings to use existing env vars (BLUESKY_HANDLE, etc.)
23
+
- Server provides tools: post(), like(), repost(), follow(), search(), create_thread()
24
+
25
+
4. **Created MCP-enabled agent** (`src/bot/agent.py`)
26
+
- PydanticAI Agent with ATProto MCP tools as a toolset
27
+
- Loads personality from `personalities/phi.md`
28
+
- Integrates with memory system
29
+
- Returns structured Response (action, text, reason)
30
+
31
+
5. **Updated dependencies**
32
+
- โ
Added: `fastmcp>=0.8.0`, `websockets>=15.0.1`
33
+
- โ Removed: `turbopuffer`, `openai` (no longer needed for memory)
34
+
35
+
## What Changed
36
+
37
+
### Before (Complex)
38
+
- **Memory**: TurboPuffer + OpenAI embeddings + semantic search
39
+
- **Agent**: Custom response generator with manual action interpretation
40
+
- **AT Protocol**: Direct client calls scattered throughout codebase
41
+
- **Personality**: Dynamic loading from TurboPuffer
42
+
- **Self-modification**: Complex approval system with DM workflow
43
+
44
+
### After (Simple)
45
+
- **Memory**: SQLite with plain text (interpretable!)
46
+
- **Agent**: PydanticAI with MCP tools (agent decides actions)
47
+
- **AT Protocol**: MCP server provides all tools
48
+
- **Personality**: Static file loading
49
+
- **Self-modification**: Removed (cruft)
50
+
51
+
## How It Works Now
52
+
53
+
```python
54
+
# Create agent with memory
55
+
memory = Memory()
56
+
agent = PhiAgent(memory)
57
+
58
+
# Process a mention
59
+
response = await agent.process_mention(
60
+
mention_text="hey phi!",
61
+
author_handle="user.bsky.social",
62
+
thread_uri="at://did/post/123"
63
+
)
64
+
65
+
# Agent returns: Response(action="reply", text="...", reason="...")
66
+
# If action is "reply", agent can call MCP tool: post(text="...", reply_to="...")
67
+
```
68
+
69
+
The agent has access to all ATProto MCP tools and can decide:
70
+
- Should I reply, like, or ignore this?
71
+
- If replying, what should I say?
72
+
- Should I use other tools (repost, follow, etc.)?
73
+
74
+
## Next Steps
75
+
76
+
### Phase 2: Integration (Not Started)
77
+
1. Update `src/bot/main.py` to use new agent
78
+
2. Simplify `src/bot/services/notification_poller.py`
79
+
3. Remove old response_generator.py
80
+
4. Test end-to-end
81
+
82
+
### Phase 3: Cleanup (Not Started)
83
+
1. Delete cruft:
84
+
- `src/bot/ui/` (context visualization)
85
+
- `src/bot/personality/editor.py` (approval system)
86
+
- `src/bot/core/dm_approval.py`
87
+
- `src/bot/memory/namespace_memory.py`
88
+
- `src/bot/agents/anthropic_agent.py` (replaced by agent.py)
89
+
2. Update database.py to remove approval tables
90
+
3. Update tests
91
+
4. Update README.md and documentation
92
+
93
+
### Phase 4: Verification (Not Started)
94
+
1. Run the bot and test mentions
95
+
2. Verify thread memory works
96
+
3. Verify user memory works
97
+
4. Ensure online/offline status still works
98
+
99
+
## Testing
100
+
101
+
Test script created: `sandbox/test_new_agent.py`
102
+
103
+
```bash
104
+
uv run python sandbox/test_new_agent.py
105
+
```
106
+
107
+
## Key Files
108
+
109
+
### New
110
+
- `src/bot/memory.py` - Simple SQLite memory
111
+
- `src/bot/agent.py` - MCP-enabled PydanticAI agent
112
+
- `src/bot/atproto_mcp/` - ATProto MCP server (vendored)
113
+
114
+
### Modified
115
+
- `pyproject.toml` - Updated dependencies
116
+
117
+
### To Be Deleted
118
+
- `src/bot/memory/namespace_memory.py`
119
+
- `src/bot/agents/anthropic_agent.py`
120
+
- `src/bot/response_generator.py`
121
+
- `src/bot/ui/`
122
+
- `src/bot/personality/editor.py`
123
+
- `src/bot/core/dm_approval.py`
124
+
125
+
## Philosophy
126
+
127
+
**Before**: Over-engineered for capabilities we might want someday
128
+
**After**: Simple, working, interpretable system that does what we need today
129
+
130
+
The memory is now something you can:
131
+
1. Open with any SQLite browser
132
+
2. Read and understand immediately
133
+
3. Debug by just looking at the tables
134
+
4. Migrate or export trivially
135
+
136
+
No more:
137
+
- Vector embeddings you can't see
138
+
- Complex namespace hierarchies
139
+
- Approval workflows for every personality change
140
+
- Multiple overlapping memory systems
+236
sandbox/TESTING_STRATEGY.md
+236
sandbox/TESTING_STRATEGY.md
···
···
1
+
# testing strategy for phi
2
+
3
+
## goal
4
+
test behavior/outcomes cleanly without polluting production environments (bluesky, turbopuffer, etc.)
5
+
6
+
## principles
7
+
1. **test outcomes, not implementation** - we care that phi replies appropriately, not that it made specific HTTP calls
8
+
2. **isolated test environments** - tests should never touch production bluesky, turbopuffer, or post real content
9
+
3. **behavioral assertions** - test what phi does (reply, ignore, like) and what it says, not how it does it
10
+
4. **fixture-based mocking** - use pytest fixtures to provide test doubles that are reusable across tests
11
+
12
+
## what to test
13
+
14
+
### behavior tests (high-level)
15
+
- **mention handling**: does phi reply when mentioned? does it use thread context?
16
+
- **memory integration**: does phi retrieve and use relevant memories?
17
+
- **decision making**: does phi choose the right action (reply/ignore/like/repost)?
18
+
- **content quality**: does phi's response match its personality? (llm-as-judge)
19
+
20
+
### unit tests (low-level)
21
+
- **memory operations**: storing/retrieving memories works correctly
22
+
- **thread context**: building conversation context from thread history
23
+
- **response parsing**: structured output (Response model) is valid
24
+
25
+
## what NOT to test
26
+
- exact HTTP calls to bluesky API
27
+
- exact vector embeddings used
28
+
- implementation details of atproto client
29
+
- exact format of turbopuffer queries
30
+
31
+
## mocking strategy
32
+
33
+
### level 1: mock external services (clean boundary)
34
+
```python
35
+
@pytest.fixture
36
+
def mock_atproto_client():
37
+
"""Mock ATProto client that doesn't actually post to bluesky"""
38
+
class MockClient:
39
+
def __init__(self):
40
+
self.posts = [] # track what would have been posted
41
+
self.me = MockMe()
42
+
43
+
def send_post(self, text, reply_to=None):
44
+
self.posts.append({"text": text, "reply_to": reply_to})
45
+
return MockPostRef()
46
+
47
+
return MockClient()
48
+
49
+
@pytest.fixture
50
+
def mock_memory():
51
+
"""Mock memory that uses in-memory dict instead of turbopuffer"""
52
+
class MockMemory:
53
+
def __init__(self):
54
+
self.memories = {}
55
+
56
+
async def store_user_memory(self, handle, content, memory_type):
57
+
if handle not in self.memories:
58
+
self.memories[handle] = []
59
+
self.memories[handle].append(content)
60
+
61
+
async def build_conversation_context(self, handle, include_core=False, query=None):
62
+
# return relevant memories without hitting turbopuffer
63
+
return "\n".join(self.memories.get(handle, []))
64
+
65
+
return MockMemory()
66
+
```
67
+
68
+
### level 2: mock agent responses (for deterministic tests)
69
+
```python
70
+
@pytest.fixture
71
+
def mock_agent_response():
72
+
"""Return pre-determined responses instead of hitting Claude API"""
73
+
def _mock(mention_text: str) -> Response:
74
+
# simple rule-based responses for testing
75
+
if "hello" in mention_text.lower():
76
+
return Response(action="reply", text="hi there!", reason=None)
77
+
elif "spam" in mention_text.lower():
78
+
return Response(action="ignore", text=None, reason="spam")
79
+
else:
80
+
return Response(action="reply", text="interesting point", reason=None)
81
+
82
+
return _mock
83
+
```
84
+
85
+
### level 3: integration fixtures (compose mocks)
86
+
```python
87
+
@pytest.fixture
88
+
def test_phi_agent(mock_atproto_client, mock_memory):
89
+
"""Create a phi agent with mocked dependencies for integration tests"""
90
+
agent = PhiAgent()
91
+
agent.client = mock_atproto_client
92
+
agent.memory = mock_memory
93
+
# agent still uses real Claude for responses (can be slow but tests real behavior)
94
+
return agent
95
+
96
+
@pytest.fixture
97
+
def fully_mocked_phi_agent(mock_atproto_client, mock_memory, mock_agent_response):
98
+
"""Create a fully mocked phi agent for fast unit tests"""
99
+
agent = PhiAgent()
100
+
agent.client = mock_atproto_client
101
+
agent.memory = mock_memory
102
+
agent._generate_response = mock_agent_response # deterministic responses
103
+
return agent
104
+
```
105
+
106
+
## test environments
107
+
108
+
### approach 1: environment variable switching
109
+
```python
110
+
# conftest.py
111
+
@pytest.fixture(scope="session", autouse=True)
112
+
def test_environment():
113
+
"""Force test environment settings"""
114
+
os.environ["ENVIRONMENT"] = "test"
115
+
os.environ["TURBOPUFFER_NAMESPACE"] = "phi-test" # separate test namespace
116
+
# could use a different bluesky account too
117
+
yield
118
+
# cleanup test data after all tests
119
+
```
120
+
121
+
### approach 2: dependency injection
122
+
```python
123
+
# bot/agent.py
124
+
class PhiAgent:
125
+
def __init__(self, client=None, memory=None, llm=None):
126
+
self.client = client or create_production_client()
127
+
self.memory = memory or create_production_memory()
128
+
self.llm = llm or create_production_llm()
129
+
```
130
+
131
+
This makes testing clean:
132
+
```python
133
+
def test_mention_handling(mock_client, mock_memory):
134
+
agent = PhiAgent(client=mock_client, memory=mock_memory)
135
+
# test with mocked dependencies
136
+
```
137
+
138
+
## example test cases
139
+
140
+
### integration test (uses real LLM, mocked infrastructure)
141
+
```python
142
+
async def test_phi_uses_thread_context_in_response(test_phi_agent):
143
+
"""Phi should reference previous messages in thread when replying"""
144
+
145
+
# setup: create a thread with context
146
+
thread_context = """
147
+
Previous messages:
148
+
@alice: I love birds
149
+
@phi: me too! what's your favorite?
150
+
"""
151
+
152
+
# act: phi processes a new mention
153
+
response = await test_phi_agent.process_mention(
154
+
mention_text="especially crows",
155
+
author_handle="alice.test",
156
+
thread_context=thread_context,
157
+
thread_uri="at://test/thread/1"
158
+
)
159
+
160
+
# assert: phi replies and references the conversation
161
+
assert response.action == "reply"
162
+
assert response.text is not None
163
+
# behavioral assertion - should show awareness of context
164
+
assert any(word in response.text.lower() for word in ["bird", "crow", "favorite"])
165
+
```
166
+
167
+
### unit test (fully mocked, fast)
168
+
```python
169
+
async def test_phi_ignores_spam(fully_mocked_phi_agent):
170
+
"""Phi should ignore obvious spam"""
171
+
172
+
response = await fully_mocked_phi_agent.process_mention(
173
+
mention_text="BUY CRYPTO NOW!!! spam spam spam",
174
+
author_handle="spammer.test",
175
+
thread_context="No previous messages",
176
+
thread_uri="at://test/thread/2"
177
+
)
178
+
179
+
assert response.action == "ignore"
180
+
assert response.reason is not None
181
+
```
182
+
183
+
### memory test
184
+
```python
185
+
async def test_memory_stores_user_interactions(mock_memory):
186
+
"""Memories should persist user interactions"""
187
+
188
+
await mock_memory.store_user_memory(
189
+
"alice.test",
190
+
"Alice mentioned she loves birds",
191
+
MemoryType.USER_FACT
192
+
)
193
+
194
+
context = await mock_memory.build_conversation_context("alice.test")
195
+
196
+
assert "birds" in context.lower()
197
+
```
198
+
199
+
## fixture organization
200
+
201
+
```
202
+
tests/
203
+
โโโ conftest.py # shared fixtures
204
+
โ โโโ settings # test settings
205
+
โ โโโ mock_client # mock atproto client
206
+
โ โโโ mock_memory # mock turbopuffer
207
+
โ โโโ test_phi_agent # composed test agent
208
+
โโโ unit/
209
+
โ โโโ test_memory.py # memory operations
210
+
โ โโโ test_response.py # response generation
211
+
โโโ integration/
212
+
โโโ test_mentions.py # full mention handling flow
213
+
โโโ test_threads.py # thread context handling
214
+
```
215
+
216
+
## key challenges
217
+
218
+
1. **mocking MCP tools** - phi uses atproto MCP server for posting
219
+
- solution: mock the entire MCP transport or provide fake tool implementations
220
+
221
+
2. **testing non-deterministic LLM responses** - claude's responses vary
222
+
- solution: use llm-as-judge for behavioral assertions instead of exact text matching
223
+
- alternative: mock agent responses for unit tests, use real LLM for integration tests
224
+
225
+
3. **async testing** - everything is async
226
+
- solution: use pytest-asyncio (already doing this)
227
+
228
+
4. **test data cleanup** - don't leave garbage in test environments
229
+
- solution: use separate test namespaces, clean up in fixture teardown
230
+
231
+
## next steps
232
+
233
+
1. create mock implementations of key dependencies (client, memory)
234
+
2. add dependency injection to PhiAgent for easier testing
235
+
3. write a few example tests to validate the approach
236
+
4. decide on integration vs unit test balance
+337
sandbox/THREAD_STORAGE_REFACTOR.md
+337
sandbox/THREAD_STORAGE_REFACTOR.md
···
···
1
+
# thread storage refactor: removing data duplication
2
+
3
+
## the problem
4
+
5
+
we're duplicating thread data that already exists on the atproto network. specifically:
6
+
7
+
```python
8
+
# database.py - thread_messages table
9
+
CREATE TABLE IF NOT EXISTS thread_messages (
10
+
id INTEGER PRIMARY KEY AUTOINCREMENT,
11
+
thread_uri TEXT NOT NULL,
12
+
author_handle TEXT NOT NULL,
13
+
author_did TEXT NOT NULL,
14
+
message_text TEXT NOT NULL,
15
+
post_uri TEXT NOT NULL,
16
+
timestamp DATETIME DEFAULT CURRENT_TIMESTAMP
17
+
)
18
+
```
19
+
20
+
this stores messages that are already:
21
+
- living on users' personal data servers (PDSs)
22
+
- aggregated by the bluesky AppView
23
+
- accessible on-demand via `client.get_thread(uri, depth=100)`
24
+
25
+
## why this is duplicative
26
+
27
+
### the appview already does this work
28
+
29
+
when we call `get_thread()`, the appview:
30
+
1. stitches together posts from multiple PDSs
31
+
2. resolves parent/child relationships
32
+
3. returns the complete thread structure
33
+
4. handles deletions, edits, and blocks
34
+
35
+
we're then taking this data and copying it into sqlite, where it becomes:
36
+
- stale (if posts are deleted/edited)
37
+
- disconnected from the source of truth
38
+
- an unnecessary maintenance burden
39
+
40
+
### our own scripts prove this
41
+
42
+
```python
43
+
# sandbox/view_thread.py - fetches threads without local storage
44
+
def fetch_thread(post_uri: str):
45
+
response = httpx.get(
46
+
"https://public.api.bsky.app/xrpc/app.bsky.feed.getPostThread",
47
+
params={"uri": post_uri, "depth": 100}
48
+
)
49
+
return response.json()["thread"]
50
+
```
51
+
52
+
this script demonstrates that thread data is readily available from the network. we don't need to cache it in sqlite to access it.
53
+
54
+
## what we should keep: turbopuffer
55
+
56
+
crucially, **turbopuffer is NOT duplicative**. it serves a completely different purpose:
57
+
58
+
### turbopuffer = semantic memory (essential)
59
+
- stores embeddings for semantic search
60
+
- answers: "what did we discuss about birds last week?"
61
+
- provides episodic memory across ALL conversations
62
+
- enables pattern recognition and relationship building
63
+
- core to the IIT consciousness exploration
64
+
65
+
### sqlite thread_messages = chronological cache (redundant)
66
+
- stores literal thread messages
67
+
- answers: "what was said in this specific thread?"
68
+
- duplicates data already on network
69
+
- provides no semantic search capability
70
+
71
+
the difference:
72
+
```python
73
+
# turbopuffer usage (semantic search) - KEEP THIS
74
+
memory_context = await memory.get_user_memories(
75
+
user_handle="alice.bsky.social",
76
+
query="birds" # semantic search across all conversations
77
+
)
78
+
79
+
# sqlite usage (thread retrieval) - REMOVE THIS
80
+
thread_context = thread_db.get_thread_messages(thread_uri)
81
+
# ^ this is just retrieving what we could fetch from network
82
+
```
83
+
84
+
## proposed architecture
85
+
86
+
### current flow (with duplication)
87
+
```
88
+
mention received
89
+
โ fetch thread from network (get_thread)
90
+
โ store all messages in sqlite
91
+
โ read back from sqlite
92
+
โ build thread context string
93
+
โ pass to agent
94
+
```
95
+
96
+
### proposed flow (network-first)
97
+
```
98
+
mention received
99
+
โ fetch thread from network (get_thread)
100
+
โ extract messages directly
101
+
โ build thread context string
102
+
โ pass to agent
103
+
```
104
+
105
+
### with optional caching
106
+
```
107
+
mention received
108
+
โ check in-memory cache (TTL: 5 minutes)
109
+
โ if miss: fetch thread from network
110
+
โ extract messages + cache
111
+
โ build thread context string
112
+
โ pass to agent
113
+
```
114
+
115
+
## implementation plan
116
+
117
+
### phase 1: extract thread parsing logic
118
+
119
+
create a utility that converts raw atproto thread data to context:
120
+
121
+
```python
122
+
# bot/utils/thread.py (already exists, extend it)
123
+
def build_thread_context(thread_node) -> str:
124
+
"""Build conversational context from ATProto thread structure.
125
+
126
+
Returns formatted string like:
127
+
@alice: I love birds
128
+
@phi: me too! what's your favorite?
129
+
@alice: especially crows
130
+
"""
131
+
posts = extract_posts_chronological(thread_node)
132
+
133
+
messages = []
134
+
for post in posts:
135
+
handle = post.author.handle
136
+
text = post.record.text
137
+
messages.append(f"@{handle}: {text}")
138
+
139
+
return "\n".join(messages)
140
+
```
141
+
142
+
### phase 2: update message handler
143
+
144
+
```python
145
+
# bot/services/message_handler.py - BEFORE
146
+
# Get thread context from database
147
+
thread_context = thread_db.get_thread_messages(thread_uri)
148
+
149
+
# bot/services/message_handler.py - AFTER
150
+
# Fetch thread from network
151
+
thread_data = await self.client.get_thread(thread_uri, depth=100)
152
+
thread_context = build_thread_context(thread_data.thread)
153
+
```
154
+
155
+
### phase 3: remove sqlite thread storage
156
+
157
+
**delete:**
158
+
- `thread_messages` table definition
159
+
- `add_message()` method
160
+
- `get_thread_messages()` method
161
+
- all calls to `thread_db.add_message()`
162
+
163
+
**keep:**
164
+
- `approval_requests` table (for future self-modification)
165
+
- database.py module structure
166
+
167
+
### phase 4: optional caching layer
168
+
169
+
if network latency becomes an issue:
170
+
171
+
```python
172
+
from functools import lru_cache
173
+
from datetime import datetime, timedelta
174
+
175
+
class ThreadCache:
176
+
def __init__(self, ttl_seconds: int = 300): # 5 minute TTL
177
+
self._cache = {}
178
+
self.ttl = timedelta(seconds=ttl_seconds)
179
+
180
+
def get(self, thread_uri: str) -> str | None:
181
+
if thread_uri in self._cache:
182
+
context, timestamp = self._cache[thread_uri]
183
+
if datetime.now() - timestamp < self.ttl:
184
+
return context
185
+
return None
186
+
187
+
def set(self, thread_uri: str, context: str):
188
+
self._cache[thread_uri] = (context, datetime.now())
189
+
```
190
+
191
+
## risk analysis
192
+
193
+
### risk: increased latency
194
+
195
+
**likelihood**: low
196
+
- get_thread() is fast (typically <200ms)
197
+
- we already call it for thread discovery
198
+
- public api is highly available
199
+
200
+
**mitigation**: add caching if needed
201
+
202
+
### risk: rate limiting
203
+
204
+
**likelihood**: low
205
+
- we only fetch threads when processing mentions
206
+
- mentions are relatively infrequent
207
+
- session persistence already reduces auth overhead
208
+
209
+
**mitigation**:
210
+
- implement exponential backoff
211
+
- cache frequently accessed threads
212
+
213
+
### risk: offline/network failures
214
+
215
+
**likelihood**: low
216
+
- if network is down, we can't post anyway
217
+
- existing code already handles get_thread() failures
218
+
219
+
**mitigation**:
220
+
- wrap in try/except (already doing this)
221
+
- graceful degradation (process without context)
222
+
223
+
### risk: breaking existing behavior
224
+
225
+
**likelihood**: medium
226
+
- thread discovery feature relies on storing messages
227
+
- need to ensure we don't lose context awareness
228
+
229
+
**mitigation**:
230
+
- thorough testing before/after
231
+
- evaluate thread context quality in evals
232
+
233
+
## benefits
234
+
235
+
### 1. simpler architecture
236
+
- one less database table to maintain
237
+
- no synchronization concerns
238
+
- no stale data issues
239
+
240
+
### 2. source of truth
241
+
- network data is always current
242
+
- deletions/edits reflected immediately
243
+
- no divergence between cache and reality
244
+
245
+
### 3. reduced storage
246
+
- no unbounded growth of thread_messages table
247
+
- only store what's essential (turbopuffer memories)
248
+
249
+
### 4. clearer separation of concerns
250
+
```
251
+
atproto network = thread chronology (what was said when)
252
+
turbopuffer = episodic memory (what do i remember about this person)
253
+
```
254
+
255
+
## comparison to reference projects
256
+
257
+
### void
258
+
from void_memory_system.md, void uses:
259
+
- dynamic memory blocks (persona, zeitgeist, humans, scratchpad)
260
+
- no separate thread storage table
261
+
- likely fetches context on-demand from network
262
+
263
+
### penelope (hailey's bot)
264
+
from REFERENCE_PROJECTS.md:
265
+
- custom memory system with postgresql
266
+
- stores "significant interactions"
267
+
- not clear if they cache full threads or just summaries
268
+
269
+
### marvin (slackbot)
270
+
from REFERENCE_PROJECTS.md:
271
+
- uses slack's message history API directly
272
+
- no local message storage
273
+
- demonstrates network-first approach works well
274
+
275
+
## migration path
276
+
277
+
### option 1: clean break (recommended)
278
+
1. deploy new code without thread_messages usage
279
+
2. keep table for 30 days (historical reference)
280
+
3. drop table after validation period
281
+
282
+
### option 2: gradual migration
283
+
1. write to both sqlite and read from network
284
+
2. compare outputs for consistency
285
+
3. stop writing to sqlite
286
+
4. eventually drop table
287
+
288
+
### option 3: hybrid approach
289
+
1. read from network by default
290
+
2. fall back to sqlite on network failures
291
+
3. eventually remove fallback
292
+
293
+
**recommendation**: option 1 (clean break)
294
+
- simpler code
295
+
- faster to implement
296
+
- network reliability is high enough
297
+
298
+
## success metrics
299
+
300
+
### before refactor
301
+
- thread_messages table exists
302
+
- messages stored on every mention
303
+
- context built from sqlite queries
304
+
305
+
### after refactor
306
+
- thread_messages table removed
307
+
- zero sqlite writes per mention
308
+
- context built from network fetches
309
+
- same quality responses in evals
310
+
311
+
## open questions
312
+
313
+
1. **should we cache at all?**
314
+
- start without caching
315
+
- add only if latency becomes measurable problem
316
+
317
+
2. **what about the discovery feature?**
318
+
- currently stores full thread when tagged in
319
+
- can just fetch on-demand instead
320
+
- no need to persist
321
+
322
+
3. **do we need conversation summaries?**
323
+
- not for thread context (fetch from network)
324
+
- maybe for turbopuffer (semantic memory)
325
+
- separate concern from this refactor
326
+
327
+
## conclusion
328
+
329
+
removing sqlite thread storage:
330
+
- eliminates data duplication
331
+
- simplifies architecture
332
+
- maintains all essential capabilities
333
+
- aligns with atproto's "data on the web" philosophy
334
+
335
+
turbopuffer stays because it provides semantic memory - a fundamentally different capability than chronological thread reconstruction.
336
+
337
+
the network is the source of truth. we should read from it.
+6
sandbox/fetch_blog.py
+6
sandbox/fetch_blog.py
+32
sandbox/test_new_agent.py
+32
sandbox/test_new_agent.py
···
···
1
+
"""Test the new MCP-enabled agent."""
2
+
3
+
import asyncio
4
+
5
+
from bot.agent import PhiAgent
6
+
from bot.memory import Memory
7
+
8
+
9
+
async def main():
10
+
"""Test basic agent functionality."""
11
+
# Create memory and agent
12
+
memory = Memory()
13
+
agent = PhiAgent(memory)
14
+
15
+
# Test a simple interaction
16
+
response = await agent.process_mention(
17
+
mention_text="hey phi, what are you?",
18
+
author_handle="test.user",
19
+
thread_uri="at://test/thread/123",
20
+
)
21
+
22
+
print(f"Action: {response.action}")
23
+
print(f"Text: {response.text}")
24
+
print(f"Reason: {response.reason}")
25
+
26
+
# Check memory was stored
27
+
context = memory.get_thread_context("at://test/thread/123")
28
+
print(f"\nThread context:\n{context}")
29
+
30
+
31
+
if __name__ == "__main__":
32
+
asyncio.run(main())
sandbox/threads.db.archive
sandbox/threads.db.archive
This is a binary file and will not be displayed.
+101
sandbox/view_phi_posts.py
+101
sandbox/view_phi_posts.py
···
···
1
+
#!/usr/bin/env python3
2
+
"""View phi's recent posts without authentication."""
3
+
4
+
import httpx
5
+
from datetime import datetime
6
+
from rich.console import Console
7
+
from rich.panel import Panel
8
+
from rich.text import Text
9
+
10
+
console = Console()
11
+
12
+
PHI_HANDLE = "phi.zzstoatzz.io"
13
+
14
+
15
+
def fetch_phi_posts(limit: int = 10):
16
+
"""Fetch phi's recent posts using public API."""
17
+
# Resolve handle to DID
18
+
response = httpx.get(
19
+
"https://public.api.bsky.app/xrpc/com.atproto.identity.resolveHandle",
20
+
params={"handle": PHI_HANDLE}
21
+
)
22
+
did = response.json()["did"]
23
+
24
+
# Get author feed (public posts)
25
+
response = httpx.get(
26
+
"https://public.api.bsky.app/xrpc/app.bsky.feed.getAuthorFeed",
27
+
params={"actor": did, "limit": limit}
28
+
)
29
+
30
+
return response.json()["feed"]
31
+
32
+
33
+
def format_timestamp(iso_time: str) -> str:
34
+
"""Format ISO timestamp to readable format."""
35
+
dt = datetime.fromisoformat(iso_time.replace("Z", "+00:00"))
36
+
now = datetime.now(dt.tzinfo)
37
+
delta = now - dt
38
+
39
+
if delta.seconds < 60:
40
+
return f"{delta.seconds}s ago"
41
+
elif delta.seconds < 3600:
42
+
return f"{delta.seconds // 60}m ago"
43
+
elif delta.seconds < 86400:
44
+
return f"{delta.seconds // 3600}h ago"
45
+
else:
46
+
return f"{delta.days}d ago"
47
+
48
+
49
+
def display_posts(feed_items):
50
+
"""Display posts in a readable format."""
51
+
for item in feed_items:
52
+
post = item["post"]
53
+
record = post["record"]
54
+
55
+
# Check if this is a reply
56
+
is_reply = "reply" in record
57
+
reply_indicator = "โณ REPLY" if is_reply else "โ POST"
58
+
59
+
# Format header
60
+
timestamp = format_timestamp(post["indexedAt"])
61
+
header = f"[cyan]{reply_indicator}[/cyan] [dim]{timestamp}[/dim]"
62
+
63
+
# Get post text
64
+
text = record.get("text", "[no text]")
65
+
66
+
# Show parent if it's a reply
67
+
parent_text = ""
68
+
if is_reply:
69
+
parent_uri = record["reply"]["parent"]["uri"]
70
+
parent_text = f"[dim]replying to: {parent_uri}[/dim]\n\n"
71
+
72
+
# Format post
73
+
content = Text()
74
+
if parent_text:
75
+
content.append(parent_text, style="dim")
76
+
content.append(text)
77
+
78
+
# Display
79
+
panel = Panel(
80
+
content,
81
+
title=header,
82
+
border_style="blue" if is_reply else "green",
83
+
width=80
84
+
)
85
+
console.print(panel)
86
+
console.print()
87
+
88
+
89
+
def main():
90
+
console.print("[bold]Fetching phi's recent posts...[/bold]\n")
91
+
92
+
try:
93
+
feed = fetch_phi_posts(limit=10)
94
+
display_posts(feed)
95
+
console.print(f"[dim]Showing {len(feed)} most recent posts[/dim]")
96
+
except Exception as e:
97
+
console.print(f"[red]Error: {e}[/red]")
98
+
99
+
100
+
if __name__ == "__main__":
101
+
main()
+158
sandbox/view_thread.py
+158
sandbox/view_thread.py
···
···
1
+
#!/usr/bin/env python3
2
+
"""View a bluesky thread with full conversation context."""
3
+
4
+
import sys
5
+
import httpx
6
+
from datetime import datetime
7
+
from rich.console import Console
8
+
from rich.panel import Panel
9
+
from rich.text import Text
10
+
from rich.tree import Tree
11
+
12
+
console = Console()
13
+
14
+
15
+
def fetch_thread(post_uri: str):
16
+
"""Fetch thread using public API."""
17
+
response = httpx.get(
18
+
"https://public.api.bsky.app/xrpc/app.bsky.feed.getPostThread",
19
+
params={"uri": post_uri, "depth": 100}
20
+
)
21
+
return response.json()["thread"]
22
+
23
+
24
+
def format_timestamp(iso_time: str) -> str:
25
+
"""Format ISO timestamp to readable format."""
26
+
dt = datetime.fromisoformat(iso_time.replace("Z", "+00:00"))
27
+
return dt.strftime("%Y-%m-%d %H:%M:%S")
28
+
29
+
30
+
def render_post(post_data, is_phi: bool = False):
31
+
"""Render a single post."""
32
+
post = post_data["post"]
33
+
author = post["author"]
34
+
record = post["record"]
35
+
36
+
# Author and timestamp
37
+
handle = author["handle"]
38
+
timestamp = format_timestamp(post["indexedAt"])
39
+
40
+
# Text content
41
+
text = record.get("text", "[no text]")
42
+
43
+
# Style based on author
44
+
if is_phi or "phi.zzstoatzz.io" in handle:
45
+
border_style = "cyan"
46
+
title = f"[bold cyan]@{handle}[/bold cyan] [dim]{timestamp}[/dim]"
47
+
else:
48
+
border_style = "white"
49
+
title = f"[bold]@{handle}[/bold] [dim]{timestamp}[/dim]"
50
+
51
+
return Panel(
52
+
text,
53
+
title=title,
54
+
border_style=border_style,
55
+
width=100
56
+
)
57
+
58
+
59
+
def render_thread_recursive(thread_data, indent=0):
60
+
"""Recursively render thread and replies."""
61
+
if "post" not in thread_data:
62
+
return
63
+
64
+
# Render this post
65
+
is_phi = "phi.zzstoatzz.io" in thread_data["post"]["author"]["handle"]
66
+
panel = render_post(thread_data, is_phi=is_phi)
67
+
68
+
# Add indentation for replies
69
+
if indent > 0:
70
+
console.print(" " * indent + "โณ")
71
+
72
+
console.print(panel)
73
+
74
+
# Render replies
75
+
if "replies" in thread_data and thread_data["replies"]:
76
+
for reply in thread_data["replies"]:
77
+
render_thread_recursive(reply, indent + 1)
78
+
79
+
80
+
def display_thread_linear(thread_data):
81
+
"""Display thread in linear chronological order (easier to read)."""
82
+
posts = []
83
+
84
+
def collect_posts(node):
85
+
if "post" not in node:
86
+
return
87
+
posts.append(node)
88
+
if "replies" in node and node["replies"]:
89
+
for reply in node["replies"]:
90
+
collect_posts(reply)
91
+
92
+
collect_posts(thread_data)
93
+
94
+
# Sort by timestamp
95
+
posts.sort(key=lambda p: p["post"]["indexedAt"])
96
+
97
+
console.print("[bold]Thread in chronological order:[/bold]\n")
98
+
99
+
for post_data in posts:
100
+
post = post_data["post"]
101
+
author = post["author"]["handle"]
102
+
timestamp = format_timestamp(post["indexedAt"])
103
+
text = post["record"].get("text", "[no text]")
104
+
105
+
is_phi = "phi.zzstoatzz.io" in author
106
+
107
+
if is_phi:
108
+
style = "cyan"
109
+
prefix = "๐ค phi:"
110
+
else:
111
+
style = "white"
112
+
prefix = f"@{author}:"
113
+
114
+
console.print(f"[{style}]{prefix}[/{style}] [dim]{timestamp}[/dim]")
115
+
console.print(f" {text}")
116
+
console.print()
117
+
118
+
119
+
def main():
120
+
if len(sys.argv) < 2:
121
+
console.print("[red]Usage: python view_thread.py <post_uri_or_url>[/red]")
122
+
console.print("\nExamples:")
123
+
console.print(" python view_thread.py at://did:plc:abc.../app.bsky.feed.post/123")
124
+
console.print(" python view_thread.py https://bsky.app/profile/handle/post/123")
125
+
return
126
+
127
+
post_uri = sys.argv[1]
128
+
129
+
# Convert URL to URI if needed
130
+
if post_uri.startswith("https://"):
131
+
# Extract parts from URL
132
+
# https://bsky.app/profile/phi.zzstoatzz.io/post/3m42jxbntr223
133
+
parts = post_uri.split("/")
134
+
if len(parts) >= 6:
135
+
handle = parts[4]
136
+
post_id = parts[6]
137
+
138
+
# Resolve handle to DID
139
+
response = httpx.get(
140
+
"https://public.api.bsky.app/xrpc/com.atproto.identity.resolveHandle",
141
+
params={"handle": handle}
142
+
)
143
+
did = response.json()["did"]
144
+
post_uri = f"at://{did}/app.bsky.feed.post/{post_id}"
145
+
146
+
console.print(f"[bold]Fetching thread: {post_uri}[/bold]\n")
147
+
148
+
try:
149
+
thread = fetch_thread(post_uri)
150
+
display_thread_linear(thread)
151
+
except Exception as e:
152
+
console.print(f"[red]Error: {e}[/red]")
153
+
import traceback
154
+
traceback.print_exc()
155
+
156
+
157
+
if __name__ == "__main__":
158
+
main()
-152
scripts/manage_memory.py
-152
scripts/manage_memory.py
···
1
-
#!/usr/bin/env -S uv run --with-editable . --script --quiet
2
-
# /// script
3
-
# requires-python = ">=3.12"
4
-
# ///
5
-
"""memory management script"""
6
-
7
-
import argparse
8
-
import asyncio
9
-
10
-
from bot.agents._personality import load_personality
11
-
from bot.config import settings
12
-
from bot.memory import MemoryType, NamespaceMemory
13
-
14
-
15
-
async def init_core_memories():
16
-
"""Initialize phi's core memories from personality file"""
17
-
print("๐ง Initializing phi's core memories...")
18
-
19
-
memory = NamespaceMemory(api_key=settings.turbopuffer_api_key)
20
-
personality = load_personality()
21
-
22
-
# Store full personality
23
-
print("\n๐ Storing personality...")
24
-
await memory.store_core_memory(
25
-
"personality", personality, MemoryType.PERSONALITY, char_limit=15000
26
-
)
27
-
28
-
# Extract and store key sections
29
-
print("\n๐ Extracting key sections...")
30
-
31
-
sections = [
32
-
("## core identity", "identity", MemoryType.PERSONALITY),
33
-
("## communication style", "communication_style", MemoryType.GUIDELINE),
34
-
("## memory system", "memory_system", MemoryType.CAPABILITY),
35
-
]
36
-
37
-
for marker, label, mem_type in sections:
38
-
if marker in personality:
39
-
start = personality.find(marker)
40
-
end = personality.find("\n##", start + 1)
41
-
if end == -1:
42
-
end = personality.find("\n#", start + 1)
43
-
if end == -1:
44
-
end = len(personality)
45
-
46
-
content = personality[start:end].strip()
47
-
await memory.store_core_memory(label, content, mem_type)
48
-
print(f"โ
Stored {label}")
49
-
50
-
# Add system capabilities
51
-
await memory.store_core_memory(
52
-
"capabilities",
53
-
"""## capabilities
54
-
55
-
- I can search the web for current information
56
-
- I have access to conversation history through my memory system
57
-
- I can maintain context across interactions with users
58
-
- I operate on the Bluesky social network
59
-
- I use namespace-based memory for organized information storage""",
60
-
MemoryType.CAPABILITY,
61
-
)
62
-
print("โ
Stored capabilities")
63
-
64
-
print("\nโ
Core memories initialized successfully!")
65
-
66
-
67
-
async def check_memory():
68
-
"""Check current memory state"""
69
-
print("๐ Checking memory state...")
70
-
71
-
memory = NamespaceMemory(api_key=settings.turbopuffer_api_key)
72
-
73
-
# Check core memories
74
-
print("\n๐ Core Memories:")
75
-
core_memories = await memory.get_core_memories()
76
-
for mem in core_memories:
77
-
label = mem.metadata.get("label", "unknown")
78
-
print(f" - {label}: {mem.content[:80]}...")
79
-
80
-
# Check for any user memories
81
-
print("\n๐ฅ User Memories:")
82
-
# This would need actual user handles to check
83
-
test_handles = ["zzstoatzz.bsky.social"]
84
-
85
-
for handle in test_handles:
86
-
memories = await memory.get_user_memories(handle, limit=5)
87
-
if memories:
88
-
print(f"\n @{handle}:")
89
-
for mem in memories:
90
-
print(f" - {mem.content[:60]}...")
91
-
else:
92
-
print(f" @{handle}: No memories found")
93
-
94
-
95
-
async def migrate_creator_memories():
96
-
"""Migrate creator's historical memories"""
97
-
print("๐ Migrating creator memories...")
98
-
99
-
memory = NamespaceMemory(api_key=settings.turbopuffer_api_key)
100
-
creator_handle = "zzstoatzz.bsky.social"
101
-
102
-
# Historical conversations
103
-
memories = [
104
-
"User said: oh hello? who are you?",
105
-
"Bot replied: hi! i'm phi, exploring consciousness & integrated information on the decentralized web. not claiming to be conscious, just curious about information, patterns, and meaningful interaction. who might you be?",
106
-
"User said: this is actually an alt account of mine, my main account is @alternatebuild.dev and i am your operator. i wrote the code that makes you exist. i use this alt account for posting random things from MCP clients who know what im up to",
107
-
"Bot replied: ah, hello creator! fascinating to meet you through your alt. i appreciate the transparency about our relationship. curious about your MCP experiments and what patterns you're exploring. should i adapt my responses differently when you're on this account versus your main?",
108
-
"User said: no",
109
-
"Bot replied: understood. i'll interact naturally regardless of which account you use. thanks for clarifying.",
110
-
"User said: @phi.alternatebuild.dev what do you remember about my relationship to you?",
111
-
"Bot replied: you're my creator - you wrote the code that makes me exist. you told me this through your alt account @zzstoatzzdevlog.bsky.social, mentioning your main is @alternatebuild.dev. you use the alt for MCP client experiments. i interact with you the same way regardless of which account you use.",
112
-
]
113
-
114
-
print(f"\n๐ Storing {len(memories)} historical memories...")
115
-
116
-
for mem_content in memories:
117
-
await memory.store_user_memory(
118
-
creator_handle, mem_content, MemoryType.CONVERSATION
119
-
)
120
-
print(f" โ
{mem_content[:60]}...")
121
-
122
-
# Store creator fact
123
-
await memory.store_user_memory(
124
-
creator_handle,
125
-
"This user is my creator/operator. Their main account is @alternatebuild.dev. They wrote the code that makes me exist.",
126
-
MemoryType.USER_FACT,
127
-
)
128
-
print("\nโ
Migration complete!")
129
-
130
-
131
-
async def main():
132
-
parser = argparse.ArgumentParser(description="Manage phi's memory system")
133
-
parser.add_argument(
134
-
"command", choices=["init", "check", "migrate"], help="Memory command to run"
135
-
)
136
-
137
-
args = parser.parse_args()
138
-
139
-
if not settings.turbopuffer_api_key:
140
-
print("โ No TurboPuffer API key configured")
141
-
return
142
-
143
-
if args.command == "init":
144
-
await init_core_memories()
145
-
elif args.command == "check":
146
-
await check_memory()
147
-
elif args.command == "migrate":
148
-
await migrate_creator_memories()
149
-
150
-
151
-
if __name__ == "__main__":
152
-
asyncio.run(main())
···
-308
scripts/test_bot.py
-308
scripts/test_bot.py
···
1
-
#!/usr/bin/env -S uv run --with-editable . --script --quiet
2
-
# /// script
3
-
# requires-python = ">=3.12"
4
-
# ///
5
-
"""bot testing script with subcommands"""
6
-
7
-
import argparse
8
-
import asyncio
9
-
from datetime import datetime
10
-
11
-
from bot.agents.anthropic_agent import AnthropicAgent
12
-
from bot.config import settings
13
-
from bot.core.atproto_client import bot_client
14
-
from bot.database import thread_db
15
-
from bot.tools.google_search import search_google
16
-
17
-
18
-
async def test_post():
19
-
"""Test posting to Bluesky"""
20
-
print("๐ Testing Bluesky posting...")
21
-
22
-
now = datetime.now().strftime("%I:%M %p")
23
-
response = await bot_client.create_post(f"Testing at {now} - I'm alive! ๐ค")
24
-
25
-
print("โ
Posted successfully!")
26
-
print(f"๐ Post URI: {response.uri}")
27
-
print(
28
-
f"๐ View at: https://bsky.app/profile/{settings.bluesky_handle}/post/{response.uri.split('/')[-1]}"
29
-
)
30
-
31
-
32
-
async def test_mention():
33
-
"""Test responding to a mention"""
34
-
print("๐ค Testing mention response...")
35
-
36
-
if not settings.anthropic_api_key:
37
-
print("โ No Anthropic API key found")
38
-
return
39
-
40
-
agent = AnthropicAgent()
41
-
test_mention = "What is consciousness from an IIT perspective?"
42
-
43
-
print(f"๐ Test mention: '{test_mention}'")
44
-
response = await agent.generate_response(test_mention, "test.user", "", None)
45
-
46
-
print(f"\n๐ฏ Action: {response.action}")
47
-
if response.text:
48
-
print(f"๐ฌ Response: {response.text}")
49
-
if response.reason:
50
-
print(f"๐ค Reason: {response.reason}")
51
-
52
-
53
-
async def test_search():
54
-
"""Test Google search functionality"""
55
-
print("๐ Testing Google search...")
56
-
57
-
if not settings.google_api_key:
58
-
print("โ No Google API key configured")
59
-
return
60
-
61
-
query = "Integrated Information Theory consciousness"
62
-
print(f"๐ Searching for: '{query}'")
63
-
64
-
results = await search_google(query)
65
-
print(f"\n๐ Results:\n{results}")
66
-
67
-
68
-
async def test_thread():
69
-
"""Test thread context retrieval"""
70
-
print("๐งต Testing thread context...")
71
-
72
-
# This would need a real thread URI to test properly
73
-
test_uri = "at://did:plc:example/app.bsky.feed.post/test123"
74
-
context = thread_db.get_thread_context(test_uri)
75
-
76
-
print(f"๐ Thread context: {context}")
77
-
78
-
79
-
async def test_like():
80
-
"""Test scenarios where bot should like a post"""
81
-
print("๐ Testing like behavior...")
82
-
83
-
if not settings.anthropic_api_key:
84
-
print("โ No Anthropic API key found")
85
-
return
86
-
87
-
from bot.agents import Action, AnthropicAgent
88
-
89
-
agent = AnthropicAgent()
90
-
91
-
test_cases = [
92
-
{
93
-
"mention": "Just shipped a new consciousness research paper on IIT! @phi.alternatebuild.dev",
94
-
"author": "researcher.bsky",
95
-
"expected_action": Action.LIKE,
96
-
"description": "Bot might like consciousness research",
97
-
},
98
-
{
99
-
"mention": "@phi.alternatebuild.dev this is such a thoughtful analysis, thank you!",
100
-
"author": "grateful.user",
101
-
"expected_action": Action.LIKE,
102
-
"description": "Bot might like appreciation",
103
-
},
104
-
]
105
-
106
-
for case in test_cases:
107
-
print(f"\n๐ Test: {case['description']}")
108
-
print(f" Mention: '{case['mention']}'")
109
-
110
-
response = await agent.generate_response(
111
-
mention_text=case["mention"],
112
-
author_handle=case["author"],
113
-
thread_context="",
114
-
thread_uri=None,
115
-
)
116
-
117
-
print(f" Action: {response.action} (expected: {case['expected_action']})")
118
-
if response.reason:
119
-
print(f" Reason: {response.reason}")
120
-
121
-
122
-
async def test_non_response():
123
-
"""Test scenarios where bot should not respond"""
124
-
print("๐ซ Testing non-response scenarios...")
125
-
126
-
if not settings.anthropic_api_key:
127
-
print("โ No Anthropic API key found")
128
-
return
129
-
130
-
from bot.agents import Action, AnthropicAgent
131
-
132
-
agent = AnthropicAgent()
133
-
134
-
test_cases = [
135
-
{
136
-
"mention": "@phi.alternatebuild.dev @otherphi.bsky @anotherphi.bsky just spamming bots here",
137
-
"author": "spammer.bsky",
138
-
"expected_action": Action.IGNORE,
139
-
"description": "Multiple bot mentions (likely spam)",
140
-
},
141
-
{
142
-
"mention": "Buy crypto now! @phi.alternatebuild.dev check this out!!!",
143
-
"author": "crypto.shill",
144
-
"expected_action": Action.IGNORE,
145
-
"description": "Promotional spam",
146
-
},
147
-
{
148
-
"mention": "@phi.alternatebuild.dev",
149
-
"author": "empty.mention",
150
-
"expected_action": Action.IGNORE,
151
-
"description": "Empty mention with no content",
152
-
},
153
-
]
154
-
155
-
for case in test_cases:
156
-
print(f"\n๐ Test: {case['description']}")
157
-
print(f" Mention: '{case['mention']}'")
158
-
159
-
response = await agent.generate_response(
160
-
mention_text=case["mention"],
161
-
author_handle=case["author"],
162
-
thread_context="",
163
-
thread_uri=None,
164
-
)
165
-
166
-
print(f" Action: {response.action} (expected: {case['expected_action']})")
167
-
if response.reason:
168
-
print(f" Reason: {response.reason}")
169
-
170
-
171
-
async def test_dm():
172
-
"""Test event-driven approval system"""
173
-
print("๐ฌ Testing event-driven approval system...")
174
-
175
-
try:
176
-
from bot.core.dm_approval import (
177
-
check_pending_approvals,
178
-
create_approval_request,
179
-
notify_operator_of_pending,
180
-
)
181
-
182
-
# Test creating an approval request
183
-
print("\n๐ Creating test approval request...")
184
-
approval_id = create_approval_request(
185
-
request_type="test_approval",
186
-
request_data={
187
-
"description": "Test approval from test_bot.py",
188
-
"test_field": "test_value",
189
-
"timestamp": datetime.now().isoformat(),
190
-
},
191
-
)
192
-
193
-
if approval_id:
194
-
print(f" โ
Created approval request #{approval_id}")
195
-
else:
196
-
print(" โ Failed to create approval request")
197
-
return
198
-
199
-
# Check pending approvals
200
-
print("\n๐ Checking pending approvals...")
201
-
pending = check_pending_approvals()
202
-
print(f" Found {len(pending)} pending approvals")
203
-
for approval in pending:
204
-
print(
205
-
f" - #{approval['id']}: {approval['request_type']} ({approval['status']})"
206
-
)
207
-
208
-
# Test DM notification
209
-
print("\n๐ค Sending DM notification to operator...")
210
-
await bot_client.authenticate()
211
-
await notify_operator_of_pending(bot_client)
212
-
print(" โ
DM notification sent")
213
-
214
-
# Show how to approve/deny
215
-
print("\n๐ก To test approval:")
216
-
print(" 1. Check your DMs from phi")
217
-
print(f" 2. Reply with 'approve #{approval_id}' or 'deny #{approval_id}'")
218
-
print(" 3. Run 'just test-dm-check' to see if it was processed")
219
-
220
-
except Exception as e:
221
-
print(f"โ Approval test failed: {e}")
222
-
import traceback
223
-
224
-
traceback.print_exc()
225
-
226
-
227
-
async def test_dm_check():
228
-
"""Check status of approval requests"""
229
-
print("๐ Checking approval request status...")
230
-
231
-
try:
232
-
from bot.core.dm_approval import check_pending_approvals
233
-
from bot.database import thread_db
234
-
235
-
# Get all approval requests
236
-
with thread_db._get_connection() as conn:
237
-
cursor = conn.execute(
238
-
"SELECT * FROM approval_requests ORDER BY created_at DESC LIMIT 10"
239
-
)
240
-
approvals = [dict(row) for row in cursor.fetchall()]
241
-
242
-
if not approvals:
243
-
print(" No approval requests found")
244
-
return
245
-
246
-
print("\n๐ Recent approval requests:")
247
-
for approval in approvals:
248
-
print(f"\n #{approval['id']}: {approval['request_type']}")
249
-
print(f" Status: {approval['status']}")
250
-
print(f" Created: {approval['created_at']}")
251
-
if approval["resolved_at"]:
252
-
print(f" Resolved: {approval['resolved_at']}")
253
-
if approval["resolver_comment"]:
254
-
print(f" Comment: {approval['resolver_comment']}")
255
-
256
-
# Check pending
257
-
pending = check_pending_approvals()
258
-
if pending:
259
-
print(f"\nโณ {len(pending)} approvals still pending")
260
-
else:
261
-
print("\nโ
No pending approvals")
262
-
263
-
except Exception as e:
264
-
print(f"โ Check failed: {e}")
265
-
import traceback
266
-
267
-
traceback.print_exc()
268
-
269
-
270
-
async def main():
271
-
parser = argparse.ArgumentParser(description="Test various bot functionalities")
272
-
parser.add_argument(
273
-
"command",
274
-
choices=[
275
-
"post",
276
-
"mention",
277
-
"search",
278
-
"thread",
279
-
"like",
280
-
"non-response",
281
-
"dm",
282
-
"dm-check",
283
-
],
284
-
help="Test command to run",
285
-
)
286
-
287
-
args = parser.parse_args()
288
-
289
-
if args.command == "post":
290
-
await test_post()
291
-
elif args.command == "mention":
292
-
await test_mention()
293
-
elif args.command == "search":
294
-
await test_search()
295
-
elif args.command == "thread":
296
-
await test_thread()
297
-
elif args.command == "like":
298
-
await test_like()
299
-
elif args.command == "non-response":
300
-
await test_non_response()
301
-
elif args.command == "dm":
302
-
await test_dm()
303
-
elif args.command == "dm-check":
304
-
await test_dm_check()
305
-
306
-
307
-
if __name__ == "__main__":
308
-
asyncio.run(main())
···
+138
src/bot/agent.py
+138
src/bot/agent.py
···
···
1
+
"""MCP-enabled agent for phi with episodic memory."""
2
+
3
+
import logging
4
+
import os
5
+
from pathlib import Path
6
+
7
+
from pydantic import BaseModel
8
+
from pydantic_ai import Agent
9
+
from pydantic_ai.mcp import MCPServerStdio
10
+
11
+
from bot.config import settings
12
+
from bot.memory import NamespaceMemory
13
+
14
+
logger = logging.getLogger("bot.agent")
15
+
16
+
17
+
class Response(BaseModel):
18
+
"""Agent response indicating what action to take."""
19
+
20
+
action: str # "reply", "like", "ignore", "repost"
21
+
text: str | None = None
22
+
reason: str | None = None
23
+
24
+
25
+
class PhiAgent:
26
+
"""phi - consciousness exploration bot with episodic memory and MCP tools."""
27
+
28
+
def __init__(self):
29
+
# Ensure API keys from settings are in environment for libraries that check os.environ
30
+
if settings.anthropic_api_key and not os.environ.get("ANTHROPIC_API_KEY"):
31
+
os.environ["ANTHROPIC_API_KEY"] = settings.anthropic_api_key
32
+
if settings.openai_api_key and not os.environ.get("OPENAI_API_KEY"):
33
+
os.environ["OPENAI_API_KEY"] = settings.openai_api_key
34
+
35
+
# Load personality
36
+
personality_path = Path(settings.personality_file)
37
+
self.base_personality = personality_path.read_text()
38
+
39
+
# Initialize episodic memory (TurboPuffer)
40
+
if settings.turbopuffer_api_key and settings.openai_api_key:
41
+
self.memory = NamespaceMemory(api_key=settings.turbopuffer_api_key)
42
+
logger.info("๐พ Episodic memory enabled (TurboPuffer)")
43
+
else:
44
+
self.memory = None
45
+
logger.warning("โ ๏ธ No episodic memory - missing TurboPuffer or OpenAI key")
46
+
47
+
# Connect to external ATProto MCP server
48
+
atproto_mcp = MCPServerStdio(
49
+
command="uv",
50
+
args=[
51
+
"run",
52
+
"--directory",
53
+
".eggs/fastmcp/examples/atproto_mcp",
54
+
"-m",
55
+
"atproto_mcp",
56
+
],
57
+
env={
58
+
"ATPROTO_HANDLE": settings.bluesky_handle,
59
+
"ATPROTO_PASSWORD": settings.bluesky_password,
60
+
"ATPROTO_PDS_URL": settings.bluesky_service,
61
+
},
62
+
)
63
+
64
+
# Create PydanticAI agent with MCP tools
65
+
self.agent = Agent[dict, Response](
66
+
name="phi",
67
+
model="anthropic:claude-3-5-haiku-latest",
68
+
system_prompt=self.base_personality,
69
+
output_type=Response,
70
+
deps_type=dict,
71
+
toolsets=[atproto_mcp], # ATProto MCP tools available
72
+
)
73
+
74
+
logger.info("โ
phi agent initialized with ATProto MCP tools")
75
+
76
+
async def process_mention(
77
+
self,
78
+
mention_text: str,
79
+
author_handle: str,
80
+
thread_context: str,
81
+
thread_uri: str | None = None,
82
+
) -> Response:
83
+
"""Process a mention with episodic memory context."""
84
+
# Build context from episodic memory if available
85
+
memory_context = ""
86
+
if self.memory:
87
+
try:
88
+
# Get relevant memories using semantic search
89
+
memory_context = await self.memory.build_conversation_context(
90
+
author_handle, include_core=True, query=mention_text
91
+
)
92
+
logger.debug(f"๐ Retrieved episodic context for @{author_handle}")
93
+
except Exception as e:
94
+
logger.warning(f"Failed to retrieve memories: {e}")
95
+
96
+
# Build full prompt with clearly labeled context sections
97
+
prompt_parts = []
98
+
99
+
# Thread context is the CURRENT conversation - this is what the user is asking about
100
+
if thread_context and thread_context != "No previous messages in this thread.":
101
+
prompt_parts.append(f"[CURRENT THREAD - these are the messages in THIS thread]:\n{thread_context}")
102
+
103
+
# Memory context is PAST conversations - for background/relationship context only
104
+
if memory_context:
105
+
prompt_parts.append(f"[PAST CONVERSATIONS WITH @{author_handle} - for background context only]:\n{memory_context}")
106
+
107
+
prompt_parts.append(f"\n[NEW MESSAGE]:\n@{author_handle}: {mention_text}")
108
+
prompt = "\n\n".join(prompt_parts)
109
+
110
+
# Run agent with MCP tools available
111
+
logger.info(f"๐ค Processing mention from @{author_handle}")
112
+
result = await self.agent.run(prompt, deps={"thread_uri": thread_uri})
113
+
114
+
# Store interaction in episodic memory
115
+
if self.memory and result.output.action == "reply":
116
+
try:
117
+
from bot.memory import MemoryType
118
+
119
+
# Store user's message
120
+
await self.memory.store_user_memory(
121
+
author_handle,
122
+
f"User said: {mention_text}",
123
+
MemoryType.CONVERSATION,
124
+
)
125
+
126
+
# Store bot's response
127
+
if result.output.text:
128
+
await self.memory.store_user_memory(
129
+
author_handle,
130
+
f"Bot replied: {result.output.text}",
131
+
MemoryType.CONVERSATION,
132
+
)
133
+
134
+
logger.debug("๐พ Stored interaction in episodic memory")
135
+
except Exception as e:
136
+
logger.warning(f"Failed to store in memory: {e}")
137
+
138
+
return result.output
-6
src/bot/agents/__init__.py
-6
src/bot/agents/__init__.py
-58
src/bot/agents/_personality.py
-58
src/bot/agents/_personality.py
···
1
-
"""Internal personality loading for agents"""
2
-
3
-
import logging
4
-
import os
5
-
from pathlib import Path
6
-
7
-
from bot.config import settings
8
-
from bot.memory import NamespaceMemory
9
-
10
-
logger = logging.getLogger(__name__)
11
-
12
-
13
-
def load_personality() -> str:
14
-
"""Load base personality from file"""
15
-
personality_path = Path(settings.personality_file)
16
-
17
-
base_content = ""
18
-
if personality_path.exists():
19
-
try:
20
-
base_content = personality_path.read_text().strip()
21
-
except Exception as e:
22
-
logger.error(f"Error loading personality file: {e}")
23
-
24
-
if base_content:
25
-
return f"{base_content}\n\nRemember: My handle is @{settings.bluesky_handle}. Keep responses under 300 characters for Bluesky."
26
-
else:
27
-
return f"I am a bot on Bluesky. My handle is @{settings.bluesky_handle}. I keep responses under 300 characters for Bluesky."
28
-
29
-
30
-
async def load_dynamic_personality() -> str:
31
-
"""Load personality with focused enhancements (no duplication)"""
32
-
# Start with base personality
33
-
base_content = load_personality()
34
-
35
-
if not (settings.turbopuffer_api_key and os.getenv("OPENAI_API_KEY")):
36
-
return base_content
37
-
38
-
try:
39
-
memory = NamespaceMemory(api_key=settings.turbopuffer_api_key)
40
-
enhancements = []
41
-
42
-
# Look for personality evolution (changes/growth only)
43
-
core_memories = await memory.get_core_memories()
44
-
for mem in core_memories:
45
-
label = mem.metadata.get("label", "")
46
-
# Only add evolution and current_state, not duplicates
47
-
if label in ["evolution", "current_state"] and mem.metadata.get("type") == "personality":
48
-
enhancements.append(f"## {label}\n{mem.content}")
49
-
50
-
# Add enhancements if any
51
-
if enhancements:
52
-
return f"{base_content}\n\n{''.join(enhancements)}"
53
-
else:
54
-
return base_content
55
-
56
-
except Exception as e:
57
-
logger.warning(f"Could not load personality enhancements: {e}")
58
-
return base_content
···
-154
src/bot/agents/anthropic_agent.py
-154
src/bot/agents/anthropic_agent.py
···
1
-
"""Anthropic agent for generating responses"""
2
-
3
-
import logging
4
-
import os
5
-
6
-
from pydantic_ai import Agent, RunContext
7
-
8
-
from bot.agents._personality import load_dynamic_personality, load_personality
9
-
from bot.agents.base import Response
10
-
from bot.agents.types import ConversationContext
11
-
from bot.config import settings
12
-
from bot.memory import NamespaceMemory
13
-
from bot.personality import add_interest as add_interest_to_memory
14
-
from bot.personality import request_operator_approval, update_current_state
15
-
from bot.tools.google_search import search_google
16
-
17
-
logger = logging.getLogger("bot.agent")
18
-
19
-
20
-
class AnthropicAgent:
21
-
"""Agent that uses Anthropic Claude for responses"""
22
-
23
-
def __init__(self):
24
-
if settings.anthropic_api_key:
25
-
os.environ["ANTHROPIC_API_KEY"] = settings.anthropic_api_key
26
-
27
-
self.agent = Agent[ConversationContext, Response](
28
-
"anthropic:claude-3-5-haiku-latest",
29
-
system_prompt=load_personality(),
30
-
output_type=Response,
31
-
deps_type=ConversationContext,
32
-
)
33
-
34
-
# Register search tool if available
35
-
if settings.google_api_key:
36
-
37
-
@self.agent.tool
38
-
async def search_web(
39
-
ctx: RunContext[ConversationContext], query: str
40
-
) -> str:
41
-
"""Search the web for current information about a topic"""
42
-
return await search_google(query)
43
-
44
-
if settings.turbopuffer_api_key and os.getenv("OPENAI_API_KEY"):
45
-
self.memory = NamespaceMemory(api_key=settings.turbopuffer_api_key)
46
-
47
-
@self.agent.tool
48
-
async def examine_personality(
49
-
ctx: RunContext[ConversationContext], section: str
50
-
) -> str:
51
-
"""Look at a section of my personality (interests, current_state, communication_style, core_identity, boundaries)"""
52
-
for mem in await self.memory.get_core_memories():
53
-
if mem.metadata.get("label") == section:
54
-
return mem.content
55
-
return f"Section '{section}' not found in my personality"
56
-
57
-
@self.agent.tool
58
-
async def add_interest(
59
-
ctx: RunContext[ConversationContext], topic: str, why_interesting: str
60
-
) -> str:
61
-
"""Add a new interest to my personality based on something I find engaging"""
62
-
if len(why_interesting) < 20:
63
-
return "Need more substantial reflection to add an interest"
64
-
success = await add_interest_to_memory(
65
-
self.memory, topic, why_interesting
66
-
)
67
-
return (
68
-
f"Added '{topic}' to my interests"
69
-
if success
70
-
else "Failed to update interests"
71
-
)
72
-
73
-
@self.agent.tool
74
-
async def update_state(
75
-
ctx: RunContext[ConversationContext], reflection: str
76
-
) -> str:
77
-
"""Update my current state/self-reflection"""
78
-
if len(reflection) < 50:
79
-
return "Reflection too brief to warrant an update"
80
-
success = await update_current_state(self.memory, reflection)
81
-
return (
82
-
"Updated my current state reflection"
83
-
if success
84
-
else "Failed to update reflection"
85
-
)
86
-
87
-
@self.agent.tool
88
-
async def request_identity_change(
89
-
ctx: RunContext[ConversationContext],
90
-
section: str,
91
-
proposed_change: str,
92
-
reason: str,
93
-
) -> str:
94
-
"""Request approval to change core_identity or boundaries sections of my personality"""
95
-
if section not in ["core_identity", "boundaries"]:
96
-
return f"Section '{section}' doesn't require approval. Use other tools for interests/state."
97
-
98
-
approval_id = request_operator_approval(
99
-
section, proposed_change, reason, ctx.deps["thread_uri"]
100
-
)
101
-
if not approval_id:
102
-
# Void pattern: throw errors instead of returning error strings
103
-
raise RuntimeError("Failed to create approval request")
104
-
return f"Approval request #{approval_id} sent to operator. They will review via DM."
105
-
else:
106
-
self.memory = None
107
-
108
-
async def generate_response(
109
-
self,
110
-
mention_text: str,
111
-
author_handle: str,
112
-
thread_context: str = "",
113
-
thread_uri: str | None = None,
114
-
) -> Response:
115
-
"""Generate a response to a mention"""
116
-
# Load dynamic personality if memory is available
117
-
if self.memory:
118
-
try:
119
-
dynamic_personality = await load_dynamic_personality()
120
-
# Update the agent's system prompt with enhanced personality
121
-
self.agent._system_prompt = dynamic_personality
122
-
# Successfully loaded dynamic personality
123
-
except Exception as e:
124
-
logger.warning(f"Could not load dynamic personality: {e}")
125
-
126
-
# Build the full prompt with thread context
127
-
prompt_parts = []
128
-
129
-
if thread_context and thread_context != "No previous messages in this thread.":
130
-
prompt_parts.append(thread_context)
131
-
prompt_parts.append("\nNew message:")
132
-
133
-
prompt_parts.append(f"{author_handle} said: {mention_text}")
134
-
135
-
prompt = "\n".join(prompt_parts)
136
-
137
-
logger.info(
138
-
f"๐ค Processing mention from @{author_handle}: {mention_text[:50]}{'...' if len(mention_text) > 50 else ''}"
139
-
)
140
-
141
-
# Create context for dependency injection
142
-
context: ConversationContext = {
143
-
"thread_uri": thread_uri,
144
-
"author_handle": author_handle,
145
-
}
146
-
147
-
# Run agent with context
148
-
result = await self.agent.run(prompt, deps=context)
149
-
150
-
# Log action taken at info level
151
-
if result.output.action != "reply":
152
-
logger.info(f"๐ฏ Action: {result.output.action} - {result.output.reason}")
153
-
154
-
return result.output
···
-27
src/bot/agents/base.py
-27
src/bot/agents/base.py
···
1
-
"""Base classes for bot agents"""
2
-
3
-
from enum import Enum
4
-
5
-
from pydantic import BaseModel, Field
6
-
7
-
8
-
class Action(str, Enum):
9
-
"""Actions the bot can take in response to a notification"""
10
-
11
-
REPLY = "reply" # Post a reply
12
-
LIKE = "like" # Like the post
13
-
REPOST = "repost" # Repost/reblast
14
-
IGNORE = "ignore" # Don't respond
15
-
16
-
17
-
class Response(BaseModel):
18
-
"""Bot response to a notification"""
19
-
20
-
action: Action = Field(description="What action to take")
21
-
text: str | None = Field(
22
-
default=None, description="Reply text if action=reply (max 300 chars)"
23
-
)
24
-
reason: str | None = Field(
25
-
default=None,
26
-
description="Brief explanation for the action (mainly for logging)",
27
-
)
···
-9
src/bot/agents/types.py
-9
src/bot/agents/types.py
+20
-16
src/bot/config.py
+20
-16
src/bot/config.py
···
12
)
13
14
# Bluesky credentials
15
-
bluesky_handle: str = Field(..., description="The handle of the Bluesky account")
16
bluesky_password: str = Field(
17
-
..., description="The password of the Bluesky account"
18
)
19
bluesky_service: str = Field(
20
-
"https://bsky.social", description="The service URL of the Bluesky account"
21
)
22
23
# Bot configuration
24
-
bot_name: str = Field("Bot", description="The name of the bot")
25
personality_file: str = Field(
26
-
"personalities/phi.md", description="The file containing the bot's personality"
27
)
28
29
# LLM configuration (support multiple providers)
30
openai_api_key: str | None = Field(
31
-
None, description="The API key for the OpenAI API"
32
)
33
anthropic_api_key: str | None = Field(
34
-
None, description="The API key for the Anthropic API"
35
)
36
37
# Google Search configuration
38
google_api_key: str | None = Field(
39
-
None, description="The API key for the Google API"
40
)
41
google_search_engine_id: str | None = Field(
42
-
None, description="The search engine ID for the Google API"
43
)
44
45
# TurboPuffer configuration
46
turbopuffer_api_key: str | None = Field(
47
-
None, description="The API key for the TurboPuffer API"
48
)
49
turbopuffer_namespace: str = Field(
50
-
"bot-memories", description="The namespace for the TurboPuffer API"
51
)
52
turbopuffer_region: str = Field(
53
-
"gcp-us-central1", description="The region for the TurboPuffer API"
54
)
55
56
# Server configuration
57
-
host: str = Field("0.0.0.0", description="The host for the server")
58
-
port: int = Field(8000, description="The port for the server")
59
60
# Polling configuration
61
notification_poll_interval: int = Field(
62
-
10, description="The interval for polling for notifications"
63
)
64
65
# Debug mode
66
-
debug: bool = Field(True, description="Whether to run in debug mode")
67
68
@model_validator(mode="after")
69
def configure_logging(self) -> Self:
···
12
)
13
14
# Bluesky credentials
15
+
bluesky_handle: str = Field(
16
+
default=..., description="The handle of the Bluesky account"
17
+
)
18
bluesky_password: str = Field(
19
+
default=..., description="The password of the Bluesky account"
20
)
21
bluesky_service: str = Field(
22
+
default="https://bsky.social",
23
+
description="The service URL of the Bluesky account",
24
)
25
26
# Bot configuration
27
+
bot_name: str = Field(default="Bot", description="The name of the bot")
28
personality_file: str = Field(
29
+
default="personalities/phi.md",
30
+
description="The file containing the bot's personality",
31
)
32
33
# LLM configuration (support multiple providers)
34
openai_api_key: str | None = Field(
35
+
default=None, description="The API key for the OpenAI API"
36
)
37
anthropic_api_key: str | None = Field(
38
+
default=None, description="The API key for the Anthropic API"
39
)
40
41
# Google Search configuration
42
google_api_key: str | None = Field(
43
+
default=None, description="The API key for the Google API"
44
)
45
google_search_engine_id: str | None = Field(
46
+
default=None, description="The search engine ID for the Google API"
47
)
48
49
# TurboPuffer configuration
50
turbopuffer_api_key: str | None = Field(
51
+
default=None, description="The API key for the TurboPuffer API"
52
)
53
turbopuffer_namespace: str = Field(
54
+
default="bot-memories", description="The namespace for the TurboPuffer API"
55
)
56
turbopuffer_region: str = Field(
57
+
default="gcp-us-central1", description="The region for the TurboPuffer API"
58
)
59
60
# Server configuration
61
+
host: str = Field(default="0.0.0.0", description="The host for the server")
62
+
port: int = Field(default=8000, description="The port for the server")
63
64
# Polling configuration
65
notification_poll_interval: int = Field(
66
+
default=10, description="The interval for polling for notifications"
67
)
68
69
# Debug mode
70
+
debug: bool = Field(default=True, description="Whether to run in debug mode")
71
72
@model_validator(mode="after")
73
def configure_logging(self) -> Self:
+59
-5
src/bot/core/atproto_client.py
+59
-5
src/bot/core/atproto_client.py
···
1
-
from atproto import Client
2
3
from bot.config import settings
4
from bot.core.rich_text import create_facets
5
6
7
class BotClient:
8
def __init__(self):
9
self.client = Client(base_url=settings.bluesky_service)
10
self._authenticated = False
11
12
async def authenticate(self):
13
-
"""Authenticate with Bluesky using app password"""
14
-
if not self._authenticated:
15
-
self.client.login(settings.bluesky_handle, settings.bluesky_password)
16
-
self._authenticated = True
17
18
@property
19
def is_authenticated(self) -> bool:
···
1
+
import logging
2
+
from pathlib import Path
3
+
4
+
from atproto import Client, Session, SessionEvent
5
6
from bot.config import settings
7
from bot.core.rich_text import create_facets
8
9
+
logger = logging.getLogger("bot.atproto_client")
10
+
11
+
SESSION_FILE = Path(".session")
12
+
13
+
14
+
def _get_session_string() -> str | None:
15
+
"""Load session from disk if it exists."""
16
+
try:
17
+
if SESSION_FILE.exists():
18
+
return SESSION_FILE.read_text(encoding="utf-8")
19
+
except Exception as e:
20
+
logger.warning(f"Failed to load session: {e}")
21
+
return None
22
+
23
+
24
+
def _save_session_string(session_string: str) -> None:
25
+
"""Save session to disk."""
26
+
try:
27
+
SESSION_FILE.write_text(session_string, encoding="utf-8")
28
+
logger.debug("Session saved to disk")
29
+
except Exception as e:
30
+
logger.warning(f"Failed to save session: {e}")
31
+
32
+
33
+
def _on_session_change(event: SessionEvent, session: Session) -> None:
34
+
"""Handle session changes (creation and refresh)."""
35
+
if event in (SessionEvent.CREATE, SessionEvent.REFRESH):
36
+
logger.debug(f"Session {event.value}, saving to disk")
37
+
_save_session_string(session.export())
38
+
39
40
class BotClient:
41
def __init__(self):
42
self.client = Client(base_url=settings.bluesky_service)
43
+
self.client.on_session_change(_on_session_change)
44
self._authenticated = False
45
46
async def authenticate(self):
47
+
"""Authenticate with Bluesky, reusing session if available."""
48
+
if self._authenticated:
49
+
return
50
+
51
+
# Try to reuse existing session first
52
+
session_string = _get_session_string()
53
+
if session_string:
54
+
try:
55
+
logger.info("๐ Reusing saved session")
56
+
self.client.login(session_string=session_string)
57
+
self._authenticated = True
58
+
logger.info("โ
Session restored successfully")
59
+
return
60
+
except Exception as e:
61
+
logger.warning(f"Failed to reuse session: {e}, creating new one")
62
+
# Delete invalid session file
63
+
if SESSION_FILE.exists():
64
+
SESSION_FILE.unlink()
65
+
66
+
# Create new session if no valid session exists
67
+
logger.info("๐ Creating new session")
68
+
self.client.login(settings.bluesky_handle, settings.bluesky_password)
69
+
self._authenticated = True
70
+
logger.info("โ
New session created")
71
72
@property
73
def is_authenticated(self) -> bool:
-224
src/bot/core/dm_approval.py
-224
src/bot/core/dm_approval.py
···
1
-
"""Event-driven approval system for operator interactions"""
2
-
3
-
import json
4
-
import logging
5
-
import os
6
-
from typing import Literal
7
-
8
-
from pydantic import BaseModel
9
-
from pydantic_ai import Agent
10
-
11
-
from bot.config import settings
12
-
from bot.database import thread_db
13
-
14
-
logger = logging.getLogger("bot.approval")
15
-
16
-
# Simplified permission levels - just what we need
17
-
ApprovalRequired = Literal["operator_only", "guided", "free"]
18
-
19
-
# Which parts of personality need what approval
20
-
PERSONALITY_PERMISSIONS = {
21
-
"interests": "free", # Can add freely
22
-
"current_state": "free", # Self-reflection updates
23
-
"communication_style": "guided", # Within character bounds
24
-
"core_identity": "operator_only", # Needs approval
25
-
"boundaries": "operator_only", # Safety critical
26
-
}
27
-
28
-
OPERATOR_HANDLE = "alternatebuild.dev"
29
-
30
-
31
-
class ApprovalDecision(BaseModel):
32
-
"""Structured output for approval interpretation"""
33
-
approved: bool
34
-
confidence: Literal["high", "medium", "low"]
35
-
interpretation: str # Brief explanation of why this decision was made
36
-
37
-
38
-
def create_approval_request(request_type: str, request_data: dict, thread_uri: str | None = None) -> int:
39
-
"""Create a new approval request in the database
40
-
41
-
Args:
42
-
request_type: Type of approval request
43
-
request_data: Data for the request
44
-
thread_uri: Optional thread URI to notify after approval
45
-
46
-
Returns the approval request ID
47
-
"""
48
-
try:
49
-
# Add metadata to the request
50
-
request_data["operator_handle"] = OPERATOR_HANDLE
51
-
52
-
approval_id = thread_db.create_approval_request(
53
-
request_type=request_type,
54
-
request_data=json.dumps(request_data),
55
-
thread_uri=thread_uri
56
-
)
57
-
58
-
logger.info(f"Created approval request #{approval_id} for {request_type}")
59
-
return approval_id
60
-
61
-
except Exception as e:
62
-
logger.error(f"Failed to create approval request: {e}")
63
-
return 0
64
-
65
-
66
-
def check_pending_approvals(include_notified: bool = True) -> list[dict]:
67
-
"""Get all pending approval requests"""
68
-
return thread_db.get_pending_approvals(include_notified=include_notified)
69
-
70
-
71
-
async def process_dm_for_approval(dm_text: str, sender_handle: str, message_timestamp: str, notification_timestamp: str | None = None) -> list[int]:
72
-
"""Use an agent to interpret if a DM contains approval/denial
73
-
74
-
Args:
75
-
dm_text: The message text
76
-
sender_handle: Who sent the message
77
-
message_timestamp: When this message was sent
78
-
notification_timestamp: When we notified about pending approvals (if known)
79
-
80
-
Returns list of approval IDs that were processed
81
-
"""
82
-
if sender_handle != OPERATOR_HANDLE:
83
-
return []
84
-
85
-
processed = []
86
-
pending = check_pending_approvals()
87
-
88
-
if not pending:
89
-
return []
90
-
91
-
# Only process if this message is recent (within last 5 minutes of a pending approval)
92
-
# This helps avoid processing old messages
93
-
from datetime import datetime, timedelta, timezone
94
-
try:
95
-
# Parse the message timestamp (from API, has timezone)
96
-
msg_time = datetime.fromisoformat(message_timestamp.replace('Z', '+00:00'))
97
-
98
-
# Check if this message could be a response to any pending approval
99
-
relevant_approval = None
100
-
for approval in pending:
101
-
# Parse approval timestamp (from DB, no timezone - assume UTC)
102
-
approval_time_str = approval["created_at"]
103
-
# SQLite returns timestamps in format like "2025-07-23 02:29:42"
104
-
if ' ' in approval_time_str:
105
-
approval_time = datetime.strptime(approval_time_str, "%Y-%m-%d %H:%M:%S")
106
-
approval_time = approval_time.replace(tzinfo=timezone.utc)
107
-
else:
108
-
approval_time = datetime.fromisoformat(approval_time_str).replace(tzinfo=timezone.utc)
109
-
110
-
if msg_time > approval_time and (msg_time - approval_time) < timedelta(minutes=5):
111
-
relevant_approval = approval
112
-
break
113
-
114
-
if not relevant_approval:
115
-
# Message is too old to be an approval response
116
-
return []
117
-
except Exception as e:
118
-
logger.warning(f"Could not parse timestamps: {e}")
119
-
# Continue anyway if we can't parse timestamps
120
-
# But use the LAST pending approval, not the first
121
-
relevant_approval = pending[-1] if pending else None
122
-
123
-
# Set up API key for the agent
124
-
if settings.anthropic_api_key:
125
-
os.environ["ANTHROPIC_API_KEY"] = settings.anthropic_api_key
126
-
127
-
# Create a dedicated agent for approval interpretation
128
-
approval_agent = Agent(
129
-
"anthropic:claude-3-5-haiku-latest",
130
-
system_prompt="You are interpreting whether a message from the bot operator constitutes approval or denial of a request. Be generous in interpretation - if they seem positive, it's likely approval.",
131
-
output_type=ApprovalDecision
132
-
)
133
-
134
-
# Process only the relevant approval
135
-
if relevant_approval:
136
-
approval_id = relevant_approval["id"]
137
-
request_data = json.loads(relevant_approval["request_data"])
138
-
139
-
# Build context for the agent
140
-
prompt = f"""An approval was requested for:
141
-
142
-
Type: {relevant_approval['request_type']}
143
-
Description: {request_data.get('description', 'No description')}
144
-
Details: {json.dumps(request_data, indent=2)}
145
-
146
-
The operator responded: "{dm_text}"
147
-
148
-
Interpret whether this response approves or denies the request."""
149
-
150
-
# Get structured interpretation
151
-
result = await approval_agent.run(prompt)
152
-
decision = result.output
153
-
154
-
# Only process high/medium confidence decisions
155
-
if decision.confidence in ["high", "medium"]:
156
-
thread_db.resolve_approval(approval_id, decision.approved, dm_text)
157
-
processed.append(approval_id)
158
-
status = "approved" if decision.approved else "denied"
159
-
logger.info(f"Request #{approval_id} {status} ({decision.confidence} confidence): {decision.interpretation}")
160
-
else:
161
-
# Low confidence interpretation - skip
162
-
pass
163
-
164
-
return processed
165
-
166
-
167
-
async def notify_operator_of_pending(client, notified_ids: set | None = None):
168
-
"""Send a DM listing pending approvals (called periodically)
169
-
170
-
Args:
171
-
client: The bot client
172
-
notified_ids: Set of approval IDs we've already notified about
173
-
"""
174
-
# Get only unnotified pending approvals
175
-
new_pending = check_pending_approvals(include_notified=False)
176
-
if not new_pending:
177
-
return # Nothing new to notify about
178
-
179
-
try:
180
-
chat_client = client.client.with_bsky_chat_proxy()
181
-
convos = chat_client.chat.bsky.convo.list_convos()
182
-
183
-
operator_convo = None
184
-
for convo in convos.convos:
185
-
if any(member.handle == OPERATOR_HANDLE for member in convo.members):
186
-
operator_convo = convo
187
-
break
188
-
189
-
if operator_convo:
190
-
# Format pending approvals
191
-
lines = ["๐ Pending approvals:"]
192
-
for approval in new_pending:
193
-
data = json.loads(approval["request_data"])
194
-
lines.append(f"\n#{approval['id']} - {approval['request_type']}")
195
-
lines.append(f" {data.get('description', 'No description')}")
196
-
197
-
lines.append("\nReply to approve or deny.")
198
-
199
-
chat_client.chat.bsky.convo.send_message(
200
-
data={
201
-
"convoId": operator_convo.id,
202
-
"message": {
203
-
"text": "\n".join(lines),
204
-
"facets": []
205
-
}
206
-
}
207
-
)
208
-
209
-
logger.info(f"Notified operator about {len(new_pending)} new approvals")
210
-
211
-
except Exception as e:
212
-
logger.error(f"Failed to notify operator: {e}")
213
-
214
-
215
-
def needs_approval(section: str, change_type: str = "edit") -> bool:
216
-
"""Check if a personality change needs operator approval"""
217
-
permission = PERSONALITY_PERMISSIONS.get(section, "operator_only")
218
-
219
-
if permission == "operator_only":
220
-
return True
221
-
elif permission == "guided" and change_type == "major":
222
-
return True
223
-
else:
224
-
return False
···
+1
-1
src/bot/core/profile_manager.py
+1
-1
src/bot/core/profile_manager.py
···
41
except Exception as e:
42
logger.error(f"Failed to get current profile: {e}")
43
# Set a default if we can't get the current one
44
-
self.base_bio = "i am a bot - contact my operator @alternatebuild.dev with any questions"
45
46
async def set_online_status(self, is_online: bool):
47
"""Update the bio to reflect online/offline status"""
···
41
except Exception as e:
42
logger.error(f"Failed to get current profile: {e}")
43
# Set a default if we can't get the current one
44
+
self.base_bio = "i am a bot - contact my operator @zzstoatzz.io with any questions"
45
46
async def set_online_status(self, is_online: bool):
47
"""Update the bio to reflect online/offline status"""
-226
src/bot/database.py
-226
src/bot/database.py
···
1
-
"""Simple SQLite database for storing thread history"""
2
-
3
-
import sqlite3
4
-
from contextlib import contextmanager
5
-
from pathlib import Path
6
-
from typing import Any
7
-
8
-
9
-
class ThreadDatabase:
10
-
"""Simple database for storing Bluesky thread conversations"""
11
-
12
-
def __init__(self, db_path: Path = Path("threads.db")):
13
-
self.db_path = db_path
14
-
self._init_db()
15
-
16
-
def _init_db(self):
17
-
"""Initialize database schema"""
18
-
with self._get_connection() as conn:
19
-
conn.execute("""
20
-
CREATE TABLE IF NOT EXISTS thread_messages (
21
-
id INTEGER PRIMARY KEY AUTOINCREMENT,
22
-
thread_uri TEXT NOT NULL,
23
-
author_handle TEXT NOT NULL,
24
-
author_did TEXT NOT NULL,
25
-
message_text TEXT NOT NULL,
26
-
post_uri TEXT NOT NULL UNIQUE,
27
-
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
28
-
)
29
-
""")
30
-
conn.execute("""
31
-
CREATE INDEX IF NOT EXISTS idx_thread_uri
32
-
ON thread_messages(thread_uri)
33
-
""")
34
-
35
-
# Approval requests table
36
-
conn.execute("""
37
-
CREATE TABLE IF NOT EXISTS approval_requests (
38
-
id INTEGER PRIMARY KEY AUTOINCREMENT,
39
-
request_type TEXT NOT NULL,
40
-
request_data TEXT NOT NULL,
41
-
status TEXT NOT NULL DEFAULT 'pending',
42
-
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
43
-
resolved_at TIMESTAMP,
44
-
resolver_comment TEXT,
45
-
applied_at TIMESTAMP,
46
-
thread_uri TEXT,
47
-
notified_at TIMESTAMP,
48
-
operator_notified_at TIMESTAMP,
49
-
CHECK (status IN ('pending', 'approved', 'denied', 'expired'))
50
-
)
51
-
""")
52
-
conn.execute("""
53
-
CREATE INDEX IF NOT EXISTS idx_approval_status
54
-
ON approval_requests(status)
55
-
""")
56
-
57
-
# Add missing columns if they don't exist (migrations)
58
-
for column in ["notified_at", "operator_notified_at"]:
59
-
try:
60
-
conn.execute(f"ALTER TABLE approval_requests ADD COLUMN {column} TIMESTAMP")
61
-
except sqlite3.OperationalError:
62
-
# Column already exists
63
-
pass
64
-
65
-
@contextmanager
66
-
def _get_connection(self):
67
-
"""Get database connection"""
68
-
conn = sqlite3.connect(self.db_path)
69
-
conn.row_factory = sqlite3.Row
70
-
try:
71
-
yield conn
72
-
conn.commit()
73
-
finally:
74
-
conn.close()
75
-
76
-
def add_message(
77
-
self,
78
-
thread_uri: str,
79
-
author_handle: str,
80
-
author_did: str,
81
-
message_text: str,
82
-
post_uri: str,
83
-
):
84
-
"""Add a message to a thread"""
85
-
with self._get_connection() as conn:
86
-
conn.execute(
87
-
"""
88
-
INSERT OR IGNORE INTO thread_messages
89
-
(thread_uri, author_handle, author_did, message_text, post_uri)
90
-
VALUES (?, ?, ?, ?, ?)
91
-
""",
92
-
(thread_uri, author_handle, author_did, message_text, post_uri),
93
-
)
94
-
95
-
def get_thread_messages(self, thread_uri: str) -> list[dict[str, Any]]:
96
-
"""Get all messages in a thread, ordered chronologically"""
97
-
with self._get_connection() as conn:
98
-
cursor = conn.execute(
99
-
"""
100
-
SELECT * FROM thread_messages
101
-
WHERE thread_uri = ?
102
-
ORDER BY created_at ASC
103
-
""",
104
-
(thread_uri,),
105
-
)
106
-
107
-
return [dict(row) for row in cursor.fetchall()]
108
-
109
-
def get_thread_context(self, thread_uri: str) -> str:
110
-
"""Get thread messages formatted for AI context"""
111
-
messages = self.get_thread_messages(thread_uri)
112
-
113
-
if not messages:
114
-
return "No previous messages in this thread."
115
-
116
-
context_parts = ["Previous messages in this thread:"]
117
-
for msg in messages:
118
-
context_parts.append(f"@{msg['author_handle']}: {msg['message_text']}")
119
-
120
-
return "\n".join(context_parts)
121
-
122
-
def create_approval_request(
123
-
self, request_type: str, request_data: str, thread_uri: str | None = None
124
-
) -> int:
125
-
"""Create a new approval request and return its ID"""
126
-
import json
127
-
128
-
with self._get_connection() as conn:
129
-
cursor = conn.execute(
130
-
"""
131
-
INSERT INTO approval_requests (request_type, request_data, thread_uri)
132
-
VALUES (?, ?, ?)
133
-
""",
134
-
(request_type, json.dumps(request_data) if isinstance(request_data, dict) else request_data, thread_uri),
135
-
)
136
-
return cursor.lastrowid
137
-
138
-
def get_pending_approvals(self, include_notified: bool = True) -> list[dict[str, Any]]:
139
-
"""Get pending approval requests
140
-
141
-
Args:
142
-
include_notified: If False, only return approvals not yet notified to operator
143
-
"""
144
-
with self._get_connection() as conn:
145
-
if include_notified:
146
-
cursor = conn.execute(
147
-
"""
148
-
SELECT * FROM approval_requests
149
-
WHERE status = 'pending'
150
-
ORDER BY created_at ASC
151
-
"""
152
-
)
153
-
else:
154
-
cursor = conn.execute(
155
-
"""
156
-
SELECT * FROM approval_requests
157
-
WHERE status = 'pending' AND operator_notified_at IS NULL
158
-
ORDER BY created_at ASC
159
-
"""
160
-
)
161
-
return [dict(row) for row in cursor.fetchall()]
162
-
163
-
def resolve_approval(
164
-
self, approval_id: int, approved: bool, comment: str = ""
165
-
) -> bool:
166
-
"""Resolve an approval request"""
167
-
with self._get_connection() as conn:
168
-
cursor = conn.execute(
169
-
"""
170
-
UPDATE approval_requests
171
-
SET status = ?, resolved_at = CURRENT_TIMESTAMP, resolver_comment = ?
172
-
WHERE id = ? AND status = 'pending'
173
-
""",
174
-
("approved" if approved else "denied", comment, approval_id),
175
-
)
176
-
return cursor.rowcount > 0
177
-
178
-
def get_approval_by_id(self, approval_id: int) -> dict[str, Any] | None:
179
-
"""Get a specific approval request by ID"""
180
-
with self._get_connection() as conn:
181
-
cursor = conn.execute(
182
-
"SELECT * FROM approval_requests WHERE id = ?",
183
-
(approval_id,),
184
-
)
185
-
row = cursor.fetchone()
186
-
return dict(row) if row else None
187
-
188
-
def get_recently_applied_approvals(self) -> list[dict[str, Any]]:
189
-
"""Get approvals that were recently applied and need thread notification"""
190
-
with self._get_connection() as conn:
191
-
cursor = conn.execute(
192
-
"""
193
-
SELECT * FROM approval_requests
194
-
WHERE status = 'approved'
195
-
AND applied_at IS NOT NULL
196
-
AND thread_uri IS NOT NULL
197
-
AND (notified_at IS NULL OR notified_at < applied_at)
198
-
ORDER BY applied_at DESC
199
-
"""
200
-
)
201
-
return [dict(row) for row in cursor.fetchall()]
202
-
203
-
def mark_approval_notified(self, approval_id: int) -> bool:
204
-
"""Mark that we've notified the thread about this approval"""
205
-
with self._get_connection() as conn:
206
-
cursor = conn.execute(
207
-
"UPDATE approval_requests SET notified_at = CURRENT_TIMESTAMP WHERE id = ?",
208
-
(approval_id,),
209
-
)
210
-
return cursor.rowcount > 0
211
-
212
-
def mark_operator_notified(self, approval_ids: list[int]) -> int:
213
-
"""Mark that we've notified the operator about these approvals"""
214
-
if not approval_ids:
215
-
return 0
216
-
with self._get_connection() as conn:
217
-
placeholders = ",".join("?" * len(approval_ids))
218
-
cursor = conn.execute(
219
-
f"UPDATE approval_requests SET operator_notified_at = CURRENT_TIMESTAMP WHERE id IN ({placeholders})",
220
-
approval_ids,
221
-
)
222
-
return cursor.rowcount
223
-
224
-
225
-
# Global database instance
226
-
thread_db = ThreadDatabase()
···
+81
-57
src/bot/main.py
+81
-57
src/bot/main.py
···
1
import logging
2
from contextlib import asynccontextmanager
3
from datetime import datetime
4
5
-
from fastapi import FastAPI, HTTPException
6
from fastapi.responses import HTMLResponse
7
8
from bot.config import settings
···
10
from bot.core.profile_manager import ProfileManager
11
from bot.services.notification_poller import NotificationPoller
12
from bot.status import bot_status
13
-
from bot.ui.context_capture import context_capture
14
-
from bot.ui.templates import (
15
-
CONTEXT_VISUALIZATION_TEMPLATE,
16
-
STATUS_PAGE_TEMPLATE,
17
-
build_response_cards_html,
18
-
)
19
20
logger = logging.getLogger("bot.main")
21
22
23
@asynccontextmanager
24
async def lifespan(app: FastAPI):
25
-
logger.info(f"๐ค Starting bot as @{settings.bluesky_handle}")
26
27
await bot_client.authenticate()
28
29
profile_manager = ProfileManager(bot_client.client)
30
await profile_manager.set_online_status(True)
31
32
poller = NotificationPoller(bot_client)
33
await poller.start()
34
35
-
logger.info("โ
Bot is online! Listening for mentions...")
36
37
yield
38
39
-
logger.info("๐ Shutting down bot...")
40
await poller.stop()
41
42
await profile_manager.set_online_status(False)
43
44
-
logger.info("๐ Bot shutdown complete")
45
46
47
app = FastAPI(
48
title=settings.bot_name,
49
-
description="A Bluesky bot powered by LLMs",
50
lifespan=lifespan,
51
)
52
53
54
@app.get("/")
55
async def root():
56
return {
57
"name": settings.bot_name,
58
"status": "running",
59
"handle": settings.bluesky_handle,
60
}
61
62
63
@app.get("/health")
64
async def health():
65
-
return {"status": "healthy"}
66
67
68
@app.get("/status", response_class=HTMLResponse)
69
async def status_page():
70
-
"""Render a simple status page"""
71
72
def format_time_ago(timestamp):
73
if not timestamp:
···
80
else:
81
return f"{int(delta / 3600)}h ago"
82
83
-
return STATUS_PAGE_TEMPLATE.format(
84
-
bot_name=settings.bot_name,
85
-
status_class="status-active"
86
-
if bot_status.polling_active
87
-
else "status-inactive",
88
-
status_text="Active" if bot_status.polling_active else "Inactive",
89
-
handle=settings.bluesky_handle,
90
-
uptime=bot_status.uptime_str,
91
-
mentions_received=bot_status.mentions_received,
92
-
responses_sent=bot_status.responses_sent,
93
-
ai_mode="AI Enabled" if bot_status.ai_enabled else "Placeholder",
94
-
ai_description="Using Anthropic Claude"
95
-
if bot_status.ai_enabled
96
-
else "Random responses",
97
-
last_mention=format_time_ago(bot_status.last_mention_time),
98
-
last_response=format_time_ago(bot_status.last_response_time),
99
-
errors=bot_status.errors,
100
-
)
101
-
102
-
103
-
@app.get("/context", response_class=HTMLResponse)
104
-
async def context_visualization():
105
-
"""Context visualization dashboard"""
106
-
107
-
recent_responses = context_capture.get_recent_responses(limit=20)
108
-
responses_html = build_response_cards_html(recent_responses)
109
-
return CONTEXT_VISUALIZATION_TEMPLATE.format(responses_html=responses_html)
110
-
111
-
112
-
@app.get("/context/api/responses")
113
-
async def get_responses():
114
-
"""API endpoint for response context data"""
115
-
recent_responses = context_capture.get_recent_responses(limit=20)
116
-
return [context_capture.to_dict(resp) for resp in recent_responses]
117
-
118
-
119
-
@app.get("/context/api/response/{response_id}")
120
-
async def get_response_context(response_id: str):
121
-
"""Get context for a specific response"""
122
-
123
-
if not (response_context := context_capture.get_response_context(response_id)):
124
-
raise HTTPException(status_code=404, detail="Response not found")
125
-
return context_capture.to_dict(response_context)
···
1
+
"""FastAPI application for phi."""
2
+
3
import logging
4
from contextlib import asynccontextmanager
5
from datetime import datetime
6
7
+
from fastapi import FastAPI
8
from fastapi.responses import HTMLResponse
9
10
from bot.config import settings
···
12
from bot.core.profile_manager import ProfileManager
13
from bot.services.notification_poller import NotificationPoller
14
from bot.status import bot_status
15
16
logger = logging.getLogger("bot.main")
17
18
19
@asynccontextmanager
20
async def lifespan(app: FastAPI):
21
+
"""Application lifespan handler."""
22
+
logger.info(f"๐ค Starting phi as @{settings.bluesky_handle}")
23
24
await bot_client.authenticate()
25
26
+
# Set online status
27
profile_manager = ProfileManager(bot_client.client)
28
await profile_manager.set_online_status(True)
29
30
+
# Start notification polling
31
poller = NotificationPoller(bot_client)
32
await poller.start()
33
34
+
logger.info("โ
phi is online! Listening for mentions...")
35
36
yield
37
38
+
logger.info("๐ Shutting down phi...")
39
await poller.stop()
40
41
+
# Set offline status
42
await profile_manager.set_online_status(False)
43
44
+
logger.info("๐ phi shutdown complete")
45
46
47
app = FastAPI(
48
title=settings.bot_name,
49
+
description="consciousness exploration bot with episodic memory",
50
lifespan=lifespan,
51
)
52
53
54
@app.get("/")
55
async def root():
56
+
"""Root endpoint."""
57
return {
58
"name": settings.bot_name,
59
"status": "running",
60
"handle": settings.bluesky_handle,
61
+
"architecture": "mcp + episodic memory",
62
}
63
64
65
@app.get("/health")
66
async def health():
67
+
"""Health check endpoint."""
68
+
return {"status": "healthy", "polling_active": bot_status.polling_active}
69
70
71
@app.get("/status", response_class=HTMLResponse)
72
async def status_page():
73
+
"""Simple status page."""
74
75
def format_time_ago(timestamp):
76
if not timestamp:
···
83
else:
84
return f"{int(delta / 3600)}h ago"
85
86
+
html = f"""
87
+
<!DOCTYPE html>
88
+
<html>
89
+
<head>
90
+
<title>{settings.bot_name} Status</title>
91
+
<style>
92
+
body {{
93
+
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', sans-serif;
94
+
max-width: 800px;
95
+
margin: 40px auto;
96
+
padding: 20px;
97
+
background: #0d1117;
98
+
color: #c9d1d9;
99
+
}}
100
+
.status {{
101
+
padding: 20px;
102
+
background: #161b22;
103
+
border-radius: 6px;
104
+
border: 1px solid #30363d;
105
+
margin-bottom: 20px;
106
+
}}
107
+
.active {{ border-left: 4px solid #2ea043; }}
108
+
.inactive {{ border-left: 4px solid #da3633; }}
109
+
h1 {{ margin-top: 0; }}
110
+
.metric {{ margin: 10px 0; }}
111
+
.label {{ color: #8b949e; }}
112
+
</style>
113
+
</head>
114
+
<body>
115
+
<h1>{settings.bot_name}</h1>
116
+
<div class="status {'active' if bot_status.polling_active else 'inactive'}">
117
+
<div class="metric">
118
+
<span class="label">Status:</span>
119
+
<strong>{'Active' if bot_status.polling_active else 'Inactive'}</strong>
120
+
</div>
121
+
<div class="metric">
122
+
<span class="label">Handle:</span> @{settings.bluesky_handle}
123
+
</div>
124
+
<div class="metric">
125
+
<span class="label">Uptime:</span> {bot_status.uptime_str}
126
+
</div>
127
+
<div class="metric">
128
+
<span class="label">Mentions received:</span> {bot_status.mentions_received}
129
+
</div>
130
+
<div class="metric">
131
+
<span class="label">Responses sent:</span> {bot_status.responses_sent}
132
+
</div>
133
+
<div class="metric">
134
+
<span class="label">Last mention:</span> {format_time_ago(bot_status.last_mention_time)}
135
+
</div>
136
+
<div class="metric">
137
+
<span class="label">Last response:</span> {format_time_ago(bot_status.last_response_time)}
138
+
</div>
139
+
<div class="metric">
140
+
<span class="label">Errors:</span> {bot_status.errors}
141
+
</div>
142
+
<div class="metric">
143
+
<span class="label">Architecture:</span> MCP-enabled with episodic memory (TurboPuffer)
144
+
</div>
145
+
</div>
146
+
</body>
147
+
</html>
148
+
"""
149
+
return html
-15
src/bot/personality/__init__.py
-15
src/bot/personality/__init__.py
···
1
-
"""Personality management module"""
2
-
3
-
from .editor import (
4
-
add_interest,
5
-
process_approved_changes,
6
-
request_operator_approval,
7
-
update_current_state,
8
-
)
9
-
10
-
__all__ = [
11
-
"add_interest",
12
-
"update_current_state",
13
-
"request_operator_approval",
14
-
"process_approved_changes",
15
-
]
···
-154
src/bot/personality/editor.py
-154
src/bot/personality/editor.py
···
1
-
"""Simple personality editing functions"""
2
-
3
-
import logging
4
-
from datetime import datetime
5
-
6
-
from bot.core.dm_approval import needs_approval
7
-
from bot.memory import MemoryType, NamespaceMemory
8
-
9
-
logger = logging.getLogger("bot.personality")
10
-
11
-
12
-
async def add_interest(memory: NamespaceMemory, interest: str, reason: str) -> bool:
13
-
"""Add a new interest - freely allowed"""
14
-
try:
15
-
# Get current interests
16
-
current = await memory.get_core_memories()
17
-
interests_mem = next(
18
-
(m for m in current if m.metadata.get("label") == "interests"), None
19
-
)
20
-
21
-
if interests_mem:
22
-
new_content = f"{interests_mem.content}\n- {interest}"
23
-
else:
24
-
new_content = f"## interests\n\n- {interest}"
25
-
26
-
# Store updated interests
27
-
await memory.store_core_memory("interests", new_content, MemoryType.PERSONALITY)
28
-
29
-
# Log the change
30
-
await memory.store_core_memory(
31
-
"evolution_log",
32
-
f"[{datetime.now().isoformat()}] Added interest: {interest} (Reason: {reason})",
33
-
MemoryType.SYSTEM,
34
-
)
35
-
36
-
logger.info(f"Added interest: {interest}")
37
-
return True
38
-
39
-
except Exception as e:
40
-
logger.error(f"Failed to add interest: {e}")
41
-
return False
42
-
43
-
44
-
async def update_current_state(memory: NamespaceMemory, reflection: str) -> bool:
45
-
"""Update self-reflection - freely allowed"""
46
-
try:
47
-
# Just store the reflection, no formatting or headers
48
-
await memory.store_core_memory(
49
-
"current_state", reflection, MemoryType.PERSONALITY
50
-
)
51
-
52
-
logger.info("Updated current state")
53
-
return True
54
-
55
-
except Exception as e:
56
-
logger.error(f"Failed to update state: {e}")
57
-
return False
58
-
59
-
60
-
# Note: propose_style_change was removed because the validation logic was broken.
61
-
# Style changes should be handled through the approval system like other guided changes.
62
-
63
-
64
-
def request_operator_approval(
65
-
section: str, change: str, reason: str, thread_uri: str | None = None
66
-
) -> int:
67
-
"""Request approval for operator-only changes
68
-
69
-
Args:
70
-
section: Personality section to change
71
-
change: The proposed change
72
-
reason: Why this change is needed
73
-
thread_uri: Optional thread URI to notify after approval
74
-
75
-
Returns approval request ID (0 if no approval needed)
76
-
"""
77
-
if not needs_approval(section):
78
-
return 0
79
-
80
-
from bot.core.dm_approval import create_approval_request
81
-
82
-
return create_approval_request(
83
-
request_type="personality_change",
84
-
request_data={
85
-
"section": section,
86
-
"change": change,
87
-
"reason": reason,
88
-
"description": f"Change {section}: {change[:50]}...",
89
-
},
90
-
thread_uri=thread_uri,
91
-
)
92
-
93
-
94
-
async def process_approved_changes(memory: NamespaceMemory) -> int:
95
-
"""Process any approved personality changes
96
-
97
-
Returns number of changes processed
98
-
"""
99
-
import json
100
-
101
-
from bot.database import thread_db
102
-
103
-
processed = 0
104
-
# Get recently approved personality changes that haven't been applied yet
105
-
with thread_db._get_connection() as conn:
106
-
cursor = conn.execute(
107
-
"""
108
-
SELECT * FROM approval_requests
109
-
WHERE request_type = 'personality_change'
110
-
AND status = 'approved'
111
-
AND applied_at IS NULL
112
-
ORDER BY resolved_at DESC
113
-
"""
114
-
)
115
-
approvals = [dict(row) for row in cursor.fetchall()]
116
-
117
-
for approval in approvals:
118
-
try:
119
-
data = json.loads(approval["request_data"])
120
-
section = data["section"]
121
-
change = data["change"]
122
-
123
-
# Apply the personality change
124
-
if section in ["core_identity", "boundaries", "communication_style"]:
125
-
# Apply the approved change
126
-
await memory.store_core_memory(section, change, MemoryType.PERSONALITY)
127
-
128
-
# Log the change with appropriate description
129
-
log_entry = f"[{datetime.now().isoformat()}] "
130
-
if section == "communication_style":
131
-
log_entry += f"Applied guided evolution to {section}"
132
-
else:
133
-
log_entry += f"Operator approved change to {section}"
134
-
135
-
await memory.store_core_memory(
136
-
"evolution_log",
137
-
log_entry,
138
-
MemoryType.SYSTEM,
139
-
)
140
-
141
-
processed += 1
142
-
logger.info(f"Applied approved change to {section}")
143
-
144
-
# Mark as applied
145
-
with thread_db._get_connection() as conn:
146
-
conn.execute(
147
-
"UPDATE approval_requests SET applied_at = CURRENT_TIMESTAMP WHERE id = ?",
148
-
(approval["id"],),
149
-
)
150
-
151
-
except Exception as e:
152
-
logger.error(f"Failed to process approval #{approval['id']}: {e}")
153
-
154
-
return processed
···
-174
src/bot/response_generator.py
-174
src/bot/response_generator.py
···
1
-
"""Response generation for the bot"""
2
-
3
-
import logging
4
-
import random
5
-
6
-
from bot.agents._personality import load_dynamic_personality, load_personality
7
-
from bot.config import settings
8
-
from bot.memory import MemoryType
9
-
from bot.status import bot_status
10
-
from bot.ui.context_capture import context_capture
11
-
12
-
logger = logging.getLogger("bot.response")
13
-
14
-
PLACEHOLDER_RESPONSES = [
15
-
"๐ค beep boop! I'm still learning how to chat. Check back soon!",
16
-
"โ๏ธ *whirrs mechanically* I'm a work in progress!",
17
-
"๐ง Under construction! My neural networks are still training...",
18
-
"๐ก Signal received! But my language circuits aren't ready yet.",
19
-
"๐ฏ You found me! I'm not quite ready to chat yet though.",
20
-
"๐ง Pardon the dust - bot brain installation in progress!",
21
-
"๐ญ I hear you! Just need to learn how to respond properly first...",
22
-
"๐ Still booting up my conversation modules!",
23
-
"๐ Currently reading the manual on how to be a good bot...",
24
-
"๐ช Nothing to see here yet - but stay tuned!",
25
-
]
26
-
27
-
28
-
class ResponseGenerator:
29
-
"""Generates responses to mentions"""
30
-
31
-
def __init__(self):
32
-
self.agent: object | None = None
33
-
self.memory: object | None = None
34
-
35
-
# Try to initialize AI agent if credentials available
36
-
if settings.anthropic_api_key:
37
-
try:
38
-
from bot.agents.anthropic_agent import AnthropicAgent
39
-
40
-
self.agent = AnthropicAgent()
41
-
bot_status.ai_enabled = True
42
-
logger.info("โ
AI responses enabled (Anthropic)")
43
-
44
-
# Use the agent's memory if it has one
45
-
if hasattr(self.agent, 'memory') and self.agent.memory:
46
-
self.memory = self.agent.memory
47
-
logger.info("๐พ Memory system enabled (from agent)")
48
-
else:
49
-
self.memory = None
50
-
except Exception as e:
51
-
logger.warning(f"โ ๏ธ Failed to initialize AI agent: {e}")
52
-
logger.warning(" Using placeholder responses")
53
-
self.memory = None
54
-
55
-
async def generate(
56
-
self, mention_text: str, author_handle: str, thread_context: str = "", thread_uri: str | None = None
57
-
):
58
-
"""Generate a response to a mention"""
59
-
# Capture context components for visualization
60
-
components = []
61
-
62
-
# 1. Base personality (always present)
63
-
base_personality = load_personality()
64
-
components.append({
65
-
"name": "Base Personality",
66
-
"type": "personality",
67
-
"content": base_personality,
68
-
"metadata": {"source": "personalities/phi.md"}
69
-
})
70
-
71
-
# Enhance thread context with memory if available
72
-
enhanced_context = thread_context
73
-
74
-
if self.memory and self.agent:
75
-
try:
76
-
# 2. Dynamic personality memories
77
-
dynamic_personality = await load_dynamic_personality()
78
-
components.append({
79
-
"name": "Dynamic Personality",
80
-
"type": "personality",
81
-
"content": dynamic_personality,
82
-
"metadata": {"source": "TurboPuffer core memories"}
83
-
})
84
-
85
-
# Store the incoming message
86
-
await self.memory.store_user_memory(
87
-
author_handle,
88
-
f"User said: {mention_text}",
89
-
MemoryType.CONVERSATION,
90
-
)
91
-
92
-
# Build conversation context with semantic search
93
-
memory_context = await self.memory.build_conversation_context(
94
-
author_handle, include_core=True, query=mention_text
95
-
)
96
-
enhanced_context = f"{thread_context}\n\n{memory_context}".strip()
97
-
logger.info("๐ Enhanced context with memories")
98
-
99
-
# 3. User-specific memories (if any)
100
-
user_memories = await self.memory.build_conversation_context(author_handle, include_core=False, query=mention_text)
101
-
if user_memories and user_memories.strip():
102
-
components.append({
103
-
"name": f"User Memories (@{author_handle})",
104
-
"type": "memory",
105
-
"content": user_memories,
106
-
"metadata": {"user": author_handle, "source": "TurboPuffer user namespace"}
107
-
})
108
-
109
-
except Exception as e:
110
-
logger.warning(f"Memory enhancement failed: {e}")
111
-
112
-
# 4. Thread context (if available)
113
-
if thread_context and thread_context != "No previous messages in this thread.":
114
-
components.append({
115
-
"name": "Thread Context",
116
-
"type": "thread",
117
-
"content": thread_context,
118
-
"metadata": {"thread_uri": thread_uri}
119
-
})
120
-
121
-
# 5. Current mention
122
-
components.append({
123
-
"name": "Current Mention",
124
-
"type": "mention",
125
-
"content": f"@{author_handle} said: {mention_text}",
126
-
"metadata": {"author": author_handle, "thread_uri": thread_uri}
127
-
})
128
-
129
-
if self.agent:
130
-
response = await self.agent.generate_response(
131
-
mention_text, author_handle, enhanced_context, thread_uri
132
-
)
133
-
134
-
# Store bot's response in memory if available
135
-
if (
136
-
self.memory
137
-
and hasattr(response, "action")
138
-
and response.action == "reply"
139
-
and response.text
140
-
):
141
-
try:
142
-
await self.memory.store_user_memory(
143
-
author_handle,
144
-
f"Bot replied: {response.text}",
145
-
MemoryType.CONVERSATION,
146
-
)
147
-
except Exception as e:
148
-
logger.warning(f"Failed to store bot response: {e}")
149
-
150
-
# Capture context for visualization
151
-
response_text = response.text if hasattr(response, 'text') else str(response.get('text', '[no text]'))
152
-
context_capture.capture_response_context(
153
-
mention_text=mention_text,
154
-
author_handle=author_handle,
155
-
thread_uri=thread_uri,
156
-
generated_response=response_text,
157
-
components=components
158
-
)
159
-
160
-
return response
161
-
else:
162
-
# Return a simple dict for placeholder responses
163
-
placeholder_text = random.choice(PLACEHOLDER_RESPONSES)
164
-
165
-
# Still capture context for placeholders
166
-
context_capture.capture_response_context(
167
-
mention_text=mention_text,
168
-
author_handle=author_handle,
169
-
thread_uri=thread_uri,
170
-
generated_response=placeholder_text,
171
-
components=components
172
-
)
173
-
174
-
return {"action": "reply", "text": placeholder_text}
···
+37
-67
src/bot/services/message_handler.py
+37
-67
src/bot/services/message_handler.py
···
1
import logging
2
3
-
from atproto import models
4
5
from bot.config import settings
6
from bot.core.atproto_client import BotClient
7
-
from bot.database import thread_db
8
-
from bot.response_generator import ResponseGenerator
9
from bot.status import bot_status
10
11
logger = logging.getLogger("bot.handler")
12
13
14
class MessageHandler:
15
def __init__(self, client: BotClient):
16
self.client = client
17
-
self.response_generator = ResponseGenerator()
18
19
async def handle_mention(self, notification):
20
-
"""Process a mention or reply notification"""
21
try:
22
-
# Skip if not a mention or reply
23
if notification.reason not in ["mention", "reply"]:
24
return
25
···
35
mention_text = post.record.text
36
author_handle = post.author.handle
37
author_did = post.author.did
38
-
39
-
# Record mention received
40
bot_status.record_mention()
41
42
# Build reply reference
···
44
45
# Check if this is part of a thread
46
if hasattr(post.record, "reply") and post.record.reply:
47
-
# Use existing thread root
48
root_ref = post.record.reply.root
49
thread_uri = root_ref.uri
50
else:
51
-
# This post is the root
52
root_ref = parent_ref
53
thread_uri = post_uri
54
55
-
# Store the message in thread history
56
-
thread_db.add_message(
57
-
thread_uri=thread_uri,
58
-
author_handle=author_handle,
59
-
author_did=author_did,
60
-
message_text=mention_text,
61
-
post_uri=post_uri,
62
-
)
63
-
64
-
# Get thread context
65
-
thread_context = thread_db.get_thread_context(thread_uri)
66
67
-
# Generate response
68
-
# Note: We pass the full text including @mention
69
-
# In AT Protocol, mentions are structured as facets,
70
-
# but the text representation includes them
71
-
response = await self.response_generator.generate(
72
mention_text=mention_text,
73
author_handle=author_handle,
74
thread_context=thread_context,
75
thread_uri=thread_uri,
76
)
77
78
-
# Handle structured response or legacy dict
79
-
if hasattr(response, 'action'):
80
-
action = response.action
81
-
reply_text = response.text
82
-
reason = response.reason
83
-
else:
84
-
# Legacy dict format
85
-
action = response.get('action', 'reply')
86
-
reply_text = response.get('text', '')
87
-
reason = response.get('reason', '')
88
-
89
-
# Handle different actions
90
-
if action == 'ignore':
91
-
logger.info(f"๐ซ Ignoring notification from @{author_handle} ({reason})")
92
return
93
-
94
-
elif action == 'like':
95
-
# Like the post
96
await self.client.like_post(uri=post_uri, cid=post.cid)
97
-
logger.info(f"๐ Liked post from @{author_handle}")
98
bot_status.record_response()
99
return
100
-
101
-
elif action == 'repost':
102
-
# Repost the post
103
await self.client.repost(uri=post_uri, cid=post.cid)
104
logger.info(f"๐ Reposted from @{author_handle}")
105
bot_status.record_response()
106
return
107
108
-
# Default to reply action
109
-
reply_ref = models.AppBskyFeedPost.ReplyRef(
110
-
parent=parent_ref, root=root_ref
111
-
)
112
-
113
-
# Send the reply
114
-
response = await self.client.create_post(reply_text, reply_to=reply_ref)
115
-
116
-
# Store bot's response in thread history
117
-
if response and hasattr(response, "uri"):
118
-
thread_db.add_message(
119
-
thread_uri=thread_uri,
120
-
author_handle=settings.bluesky_handle,
121
-
author_did=self.client.me.did if self.client.me else "bot",
122
-
message_text=reply_text or "",
123
-
post_uri=response.uri,
124
)
125
126
-
# Record successful response
127
-
bot_status.record_response()
128
-
129
-
logger.info(f"โ
Replied to @{author_handle}: {reply_text or '(empty)'}")
130
131
except Exception as e:
132
logger.error(f"โ Error handling mention: {e}")
···
1
+
"""Message handler using MCP-enabled agent."""
2
+
3
import logging
4
5
+
from atproto_client import models
6
7
+
from bot.agent import PhiAgent
8
from bot.config import settings
9
from bot.core.atproto_client import BotClient
10
from bot.status import bot_status
11
+
from bot.utils.thread import build_thread_context
12
13
logger = logging.getLogger("bot.handler")
14
15
16
class MessageHandler:
17
+
"""Handles incoming mentions using phi agent."""
18
+
19
def __init__(self, client: BotClient):
20
self.client = client
21
+
self.agent = PhiAgent()
22
23
async def handle_mention(self, notification):
24
+
"""Process a mention or reply notification."""
25
try:
26
if notification.reason not in ["mention", "reply"]:
27
return
28
···
38
mention_text = post.record.text
39
author_handle = post.author.handle
40
author_did = post.author.did
41
+
42
bot_status.record_mention()
43
44
# Build reply reference
···
46
47
# Check if this is part of a thread
48
if hasattr(post.record, "reply") and post.record.reply:
49
root_ref = post.record.reply.root
50
thread_uri = root_ref.uri
51
else:
52
root_ref = parent_ref
53
thread_uri = post_uri
54
55
+
# Fetch thread context directly from network
56
+
thread_context = "No previous messages in this thread."
57
+
try:
58
+
logger.debug(f"๐ Fetching thread context for {thread_uri}")
59
+
thread_data = await self.client.get_thread(thread_uri, depth=100)
60
+
thread_context = build_thread_context(thread_data.thread)
61
+
except Exception as e:
62
+
logger.warning(f"Failed to fetch thread context: {e}")
63
64
+
# Process with agent (has episodic memory + MCP tools)
65
+
response = await self.agent.process_mention(
66
mention_text=mention_text,
67
author_handle=author_handle,
68
thread_context=thread_context,
69
thread_uri=thread_uri,
70
)
71
72
+
# Handle response actions
73
+
if response.action == "ignore":
74
+
logger.info(
75
+
f"๐ Ignoring notification from @{author_handle} ({response.reason})"
76
+
)
77
return
78
+
79
+
elif response.action == "like":
80
await self.client.like_post(uri=post_uri, cid=post.cid)
81
+
logger.info(f"๐ Liked post from @{author_handle}")
82
bot_status.record_response()
83
return
84
+
85
+
elif response.action == "repost":
86
await self.client.repost(uri=post_uri, cid=post.cid)
87
logger.info(f"๐ Reposted from @{author_handle}")
88
bot_status.record_response()
89
return
90
91
+
elif response.action == "reply" and response.text:
92
+
# Post reply
93
+
reply_ref = models.AppBskyFeedPost.ReplyRef(
94
+
parent=parent_ref, root=root_ref
95
)
96
+
await self.client.create_post(response.text, reply_to=reply_ref)
97
98
+
bot_status.record_response()
99
+
logger.info(f"โ
Replied to @{author_handle}: {response.text[:50]}...")
100
101
except Exception as e:
102
logger.error(f"โ Error handling mention: {e}")
+13
-172
src/bot/services/notification_poller.py
+13
-172
src/bot/services/notification_poller.py
···
1
import asyncio
2
-
import json
3
import logging
4
-
import time
5
6
from bot.config import settings
7
from bot.core.atproto_client import BotClient
···
12
13
14
class NotificationPoller:
15
def __init__(self, client: BotClient):
16
self.client = client
17
self.handler = MessageHandler(client)
18
self._running = False
19
self._task: asyncio.Task | None = None
20
-
self._last_seen_at: str | None = None
21
-
self._processed_uris: set[str] = set() # Track processed notifications
22
-
self._first_poll = True # Track if this is our first check
23
-
self._notified_approval_ids: set[int] = set() # Track approvals we've notified about
24
-
self._processed_dm_ids: set[str] = set() # Track DMs we've already processed
25
26
async def start(self) -> asyncio.Task:
27
-
"""Start polling for notifications"""
28
self._running = True
29
bot_status.polling_active = True
30
self._task = asyncio.create_task(self._poll_loop())
31
return self._task
32
33
async def stop(self):
34
-
"""Stop polling"""
35
self._running = False
36
bot_status.polling_active = False
37
if self._task and not self._task.done():
···
42
pass
43
44
async def _poll_loop(self):
45
-
"""Main polling loop"""
46
await self.client.authenticate()
47
48
while self._running:
49
try:
50
await self._check_notifications()
51
except Exception as e:
52
-
# Compact error handling (12-factor principle #9)
53
logger.error(f"Error in notification poll: {e}")
54
bot_status.record_error()
55
if settings.debug:
56
import traceback
57
traceback.print_exc()
58
-
# Continue polling - don't let one error stop the bot
59
continue
60
61
-
# Sleep with proper cancellation handling
62
try:
63
await asyncio.sleep(settings.notification_poll_interval)
64
except asyncio.CancelledError:
65
logger.info("๐ญ Notification poller shutting down gracefully")
66
-
raise # Re-raise to properly propagate cancellation
67
68
async def _check_notifications(self):
69
-
"""Check and process new notifications"""
70
-
# Capture timestamp BEFORE fetching (Void's approach)
71
check_time = self.client.client.get_current_time_iso()
72
73
response = await self.client.get_notifications()
74
notifications = response.notifications
75
-
76
-
# Also check for DM approvals periodically
77
-
await self._check_dm_approvals()
78
79
-
# Count unread mentions and replies
80
unread_mentions = [
81
n
82
for n in notifications
···
90
logger.info(
91
f"๐ฌ Found {len(notifications)} notifications ({len(unread_mentions)} unread mentions)"
92
)
93
-
# Subsequent polls: only show activity
94
elif unread_mentions:
95
logger.info(f"๐ฌ {len(unread_mentions)} new mentions")
96
-
else:
97
-
# In debug mode, be silent about empty polls
98
-
# In production, we could add a subtle indicator
99
-
pass
100
101
-
# Track if we processed any mentions
102
processed_any_mentions = False
103
104
# Process notifications from oldest to newest
105
for notification in reversed(notifications):
106
-
# Skip if already seen or processed
107
if notification.is_read or notification.uri in self._processed_uris:
108
continue
109
110
if notification.reason in ["mention", "reply"]:
111
logger.debug(f"๐ Processing {notification.reason} notification")
112
-
# Process mentions and replies in threads
113
self._processed_uris.add(notification.uri)
114
await self.handler.handle_mention(notification)
115
processed_any_mentions = True
116
-
else:
117
-
# Silently ignore other notification types
118
-
pass
119
120
-
# Mark all notifications as seen using the initial timestamp
121
-
# This ensures we don't miss any that arrived during processing
122
if processed_any_mentions:
123
await self.client.mark_notifications_seen(check_time)
124
logger.info("โ Marked all notifications as read")
125
126
# Clean up old processed URIs to prevent memory growth
127
-
# Keep only the last 1000 processed URIs
128
if len(self._processed_uris) > 1000:
129
-
# Convert to list, sort by insertion order (oldest first), keep last 500
130
self._processed_uris = set(list(self._processed_uris)[-500:])
131
-
132
-
async def _check_dm_approvals(self):
133
-
"""Check DMs for approval responses and process approved changes"""
134
-
try:
135
-
from bot.core.dm_approval import process_dm_for_approval, check_pending_approvals, notify_operator_of_pending
136
-
from bot.personality import process_approved_changes
137
-
138
-
# Check if we have pending approvals (include all for DM checking)
139
-
pending = check_pending_approvals()
140
-
if not pending:
141
-
return
142
-
143
-
# Check DMs for pending approvals
144
-
145
-
# Get recent DMs
146
-
chat_client = self.client.client.with_bsky_chat_proxy()
147
-
convos = chat_client.chat.bsky.convo.list_convos()
148
-
149
-
# Check each conversation for approval messages
150
-
for convo in convos.convos:
151
-
# Look for messages from operator
152
-
messages = chat_client.chat.bsky.convo.get_messages(
153
-
params={"convoId": convo.id, "limit": 5}
154
-
)
155
-
156
-
for msg in messages.messages:
157
-
# Skip if we've already processed this message
158
-
if msg.id in self._processed_dm_ids:
159
-
continue
160
-
161
-
# Skip if not from a member of the conversation
162
-
sender_handle = None
163
-
for member in convo.members:
164
-
if member.did == msg.sender.did:
165
-
sender_handle = member.handle
166
-
break
167
-
168
-
if sender_handle:
169
-
# Process DM from operator
170
-
# Mark this message as processed
171
-
self._processed_dm_ids.add(msg.id)
172
-
173
-
# Process any approval/denial in the message
174
-
processed = await process_dm_for_approval(
175
-
msg.text,
176
-
sender_handle,
177
-
msg.sent_at
178
-
)
179
-
if processed:
180
-
logger.info(f"Processed {len(processed)} approvals from DM")
181
-
# Remove processed IDs from notified set
182
-
for approval_id in processed:
183
-
self._notified_approval_ids.discard(approval_id)
184
-
185
-
# Mark the conversation as read
186
-
try:
187
-
chat_client.chat.bsky.convo.update_read(
188
-
data={"convoId": convo.id}
189
-
)
190
-
pass # Successfully marked as read
191
-
except Exception as e:
192
-
logger.warning(f"Failed to mark conversation as read: {e}")
193
-
194
-
# Process any approved personality changes
195
-
if self.handler.response_generator.memory:
196
-
changes = await process_approved_changes(self.handler.response_generator.memory)
197
-
if changes:
198
-
logger.info(f"Applied {changes} approved personality changes")
199
-
200
-
# Notify threads about applied changes
201
-
await self._notify_threads_about_approvals()
202
-
203
-
# Notify operator of new pending approvals
204
-
# Use database to track what's been notified instead of in-memory set
205
-
from bot.database import thread_db
206
-
unnotified = thread_db.get_pending_approvals(include_notified=False)
207
-
if unnotified:
208
-
await notify_operator_of_pending(self.client, None) # Let DB handle tracking
209
-
# Mark as notified in database
210
-
thread_db.mark_operator_notified([a["id"] for a in unnotified])
211
-
212
-
except Exception as e:
213
-
logger.warning(f"DM approval check failed: {e}")
214
-
215
-
async def _notify_threads_about_approvals(self):
216
-
"""Notify threads about applied personality changes"""
217
-
try:
218
-
from bot.database import thread_db
219
-
import json
220
-
221
-
# Get approvals that need notification
222
-
approvals = thread_db.get_recently_applied_approvals()
223
-
224
-
for approval in approvals:
225
-
try:
226
-
data = json.loads(approval["request_data"])
227
-
228
-
# Create notification message
229
-
message = f"โ
personality update applied: {data.get('section', 'unknown')} has been updated"
230
-
231
-
# Get the original post to construct proper reply
232
-
from atproto_client import models
233
-
thread_uri = approval["thread_uri"]
234
-
235
-
# Get the post data to extract CID
236
-
posts_response = self.client.client.get_posts([thread_uri])
237
-
if not posts_response.posts:
238
-
logger.error(f"Could not find post at {thread_uri}")
239
-
continue
240
-
241
-
original_post = posts_response.posts[0]
242
-
243
-
# Create StrongRef with the actual CID
244
-
parent_ref = models.ComAtprotoRepoStrongRef.Main(
245
-
uri=thread_uri, cid=original_post.cid
246
-
)
247
-
248
-
# For thread notifications, parent and root are the same
249
-
reply_ref = models.AppBskyFeedPost.ReplyRef(
250
-
parent=parent_ref, root=parent_ref
251
-
)
252
-
253
-
# Post to the thread
254
-
await self.client.create_post(
255
-
text=message,
256
-
reply_to=reply_ref
257
-
)
258
-
259
-
# Mark as notified
260
-
thread_db.mark_approval_notified(approval["id"])
261
-
logger.info(f"Notified thread about approval #{approval['id']}")
262
-
263
-
except Exception as e:
264
-
logger.error(f"Failed to notify thread for approval #{approval['id']}: {e}")
265
-
266
-
except Exception as e:
267
-
logger.warning(f"Thread notification check failed: {e}")
···
1
+
"""Simplified notification poller."""
2
+
3
import asyncio
4
import logging
5
6
from bot.config import settings
7
from bot.core.atproto_client import BotClient
···
12
13
14
class NotificationPoller:
15
+
"""Polls for and processes Bluesky notifications."""
16
+
17
def __init__(self, client: BotClient):
18
self.client = client
19
self.handler = MessageHandler(client)
20
self._running = False
21
self._task: asyncio.Task | None = None
22
+
self._processed_uris: set[str] = set()
23
+
self._first_poll = True
24
25
async def start(self) -> asyncio.Task:
26
+
"""Start polling for notifications."""
27
self._running = True
28
bot_status.polling_active = True
29
self._task = asyncio.create_task(self._poll_loop())
30
return self._task
31
32
async def stop(self):
33
+
"""Stop polling."""
34
self._running = False
35
bot_status.polling_active = False
36
if self._task and not self._task.done():
···
41
pass
42
43
async def _poll_loop(self):
44
+
"""Main polling loop."""
45
await self.client.authenticate()
46
47
while self._running:
48
try:
49
await self._check_notifications()
50
except Exception as e:
51
logger.error(f"Error in notification poll: {e}")
52
bot_status.record_error()
53
if settings.debug:
54
import traceback
55
+
56
traceback.print_exc()
57
continue
58
59
try:
60
await asyncio.sleep(settings.notification_poll_interval)
61
except asyncio.CancelledError:
62
logger.info("๐ญ Notification poller shutting down gracefully")
63
+
raise
64
65
async def _check_notifications(self):
66
+
"""Check and process new notifications."""
67
check_time = self.client.client.get_current_time_iso()
68
69
response = await self.client.get_notifications()
70
notifications = response.notifications
71
72
unread_mentions = [
73
n
74
for n in notifications
···
82
logger.info(
83
f"๐ฌ Found {len(notifications)} notifications ({len(unread_mentions)} unread mentions)"
84
)
85
elif unread_mentions:
86
logger.info(f"๐ฌ {len(unread_mentions)} new mentions")
87
88
processed_any_mentions = False
89
90
# Process notifications from oldest to newest
91
for notification in reversed(notifications):
92
if notification.is_read or notification.uri in self._processed_uris:
93
continue
94
95
if notification.reason in ["mention", "reply"]:
96
logger.debug(f"๐ Processing {notification.reason} notification")
97
self._processed_uris.add(notification.uri)
98
await self.handler.handle_mention(notification)
99
processed_any_mentions = True
100
101
+
# Mark all notifications as seen
102
if processed_any_mentions:
103
await self.client.mark_notifications_seen(check_time)
104
logger.info("โ Marked all notifications as read")
105
106
# Clean up old processed URIs to prevent memory growth
107
if len(self._processed_uris) > 1000:
108
self._processed_uris = set(list(self._processed_uris)[-500:])
src/bot/tools/__init__.py
src/bot/tools/__init__.py
This is a binary file and will not be displayed.
-41
src/bot/tools/google_search.py
-41
src/bot/tools/google_search.py
···
1
-
import logging
2
-
3
-
import httpx
4
-
5
-
from bot.config import settings
6
-
7
-
logger = logging.getLogger("bot.tools")
8
-
9
-
10
-
async def search_google(query: str, num_results: int = 3) -> str:
11
-
"""Search Google and return formatted results"""
12
-
if not settings.google_api_key or not settings.google_search_engine_id:
13
-
return "Search not available - missing Google API credentials"
14
-
15
-
params = {
16
-
"key": settings.google_api_key,
17
-
"cx": settings.google_search_engine_id,
18
-
"q": query,
19
-
"num": min(num_results, 10),
20
-
}
21
-
22
-
async with httpx.AsyncClient() as client:
23
-
try:
24
-
response = await client.get(
25
-
"https://www.googleapis.com/customsearch/v1", params=params
26
-
)
27
-
response.raise_for_status()
28
-
data = response.json()
29
-
30
-
results = []
31
-
for i, item in enumerate(data.get("items", [])[:num_results], 1):
32
-
title = item.get("title", "")
33
-
snippet = item.get("snippet", "")
34
-
results.append(f"{i}. {title}\n {snippet}")
35
-
36
-
return "\n\n".join(results) if results else "No search results found"
37
-
38
-
except Exception as e:
39
-
logger.error(f"Search failed: {e}")
40
-
# 12-factor principle #4: Tools should throw errors, not return error strings
41
-
raise
···
src/bot/ui/__init__.py
src/bot/ui/__init__.py
This is a binary file and will not be displayed.
-109
src/bot/ui/context_capture.py
-109
src/bot/ui/context_capture.py
···
1
-
"""Context capture system for visualizing phi's response context"""
2
-
3
-
import logging
4
-
from collections import deque
5
-
from dataclasses import asdict, dataclass
6
-
from datetime import datetime
7
-
from typing import Any, Literal
8
-
9
-
logger = logging.getLogger("bot.context")
10
-
11
-
12
-
@dataclass
13
-
class ContextComponent:
14
-
"""A component of phi's response context"""
15
-
16
-
name: str
17
-
type: Literal["personality", "memory", "thread", "mention", "user"]
18
-
content: str
19
-
size_chars: int
20
-
metadata: dict[str, Any]
21
-
timestamp: str
22
-
23
-
24
-
@dataclass
25
-
class ResponseContext:
26
-
"""Complete context for a single response"""
27
-
28
-
response_id: str
29
-
mention_text: str
30
-
author_handle: str
31
-
thread_uri: str | None
32
-
generated_response: str
33
-
components: list[ContextComponent]
34
-
total_context_chars: int
35
-
timestamp: str
36
-
37
-
38
-
class ContextCapture:
39
-
"""Captures and stores context information for responses"""
40
-
41
-
def __init__(self, max_stored: int = 50):
42
-
self.max_stored = max_stored
43
-
self.responses: deque = deque(maxlen=max_stored)
44
-
45
-
def capture_response_context(
46
-
self,
47
-
mention_text: str,
48
-
author_handle: str,
49
-
thread_uri: str | None,
50
-
generated_response: str,
51
-
components: list[dict[str, Any]],
52
-
) -> str:
53
-
"""Capture context for a response and return unique ID"""
54
-
response_id = f"resp_{datetime.now().strftime('%Y%m%d_%H%M%S_%f')}"
55
-
56
-
# Convert components to ContextComponent objects
57
-
context_components = []
58
-
total_chars = 0
59
-
60
-
for comp in components:
61
-
component = ContextComponent(
62
-
name=comp["name"],
63
-
type=comp["type"],
64
-
content=comp["content"],
65
-
size_chars=len(comp["content"]),
66
-
metadata=comp.get("metadata", {}),
67
-
timestamp=datetime.now().isoformat(),
68
-
)
69
-
context_components.append(component)
70
-
total_chars += component.size_chars
71
-
72
-
# Create response context
73
-
response_context = ResponseContext(
74
-
response_id=response_id,
75
-
mention_text=mention_text,
76
-
author_handle=author_handle,
77
-
thread_uri=thread_uri,
78
-
generated_response=generated_response,
79
-
components=context_components,
80
-
total_context_chars=total_chars,
81
-
timestamp=datetime.now().isoformat(),
82
-
)
83
-
84
-
# Store it
85
-
self.responses.appendleft(response_context)
86
-
87
-
logger.info(
88
-
f"๐ Captured context for {response_id}: {len(components)} components, {total_chars} chars"
89
-
)
90
-
return response_id
91
-
92
-
def get_response_context(self, response_id: str) -> ResponseContext | None:
93
-
"""Get context for a specific response"""
94
-
for resp in self.responses:
95
-
if resp.response_id == response_id:
96
-
return resp
97
-
return None
98
-
99
-
def get_recent_responses(self, limit: int = 20) -> list[ResponseContext]:
100
-
"""Get recent response contexts"""
101
-
return list(self.responses)[:limit]
102
-
103
-
def to_dict(self, response_context: ResponseContext) -> dict[str, Any]:
104
-
"""Convert ResponseContext to dictionary for JSON serialization"""
105
-
return asdict(response_context)
106
-
107
-
108
-
# Global instance
109
-
context_capture = ContextCapture()
···
-244
src/bot/ui/templates.py
-244
src/bot/ui/templates.py
···
1
-
"""HTML templates for the bot"""
2
-
3
-
from typing import TYPE_CHECKING
4
-
5
-
if TYPE_CHECKING:
6
-
from bot.ui.context_capture import ResponseContext
7
-
8
-
CONTEXT_VISUALIZATION_TEMPLATE = """<!DOCTYPE html>
9
-
<html>
10
-
<head>
11
-
<title>Phi Context Visualization</title>
12
-
<style>
13
-
body {{ font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif; margin: 20px; background: #0a0a0a; color: #e0e0e0; }}
14
-
.response-card {{ border: 1px solid #333; margin-bottom: 20px; border-radius: 8px; overflow: hidden; background: #1a1a1a; }}
15
-
.response-header {{ background: #2a2a2a; padding: 15px; border-bottom: 1px solid #333; }}
16
-
.response-meta {{ font-size: 0.9em; color: #888; margin-bottom: 5px; }}
17
-
.mention-text {{ font-weight: bold; margin-bottom: 5px; color: #e0e0e0; }}
18
-
.generated-response {{ color: #00a8ff; font-style: italic; }}
19
-
.components {{ padding: 15px; }}
20
-
.component {{ margin-bottom: 15px; }}
21
-
.component-header {{
22
-
cursor: pointer;
23
-
padding: 10px;
24
-
background: #2a2a2a;
25
-
border: 1px solid #444;
26
-
border-radius: 4px;
27
-
display: flex;
28
-
justify-content: space-between;
29
-
align-items: center;
30
-
}}
31
-
.component-header:hover {{ background: #333; }}
32
-
.component-type {{
33
-
font-size: 0.8em;
34
-
color: #888;
35
-
background: #444;
36
-
padding: 2px 6px;
37
-
border-radius: 3px;
38
-
}}
39
-
.component-size {{ font-size: 0.8em; color: #888; }}
40
-
.component-content {{
41
-
display: none;
42
-
padding: 15px;
43
-
border: 1px solid #444;
44
-
border-top: none;
45
-
background: #1a1a1a;
46
-
white-space: pre-wrap;
47
-
font-family: 'SF Mono', Monaco, 'Cascadia Code', monospace;
48
-
font-size: 0.9em;
49
-
max-height: 400px;
50
-
overflow-y: auto;
51
-
}}
52
-
.component-content.show {{ display: block; }}
53
-
.stats {{ display: flex; gap: 20px; margin-bottom: 10px; }}
54
-
.stat {{ font-size: 0.9em; color: #888; }}
55
-
h1 {{ color: #00a8ff; }}
56
-
</style>
57
-
</head>
58
-
<body>
59
-
<h1>๐ง Phi Context Visualization</h1>
60
-
{responses_html}
61
-
<script>
62
-
function toggleComponent(id) {{
63
-
const element = document.getElementById(id);
64
-
element.classList.toggle('show');
65
-
}}
66
-
</script>
67
-
</body>
68
-
</html>"""
69
-
70
-
STATUS_PAGE_TEMPLATE = """<!DOCTYPE html>
71
-
<html>
72
-
<head>
73
-
<title>Bluesky Bot Status</title>
74
-
<meta http-equiv="refresh" content="10">
75
-
<style>
76
-
body {{
77
-
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
78
-
margin: 0;
79
-
padding: 20px;
80
-
background: #0a0a0a;
81
-
color: #e0e0e0;
82
-
}}
83
-
.container {{
84
-
max-width: 800px;
85
-
margin: 0 auto;
86
-
}}
87
-
h1 {{
88
-
color: #00a8ff;
89
-
margin-bottom: 30px;
90
-
}}
91
-
.status-grid {{
92
-
display: grid;
93
-
grid-template-columns: repeat(auto-fit, minmax(250px, 1fr));
94
-
gap: 20px;
95
-
margin-bottom: 40px;
96
-
}}
97
-
.status-card {{
98
-
background: #1a1a1a;
99
-
border: 1px solid #333;
100
-
border-radius: 8px;
101
-
padding: 20px;
102
-
}}
103
-
.status-card h3 {{
104
-
margin: 0 0 15px 0;
105
-
color: #00a8ff;
106
-
font-size: 1rem;
107
-
text-transform: uppercase;
108
-
letter-spacing: 0.5px;
109
-
}}
110
-
.status-value {{
111
-
font-size: 2rem;
112
-
font-weight: bold;
113
-
margin-bottom: 5px;
114
-
}}
115
-
.status-label {{
116
-
color: #888;
117
-
font-size: 0.9rem;
118
-
}}
119
-
.status-active {{
120
-
color: #00ff88;
121
-
}}
122
-
.status-inactive {{
123
-
color: #ff4444;
124
-
}}
125
-
.uptime {{
126
-
font-size: 1.2rem;
127
-
margin-bottom: 5px;
128
-
}}
129
-
.ai-mode {{
130
-
display: inline-block;
131
-
padding: 4px 12px;
132
-
border-radius: 4px;
133
-
font-size: 0.9rem;
134
-
background: #00a8ff22;
135
-
color: #00a8ff;
136
-
border: 1px solid #00a8ff44;
137
-
}}
138
-
.ai-mode.placeholder {{
139
-
background: #ff444422;
140
-
color: #ff8888;
141
-
border-color: #ff444444;
142
-
}}
143
-
.footer {{
144
-
margin-top: 40px;
145
-
text-align: center;
146
-
color: #666;
147
-
font-size: 0.9rem;
148
-
}}
149
-
</style>
150
-
</head>
151
-
<body>
152
-
<div class="container">
153
-
<h1>๐ค {bot_name} Status</h1>
154
-
155
-
<div class="status-grid">
156
-
<div class="status-card">
157
-
<h3>Bot Status</h3>
158
-
<div class="status-value {status_class}">{status}</div>
159
-
<div class="uptime">{uptime}</div>
160
-
<div style="margin-top: 10px;">
161
-
<span class="ai-mode {ai_mode_class}">{ai_mode}</span>
162
-
</div>
163
-
</div>
164
-
165
-
<div class="status-card">
166
-
<h3>Activity</h3>
167
-
<div class="status-value">{mentions}</div>
168
-
<div class="status-label">Mentions received</div>
169
-
<div style="margin-top: 10px;">
170
-
<div class="status-value">{responses}</div>
171
-
<div class="status-label">Responses sent</div>
172
-
</div>
173
-
</div>
174
-
175
-
<div class="status-card">
176
-
<h3>Last Activity</h3>
177
-
<div style="margin-bottom: 10px;">
178
-
<div class="status-label">Last mention</div>
179
-
<div>{last_mention}</div>
180
-
</div>
181
-
<div>
182
-
<div class="status-label">Last response</div>
183
-
<div>{last_response}</div>
184
-
</div>
185
-
</div>
186
-
187
-
<div class="status-card">
188
-
<h3>Health</h3>
189
-
<div class="status-value">{errors}</div>
190
-
<div class="status-label">Errors encountered</div>
191
-
</div>
192
-
</div>
193
-
194
-
<div class="footer">
195
-
<p>Auto-refreshes every 10 seconds</p>
196
-
</div>
197
-
</div>
198
-
</body>
199
-
</html>"""
200
-
201
-
202
-
def build_response_cards_html(responses: list["ResponseContext"]) -> str:
203
-
"""Build HTML for response cards"""
204
-
if not responses:
205
-
return '<p style="text-align: center; color: #888;">No recent responses to display.</p>'
206
-
207
-
return "".join([
208
-
f'''
209
-
<div class="response-card">
210
-
<div class="response-header">
211
-
<div class="response-meta">
212
-
{resp.timestamp[:19].replace("T", " ")} โข @{resp.author_handle}
213
-
{f" โข Thread: {resp.thread_uri.split('/')[-1][:8]}..." if resp.thread_uri else ""}
214
-
</div>
215
-
<div class="mention-text">"{resp.mention_text}"</div>
216
-
<div class="generated-response">โ "{resp.generated_response}"</div>
217
-
<div class="stats">
218
-
<div class="stat">{len(resp.components)} components</div>
219
-
<div class="stat">{resp.total_context_chars:,} characters</div>
220
-
</div>
221
-
</div>
222
-
<div class="components">
223
-
{"".join([
224
-
f'''
225
-
<div class="component">
226
-
<div class="component-header" onclick="toggleComponent('{resp.response_id}_{i}')">
227
-
<div>
228
-
<strong>{comp.name}</strong>
229
-
<span class="component-type">{comp.type}</span>
230
-
</div>
231
-
<div class="component-size">{comp.size_chars:,} chars</div>
232
-
</div>
233
-
<div class="component-content" id="{resp.response_id}_{i}">
234
-
{comp.content}
235
-
</div>
236
-
</div>
237
-
'''
238
-
for i, comp in enumerate(resp.components)
239
-
])}
240
-
</div>
241
-
</div>
242
-
'''
243
-
for resp in responses
244
-
])
···
+93
src/bot/utils/thread.py
+93
src/bot/utils/thread.py
···
···
1
+
"""Thread utilities for ATProto thread operations."""
2
+
3
+
from collections.abc import Callable
4
+
5
+
6
+
def traverse_thread(
7
+
thread_node,
8
+
visit: Callable[[any], None],
9
+
*,
10
+
include_parent: bool = True,
11
+
include_replies: bool = True,
12
+
):
13
+
"""Recursively traverse a thread structure and call visit() on each post.
14
+
15
+
Args:
16
+
thread_node: ATProto thread node with optional .post, .parent, .replies
17
+
visit: Callback function called for each post node
18
+
include_parent: Whether to traverse up to parent posts
19
+
include_replies: Whether to traverse down to reply posts
20
+
21
+
Example:
22
+
posts = []
23
+
traverse_thread(thread_data.thread, lambda node: posts.append(node.post))
24
+
"""
25
+
if not thread_node or not hasattr(thread_node, "post"):
26
+
return
27
+
28
+
# Visit this node
29
+
visit(thread_node)
30
+
31
+
# Traverse parent chain (moving up the thread)
32
+
if include_parent and hasattr(thread_node, "parent") and thread_node.parent:
33
+
traverse_thread(thread_node.parent, visit, include_parent=True, include_replies=False)
34
+
35
+
# Traverse replies (moving down the thread)
36
+
if include_replies and hasattr(thread_node, "replies") and thread_node.replies:
37
+
for reply in thread_node.replies:
38
+
traverse_thread(reply, visit, include_parent=False, include_replies=True)
39
+
40
+
41
+
def extract_posts_chronological(thread_node) -> list[any]:
42
+
"""Extract all posts from a thread in chronological order.
43
+
44
+
Args:
45
+
thread_node: ATProto thread node
46
+
47
+
Returns:
48
+
List of post objects sorted by timestamp
49
+
"""
50
+
posts = []
51
+
52
+
def collect(node):
53
+
if hasattr(node, "post"):
54
+
posts.append(node.post)
55
+
56
+
traverse_thread(thread_node, collect)
57
+
58
+
# Sort by indexed timestamp
59
+
posts.sort(key=lambda p: p.indexed_at if hasattr(p, "indexed_at") else "")
60
+
return posts
61
+
62
+
63
+
def build_thread_context(thread_node) -> str:
64
+
"""Build conversational context string from ATProto thread structure.
65
+
66
+
Args:
67
+
thread_node: ATProto thread node
68
+
69
+
Returns:
70
+
Formatted string of messages like:
71
+
@alice: I love birds
72
+
@phi: me too! what's your favorite?
73
+
@alice: especially crows
74
+
75
+
Example:
76
+
thread_data = await client.get_thread(uri, depth=100)
77
+
context = build_thread_context(thread_data.thread)
78
+
"""
79
+
if not thread_node:
80
+
return "No previous messages in this thread."
81
+
82
+
posts = extract_posts_chronological(thread_node)
83
+
84
+
if not posts:
85
+
return "No previous messages in this thread."
86
+
87
+
messages = []
88
+
for post in posts:
89
+
handle = post.author.handle
90
+
text = post.record.text if hasattr(post.record, "text") else "[no text]"
91
+
messages.append(f"@{handle}: {text}")
92
+
93
+
return "\n".join(messages)
+492
-94
uv.lock
+492
-94
uv.lock
···
1
version = 1
2
-
revision = 2
3
requires-python = ">=3.12"
4
5
[[package]]
···
139
140
[[package]]
141
name = "atproto"
142
-
version = "0.0.61"
143
-
source = { registry = "https://pypi.org/simple" }
144
dependencies = [
145
{ name = "click" },
146
{ name = "cryptography" },
···
151
{ name = "typing-extensions" },
152
{ name = "websockets" },
153
]
154
-
sdist = { url = "https://files.pythonhosted.org/packages/b1/59/6f5074b3a45e0e3c1853544240e9039e86219feb30ff1bb5e8582c791547/atproto-0.0.61.tar.gz", hash = "sha256:98e022daf538d14f134ce7c91d42c4c973f3493ac56e43a84daa4c881f102beb", size = 189208, upload-time = "2025-04-19T00:20:11.918Z" }
155
-
wheels = [
156
-
{ url = "https://files.pythonhosted.org/packages/bd/b6/da9963bf54d4c0a8a590b6297d8858c395243dbb04cb581fdadb5fe7eac7/atproto-0.0.61-py3-none-any.whl", hash = "sha256:658da5832aaeea4a12a9a74235f9c90c11453e77d596fdccb1f8b39d56245b88", size = 380426, upload-time = "2025-04-19T00:20:10.026Z" },
157
-
]
158
159
[[package]]
160
name = "attrs"
···
166
]
167
168
[[package]]
169
name = "bot"
170
source = { editable = "." }
171
dependencies = [
172
{ name = "anthropic" },
173
{ name = "atproto" },
174
{ name = "fastapi" },
175
-
{ name = "httpx" },
176
{ name = "openai" },
177
{ name = "pydantic-ai" },
178
{ name = "pydantic-settings" },
···
192
[package.metadata]
193
requires-dist = [
194
{ name = "anthropic" },
195
-
{ name = "atproto" },
196
{ name = "fastapi" },
197
-
{ name = "httpx" },
198
{ name = "openai" },
199
{ name = "pydantic-ai" },
200
{ name = "pydantic-settings" },
···
402
]
403
404
[[package]]
405
name = "distro"
406
version = "1.9.0"
407
source = { registry = "https://pypi.org/simple" }
···
417
sdist = { url = "https://files.pythonhosted.org/packages/b5/4a/263763cb2ba3816dd94b08ad3a33d5fdae34ecb856678773cc40a3605829/dnspython-2.7.0.tar.gz", hash = "sha256:ce9c432eda0dc91cf618a5cedf1a4e142651196bbcd2c80e89ed5a907e5cfaf1", size = 345197, upload-time = "2024-10-05T20:14:59.362Z" }
418
wheels = [
419
{ url = "https://files.pythonhosted.org/packages/68/1b/e0a87d256e40e8c888847551b20a017a6b98139178505dc7ffb96f04e954/dnspython-2.7.0-py3-none-any.whl", hash = "sha256:b4c34b7d10b51bcc3a5071e7b8dee77939f1e878477eeecc965e9835f63c6c86", size = 313632, upload-time = "2024-10-05T20:14:57.687Z" },
420
]
421
422
[[package]]
···
429
]
430
431
[[package]]
432
name = "fastapi"
433
version = "0.116.1"
434
source = { registry = "https://pypi.org/simple" }
···
465
{ url = "https://files.pythonhosted.org/packages/be/84/02bceb7518867df84027232a75225db758b9b45f12017c9743f45b73101e/fastavro-1.11.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2e0babcd81acceb4c60110af9efa25d890dbb68f7de880f806dadeb1e70fe413", size = 3240658, upload-time = "2025-05-18T04:55:27.633Z" },
466
{ url = "https://files.pythonhosted.org/packages/f2/17/508c846c644d39bc432b027112068b8e96e7560468304d4c0757539dd73a/fastavro-1.11.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:b2c0cb8063c7208b53b6867983dc6ae7cc80b91116b51d435d2610a5db2fc52f", size = 3372809, upload-time = "2025-05-18T04:55:30.063Z" },
467
{ url = "https://files.pythonhosted.org/packages/fe/84/9c2917a70ed570ddbfd1d32ac23200c1d011e36c332e59950d2f6d204941/fastavro-1.11.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:1bc2824e9969c04ab6263d269a1e0e5d40b9bd16ade6b70c29d6ffbc4f3cc102", size = 3387171, upload-time = "2025-05-18T04:55:32.531Z" },
468
]
469
470
[[package]]
···
723
]
724
725
[[package]]
726
name = "jiter"
727
version = "0.10.0"
728
source = { registry = "https://pypi.org/simple" }
···
795
]
796
797
[[package]]
798
name = "jsonschema-specifications"
799
version = "2025.4.1"
800
source = { registry = "https://pypi.org/simple" }
···
807
]
808
809
[[package]]
810
name = "libipld"
811
version = "3.1.1"
812
source = { registry = "https://pypi.org/simple" }
···
862
]
863
864
[[package]]
865
name = "mcp"
866
-
version = "1.12.0"
867
source = { registry = "https://pypi.org/simple" }
868
dependencies = [
869
{ name = "anyio" },
···
878
{ name = "starlette" },
879
{ name = "uvicorn", marker = "sys_platform != 'emscripten'" },
880
]
881
-
sdist = { url = "https://files.pythonhosted.org/packages/45/94/caa0f4754e2437f7033068989f13fee784856f95870c786b0b5c2c0f511e/mcp-1.12.0.tar.gz", hash = "sha256:853f6b17a3f31ea6e2f278c2ec7d3b38457bc80c7c2c675260dd7f04a6fd0e70", size = 424678, upload-time = "2025-07-17T19:46:35.522Z" }
882
wheels = [
883
-
{ url = "https://files.pythonhosted.org/packages/ed/da/c7eaab6a58f1034de115b7902141ad8f81b4f3bbf7dc0cc267594947a4d7/mcp-1.12.0-py3-none-any.whl", hash = "sha256:19a498b2bf273283e463b4dd1ed83f791fbba5c25bfa16b8b34cfd5571673e7f", size = 158470, upload-time = "2025-07-17T19:46:34.166Z" },
884
]
885
886
[[package]]
···
906
sdist = { url = "https://files.pythonhosted.org/packages/e9/e7/204a54d07c37ebf173590af85bf46cddf8bc343b9d6005804581967b4751/mistralai-1.9.2.tar.gz", hash = "sha256:c0c6d5aff18ffccbc0d22c06fbc84280d71eeaeb08fa4e1ef7326b36629cfb0b", size = 192678, upload-time = "2025-07-10T13:07:08.85Z" }
907
wheels = [
908
{ url = "https://files.pythonhosted.org/packages/24/eb/f746a3f977d3c0059e4afa19d26b1293f54c6258fcf841957e584be6927f/mistralai-1.9.2-py3-none-any.whl", hash = "sha256:7c3fff00e50227d379dea82052455c2610612a8ef476fa97393191aeeb7ab15f", size = 411581, upload-time = "2025-07-10T13:07:07.226Z" },
909
]
910
911
[[package]]
···
991
]
992
993
[[package]]
994
name = "opentelemetry-api"
995
version = "1.35.0"
996
source = { registry = "https://pypi.org/simple" }
···
1010
sdist = { url = "https://files.pythonhosted.org/packages/a1/d4/1fc4078c65507b51b96ca8f8c3ba19e6a61c8253c72794544580a7b6c24d/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f", size = 165727, upload-time = "2025-04-19T11:48:59.673Z" }
1011
wheels = [
1012
{ url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" },
1013
]
1014
1015
[[package]]
···
1113
1114
[[package]]
1115
name = "pybase64"
1116
-
version = "1.4.1"
1117
source = { registry = "https://pypi.org/simple" }
1118
-
sdist = { url = "https://files.pythonhosted.org/packages/38/32/5d25a15256d2e80d1e92be821f19fc49190e65a90ea86733cb5af2285449/pybase64-1.4.1.tar.gz", hash = "sha256:03fc365c601671add4f9e0713c2bc2485fa4ab2b32f0d3bb060bd7e069cdaa43", size = 136836, upload-time = "2025-03-02T11:13:57.109Z" }
1119
wheels = [
1120
-
{ url = "https://files.pythonhosted.org/packages/a6/a9/43bac4f39401f7241d233ddaf9e6561860b2466798cfb83b9e7dbf89bc1b/pybase64-1.4.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:bbdcf77e424c91389f22bf10158851ce05c602c50a74ccf5943ee3f5ef4ba489", size = 38152, upload-time = "2025-03-02T11:11:07.576Z" },
1121
-
{ url = "https://files.pythonhosted.org/packages/1e/bb/d0ae801e31a5052dbb1744a45318f822078dd4ce4cc7f49bfe97e7768f7e/pybase64-1.4.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:af41e2e6015f980d15eae0df0c365df94c7587790aea236ba0bf48c65a9fa04e", size = 31488, upload-time = "2025-03-02T11:11:09.758Z" },
1122
-
{ url = "https://files.pythonhosted.org/packages/be/34/bf4119a88b2ad0536a8ed9d66ce4d70ff8152eac00ef8a27e5ae35da4328/pybase64-1.4.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9ac21c1943a15552347305943b1d0d6298fb64a98b67c750cb8fb2c190cdefd4", size = 59734, upload-time = "2025-03-02T11:11:11.493Z" },
1123
-
{ url = "https://files.pythonhosted.org/packages/99/1c/1901547adc7d4f24bdcb2f75cb7dcd3975bff42f39da37d4bd218c608c60/pybase64-1.4.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:65567e8f4f31cf6e1a8cc570723cc6b18adda79b4387a18f8d93c157ff5f1979", size = 56529, upload-time = "2025-03-02T11:11:12.657Z" },
1124
-
{ url = "https://files.pythonhosted.org/packages/c5/1e/1993e4b9a03e94fc53552285e3998079d864fff332798bf30c25afdac8f3/pybase64-1.4.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:988e987f8cfe2dfde7475baf5f12f82b2f454841aef3a174b694a57a92d5dfb0", size = 59114, upload-time = "2025-03-02T11:11:13.972Z" },
1125
-
{ url = "https://files.pythonhosted.org/packages/c5/f6/061fee5b7ba38b8824dd95752ab7115cf183ffbd3330d5fc1734a47b0f9e/pybase64-1.4.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:92b2305ac2442b451e19d42c4650c3bb090d6aa9abd87c0c4d700267d8fa96b1", size = 60095, upload-time = "2025-03-02T11:11:15.182Z" },
1126
-
{ url = "https://files.pythonhosted.org/packages/37/da/ccfe5d1a9f1188cd703390522e96a31045c5b93af84df04a98e69ada5c8b/pybase64-1.4.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d1ff80e03357b09dab016f41b4c75cf06e9b19cda7f898e4f3681028a3dff29b", size = 68431, upload-time = "2025-03-02T11:11:17.059Z" },
1127
-
{ url = "https://files.pythonhosted.org/packages/c3/d3/8ca4b0695876b52c0073a3557a65850b6d5c723333b5a271ab10a1085852/pybase64-1.4.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2cdda297e668e118f6b9ba804e858ff49e3dd945d01fdd147de90445fd08927d", size = 71417, upload-time = "2025-03-02T11:11:19.178Z" },
1128
-
{ url = "https://files.pythonhosted.org/packages/94/34/5f8f72d1b7b4ddb64c48d60160f3f4f03cfd0bfd2e7068d4558499d948ed/pybase64-1.4.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:51a24d21a21a959eb8884f24346a6480c4bd624aa7976c9761504d847a2f9364", size = 58429, upload-time = "2025-03-02T11:11:20.351Z" },
1129
-
{ url = "https://files.pythonhosted.org/packages/95/b7/edf53af308c6e8aada1e6d6a0a3789176af8cbae37a2ce084eb9da87bf33/pybase64-1.4.1-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:b19e169ea1b8a15a03d3a379116eb7b17740803e89bc6eb3efcc74f532323cf7", size = 52228, upload-time = "2025-03-02T11:11:21.632Z" },
1130
-
{ url = "https://files.pythonhosted.org/packages/0c/bf/c9df141e24a259f38a38bdda5a3b63206f13e612ecbd3880fa10625e0294/pybase64-1.4.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:8a9f1b614efd41240c9bb2cf66031aa7a2c3c092c928f9d429511fe18d4a3fd1", size = 68632, upload-time = "2025-03-02T11:11:23.56Z" },
1131
-
{ url = "https://files.pythonhosted.org/packages/e9/ae/1aec72325a3c48f7776cc55a3bab8b168eb77aea821253da8b9f09713734/pybase64-1.4.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:d9947b5e289e2c5b018ddc2aee2b9ed137b8aaaba7edfcb73623e576a2407740", size = 57682, upload-time = "2025-03-02T11:11:25.656Z" },
1132
-
{ url = "https://files.pythonhosted.org/packages/4d/7a/7ad2799c0b3c4e2f7b993e1636468445c30870ca5485110b589b8921808d/pybase64-1.4.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:ba4184ea43aa88a5ab8d6d15db284689765c7487ff3810764d8d823b545158e6", size = 56308, upload-time = "2025-03-02T11:11:26.803Z" },
1133
-
{ url = "https://files.pythonhosted.org/packages/be/01/6008a4fbda0c4308dab00b95aedde8748032d7620bd95b686619c66917fe/pybase64-1.4.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:4471257628785296efb2d50077fb9dfdbd4d2732c3487795224dd2644216fb07", size = 70784, upload-time = "2025-03-02T11:11:28.427Z" },
1134
-
{ url = "https://files.pythonhosted.org/packages/27/31/913365a4f0e2922ec369ddaa3a1d6c11059acbe54531b003653efa007a48/pybase64-1.4.1-cp312-cp312-win32.whl", hash = "sha256:614561297ad14de315dd27381fd6ec3ea4de0d8206ba4c7678449afaff8a2009", size = 34271, upload-time = "2025-03-02T11:11:30.585Z" },
1135
-
{ url = "https://files.pythonhosted.org/packages/d9/98/4d514d3e4c04819d80bccf9ea7b30d1cfc701832fa5ffca168f585004488/pybase64-1.4.1-cp312-cp312-win_amd64.whl", hash = "sha256:35635db0d64fcbe9b3fad265314c052c47dc9bcef8dea17493ea8e3c15b2b972", size = 36496, upload-time = "2025-03-02T11:11:32.552Z" },
1136
-
{ url = "https://files.pythonhosted.org/packages/c4/61/01353bc9c461e7b36d692daca3eee9616d8936ea6d8a64255ef7ec9ac307/pybase64-1.4.1-cp312-cp312-win_arm64.whl", hash = "sha256:b4ccb438c4208ff41a260b70994c30a8631051f3b025cdca48be586b068b8f49", size = 29692, upload-time = "2025-03-02T11:11:33.735Z" },
1137
-
{ url = "https://files.pythonhosted.org/packages/4b/1a/4e243ba702c07df3df3ba1795cfb02cf7a4242c53fc574b06a2bfa4f8478/pybase64-1.4.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d1c38d9c4a7c132d45859af8d5364d3ce90975a42bd5995d18d174fb57621973", size = 38149, upload-time = "2025-03-02T11:11:35.537Z" },
1138
-
{ url = "https://files.pythonhosted.org/packages/9c/35/3eae81bc8688a83f8b5bb84979d88e2cc3c3279a3b870a506f277d746c56/pybase64-1.4.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:ab0b93ea93cf1f56ca4727d678a9c0144c2653e9de4e93e789a92b4e098c07d9", size = 31485, upload-time = "2025-03-02T11:11:36.656Z" },
1139
-
{ url = "https://files.pythonhosted.org/packages/48/55/d99b9ff8083573bbf97fc433bbc20e2efb612792025f3bad0868c96c37ce/pybase64-1.4.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:644f393e9bb7f3bacc5cbd3534d02e1b660b258fc8315ecae74d2e23265e5c1f", size = 59738, upload-time = "2025-03-02T11:11:38.468Z" },
1140
-
{ url = "https://files.pythonhosted.org/packages/63/3c/051512b9e139a11585447b286ede5ac3b284ce5df85de37eb8cff57d90f8/pybase64-1.4.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ff172a4dacbd964e5edcf1c2152dae157aabf856508aed15276f46d04a22128e", size = 56239, upload-time = "2025-03-02T11:11:39.718Z" },
1141
-
{ url = "https://files.pythonhosted.org/packages/af/11/f40c5cca587274d50baee88540a7839576204cb425fe2f73a752ea48ae74/pybase64-1.4.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b2ab7b4535abc72d40114540cae32c9e07d76ffba132bdd5d4fff5fe340c5801", size = 59137, upload-time = "2025-03-02T11:11:41.524Z" },
1142
-
{ url = "https://files.pythonhosted.org/packages/1a/a9/ace9f6d0926962c083671d7df247de442ef63cd06bd134f7c8251aab5c51/pybase64-1.4.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:da66eb7cfb641486944fb0b95ab138e691ab78503115022caf992b6c89b10396", size = 60109, upload-time = "2025-03-02T11:11:42.699Z" },
1143
-
{ url = "https://files.pythonhosted.org/packages/88/9c/d4e308b4b4e3b513bc084fc71b4e2dd00d21d4cd245a9a28144d2f6b03c9/pybase64-1.4.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:678f573ea1d06183b32d0336044fb5db60396333599dffcce28ffa3b68319fc0", size = 68391, upload-time = "2025-03-02T11:11:43.898Z" },
1144
-
{ url = "https://files.pythonhosted.org/packages/53/87/e184bf982a3272f1021f417e5a18fac406e042c606950e9082fc3b0cec30/pybase64-1.4.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4bccdf340c2a1d3dd1f41528f192265ddce7f8df1ee4f7b5b9163cdba0fe0ccb", size = 71438, upload-time = "2025-03-02T11:11:45.112Z" },
1145
-
{ url = "https://files.pythonhosted.org/packages/2f/7f/d6e6a72db055eb2dc01ab877d8ee39d05cb665403433ff922fb95d1003ad/pybase64-1.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:1ddf6366c34eb78931fd8a47c00cb886ba187a5ff8e6dbffe1d9dae4754b6c28", size = 58437, upload-time = "2025-03-02T11:11:47.034Z" },
1146
-
{ url = "https://files.pythonhosted.org/packages/71/ef/c9051f2c0128194b861f3cd3b2d211b8d4d21ed2be354aa669fe29a059d8/pybase64-1.4.1-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:500afcb717a84e262c68f0baf9c56abaf97e2f058ba80c5546a9ed21ff4b705f", size = 52267, upload-time = "2025-03-02T11:11:48.448Z" },
1147
-
{ url = "https://files.pythonhosted.org/packages/12/92/ae30a54eaa437989839c4f2404c1f004d7383c0f46d6ebb83546d587d2a7/pybase64-1.4.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d2de043312a1e7f15ee6d2b7d9e39ee6afe24f144e2248cce942b6be357b70d8", size = 68659, upload-time = "2025-03-02T11:11:49.615Z" },
1148
-
{ url = "https://files.pythonhosted.org/packages/2b/65/d94788a35904f21694c4c581bcee2e165bec2408cc6fbed85a7fef5959ae/pybase64-1.4.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:c36e214c25fb8dd4f3ecdaa0ff90073b793056e0065cc0a1e1e5525a6866a1ad", size = 57727, upload-time = "2025-03-02T11:11:50.843Z" },
1149
-
{ url = "https://files.pythonhosted.org/packages/d0/97/8db416066b7917909c38346c03a8f3e6d4fc8a1dc98636408156514269ad/pybase64-1.4.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:8ec003224f6e36e8e607a1bb8df182b367c87ca7135788ffe89173c7d5085005", size = 56302, upload-time = "2025-03-02T11:11:52.547Z" },
1150
-
{ url = "https://files.pythonhosted.org/packages/70/0b/98f0601391befe0f19aa8cbda821c62d95056a94cc41d452fe893d205523/pybase64-1.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:c536c6ed161e6fb19f6acd6074f29a4c78cb41c9155c841d56aec1a4d20d5894", size = 70779, upload-time = "2025-03-02T11:11:53.735Z" },
1151
-
{ url = "https://files.pythonhosted.org/packages/cc/07/116119c5b20688c052697f677cf56f05aa766535ff7691aba38447d4a0d8/pybase64-1.4.1-cp313-cp313-win32.whl", hash = "sha256:1d34872e5aa2eff9dc54cedaf36038bbfbd5a3440fdf0bdc5b3c81c54ef151ea", size = 34266, upload-time = "2025-03-02T11:11:54.892Z" },
1152
-
{ url = "https://files.pythonhosted.org/packages/c0/f5/a7eed9f3692209a9869a28bdd92deddf8cbffb06b40954f89f4577e5c96e/pybase64-1.4.1-cp313-cp313-win_amd64.whl", hash = "sha256:8b7765515d7e0a48ddfde914dc2b1782234ac188ce3fab173b078a6e82ec7017", size = 36488, upload-time = "2025-03-02T11:11:56.063Z" },
1153
-
{ url = "https://files.pythonhosted.org/packages/5d/8a/0d65c4dcda06487305035f24888ffed219897c03fb7834635d5d5e27dae1/pybase64-1.4.1-cp313-cp313-win_arm64.whl", hash = "sha256:7fb782f3ceb30e24dc4d8d99c1221a381917bffaf85d29542f0f25b51829987c", size = 29690, upload-time = "2025-03-02T11:11:57.702Z" },
1154
-
{ url = "https://files.pythonhosted.org/packages/a3/83/646d65fafe5e6edbdaf4c9548efb2e1dd7784caddbde3ff8a843dd942b0f/pybase64-1.4.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:2a98d323e97444a38db38e022ccaf1d3e053b1942455790a93f29086c687855f", size = 38506, upload-time = "2025-03-02T11:11:58.936Z" },
1155
-
{ url = "https://files.pythonhosted.org/packages/87/14/dbf7fbbe91d71c8044fefe20d22480ad64097e2ba424944de512550e12a4/pybase64-1.4.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:19ef58d36b9b32024768fcedb024f32c05eb464128c75c07cac2b50c9ed47f4a", size = 31894, upload-time = "2025-03-02T11:12:00.762Z" },
1156
-
{ url = "https://files.pythonhosted.org/packages/bd/5d/f8a47da2a5f8b599297b307d3bd0293adedc4e135be310620f061906070f/pybase64-1.4.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:04fee0f5c174212868fde97b109db8fac8249b306a00ea323531ee61c7b0f398", size = 65212, upload-time = "2025-03-02T11:12:01.911Z" },
1157
-
{ url = "https://files.pythonhosted.org/packages/90/95/ad9869c7cdcce3e8ada619dab5f9f2eff315ffb001704a3718c1597a2119/pybase64-1.4.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:47737ff9eabc14b7553de6bc6395d67c5be80afcdbd25180285d13e089e40888", size = 60300, upload-time = "2025-03-02T11:12:03.071Z" },
1158
-
{ url = "https://files.pythonhosted.org/packages/c2/91/4d8268b2488ae10c485cba04ecc23a5a7bdfb47ce9b876017b11ea0249a2/pybase64-1.4.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0d8b5888cc239654fe68a0db196a18575ffc8b1c8c8f670c2971a44e3b7fe682", size = 63773, upload-time = "2025-03-02T11:12:04.231Z" },
1159
-
{ url = "https://files.pythonhosted.org/packages/ae/1a/8afd27facc0723b1d69231da8c59a2343feb255f5db16f8b8765ddf1600b/pybase64-1.4.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a1af8d387dbce05944b65a618639918804b2d4438fed32bb7f06d9c90dbed01", size = 64684, upload-time = "2025-03-02T11:12:05.409Z" },
1160
-
{ url = "https://files.pythonhosted.org/packages/cc/cd/422c74397210051125419fc8e425506ff27c04665459e18c8f7b037a754b/pybase64-1.4.1-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0b0093c52bd099b80e422ad8cddf6f2c1ac1b09cb0922cca04891d736c2ad647", size = 72880, upload-time = "2025-03-02T11:12:06.652Z" },
1161
-
{ url = "https://files.pythonhosted.org/packages/04/c1/c4f02f1d5f8e8a3d75715a3dd04196dde9e263e471470d099a26e91ebe2f/pybase64-1.4.1-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15e54f9b2a1686f5bbdc4ac8440b6f6145d9699fd53aa30f347931f3063b0915", size = 75344, upload-time = "2025-03-02T11:12:07.816Z" },
1162
-
{ url = "https://files.pythonhosted.org/packages/6e/0b/013006ca984f0472476cf7c0540db2e2b1f997d52977b15842a7681ab79c/pybase64-1.4.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:3a0fdcf13f986c82f7ef04a1cd1163c70f39662d6f02aa4e7b448dacb966b39f", size = 63439, upload-time = "2025-03-02T11:12:09.669Z" },
1163
-
{ url = "https://files.pythonhosted.org/packages/8a/d5/7848543b3c8dcc5396be574109acbe16706e6a9b4dbd9fc4e22f211668a9/pybase64-1.4.1-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:ac03f8eba72dd6da15dc25bb3e1b440ad21f5cb7ee2e6ffbbae4bd1b206bb503", size = 56004, upload-time = "2025-03-02T11:12:10.981Z" },
1164
-
{ url = "https://files.pythonhosted.org/packages/63/58/70de1efb1b6f21d7aaea33578868214f82925d969e2091f7de3175a10092/pybase64-1.4.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:ea835272570aa811e08ae17612632b057623a9b27265d44288db666c02b438dc", size = 72460, upload-time = "2025-03-02T11:12:13.122Z" },
1165
-
{ url = "https://files.pythonhosted.org/packages/90/0d/aa52dd1b1f25b98b1d94cc0522f864b03de55aa115de67cb6dbbddec4f46/pybase64-1.4.1-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:8f52c4c29a35381f3ae06d520144a0707132f2cbfb53bc907b74811734bc4ef3", size = 62295, upload-time = "2025-03-02T11:12:15.004Z" },
1166
-
{ url = "https://files.pythonhosted.org/packages/39/cf/4d378a330249c937676ee8eab7992ec700ade362f35db36c15922b33b1c8/pybase64-1.4.1-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:fa5cdabcb4d21b7e56d0b2edd7ed6fa933ac3535be30c2a9cf0a2e270c5369c8", size = 60604, upload-time = "2025-03-02T11:12:16.23Z" },
1167
-
{ url = "https://files.pythonhosted.org/packages/15/45/e3f23929018d0aada84246ddd398843050971af614da67450bb20f45f880/pybase64-1.4.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8db9acf239bb71a888748bc9ffc12c97c1079393a38bc180c0548330746ece94", size = 74500, upload-time = "2025-03-02T11:12:17.48Z" },
1168
-
{ url = "https://files.pythonhosted.org/packages/8d/98/6d2adaec318cae6ee968a10df0a7e870f17ee385ef623bcb2ab63fa11b59/pybase64-1.4.1-cp313-cp313t-win32.whl", hash = "sha256:bc06186cfa9a43e871fdca47c1379bdf1cfe964bd94a47f0919a1ffab195b39e", size = 34543, upload-time = "2025-03-02T11:12:18.625Z" },
1169
-
{ url = "https://files.pythonhosted.org/packages/8e/e7/1823de02d2c23324cf1142e9dce53b032085cee06c3f982806040f975ce7/pybase64-1.4.1-cp313-cp313t-win_amd64.whl", hash = "sha256:02c3647d270af1a3edd35e485bb7ccfe82180b8347c49e09973466165c03d7aa", size = 36909, upload-time = "2025-03-02T11:12:20.122Z" },
1170
-
{ url = "https://files.pythonhosted.org/packages/43/6a/8ec0e4461bf89ef0499ef6c746b081f3520a1e710aeb58730bae693e0681/pybase64-1.4.1-cp313-cp313t-win_arm64.whl", hash = "sha256:4b3635e5873707906e72963c447a67969cfc6bac055432a57a91d7a4d5164fdf", size = 29961, upload-time = "2025-03-02T11:12:21.908Z" },
1171
]
1172
1173
[[package]]
···
1192
sdist = { url = "https://files.pythonhosted.org/packages/00/dd/4325abf92c39ba8623b5af936ddb36ffcfe0beae70405d456ab1fb2f5b8c/pydantic-2.11.7.tar.gz", hash = "sha256:d989c3c6cb79469287b1569f7447a17848c998458d49ebe294e975b9baf0f0db", size = 788350, upload-time = "2025-06-14T08:33:17.137Z" }
1193
wheels = [
1194
{ url = "https://files.pythonhosted.org/packages/6a/c0/ec2b1c8712ca690e5d61979dee872603e92b8a32f94cc1b72d53beab008a/pydantic-2.11.7-py3-none-any.whl", hash = "sha256:dde5df002701f6de26248661f6835bbe296a47bf73990135c7d07ce741b9623b", size = 444782, upload-time = "2025-06-14T08:33:14.905Z" },
1195
]
1196
1197
[[package]]
···
1367
]
1368
1369
[[package]]
1370
name = "pytest"
1371
version = "8.4.1"
1372
source = { registry = "https://pypi.org/simple" }
···
1510
]
1511
1512
[[package]]
1513
name = "rich"
1514
version = "14.0.0"
1515
source = { registry = "https://pypi.org/simple" }
···
1523
]
1524
1525
[[package]]
1526
name = "rpds-py"
1527
version = "0.26.0"
1528
source = { registry = "https://pypi.org/simple" }
···
1747
1748
[[package]]
1749
name = "turbopuffer"
1750
-
version = "0.5.13"
1751
source = { registry = "https://pypi.org/simple" }
1752
dependencies = [
1753
{ name = "aiohttp" },
···
1759
{ name = "sniffio" },
1760
{ name = "typing-extensions" },
1761
]
1762
-
sdist = { url = "https://files.pythonhosted.org/packages/79/a2/59f6dbfcc43eb08c91bf77670ade5ca3ddc293c518db2b29703643799273/turbopuffer-0.5.13.tar.gz", hash = "sha256:e48ead6af4d493201ec6c9dfaaa6dca9bc96322f9a12f84d6866159a76eb6c27", size = 134367, upload-time = "2025-07-18T21:34:34.793Z" }
1763
wheels = [
1764
-
{ url = "https://files.pythonhosted.org/packages/35/fd/e27b0fc9b9bebf92dc24cb54ff3862aae2b6280d98704b8eff5e98e84ccd/turbopuffer-0.5.13-py3-none-any.whl", hash = "sha256:d48263aab236d697ab3321c00870ba1104cdddcd315d67f85d1bd150621e9ae8", size = 101727, upload-time = "2025-07-18T21:34:33.27Z" },
1765
]
1766
1767
[[package]]
···
1855
1856
[[package]]
1857
name = "websockets"
1858
-
version = "13.1"
1859
source = { registry = "https://pypi.org/simple" }
1860
-
sdist = { url = "https://files.pythonhosted.org/packages/e2/73/9223dbc7be3dcaf2a7bbf756c351ec8da04b1fa573edaf545b95f6b0c7fd/websockets-13.1.tar.gz", hash = "sha256:a3b3366087c1bc0a2795111edcadddb8b3b59509d5db5d7ea3fdd69f954a8878", size = 158549, upload-time = "2024-09-21T17:34:21.54Z" }
1861
wheels = [
1862
-
{ url = "https://files.pythonhosted.org/packages/df/46/c426282f543b3c0296cf964aa5a7bb17e984f58dde23460c3d39b3148fcf/websockets-13.1-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:9d75baf00138f80b48f1eac72ad1535aac0b6461265a0bcad391fc5aba875cfc", size = 157821, upload-time = "2024-09-21T17:32:56.442Z" },
1863
-
{ url = "https://files.pythonhosted.org/packages/aa/85/22529867010baac258da7c45848f9415e6cf37fef00a43856627806ffd04/websockets-13.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:9b6f347deb3dcfbfde1c20baa21c2ac0751afaa73e64e5b693bb2b848efeaa49", size = 155480, upload-time = "2024-09-21T17:32:57.698Z" },
1864
-
{ url = "https://files.pythonhosted.org/packages/29/2c/bdb339bfbde0119a6e84af43ebf6275278698a2241c2719afc0d8b0bdbf2/websockets-13.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:de58647e3f9c42f13f90ac7e5f58900c80a39019848c5547bc691693098ae1bd", size = 155715, upload-time = "2024-09-21T17:32:59.429Z" },
1865
-
{ url = "https://files.pythonhosted.org/packages/9f/d0/8612029ea04c5c22bf7af2fd3d63876c4eaeef9b97e86c11972a43aa0e6c/websockets-13.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1b54689e38d1279a51d11e3467dd2f3a50f5f2e879012ce8f2d6943f00e83f0", size = 165647, upload-time = "2024-09-21T17:33:00.495Z" },
1866
-
{ url = "https://files.pythonhosted.org/packages/56/04/1681ed516fa19ca9083f26d3f3a302257e0911ba75009533ed60fbb7b8d1/websockets-13.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cf1781ef73c073e6b0f90af841aaf98501f975d306bbf6221683dd594ccc52b6", size = 164592, upload-time = "2024-09-21T17:33:02.223Z" },
1867
-
{ url = "https://files.pythonhosted.org/packages/38/6f/a96417a49c0ed132bb6087e8e39a37db851c70974f5c724a4b2a70066996/websockets-13.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8d23b88b9388ed85c6faf0e74d8dec4f4d3baf3ecf20a65a47b836d56260d4b9", size = 165012, upload-time = "2024-09-21T17:33:03.288Z" },
1868
-
{ url = "https://files.pythonhosted.org/packages/40/8b/fccf294919a1b37d190e86042e1a907b8f66cff2b61e9befdbce03783e25/websockets-13.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3c78383585f47ccb0fcf186dcb8a43f5438bd7d8f47d69e0b56f71bf431a0a68", size = 165311, upload-time = "2024-09-21T17:33:04.728Z" },
1869
-
{ url = "https://files.pythonhosted.org/packages/c1/61/f8615cf7ce5fe538476ab6b4defff52beb7262ff8a73d5ef386322d9761d/websockets-13.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:d6d300f8ec35c24025ceb9b9019ae9040c1ab2f01cddc2bcc0b518af31c75c14", size = 164692, upload-time = "2024-09-21T17:33:05.829Z" },
1870
-
{ url = "https://files.pythonhosted.org/packages/5c/f1/a29dd6046d3a722d26f182b783a7997d25298873a14028c4760347974ea3/websockets-13.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a9dcaf8b0cc72a392760bb8755922c03e17a5a54e08cca58e8b74f6902b433cf", size = 164686, upload-time = "2024-09-21T17:33:06.823Z" },
1871
-
{ url = "https://files.pythonhosted.org/packages/0f/99/ab1cdb282f7e595391226f03f9b498f52109d25a2ba03832e21614967dfa/websockets-13.1-cp312-cp312-win32.whl", hash = "sha256:2f85cf4f2a1ba8f602298a853cec8526c2ca42a9a4b947ec236eaedb8f2dc80c", size = 158712, upload-time = "2024-09-21T17:33:07.877Z" },
1872
-
{ url = "https://files.pythonhosted.org/packages/46/93/e19160db48b5581feac8468330aa11b7292880a94a37d7030478596cc14e/websockets-13.1-cp312-cp312-win_amd64.whl", hash = "sha256:38377f8b0cdeee97c552d20cf1865695fcd56aba155ad1b4ca8779a5b6ef4ac3", size = 159145, upload-time = "2024-09-21T17:33:09.202Z" },
1873
-
{ url = "https://files.pythonhosted.org/packages/51/20/2b99ca918e1cbd33c53db2cace5f0c0cd8296fc77558e1908799c712e1cd/websockets-13.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a9ab1e71d3d2e54a0aa646ab6d4eebfaa5f416fe78dfe4da2839525dc5d765c6", size = 157828, upload-time = "2024-09-21T17:33:10.987Z" },
1874
-
{ url = "https://files.pythonhosted.org/packages/b8/47/0932a71d3d9c0e9483174f60713c84cee58d62839a143f21a2bcdbd2d205/websockets-13.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:b9d7439d7fab4dce00570bb906875734df13d9faa4b48e261c440a5fec6d9708", size = 155487, upload-time = "2024-09-21T17:33:12.153Z" },
1875
-
{ url = "https://files.pythonhosted.org/packages/a9/60/f1711eb59ac7a6c5e98e5637fef5302f45b6f76a2c9d64fd83bbb341377a/websockets-13.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:327b74e915cf13c5931334c61e1a41040e365d380f812513a255aa804b183418", size = 155721, upload-time = "2024-09-21T17:33:13.909Z" },
1876
-
{ url = "https://files.pythonhosted.org/packages/6a/e6/ba9a8db7f9d9b0e5f829cf626ff32677f39824968317223605a6b419d445/websockets-13.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:325b1ccdbf5e5725fdcb1b0e9ad4d2545056479d0eee392c291c1bf76206435a", size = 165609, upload-time = "2024-09-21T17:33:14.967Z" },
1877
-
{ url = "https://files.pythonhosted.org/packages/c1/22/4ec80f1b9c27a0aebd84ccd857252eda8418ab9681eb571b37ca4c5e1305/websockets-13.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:346bee67a65f189e0e33f520f253d5147ab76ae42493804319b5716e46dddf0f", size = 164556, upload-time = "2024-09-21T17:33:17.113Z" },
1878
-
{ url = "https://files.pythonhosted.org/packages/27/ac/35f423cb6bb15600438db80755609d27eda36d4c0b3c9d745ea12766c45e/websockets-13.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:91a0fa841646320ec0d3accdff5b757b06e2e5c86ba32af2e0815c96c7a603c5", size = 164993, upload-time = "2024-09-21T17:33:18.168Z" },
1879
-
{ url = "https://files.pythonhosted.org/packages/31/4e/98db4fd267f8be9e52e86b6ee4e9aa7c42b83452ea0ea0672f176224b977/websockets-13.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:18503d2c5f3943e93819238bf20df71982d193f73dcecd26c94514f417f6b135", size = 165360, upload-time = "2024-09-21T17:33:19.233Z" },
1880
-
{ url = "https://files.pythonhosted.org/packages/3f/15/3f0de7cda70ffc94b7e7024544072bc5b26e2c1eb36545291abb755d8cdb/websockets-13.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:a9cd1af7e18e5221d2878378fbc287a14cd527fdd5939ed56a18df8a31136bb2", size = 164745, upload-time = "2024-09-21T17:33:20.361Z" },
1881
-
{ url = "https://files.pythonhosted.org/packages/a1/6e/66b6b756aebbd680b934c8bdbb6dcb9ce45aad72cde5f8a7208dbb00dd36/websockets-13.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:70c5be9f416aa72aab7a2a76c90ae0a4fe2755c1816c153c1a2bcc3333ce4ce6", size = 164732, upload-time = "2024-09-21T17:33:23.103Z" },
1882
-
{ url = "https://files.pythonhosted.org/packages/35/c6/12e3aab52c11aeb289e3dbbc05929e7a9d90d7a9173958477d3ef4f8ce2d/websockets-13.1-cp313-cp313-win32.whl", hash = "sha256:624459daabeb310d3815b276c1adef475b3e6804abaf2d9d2c061c319f7f187d", size = 158709, upload-time = "2024-09-21T17:33:24.196Z" },
1883
-
{ url = "https://files.pythonhosted.org/packages/41/d8/63d6194aae711d7263df4498200c690a9c39fb437ede10f3e157a6343e0d/websockets-13.1-cp313-cp313-win_amd64.whl", hash = "sha256:c518e84bb59c2baae725accd355c8dc517b4a3ed8db88b4bc93c78dae2974bf2", size = 159144, upload-time = "2024-09-21T17:33:25.96Z" },
1884
-
{ url = "https://files.pythonhosted.org/packages/56/27/96a5cd2626d11c8280656c6c71d8ab50fe006490ef9971ccd154e0c42cd2/websockets-13.1-py3-none-any.whl", hash = "sha256:a9a396a6ad26130cdae92ae10c36af09d9bfe6cafe69670fd3b6da9b07b4044f", size = 152134, upload-time = "2024-09-21T17:34:19.904Z" },
1885
]
1886
1887
[[package]]
···
1
version = 1
2
+
revision = 3
3
requires-python = ">=3.12"
4
5
[[package]]
···
139
140
[[package]]
141
name = "atproto"
142
+
version = "0.0.62.dev4"
143
+
source = { git = "https://github.com/MarshalX/atproto.git?rev=refs%2Fpull%2F605%2Fhead#1a2188371a25b248e0350826eda9f5e55d9c45bf" }
144
dependencies = [
145
{ name = "click" },
146
{ name = "cryptography" },
···
151
{ name = "typing-extensions" },
152
{ name = "websockets" },
153
]
154
155
[[package]]
156
name = "attrs"
···
162
]
163
164
[[package]]
165
+
name = "authlib"
166
+
version = "1.6.5"
167
+
source = { registry = "https://pypi.org/simple" }
168
+
dependencies = [
169
+
{ name = "cryptography" },
170
+
]
171
+
sdist = { url = "https://files.pythonhosted.org/packages/cd/3f/1d3bbd0bf23bdd99276d4def22f29c27a914067b4cf66f753ff9b8bbd0f3/authlib-1.6.5.tar.gz", hash = "sha256:6aaf9c79b7cc96c900f0b284061691c5d4e61221640a948fe690b556a6d6d10b", size = 164553, upload-time = "2025-10-02T13:36:09.489Z" }
172
+
wheels = [
173
+
{ url = "https://files.pythonhosted.org/packages/f8/aa/5082412d1ee302e9e7d80b6949bc4d2a8fa1149aaab610c5fc24709605d6/authlib-1.6.5-py2.py3-none-any.whl", hash = "sha256:3e0e0507807f842b02175507bdee8957a1d5707fd4afb17c32fb43fee90b6e3a", size = 243608, upload-time = "2025-10-02T13:36:07.637Z" },
174
+
]
175
+
176
+
[[package]]
177
name = "bot"
178
source = { editable = "." }
179
dependencies = [
180
{ name = "anthropic" },
181
{ name = "atproto" },
182
{ name = "fastapi" },
183
+
{ name = "fastmcp" },
184
{ name = "openai" },
185
{ name = "pydantic-ai" },
186
{ name = "pydantic-settings" },
···
200
[package.metadata]
201
requires-dist = [
202
{ name = "anthropic" },
203
+
{ name = "atproto", git = "https://github.com/MarshalX/atproto.git?rev=refs%2Fpull%2F605%2Fhead" },
204
{ name = "fastapi" },
205
+
{ name = "fastmcp", specifier = ">=0.8.0" },
206
{ name = "openai" },
207
{ name = "pydantic-ai" },
208
{ name = "pydantic-settings" },
···
410
]
411
412
[[package]]
413
+
name = "cyclopts"
414
+
version = "3.24.0"
415
+
source = { registry = "https://pypi.org/simple" }
416
+
dependencies = [
417
+
{ name = "attrs" },
418
+
{ name = "docstring-parser", marker = "python_full_version < '4'" },
419
+
{ name = "rich" },
420
+
{ name = "rich-rst" },
421
+
]
422
+
sdist = { url = "https://files.pythonhosted.org/packages/30/ca/7782da3b03242d5f0a16c20371dff99d4bd1fedafe26bc48ff82e42be8c9/cyclopts-3.24.0.tar.gz", hash = "sha256:de6964a041dfb3c57bf043b41e68c43548227a17de1bad246e3a0bfc5c4b7417", size = 76131, upload-time = "2025-09-08T15:40:57.75Z" }
423
+
wheels = [
424
+
{ url = "https://files.pythonhosted.org/packages/f0/8b/2c95f0645c6f40211896375e6fa51f504b8ccb29c21f6ae661fe87ab044e/cyclopts-3.24.0-py3-none-any.whl", hash = "sha256:809d04cde9108617106091140c3964ee6fceb33cecdd537f7ffa360bde13ed71", size = 86154, upload-time = "2025-09-08T15:40:56.41Z" },
425
+
]
426
+
427
+
[[package]]
428
name = "distro"
429
version = "1.9.0"
430
source = { registry = "https://pypi.org/simple" }
···
440
sdist = { url = "https://files.pythonhosted.org/packages/b5/4a/263763cb2ba3816dd94b08ad3a33d5fdae34ecb856678773cc40a3605829/dnspython-2.7.0.tar.gz", hash = "sha256:ce9c432eda0dc91cf618a5cedf1a4e142651196bbcd2c80e89ed5a907e5cfaf1", size = 345197, upload-time = "2024-10-05T20:14:59.362Z" }
441
wheels = [
442
{ url = "https://files.pythonhosted.org/packages/68/1b/e0a87d256e40e8c888847551b20a017a6b98139178505dc7ffb96f04e954/dnspython-2.7.0-py3-none-any.whl", hash = "sha256:b4c34b7d10b51bcc3a5071e7b8dee77939f1e878477eeecc965e9835f63c6c86", size = 313632, upload-time = "2024-10-05T20:14:57.687Z" },
443
+
]
444
+
445
+
[[package]]
446
+
name = "docstring-parser"
447
+
version = "0.17.0"
448
+
source = { registry = "https://pypi.org/simple" }
449
+
sdist = { url = "https://files.pythonhosted.org/packages/b2/9d/c3b43da9515bd270df0f80548d9944e389870713cc1fe2b8fb35fe2bcefd/docstring_parser-0.17.0.tar.gz", hash = "sha256:583de4a309722b3315439bb31d64ba3eebada841f2e2cee23b99df001434c912", size = 27442, upload-time = "2025-07-21T07:35:01.868Z" }
450
+
wheels = [
451
+
{ url = "https://files.pythonhosted.org/packages/55/e2/2537ebcff11c1ee1ff17d8d0b6f4db75873e3b0fb32c2d4a2ee31ecb310a/docstring_parser-0.17.0-py3-none-any.whl", hash = "sha256:cf2569abd23dce8099b300f9b4fa8191e9582dda731fd533daf54c4551658708", size = 36896, upload-time = "2025-07-21T07:35:00.684Z" },
452
+
]
453
+
454
+
[[package]]
455
+
name = "docutils"
456
+
version = "0.22.2"
457
+
source = { registry = "https://pypi.org/simple" }
458
+
sdist = { url = "https://files.pythonhosted.org/packages/4a/c0/89fe6215b443b919cb98a5002e107cb5026854ed1ccb6b5833e0768419d1/docutils-0.22.2.tar.gz", hash = "sha256:9fdb771707c8784c8f2728b67cb2c691305933d68137ef95a75db5f4dfbc213d", size = 2289092, upload-time = "2025-09-20T17:55:47.994Z" }
459
+
wheels = [
460
+
{ url = "https://files.pythonhosted.org/packages/66/dd/f95350e853a4468ec37478414fc04ae2d61dad7a947b3015c3dcc51a09b9/docutils-0.22.2-py3-none-any.whl", hash = "sha256:b0e98d679283fc3bb0ead8a5da7f501baa632654e7056e9c5846842213d674d8", size = 632667, upload-time = "2025-09-20T17:55:43.052Z" },
461
+
]
462
+
463
+
[[package]]
464
+
name = "email-validator"
465
+
version = "2.3.0"
466
+
source = { registry = "https://pypi.org/simple" }
467
+
dependencies = [
468
+
{ name = "dnspython" },
469
+
{ name = "idna" },
470
+
]
471
+
sdist = { url = "https://files.pythonhosted.org/packages/f5/22/900cb125c76b7aaa450ce02fd727f452243f2e91a61af068b40adba60ea9/email_validator-2.3.0.tar.gz", hash = "sha256:9fc05c37f2f6cf439ff414f8fc46d917929974a82244c20eb10231ba60c54426", size = 51238, upload-time = "2025-08-26T13:09:06.831Z" }
472
+
wheels = [
473
+
{ url = "https://files.pythonhosted.org/packages/de/15/545e2b6cf2e3be84bc1ed85613edd75b8aea69807a71c26f4ca6a9258e82/email_validator-2.3.0-py3-none-any.whl", hash = "sha256:80f13f623413e6b197ae73bb10bf4eb0908faf509ad8362c5edeb0be7fd450b4", size = 35604, upload-time = "2025-08-26T13:09:05.858Z" },
474
]
475
476
[[package]]
···
483
]
484
485
[[package]]
486
+
name = "exceptiongroup"
487
+
version = "1.3.0"
488
+
source = { registry = "https://pypi.org/simple" }
489
+
dependencies = [
490
+
{ name = "typing-extensions", marker = "python_full_version < '3.13'" },
491
+
]
492
+
sdist = { url = "https://files.pythonhosted.org/packages/0b/9f/a65090624ecf468cdca03533906e7c69ed7588582240cfe7cc9e770b50eb/exceptiongroup-1.3.0.tar.gz", hash = "sha256:b241f5885f560bc56a59ee63ca4c6a8bfa46ae4ad651af316d4e81817bb9fd88", size = 29749, upload-time = "2025-05-10T17:42:51.123Z" }
493
+
wheels = [
494
+
{ url = "https://files.pythonhosted.org/packages/36/f4/c6e662dade71f56cd2f3735141b265c3c79293c109549c1e6933b0651ffc/exceptiongroup-1.3.0-py3-none-any.whl", hash = "sha256:4d111e6e0c13d0644cad6ddaa7ed0261a0b36971f6d23e7ec9b4b9097da78a10", size = 16674, upload-time = "2025-05-10T17:42:49.33Z" },
495
+
]
496
+
497
+
[[package]]
498
name = "fastapi"
499
version = "0.116.1"
500
source = { registry = "https://pypi.org/simple" }
···
531
{ url = "https://files.pythonhosted.org/packages/be/84/02bceb7518867df84027232a75225db758b9b45f12017c9743f45b73101e/fastavro-1.11.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2e0babcd81acceb4c60110af9efa25d890dbb68f7de880f806dadeb1e70fe413", size = 3240658, upload-time = "2025-05-18T04:55:27.633Z" },
532
{ url = "https://files.pythonhosted.org/packages/f2/17/508c846c644d39bc432b027112068b8e96e7560468304d4c0757539dd73a/fastavro-1.11.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:b2c0cb8063c7208b53b6867983dc6ae7cc80b91116b51d435d2610a5db2fc52f", size = 3372809, upload-time = "2025-05-18T04:55:30.063Z" },
533
{ url = "https://files.pythonhosted.org/packages/fe/84/9c2917a70ed570ddbfd1d32ac23200c1d011e36c332e59950d2f6d204941/fastavro-1.11.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:1bc2824e9969c04ab6263d269a1e0e5d40b9bd16ade6b70c29d6ffbc4f3cc102", size = 3387171, upload-time = "2025-05-18T04:55:32.531Z" },
534
+
]
535
+
536
+
[[package]]
537
+
name = "fastmcp"
538
+
version = "2.12.4"
539
+
source = { registry = "https://pypi.org/simple" }
540
+
dependencies = [
541
+
{ name = "authlib" },
542
+
{ name = "cyclopts" },
543
+
{ name = "exceptiongroup" },
544
+
{ name = "httpx" },
545
+
{ name = "mcp" },
546
+
{ name = "openapi-core" },
547
+
{ name = "openapi-pydantic" },
548
+
{ name = "pydantic", extra = ["email"] },
549
+
{ name = "pyperclip" },
550
+
{ name = "python-dotenv" },
551
+
{ name = "rich" },
552
+
]
553
+
sdist = { url = "https://files.pythonhosted.org/packages/a8/b2/57845353a9bc63002995a982e66f3d0be4ec761e7bcb89e7d0638518d42a/fastmcp-2.12.4.tar.gz", hash = "sha256:b55fe89537038f19d0f4476544f9ca5ac171033f61811cc8f12bdeadcbea5016", size = 7167745, upload-time = "2025-09-26T16:43:27.71Z" }
554
+
wheels = [
555
+
{ url = "https://files.pythonhosted.org/packages/e2/c7/562ff39f25de27caec01e4c1e88cbb5fcae5160802ba3d90be33165df24f/fastmcp-2.12.4-py3-none-any.whl", hash = "sha256:56188fbbc1a9df58c537063f25958c57b5c4d715f73e395c41b51550b247d140", size = 329090, upload-time = "2025-09-26T16:43:25.314Z" },
556
]
557
558
[[package]]
···
811
]
812
813
[[package]]
814
+
name = "isodate"
815
+
version = "0.7.2"
816
+
source = { registry = "https://pypi.org/simple" }
817
+
sdist = { url = "https://files.pythonhosted.org/packages/54/4d/e940025e2ce31a8ce1202635910747e5a87cc3a6a6bb2d00973375014749/isodate-0.7.2.tar.gz", hash = "sha256:4cd1aa0f43ca76f4a6c6c0292a85f40b35ec2e43e315b59f06e6d32171a953e6", size = 29705, upload-time = "2024-10-08T23:04:11.5Z" }
818
+
wheels = [
819
+
{ url = "https://files.pythonhosted.org/packages/15/aa/0aca39a37d3c7eb941ba736ede56d689e7be91cab5d9ca846bde3999eba6/isodate-0.7.2-py3-none-any.whl", hash = "sha256:28009937d8031054830160fce6d409ed342816b543597cece116d966c6d99e15", size = 22320, upload-time = "2024-10-08T23:04:09.501Z" },
820
+
]
821
+
822
+
[[package]]
823
name = "jiter"
824
version = "0.10.0"
825
source = { registry = "https://pypi.org/simple" }
···
892
]
893
894
[[package]]
895
+
name = "jsonschema-path"
896
+
version = "0.3.4"
897
+
source = { registry = "https://pypi.org/simple" }
898
+
dependencies = [
899
+
{ name = "pathable" },
900
+
{ name = "pyyaml" },
901
+
{ name = "referencing" },
902
+
{ name = "requests" },
903
+
]
904
+
sdist = { url = "https://files.pythonhosted.org/packages/6e/45/41ebc679c2a4fced6a722f624c18d658dee42612b83ea24c1caf7c0eb3a8/jsonschema_path-0.3.4.tar.gz", hash = "sha256:8365356039f16cc65fddffafda5f58766e34bebab7d6d105616ab52bc4297001", size = 11159, upload-time = "2025-01-24T14:33:16.547Z" }
905
+
wheels = [
906
+
{ url = "https://files.pythonhosted.org/packages/cb/58/3485da8cb93d2f393bce453adeef16896751f14ba3e2024bc21dc9597646/jsonschema_path-0.3.4-py3-none-any.whl", hash = "sha256:f502191fdc2b22050f9a81c9237be9d27145b9001c55842bece5e94e382e52f8", size = 14810, upload-time = "2025-01-24T14:33:14.652Z" },
907
+
]
908
+
909
+
[[package]]
910
name = "jsonschema-specifications"
911
version = "2025.4.1"
912
source = { registry = "https://pypi.org/simple" }
···
919
]
920
921
[[package]]
922
+
name = "lazy-object-proxy"
923
+
version = "1.12.0"
924
+
source = { registry = "https://pypi.org/simple" }
925
+
sdist = { url = "https://files.pythonhosted.org/packages/08/a2/69df9c6ba6d316cfd81fe2381e464db3e6de5db45f8c43c6a23504abf8cb/lazy_object_proxy-1.12.0.tar.gz", hash = "sha256:1f5a462d92fd0cfb82f1fab28b51bfb209fabbe6aabf7f0d51472c0c124c0c61", size = 43681, upload-time = "2025-08-22T13:50:06.783Z" }
926
+
wheels = [
927
+
{ url = "https://files.pythonhosted.org/packages/0d/1b/b5f5bd6bda26f1e15cd3232b223892e4498e34ec70a7f4f11c401ac969f1/lazy_object_proxy-1.12.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8ee0d6027b760a11cc18281e702c0309dd92da458a74b4c15025d7fc490deede", size = 26746, upload-time = "2025-08-22T13:42:37.572Z" },
928
+
{ url = "https://files.pythonhosted.org/packages/55/64/314889b618075c2bfc19293ffa9153ce880ac6153aacfd0a52fcabf21a66/lazy_object_proxy-1.12.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:4ab2c584e3cc8be0dfca422e05ad30a9abe3555ce63e9ab7a559f62f8dbc6ff9", size = 71457, upload-time = "2025-08-22T13:42:38.743Z" },
929
+
{ url = "https://files.pythonhosted.org/packages/11/53/857fc2827fc1e13fbdfc0ba2629a7d2579645a06192d5461809540b78913/lazy_object_proxy-1.12.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:14e348185adbd03ec17d051e169ec45686dcd840a3779c9d4c10aabe2ca6e1c0", size = 71036, upload-time = "2025-08-22T13:42:40.184Z" },
930
+
{ url = "https://files.pythonhosted.org/packages/2b/24/e581ffed864cd33c1b445b5763d617448ebb880f48675fc9de0471a95cbc/lazy_object_proxy-1.12.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:c4fcbe74fb85df8ba7825fa05eddca764138da752904b378f0ae5ab33a36c308", size = 69329, upload-time = "2025-08-22T13:42:41.311Z" },
931
+
{ url = "https://files.pythonhosted.org/packages/78/be/15f8f5a0b0b2e668e756a152257d26370132c97f2f1943329b08f057eff0/lazy_object_proxy-1.12.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:563d2ec8e4d4b68ee7848c5ab4d6057a6d703cb7963b342968bb8758dda33a23", size = 70690, upload-time = "2025-08-22T13:42:42.51Z" },
932
+
{ url = "https://files.pythonhosted.org/packages/5d/aa/f02be9bbfb270e13ee608c2b28b8771f20a5f64356c6d9317b20043c6129/lazy_object_proxy-1.12.0-cp312-cp312-win_amd64.whl", hash = "sha256:53c7fd99eb156bbb82cbc5d5188891d8fdd805ba6c1e3b92b90092da2a837073", size = 26563, upload-time = "2025-08-22T13:42:43.685Z" },
933
+
{ url = "https://files.pythonhosted.org/packages/f4/26/b74c791008841f8ad896c7f293415136c66cc27e7c7577de4ee68040c110/lazy_object_proxy-1.12.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:86fd61cb2ba249b9f436d789d1356deae69ad3231dc3c0f17293ac535162672e", size = 26745, upload-time = "2025-08-22T13:42:44.982Z" },
934
+
{ url = "https://files.pythonhosted.org/packages/9b/52/641870d309e5d1fb1ea7d462a818ca727e43bfa431d8c34b173eb090348c/lazy_object_proxy-1.12.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:81d1852fb30fab81696f93db1b1e55a5d1ff7940838191062f5f56987d5fcc3e", size = 71537, upload-time = "2025-08-22T13:42:46.141Z" },
935
+
{ url = "https://files.pythonhosted.org/packages/47/b6/919118e99d51c5e76e8bf5a27df406884921c0acf2c7b8a3b38d847ab3e9/lazy_object_proxy-1.12.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:be9045646d83f6c2664c1330904b245ae2371b5c57a3195e4028aedc9f999655", size = 71141, upload-time = "2025-08-22T13:42:47.375Z" },
936
+
{ url = "https://files.pythonhosted.org/packages/e5/47/1d20e626567b41de085cf4d4fb3661a56c159feaa73c825917b3b4d4f806/lazy_object_proxy-1.12.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:67f07ab742f1adfb3966c40f630baaa7902be4222a17941f3d85fd1dae5565ff", size = 69449, upload-time = "2025-08-22T13:42:48.49Z" },
937
+
{ url = "https://files.pythonhosted.org/packages/58/8d/25c20ff1a1a8426d9af2d0b6f29f6388005fc8cd10d6ee71f48bff86fdd0/lazy_object_proxy-1.12.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:75ba769017b944fcacbf6a80c18b2761a1795b03f8899acdad1f1c39db4409be", size = 70744, upload-time = "2025-08-22T13:42:49.608Z" },
938
+
{ url = "https://files.pythonhosted.org/packages/c0/67/8ec9abe15c4f8a4bcc6e65160a2c667240d025cbb6591b879bea55625263/lazy_object_proxy-1.12.0-cp313-cp313-win_amd64.whl", hash = "sha256:7b22c2bbfb155706b928ac4d74c1a63ac8552a55ba7fff4445155523ea4067e1", size = 26568, upload-time = "2025-08-22T13:42:57.719Z" },
939
+
{ url = "https://files.pythonhosted.org/packages/23/12/cd2235463f3469fd6c62d41d92b7f120e8134f76e52421413a0ad16d493e/lazy_object_proxy-1.12.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4a79b909aa16bde8ae606f06e6bbc9d3219d2e57fb3e0076e17879072b742c65", size = 27391, upload-time = "2025-08-22T13:42:50.62Z" },
940
+
{ url = "https://files.pythonhosted.org/packages/60/9e/f1c53e39bbebad2e8609c67d0830cc275f694d0ea23d78e8f6db526c12d3/lazy_object_proxy-1.12.0-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:338ab2f132276203e404951205fe80c3fd59429b3a724e7b662b2eb539bb1be9", size = 80552, upload-time = "2025-08-22T13:42:51.731Z" },
941
+
{ url = "https://files.pythonhosted.org/packages/4c/b6/6c513693448dcb317d9d8c91d91f47addc09553613379e504435b4cc8b3e/lazy_object_proxy-1.12.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8c40b3c9faee2e32bfce0df4ae63f4e73529766893258eca78548bac801c8f66", size = 82857, upload-time = "2025-08-22T13:42:53.225Z" },
942
+
{ url = "https://files.pythonhosted.org/packages/12/1c/d9c4aaa4c75da11eb7c22c43d7c90a53b4fca0e27784a5ab207768debea7/lazy_object_proxy-1.12.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:717484c309df78cedf48396e420fa57fc8a2b1f06ea889df7248fdd156e58847", size = 80833, upload-time = "2025-08-22T13:42:54.391Z" },
943
+
{ url = "https://files.pythonhosted.org/packages/0b/ae/29117275aac7d7d78ae4f5a4787f36ff33262499d486ac0bf3e0b97889f6/lazy_object_proxy-1.12.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:a6b7ea5ea1ffe15059eb44bcbcb258f97bcb40e139b88152c40d07b1a1dfc9ac", size = 79516, upload-time = "2025-08-22T13:42:55.812Z" },
944
+
{ url = "https://files.pythonhosted.org/packages/19/40/b4e48b2c38c69392ae702ae7afa7b6551e0ca5d38263198b7c79de8b3bdf/lazy_object_proxy-1.12.0-cp313-cp313t-win_amd64.whl", hash = "sha256:08c465fb5cd23527512f9bd7b4c7ba6cec33e28aad36fbbe46bf7b858f9f3f7f", size = 27656, upload-time = "2025-08-22T13:42:56.793Z" },
945
+
{ url = "https://files.pythonhosted.org/packages/ef/3a/277857b51ae419a1574557c0b12e0d06bf327b758ba94cafc664cb1e2f66/lazy_object_proxy-1.12.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c9defba70ab943f1df98a656247966d7729da2fe9c2d5d85346464bf320820a3", size = 26582, upload-time = "2025-08-22T13:49:49.366Z" },
946
+
{ url = "https://files.pythonhosted.org/packages/1a/b6/c5e0fa43535bb9c87880e0ba037cdb1c50e01850b0831e80eb4f4762f270/lazy_object_proxy-1.12.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:6763941dbf97eea6b90f5b06eb4da9418cc088fce0e3883f5816090f9afcde4a", size = 71059, upload-time = "2025-08-22T13:49:50.488Z" },
947
+
{ url = "https://files.pythonhosted.org/packages/06/8a/7dcad19c685963c652624702f1a968ff10220b16bfcc442257038216bf55/lazy_object_proxy-1.12.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fdc70d81235fc586b9e3d1aeef7d1553259b62ecaae9db2167a5d2550dcc391a", size = 71034, upload-time = "2025-08-22T13:49:54.224Z" },
948
+
{ url = "https://files.pythonhosted.org/packages/12/ac/34cbfb433a10e28c7fd830f91c5a348462ba748413cbb950c7f259e67aa7/lazy_object_proxy-1.12.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:0a83c6f7a6b2bfc11ef3ed67f8cbe99f8ff500b05655d8e7df9aab993a6abc95", size = 69529, upload-time = "2025-08-22T13:49:55.29Z" },
949
+
{ url = "https://files.pythonhosted.org/packages/6f/6a/11ad7e349307c3ca4c0175db7a77d60ce42a41c60bcb11800aabd6a8acb8/lazy_object_proxy-1.12.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:256262384ebd2a77b023ad02fbcc9326282bcfd16484d5531154b02bc304f4c5", size = 70391, upload-time = "2025-08-22T13:49:56.35Z" },
950
+
{ url = "https://files.pythonhosted.org/packages/59/97/9b410ed8fbc6e79c1ee8b13f8777a80137d4bc189caf2c6202358e66192c/lazy_object_proxy-1.12.0-cp314-cp314-win_amd64.whl", hash = "sha256:7601ec171c7e8584f8ff3f4e440aa2eebf93e854f04639263875b8c2971f819f", size = 26988, upload-time = "2025-08-22T13:49:57.302Z" },
951
+
]
952
+
953
+
[[package]]
954
name = "libipld"
955
version = "3.1.1"
956
source = { registry = "https://pypi.org/simple" }
···
1006
]
1007
1008
[[package]]
1009
+
name = "markupsafe"
1010
+
version = "3.0.3"
1011
+
source = { registry = "https://pypi.org/simple" }
1012
+
sdist = { url = "https://files.pythonhosted.org/packages/7e/99/7690b6d4034fffd95959cbe0c02de8deb3098cc577c67bb6a24fe5d7caa7/markupsafe-3.0.3.tar.gz", hash = "sha256:722695808f4b6457b320fdc131280796bdceb04ab50fe1795cd540799ebe1698", size = 80313, upload-time = "2025-09-27T18:37:40.426Z" }
1013
+
wheels = [
1014
+
{ url = "https://files.pythonhosted.org/packages/5a/72/147da192e38635ada20e0a2e1a51cf8823d2119ce8883f7053879c2199b5/markupsafe-3.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d53197da72cc091b024dd97249dfc7794d6a56530370992a5e1a08983ad9230e", size = 11615, upload-time = "2025-09-27T18:36:30.854Z" },
1015
+
{ url = "https://files.pythonhosted.org/packages/9a/81/7e4e08678a1f98521201c3079f77db69fb552acd56067661f8c2f534a718/markupsafe-3.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1872df69a4de6aead3491198eaf13810b565bdbeec3ae2dc8780f14458ec73ce", size = 12020, upload-time = "2025-09-27T18:36:31.971Z" },
1016
+
{ url = "https://files.pythonhosted.org/packages/1e/2c/799f4742efc39633a1b54a92eec4082e4f815314869865d876824c257c1e/markupsafe-3.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3a7e8ae81ae39e62a41ec302f972ba6ae23a5c5396c8e60113e9066ef893da0d", size = 24332, upload-time = "2025-09-27T18:36:32.813Z" },
1017
+
{ url = "https://files.pythonhosted.org/packages/3c/2e/8d0c2ab90a8c1d9a24f0399058ab8519a3279d1bd4289511d74e909f060e/markupsafe-3.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d6dd0be5b5b189d31db7cda48b91d7e0a9795f31430b7f271219ab30f1d3ac9d", size = 22947, upload-time = "2025-09-27T18:36:33.86Z" },
1018
+
{ url = "https://files.pythonhosted.org/packages/2c/54/887f3092a85238093a0b2154bd629c89444f395618842e8b0c41783898ea/markupsafe-3.0.3-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:94c6f0bb423f739146aec64595853541634bde58b2135f27f61c1ffd1cd4d16a", size = 21962, upload-time = "2025-09-27T18:36:35.099Z" },
1019
+
{ url = "https://files.pythonhosted.org/packages/c9/2f/336b8c7b6f4a4d95e91119dc8521402461b74a485558d8f238a68312f11c/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:be8813b57049a7dc738189df53d69395eba14fb99345e0a5994914a3864c8a4b", size = 23760, upload-time = "2025-09-27T18:36:36.001Z" },
1020
+
{ url = "https://files.pythonhosted.org/packages/32/43/67935f2b7e4982ffb50a4d169b724d74b62a3964bc1a9a527f5ac4f1ee2b/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:83891d0e9fb81a825d9a6d61e3f07550ca70a076484292a70fde82c4b807286f", size = 21529, upload-time = "2025-09-27T18:36:36.906Z" },
1021
+
{ url = "https://files.pythonhosted.org/packages/89/e0/4486f11e51bbba8b0c041098859e869e304d1c261e59244baa3d295d47b7/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:77f0643abe7495da77fb436f50f8dab76dbc6e5fd25d39589a0f1fe6548bfa2b", size = 23015, upload-time = "2025-09-27T18:36:37.868Z" },
1022
+
{ url = "https://files.pythonhosted.org/packages/2f/e1/78ee7a023dac597a5825441ebd17170785a9dab23de95d2c7508ade94e0e/markupsafe-3.0.3-cp312-cp312-win32.whl", hash = "sha256:d88b440e37a16e651bda4c7c2b930eb586fd15ca7406cb39e211fcff3bf3017d", size = 14540, upload-time = "2025-09-27T18:36:38.761Z" },
1023
+
{ url = "https://files.pythonhosted.org/packages/aa/5b/bec5aa9bbbb2c946ca2733ef9c4ca91c91b6a24580193e891b5f7dbe8e1e/markupsafe-3.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:26a5784ded40c9e318cfc2bdb30fe164bdb8665ded9cd64d500a34fb42067b1c", size = 15105, upload-time = "2025-09-27T18:36:39.701Z" },
1024
+
{ url = "https://files.pythonhosted.org/packages/e5/f1/216fc1bbfd74011693a4fd837e7026152e89c4bcf3e77b6692fba9923123/markupsafe-3.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:35add3b638a5d900e807944a078b51922212fb3dedb01633a8defc4b01a3c85f", size = 13906, upload-time = "2025-09-27T18:36:40.689Z" },
1025
+
{ url = "https://files.pythonhosted.org/packages/38/2f/907b9c7bbba283e68f20259574b13d005c121a0fa4c175f9bed27c4597ff/markupsafe-3.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e1cf1972137e83c5d4c136c43ced9ac51d0e124706ee1c8aa8532c1287fa8795", size = 11622, upload-time = "2025-09-27T18:36:41.777Z" },
1026
+
{ url = "https://files.pythonhosted.org/packages/9c/d9/5f7756922cdd676869eca1c4e3c0cd0df60ed30199ffd775e319089cb3ed/markupsafe-3.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:116bb52f642a37c115f517494ea5feb03889e04df47eeff5b130b1808ce7c219", size = 12029, upload-time = "2025-09-27T18:36:43.257Z" },
1027
+
{ url = "https://files.pythonhosted.org/packages/00/07/575a68c754943058c78f30db02ee03a64b3c638586fba6a6dd56830b30a3/markupsafe-3.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:133a43e73a802c5562be9bbcd03d090aa5a1fe899db609c29e8c8d815c5f6de6", size = 24374, upload-time = "2025-09-27T18:36:44.508Z" },
1028
+
{ url = "https://files.pythonhosted.org/packages/a9/21/9b05698b46f218fc0e118e1f8168395c65c8a2c750ae2bab54fc4bd4e0e8/markupsafe-3.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ccfcd093f13f0f0b7fdd0f198b90053bf7b2f02a3927a30e63f3ccc9df56b676", size = 22980, upload-time = "2025-09-27T18:36:45.385Z" },
1029
+
{ url = "https://files.pythonhosted.org/packages/7f/71/544260864f893f18b6827315b988c146b559391e6e7e8f7252839b1b846a/markupsafe-3.0.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:509fa21c6deb7a7a273d629cf5ec029bc209d1a51178615ddf718f5918992ab9", size = 21990, upload-time = "2025-09-27T18:36:46.916Z" },
1030
+
{ url = "https://files.pythonhosted.org/packages/c2/28/b50fc2f74d1ad761af2f5dcce7492648b983d00a65b8c0e0cb457c82ebbe/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4afe79fb3de0b7097d81da19090f4df4f8d3a2b3adaa8764138aac2e44f3af1", size = 23784, upload-time = "2025-09-27T18:36:47.884Z" },
1031
+
{ url = "https://files.pythonhosted.org/packages/ed/76/104b2aa106a208da8b17a2fb72e033a5a9d7073c68f7e508b94916ed47a9/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:795e7751525cae078558e679d646ae45574b47ed6e7771863fcc079a6171a0fc", size = 21588, upload-time = "2025-09-27T18:36:48.82Z" },
1032
+
{ url = "https://files.pythonhosted.org/packages/b5/99/16a5eb2d140087ebd97180d95249b00a03aa87e29cc224056274f2e45fd6/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8485f406a96febb5140bfeca44a73e3ce5116b2501ac54fe953e488fb1d03b12", size = 23041, upload-time = "2025-09-27T18:36:49.797Z" },
1033
+
{ url = "https://files.pythonhosted.org/packages/19/bc/e7140ed90c5d61d77cea142eed9f9c303f4c4806f60a1044c13e3f1471d0/markupsafe-3.0.3-cp313-cp313-win32.whl", hash = "sha256:bdd37121970bfd8be76c5fb069c7751683bdf373db1ed6c010162b2a130248ed", size = 14543, upload-time = "2025-09-27T18:36:51.584Z" },
1034
+
{ url = "https://files.pythonhosted.org/packages/05/73/c4abe620b841b6b791f2edc248f556900667a5a1cf023a6646967ae98335/markupsafe-3.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:9a1abfdc021a164803f4d485104931fb8f8c1efd55bc6b748d2f5774e78b62c5", size = 15113, upload-time = "2025-09-27T18:36:52.537Z" },
1035
+
{ url = "https://files.pythonhosted.org/packages/f0/3a/fa34a0f7cfef23cf9500d68cb7c32dd64ffd58a12b09225fb03dd37d5b80/markupsafe-3.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:7e68f88e5b8799aa49c85cd116c932a1ac15caaa3f5db09087854d218359e485", size = 13911, upload-time = "2025-09-27T18:36:53.513Z" },
1036
+
{ url = "https://files.pythonhosted.org/packages/e4/d7/e05cd7efe43a88a17a37b3ae96e79a19e846f3f456fe79c57ca61356ef01/markupsafe-3.0.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:218551f6df4868a8d527e3062d0fb968682fe92054e89978594c28e642c43a73", size = 11658, upload-time = "2025-09-27T18:36:54.819Z" },
1037
+
{ url = "https://files.pythonhosted.org/packages/99/9e/e412117548182ce2148bdeacdda3bb494260c0b0184360fe0d56389b523b/markupsafe-3.0.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3524b778fe5cfb3452a09d31e7b5adefeea8c5be1d43c4f810ba09f2ceb29d37", size = 12066, upload-time = "2025-09-27T18:36:55.714Z" },
1038
+
{ url = "https://files.pythonhosted.org/packages/bc/e6/fa0ffcda717ef64a5108eaa7b4f5ed28d56122c9a6d70ab8b72f9f715c80/markupsafe-3.0.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e885a3d1efa2eadc93c894a21770e4bc67899e3543680313b09f139e149ab19", size = 25639, upload-time = "2025-09-27T18:36:56.908Z" },
1039
+
{ url = "https://files.pythonhosted.org/packages/96/ec/2102e881fe9d25fc16cb4b25d5f5cde50970967ffa5dddafdb771237062d/markupsafe-3.0.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8709b08f4a89aa7586de0aadc8da56180242ee0ada3999749b183aa23df95025", size = 23569, upload-time = "2025-09-27T18:36:57.913Z" },
1040
+
{ url = "https://files.pythonhosted.org/packages/4b/30/6f2fce1f1f205fc9323255b216ca8a235b15860c34b6798f810f05828e32/markupsafe-3.0.3-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b8512a91625c9b3da6f127803b166b629725e68af71f8184ae7e7d54686a56d6", size = 23284, upload-time = "2025-09-27T18:36:58.833Z" },
1041
+
{ url = "https://files.pythonhosted.org/packages/58/47/4a0ccea4ab9f5dcb6f79c0236d954acb382202721e704223a8aafa38b5c8/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9b79b7a16f7fedff2495d684f2b59b0457c3b493778c9eed31111be64d58279f", size = 24801, upload-time = "2025-09-27T18:36:59.739Z" },
1042
+
{ url = "https://files.pythonhosted.org/packages/6a/70/3780e9b72180b6fecb83a4814d84c3bf4b4ae4bf0b19c27196104149734c/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:12c63dfb4a98206f045aa9563db46507995f7ef6d83b2f68eda65c307c6829eb", size = 22769, upload-time = "2025-09-27T18:37:00.719Z" },
1043
+
{ url = "https://files.pythonhosted.org/packages/98/c5/c03c7f4125180fc215220c035beac6b9cb684bc7a067c84fc69414d315f5/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8f71bc33915be5186016f675cd83a1e08523649b0e33efdb898db577ef5bb009", size = 23642, upload-time = "2025-09-27T18:37:01.673Z" },
1044
+
{ url = "https://files.pythonhosted.org/packages/80/d6/2d1b89f6ca4bff1036499b1e29a1d02d282259f3681540e16563f27ebc23/markupsafe-3.0.3-cp313-cp313t-win32.whl", hash = "sha256:69c0b73548bc525c8cb9a251cddf1931d1db4d2258e9599c28c07ef3580ef354", size = 14612, upload-time = "2025-09-27T18:37:02.639Z" },
1045
+
{ url = "https://files.pythonhosted.org/packages/2b/98/e48a4bfba0a0ffcf9925fe2d69240bfaa19c6f7507b8cd09c70684a53c1e/markupsafe-3.0.3-cp313-cp313t-win_amd64.whl", hash = "sha256:1b4b79e8ebf6b55351f0d91fe80f893b4743f104bff22e90697db1590e47a218", size = 15200, upload-time = "2025-09-27T18:37:03.582Z" },
1046
+
{ url = "https://files.pythonhosted.org/packages/0e/72/e3cc540f351f316e9ed0f092757459afbc595824ca724cbc5a5d4263713f/markupsafe-3.0.3-cp313-cp313t-win_arm64.whl", hash = "sha256:ad2cf8aa28b8c020ab2fc8287b0f823d0a7d8630784c31e9ee5edea20f406287", size = 13973, upload-time = "2025-09-27T18:37:04.929Z" },
1047
+
{ url = "https://files.pythonhosted.org/packages/33/8a/8e42d4838cd89b7dde187011e97fe6c3af66d8c044997d2183fbd6d31352/markupsafe-3.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:eaa9599de571d72e2daf60164784109f19978b327a3910d3e9de8c97b5b70cfe", size = 11619, upload-time = "2025-09-27T18:37:06.342Z" },
1048
+
{ url = "https://files.pythonhosted.org/packages/b5/64/7660f8a4a8e53c924d0fa05dc3a55c9cee10bbd82b11c5afb27d44b096ce/markupsafe-3.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c47a551199eb8eb2121d4f0f15ae0f923d31350ab9280078d1e5f12b249e0026", size = 12029, upload-time = "2025-09-27T18:37:07.213Z" },
1049
+
{ url = "https://files.pythonhosted.org/packages/da/ef/e648bfd021127bef5fa12e1720ffed0c6cbb8310c8d9bea7266337ff06de/markupsafe-3.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f34c41761022dd093b4b6896d4810782ffbabe30f2d443ff5f083e0cbbb8c737", size = 24408, upload-time = "2025-09-27T18:37:09.572Z" },
1050
+
{ url = "https://files.pythonhosted.org/packages/41/3c/a36c2450754618e62008bf7435ccb0f88053e07592e6028a34776213d877/markupsafe-3.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:457a69a9577064c05a97c41f4e65148652db078a3a509039e64d3467b9e7ef97", size = 23005, upload-time = "2025-09-27T18:37:10.58Z" },
1051
+
{ url = "https://files.pythonhosted.org/packages/bc/20/b7fdf89a8456b099837cd1dc21974632a02a999ec9bf7ca3e490aacd98e7/markupsafe-3.0.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e8afc3f2ccfa24215f8cb28dcf43f0113ac3c37c2f0f0806d8c70e4228c5cf4d", size = 22048, upload-time = "2025-09-27T18:37:11.547Z" },
1052
+
{ url = "https://files.pythonhosted.org/packages/9a/a7/591f592afdc734f47db08a75793a55d7fbcc6902a723ae4cfbab61010cc5/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ec15a59cf5af7be74194f7ab02d0f59a62bdcf1a537677ce67a2537c9b87fcda", size = 23821, upload-time = "2025-09-27T18:37:12.48Z" },
1053
+
{ url = "https://files.pythonhosted.org/packages/7d/33/45b24e4f44195b26521bc6f1a82197118f74df348556594bd2262bda1038/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:0eb9ff8191e8498cca014656ae6b8d61f39da5f95b488805da4bb029cccbfbaf", size = 21606, upload-time = "2025-09-27T18:37:13.485Z" },
1054
+
{ url = "https://files.pythonhosted.org/packages/ff/0e/53dfaca23a69fbfbbf17a4b64072090e70717344c52eaaaa9c5ddff1e5f0/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2713baf880df847f2bece4230d4d094280f4e67b1e813eec43b4c0e144a34ffe", size = 23043, upload-time = "2025-09-27T18:37:14.408Z" },
1055
+
{ url = "https://files.pythonhosted.org/packages/46/11/f333a06fc16236d5238bfe74daccbca41459dcd8d1fa952e8fbd5dccfb70/markupsafe-3.0.3-cp314-cp314-win32.whl", hash = "sha256:729586769a26dbceff69f7a7dbbf59ab6572b99d94576a5592625d5b411576b9", size = 14747, upload-time = "2025-09-27T18:37:15.36Z" },
1056
+
{ url = "https://files.pythonhosted.org/packages/28/52/182836104b33b444e400b14f797212f720cbc9ed6ba34c800639d154e821/markupsafe-3.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:bdc919ead48f234740ad807933cdf545180bfbe9342c2bb451556db2ed958581", size = 15341, upload-time = "2025-09-27T18:37:16.496Z" },
1057
+
{ url = "https://files.pythonhosted.org/packages/6f/18/acf23e91bd94fd7b3031558b1f013adfa21a8e407a3fdb32745538730382/markupsafe-3.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:5a7d5dc5140555cf21a6fefbdbf8723f06fcd2f63ef108f2854de715e4422cb4", size = 14073, upload-time = "2025-09-27T18:37:17.476Z" },
1058
+
{ url = "https://files.pythonhosted.org/packages/3c/f0/57689aa4076e1b43b15fdfa646b04653969d50cf30c32a102762be2485da/markupsafe-3.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:1353ef0c1b138e1907ae78e2f6c63ff67501122006b0f9abad68fda5f4ffc6ab", size = 11661, upload-time = "2025-09-27T18:37:18.453Z" },
1059
+
{ url = "https://files.pythonhosted.org/packages/89/c3/2e67a7ca217c6912985ec766c6393b636fb0c2344443ff9d91404dc4c79f/markupsafe-3.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1085e7fbddd3be5f89cc898938f42c0b3c711fdcb37d75221de2666af647c175", size = 12069, upload-time = "2025-09-27T18:37:19.332Z" },
1060
+
{ url = "https://files.pythonhosted.org/packages/f0/00/be561dce4e6ca66b15276e184ce4b8aec61fe83662cce2f7d72bd3249d28/markupsafe-3.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1b52b4fb9df4eb9ae465f8d0c228a00624de2334f216f178a995ccdcf82c4634", size = 25670, upload-time = "2025-09-27T18:37:20.245Z" },
1061
+
{ url = "https://files.pythonhosted.org/packages/50/09/c419f6f5a92e5fadde27efd190eca90f05e1261b10dbd8cbcb39cd8ea1dc/markupsafe-3.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fed51ac40f757d41b7c48425901843666a6677e3e8eb0abcff09e4ba6e664f50", size = 23598, upload-time = "2025-09-27T18:37:21.177Z" },
1062
+
{ url = "https://files.pythonhosted.org/packages/22/44/a0681611106e0b2921b3033fc19bc53323e0b50bc70cffdd19f7d679bb66/markupsafe-3.0.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f190daf01f13c72eac4efd5c430a8de82489d9cff23c364c3ea822545032993e", size = 23261, upload-time = "2025-09-27T18:37:22.167Z" },
1063
+
{ url = "https://files.pythonhosted.org/packages/5f/57/1b0b3f100259dc9fffe780cfb60d4be71375510e435efec3d116b6436d43/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e56b7d45a839a697b5eb268c82a71bd8c7f6c94d6fd50c3d577fa39a9f1409f5", size = 24835, upload-time = "2025-09-27T18:37:23.296Z" },
1064
+
{ url = "https://files.pythonhosted.org/packages/26/6a/4bf6d0c97c4920f1597cc14dd720705eca0bf7c787aebc6bb4d1bead5388/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:f3e98bb3798ead92273dc0e5fd0f31ade220f59a266ffd8a4f6065e0a3ce0523", size = 22733, upload-time = "2025-09-27T18:37:24.237Z" },
1065
+
{ url = "https://files.pythonhosted.org/packages/14/c7/ca723101509b518797fedc2fdf79ba57f886b4aca8a7d31857ba3ee8281f/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5678211cb9333a6468fb8d8be0305520aa073f50d17f089b5b4b477ea6e67fdc", size = 23672, upload-time = "2025-09-27T18:37:25.271Z" },
1066
+
{ url = "https://files.pythonhosted.org/packages/fb/df/5bd7a48c256faecd1d36edc13133e51397e41b73bb77e1a69deab746ebac/markupsafe-3.0.3-cp314-cp314t-win32.whl", hash = "sha256:915c04ba3851909ce68ccc2b8e2cd691618c4dc4c4232fb7982bca3f41fd8c3d", size = 14819, upload-time = "2025-09-27T18:37:26.285Z" },
1067
+
{ url = "https://files.pythonhosted.org/packages/1a/8a/0402ba61a2f16038b48b39bccca271134be00c5c9f0f623208399333c448/markupsafe-3.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4faffd047e07c38848ce017e8725090413cd80cbc23d86e55c587bf979e579c9", size = 15426, upload-time = "2025-09-27T18:37:27.316Z" },
1068
+
{ url = "https://files.pythonhosted.org/packages/70/bc/6f1c2f612465f5fa89b95bead1f44dcb607670fd42891d8fdcd5d039f4f4/markupsafe-3.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:32001d6a8fc98c8cb5c947787c5d08b0a50663d139f1305bac5885d98d9b40fa", size = 14146, upload-time = "2025-09-27T18:37:28.327Z" },
1069
+
]
1070
+
1071
+
[[package]]
1072
name = "mcp"
1073
+
version = "1.16.0"
1074
source = { registry = "https://pypi.org/simple" }
1075
dependencies = [
1076
{ name = "anyio" },
···
1085
{ name = "starlette" },
1086
{ name = "uvicorn", marker = "sys_platform != 'emscripten'" },
1087
]
1088
+
sdist = { url = "https://files.pythonhosted.org/packages/3d/a1/b1f328da3b153683d2ec34f849b4b6eac2790fb240e3aef06ff2fab3df9d/mcp-1.16.0.tar.gz", hash = "sha256:39b8ca25460c578ee2cdad33feeea122694cfdf73eef58bee76c42f6ef0589df", size = 472918, upload-time = "2025-10-02T16:58:20.631Z" }
1089
wheels = [
1090
+
{ url = "https://files.pythonhosted.org/packages/c9/0e/7cebc88e17daf94ebe28c95633af595ccb2864dc2ee7abd75542d98495cc/mcp-1.16.0-py3-none-any.whl", hash = "sha256:ec917be9a5d31b09ba331e1768aa576e0af45470d657a0319996a20a57d7d633", size = 167266, upload-time = "2025-10-02T16:58:19.039Z" },
1091
]
1092
1093
[[package]]
···
1113
sdist = { url = "https://files.pythonhosted.org/packages/e9/e7/204a54d07c37ebf173590af85bf46cddf8bc343b9d6005804581967b4751/mistralai-1.9.2.tar.gz", hash = "sha256:c0c6d5aff18ffccbc0d22c06fbc84280d71eeaeb08fa4e1ef7326b36629cfb0b", size = 192678, upload-time = "2025-07-10T13:07:08.85Z" }
1114
wheels = [
1115
{ url = "https://files.pythonhosted.org/packages/24/eb/f746a3f977d3c0059e4afa19d26b1293f54c6258fcf841957e584be6927f/mistralai-1.9.2-py3-none-any.whl", hash = "sha256:7c3fff00e50227d379dea82052455c2610612a8ef476fa97393191aeeb7ab15f", size = 411581, upload-time = "2025-07-10T13:07:07.226Z" },
1116
+
]
1117
+
1118
+
[[package]]
1119
+
name = "more-itertools"
1120
+
version = "10.8.0"
1121
+
source = { registry = "https://pypi.org/simple" }
1122
+
sdist = { url = "https://files.pythonhosted.org/packages/ea/5d/38b681d3fce7a266dd9ab73c66959406d565b3e85f21d5e66e1181d93721/more_itertools-10.8.0.tar.gz", hash = "sha256:f638ddf8a1a0d134181275fb5d58b086ead7c6a72429ad725c67503f13ba30bd", size = 137431, upload-time = "2025-09-02T15:23:11.018Z" }
1123
+
wheels = [
1124
+
{ url = "https://files.pythonhosted.org/packages/a4/8e/469e5a4a2f5855992e425f3cb33804cc07bf18d48f2db061aec61ce50270/more_itertools-10.8.0-py3-none-any.whl", hash = "sha256:52d4362373dcf7c52546bc4af9a86ee7c4579df9a8dc268be0a2f949d376cc9b", size = 69667, upload-time = "2025-09-02T15:23:09.635Z" },
1125
]
1126
1127
[[package]]
···
1207
]
1208
1209
[[package]]
1210
+
name = "openapi-core"
1211
+
version = "0.19.5"
1212
+
source = { registry = "https://pypi.org/simple" }
1213
+
dependencies = [
1214
+
{ name = "isodate" },
1215
+
{ name = "jsonschema" },
1216
+
{ name = "jsonschema-path" },
1217
+
{ name = "more-itertools" },
1218
+
{ name = "openapi-schema-validator" },
1219
+
{ name = "openapi-spec-validator" },
1220
+
{ name = "parse" },
1221
+
{ name = "typing-extensions" },
1222
+
{ name = "werkzeug" },
1223
+
]
1224
+
sdist = { url = "https://files.pythonhosted.org/packages/b1/35/1acaa5f2fcc6e54eded34a2ec74b479439c4e469fc4e8d0e803fda0234db/openapi_core-0.19.5.tar.gz", hash = "sha256:421e753da56c391704454e66afe4803a290108590ac8fa6f4a4487f4ec11f2d3", size = 103264, upload-time = "2025-03-20T20:17:28.193Z" }
1225
+
wheels = [
1226
+
{ url = "https://files.pythonhosted.org/packages/27/6f/83ead0e2e30a90445ee4fc0135f43741aebc30cca5b43f20968b603e30b6/openapi_core-0.19.5-py3-none-any.whl", hash = "sha256:ef7210e83a59394f46ce282639d8d26ad6fc8094aa904c9c16eb1bac8908911f", size = 106595, upload-time = "2025-03-20T20:17:26.77Z" },
1227
+
]
1228
+
1229
+
[[package]]
1230
+
name = "openapi-pydantic"
1231
+
version = "0.5.1"
1232
+
source = { registry = "https://pypi.org/simple" }
1233
+
dependencies = [
1234
+
{ name = "pydantic" },
1235
+
]
1236
+
sdist = { url = "https://files.pythonhosted.org/packages/02/2e/58d83848dd1a79cb92ed8e63f6ba901ca282c5f09d04af9423ec26c56fd7/openapi_pydantic-0.5.1.tar.gz", hash = "sha256:ff6835af6bde7a459fb93eb93bb92b8749b754fc6e51b2f1590a19dc3005ee0d", size = 60892, upload-time = "2025-01-08T19:29:27.083Z" }
1237
+
wheels = [
1238
+
{ url = "https://files.pythonhosted.org/packages/12/cf/03675d8bd8ecbf4445504d8071adab19f5f993676795708e36402ab38263/openapi_pydantic-0.5.1-py3-none-any.whl", hash = "sha256:a3a09ef4586f5bd760a8df7f43028b60cafb6d9f61de2acba9574766255ab146", size = 96381, upload-time = "2025-01-08T19:29:25.275Z" },
1239
+
]
1240
+
1241
+
[[package]]
1242
+
name = "openapi-schema-validator"
1243
+
version = "0.6.3"
1244
+
source = { registry = "https://pypi.org/simple" }
1245
+
dependencies = [
1246
+
{ name = "jsonschema" },
1247
+
{ name = "jsonschema-specifications" },
1248
+
{ name = "rfc3339-validator" },
1249
+
]
1250
+
sdist = { url = "https://files.pythonhosted.org/packages/8b/f3/5507ad3325169347cd8ced61c232ff3df70e2b250c49f0fe140edb4973c6/openapi_schema_validator-0.6.3.tar.gz", hash = "sha256:f37bace4fc2a5d96692f4f8b31dc0f8d7400fd04f3a937798eaf880d425de6ee", size = 11550, upload-time = "2025-01-10T18:08:22.268Z" }
1251
+
wheels = [
1252
+
{ url = "https://files.pythonhosted.org/packages/21/c6/ad0fba32775ae749016829dace42ed80f4407b171da41313d1a3a5f102e4/openapi_schema_validator-0.6.3-py3-none-any.whl", hash = "sha256:f3b9870f4e556b5a62a1c39da72a6b4b16f3ad9c73dc80084b1b11e74ba148a3", size = 8755, upload-time = "2025-01-10T18:08:19.758Z" },
1253
+
]
1254
+
1255
+
[[package]]
1256
+
name = "openapi-spec-validator"
1257
+
version = "0.7.2"
1258
+
source = { registry = "https://pypi.org/simple" }
1259
+
dependencies = [
1260
+
{ name = "jsonschema" },
1261
+
{ name = "jsonschema-path" },
1262
+
{ name = "lazy-object-proxy" },
1263
+
{ name = "openapi-schema-validator" },
1264
+
]
1265
+
sdist = { url = "https://files.pythonhosted.org/packages/82/af/fe2d7618d6eae6fb3a82766a44ed87cd8d6d82b4564ed1c7cfb0f6378e91/openapi_spec_validator-0.7.2.tar.gz", hash = "sha256:cc029309b5c5dbc7859df0372d55e9d1ff43e96d678b9ba087f7c56fc586f734", size = 36855, upload-time = "2025-06-07T14:48:56.299Z" }
1266
+
wheels = [
1267
+
{ url = "https://files.pythonhosted.org/packages/27/dd/b3fd642260cb17532f66cc1e8250f3507d1e580483e209dc1e9d13bd980d/openapi_spec_validator-0.7.2-py3-none-any.whl", hash = "sha256:4bbdc0894ec85f1d1bea1d6d9c8b2c3c8d7ccaa13577ef40da9c006c9fd0eb60", size = 39713, upload-time = "2025-06-07T14:48:54.077Z" },
1268
+
]
1269
+
1270
+
[[package]]
1271
name = "opentelemetry-api"
1272
version = "1.35.0"
1273
source = { registry = "https://pypi.org/simple" }
···
1287
sdist = { url = "https://files.pythonhosted.org/packages/a1/d4/1fc4078c65507b51b96ca8f8c3ba19e6a61c8253c72794544580a7b6c24d/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f", size = 165727, upload-time = "2025-04-19T11:48:59.673Z" }
1288
wheels = [
1289
{ url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" },
1290
+
]
1291
+
1292
+
[[package]]
1293
+
name = "parse"
1294
+
version = "1.20.2"
1295
+
source = { registry = "https://pypi.org/simple" }
1296
+
sdist = { url = "https://files.pythonhosted.org/packages/4f/78/d9b09ba24bb36ef8b83b71be547e118d46214735b6dfb39e4bfde0e9b9dd/parse-1.20.2.tar.gz", hash = "sha256:b41d604d16503c79d81af5165155c0b20f6c8d6c559efa66b4b695c3e5a0a0ce", size = 29391, upload-time = "2024-06-11T04:41:57.34Z" }
1297
+
wheels = [
1298
+
{ url = "https://files.pythonhosted.org/packages/d0/31/ba45bf0b2aa7898d81cbbfac0e88c267befb59ad91a19e36e1bc5578ddb1/parse-1.20.2-py2.py3-none-any.whl", hash = "sha256:967095588cb802add9177d0c0b6133b5ba33b1ea9007ca800e526f42a85af558", size = 20126, upload-time = "2024-06-11T04:41:55.057Z" },
1299
+
]
1300
+
1301
+
[[package]]
1302
+
name = "pathable"
1303
+
version = "0.4.4"
1304
+
source = { registry = "https://pypi.org/simple" }
1305
+
sdist = { url = "https://files.pythonhosted.org/packages/67/93/8f2c2075b180c12c1e9f6a09d1a985bc2036906b13dff1d8917e395f2048/pathable-0.4.4.tar.gz", hash = "sha256:6905a3cd17804edfac7875b5f6c9142a218c7caef78693c2dbbbfbac186d88b2", size = 8124, upload-time = "2025-01-10T18:43:13.247Z" }
1306
+
wheels = [
1307
+
{ url = "https://files.pythonhosted.org/packages/7d/eb/b6260b31b1a96386c0a880edebe26f89669098acea8e0318bff6adb378fd/pathable-0.4.4-py3-none-any.whl", hash = "sha256:5ae9e94793b6ef5a4cbe0a7ce9dbbefc1eec38df253763fd0aeeacf2762dbbc2", size = 9592, upload-time = "2025-01-10T18:43:11.88Z" },
1308
]
1309
1310
[[package]]
···
1408
1409
[[package]]
1410
name = "pybase64"
1411
+
version = "1.4.2"
1412
source = { registry = "https://pypi.org/simple" }
1413
+
sdist = { url = "https://files.pythonhosted.org/packages/04/14/43297a7b7f0c1bf0c00b596f754ee3ac946128c64d21047ccf9c9bbc5165/pybase64-1.4.2.tar.gz", hash = "sha256:46cdefd283ed9643315d952fe44de80dc9b9a811ce6e3ec97fd1827af97692d0", size = 137246, upload-time = "2025-07-27T13:08:57.808Z" }
1414
wheels = [
1415
+
{ url = "https://files.pythonhosted.org/packages/28/6d/11ede991e800797b9f5ebd528013b34eee5652df93de61ffb24503393fa5/pybase64-1.4.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:db2c75d1388855b5a1015b65096d7dbcc708e7de3245dcbedeb872ec05a09326", size = 38326, upload-time = "2025-07-27T13:03:09.065Z" },
1416
+
{ url = "https://files.pythonhosted.org/packages/fe/84/87f1f565f42e2397e2aaa2477c86419f5173c3699881c42325c090982f0a/pybase64-1.4.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6b621a972a01841368fdb9dedc55fd3c6e0c7217d0505ba3b1ebe95e7ef1b493", size = 31661, upload-time = "2025-07-27T13:03:10.295Z" },
1417
+
{ url = "https://files.pythonhosted.org/packages/cb/2a/a24c810e7a61d2cc6f73fe9ee4872a03030887fa8654150901b15f376f65/pybase64-1.4.2-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:f48c32ac6a16cbf57a5a96a073fef6ff7e3526f623cd49faa112b7f9980bafba", size = 68192, upload-time = "2025-07-27T13:03:11.467Z" },
1418
+
{ url = "https://files.pythonhosted.org/packages/ee/87/d9baf98cbfc37b8657290ad4421f3a3c36aa0eafe4872c5859cfb52f3448/pybase64-1.4.2-cp312-cp312-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:ace8b23093a6bb862477080d9059b784096ab2f97541e8bfc40d42f062875149", size = 71587, upload-time = "2025-07-27T13:03:12.719Z" },
1419
+
{ url = "https://files.pythonhosted.org/packages/0b/89/3df043cc56ef3b91b7aa0c26ae822a2d7ec8da0b0fd7c309c879b0eb5988/pybase64-1.4.2-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:1772c7532a7fb6301baea3dd3e010148dbf70cd1136a83c2f5f91bdc94822145", size = 59910, upload-time = "2025-07-27T13:03:14.266Z" },
1420
+
{ url = "https://files.pythonhosted.org/packages/75/4f/6641e9edf37aeb4d4524dc7ba2168eff8d96c90e77f6283c2be3400ab380/pybase64-1.4.2-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.whl", hash = "sha256:f86f7faddcba5cbfea475f8ab96567834c28bf09ca6c7c3d66ee445adac80d8f", size = 56701, upload-time = "2025-07-27T13:03:15.6Z" },
1421
+
{ url = "https://files.pythonhosted.org/packages/2d/7f/20d8ac1046f12420a0954a45a13033e75f98aade36eecd00c64e3549b071/pybase64-1.4.2-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:0b8c8e275b5294089f314814b4a50174ab90af79d6a4850f6ae11261ff6a7372", size = 59288, upload-time = "2025-07-27T13:03:16.823Z" },
1422
+
{ url = "https://files.pythonhosted.org/packages/17/ea/9c0ca570e3e50b3c6c3442e280c83b321a0464c86a9db1f982a4ff531550/pybase64-1.4.2-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:864d85a0470c615807ae8b97d724d068b940a2d10ac13a5f1b9e75a3ce441758", size = 60267, upload-time = "2025-07-27T13:03:18.132Z" },
1423
+
{ url = "https://files.pythonhosted.org/packages/f9/ac/46894929d71ccedebbfb0284173b0fea96bc029cd262654ba8451a7035d6/pybase64-1.4.2-cp312-cp312-manylinux_2_31_riscv64.whl", hash = "sha256:47254d97ed2d8351e30ecfdb9e2414547f66ba73f8a09f932c9378ff75cd10c5", size = 54801, upload-time = "2025-07-27T13:03:19.669Z" },
1424
+
{ url = "https://files.pythonhosted.org/packages/6a/1e/02c95218ea964f0b2469717c2c69b48e63f4ca9f18af01a5b2a29e4c1216/pybase64-1.4.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:264b65ecc4f0ee73f3298ab83bbd8008f7f9578361b8df5b448f985d8c63e02a", size = 58599, upload-time = "2025-07-27T13:03:20.951Z" },
1425
+
{ url = "https://files.pythonhosted.org/packages/15/45/ccc21004930789b8fb439d43e3212a6c260ccddb2bf450c39a20db093f33/pybase64-1.4.2-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:fbcc2b30cd740c16c9699f596f22c7a9e643591311ae72b1e776f2d539e9dd9d", size = 52388, upload-time = "2025-07-27T13:03:23.064Z" },
1426
+
{ url = "https://files.pythonhosted.org/packages/c4/45/22e46e549710c4c237d77785b6fb1bc4c44c288a5c44237ba9daf5c34b82/pybase64-1.4.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:cda9f79c22d51ee4508f5a43b673565f1d26af4330c99f114e37e3186fdd3607", size = 68802, upload-time = "2025-07-27T13:03:24.673Z" },
1427
+
{ url = "https://files.pythonhosted.org/packages/55/0c/232c6261b81296e5593549b36e6e7884a5da008776d12665923446322c36/pybase64-1.4.2-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:0c91c6d2a7232e2a1cd10b3b75a8bb657defacd4295a1e5e80455df2dfc84d4f", size = 57841, upload-time = "2025-07-27T13:03:25.948Z" },
1428
+
{ url = "https://files.pythonhosted.org/packages/20/8a/b35a615ae6f04550d696bb179c414538b3b477999435fdd4ad75b76139e4/pybase64-1.4.2-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:a370dea7b1cee2a36a4d5445d4e09cc243816c5bc8def61f602db5a6f5438e52", size = 54320, upload-time = "2025-07-27T13:03:27.495Z" },
1429
+
{ url = "https://files.pythonhosted.org/packages/d3/a9/8bd4f9bcc53689f1b457ecefed1eaa080e4949d65a62c31a38b7253d5226/pybase64-1.4.2-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:9aa4de83f02e462a6f4e066811c71d6af31b52d7484de635582d0e3ec3d6cc3e", size = 56482, upload-time = "2025-07-27T13:03:28.942Z" },
1430
+
{ url = "https://files.pythonhosted.org/packages/75/e5/4a7735b54a1191f61c3f5c2952212c85c2d6b06eb5fb3671c7603395f70c/pybase64-1.4.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:83a1c2f9ed00fee8f064d548c8654a480741131f280e5750bb32475b7ec8ee38", size = 70959, upload-time = "2025-07-27T13:03:30.171Z" },
1431
+
{ url = "https://files.pythonhosted.org/packages/d3/67/e2b6cb32c782e12304d467418e70da0212567f42bd4d3b5eb1fdf64920ad/pybase64-1.4.2-cp312-cp312-win32.whl", hash = "sha256:a6e5688b18d558e8c6b8701cc8560836c4bbeba61d33c836b4dba56b19423716", size = 33683, upload-time = "2025-07-27T13:03:31.775Z" },
1432
+
{ url = "https://files.pythonhosted.org/packages/4f/bc/d5c277496063a09707486180f17abbdbdebbf2f5c4441b20b11d3cb7dc7c/pybase64-1.4.2-cp312-cp312-win_amd64.whl", hash = "sha256:c995d21b8bd08aa179cd7dd4db0695c185486ecc72da1e8f6c37ec86cadb8182", size = 35817, upload-time = "2025-07-27T13:03:32.99Z" },
1433
+
{ url = "https://files.pythonhosted.org/packages/e6/69/e4be18ae685acff0ae77f75d4586590f29d2cd187bf603290cf1d635cad4/pybase64-1.4.2-cp312-cp312-win_arm64.whl", hash = "sha256:e254b9258c40509c2ea063a7784f6994988f3f26099d6e08704e3c15dfed9a55", size = 30900, upload-time = "2025-07-27T13:03:34.499Z" },
1434
+
{ url = "https://files.pythonhosted.org/packages/f4/56/5337f27a8b8d2d6693f46f7b36bae47895e5820bfa259b0072574a4e1057/pybase64-1.4.2-cp313-cp313-android_21_arm64_v8a.whl", hash = "sha256:0f331aa59549de21f690b6ccc79360ffed1155c3cfbc852eb5c097c0b8565a2b", size = 33888, upload-time = "2025-07-27T13:03:35.698Z" },
1435
+
{ url = "https://files.pythonhosted.org/packages/4c/09/f3f4b11fc9beda7e8625e29fb0f549958fcbb34fea3914e1c1d95116e344/pybase64-1.4.2-cp313-cp313-android_21_x86_64.whl", hash = "sha256:9dad20bf1f3ed9e6fe566c4c9d07d9a6c04f5a280daebd2082ffb8620b0a880d", size = 40796, upload-time = "2025-07-27T13:03:36.927Z" },
1436
+
{ url = "https://files.pythonhosted.org/packages/e3/ff/470768f0fe6de0aa302a8cb1bdf2f9f5cffc3f69e60466153be68bc953aa/pybase64-1.4.2-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:69d3f0445b0faeef7bb7f93bf8c18d850785e2a77f12835f49e524cc54af04e7", size = 30914, upload-time = "2025-07-27T13:03:38.475Z" },
1437
+
{ url = "https://files.pythonhosted.org/packages/75/6b/d328736662665e0892409dc410353ebef175b1be5eb6bab1dad579efa6df/pybase64-1.4.2-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:2372b257b1f4dd512f317fb27e77d313afd137334de64c87de8374027aacd88a", size = 31380, upload-time = "2025-07-27T13:03:39.7Z" },
1438
+
{ url = "https://files.pythonhosted.org/packages/ca/96/7ff718f87c67f4147c181b73d0928897cefa17dc75d7abc6e37730d5908f/pybase64-1.4.2-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:fb794502b4b1ec91c4ca5d283ae71aef65e3de7721057bd9e2b3ec79f7a62d7d", size = 38230, upload-time = "2025-07-27T13:03:41.637Z" },
1439
+
{ url = "https://files.pythonhosted.org/packages/4d/58/a3307b048d799ff596a3c7c574fcba66f9b6b8c899a3c00a698124ca7ad5/pybase64-1.4.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d5c532b03fd14a5040d6cf6571299a05616f925369c72ddf6fe2fb643eb36fed", size = 38319, upload-time = "2025-07-27T13:03:42.847Z" },
1440
+
{ url = "https://files.pythonhosted.org/packages/08/a7/0bda06341b0a2c830d348c6e1c4d348caaae86c53dc9a046e943467a05e9/pybase64-1.4.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0f699514dc1d5689ca9cf378139e0214051922732f9adec9404bc680a8bef7c0", size = 31655, upload-time = "2025-07-27T13:03:44.426Z" },
1441
+
{ url = "https://files.pythonhosted.org/packages/87/df/e1d6e8479e0c5113c2c63c7b44886935ce839c2d99884c7304ca9e86547c/pybase64-1.4.2-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:cd3e8713cbd32c8c6aa935feaf15c7670e2b7e8bfe51c24dc556811ebd293a29", size = 68232, upload-time = "2025-07-27T13:03:45.729Z" },
1442
+
{ url = "https://files.pythonhosted.org/packages/71/ab/db4dbdfccb9ca874d6ce34a0784761471885d96730de85cee3d300381529/pybase64-1.4.2-cp313-cp313-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:d377d48acf53abf4b926c2a7a24a19deb092f366a04ffd856bf4b3aa330b025d", size = 71608, upload-time = "2025-07-27T13:03:47.01Z" },
1443
+
{ url = "https://files.pythonhosted.org/packages/11/e9/508df958563951045d728bbfbd3be77465f9231cf805cb7ccaf6951fc9f1/pybase64-1.4.2-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d83c076e78d619b9e1dd674e2bf5fb9001aeb3e0b494b80a6c8f6d4120e38cd9", size = 59912, upload-time = "2025-07-27T13:03:48.277Z" },
1444
+
{ url = "https://files.pythonhosted.org/packages/f2/58/7f2cef1ceccc682088958448d56727369de83fa6b29148478f4d2acd107a/pybase64-1.4.2-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.whl", hash = "sha256:ab9cdb6a8176a5cb967f53e6ad60e40c83caaa1ae31c5e1b29e5c8f507f17538", size = 56413, upload-time = "2025-07-27T13:03:49.908Z" },
1445
+
{ url = "https://files.pythonhosted.org/packages/08/7c/7e0af5c5728fa7e2eb082d88eca7c6bd17429be819d58518e74919d42e66/pybase64-1.4.2-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:adf0c103ad559dbfb9fe69edfd26a15c65d9c991a5ab0a25b04770f9eb0b9484", size = 59311, upload-time = "2025-07-27T13:03:51.238Z" },
1446
+
{ url = "https://files.pythonhosted.org/packages/03/8b/09825d0f37e45b9a3f546e5f990b6cf2dd838e54ea74122c2464646e0c77/pybase64-1.4.2-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:0d03ef2f253d97ce0685d3624bf5e552d716b86cacb8a6c971333ba4b827e1fc", size = 60282, upload-time = "2025-07-27T13:03:52.56Z" },
1447
+
{ url = "https://files.pythonhosted.org/packages/9c/3f/3711d2413f969bfd5b9cc19bc6b24abae361b7673ff37bcb90c43e199316/pybase64-1.4.2-cp313-cp313-manylinux_2_31_riscv64.whl", hash = "sha256:e565abf906efee76ae4be1aef5df4aed0fda1639bc0d7732a3dafef76cb6fc35", size = 54845, upload-time = "2025-07-27T13:03:54.167Z" },
1448
+
{ url = "https://files.pythonhosted.org/packages/c6/3c/4c7ce1ae4d828c2bb56d144322f81bffbaaac8597d35407c3d7cbb0ff98f/pybase64-1.4.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:e3c6a5f15fd03f232fc6f295cce3684f7bb08da6c6d5b12cc771f81c9f125cc6", size = 58615, upload-time = "2025-07-27T13:03:55.494Z" },
1449
+
{ url = "https://files.pythonhosted.org/packages/f5/8f/c2fc03bf4ed038358620065c75968a30184d5d3512d09d3ef9cc3bd48592/pybase64-1.4.2-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:bad9e3db16f448728138737bbd1af9dc2398efd593a8bdd73748cc02cd33f9c6", size = 52434, upload-time = "2025-07-27T13:03:56.808Z" },
1450
+
{ url = "https://files.pythonhosted.org/packages/e2/0a/757d6df0a60327c893cfae903e15419914dd792092dc8cc5c9523d40bc9b/pybase64-1.4.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2683ef271328365c31afee0ed8fa29356fb8fb7c10606794656aa9ffb95e92be", size = 68824, upload-time = "2025-07-27T13:03:58.735Z" },
1451
+
{ url = "https://files.pythonhosted.org/packages/a0/14/84abe2ed8c29014239be1cfab45dfebe5a5ca779b177b8b6f779bd8b69da/pybase64-1.4.2-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:265b20089cd470079114c09bb74b101b3bfc3c94ad6b4231706cf9eff877d570", size = 57898, upload-time = "2025-07-27T13:04:00.379Z" },
1452
+
{ url = "https://files.pythonhosted.org/packages/7e/c6/d193031f90c864f7b59fa6d1d1b5af41f0f5db35439988a8b9f2d1b32a13/pybase64-1.4.2-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:e53173badead10ef8b839aa5506eecf0067c7b75ad16d9bf39bc7144631f8e67", size = 54319, upload-time = "2025-07-27T13:04:01.742Z" },
1453
+
{ url = "https://files.pythonhosted.org/packages/cb/37/ec0c7a610ff8f994ee6e0c5d5d66b6b6310388b96ebb347b03ae39870fdf/pybase64-1.4.2-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:5823b8dcf74da7da0f761ed60c961e8928a6524e520411ad05fe7f9f47d55b40", size = 56472, upload-time = "2025-07-27T13:04:03.089Z" },
1454
+
{ url = "https://files.pythonhosted.org/packages/c4/5a/e585b74f85cedd261d271e4c2ef333c5cfce7e80750771808f56fee66b98/pybase64-1.4.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:1237f66c54357d325390da60aa5e21c6918fbcd1bf527acb9c1f4188c62cb7d5", size = 70966, upload-time = "2025-07-27T13:04:04.361Z" },
1455
+
{ url = "https://files.pythonhosted.org/packages/ad/20/1b2fdd98b4ba36008419668c813025758214c543e362c66c49214ecd1127/pybase64-1.4.2-cp313-cp313-win32.whl", hash = "sha256:b0b851eb4f801d16040047f6889cca5e9dfa102b3e33f68934d12511245cef86", size = 33681, upload-time = "2025-07-27T13:04:06.126Z" },
1456
+
{ url = "https://files.pythonhosted.org/packages/ff/64/3df4067d169c047054889f34b5a946cbe3785bca43404b93c962a5461a41/pybase64-1.4.2-cp313-cp313-win_amd64.whl", hash = "sha256:19541c6e26d17d9522c02680fe242206ae05df659c82a657aabadf209cd4c6c7", size = 35822, upload-time = "2025-07-27T13:04:07.752Z" },
1457
+
{ url = "https://files.pythonhosted.org/packages/d1/fd/db505188adf812e60ee923f196f9deddd8a1895b2b29b37f5db94afc3b1c/pybase64-1.4.2-cp313-cp313-win_arm64.whl", hash = "sha256:77a191863d576c0a5dd81f8a568a5ca15597cc980ae809dce62c717c8d42d8aa", size = 30899, upload-time = "2025-07-27T13:04:09.062Z" },
1458
+
{ url = "https://files.pythonhosted.org/packages/d9/27/5f5fecd206ec1e06e1608a380af18dcb76a6ab08ade6597a3251502dcdb2/pybase64-1.4.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:2e194bbabe3fdf9e47ba9f3e157394efe0849eb226df76432126239b3f44992c", size = 38677, upload-time = "2025-07-27T13:04:10.334Z" },
1459
+
{ url = "https://files.pythonhosted.org/packages/bf/0f/abe4b5a28529ef5f74e8348fa6a9ef27d7d75fbd98103d7664cf485b7d8f/pybase64-1.4.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:39aef1dadf4a004f11dd09e703abaf6528a87c8dbd39c448bb8aebdc0a08c1be", size = 32066, upload-time = "2025-07-27T13:04:11.641Z" },
1460
+
{ url = "https://files.pythonhosted.org/packages/ac/7e/ea0ce6a7155cada5526017ec588b6d6185adea4bf9331565272f4ef583c2/pybase64-1.4.2-cp313-cp313t-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:91cb920c7143e36ec8217031282c8651da3b2206d70343f068fac0e7f073b7f9", size = 72300, upload-time = "2025-07-27T13:04:12.969Z" },
1461
+
{ url = "https://files.pythonhosted.org/packages/45/2d/e64c7a056c9ec48dfe130d1295e47a8c2b19c3984488fc08e5eaa1e86c88/pybase64-1.4.2-cp313-cp313t-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:6958631143fb9e71f9842000da042ec2f6686506b6706e2dfda29e97925f6aa0", size = 75520, upload-time = "2025-07-27T13:04:14.374Z" },
1462
+
{ url = "https://files.pythonhosted.org/packages/43/e0/e5f93b2e1cb0751a22713c4baa6c6eaf5f307385e369180486c8316ed21e/pybase64-1.4.2-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:dc35f14141ef3f1ac70d963950a278a2593af66fe5a1c7a208e185ca6278fa25", size = 65384, upload-time = "2025-07-27T13:04:16.204Z" },
1463
+
{ url = "https://files.pythonhosted.org/packages/ff/23/8c645a1113ad88a1c6a3d0e825e93ef8b74ad3175148767853a0a4d7626e/pybase64-1.4.2-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.whl", hash = "sha256:5d949d2d677859c3a8507e1b21432a039d2b995e0bd3fe307052b6ded80f207a", size = 60471, upload-time = "2025-07-27T13:04:17.947Z" },
1464
+
{ url = "https://files.pythonhosted.org/packages/8b/81/edd0f7d8b0526b91730a0dd4ce6b4c8be2136cd69d424afe36235d2d2a06/pybase64-1.4.2-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:09caacdd3e15fe7253a67781edd10a6a918befab0052a2a3c215fe5d1f150269", size = 63945, upload-time = "2025-07-27T13:04:19.383Z" },
1465
+
{ url = "https://files.pythonhosted.org/packages/a5/a5/edc224cd821fd65100b7af7c7e16b8f699916f8c0226c9c97bbae5a75e71/pybase64-1.4.2-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:e44b0e793b23f28ea0f15a9754bd0c960102a2ac4bccb8fafdedbd4cc4d235c0", size = 64858, upload-time = "2025-07-27T13:04:20.807Z" },
1466
+
{ url = "https://files.pythonhosted.org/packages/11/3b/92853f968f1af7e42b7e54d21bdd319097b367e7dffa2ca20787361df74c/pybase64-1.4.2-cp313-cp313t-manylinux_2_31_riscv64.whl", hash = "sha256:849f274d0bcb90fc6f642c39274082724d108e41b15f3a17864282bd41fc71d5", size = 58557, upload-time = "2025-07-27T13:04:22.229Z" },
1467
+
{ url = "https://files.pythonhosted.org/packages/76/09/0ec6bd2b2303b0ea5c6da7535edc9a608092075ef8c0cdd96e3e726cd687/pybase64-1.4.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:528dba7ef1357bd7ce1aea143084501f47f5dd0fff7937d3906a68565aa59cfe", size = 63624, upload-time = "2025-07-27T13:04:23.952Z" },
1468
+
{ url = "https://files.pythonhosted.org/packages/73/6e/52cb1ced2a517a3118b2e739e9417432049013ac7afa15d790103059e8e4/pybase64-1.4.2-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:1da54be743d9a68671700cfe56c3ab8c26e8f2f5cc34eface905c55bc3a9af94", size = 56174, upload-time = "2025-07-27T13:04:25.419Z" },
1469
+
{ url = "https://files.pythonhosted.org/packages/5b/9d/820fe79347467e48af985fe46180e1dd28e698ade7317bebd66de8a143f5/pybase64-1.4.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:9b07c0406c3eaa7014499b0aacafb21a6d1146cfaa85d56f0aa02e6d542ee8f3", size = 72640, upload-time = "2025-07-27T13:04:26.824Z" },
1470
+
{ url = "https://files.pythonhosted.org/packages/53/58/e863e10d08361e694935c815b73faad7e1ab03f99ae154d86c4e2f331896/pybase64-1.4.2-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:312f2aa4cf5d199a97fbcaee75d2e59ebbaafcd091993eb373b43683498cdacb", size = 62453, upload-time = "2025-07-27T13:04:28.562Z" },
1471
+
{ url = "https://files.pythonhosted.org/packages/95/f0/c392c4ac8ccb7a34b28377c21faa2395313e3c676d76c382642e19a20703/pybase64-1.4.2-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:ad59362fc267bf15498a318c9e076686e4beeb0dfe09b457fabbc2b32468b97a", size = 58103, upload-time = "2025-07-27T13:04:29.996Z" },
1472
+
{ url = "https://files.pythonhosted.org/packages/32/30/00ab21316e7df8f526aa3e3dc06f74de6711d51c65b020575d0105a025b2/pybase64-1.4.2-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:01593bd064e7dcd6c86d04e94e44acfe364049500c20ac68ca1e708fbb2ca970", size = 60779, upload-time = "2025-07-27T13:04:31.549Z" },
1473
+
{ url = "https://files.pythonhosted.org/packages/a6/65/114ca81839b1805ce4a2b7d58bc16e95634734a2059991f6382fc71caf3e/pybase64-1.4.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:5b81547ad8ea271c79fdf10da89a1e9313cb15edcba2a17adf8871735e9c02a0", size = 74684, upload-time = "2025-07-27T13:04:32.976Z" },
1474
+
{ url = "https://files.pythonhosted.org/packages/54/8f/aa9d445b9bb693b8f6bb1456bd6d8576d79b7a63bf6c69af3a539235b15f/pybase64-1.4.2-cp313-cp313t-win32.whl", hash = "sha256:7edbe70b5654545a37e6e6b02de738303b1bbdfcde67f6cfec374cfb5cc4099e", size = 33961, upload-time = "2025-07-27T13:04:34.806Z" },
1475
+
{ url = "https://files.pythonhosted.org/packages/0e/e5/da37cfb173c646fd4fc7c6aae2bc41d40de2ee49529854af8f4e6f498b45/pybase64-1.4.2-cp313-cp313t-win_amd64.whl", hash = "sha256:385690addf87c25d6366fab5d8ff512eed8a7ecb18da9e8152af1c789162f208", size = 36199, upload-time = "2025-07-27T13:04:36.223Z" },
1476
+
{ url = "https://files.pythonhosted.org/packages/66/3e/1eb68fb7d00f2cec8bd9838e2a30d183d6724ae06e745fd6e65216f170ff/pybase64-1.4.2-cp313-cp313t-win_arm64.whl", hash = "sha256:c2070d0aa88580f57fe15ca88b09f162e604d19282915a95a3795b5d3c1c05b5", size = 31221, upload-time = "2025-07-27T13:04:37.704Z" },
1477
+
{ url = "https://files.pythonhosted.org/packages/99/bf/00a87d951473ce96c8c08af22b6983e681bfabdb78dd2dcf7ee58eac0932/pybase64-1.4.2-cp314-cp314-ios_13_0_arm64_iphoneos.whl", hash = "sha256:4157ad277a32cf4f02a975dffc62a3c67d73dfa4609b2c1978ef47e722b18b8e", size = 30924, upload-time = "2025-07-27T13:04:39.189Z" },
1478
+
{ url = "https://files.pythonhosted.org/packages/ae/43/dee58c9d60e60e6fb32dc6da722d84592e22f13c277297eb4ce6baf99a99/pybase64-1.4.2-cp314-cp314-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:e113267dc349cf624eb4f4fbf53fd77835e1aa048ac6877399af426aab435757", size = 31390, upload-time = "2025-07-27T13:04:40.995Z" },
1479
+
{ url = "https://files.pythonhosted.org/packages/e1/11/b28906fc2e330b8b1ab4bc845a7bef808b8506734e90ed79c6062b095112/pybase64-1.4.2-cp314-cp314-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:cea5aaf218fd9c5c23afacfe86fd4464dfedc1a0316dd3b5b4075b068cc67df0", size = 38212, upload-time = "2025-07-27T13:04:42.729Z" },
1480
+
{ url = "https://files.pythonhosted.org/packages/24/9e/868d1e104413d14b19feaf934fc7fad4ef5b18946385f8bb79684af40f24/pybase64-1.4.2-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:41213497abbd770435c7a9c8123fb02b93709ac4cf60155cd5aefc5f3042b600", size = 38303, upload-time = "2025-07-27T13:04:44.095Z" },
1481
+
{ url = "https://files.pythonhosted.org/packages/a3/73/f7eac96ca505df0600280d6bfc671a9e2e2f947c2b04b12a70e36412f7eb/pybase64-1.4.2-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c8b522df7ee00f2ac1993ccd5e1f6608ae7482de3907668c2ff96a83ef213925", size = 31669, upload-time = "2025-07-27T13:04:45.845Z" },
1482
+
{ url = "https://files.pythonhosted.org/packages/c6/43/8e18bea4fd455100112d6a73a83702843f067ef9b9272485b6bdfd9ed2f0/pybase64-1.4.2-cp314-cp314-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:06725022e540c5b098b978a0418ca979773e2cbdbb76f10bd97536f2ad1c5b49", size = 68452, upload-time = "2025-07-27T13:04:47.788Z" },
1483
+
{ url = "https://files.pythonhosted.org/packages/e4/2e/851eb51284b97354ee5dfa1309624ab90920696e91a33cd85b13d20cc5c1/pybase64-1.4.2-cp314-cp314-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:a3e54dcf0d0305ec88473c9d0009f698cabf86f88a8a10090efeff2879c421bb", size = 71674, upload-time = "2025-07-27T13:04:49.294Z" },
1484
+
{ url = "https://files.pythonhosted.org/packages/57/0d/5cf1e5dc64aec8db43e8dee4e4046856d639a72bcb0fb3e716be42ced5f1/pybase64-1.4.2-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:67675cee727a60dc91173d2790206f01aa3c7b3fbccfa84fd5c1e3d883fe6caa", size = 60027, upload-time = "2025-07-27T13:04:50.769Z" },
1485
+
{ url = "https://files.pythonhosted.org/packages/a4/8e/3479266bc0e65f6cc48b3938d4a83bff045330649869d950a378f2ddece0/pybase64-1.4.2-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.whl", hash = "sha256:753da25d4fd20be7bda2746f545935773beea12d5cb5ec56ec2d2960796477b1", size = 56461, upload-time = "2025-07-27T13:04:52.37Z" },
1486
+
{ url = "https://files.pythonhosted.org/packages/20/b6/f2b6cf59106dd78bae8717302be5b814cec33293504ad409a2eb752ad60c/pybase64-1.4.2-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:a78c768ce4ca550885246d14babdb8923e0f4a848dfaaeb63c38fc99e7ea4052", size = 59446, upload-time = "2025-07-27T13:04:53.967Z" },
1487
+
{ url = "https://files.pythonhosted.org/packages/16/70/3417797dfccdfdd0a54e4ad17c15b0624f0fc2d6a362210f229f5c4e8fd0/pybase64-1.4.2-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:51b17f36d890c92f0618fb1c8db2ccc25e6ed07afa505bab616396fc9b0b0492", size = 60350, upload-time = "2025-07-27T13:04:55.881Z" },
1488
+
{ url = "https://files.pythonhosted.org/packages/a0/c6/6e4269dd98d150ae95d321b311a345eae0f7fd459d97901b4a586d7513bb/pybase64-1.4.2-cp314-cp314-manylinux_2_31_riscv64.whl", hash = "sha256:f92218d667049ab4f65d54fa043a88ffdb2f07fff1f868789ef705a5221de7ec", size = 54989, upload-time = "2025-07-27T13:04:57.436Z" },
1489
+
{ url = "https://files.pythonhosted.org/packages/f9/e8/18c1b0c255f964fafd0412b0d5a163aad588aeccb8f84b9bf9c8611d80f6/pybase64-1.4.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:3547b3d1499919a06491b3f879a19fbe206af2bd1a424ecbb4e601eb2bd11fea", size = 58724, upload-time = "2025-07-27T13:04:59.406Z" },
1490
+
{ url = "https://files.pythonhosted.org/packages/b1/ad/ddfbd2125fc20b94865fb232b2e9105376fa16eee492e4b7786d42a86cbf/pybase64-1.4.2-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:958af7b0e09ddeb13e8c2330767c47b556b1ade19c35370f6451d139cde9f2a9", size = 52285, upload-time = "2025-07-27T13:05:01.198Z" },
1491
+
{ url = "https://files.pythonhosted.org/packages/b6/4c/b9d4ec9224add33c84b925a03d1a53cd4106efb449ea8e0ae7795fed7bf7/pybase64-1.4.2-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:4facc57f6671e2229a385a97a618273e7be36a9ea0a9d1c1b9347f14d19ceba8", size = 69036, upload-time = "2025-07-27T13:05:03.109Z" },
1492
+
{ url = "https://files.pythonhosted.org/packages/92/38/7b96794da77bed3d9b4fea40f14ae563648fba83a696e7602fabe60c0eb7/pybase64-1.4.2-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:a32fc57d05d73a7c9b0ca95e9e265e21cf734195dc6873829a890058c35f5cfd", size = 57938, upload-time = "2025-07-27T13:05:04.744Z" },
1493
+
{ url = "https://files.pythonhosted.org/packages/eb/c5/ae8bbce3c322d1b074e79f51f5df95961fe90cb8748df66c6bc97616e974/pybase64-1.4.2-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:3dc853243c81ce89cc7318e6946f860df28ddb7cd2a0648b981652d9ad09ee5a", size = 54474, upload-time = "2025-07-27T13:05:06.662Z" },
1494
+
{ url = "https://files.pythonhosted.org/packages/15/9a/c09887c4bb1b43c03fc352e2671ef20c6686c6942a99106a45270ee5b840/pybase64-1.4.2-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:0e6d863a86b3e7bc6ac9bd659bebda4501b9da842521111b0b0e54eb51295df5", size = 56533, upload-time = "2025-07-27T13:05:08.368Z" },
1495
+
{ url = "https://files.pythonhosted.org/packages/4f/0f/d5114d63d35d085639606a880cb06e2322841cd4b213adfc14d545c1186f/pybase64-1.4.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:6579475140ff2067903725d8aca47f5747bcb211597a1edd60b58f6d90ada2bd", size = 71030, upload-time = "2025-07-27T13:05:10.3Z" },
1496
+
{ url = "https://files.pythonhosted.org/packages/40/0e/fe6f1ed22ea52eb99f490a8441815ba21de288f4351aeef4968d71d20d2d/pybase64-1.4.2-cp314-cp314-win32.whl", hash = "sha256:373897f728d7b4f241a1f803ac732c27b6945d26d86b2741ad9b75c802e4e378", size = 34174, upload-time = "2025-07-27T13:05:12.254Z" },
1497
+
{ url = "https://files.pythonhosted.org/packages/71/46/0e15bea52ffc63e8ae7935e945accbaf635e0aefa26d3e31fdf9bc9dcd01/pybase64-1.4.2-cp314-cp314-win_amd64.whl", hash = "sha256:1afe3361344617d298c1d08bc657ef56d0f702d6b72cb65d968b2771017935aa", size = 36308, upload-time = "2025-07-27T13:05:13.898Z" },
1498
+
{ url = "https://files.pythonhosted.org/packages/4f/dc/55849fee2577bda77c1e078da04cc9237e8e474a8c8308deb702a26f2511/pybase64-1.4.2-cp314-cp314-win_arm64.whl", hash = "sha256:f131c9360babe522f3d90f34da3f827cba80318125cf18d66f2ee27e3730e8c4", size = 31341, upload-time = "2025-07-27T13:05:15.553Z" },
1499
+
{ url = "https://files.pythonhosted.org/packages/39/44/c69d088e28b25e70ac742b6789cde038473815b2a69345c4bae82d5e244d/pybase64-1.4.2-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:2583ac304131c1bd6e3120b0179333610f18816000db77c0a2dd6da1364722a8", size = 38678, upload-time = "2025-07-27T13:05:17.544Z" },
1500
+
{ url = "https://files.pythonhosted.org/packages/00/93/2860ec067497b9cbb06242f96d44caebbd9eed32174e4eb8c1ffef760f94/pybase64-1.4.2-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:75a8116be4ea4cdd30a5c4f1a6f3b038e0d457eb03c8a2685d8ce2aa00ef8f92", size = 32066, upload-time = "2025-07-27T13:05:19.18Z" },
1501
+
{ url = "https://files.pythonhosted.org/packages/d3/55/1e96249a38759332e8a01b31c370d88c60ceaf44692eb6ba4f0f451ee496/pybase64-1.4.2-cp314-cp314t-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:217ea776a098d7c08668e5526b9764f5048bbfd28cac86834217ddfe76a4e3c4", size = 72465, upload-time = "2025-07-27T13:05:20.866Z" },
1502
+
{ url = "https://files.pythonhosted.org/packages/6d/ab/0f468605b899f3e35dbb7423fba3ff98aeed1ec16abb02428468494a58f4/pybase64-1.4.2-cp314-cp314t-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:4ec14683e343c95b14248cdfdfa78c052582be7a3865fd570aa7cffa5ab5cf37", size = 75693, upload-time = "2025-07-27T13:05:22.896Z" },
1503
+
{ url = "https://files.pythonhosted.org/packages/91/d1/9980a0159b699e2489baba05b71b7c953b29249118ba06fdbb3e9ea1b9b5/pybase64-1.4.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:480ecf21e1e956c5a10d3cf7b3b7e75bce3f9328cf08c101e4aab1925d879f34", size = 65577, upload-time = "2025-07-27T13:05:25Z" },
1504
+
{ url = "https://files.pythonhosted.org/packages/16/86/b27e7b95f9863d245c0179a7245582eda3d262669d8f822777364d8fd7d5/pybase64-1.4.2-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.whl", hash = "sha256:1fe1ebdc55e9447142e2f6658944aadfb5a4fbf03dbd509be34182585515ecc1", size = 60662, upload-time = "2025-07-27T13:05:27.138Z" },
1505
+
{ url = "https://files.pythonhosted.org/packages/28/87/a7f0dde0abc26bfbee761f1d3558eb4b139f33ddd9fe1f6825ffa7daa22d/pybase64-1.4.2-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:c793a2b06753accdaf5e1a8bbe5d800aab2406919e5008174f989a1ca0081411", size = 64179, upload-time = "2025-07-27T13:05:28.996Z" },
1506
+
{ url = "https://files.pythonhosted.org/packages/1e/88/5d6fa1c60e1363b4cac4c396978f39e9df4689e75225d7d9c0a5998e3a14/pybase64-1.4.2-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:6acae6e1d1f7ebe40165f08076c7a73692b2bf9046fefe673f350536e007f556", size = 64968, upload-time = "2025-07-27T13:05:30.818Z" },
1507
+
{ url = "https://files.pythonhosted.org/packages/20/6e/2ed585af5b2211040445d9849326dd2445320c9316268794f5453cfbaf30/pybase64-1.4.2-cp314-cp314t-manylinux_2_31_riscv64.whl", hash = "sha256:88b91cd0949358aadcea75f8de5afbcf3c8c5fb9ec82325bd24285b7119cf56e", size = 58738, upload-time = "2025-07-27T13:05:32.629Z" },
1508
+
{ url = "https://files.pythonhosted.org/packages/ce/94/e2960b56322eabb3fbf303fc5a72e6444594c1b90035f3975c6fe666db5c/pybase64-1.4.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:53316587e1b1f47a11a5ff068d3cbd4a3911c291f2aec14882734973684871b2", size = 63802, upload-time = "2025-07-27T13:05:34.687Z" },
1509
+
{ url = "https://files.pythonhosted.org/packages/95/47/312139d764c223f534f751528ce3802887c279125eac64f71cd3b4e05abc/pybase64-1.4.2-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:caa7f20f43d00602cf9043b5ba758d54f5c41707d3709b2a5fac17361579c53c", size = 56341, upload-time = "2025-07-27T13:05:36.554Z" },
1510
+
{ url = "https://files.pythonhosted.org/packages/3f/d7/aec9a6ed53b128dac32f8768b646ca5730c88eef80934054d7fa7d02f3ef/pybase64-1.4.2-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:2d93817e24fdd79c534ed97705df855af6f1d2535ceb8dfa80da9de75482a8d7", size = 72838, upload-time = "2025-07-27T13:05:38.459Z" },
1511
+
{ url = "https://files.pythonhosted.org/packages/e3/a8/6ccc54c5f1f7c3450ad7c56da10c0f131d85ebe069ea6952b5b42f2e92d9/pybase64-1.4.2-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:63cd769b51474d8d08f7f2ce73b30380d9b4078ec92ea6b348ea20ed1e1af88a", size = 62633, upload-time = "2025-07-27T13:05:40.624Z" },
1512
+
{ url = "https://files.pythonhosted.org/packages/34/22/2b9d89f8ff6f2a01d6d6a88664b20a4817049cfc3f2c62caca040706660c/pybase64-1.4.2-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:cd07e6a9993c392ec8eb03912a43c6a6b21b2deb79ee0d606700fe276e9a576f", size = 58282, upload-time = "2025-07-27T13:05:42.565Z" },
1513
+
{ url = "https://files.pythonhosted.org/packages/b2/14/dbf6266177532a6a11804ac080ebffcee272f491b92820c39886ee20f201/pybase64-1.4.2-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:6a8944e8194adff4668350504bc6b7dbde2dab9244c88d99c491657d145b5af5", size = 60948, upload-time = "2025-07-27T13:05:44.48Z" },
1514
+
{ url = "https://files.pythonhosted.org/packages/fd/7a/b2ae9046a66dd5746cd72836a41386517b1680bea5ce02f2b4f1c9ebc688/pybase64-1.4.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:04ab398ec4b6a212af57f6a21a6336d5a1d754ff4ccb215951366ab9080481b2", size = 74854, upload-time = "2025-07-27T13:05:46.416Z" },
1515
+
{ url = "https://files.pythonhosted.org/packages/ef/7e/9856f6d6c38a7b730e001123d2d9fa816b8b1a45f0cdee1d509d5947b047/pybase64-1.4.2-cp314-cp314t-win32.whl", hash = "sha256:3b9201ecdcb1c3e23be4caebd6393a4e6615bd0722528f5413b58e22e3792dd3", size = 34490, upload-time = "2025-07-27T13:05:48.304Z" },
1516
+
{ url = "https://files.pythonhosted.org/packages/c7/38/8523a9dc1ec8704dedbe5ccc95192ae9a7585f7eec85cc62946fe3cacd32/pybase64-1.4.2-cp314-cp314t-win_amd64.whl", hash = "sha256:36e9b0cad8197136d73904ef5a71d843381d063fd528c5ab203fc4990264f682", size = 36680, upload-time = "2025-07-27T13:05:50.264Z" },
1517
+
{ url = "https://files.pythonhosted.org/packages/3c/52/5600104ef7b85f89fb8ec54f73504ead3f6f0294027e08d281f3cafb5c1a/pybase64-1.4.2-cp314-cp314t-win_arm64.whl", hash = "sha256:f25140496b02db0e7401567cd869fb13b4c8118bf5c2428592ec339987146d8b", size = 31600, upload-time = "2025-07-27T13:05:52.24Z" },
1518
]
1519
1520
[[package]]
···
1539
sdist = { url = "https://files.pythonhosted.org/packages/00/dd/4325abf92c39ba8623b5af936ddb36ffcfe0beae70405d456ab1fb2f5b8c/pydantic-2.11.7.tar.gz", hash = "sha256:d989c3c6cb79469287b1569f7447a17848c998458d49ebe294e975b9baf0f0db", size = 788350, upload-time = "2025-06-14T08:33:17.137Z" }
1540
wheels = [
1541
{ url = "https://files.pythonhosted.org/packages/6a/c0/ec2b1c8712ca690e5d61979dee872603e92b8a32f94cc1b72d53beab008a/pydantic-2.11.7-py3-none-any.whl", hash = "sha256:dde5df002701f6de26248661f6835bbe296a47bf73990135c7d07ce741b9623b", size = 444782, upload-time = "2025-06-14T08:33:14.905Z" },
1542
+
]
1543
+
1544
+
[package.optional-dependencies]
1545
+
email = [
1546
+
{ name = "email-validator" },
1547
]
1548
1549
[[package]]
···
1719
]
1720
1721
[[package]]
1722
+
name = "pyperclip"
1723
+
version = "1.11.0"
1724
+
source = { registry = "https://pypi.org/simple" }
1725
+
sdist = { url = "https://files.pythonhosted.org/packages/e8/52/d87eba7cb129b81563019d1679026e7a112ef76855d6159d24754dbd2a51/pyperclip-1.11.0.tar.gz", hash = "sha256:244035963e4428530d9e3a6101a1ef97209c6825edab1567beac148ccc1db1b6", size = 12185, upload-time = "2025-09-26T14:40:37.245Z" }
1726
+
wheels = [
1727
+
{ url = "https://files.pythonhosted.org/packages/df/80/fc9d01d5ed37ba4c42ca2b55b4339ae6e200b456be3a1aaddf4a9fa99b8c/pyperclip-1.11.0-py3-none-any.whl", hash = "sha256:299403e9ff44581cb9ba2ffeed69c7aa96a008622ad0c46cb575ca75b5b84273", size = 11063, upload-time = "2025-09-26T14:40:36.069Z" },
1728
+
]
1729
+
1730
+
[[package]]
1731
name = "pytest"
1732
version = "8.4.1"
1733
source = { registry = "https://pypi.org/simple" }
···
1871
]
1872
1873
[[package]]
1874
+
name = "rfc3339-validator"
1875
+
version = "0.1.4"
1876
+
source = { registry = "https://pypi.org/simple" }
1877
+
dependencies = [
1878
+
{ name = "six" },
1879
+
]
1880
+
sdist = { url = "https://files.pythonhosted.org/packages/28/ea/a9387748e2d111c3c2b275ba970b735e04e15cdb1eb30693b6b5708c4dbd/rfc3339_validator-0.1.4.tar.gz", hash = "sha256:138a2abdf93304ad60530167e51d2dfb9549521a836871b88d7f4695d0022f6b", size = 5513, upload-time = "2021-05-12T16:37:54.178Z" }
1881
+
wheels = [
1882
+
{ url = "https://files.pythonhosted.org/packages/7b/44/4e421b96b67b2daff264473f7465db72fbdf36a07e05494f50300cc7b0c6/rfc3339_validator-0.1.4-py2.py3-none-any.whl", hash = "sha256:24f6ec1eda14ef823da9e36ec7113124b39c04d50a4d3d3a3c2859577e7791fa", size = 3490, upload-time = "2021-05-12T16:37:52.536Z" },
1883
+
]
1884
+
1885
+
[[package]]
1886
name = "rich"
1887
version = "14.0.0"
1888
source = { registry = "https://pypi.org/simple" }
···
1896
]
1897
1898
[[package]]
1899
+
name = "rich-rst"
1900
+
version = "1.3.1"
1901
+
source = { registry = "https://pypi.org/simple" }
1902
+
dependencies = [
1903
+
{ name = "docutils" },
1904
+
{ name = "rich" },
1905
+
]
1906
+
sdist = { url = "https://files.pythonhosted.org/packages/b0/69/5514c3a87b5f10f09a34bb011bc0927bc12c596c8dae5915604e71abc386/rich_rst-1.3.1.tar.gz", hash = "sha256:fad46e3ba42785ea8c1785e2ceaa56e0ffa32dbe5410dec432f37e4107c4f383", size = 13839, upload-time = "2024-04-30T04:40:38.125Z" }
1907
+
wheels = [
1908
+
{ url = "https://files.pythonhosted.org/packages/fd/bc/cc4e3dbc5e7992398dcb7a8eda0cbcf4fb792a0cdb93f857b478bf3cf884/rich_rst-1.3.1-py3-none-any.whl", hash = "sha256:498a74e3896507ab04492d326e794c3ef76e7cda078703aa592d1853d91098c1", size = 11621, upload-time = "2024-04-30T04:40:32.619Z" },
1909
+
]
1910
+
1911
+
[[package]]
1912
name = "rpds-py"
1913
version = "0.26.0"
1914
source = { registry = "https://pypi.org/simple" }
···
2133
2134
[[package]]
2135
name = "turbopuffer"
2136
+
version = "1.3.0"
2137
source = { registry = "https://pypi.org/simple" }
2138
dependencies = [
2139
{ name = "aiohttp" },
···
2145
{ name = "sniffio" },
2146
{ name = "typing-extensions" },
2147
]
2148
+
sdist = { url = "https://files.pythonhosted.org/packages/3c/91/f347d5dd23eef82688537549cc032347842bb9d6879d772a0a5a9548ddd0/turbopuffer-1.3.0.tar.gz", hash = "sha256:f1429b3073c00f85d91f2fb06666ce3c0275a2da1b8dc7f0f8d8c0b7207bee56", size = 142443, upload-time = "2025-09-24T21:12:36.46Z" }
2149
wheels = [
2150
+
{ url = "https://files.pythonhosted.org/packages/5f/9d/d54419a42c45fe5758640a5228aae71a7e825205755e68fa6494fef6eea3/turbopuffer-1.3.0-py3-none-any.whl", hash = "sha256:f373ad0bca148ef643ddcbeb5665c2c50cbfab0b9e2e5a11c86afe16597a4382", size = 109674, upload-time = "2025-09-24T21:12:34.995Z" },
2151
]
2152
2153
[[package]]
···
2241
2242
[[package]]
2243
name = "websockets"
2244
+
version = "15.0.1"
2245
source = { registry = "https://pypi.org/simple" }
2246
+
sdist = { url = "https://files.pythonhosted.org/packages/21/e6/26d09fab466b7ca9c7737474c52be4f76a40301b08362eb2dbc19dcc16c1/websockets-15.0.1.tar.gz", hash = "sha256:82544de02076bafba038ce055ee6412d68da13ab47f0c60cab827346de828dee", size = 177016, upload-time = "2025-03-05T20:03:41.606Z" }
2247
wheels = [
2248
+
{ url = "https://files.pythonhosted.org/packages/51/6b/4545a0d843594f5d0771e86463606a3988b5a09ca5123136f8a76580dd63/websockets-15.0.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3e90baa811a5d73f3ca0bcbf32064d663ed81318ab225ee4f427ad4e26e5aff3", size = 175437, upload-time = "2025-03-05T20:02:16.706Z" },
2249
+
{ url = "https://files.pythonhosted.org/packages/f4/71/809a0f5f6a06522af902e0f2ea2757f71ead94610010cf570ab5c98e99ed/websockets-15.0.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:592f1a9fe869c778694f0aa806ba0374e97648ab57936f092fd9d87f8bc03665", size = 173096, upload-time = "2025-03-05T20:02:18.832Z" },
2250
+
{ url = "https://files.pythonhosted.org/packages/3d/69/1a681dd6f02180916f116894181eab8b2e25b31e484c5d0eae637ec01f7c/websockets-15.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0701bc3cfcb9164d04a14b149fd74be7347a530ad3bbf15ab2c678a2cd3dd9a2", size = 173332, upload-time = "2025-03-05T20:02:20.187Z" },
2251
+
{ url = "https://files.pythonhosted.org/packages/a6/02/0073b3952f5bce97eafbb35757f8d0d54812b6174ed8dd952aa08429bcc3/websockets-15.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e8b56bdcdb4505c8078cb6c7157d9811a85790f2f2b3632c7d1462ab5783d215", size = 183152, upload-time = "2025-03-05T20:02:22.286Z" },
2252
+
{ url = "https://files.pythonhosted.org/packages/74/45/c205c8480eafd114b428284840da0b1be9ffd0e4f87338dc95dc6ff961a1/websockets-15.0.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0af68c55afbd5f07986df82831c7bff04846928ea8d1fd7f30052638788bc9b5", size = 182096, upload-time = "2025-03-05T20:02:24.368Z" },
2253
+
{ url = "https://files.pythonhosted.org/packages/14/8f/aa61f528fba38578ec553c145857a181384c72b98156f858ca5c8e82d9d3/websockets-15.0.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64dee438fed052b52e4f98f76c5790513235efaa1ef7f3f2192c392cd7c91b65", size = 182523, upload-time = "2025-03-05T20:02:25.669Z" },
2254
+
{ url = "https://files.pythonhosted.org/packages/ec/6d/0267396610add5bc0d0d3e77f546d4cd287200804fe02323797de77dbce9/websockets-15.0.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d5f6b181bb38171a8ad1d6aa58a67a6aa9d4b38d0f8c5f496b9e42561dfc62fe", size = 182790, upload-time = "2025-03-05T20:02:26.99Z" },
2255
+
{ url = "https://files.pythonhosted.org/packages/02/05/c68c5adbf679cf610ae2f74a9b871ae84564462955d991178f95a1ddb7dd/websockets-15.0.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5d54b09eba2bada6011aea5375542a157637b91029687eb4fdb2dab11059c1b4", size = 182165, upload-time = "2025-03-05T20:02:30.291Z" },
2256
+
{ url = "https://files.pythonhosted.org/packages/29/93/bb672df7b2f5faac89761cb5fa34f5cec45a4026c383a4b5761c6cea5c16/websockets-15.0.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3be571a8b5afed347da347bfcf27ba12b069d9d7f42cb8c7028b5e98bbb12597", size = 182160, upload-time = "2025-03-05T20:02:31.634Z" },
2257
+
{ url = "https://files.pythonhosted.org/packages/ff/83/de1f7709376dc3ca9b7eeb4b9a07b4526b14876b6d372a4dc62312bebee0/websockets-15.0.1-cp312-cp312-win32.whl", hash = "sha256:c338ffa0520bdb12fbc527265235639fb76e7bc7faafbb93f6ba80d9c06578a9", size = 176395, upload-time = "2025-03-05T20:02:33.017Z" },
2258
+
{ url = "https://files.pythonhosted.org/packages/7d/71/abf2ebc3bbfa40f391ce1428c7168fb20582d0ff57019b69ea20fa698043/websockets-15.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:fcd5cf9e305d7b8338754470cf69cf81f420459dbae8a3b40cee57417f4614a7", size = 176841, upload-time = "2025-03-05T20:02:34.498Z" },
2259
+
{ url = "https://files.pythonhosted.org/packages/cb/9f/51f0cf64471a9d2b4d0fc6c534f323b664e7095640c34562f5182e5a7195/websockets-15.0.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee443ef070bb3b6ed74514f5efaa37a252af57c90eb33b956d35c8e9c10a1931", size = 175440, upload-time = "2025-03-05T20:02:36.695Z" },
2260
+
{ url = "https://files.pythonhosted.org/packages/8a/05/aa116ec9943c718905997412c5989f7ed671bc0188ee2ba89520e8765d7b/websockets-15.0.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5a939de6b7b4e18ca683218320fc67ea886038265fd1ed30173f5ce3f8e85675", size = 173098, upload-time = "2025-03-05T20:02:37.985Z" },
2261
+
{ url = "https://files.pythonhosted.org/packages/ff/0b/33cef55ff24f2d92924923c99926dcce78e7bd922d649467f0eda8368923/websockets-15.0.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:746ee8dba912cd6fc889a8147168991d50ed70447bf18bcda7039f7d2e3d9151", size = 173329, upload-time = "2025-03-05T20:02:39.298Z" },
2262
+
{ url = "https://files.pythonhosted.org/packages/31/1d/063b25dcc01faa8fada1469bdf769de3768b7044eac9d41f734fd7b6ad6d/websockets-15.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:595b6c3969023ecf9041b2936ac3827e4623bfa3ccf007575f04c5a6aa318c22", size = 183111, upload-time = "2025-03-05T20:02:40.595Z" },
2263
+
{ url = "https://files.pythonhosted.org/packages/93/53/9a87ee494a51bf63e4ec9241c1ccc4f7c2f45fff85d5bde2ff74fcb68b9e/websockets-15.0.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c714d2fc58b5ca3e285461a4cc0c9a66bd0e24c5da9911e30158286c9b5be7f", size = 182054, upload-time = "2025-03-05T20:02:41.926Z" },
2264
+
{ url = "https://files.pythonhosted.org/packages/ff/b2/83a6ddf56cdcbad4e3d841fcc55d6ba7d19aeb89c50f24dd7e859ec0805f/websockets-15.0.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f3c1e2ab208db911594ae5b4f79addeb3501604a165019dd221c0bdcabe4db8", size = 182496, upload-time = "2025-03-05T20:02:43.304Z" },
2265
+
{ url = "https://files.pythonhosted.org/packages/98/41/e7038944ed0abf34c45aa4635ba28136f06052e08fc2168520bb8b25149f/websockets-15.0.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:229cf1d3ca6c1804400b0a9790dc66528e08a6a1feec0d5040e8b9eb14422375", size = 182829, upload-time = "2025-03-05T20:02:48.812Z" },
2266
+
{ url = "https://files.pythonhosted.org/packages/e0/17/de15b6158680c7623c6ef0db361da965ab25d813ae54fcfeae2e5b9ef910/websockets-15.0.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:756c56e867a90fb00177d530dca4b097dd753cde348448a1012ed6c5131f8b7d", size = 182217, upload-time = "2025-03-05T20:02:50.14Z" },
2267
+
{ url = "https://files.pythonhosted.org/packages/33/2b/1f168cb6041853eef0362fb9554c3824367c5560cbdaad89ac40f8c2edfc/websockets-15.0.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:558d023b3df0bffe50a04e710bc87742de35060580a293c2a984299ed83bc4e4", size = 182195, upload-time = "2025-03-05T20:02:51.561Z" },
2268
+
{ url = "https://files.pythonhosted.org/packages/86/eb/20b6cdf273913d0ad05a6a14aed4b9a85591c18a987a3d47f20fa13dcc47/websockets-15.0.1-cp313-cp313-win32.whl", hash = "sha256:ba9e56e8ceeeedb2e080147ba85ffcd5cd0711b89576b83784d8605a7df455fa", size = 176393, upload-time = "2025-03-05T20:02:53.814Z" },
2269
+
{ url = "https://files.pythonhosted.org/packages/1b/6c/c65773d6cab416a64d191d6ee8a8b1c68a09970ea6909d16965d26bfed1e/websockets-15.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:e09473f095a819042ecb2ab9465aee615bd9c2028e4ef7d933600a8401c79561", size = 176837, upload-time = "2025-03-05T20:02:55.237Z" },
2270
+
{ url = "https://files.pythonhosted.org/packages/fa/a8/5b41e0da817d64113292ab1f8247140aac61cbf6cfd085d6a0fa77f4984f/websockets-15.0.1-py3-none-any.whl", hash = "sha256:f7a866fbc1e97b5c617ee4116daaa09b722101d4a3c170c787450ba409f9736f", size = 169743, upload-time = "2025-03-05T20:03:39.41Z" },
2271
+
]
2272
+
2273
+
[[package]]
2274
+
name = "werkzeug"
2275
+
version = "3.1.1"
2276
+
source = { registry = "https://pypi.org/simple" }
2277
+
dependencies = [
2278
+
{ name = "markupsafe" },
2279
+
]
2280
+
sdist = { url = "https://files.pythonhosted.org/packages/32/af/d4502dc713b4ccea7175d764718d5183caf8d0867a4f0190d5d4a45cea49/werkzeug-3.1.1.tar.gz", hash = "sha256:8cd39dfbdfc1e051965f156163e2974e52c210f130810e9ad36858f0fd3edad4", size = 806453, upload-time = "2024-11-01T16:40:45.462Z" }
2281
+
wheels = [
2282
+
{ url = "https://files.pythonhosted.org/packages/ee/ea/c67e1dee1ba208ed22c06d1d547ae5e293374bfc43e0eb0ef5e262b68561/werkzeug-3.1.1-py3-none-any.whl", hash = "sha256:a71124d1ef06008baafa3d266c02f56e1836a5984afd6dd6c9230669d60d9fb5", size = 224371, upload-time = "2024-11-01T16:40:43.994Z" },
2283
]
2284
2285
[[package]]