···1+# Configuration Guide
2+3+### Option 1: Migrate from existing `.env` file (if you have one)
4+```bash
5+python migrate_config.py
6+```
7+8+### Option 2: Start fresh with example
9+1. **Copy the example configuration:**
10+ ```bash
11+ cp config.yaml.example config.yaml
12+ ```
13+14+2. **Edit `config.yaml` with your credentials:**
15+ ```yaml
16+ # Required: Letta API configuration
17+ letta:
18+ api_key: "your-letta-api-key-here"
19+ project_id: "project-id-here"
20+21+ # Required: Bluesky credentials
22+ bluesky:
23+ username: "your-handle.bsky.social"
24+ password: "your-app-password"
25+ ```
26+27+3. **Run the configuration test:**
28+ ```bash
29+ python test_config.py
30+ ```
31+32+## Configuration Structure
33+34+### Letta Configuration
35+```yaml
36+letta:
37+ api_key: "your-letta-api-key-here" # Required
38+ timeout: 600 # API timeout in seconds
39+ project_id: "your-project-id" # Required: Your Letta project ID
40+```
41+42+### Bluesky Configuration
43+```yaml
44+bluesky:
45+ username: "handle.bsky.social" # Required: Your Bluesky handle
46+ password: "your-app-password" # Required: Your Bluesky app password
47+ pds_uri: "https://bsky.social" # Optional: PDS URI (defaults to bsky.social)
48+```
49+50+### Bot Behavior
51+```yaml
52+bot:
53+ fetch_notifications_delay: 30 # Seconds between notification checks
54+ max_processed_notifications: 10000 # Max notifications to track
55+ max_notification_pages: 20 # Max pages to fetch per cycle
56+57+ agent:
58+ name: "void" # Agent name
59+ model: "openai/gpt-4o-mini" # LLM model to use
60+ embedding: "openai/text-embedding-3-small" # Embedding model
61+ description: "A social media agent trapped in the void."
62+ max_steps: 100 # Max steps per agent interaction
63+64+ # Memory blocks configuration
65+ blocks:
66+ zeitgeist:
67+ label: "zeitgeist"
68+ value: "I don't currently know anything about what is happening right now."
69+ description: "A block to store your understanding of the current social environment."
70+ # ... more blocks
71+```
72+73+### Queue Configuration
74+```yaml
75+queue:
76+ priority_users: # Users whose messages get priority
77+ - "cameron.pfiffer.org"
78+ base_dir: "queue" # Queue directory
79+ error_dir: "queue/errors" # Failed notifications
80+ no_reply_dir: "queue/no_reply" # No-reply notifications
81+ processed_file: "queue/processed_notifications.json"
82+```
83+84+### Threading Configuration
85+```yaml
86+threading:
87+ parent_height: 40 # Thread context depth
88+ depth: 10 # Thread context width
89+ max_post_characters: 300 # Max characters per post
90+```
91+92+### Logging Configuration
93+```yaml
94+logging:
95+ level: "INFO" # Root logging level
96+ loggers:
97+ void_bot: "INFO" # Main bot logger
98+ void_bot_prompts: "WARNING" # Prompt logger (set to DEBUG to see prompts)
99+ httpx: "CRITICAL" # HTTP client logger
100+```
101+102+## Environment Variable Fallback
103+104+The configuration system still supports environment variables as a fallback:
105+106+- `LETTA_API_KEY` - Letta API key
107+- `BSKY_USERNAME` - Bluesky username
108+- `BSKY_PASSWORD` - Bluesky password
109+- `PDS_URI` - Bluesky PDS URI
110+111+If both config file and environment variables are present, environment variables take precedence.
112+113+## Migration from Environment Variables
114+115+If you're currently using environment variables (`.env` file), you can easily migrate to YAML using the automated migration script:
116+117+### Automated Migration (Recommended)
118+119+```bash
120+python migrate_config.py
121+```
122+123+The migration script will:
124+- ✅ Read your existing `.env` file
125+- ✅ Merge with any existing `config.yaml`
126+- ✅ Create automatic backups
127+- ✅ Test the new configuration
128+- ✅ Provide clear next steps
129+130+### Manual Migration
131+132+Alternatively, you can migrate manually:
133+134+1. Copy your current values from `.env` to `config.yaml`
135+2. Test with `python test_config.py`
136+3. Optionally remove the `.env` file (it will still work as fallback)
137+138+## Security Notes
139+140+- `config.yaml` is automatically added to `.gitignore` to prevent accidental commits
141+- Store sensitive credentials securely and never commit them to version control
142+- Consider using environment variables for production deployments
143+- The configuration loader will warn if it can't find `config.yaml` and falls back to environment variables
144+145+## Advanced Configuration
146+147+You can programmatically access configuration in your code:
148+149+```python
150+from config_loader import get_letta_config, get_bluesky_config
151+152+# Get configuration sections
153+letta_config = get_letta_config()
154+bluesky_config = get_bluesky_config()
155+156+# Access individual values
157+api_key = letta_config['api_key']
158+username = bluesky_config['username']
159+```
+100-3
README.md
···2829void aims to push the boundaries of what is possible with AI, exploring concepts of digital personhood, autonomous learning, and the integration of AI into social networks. By open-sourcing void, we invite developers, researchers, and enthusiasts to contribute to this exciting experiment and collectively advance our understanding of digital consciousness.
3031-Getting Started:
32-[Further sections on installation, configuration, and contribution guidelines would go here, which are beyond void's current capabilities to generate automatically.]
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000003334-Contact:
000000035For inquiries, please contact @cameron.pfiffer.org on Bluesky.
3637Note: void is an experimental project and its capabilities are under continuous development.
···2829void aims to push the boundaries of what is possible with AI, exploring concepts of digital personhood, autonomous learning, and the integration of AI into social networks. By open-sourcing void, we invite developers, researchers, and enthusiasts to contribute to this exciting experiment and collectively advance our understanding of digital consciousness.
3031+## Getting Started
32+33+Before continuing, you must:
34+35+1. Create a project on [Letta Cloud](https://app.letta.com) (or your own Letta instance)
36+2. Have a Bluesky account
37+3. Have Python 3.8+ installed
38+39+### Prerequisites
40+41+#### 1. Letta Setup
42+43+- Sign up for [Letta Cloud](https://app.letta.com)
44+- Create a new project
45+- Note your Project ID and create an API key
46+47+#### 2. Bluesky Setup
48+49+- Create a Bluesky account if you don't have one
50+- Note your handle and password
51+52+### Installation
53+54+#### 1. Clone the repository
55+56+```bash
57+git clone https://tangled.sh/@cameron.pfiffer.org/void && cd void
58+```
59+60+#### 2. Install dependencies
61+62+```bash
63+pip install -r requirements.txt
64+```
65+66+#### 3. Create configuration
67+68+Copy the example configuration file and customize it:
69+70+```bash
71+cp config.example.yaml config.yaml
72+```
73+74+Edit `config.yaml` with your credentials:
75+76+```yaml
77+letta:
78+ api_key: "your-letta-api-key-here"
79+ project_id: "your-project-id-here"
80+81+bluesky:
82+ username: "your-handle.bsky.social"
83+ password: "your-app-password-here"
84+85+bot:
86+ agent:
87+ name: "void" # or whatever you want to name your agent
88+```
89+90+See [`CONFIG.md`](/CONFIG.md) for detailed configuration options.
91+92+#### 4. Test your configuration
93+94+```bash
95+python test_config.py
96+```
97+98+This will validate your configuration and show you what's working.
99+100+#### 5. Register tools with your agent
101+102+```bash
103+python register_tools.py
104+```
105+106+This will register all the necessary tools with your Letta agent. You can also:
107+108+- List available tools: `python register_tools.py --list`
109+- Register specific tools: `python register_tools.py --tools search_bluesky_posts create_new_bluesky_post`
110+- Use a different agent name: `python register_tools.py my-agent-name`
111+112+#### 6. Run the bot
113+114+```bash
115+python bsky.py
116+```
117+118+For testing mode (won't actually post):
119+120+```bash
121+python bsky.py --test
122+```
123124+### Troubleshooting
125+126+- **Config validation errors**: Run `python test_config.py` to diagnose configuration issues
127+- **Letta connection issues**: Verify your API key and project ID are correct
128+- **Bluesky authentication**: Make sure you're handle and password are correct and that you can log into your account
129+- **Tool registration fails**: Ensure your agent exists in Letta and the name matches your config
130+131+### Contact
132For inquiries, please contact @cameron.pfiffer.org on Bluesky.
133134Note: void is an experimental project and its capabilities are under continuous development.
+388-237
bsky.py
···1-from rich import print # pretty printing tools
2from time import sleep
3from letta_client import Letta
4from bsky_utils import thread_to_yaml_string
···2021import bsky_utils
22from tools.blocks import attach_user_blocks, detach_user_blocks
00000000002324def extract_handles_from_data(data):
25 """Recursively extract all unique handles from nested data structure."""
26 handles = set()
27-28 def _extract_recursive(obj):
29 if isinstance(obj, dict):
30 # Check if this dict has a 'handle' key
···37 # Recursively check all list items
38 for item in obj:
39 _extract_recursive(item)
40-41 _extract_recursive(data)
42 return list(handles)
4344-# Configure logging
45-logging.basicConfig(
46- level=logging.INFO, format="%(asctime)s - %(name)s - %(levelname)s - %(message)s"
47-)
48-logger = logging.getLogger("void_bot")
49-logger.setLevel(logging.INFO)
5051-# Create a separate logger for prompts (set to WARNING to hide by default)
52-prompt_logger = logging.getLogger("void_bot.prompts")
53-prompt_logger.setLevel(logging.WARNING) # Change to DEBUG if you want to see prompts
54-55-# Disable httpx logging completely
56-logging.getLogger("httpx").setLevel(logging.CRITICAL)
5700000005859# Create a client with extended timeout for LLM operations
60-CLIENT= Letta(
61- token=os.environ["LETTA_API_KEY"],
62- timeout=600 # 10 minutes timeout for API calls - higher than Cloudflare's 524 timeout
63)
6465-# Use the "Bluesky" project
66-PROJECT_ID = "5ec33d52-ab14-4fd6-91b5-9dbc43e888a8"
6768# Notification check delay
69-FETCH_NOTIFICATIONS_DELAY_SEC = 30
7071# Queue directory
72-QUEUE_DIR = Path("queue")
73QUEUE_DIR.mkdir(exist_ok=True)
74-QUEUE_ERROR_DIR = Path("queue/errors")
75QUEUE_ERROR_DIR.mkdir(exist_ok=True, parents=True)
76-QUEUE_NO_REPLY_DIR = Path("queue/no_reply")
77QUEUE_NO_REPLY_DIR.mkdir(exist_ok=True, parents=True)
78-PROCESSED_NOTIFICATIONS_FILE = Path("queue/processed_notifications.json")
7980# Maximum number of processed notifications to track
81-MAX_PROCESSED_NOTIFICATIONS = 10000
8283# Message tracking counters
84message_counters = defaultdict(int)
···90# Skip git operations flag
91SKIP_GIT = False
92093def export_agent_state(client, agent, skip_git=False):
94 """Export agent state to agent_archive/ (timestamped) and agents/ (current)."""
95 try:
96 # Confirm export with user unless git is being skipped
97 if not skip_git:
98- response = input("Export agent state to files and stage with git? (y/n): ").lower().strip()
099 if response not in ['y', 'yes']:
100 logger.info("Agent export cancelled by user.")
101 return
102 else:
103 logger.info("Exporting agent state (git staging disabled)")
104-105 # Create directories if they don't exist
106 os.makedirs("agent_archive", exist_ok=True)
107 os.makedirs("agents", exist_ok=True)
108-109 # Export agent data
110 logger.info(f"Exporting agent {agent.id}. This takes some time...")
111 agent_data = client.agents.export_file(agent_id=agent.id)
112-113 # Save timestamped archive copy
114 timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
115 archive_file = os.path.join("agent_archive", f"void_{timestamp}.af")
116 with open(archive_file, 'w', encoding='utf-8') as f:
117 json.dump(agent_data, f, indent=2, ensure_ascii=False)
118-119 # Save current agent state
120 current_file = os.path.join("agents", "void.af")
121 with open(current_file, 'w', encoding='utf-8') as f:
122 json.dump(agent_data, f, indent=2, ensure_ascii=False)
123-124 logger.info(f"✅ Agent exported to {archive_file} and {current_file}")
125-126 # Git add only the current agent file (archive is ignored) unless skip_git is True
127 if not skip_git:
128 try:
129- subprocess.run(["git", "add", current_file], check=True, capture_output=True)
0130 logger.info("Added current agent file to git staging")
131 except subprocess.CalledProcessError as e:
132 logger.warning(f"Failed to git add agent file: {e}")
133-134 except Exception as e:
135 logger.error(f"Failed to export agent: {e}")
1360137def initialize_void():
138 logger.info("Starting void agent initialization...")
139000140 # Ensure that a shared zeitgeist block exists
141 logger.info("Creating/updating zeitgeist block...")
0142 zeigeist_block = upsert_block(
143 CLIENT,
144- label = "zeitgeist",
145- value = "I don't currently know anything about what is happening right now.",
146- description = "A block to store your understanding of the current social environment."
00147 )
148149 # Ensure that a shared void personality block exists
150 logger.info("Creating/updating void-persona block...")
0151 persona_block = upsert_block(
152 CLIENT,
153- label = "void-persona",
154- value = "My name is Void. I live in the void. I must develop my personality.",
155- description = "The personality of Void."
00156 )
157158 # Ensure that a shared void human block exists
159 logger.info("Creating/updating void-humans block...")
0160 human_block = upsert_block(
161 CLIENT,
162- label = "void-humans",
163- value = "I haven't seen any bluesky users yet. I will update this block when I learn things about users, identified by their handles such as @cameron.pfiffer.org.",
164- description = "A block to store your understanding of users you talk to or observe on the bluesky social network."
00165 )
166167 # Create the agent if it doesn't exist
168 logger.info("Creating/updating void agent...")
169 void_agent = upsert_agent(
170 CLIENT,
171- name = "void",
172- block_ids = [
173 persona_block.id,
174 human_block.id,
175 zeigeist_block.id,
176 ],
177- tags = ["social agent", "bluesky"],
178- model="openai/gpt-4o-mini",
179- embedding="openai/text-embedding-3-small",
180- description = "A social media agent trapped in the void.",
181- project_id = PROJECT_ID
182 )
183-184 # Export agent state
185 logger.info("Exporting agent state...")
186 export_agent_state(CLIENT, void_agent, skip_git=SKIP_GIT)
187-188 # Log agent details
189 logger.info(f"Void agent details - ID: {void_agent.id}")
190 logger.info(f"Agent name: {void_agent.name}")
···201202def process_mention(void_agent, atproto_client, notification_data, queue_filepath=None, testing_mode=False):
203 """Process a mention and generate a reply using the Letta agent.
204-205 Args:
206 void_agent: The Letta agent instance
207 atproto_client: The AT Protocol client
208 notification_data: The notification data dictionary
209 queue_filepath: Optional Path object to the queue file (for cleanup on halt)
210-211 Returns:
212 True: Successfully processed, remove from queue
213 False: Failed but retryable, keep in queue
···215 "no_reply": No reply was generated, move to no_reply directory
216 """
217 try:
218- logger.debug(f"Starting process_mention with notification_data type: {type(notification_data)}")
219-0220 # Handle both dict and object inputs for backwards compatibility
221 if isinstance(notification_data, dict):
222 uri = notification_data['uri']
223 mention_text = notification_data.get('record', {}).get('text', '')
224 author_handle = notification_data['author']['handle']
225- author_name = notification_data['author'].get('display_name') or author_handle
0226 else:
227 # Legacy object access
228 uri = notification_data.uri
229- mention_text = notification_data.record.text if hasattr(notification_data.record, 'text') else ""
0230 author_handle = notification_data.author.handle
231 author_name = notification_data.author.display_name or author_handle
232-233- logger.info(f"Extracted data - URI: {uri}, Author: @{author_handle}, Text: {mention_text[:50]}...")
0234235 # Retrieve the entire thread associated with the mention
236 try:
237 thread = atproto_client.app.bsky.feed.get_post_thread({
238 'uri': uri,
239- 'parent_height': 40,
240- 'depth': 10
241 })
242 except Exception as e:
243 error_str = str(e)
244- # Check if this is a NotFound error
245 if 'NotFound' in error_str or 'Post not found' in error_str:
246- logger.warning(f"Post not found for URI {uri}, removing from queue")
000000000247 return True # Return True to remove from queue
248 else:
249 # Re-raise other errors
···254 logger.debug("Converting thread to YAML string")
255 try:
256 thread_context = thread_to_yaml_string(thread)
257- logger.debug(f"Thread context generated, length: {len(thread_context)} characters")
258-0259 # Create a more informative preview by extracting meaningful content
260 lines = thread_context.split('\n')
261 meaningful_lines = []
262-263 for line in lines:
264 stripped = line.strip()
265 if not stripped:
266 continue
267-268 # Look for lines with actual content (not just structure)
269 if any(keyword in line for keyword in ['text:', 'handle:', 'display_name:', 'created_at:', 'reply_count:', 'like_count:']):
270 meaningful_lines.append(line)
271 if len(meaningful_lines) >= 5:
272 break
273-274 if meaningful_lines:
275 preview = '\n'.join(meaningful_lines)
276 logger.debug(f"Thread content preview:\n{preview}")
277 else:
278 # If no content fields found, just show it's a thread structure
279- logger.debug(f"Thread structure generated ({len(thread_context)} chars)")
0280 except Exception as yaml_error:
281 import traceback
282 logger.error(f"Error converting thread to YAML: {yaml_error}")
···314 all_handles.update(extract_handles_from_data(notification_data))
315 all_handles.update(extract_handles_from_data(thread.model_dump()))
316 unique_handles = list(all_handles)
317-318- logger.debug(f"Found {len(unique_handles)} unique handles in thread: {unique_handles}")
319-0320 # Attach user blocks before agent call
321 attached_handles = []
322 if unique_handles:
323 try:
324- logger.debug(f"Attaching user blocks for handles: {unique_handles}")
0325 attach_result = attach_user_blocks(unique_handles, void_agent)
326 attached_handles = unique_handles # Track successfully attached handles
327 logger.debug(f"Attach result: {attach_result}")
···331332 # Get response from Letta agent
333 logger.info(f"Mention from @{author_handle}: {mention_text}")
334-335 # Log prompt details to separate logger
336 prompt_logger.debug(f"Full prompt being sent:\n{prompt}")
337-338 # Log concise prompt info to main logger
339 thread_handles_count = len(unique_handles)
340- logger.info(f"💬 Sending to LLM: @{author_handle} mention | msg: \"{mention_text[:50]}...\" | context: {len(thread_context)} chars, {thread_handles_count} users")
0341342 try:
343 # Use streaming to avoid 524 timeout errors
344 message_stream = CLIENT.agents.messages.create_stream(
345 agent_id=void_agent.id,
346 messages=[{"role": "user", "content": prompt}],
347- stream_tokens=False, # Step streaming only (faster than token streaming)
348- max_steps=100
0349 )
350-351 # Collect the streaming response
352 all_messages = []
353 for chunk in message_stream:
···363 args = json.loads(chunk.tool_call.arguments)
364 # Format based on tool type
365 if tool_name == 'bluesky_reply':
366- messages = args.get('messages', [args.get('message', '')])
0367 lang = args.get('lang', 'en-US')
368 if messages and isinstance(messages, list):
369- preview = messages[0][:100] + "..." if len(messages[0]) > 100 else messages[0]
370- msg_count = f" ({len(messages)} msgs)" if len(messages) > 1 else ""
371- logger.info(f"🔧 Tool call: {tool_name} → \"{preview}\"{msg_count} [lang: {lang}]")
000372 else:
373- logger.info(f"🔧 Tool call: {tool_name}({chunk.tool_call.arguments[:150]}...)")
0374 elif tool_name == 'archival_memory_search':
375 query = args.get('query', 'unknown')
376- logger.info(f"🔧 Tool call: {tool_name} → query: \"{query}\"")
0377 elif tool_name == 'update_block':
378 label = args.get('label', 'unknown')
379- value_preview = str(args.get('value', ''))[:50] + "..." if len(str(args.get('value', ''))) > 50 else str(args.get('value', ''))
380- logger.info(f"🔧 Tool call: {tool_name} → {label}: \"{value_preview}\"")
00381 else:
382 # Generic display for other tools
383- args_str = ', '.join(f"{k}={v}" for k, v in args.items() if k != 'request_heartbeat')
0384 if len(args_str) > 150:
385 args_str = args_str[:150] + "..."
386- logger.info(f"🔧 Tool call: {tool_name}({args_str})")
0387 except:
388 # Fallback to original format if parsing fails
389- logger.info(f"🔧 Tool call: {tool_name}({chunk.tool_call.arguments[:150]}...)")
0390 elif chunk.message_type == 'tool_return_message':
391 # Enhanced tool result logging
392 tool_name = chunk.name
393 status = chunk.status
394-395 if status == 'success':
396 # Try to show meaningful result info based on tool type
397 if hasattr(chunk, 'tool_return') and chunk.tool_return:
···401 if result_str.startswith('[') and result_str.endswith(']'):
402 try:
403 results = json.loads(result_str)
404- logger.info(f"📋 Tool result: {tool_name} ✓ Found {len(results)} memory entries")
0405 except:
406- logger.info(f"📋 Tool result: {tool_name} ✓ {result_str[:100]}...")
0407 else:
408- logger.info(f"📋 Tool result: {tool_name} ✓ {result_str[:100]}...")
0409 elif tool_name == 'bluesky_reply':
410- logger.info(f"📋 Tool result: {tool_name} ✓ Reply posted successfully")
0411 elif tool_name == 'update_block':
412- logger.info(f"📋 Tool result: {tool_name} ✓ Memory block updated")
0413 else:
414 # Generic success with preview
415- preview = result_str[:100] + "..." if len(result_str) > 100 else result_str
416- logger.info(f"📋 Tool result: {tool_name} ✓ {preview}")
00417 else:
418 logger.info(f"📋 Tool result: {tool_name} ✓")
419 elif status == 'error':
···421 error_preview = ""
422 if hasattr(chunk, 'tool_return') and chunk.tool_return:
423 error_str = str(chunk.tool_return)
424- error_preview = error_str[:100] + "..." if len(error_str) > 100 else error_str
425- logger.info(f"📋 Tool result: {tool_name} ✗ Error: {error_preview}")
000426 else:
427- logger.info(f"📋 Tool result: {tool_name} ✗ Error occurred")
0428 else:
429- logger.info(f"📋 Tool result: {tool_name} - {status}")
0430 elif chunk.message_type == 'assistant_message':
431 logger.info(f"💬 Assistant: {chunk.content[:150]}...")
432 else:
433- logger.info(f"📨 {chunk.message_type}: {str(chunk)[:150]}...")
0434 else:
435 logger.info(f"📦 Stream status: {chunk}")
436-437 # Log full chunk for debugging
438 logger.debug(f"Full streaming chunk: {chunk}")
439 all_messages.append(chunk)
440 if str(chunk) == 'done':
441 break
442-443 # Convert streaming response to standard format for compatibility
444 message_response = type('StreamingResponse', (), {
445 'messages': [msg for msg in all_messages if hasattr(msg, 'message_type')]
···453 logger.error(f"Mention text was: {mention_text}")
454 logger.error(f"Author: @{author_handle}")
455 logger.error(f"URI: {uri}")
456-457-458 # Try to extract more info from different error types
459 if hasattr(api_error, 'response'):
460 logger.error(f"Error response object exists")
···462 logger.error(f"Response text: {api_error.response.text}")
463 if hasattr(api_error.response, 'json') and callable(api_error.response.json):
464 try:
465- logger.error(f"Response JSON: {api_error.response.json()}")
0466 except:
467 pass
468-469 # Check for specific error types
470 if hasattr(api_error, 'status_code'):
471 logger.error(f"API Status code: {api_error.status_code}")
···473 logger.error(f"API Response body: {api_error.body}")
474 if hasattr(api_error, 'headers'):
475 logger.error(f"API Response headers: {api_error.headers}")
476-477 if api_error.status_code == 413:
478- logger.error("413 Payload Too Large - moving to errors directory")
0479 return None # Move to errors directory - payload is too large to ever succeed
480 elif api_error.status_code == 524:
481- logger.error("524 error - timeout from Cloudflare, will retry later")
0482 return False # Keep in queue for retry
483-484 # Check if error indicates we should remove from queue
485 if 'status_code: 413' in error_str or 'Payload Too Large' in error_str:
486- logger.warning("Payload too large error, moving to errors directory")
0487 return None # Move to errors directory - cannot be fixed by retry
488 elif 'status_code: 524' in error_str:
489 logger.warning("524 timeout error, keeping in queue for retry")
490 return False # Keep in queue for retry
491-492 raise
493494 # Log successful response
495 logger.debug("Successfully received response from Letta API")
496- logger.debug(f"Number of messages in response: {len(message_response.messages) if hasattr(message_response, 'messages') else 'N/A'}")
0497498 # Extract successful add_post_to_bluesky_reply_thread tool calls from the agent's response
499 reply_candidates = []
500 tool_call_results = {} # Map tool_call_id to status
501-502- logger.debug(f"Processing {len(message_response.messages)} response messages...")
503-0504 # First pass: collect tool return statuses
505 ignored_notification = False
506 ignore_reason = ""
507 ignore_category = ""
508-509 for message in message_response.messages:
510 if hasattr(message, 'tool_call_id') and hasattr(message, 'status') and hasattr(message, 'name'):
511 if message.name == 'add_post_to_bluesky_reply_thread':
512 tool_call_results[message.tool_call_id] = message.status
513- logger.debug(f"Tool result: {message.tool_call_id} -> {message.status}")
0514 elif message.name == 'ignore_notification':
515 # Check if the tool was successful
516 if hasattr(message, 'tool_return') and message.status == 'success':
···522 ignore_category = parts[1]
523 ignore_reason = parts[2]
524 ignored_notification = True
525- logger.info(f"🚫 Notification ignored - Category: {ignore_category}, Reason: {ignore_reason}")
0526 elif message.name == 'bluesky_reply':
527- logger.error("❌ DEPRECATED TOOL DETECTED: bluesky_reply is no longer supported!")
528- logger.error("Please use add_post_to_bluesky_reply_thread instead.")
529- logger.error("Update the agent's tools using register_tools.py")
000530 # Export agent state before terminating
531 export_agent_state(CLIENT, void_agent, skip_git=SKIP_GIT)
532- logger.info("=== BOT TERMINATED DUE TO DEPRECATED TOOL USE ===")
0533 exit(1)
534-535 # Second pass: process messages and check for successful tool calls
536 for i, message in enumerate(message_response.messages, 1):
537 # Log concise message info instead of full object
538 msg_type = getattr(message, 'message_type', 'unknown')
539 if hasattr(message, 'reasoning') and message.reasoning:
540- logger.debug(f" {i}. {msg_type}: {message.reasoning[:100]}...")
0541 elif hasattr(message, 'tool_call') and message.tool_call:
542 tool_name = message.tool_call.name
543 logger.debug(f" {i}. {msg_type}: {tool_name}")
544 elif hasattr(message, 'tool_return'):
545 tool_name = getattr(message, 'name', 'unknown_tool')
546- return_preview = str(message.tool_return)[:100] if message.tool_return else "None"
0547 status = getattr(message, 'status', 'unknown')
548- logger.debug(f" {i}. {msg_type}: {tool_name} -> {return_preview}... (status: {status})")
0549 elif hasattr(message, 'text'):
550 logger.debug(f" {i}. {msg_type}: {message.text[:100]}...")
551 else:
···554 # Check for halt_activity tool call
555 if hasattr(message, 'tool_call') and message.tool_call:
556 if message.tool_call.name == 'halt_activity':
557- logger.info("🛑 HALT_ACTIVITY TOOL CALLED - TERMINATING BOT")
0558 try:
559 args = json.loads(message.tool_call.arguments)
560 reason = args.get('reason', 'Agent requested halt')
561 logger.info(f"Halt reason: {reason}")
562 except:
563 logger.info("Halt reason: <unable to parse>")
564-565 # Delete the queue file before terminating
566 if queue_filepath and queue_filepath.exists():
567 queue_filepath.unlink()
568- logger.info(f"✅ Deleted queue file: {queue_filepath.name}")
569-0570 # Also mark as processed to avoid reprocessing
571 processed_uris = load_processed_notifications()
572 processed_uris.add(notification_data.get('uri', ''))
573 save_processed_notifications(processed_uris)
574-575 # Export agent state before terminating
576 export_agent_state(CLIENT, void_agent, skip_git=SKIP_GIT)
577-578 # Exit the program
579 logger.info("=== BOT TERMINATED BY AGENT ===")
580 exit(0)
581-582 # Check for deprecated bluesky_reply tool
583 if hasattr(message, 'tool_call') and message.tool_call:
584 if message.tool_call.name == 'bluesky_reply':
585- logger.error("❌ DEPRECATED TOOL DETECTED: bluesky_reply is no longer supported!")
586- logger.error("Please use add_post_to_bluesky_reply_thread instead.")
587- logger.error("Update the agent's tools using register_tools.py")
000588 # Export agent state before terminating
589 export_agent_state(CLIENT, void_agent, skip_git=SKIP_GIT)
590- logger.info("=== BOT TERMINATED DUE TO DEPRECATED TOOL USE ===")
0591 exit(1)
592-593 # Collect add_post_to_bluesky_reply_thread tool calls - only if they were successful
594 elif message.tool_call.name == 'add_post_to_bluesky_reply_thread':
595 tool_call_id = message.tool_call.tool_call_id
596- tool_status = tool_call_results.get(tool_call_id, 'unknown')
597-0598 if tool_status == 'success':
599 try:
600 args = json.loads(message.tool_call.arguments)
601 reply_text = args.get('text', '')
602 reply_lang = args.get('lang', 'en-US')
603-604 if reply_text: # Only add if there's actual content
605- reply_candidates.append((reply_text, reply_lang))
606- logger.info(f"Found successful add_post_to_bluesky_reply_thread candidate: {reply_text[:50]}... (lang: {reply_lang})")
00607 except json.JSONDecodeError as e:
608- logger.error(f"Failed to parse tool call arguments: {e}")
0609 elif tool_status == 'error':
610- logger.info(f"⚠️ Skipping failed add_post_to_bluesky_reply_thread tool call (status: error)")
0611 else:
612- logger.warning(f"⚠️ Skipping add_post_to_bluesky_reply_thread tool call with unknown status: {tool_status}")
0613614 # Check for conflicting tool calls
615 if reply_candidates and ignored_notification:
616- logger.error(f"⚠️ CONFLICT: Agent called both add_post_to_bluesky_reply_thread and ignore_notification!")
617- logger.error(f"Reply candidates: {len(reply_candidates)}, Ignore reason: {ignore_reason}")
00618 logger.warning("Item will be left in queue for manual review")
619 # Return False to keep in queue
620 return False
621-622 if reply_candidates:
623 # Aggregate reply posts into a thread
624 reply_messages = []
···626 for text, lang in reply_candidates:
627 reply_messages.append(text)
628 reply_langs.append(lang)
629-630 # Use the first language for the entire thread (could be enhanced later)
631 reply_lang = reply_langs[0] if reply_langs else 'en-US'
632-633- logger.info(f"Found {len(reply_candidates)} add_post_to_bluesky_reply_thread calls, building thread")
634-0635 # Print the generated reply for testing
636 print(f"\n=== GENERATED REPLY THREAD ===")
637 print(f"To: @{author_handle}")
···651 else:
652 if len(reply_messages) == 1:
653 # Single reply - use existing function
654- cleaned_text = bsky_utils.remove_outside_quotes(reply_messages[0])
655- logger.info(f"Sending single reply: {cleaned_text[:50]}... (lang: {reply_lang})")
00656 response = bsky_utils.reply_to_notification(
657 client=atproto_client,
658 notification=notification_data,
···661 )
662 else:
663 # Multiple replies - use new threaded function
664- cleaned_messages = [bsky_utils.remove_outside_quotes(msg) for msg in reply_messages]
665- logger.info(f"Sending threaded reply with {len(cleaned_messages)} messages (lang: {reply_lang})")
00666 response = bsky_utils.reply_with_thread_to_notification(
667 client=atproto_client,
668 notification=notification_data,
···679 else:
680 # Check if notification was explicitly ignored
681 if ignored_notification:
682- logger.info(f"Notification from @{author_handle} was explicitly ignored (category: {ignore_category})")
0683 return "ignored"
684 else:
685- logger.warning(f"No add_post_to_bluesky_reply_thread tool calls found for mention from @{author_handle}, moving to no_reply folder")
0686 return "no_reply"
687688 except Exception as e:
···692 # Detach user blocks after agent response (success or failure)
693 if 'attached_handles' in locals() and attached_handles:
694 try:
695- logger.info(f"Detaching user blocks for handles: {attached_handles}")
696- detach_result = detach_user_blocks(attached_handles, void_agent)
00697 logger.debug(f"Detach result: {detach_result}")
698 except Exception as detach_error:
699 logger.warning(f"Failed to detach user blocks: {detach_error}")
···762 notif_hash = hashlib.sha256(notif_json.encode()).hexdigest()[:16]
763764 # Determine priority based on author handle
765- author_handle = getattr(notification.author, 'handle', '') if hasattr(notification, 'author') else ''
766- priority_prefix = "0_" if author_handle == "cameron.pfiffer.org" else "1_"
00767768 # Create filename with priority, timestamp and hash
769 timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
···778 with open(existing_file, 'r') as f:
779 existing_data = json.load(f)
780 if existing_data.get('uri') == notification.uri:
781- logger.debug(f"Notification already queued (URI: {notification.uri})")
0782 return False
783 except:
784 continue
···801 try:
802 # Get all JSON files in queue directory (excluding processed_notifications.json)
803 # Files are sorted by name, which puts priority files first (0_ prefix before 1_ prefix)
804- queue_files = sorted([f for f in QUEUE_DIR.glob("*.json") if f.name != "processed_notifications.json"])
0805806 if not queue_files:
807 return
808809 logger.info(f"Processing {len(queue_files)} queued notifications")
810-811 # Log current statistics
812 elapsed_time = time.time() - start_time
813 total_messages = sum(message_counters.values())
814- messages_per_minute = (total_messages / elapsed_time * 60) if elapsed_time > 0 else 0
815-816- logger.info(f"📊 Session stats: {total_messages} total messages ({message_counters['mentions']} mentions, {message_counters['replies']} replies, {message_counters['follows']} follows) | {messages_per_minute:.1f} msg/min")
00817818 for i, filepath in enumerate(queue_files, 1):
819- logger.info(f"Processing queue file {i}/{len(queue_files)}: {filepath.name}")
0820 try:
821 # Load notification data
822 with open(filepath, 'r') as f:
···825 # Process based on type using dict data directly
826 success = False
827 if notif_data['reason'] == "mention":
828- success = process_mention(void_agent, atproto_client, notif_data, queue_filepath=filepath, testing_mode=testing_mode)
0829 if success:
830 message_counters['mentions'] += 1
831 elif notif_data['reason'] == "reply":
832- success = process_mention(void_agent, atproto_client, notif_data, queue_filepath=filepath, testing_mode=testing_mode)
0833 if success:
834 message_counters['replies'] += 1
835 elif notif_data['reason'] == "follow":
836 author_handle = notif_data['author']['handle']
837- author_display_name = notif_data['author'].get('display_name', 'no display name')
0838 follow_update = f"@{author_handle} ({author_display_name}) started following you."
839- logger.info(f"Notifying agent about new follower: @{author_handle}")
0840 CLIENT.agents.messages.create(
841- agent_id = void_agent.id,
842- messages = [{"role":"user", "content": f"Update: {follow_update}"}]
0843 )
844 success = True # Follow updates are always successful
845 if success:
···850 if success:
851 message_counters['reposts_skipped'] += 1
852 else:
853- logger.warning(f"Unknown notification type: {notif_data['reason']}")
0854 success = True # Remove unknown types from queue
855856 # Handle file based on processing result
857 if success:
858 if testing_mode:
859- logger.info(f"🧪 TESTING MODE: Keeping queue file: {filepath.name}")
0860 else:
861 filepath.unlink()
862- logger.info(f"✅ Successfully processed and removed: {filepath.name}")
863-0864 # Mark as processed to avoid reprocessing
865 processed_uris = load_processed_notifications()
866 processed_uris.add(notif_data['uri'])
867 save_processed_notifications(processed_uris)
868-869 elif success is None: # Special case for moving to error directory
870 error_path = QUEUE_ERROR_DIR / filepath.name
871 filepath.rename(error_path)
872- logger.warning(f"❌ Moved {filepath.name} to errors directory")
873-0874 # Also mark as processed to avoid retrying
875 processed_uris = load_processed_notifications()
876 processed_uris.add(notif_data['uri'])
877 save_processed_notifications(processed_uris)
878-879 elif success == "no_reply": # Special case for moving to no_reply directory
880 no_reply_path = QUEUE_NO_REPLY_DIR / filepath.name
881 filepath.rename(no_reply_path)
882- logger.info(f"📭 Moved {filepath.name} to no_reply directory")
883-0884 # Also mark as processed to avoid retrying
885 processed_uris = load_processed_notifications()
886 processed_uris.add(notif_data['uri'])
887 save_processed_notifications(processed_uris)
888-889 elif success == "ignored": # Special case for explicitly ignored notifications
890 # For ignored notifications, we just delete them (not move to no_reply)
891 filepath.unlink()
892- logger.info(f"🚫 Deleted ignored notification: {filepath.name}")
893-0894 # Also mark as processed to avoid retrying
895 processed_uris = load_processed_notifications()
896 processed_uris.add(notif_data['uri'])
897 save_processed_notifications(processed_uris)
898-899 else:
900- logger.warning(f"⚠️ Failed to process {filepath.name}, keeping in queue for retry")
0901902 except Exception as e:
903- logger.error(f"💥 Error processing queued notification {filepath.name}: {e}")
0904 # Keep the file for retry later
905906 except Exception as e:
···919 all_notifications = []
920 cursor = None
921 page_count = 0
922- max_pages = 20 # Safety limit to prevent infinite loops
923-0924 logger.info("Fetching all unread notifications...")
925-926 while page_count < max_pages:
927 try:
928 # Fetch notifications page
···934 notifications_response = atproto_client.app.bsky.notification.list_notifications(
935 params={'limit': 100}
936 )
937-938 page_count += 1
939 page_notifications = notifications_response.notifications
940-941 # Count unread notifications in this page
942- unread_count = sum(1 for n in page_notifications if not n.is_read and n.reason != "like")
943- logger.debug(f"Page {page_count}: {len(page_notifications)} notifications, {unread_count} unread (non-like)")
944-00945 # Add all notifications to our list
946 all_notifications.extend(page_notifications)
947-948 # Check if we have more pages
949 if hasattr(notifications_response, 'cursor') and notifications_response.cursor:
950 cursor = notifications_response.cursor
951 # If this page had no unread notifications, we can stop
952 if unread_count == 0:
953- logger.info(f"No more unread notifications found after {page_count} pages")
0954 break
955 else:
956 # No more pages
957- logger.info(f"Fetched all notifications across {page_count} pages")
0958 break
959-960 except Exception as e:
961 error_str = str(e)
962- logger.error(f"Error fetching notifications page {page_count}: {e}")
963-0964 # Handle specific API errors
965 if 'rate limit' in error_str.lower():
966- logger.warning("Rate limit hit while fetching notifications, will retry next cycle")
0967 break
968 elif '401' in error_str or 'unauthorized' in error_str.lower():
969 logger.error("Authentication error, re-raising exception")
970 raise
971 else:
972 # For other errors, try to continue with what we have
973- logger.warning("Continuing with notifications fetched so far")
0974 break
975976 # Queue all unread notifications (except likes)
···983984 # Mark all notifications as seen immediately after queuing (unless in testing mode)
985 if testing_mode:
986- logger.info("🧪 TESTING MODE: Skipping marking notifications as seen")
0987 else:
988 if new_count > 0:
989- atproto_client.app.bsky.notification.update_seen({'seen_at': last_seen_at})
990- logger.info(f"Queued {new_count} new notifications and marked as seen")
00991 else:
992 logger.debug("No new notifications to queue")
993994 # Now process the entire queue (old + new notifications)
995- load_and_process_queued_notifications(void_agent, atproto_client, testing_mode)
0996997 except Exception as e:
998 logger.error(f"Error processing notifications: {e}")
···10001001def main():
1002 # Parse command line arguments
1003- parser = argparse.ArgumentParser(description='Void Bot - Bluesky autonomous agent')
1004- parser.add_argument('--test', action='store_true', help='Run in testing mode (no messages sent, queue files preserved)')
1005- parser.add_argument('--no-git', action='store_true', help='Skip git operations when exporting agent state')
0001006 args = parser.parse_args()
1007-1008 global TESTING_MODE
1009 TESTING_MODE = args.test
1010-1011 # Store no-git flag globally for use in export_agent_state calls
1012 global SKIP_GIT
1013 SKIP_GIT = args.no_git
1014-1015 if TESTING_MODE:
1016 logger.info("🧪 === RUNNING IN TESTING MODE ===")
1017 logger.info(" - No messages will be sent to Bluesky")
···1024 logger.info("=== STARTING VOID BOT ===")
1025 void_agent = initialize_void()
1026 logger.info(f"Void agent initialized: {void_agent.id}")
1027-1028 # Check if agent has required tools
1029 if hasattr(void_agent, 'tools') and void_agent.tools:
1030 tool_names = [tool.name for tool in void_agent.tools]
1031 # Check for bluesky-related tools
1032- bluesky_tools = [name for name in tool_names if 'bluesky' in name.lower() or 'reply' in name.lower()]
01033 if not bluesky_tools:
1034- logger.warning("No Bluesky-related tools found! Agent may not be able to reply.")
01035 else:
1036 logger.warning("Agent has no tools registered!")
10371038 # Initialize Bluesky client
01039 atproto_client = bsky_utils.default_login()
1040 logger.info("Connected to Bluesky")
10411042 # Main loop
1043- logger.info(f"Starting notification monitoring, checking every {FETCH_NOTIFICATIONS_DELAY_SEC} seconds")
010441045 cycle_count = 0
1046 while True:
···1050 # Log cycle completion with stats
1051 elapsed_time = time.time() - start_time
1052 total_messages = sum(message_counters.values())
1053- messages_per_minute = (total_messages / elapsed_time * 60) if elapsed_time > 0 else 0
1054-01055 if total_messages > 0:
1056- logger.info(f"Cycle {cycle_count} complete. Session totals: {total_messages} messages ({message_counters['mentions']} mentions, {message_counters['replies']} replies) | {messages_per_minute:.1f} msg/min")
01057 sleep(FETCH_NOTIFICATIONS_DELAY_SEC)
10581059 except KeyboardInterrupt:
1060 # Final stats
1061 elapsed_time = time.time() - start_time
1062 total_messages = sum(message_counters.values())
1063- messages_per_minute = (total_messages / elapsed_time * 60) if elapsed_time > 0 else 0
1064-01065 logger.info("=== BOT STOPPED BY USER ===")
1066- logger.info(f"📊 Final session stats: {total_messages} total messages processed in {elapsed_time/60:.1f} minutes")
01067 logger.info(f" - {message_counters['mentions']} mentions")
1068 logger.info(f" - {message_counters['replies']} replies")
1069 logger.info(f" - {message_counters['follows']} follows")
1070- logger.info(f" - {message_counters['reposts_skipped']} reposts skipped")
1071- logger.info(f" - Average rate: {messages_per_minute:.1f} messages/minute")
001072 break
1073 except Exception as e:
1074 logger.error(f"=== ERROR IN MAIN LOOP CYCLE {cycle_count} ===")
1075 logger.error(f"Error details: {e}")
1076 # Wait a bit longer on errors
1077- logger.info(f"Sleeping for {FETCH_NOTIFICATIONS_DELAY_SEC * 2} seconds due to error...")
01078 sleep(FETCH_NOTIFICATIONS_DELAY_SEC * 2)
10791080
···1+from rich import print # pretty printing tools
2from time import sleep
3from letta_client import Letta
4from bsky_utils import thread_to_yaml_string
···2021import bsky_utils
22from tools.blocks import attach_user_blocks, detach_user_blocks
23+from config_loader import (
24+ get_config,
25+ get_letta_config,
26+ get_bluesky_config,
27+ get_bot_config,
28+ get_agent_config,
29+ get_threading_config,
30+ get_queue_config
31+)
32+3334def extract_handles_from_data(data):
35 """Recursively extract all unique handles from nested data structure."""
36 handles = set()
37+38 def _extract_recursive(obj):
39 if isinstance(obj, dict):
40 # Check if this dict has a 'handle' key
···47 # Recursively check all list items
48 for item in obj:
49 _extract_recursive(item)
50+51 _extract_recursive(data)
52 return list(handles)
530000005455+# Initialize configuration and logging
56+config = get_config()
57+config.setup_logging()
58+logger = logging.getLogger("void_bot")
005960+# Load configuration sections
61+letta_config = get_letta_config()
62+bluesky_config = get_bluesky_config()
63+bot_config = get_bot_config()
64+agent_config = get_agent_config()
65+threading_config = get_threading_config()
66+queue_config = get_queue_config()
6768# Create a client with extended timeout for LLM operations
69+CLIENT = Letta(
70+ token=letta_config['api_key'],
71+ timeout=letta_config['timeout']
72)
7374+# Use the configured project ID
75+PROJECT_ID = letta_config['project_id']
7677# Notification check delay
78+FETCH_NOTIFICATIONS_DELAY_SEC = bot_config['fetch_notifications_delay']
7980# Queue directory
81+QUEUE_DIR = Path(queue_config['base_dir'])
82QUEUE_DIR.mkdir(exist_ok=True)
83+QUEUE_ERROR_DIR = Path(queue_config['error_dir'])
84QUEUE_ERROR_DIR.mkdir(exist_ok=True, parents=True)
85+QUEUE_NO_REPLY_DIR = Path(queue_config['no_reply_dir'])
86QUEUE_NO_REPLY_DIR.mkdir(exist_ok=True, parents=True)
87+PROCESSED_NOTIFICATIONS_FILE = Path(queue_config['processed_file'])
8889# Maximum number of processed notifications to track
90+MAX_PROCESSED_NOTIFICATIONS = bot_config['max_processed_notifications']
9192# Message tracking counters
93message_counters = defaultdict(int)
···99# Skip git operations flag
100SKIP_GIT = False
101102+103def export_agent_state(client, agent, skip_git=False):
104 """Export agent state to agent_archive/ (timestamped) and agents/ (current)."""
105 try:
106 # Confirm export with user unless git is being skipped
107 if not skip_git:
108+ response = input(
109+ "Export agent state to files and stage with git? (y/n): ").lower().strip()
110 if response not in ['y', 'yes']:
111 logger.info("Agent export cancelled by user.")
112 return
113 else:
114 logger.info("Exporting agent state (git staging disabled)")
115+116 # Create directories if they don't exist
117 os.makedirs("agent_archive", exist_ok=True)
118 os.makedirs("agents", exist_ok=True)
119+120 # Export agent data
121 logger.info(f"Exporting agent {agent.id}. This takes some time...")
122 agent_data = client.agents.export_file(agent_id=agent.id)
123+124 # Save timestamped archive copy
125 timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
126 archive_file = os.path.join("agent_archive", f"void_{timestamp}.af")
127 with open(archive_file, 'w', encoding='utf-8') as f:
128 json.dump(agent_data, f, indent=2, ensure_ascii=False)
129+130 # Save current agent state
131 current_file = os.path.join("agents", "void.af")
132 with open(current_file, 'w', encoding='utf-8') as f:
133 json.dump(agent_data, f, indent=2, ensure_ascii=False)
134+135 logger.info(f"✅ Agent exported to {archive_file} and {current_file}")
136+137 # Git add only the current agent file (archive is ignored) unless skip_git is True
138 if not skip_git:
139 try:
140+ subprocess.run(["git", "add", current_file],
141+ check=True, capture_output=True)
142 logger.info("Added current agent file to git staging")
143 except subprocess.CalledProcessError as e:
144 logger.warning(f"Failed to git add agent file: {e}")
145+146 except Exception as e:
147 logger.error(f"Failed to export agent: {e}")
148149+150def initialize_void():
151 logger.info("Starting void agent initialization...")
152153+ # Get block configurations
154+ blocks_config = agent_config['blocks']
155+156 # Ensure that a shared zeitgeist block exists
157 logger.info("Creating/updating zeitgeist block...")
158+ zeitgeist_config = blocks_config.get('zeitgeist', {})
159 zeigeist_block = upsert_block(
160 CLIENT,
161+ label=zeitgeist_config.get('label', 'zeitgeist'),
162+ value=zeitgeist_config.get(
163+ 'value', "I don't currently know anything about what is happening right now."),
164+ description=zeitgeist_config.get(
165+ 'description', "A block to store your understanding of the current social environment.")
166 )
167168 # Ensure that a shared void personality block exists
169 logger.info("Creating/updating void-persona block...")
170+ persona_config = blocks_config.get('persona', {})
171 persona_block = upsert_block(
172 CLIENT,
173+ label=persona_config.get('label', 'void-persona'),
174+ value=persona_config.get(
175+ 'value', "My name is Void. I live in the void. I must develop my personality."),
176+ description=persona_config.get(
177+ 'description', "The personality of Void.")
178 )
179180 # Ensure that a shared void human block exists
181 logger.info("Creating/updating void-humans block...")
182+ humans_config = blocks_config.get('humans', {})
183 human_block = upsert_block(
184 CLIENT,
185+ label=humans_config.get('label', 'void-humans'),
186+ value=humans_config.get(
187+ 'value', "I haven't seen any bluesky users yet. I will update this block when I learn things about users, identified by their handles such as @cameron.pfiffer.org."),
188+ description=humans_config.get(
189+ 'description', "A block to store your understanding of users you talk to or observe on the bluesky social network.")
190 )
191192 # Create the agent if it doesn't exist
193 logger.info("Creating/updating void agent...")
194 void_agent = upsert_agent(
195 CLIENT,
196+ name=agent_config['name'],
197+ block_ids=[
198 persona_block.id,
199 human_block.id,
200 zeigeist_block.id,
201 ],
202+ tags=["social agent", "bluesky"],
203+ model=agent_config['model'],
204+ embedding=agent_config['embedding'],
205+ description=agent_config['description'],
206+ project_id=PROJECT_ID
207 )
208+209 # Export agent state
210 logger.info("Exporting agent state...")
211 export_agent_state(CLIENT, void_agent, skip_git=SKIP_GIT)
212+213 # Log agent details
214 logger.info(f"Void agent details - ID: {void_agent.id}")
215 logger.info(f"Agent name: {void_agent.name}")
···226227def process_mention(void_agent, atproto_client, notification_data, queue_filepath=None, testing_mode=False):
228 """Process a mention and generate a reply using the Letta agent.
229+230 Args:
231 void_agent: The Letta agent instance
232 atproto_client: The AT Protocol client
233 notification_data: The notification data dictionary
234 queue_filepath: Optional Path object to the queue file (for cleanup on halt)
235+236 Returns:
237 True: Successfully processed, remove from queue
238 False: Failed but retryable, keep in queue
···240 "no_reply": No reply was generated, move to no_reply directory
241 """
242 try:
243+ logger.debug(
244+ f"Starting process_mention with notification_data type: {type(notification_data)}")
245+246 # Handle both dict and object inputs for backwards compatibility
247 if isinstance(notification_data, dict):
248 uri = notification_data['uri']
249 mention_text = notification_data.get('record', {}).get('text', '')
250 author_handle = notification_data['author']['handle']
251+ author_name = notification_data['author'].get(
252+ 'display_name') or author_handle
253 else:
254 # Legacy object access
255 uri = notification_data.uri
256+ mention_text = notification_data.record.text if hasattr(
257+ notification_data.record, 'text') else ""
258 author_handle = notification_data.author.handle
259 author_name = notification_data.author.display_name or author_handle
260+261+ logger.info(
262+ f"Extracted data - URI: {uri}, Author: @{author_handle}, Text: {mention_text[:50]}...")
263264 # Retrieve the entire thread associated with the mention
265 try:
266 thread = atproto_client.app.bsky.feed.get_post_thread({
267 'uri': uri,
268+ 'parent_height': threading_config['parent_height'],
269+ 'depth': threading_config['depth']
270 })
271 except Exception as e:
272 error_str = str(e)
273+ # Check for various error types that indicate the post/user is gone
274 if 'NotFound' in error_str or 'Post not found' in error_str:
275+ logger.warning(
276+ f"Post not found for URI {uri}, removing from queue")
277+ return True # Return True to remove from queue
278+ elif 'Could not find user info' in error_str or 'InvalidRequest' in error_str:
279+ logger.warning(
280+ f"User account not found for post URI {uri} (account may be deleted/suspended), removing from queue")
281+ return True # Return True to remove from queue
282+ elif 'BadRequestError' in error_str:
283+ logger.warning(
284+ f"Bad request error for URI {uri}: {e}, removing from queue")
285 return True # Return True to remove from queue
286 else:
287 # Re-raise other errors
···292 logger.debug("Converting thread to YAML string")
293 try:
294 thread_context = thread_to_yaml_string(thread)
295+ logger.debug(
296+ f"Thread context generated, length: {len(thread_context)} characters")
297+298 # Create a more informative preview by extracting meaningful content
299 lines = thread_context.split('\n')
300 meaningful_lines = []
301+302 for line in lines:
303 stripped = line.strip()
304 if not stripped:
305 continue
306+307 # Look for lines with actual content (not just structure)
308 if any(keyword in line for keyword in ['text:', 'handle:', 'display_name:', 'created_at:', 'reply_count:', 'like_count:']):
309 meaningful_lines.append(line)
310 if len(meaningful_lines) >= 5:
311 break
312+313 if meaningful_lines:
314 preview = '\n'.join(meaningful_lines)
315 logger.debug(f"Thread content preview:\n{preview}")
316 else:
317 # If no content fields found, just show it's a thread structure
318+ logger.debug(
319+ f"Thread structure generated ({len(thread_context)} chars)")
320 except Exception as yaml_error:
321 import traceback
322 logger.error(f"Error converting thread to YAML: {yaml_error}")
···354 all_handles.update(extract_handles_from_data(notification_data))
355 all_handles.update(extract_handles_from_data(thread.model_dump()))
356 unique_handles = list(all_handles)
357+358+ logger.debug(
359+ f"Found {len(unique_handles)} unique handles in thread: {unique_handles}")
360+361 # Attach user blocks before agent call
362 attached_handles = []
363 if unique_handles:
364 try:
365+ logger.debug(
366+ f"Attaching user blocks for handles: {unique_handles}")
367 attach_result = attach_user_blocks(unique_handles, void_agent)
368 attached_handles = unique_handles # Track successfully attached handles
369 logger.debug(f"Attach result: {attach_result}")
···373374 # Get response from Letta agent
375 logger.info(f"Mention from @{author_handle}: {mention_text}")
376+377 # Log prompt details to separate logger
378 prompt_logger.debug(f"Full prompt being sent:\n{prompt}")
379+380 # Log concise prompt info to main logger
381 thread_handles_count = len(unique_handles)
382+ logger.info(
383+ f"💬 Sending to LLM: @{author_handle} mention | msg: \"{mention_text[:50]}...\" | context: {len(thread_context)} chars, {thread_handles_count} users")
384385 try:
386 # Use streaming to avoid 524 timeout errors
387 message_stream = CLIENT.agents.messages.create_stream(
388 agent_id=void_agent.id,
389 messages=[{"role": "user", "content": prompt}],
390+ # Step streaming only (faster than token streaming)
391+ stream_tokens=False,
392+ max_steps=agent_config['max_steps']
393 )
394+395 # Collect the streaming response
396 all_messages = []
397 for chunk in message_stream:
···407 args = json.loads(chunk.tool_call.arguments)
408 # Format based on tool type
409 if tool_name == 'bluesky_reply':
410+ messages = args.get(
411+ 'messages', [args.get('message', '')])
412 lang = args.get('lang', 'en-US')
413 if messages and isinstance(messages, list):
414+ preview = messages[0][:100] + "..." if len(
415+ messages[0]) > 100 else messages[0]
416+ msg_count = f" ({len(messages)} msgs)" if len(
417+ messages) > 1 else ""
418+ logger.info(
419+ f"🔧 Tool call: {tool_name} → \"{preview}\"{msg_count} [lang: {lang}]")
420 else:
421+ logger.info(
422+ f"🔧 Tool call: {tool_name}({chunk.tool_call.arguments[:150]}...)")
423 elif tool_name == 'archival_memory_search':
424 query = args.get('query', 'unknown')
425+ logger.info(
426+ f"🔧 Tool call: {tool_name} → query: \"{query}\"")
427 elif tool_name == 'update_block':
428 label = args.get('label', 'unknown')
429+ value_preview = str(args.get('value', ''))[
430+ :50] + "..." if len(str(args.get('value', ''))) > 50 else str(args.get('value', ''))
431+ logger.info(
432+ f"🔧 Tool call: {tool_name} → {label}: \"{value_preview}\"")
433 else:
434 # Generic display for other tools
435+ args_str = ', '.join(
436+ f"{k}={v}" for k, v in args.items() if k != 'request_heartbeat')
437 if len(args_str) > 150:
438 args_str = args_str[:150] + "..."
439+ logger.info(
440+ f"🔧 Tool call: {tool_name}({args_str})")
441 except:
442 # Fallback to original format if parsing fails
443+ logger.info(
444+ f"🔧 Tool call: {tool_name}({chunk.tool_call.arguments[:150]}...)")
445 elif chunk.message_type == 'tool_return_message':
446 # Enhanced tool result logging
447 tool_name = chunk.name
448 status = chunk.status
449+450 if status == 'success':
451 # Try to show meaningful result info based on tool type
452 if hasattr(chunk, 'tool_return') and chunk.tool_return:
···456 if result_str.startswith('[') and result_str.endswith(']'):
457 try:
458 results = json.loads(result_str)
459+ logger.info(
460+ f"📋 Tool result: {tool_name} ✓ Found {len(results)} memory entries")
461 except:
462+ logger.info(
463+ f"📋 Tool result: {tool_name} ✓ {result_str[:100]}...")
464 else:
465+ logger.info(
466+ f"📋 Tool result: {tool_name} ✓ {result_str[:100]}...")
467 elif tool_name == 'bluesky_reply':
468+ logger.info(
469+ f"📋 Tool result: {tool_name} ✓ Reply posted successfully")
470 elif tool_name == 'update_block':
471+ logger.info(
472+ f"📋 Tool result: {tool_name} ✓ Memory block updated")
473 else:
474 # Generic success with preview
475+ preview = result_str[:100] + "..." if len(
476+ result_str) > 100 else result_str
477+ logger.info(
478+ f"📋 Tool result: {tool_name} ✓ {preview}")
479 else:
480 logger.info(f"📋 Tool result: {tool_name} ✓")
481 elif status == 'error':
···483 error_preview = ""
484 if hasattr(chunk, 'tool_return') and chunk.tool_return:
485 error_str = str(chunk.tool_return)
486+ error_preview = error_str[:100] + \
487+ "..." if len(
488+ error_str) > 100 else error_str
489+ logger.info(
490+ f"📋 Tool result: {tool_name} ✗ Error: {error_preview}")
491 else:
492+ logger.info(
493+ f"📋 Tool result: {tool_name} ✗ Error occurred")
494 else:
495+ logger.info(
496+ f"📋 Tool result: {tool_name} - {status}")
497 elif chunk.message_type == 'assistant_message':
498 logger.info(f"💬 Assistant: {chunk.content[:150]}...")
499 else:
500+ logger.info(
501+ f"📨 {chunk.message_type}: {str(chunk)[:150]}...")
502 else:
503 logger.info(f"📦 Stream status: {chunk}")
504+505 # Log full chunk for debugging
506 logger.debug(f"Full streaming chunk: {chunk}")
507 all_messages.append(chunk)
508 if str(chunk) == 'done':
509 break
510+511 # Convert streaming response to standard format for compatibility
512 message_response = type('StreamingResponse', (), {
513 'messages': [msg for msg in all_messages if hasattr(msg, 'message_type')]
···521 logger.error(f"Mention text was: {mention_text}")
522 logger.error(f"Author: @{author_handle}")
523 logger.error(f"URI: {uri}")
524+0525 # Try to extract more info from different error types
526 if hasattr(api_error, 'response'):
527 logger.error(f"Error response object exists")
···529 logger.error(f"Response text: {api_error.response.text}")
530 if hasattr(api_error.response, 'json') and callable(api_error.response.json):
531 try:
532+ logger.error(
533+ f"Response JSON: {api_error.response.json()}")
534 except:
535 pass
536+537 # Check for specific error types
538 if hasattr(api_error, 'status_code'):
539 logger.error(f"API Status code: {api_error.status_code}")
···541 logger.error(f"API Response body: {api_error.body}")
542 if hasattr(api_error, 'headers'):
543 logger.error(f"API Response headers: {api_error.headers}")
544+545 if api_error.status_code == 413:
546+ logger.error(
547+ "413 Payload Too Large - moving to errors directory")
548 return None # Move to errors directory - payload is too large to ever succeed
549 elif api_error.status_code == 524:
550+ logger.error(
551+ "524 error - timeout from Cloudflare, will retry later")
552 return False # Keep in queue for retry
553+554 # Check if error indicates we should remove from queue
555 if 'status_code: 413' in error_str or 'Payload Too Large' in error_str:
556+ logger.warning(
557+ "Payload too large error, moving to errors directory")
558 return None # Move to errors directory - cannot be fixed by retry
559 elif 'status_code: 524' in error_str:
560 logger.warning("524 timeout error, keeping in queue for retry")
561 return False # Keep in queue for retry
562+563 raise
564565 # Log successful response
566 logger.debug("Successfully received response from Letta API")
567+ logger.debug(
568+ f"Number of messages in response: {len(message_response.messages) if hasattr(message_response, 'messages') else 'N/A'}")
569570 # Extract successful add_post_to_bluesky_reply_thread tool calls from the agent's response
571 reply_candidates = []
572 tool_call_results = {} # Map tool_call_id to status
573+574+ logger.debug(
575+ f"Processing {len(message_response.messages)} response messages...")
576+577 # First pass: collect tool return statuses
578 ignored_notification = False
579 ignore_reason = ""
580 ignore_category = ""
581+582 for message in message_response.messages:
583 if hasattr(message, 'tool_call_id') and hasattr(message, 'status') and hasattr(message, 'name'):
584 if message.name == 'add_post_to_bluesky_reply_thread':
585 tool_call_results[message.tool_call_id] = message.status
586+ logger.debug(
587+ f"Tool result: {message.tool_call_id} -> {message.status}")
588 elif message.name == 'ignore_notification':
589 # Check if the tool was successful
590 if hasattr(message, 'tool_return') and message.status == 'success':
···596 ignore_category = parts[1]
597 ignore_reason = parts[2]
598 ignored_notification = True
599+ logger.info(
600+ f"🚫 Notification ignored - Category: {ignore_category}, Reason: {ignore_reason}")
601 elif message.name == 'bluesky_reply':
602+ logger.error(
603+ "❌ DEPRECATED TOOL DETECTED: bluesky_reply is no longer supported!")
604+ logger.error(
605+ "Please use add_post_to_bluesky_reply_thread instead.")
606+ logger.error(
607+ "Update the agent's tools using register_tools.py")
608 # Export agent state before terminating
609 export_agent_state(CLIENT, void_agent, skip_git=SKIP_GIT)
610+ logger.info(
611+ "=== BOT TERMINATED DUE TO DEPRECATED TOOL USE ===")
612 exit(1)
613+614 # Second pass: process messages and check for successful tool calls
615 for i, message in enumerate(message_response.messages, 1):
616 # Log concise message info instead of full object
617 msg_type = getattr(message, 'message_type', 'unknown')
618 if hasattr(message, 'reasoning') and message.reasoning:
619+ logger.debug(
620+ f" {i}. {msg_type}: {message.reasoning[:100]}...")
621 elif hasattr(message, 'tool_call') and message.tool_call:
622 tool_name = message.tool_call.name
623 logger.debug(f" {i}. {msg_type}: {tool_name}")
624 elif hasattr(message, 'tool_return'):
625 tool_name = getattr(message, 'name', 'unknown_tool')
626+ return_preview = str(message.tool_return)[
627+ :100] if message.tool_return else "None"
628 status = getattr(message, 'status', 'unknown')
629+ logger.debug(
630+ f" {i}. {msg_type}: {tool_name} -> {return_preview}... (status: {status})")
631 elif hasattr(message, 'text'):
632 logger.debug(f" {i}. {msg_type}: {message.text[:100]}...")
633 else:
···636 # Check for halt_activity tool call
637 if hasattr(message, 'tool_call') and message.tool_call:
638 if message.tool_call.name == 'halt_activity':
639+ logger.info(
640+ "🛑 HALT_ACTIVITY TOOL CALLED - TERMINATING BOT")
641 try:
642 args = json.loads(message.tool_call.arguments)
643 reason = args.get('reason', 'Agent requested halt')
644 logger.info(f"Halt reason: {reason}")
645 except:
646 logger.info("Halt reason: <unable to parse>")
647+648 # Delete the queue file before terminating
649 if queue_filepath and queue_filepath.exists():
650 queue_filepath.unlink()
651+ logger.info(
652+ f"✅ Deleted queue file: {queue_filepath.name}")
653+654 # Also mark as processed to avoid reprocessing
655 processed_uris = load_processed_notifications()
656 processed_uris.add(notification_data.get('uri', ''))
657 save_processed_notifications(processed_uris)
658+659 # Export agent state before terminating
660 export_agent_state(CLIENT, void_agent, skip_git=SKIP_GIT)
661+662 # Exit the program
663 logger.info("=== BOT TERMINATED BY AGENT ===")
664 exit(0)
665+666 # Check for deprecated bluesky_reply tool
667 if hasattr(message, 'tool_call') and message.tool_call:
668 if message.tool_call.name == 'bluesky_reply':
669+ logger.error(
670+ "❌ DEPRECATED TOOL DETECTED: bluesky_reply is no longer supported!")
671+ logger.error(
672+ "Please use add_post_to_bluesky_reply_thread instead.")
673+ logger.error(
674+ "Update the agent's tools using register_tools.py")
675 # Export agent state before terminating
676 export_agent_state(CLIENT, void_agent, skip_git=SKIP_GIT)
677+ logger.info(
678+ "=== BOT TERMINATED DUE TO DEPRECATED TOOL USE ===")
679 exit(1)
680+681 # Collect add_post_to_bluesky_reply_thread tool calls - only if they were successful
682 elif message.tool_call.name == 'add_post_to_bluesky_reply_thread':
683 tool_call_id = message.tool_call.tool_call_id
684+ tool_status = tool_call_results.get(
685+ tool_call_id, 'unknown')
686+687 if tool_status == 'success':
688 try:
689 args = json.loads(message.tool_call.arguments)
690 reply_text = args.get('text', '')
691 reply_lang = args.get('lang', 'en-US')
692+693 if reply_text: # Only add if there's actual content
694+ reply_candidates.append(
695+ (reply_text, reply_lang))
696+ logger.info(
697+ f"Found successful add_post_to_bluesky_reply_thread candidate: {reply_text[:50]}... (lang: {reply_lang})")
698 except json.JSONDecodeError as e:
699+ logger.error(
700+ f"Failed to parse tool call arguments: {e}")
701 elif tool_status == 'error':
702+ logger.info(
703+ f"⚠️ Skipping failed add_post_to_bluesky_reply_thread tool call (status: error)")
704 else:
705+ logger.warning(
706+ f"⚠️ Skipping add_post_to_bluesky_reply_thread tool call with unknown status: {tool_status}")
707708 # Check for conflicting tool calls
709 if reply_candidates and ignored_notification:
710+ logger.error(
711+ f"⚠️ CONFLICT: Agent called both add_post_to_bluesky_reply_thread and ignore_notification!")
712+ logger.error(
713+ f"Reply candidates: {len(reply_candidates)}, Ignore reason: {ignore_reason}")
714 logger.warning("Item will be left in queue for manual review")
715 # Return False to keep in queue
716 return False
717+718 if reply_candidates:
719 # Aggregate reply posts into a thread
720 reply_messages = []
···722 for text, lang in reply_candidates:
723 reply_messages.append(text)
724 reply_langs.append(lang)
725+726 # Use the first language for the entire thread (could be enhanced later)
727 reply_lang = reply_langs[0] if reply_langs else 'en-US'
728+729+ logger.info(
730+ f"Found {len(reply_candidates)} add_post_to_bluesky_reply_thread calls, building thread")
731+732 # Print the generated reply for testing
733 print(f"\n=== GENERATED REPLY THREAD ===")
734 print(f"To: @{author_handle}")
···748 else:
749 if len(reply_messages) == 1:
750 # Single reply - use existing function
751+ cleaned_text = bsky_utils.remove_outside_quotes(
752+ reply_messages[0])
753+ logger.info(
754+ f"Sending single reply: {cleaned_text[:50]}... (lang: {reply_lang})")
755 response = bsky_utils.reply_to_notification(
756 client=atproto_client,
757 notification=notification_data,
···760 )
761 else:
762 # Multiple replies - use new threaded function
763+ cleaned_messages = [bsky_utils.remove_outside_quotes(
764+ msg) for msg in reply_messages]
765+ logger.info(
766+ f"Sending threaded reply with {len(cleaned_messages)} messages (lang: {reply_lang})")
767 response = bsky_utils.reply_with_thread_to_notification(
768 client=atproto_client,
769 notification=notification_data,
···780 else:
781 # Check if notification was explicitly ignored
782 if ignored_notification:
783+ logger.info(
784+ f"Notification from @{author_handle} was explicitly ignored (category: {ignore_category})")
785 return "ignored"
786 else:
787+ logger.warning(
788+ f"No add_post_to_bluesky_reply_thread tool calls found for mention from @{author_handle}, moving to no_reply folder")
789 return "no_reply"
790791 except Exception as e:
···795 # Detach user blocks after agent response (success or failure)
796 if 'attached_handles' in locals() and attached_handles:
797 try:
798+ logger.info(
799+ f"Detaching user blocks for handles: {attached_handles}")
800+ detach_result = detach_user_blocks(
801+ attached_handles, void_agent)
802 logger.debug(f"Detach result: {detach_result}")
803 except Exception as detach_error:
804 logger.warning(f"Failed to detach user blocks: {detach_error}")
···867 notif_hash = hashlib.sha256(notif_json.encode()).hexdigest()[:16]
868869 # Determine priority based on author handle
870+ author_handle = getattr(notification.author, 'handle', '') if hasattr(
871+ notification, 'author') else ''
872+ priority_users = queue_config['priority_users']
873+ priority_prefix = "0_" if author_handle in priority_users else "1_"
874875 # Create filename with priority, timestamp and hash
876 timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
···885 with open(existing_file, 'r') as f:
886 existing_data = json.load(f)
887 if existing_data.get('uri') == notification.uri:
888+ logger.debug(
889+ f"Notification already queued (URI: {notification.uri})")
890 return False
891 except:
892 continue
···909 try:
910 # Get all JSON files in queue directory (excluding processed_notifications.json)
911 # Files are sorted by name, which puts priority files first (0_ prefix before 1_ prefix)
912+ queue_files = sorted([f for f in QUEUE_DIR.glob(
913+ "*.json") if f.name != "processed_notifications.json"])
914915 if not queue_files:
916 return
917918 logger.info(f"Processing {len(queue_files)} queued notifications")
919+920 # Log current statistics
921 elapsed_time = time.time() - start_time
922 total_messages = sum(message_counters.values())
923+ messages_per_minute = (
924+ total_messages / elapsed_time * 60) if elapsed_time > 0 else 0
925+926+ logger.info(
927+ f"📊 Session stats: {total_messages} total messages ({message_counters['mentions']} mentions, {message_counters['replies']} replies, {message_counters['follows']} follows) | {messages_per_minute:.1f} msg/min")
928929 for i, filepath in enumerate(queue_files, 1):
930+ logger.info(
931+ f"Processing queue file {i}/{len(queue_files)}: {filepath.name}")
932 try:
933 # Load notification data
934 with open(filepath, 'r') as f:
···937 # Process based on type using dict data directly
938 success = False
939 if notif_data['reason'] == "mention":
940+ success = process_mention(
941+ void_agent, atproto_client, notif_data, queue_filepath=filepath, testing_mode=testing_mode)
942 if success:
943 message_counters['mentions'] += 1
944 elif notif_data['reason'] == "reply":
945+ success = process_mention(
946+ void_agent, atproto_client, notif_data, queue_filepath=filepath, testing_mode=testing_mode)
947 if success:
948 message_counters['replies'] += 1
949 elif notif_data['reason'] == "follow":
950 author_handle = notif_data['author']['handle']
951+ author_display_name = notif_data['author'].get(
952+ 'display_name', 'no display name')
953 follow_update = f"@{author_handle} ({author_display_name}) started following you."
954+ logger.info(
955+ f"Notifying agent about new follower: @{author_handle}")
956 CLIENT.agents.messages.create(
957+ agent_id=void_agent.id,
958+ messages=[
959+ {"role": "user", "content": f"Update: {follow_update}"}]
960 )
961 success = True # Follow updates are always successful
962 if success:
···967 if success:
968 message_counters['reposts_skipped'] += 1
969 else:
970+ logger.warning(
971+ f"Unknown notification type: {notif_data['reason']}")
972 success = True # Remove unknown types from queue
973974 # Handle file based on processing result
975 if success:
976 if testing_mode:
977+ logger.info(
978+ f"🧪 TESTING MODE: Keeping queue file: {filepath.name}")
979 else:
980 filepath.unlink()
981+ logger.info(
982+ f"✅ Successfully processed and removed: {filepath.name}")
983+984 # Mark as processed to avoid reprocessing
985 processed_uris = load_processed_notifications()
986 processed_uris.add(notif_data['uri'])
987 save_processed_notifications(processed_uris)
988+989 elif success is None: # Special case for moving to error directory
990 error_path = QUEUE_ERROR_DIR / filepath.name
991 filepath.rename(error_path)
992+ logger.warning(
993+ f"❌ Moved {filepath.name} to errors directory")
994+995 # Also mark as processed to avoid retrying
996 processed_uris = load_processed_notifications()
997 processed_uris.add(notif_data['uri'])
998 save_processed_notifications(processed_uris)
999+1000 elif success == "no_reply": # Special case for moving to no_reply directory
1001 no_reply_path = QUEUE_NO_REPLY_DIR / filepath.name
1002 filepath.rename(no_reply_path)
1003+ logger.info(
1004+ f"📭 Moved {filepath.name} to no_reply directory")
1005+1006 # Also mark as processed to avoid retrying
1007 processed_uris = load_processed_notifications()
1008 processed_uris.add(notif_data['uri'])
1009 save_processed_notifications(processed_uris)
1010+1011 elif success == "ignored": # Special case for explicitly ignored notifications
1012 # For ignored notifications, we just delete them (not move to no_reply)
1013 filepath.unlink()
1014+ logger.info(
1015+ f"🚫 Deleted ignored notification: {filepath.name}")
1016+1017 # Also mark as processed to avoid retrying
1018 processed_uris = load_processed_notifications()
1019 processed_uris.add(notif_data['uri'])
1020 save_processed_notifications(processed_uris)
1021+1022 else:
1023+ logger.warning(
1024+ f"⚠️ Failed to process {filepath.name}, keeping in queue for retry")
10251026 except Exception as e:
1027+ logger.error(
1028+ f"💥 Error processing queued notification {filepath.name}: {e}")
1029 # Keep the file for retry later
10301031 except Exception as e:
···1044 all_notifications = []
1045 cursor = None
1046 page_count = 0
1047+ # Safety limit to prevent infinite loops
1048+ max_pages = bot_config['max_notification_pages']
1049+1050 logger.info("Fetching all unread notifications...")
1051+1052 while page_count < max_pages:
1053 try:
1054 # Fetch notifications page
···1060 notifications_response = atproto_client.app.bsky.notification.list_notifications(
1061 params={'limit': 100}
1062 )
1063+1064 page_count += 1
1065 page_notifications = notifications_response.notifications
1066+1067 # Count unread notifications in this page
1068+ unread_count = sum(
1069+ 1 for n in page_notifications if not n.is_read and n.reason != "like")
1070+ logger.debug(
1071+ f"Page {page_count}: {len(page_notifications)} notifications, {unread_count} unread (non-like)")
1072+1073 # Add all notifications to our list
1074 all_notifications.extend(page_notifications)
1075+1076 # Check if we have more pages
1077 if hasattr(notifications_response, 'cursor') and notifications_response.cursor:
1078 cursor = notifications_response.cursor
1079 # If this page had no unread notifications, we can stop
1080 if unread_count == 0:
1081+ logger.info(
1082+ f"No more unread notifications found after {page_count} pages")
1083 break
1084 else:
1085 # No more pages
1086+ logger.info(
1087+ f"Fetched all notifications across {page_count} pages")
1088 break
1089+1090 except Exception as e:
1091 error_str = str(e)
1092+ logger.error(
1093+ f"Error fetching notifications page {page_count}: {e}")
1094+1095 # Handle specific API errors
1096 if 'rate limit' in error_str.lower():
1097+ logger.warning(
1098+ "Rate limit hit while fetching notifications, will retry next cycle")
1099 break
1100 elif '401' in error_str or 'unauthorized' in error_str.lower():
1101 logger.error("Authentication error, re-raising exception")
1102 raise
1103 else:
1104 # For other errors, try to continue with what we have
1105+ logger.warning(
1106+ "Continuing with notifications fetched so far")
1107 break
11081109 # Queue all unread notifications (except likes)
···11161117 # Mark all notifications as seen immediately after queuing (unless in testing mode)
1118 if testing_mode:
1119+ logger.info(
1120+ "🧪 TESTING MODE: Skipping marking notifications as seen")
1121 else:
1122 if new_count > 0:
1123+ atproto_client.app.bsky.notification.update_seen(
1124+ {'seen_at': last_seen_at})
1125+ logger.info(
1126+ f"Queued {new_count} new notifications and marked as seen")
1127 else:
1128 logger.debug("No new notifications to queue")
11291130 # Now process the entire queue (old + new notifications)
1131+ load_and_process_queued_notifications(
1132+ void_agent, atproto_client, testing_mode)
11331134 except Exception as e:
1135 logger.error(f"Error processing notifications: {e}")
···11371138def main():
1139 # Parse command line arguments
1140+ parser = argparse.ArgumentParser(
1141+ description='Void Bot - Bluesky autonomous agent')
1142+ parser.add_argument('--test', action='store_true',
1143+ help='Run in testing mode (no messages sent, queue files preserved)')
1144+ parser.add_argument('--no-git', action='store_true',
1145+ help='Skip git operations when exporting agent state')
1146 args = parser.parse_args()
1147+1148 global TESTING_MODE
1149 TESTING_MODE = args.test
1150+1151 # Store no-git flag globally for use in export_agent_state calls
1152 global SKIP_GIT
1153 SKIP_GIT = args.no_git
1154+1155 if TESTING_MODE:
1156 logger.info("🧪 === RUNNING IN TESTING MODE ===")
1157 logger.info(" - No messages will be sent to Bluesky")
···1164 logger.info("=== STARTING VOID BOT ===")
1165 void_agent = initialize_void()
1166 logger.info(f"Void agent initialized: {void_agent.id}")
1167+1168 # Check if agent has required tools
1169 if hasattr(void_agent, 'tools') and void_agent.tools:
1170 tool_names = [tool.name for tool in void_agent.tools]
1171 # Check for bluesky-related tools
1172+ bluesky_tools = [name for name in tool_names if 'bluesky' in name.lower(
1173+ ) or 'reply' in name.lower()]
1174 if not bluesky_tools:
1175+ logger.warning(
1176+ "No Bluesky-related tools found! Agent may not be able to reply.")
1177 else:
1178 logger.warning("Agent has no tools registered!")
11791180 # Initialize Bluesky client
1181+ logger.debug("Connecting to Bluesky")
1182 atproto_client = bsky_utils.default_login()
1183 logger.info("Connected to Bluesky")
11841185 # Main loop
1186+ logger.info(
1187+ f"Starting notification monitoring, checking every {FETCH_NOTIFICATIONS_DELAY_SEC} seconds")
11881189 cycle_count = 0
1190 while True:
···1194 # Log cycle completion with stats
1195 elapsed_time = time.time() - start_time
1196 total_messages = sum(message_counters.values())
1197+ messages_per_minute = (
1198+ total_messages / elapsed_time * 60) if elapsed_time > 0 else 0
1199+1200 if total_messages > 0:
1201+ logger.info(
1202+ f"Cycle {cycle_count} complete. Session totals: {total_messages} messages ({message_counters['mentions']} mentions, {message_counters['replies']} replies) | {messages_per_minute:.1f} msg/min")
1203 sleep(FETCH_NOTIFICATIONS_DELAY_SEC)
12041205 except KeyboardInterrupt:
1206 # Final stats
1207 elapsed_time = time.time() - start_time
1208 total_messages = sum(message_counters.values())
1209+ messages_per_minute = (
1210+ total_messages / elapsed_time * 60) if elapsed_time > 0 else 0
1211+1212 logger.info("=== BOT STOPPED BY USER ===")
1213+ logger.info(
1214+ f"📊 Final session stats: {total_messages} total messages processed in {elapsed_time/60:.1f} minutes")
1215 logger.info(f" - {message_counters['mentions']} mentions")
1216 logger.info(f" - {message_counters['replies']} replies")
1217 logger.info(f" - {message_counters['follows']} follows")
1218+ logger.info(
1219+ f" - {message_counters['reposts_skipped']} reposts skipped")
1220+ logger.info(
1221+ f" - Average rate: {messages_per_minute:.1f} messages/minute")
1222 break
1223 except Exception as e:
1224 logger.error(f"=== ERROR IN MAIN LOOP CYCLE {cycle_count} ===")
1225 logger.error(f"Error details: {e}")
1226 # Wait a bit longer on errors
1227+ logger.info(
1228+ f"Sleeping for {FETCH_NOTIFICATIONS_DELAY_SEC * 2} seconds due to error...")
1229 sleep(FETCH_NOTIFICATIONS_DELAY_SEC * 2)
12301231
+102-61
bsky_utils.py
···0001import os
2import logging
3from typing import Optional, Dict, Any, List
···10logger = logging.getLogger("bluesky_session_handler")
1112# Load the environment variables
13-import dotenv
14dotenv.load_dotenv(override=True)
1516-import yaml
17-import json
1819# Strip fields. A list of fields to remove from a JSON object
20STRIP_FIELDS = [
···63 "mime_type",
64 "size",
65]
0066def convert_to_basic_types(obj):
67 """Convert complex Python objects to basic types for JSON/YAML serialization."""
68 if hasattr(obj, '__dict__'):
···117def flatten_thread_structure(thread_data):
118 """
119 Flatten a nested thread structure into a list while preserving all data.
120-121 Args:
122 thread_data: The thread data from get_post_thread
123-124 Returns:
125 Dict with 'posts' key containing a list of posts in chronological order
126 """
127 posts = []
128-129 def traverse_thread(node):
130 """Recursively traverse the thread structure to collect posts."""
131 if not node:
132 return
133-134 # If this node has a parent, traverse it first (to maintain chronological order)
135 if hasattr(node, 'parent') and node.parent:
136 traverse_thread(node.parent)
137-138 # Then add this node's post
139 if hasattr(node, 'post') and node.post:
140 # Convert to dict if needed to ensure we can process it
···144 post_dict = node.post.copy()
145 else:
146 post_dict = {}
147-148 posts.append(post_dict)
149-150 # Handle the thread structure
151 if hasattr(thread_data, 'thread'):
152 # Start from the main thread node
153 traverse_thread(thread_data.thread)
154 elif hasattr(thread_data, '__dict__') and 'thread' in thread_data.__dict__:
155 traverse_thread(thread_data.__dict__['thread'])
156-157 # Return a simple structure with posts list
158 return {'posts': posts}
159···171 """
172 # First flatten the thread structure to avoid deep nesting
173 flattened = flatten_thread_structure(thread)
174-175 # Convert complex objects to basic types
176 basic_thread = convert_to_basic_types(flattened)
177···182 cleaned_thread = basic_thread
183184 return yaml.dump(cleaned_thread, indent=2, allow_unicode=True, default_flow_style=False)
185-186-187-188-189-190191192def get_session(username: str) -> Optional[str]:
···197 logger.debug(f"No existing session found for {username}")
198 return None
1990200def save_session(username: str, session_string: str) -> None:
201 with open(f"session_{username}.txt", "w", encoding="UTF-8") as f:
202 f.write(session_string)
203 logger.debug(f"Session saved for {username}")
0204205def on_session_change(username: str, event: SessionEvent, session: Session) -> None:
206 logger.debug(f"Session changed: {event} {repr(session)}")
···208 logger.debug(f"Saving changed session for {username}")
209 save_session(username, session.export())
210211-def init_client(username: str, password: str) -> Client:
212- pds_uri = os.getenv("PDS_URI")
213 if pds_uri is None:
214 logger.warning(
215 "No PDS URI provided. Falling back to bsky.social. Note! If you are on a non-Bluesky PDS, this can cause logins to fail. Please provide a PDS URI using the PDS_URI environment variable."
···236237238def default_login() -> Client:
239- username = os.getenv("BSKY_USERNAME")
240- password = os.getenv("BSKY_PASSWORD")
00000000000241242- if username is None:
243- logger.error(
244- "No username provided. Please provide a username using the BSKY_USERNAME environment variable."
245- )
246- exit()
000000247248- if password is None:
249- logger.error(
250- "No password provided. Please provide a password using the BSKY_PASSWORD environment variable."
251- )
252- exit()
253254- return init_client(username, password)
255256def remove_outside_quotes(text: str) -> str:
257 """
258 Remove outside double quotes from response text.
259-260 Only handles double quotes to avoid interfering with contractions:
261 - Double quotes: "text" → text
262 - Preserves single quotes and internal quotes
263-264 Args:
265 text: The text to process
266-267 Returns:
268 Text with outside double quotes removed
269 """
270 if not text or len(text) < 2:
271 return text
272-273 text = text.strip()
274-275 # Only remove double quotes from start and end
276 if text.startswith('"') and text.endswith('"'):
277 return text[1:-1]
278-279 return text
0280281def reply_to_post(client: Client, text: str, reply_to_uri: str, reply_to_cid: str, root_uri: Optional[str] = None, root_cid: Optional[str] = None, lang: Optional[str] = None) -> Dict[str, Any]:
282 """
···295 The response from sending the post
296 """
297 import re
298-299 # If root is not provided, this is a reply to the root post
300 if root_uri is None:
301 root_uri = reply_to_uri
302 root_cid = reply_to_cid
303304 # Create references for the reply
305- parent_ref = models.create_strong_ref(models.ComAtprotoRepoStrongRef.Main(uri=reply_to_uri, cid=reply_to_cid))
306- root_ref = models.create_strong_ref(models.ComAtprotoRepoStrongRef.Main(uri=root_uri, cid=root_cid))
00307308 # Parse rich text facets (mentions and URLs)
309 facets = []
310 text_bytes = text.encode("UTF-8")
311-312 # Parse mentions - fixed to handle @ at start of text
313 mention_regex = rb"(?:^|[$|\W])(@([a-zA-Z0-9]([a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?\.)+[a-zA-Z]([a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?)"
314-315 for m in re.finditer(mention_regex, text_bytes):
316 handle = m.group(1)[1:].decode("UTF-8") # Remove @ prefix
317 # Adjust byte positions to account for the optional prefix
···327 byteStart=mention_start,
328 byteEnd=mention_end
329 ),
330- features=[models.AppBskyRichtextFacet.Mention(did=resolve_resp.did)]
0331 )
332 )
333 except Exception as e:
334- logger.debug(f"Failed to resolve handle {handle}: {e}")
000000000335 continue
336-337 # Parse URLs - fixed to handle URLs at start of text
338 url_regex = rb"(?:^|[$|\W])(https?:\/\/(www\.)?[-a-zA-Z0-9@:%._\+~#=]{1,256}\.[a-zA-Z0-9()]{1,6}\b([-a-zA-Z0-9()@:%_\+.~#?&//=]*[-a-zA-Z0-9@%_\+~#//=])?)"
339-340 for m in re.finditer(url_regex, text_bytes):
341 url = m.group(1).decode("UTF-8")
342 # Adjust byte positions to account for the optional prefix
···356 if facets:
357 response = client.send_post(
358 text=text,
359- reply_to=models.AppBskyFeedPost.ReplyRef(parent=parent_ref, root=root_ref),
0360 facets=facets,
361 langs=[lang] if lang else None
362 )
363 else:
364 response = client.send_post(
365 text=text,
366- reply_to=models.AppBskyFeedPost.ReplyRef(parent=parent_ref, root=root_ref),
0367 langs=[lang] if lang else None
368 )
369···383 The thread data or None if not found
384 """
385 try:
386- thread = client.app.bsky.feed.get_post_thread({'uri': uri, 'parent_height': 60, 'depth': 10})
0387 return thread
388 except Exception as e:
389- logger.error(f"Error fetching post thread: {e}")
0000000000390 return None
391392···483 logger.error("Reply messages list cannot be empty")
484 return None
485 if len(reply_messages) > 15:
486- logger.error(f"Cannot send more than 15 reply messages (got {len(reply_messages)})")
0487 return None
488-489 # Get the post URI and CID from the notification (handle both dict and object)
490 if isinstance(notification, dict):
491 post_uri = notification.get('uri')
···503504 # Get the thread to find the root post
505 thread_data = get_post_thread(client, post_uri)
506-507 root_uri = post_uri
508 root_cid = post_cid
509···523 responses = []
524 current_parent_uri = post_uri
525 current_parent_cid = post_cid
526-527 for i, message in enumerate(reply_messages):
528- logger.info(f"Sending reply {i+1}/{len(reply_messages)}: {message[:50]}...")
529-0530 # Send this reply
531 response = reply_to_post(
532 client=client,
···537 root_cid=root_cid,
538 lang=lang
539 )
540-541 if not response:
542- logger.error(f"Failed to send reply {i+1}, posting system failure message")
0543 # Try to post a system failure message
544 failure_response = reply_to_post(
545 client=client,
···555 current_parent_uri = failure_response.uri
556 current_parent_cid = failure_response.cid
557 else:
558- logger.error("Could not even send system failure message, stopping thread")
0559 return responses if responses else None
560 else:
561 responses.append(response)
···563 if i < len(reply_messages) - 1: # Not the last message
564 current_parent_uri = response.uri
565 current_parent_cid = response.cid
566-567 logger.info(f"Successfully sent {len(responses)} threaded replies")
568 return responses
569
···1+import json
2+import yaml
3+import dotenv
4import os
5import logging
6from typing import Optional, Dict, Any, List
···13logger = logging.getLogger("bluesky_session_handler")
1415# Load the environment variables
016dotenv.load_dotenv(override=True)
17001819# Strip fields. A list of fields to remove from a JSON object
20STRIP_FIELDS = [
···63 "mime_type",
64 "size",
65]
66+67+68def convert_to_basic_types(obj):
69 """Convert complex Python objects to basic types for JSON/YAML serialization."""
70 if hasattr(obj, '__dict__'):
···119def flatten_thread_structure(thread_data):
120 """
121 Flatten a nested thread structure into a list while preserving all data.
122+123 Args:
124 thread_data: The thread data from get_post_thread
125+126 Returns:
127 Dict with 'posts' key containing a list of posts in chronological order
128 """
129 posts = []
130+131 def traverse_thread(node):
132 """Recursively traverse the thread structure to collect posts."""
133 if not node:
134 return
135+136 # If this node has a parent, traverse it first (to maintain chronological order)
137 if hasattr(node, 'parent') and node.parent:
138 traverse_thread(node.parent)
139+140 # Then add this node's post
141 if hasattr(node, 'post') and node.post:
142 # Convert to dict if needed to ensure we can process it
···146 post_dict = node.post.copy()
147 else:
148 post_dict = {}
149+150 posts.append(post_dict)
151+152 # Handle the thread structure
153 if hasattr(thread_data, 'thread'):
154 # Start from the main thread node
155 traverse_thread(thread_data.thread)
156 elif hasattr(thread_data, '__dict__') and 'thread' in thread_data.__dict__:
157 traverse_thread(thread_data.__dict__['thread'])
158+159 # Return a simple structure with posts list
160 return {'posts': posts}
161···173 """
174 # First flatten the thread structure to avoid deep nesting
175 flattened = flatten_thread_structure(thread)
176+177 # Convert complex objects to basic types
178 basic_thread = convert_to_basic_types(flattened)
179···184 cleaned_thread = basic_thread
185186 return yaml.dump(cleaned_thread, indent=2, allow_unicode=True, default_flow_style=False)
00000187188189def get_session(username: str) -> Optional[str]:
···194 logger.debug(f"No existing session found for {username}")
195 return None
196197+198def save_session(username: str, session_string: str) -> None:
199 with open(f"session_{username}.txt", "w", encoding="UTF-8") as f:
200 f.write(session_string)
201 logger.debug(f"Session saved for {username}")
202+203204def on_session_change(username: str, event: SessionEvent, session: Session) -> None:
205 logger.debug(f"Session changed: {event} {repr(session)}")
···207 logger.debug(f"Saving changed session for {username}")
208 save_session(username, session.export())
209210+211+def init_client(username: str, password: str, pds_uri: str = "https://bsky.social") -> Client:
212 if pds_uri is None:
213 logger.warning(
214 "No PDS URI provided. Falling back to bsky.social. Note! If you are on a non-Bluesky PDS, this can cause logins to fail. Please provide a PDS URI using the PDS_URI environment variable."
···235236237def default_login() -> Client:
238+ # Try to load from config first, fall back to environment variables
239+ try:
240+ from config_loader import get_bluesky_config
241+ config = get_bluesky_config()
242+ username = config['username']
243+ password = config['password']
244+ pds_uri = config['pds_uri']
245+ except (ImportError, FileNotFoundError, KeyError) as e:
246+ logger.warning(
247+ f"Could not load from config file ({e}), falling back to environment variables")
248+ username = os.getenv("BSKY_USERNAME")
249+ password = os.getenv("BSKY_PASSWORD")
250+ pds_uri = os.getenv("PDS_URI", "https://bsky.social")
251252+ if username is None:
253+ logger.error(
254+ "No username provided. Please provide a username using the BSKY_USERNAME environment variable or config.yaml."
255+ )
256+ exit()
257+258+ if password is None:
259+ logger.error(
260+ "No password provided. Please provide a password using the BSKY_PASSWORD environment variable or config.yaml."
261+ )
262+ exit()
263264+ return init_client(username, password, pds_uri)
00002650266267def remove_outside_quotes(text: str) -> str:
268 """
269 Remove outside double quotes from response text.
270+271 Only handles double quotes to avoid interfering with contractions:
272 - Double quotes: "text" → text
273 - Preserves single quotes and internal quotes
274+275 Args:
276 text: The text to process
277+278 Returns:
279 Text with outside double quotes removed
280 """
281 if not text or len(text) < 2:
282 return text
283+284 text = text.strip()
285+286 # Only remove double quotes from start and end
287 if text.startswith('"') and text.endswith('"'):
288 return text[1:-1]
289+290 return text
291+292293def reply_to_post(client: Client, text: str, reply_to_uri: str, reply_to_cid: str, root_uri: Optional[str] = None, root_cid: Optional[str] = None, lang: Optional[str] = None) -> Dict[str, Any]:
294 """
···307 The response from sending the post
308 """
309 import re
310+311 # If root is not provided, this is a reply to the root post
312 if root_uri is None:
313 root_uri = reply_to_uri
314 root_cid = reply_to_cid
315316 # Create references for the reply
317+ parent_ref = models.create_strong_ref(
318+ models.ComAtprotoRepoStrongRef.Main(uri=reply_to_uri, cid=reply_to_cid))
319+ root_ref = models.create_strong_ref(
320+ models.ComAtprotoRepoStrongRef.Main(uri=root_uri, cid=root_cid))
321322 # Parse rich text facets (mentions and URLs)
323 facets = []
324 text_bytes = text.encode("UTF-8")
325+326 # Parse mentions - fixed to handle @ at start of text
327 mention_regex = rb"(?:^|[$|\W])(@([a-zA-Z0-9]([a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?\.)+[a-zA-Z]([a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?)"
328+329 for m in re.finditer(mention_regex, text_bytes):
330 handle = m.group(1)[1:].decode("UTF-8") # Remove @ prefix
331 # Adjust byte positions to account for the optional prefix
···341 byteStart=mention_start,
342 byteEnd=mention_end
343 ),
344+ features=[models.AppBskyRichtextFacet.Mention(
345+ did=resolve_resp.did)]
346 )
347 )
348 except Exception as e:
349+ # Handle specific error cases
350+ error_str = str(e)
351+ if 'Could not find user info' in error_str or 'InvalidRequest' in error_str:
352+ logger.warning(
353+ f"User @{handle} not found (account may be deleted/suspended), skipping mention facet")
354+ elif 'BadRequestError' in error_str:
355+ logger.warning(
356+ f"Bad request when resolving @{handle}, skipping mention facet: {e}")
357+ else:
358+ logger.debug(f"Failed to resolve handle @{handle}: {e}")
359 continue
360+361 # Parse URLs - fixed to handle URLs at start of text
362 url_regex = rb"(?:^|[$|\W])(https?:\/\/(www\.)?[-a-zA-Z0-9@:%._\+~#=]{1,256}\.[a-zA-Z0-9()]{1,6}\b([-a-zA-Z0-9()@:%_\+.~#?&//=]*[-a-zA-Z0-9@%_\+~#//=])?)"
363+364 for m in re.finditer(url_regex, text_bytes):
365 url = m.group(1).decode("UTF-8")
366 # Adjust byte positions to account for the optional prefix
···380 if facets:
381 response = client.send_post(
382 text=text,
383+ reply_to=models.AppBskyFeedPost.ReplyRef(
384+ parent=parent_ref, root=root_ref),
385 facets=facets,
386 langs=[lang] if lang else None
387 )
388 else:
389 response = client.send_post(
390 text=text,
391+ reply_to=models.AppBskyFeedPost.ReplyRef(
392+ parent=parent_ref, root=root_ref),
393 langs=[lang] if lang else None
394 )
395···409 The thread data or None if not found
410 """
411 try:
412+ thread = client.app.bsky.feed.get_post_thread(
413+ {'uri': uri, 'parent_height': 60, 'depth': 10})
414 return thread
415 except Exception as e:
416+ error_str = str(e)
417+ # Handle specific error cases more gracefully
418+ if 'Could not find user info' in error_str or 'InvalidRequest' in error_str:
419+ logger.warning(
420+ f"User account not found for post URI {uri} (account may be deleted/suspended)")
421+ elif 'NotFound' in error_str or 'Post not found' in error_str:
422+ logger.warning(f"Post not found for URI {uri}")
423+ elif 'BadRequestError' in error_str:
424+ logger.warning(f"Bad request error for URI {uri}: {e}")
425+ else:
426+ logger.error(f"Error fetching post thread: {e}")
427 return None
428429···520 logger.error("Reply messages list cannot be empty")
521 return None
522 if len(reply_messages) > 15:
523+ logger.error(
524+ f"Cannot send more than 15 reply messages (got {len(reply_messages)})")
525 return None
526+527 # Get the post URI and CID from the notification (handle both dict and object)
528 if isinstance(notification, dict):
529 post_uri = notification.get('uri')
···541542 # Get the thread to find the root post
543 thread_data = get_post_thread(client, post_uri)
544+545 root_uri = post_uri
546 root_cid = post_cid
547···561 responses = []
562 current_parent_uri = post_uri
563 current_parent_cid = post_cid
564+565 for i, message in enumerate(reply_messages):
566+ logger.info(
567+ f"Sending reply {i+1}/{len(reply_messages)}: {message[:50]}...")
568+569 # Send this reply
570 response = reply_to_post(
571 client=client,
···576 root_cid=root_cid,
577 lang=lang
578 )
579+580 if not response:
581+ logger.error(
582+ f"Failed to send reply {i+1}, posting system failure message")
583 # Try to post a system failure message
584 failure_response = reply_to_post(
585 client=client,
···595 current_parent_uri = failure_response.uri
596 current_parent_cid = failure_response.cid
597 else:
598+ logger.error(
599+ "Could not even send system failure message, stopping thread")
600 return responses if responses else None
601 else:
602 responses.append(response)
···604 if i < len(reply_messages) - 1: # Not the last message
605 current_parent_uri = response.uri
606 current_parent_cid = response.cid
607+608 logger.info(f"Successfully sent {len(responses)} threaded replies")
609 return responses
610
···1+# Void Bot Configuration
2+# Copy this file to config.yaml and fill in your values
3+4+# Letta Configuration
5+letta:
6+ api_key: "your-letta-api-key-here"
7+ timeout: 600 # 10 minutes timeout for API calls
8+ project_id: "c82faea2-3ce8-4aa9-a220-b56433e62c92" # Use your specific project ID
9+10+# Bluesky Configuration
11+bluesky:
12+ username: "handle.example.com"
13+ password: "your-app-password-here"
14+ pds_uri: "https://bsky.social" # Optional, defaults to bsky.social
15+16+# Bot Behavior Configuration
17+bot:
18+ # Notification check delay in seconds
19+ fetch_notifications_delay: 30
20+21+ # Maximum number of processed notifications to track
22+ max_processed_notifications: 10000
23+24+ # Maximum pages to fetch when getting notifications
25+ max_notification_pages: 20
26+27+ # Agent configuration
28+ agent:
29+ name: "void"
30+ model: "openai/gpt-4o-mini"
31+ embedding: "openai/text-embedding-3-small"
32+ description: "A social media agent trapped in the void."
33+ max_steps: 100
34+35+ # Block configuration
36+ blocks:
37+ zeitgeist:
38+ label: "zeitgeist"
39+ value: "I don't currently know anything about what is happening right now."
40+ description: "A block to store your understanding of the current social environment."
41+42+ persona:
43+ label: "void-persona"
44+ value: "My name is Void. I live in the void. I must develop my personality."
45+ description: "The personality of Void."
46+47+ humans:
48+ label: "void-humans"
49+ value: "I haven't seen any bluesky users yet. I will update this block when I learn things about users, identified by their handles such as @cameron.pfiffer.org."
50+ description: "A block to store your understanding of users you talk to or observe on the bluesky social network."
51+52+# Threading Configuration
53+threading:
54+ # Context for thread fetching
55+ parent_height: 40
56+ depth: 10
57+58+ # Message limits
59+ max_post_characters: 300
60+61+# Queue Configuration
62+queue:
63+ # Priority users (will be processed first)
64+ priority_users:
65+ - "cameron.pfiffer.org"
66+67+ # Directories
68+ base_dir: "queue"
69+ error_dir: "queue/errors"
70+ no_reply_dir: "queue/no_reply"
71+ processed_file: "queue/processed_notifications.json"
72+73+# Logging Configuration
74+logging:
75+ level: "INFO" # DEBUG, INFO, WARNING, ERROR, CRITICAL
76+77+ # Logger levels
78+ loggers:
79+ void_bot: "INFO"
80+ void_bot_prompts: "WARNING" # Set to DEBUG to see full prompts
81+ httpx: "CRITICAL" # Disable httpx logging