for assorted things

Compare changes

Choose any two refs to compare.

+7
.pre-commit-config.yaml
··· 7 language: system 8 pass_filenames: false 9 types: [file] 10 stages: [pre-commit]
··· 7 language: system 8 pass_filenames: false 9 types: [file] 10 + stages: [pre-commit] 11 + - id: check-readme-for-bad-links 12 + name: Check README for bad links 13 + entry: ./check-files-for-bad-links README.md 14 + language: system 15 + pass_filenames: false 16 + types: [file] 17 stages: [pre-commit]
+213
README.md
··· 10 11 ## scripts 12 13 - [`check-files-for-bad-links`](#check-files-for-bad-links) 14 - [`kill-processes`](#kill-processes) 15 - [`update-lights`](#update-lights) 16 - [`update-readme`](#update-readme) 17 18 --- 19 20 ### `check-files-for-bad-links` 21 22 Check files for bad links. ··· 27 ./check-files-for-bad-links *.md 28 ``` 29 30 ### `kill-processes` 31 32 AI-powered TUI for killing processes. ··· 37 ./kill-processes 38 ``` 39 40 ### `update-lights` 41 42 Make some change to my phillips hue network of lights via agent + MCP server. ··· 46 ```bash 47 ./update-lights -m "turn on sahara in the living room and nightlight in the kitchen" 48 ``` 49 50 ### `update-readme` 51
··· 10 11 ## scripts 12 13 + - [`analyze-github-followers`](#analyze-github-followers) 14 - [`check-files-for-bad-links`](#check-files-for-bad-links) 15 + - [`dm-me-when-a-flight-passes-over`](#dm-me-when-a-flight-passes-over) 16 + - [`find-longest-bsky-thread`](#find-longest-bsky-thread) 17 + - [`find-stale-bsky-follows`](#find-stale-bsky-follows) 18 - [`kill-processes`](#kill-processes) 19 + - [`predict-github-stars`](#predict-github-stars) 20 - [`update-lights`](#update-lights) 21 - [`update-readme`](#update-readme) 22 23 --- 24 25 + ### `analyze-github-followers` 26 + 27 + analyze your github followers and following. 28 + 29 + usage: 30 + ./analyze-github-followers 31 + ./analyze-github-followers --summary-only # skip detailed analysis 32 + 33 + details: 34 + - uses github rest api to fetch followers/following 35 + - shows rich tables with follower stats 36 + - identifies mutual follows, notable followers, etc. 37 + - requires GITHUB_TOKEN in .env file 38 + 39 + --- 40 + 41 ### `check-files-for-bad-links` 42 43 Check files for bad links. ··· 48 ./check-files-for-bad-links *.md 49 ``` 50 51 + Details: 52 + - uses [`httpx`](https://www.python-httpx.org/) to check links 53 + - uses [`anyio`](https://anyio.readthedocs.io/en/stable/) to run the checks concurrently 54 + - pass include globs to scan (e.g. `*.md`) 55 + - pass exclude globs to skip (e.g. `*.md`) 56 + - pass ignore-url prefixes to ignore (e.g. `http://localhost` or `https://localhost`) 57 + - pass concurrency to run the checks concurrently (default is 50) 58 + 59 + --- 60 + 61 + ### `dm-me-when-a-flight-passes-over` 62 + 63 + Monitor flights passing overhead and send BlueSky DMs. 64 + 65 + Usage: 66 + # Single user mode (backward compatible) 67 + ./dm-me-when-a-flight-passes-over 68 + 69 + # Multi-subscriber mode with JSON file 70 + ./dm-me-when-a-flight-passes-over --subscribers subscribers.json 71 + 72 + # Multi-subscriber mode with stdin 73 + echo '[{"handle": "user1.bsky.social", "latitude": 41.8781, "longitude": -87.6298, "radius_miles": 5}]' | ./dm-me-when-a-flight-passes-over --subscribers - 74 + 75 + This script monitors flights within a configurable radius and sends DMs on BlueSky 76 + when flights pass overhead. Supports multiple subscribers with different locations. 77 + 78 + ## Future Architecture Ideas 79 + 80 + ### Web App Deployment Options 81 + 82 + 1. **FastAPI + Fly.io/Railway/Render** 83 + - REST API with endpoints: 84 + - POST /subscribe - Register user with BlueSky handle 85 + - DELETE /unsubscribe - Remove subscription 86 + - POST /update-location - Update user's location 87 + - GET /status - Check subscription status 88 + - Background worker using Celery/RQ/APScheduler 89 + - PostgreSQL/SQLite for subscriber persistence 90 + - Redis for caching flight data & deduplication 91 + 92 + 2. **Vercel/Netlify Edge Functions** 93 + - Serverless approach with scheduled cron jobs 94 + - Use Vercel KV or Upstash Redis for state 95 + - Challenge: Long-running monitoring needs workarounds 96 + - Solution: Trigger checks via cron every minute 97 + 98 + 3. **Self-Hosted with ngrok/Cloudflare Tunnel** 99 + - Quick prototype option 100 + - Run this script as daemon 101 + - Expose simple Flask/FastAPI wrapper 102 + - Security concerns: rate limiting, auth required 103 + 104 + ### Mobile/Browser Integration 105 + 106 + 1. **Progressive Web App (PWA)** 107 + - Service worker for background location updates 108 + - Geolocation API for current position 109 + - Push notifications instead of/alongside DMs 110 + - IndexedDB for offline capability 111 + 112 + 2. **iOS Shortcuts Integration** 113 + - Create shortcut that gets location 114 + - Calls webhook with location + BlueSky handle 115 + - Could run automatically based on focus modes 116 + 117 + 3. **Browser Extension** 118 + - Background script polls location 119 + - Lighter weight than full app 120 + - Cross-platform solution 121 + 122 + ### Architecture Components 123 + 124 + 1. **Location Services Layer** 125 + - Browser Geolocation API 126 + - IP-based geolocation fallback 127 + - Manual location picker UI 128 + - Privacy: Only send location when checking flights 129 + 130 + 2. **Notification Options** 131 + - BlueSky DMs (current) 132 + - Web Push Notifications 133 + - Webhooks to other services 134 + - Email/SMS via Twilio/SendGrid 135 + 136 + 3. **Subscription Management** 137 + - OAuth with BlueSky for auth 138 + - User preferences: radius, notification types 139 + - Quiet hours/Do Not Disturb 140 + - Rate limiting per user 141 + 142 + 4. **Data Optimization** 143 + - Cache FlightRadar API responses 144 + - Batch location updates 145 + - Aggregate nearby users for efficiency 146 + - WebSocket for real-time updates 147 + 148 + ### Implementation Approach 149 + 150 + Phase 1: Web API Wrapper 151 + - FastAPI with /subscribe endpoint 152 + - SQLite for subscribers 153 + - Run monitoring in background thread 154 + - Deploy to Fly.io free tier 155 + 156 + Phase 2: Web UI 157 + - Simple React/Vue form 158 + - Geolocation permission request 159 + - Show nearby flights on map 160 + - Subscription management 161 + 162 + Phase 3: Mobile Experience 163 + - PWA with service workers 164 + - Background location updates 165 + - Local notifications 166 + - Offline support 167 + 168 + ### Security Considerations 169 + - Rate limit FlightRadar API calls 170 + - Authenticate BlueSky handles 171 + - Validate location bounds 172 + - Prevent subscription spam 173 + - GDPR compliance for location data 174 + 175 + --- 176 + 177 + ### `find-longest-bsky-thread` 178 + 179 + Find the longest reply thread from a Bluesky post. 180 + 181 + Usage: 182 + 183 + ```bash 184 + ./find-longest-bsky-thread https://bsky.app/profile/nerditry.bsky.social/post/3lnofix5nlc23 185 + ``` 186 + 187 + Details: 188 + - uses [`atproto`](https://github.com/MarshalX/atproto) to fetch the thread 189 + - uses [`jinja2`](https://github.com/pallets/jinja) to render the thread 190 + 191 + --- 192 + 193 + ### `find-stale-bsky-follows` 194 + 195 + Find stale/inactive accounts among those you follow on Bluesky. 196 + 197 + Usage: 198 + 199 + ```bash 200 + ./find-stale-bsky-follows 201 + # or with custom inactivity threshold (days) 202 + ./find-stale-bsky-follows --days 180 203 + ``` 204 + 205 + Details: 206 + - uses [`atproto`](https://github.com/MarshalX/atproto) to fetch following list 207 + - uses [`rich`](https://github.com/Textualize/rich) for pretty output 208 + - identifies accounts with no recent posts 209 + 210 + --- 211 + 212 ### `kill-processes` 213 214 AI-powered TUI for killing processes. ··· 219 ./kill-processes 220 ``` 221 222 + Details: 223 + - uses [`textual`](https://textual.textualize.io/) for the TUI 224 + - uses [`marvin`](https://github.com/prefecthq/marvin) (built on [`pydantic-ai`](https://github.com/pydantic/pydantic-ai)) to annotate processes 225 + 226 + --- 227 + 228 + ### `predict-github-stars` 229 + 230 + Predict when a GitHub repository will reach a target number of stars. 231 + 232 + Usage: 233 + ./predict-github-stars owner/repo 10000 234 + 235 + Details: 236 + - Uses GitHub REST API to fetch star history (with timestamps). 237 + - Fits polynomial regression (degreeโ€ฏ1โ€“3) to full history. 238 + - Falls back to recentโ€‘trend linear extrapolation if the polynomial 239 + cannot reach the target within ten years. 240 + - Shows recent growth rate and a caution for longโ€‘range estimates. 241 + - Requires `GITHUB_TOKEN` in the environment for higher rate limits (optional). 242 + 243 + --- 244 + 245 ### `update-lights` 246 247 Make some change to my phillips hue network of lights via agent + MCP server. ··· 251 ```bash 252 ./update-lights -m "turn on sahara in the living room and nightlight in the kitchen" 253 ``` 254 + 255 + Details: 256 + - uses a [`marvin`](https://github.com/prefecthq/marvin) (built on [`pydantic-ai`](https://github.com/pydantic/pydantic-ai)) agent 257 + - the agent spins up a [`fastmcp`](https://github.com/jlowin/fastmcp) MCP server that talks to my [`phue`](https://github.com/studioimaginaire/phue) bridge 258 + - set `HUE_BRIDGE_IP` and `HUE_BRIDGE_USERNAME` in `.env` or otherwise in environment 259 + - uses `OPENAI_API_KEY` by default, but you can set `AI_MODEL` in `.env` or otherwise in environment to use a different model 260 + 261 + --- 262 263 ### `update-readme` 264
+366
analyze-github-followers
···
··· 1 + #!/usr/bin/env -S uv run --script --quiet 2 + # /// script 3 + # requires-python = ">=3.12" 4 + # dependencies = ["httpx", "rich", "pydantic-settings"] 5 + # /// 6 + """ 7 + analyze your github followers and following. 8 + 9 + usage: 10 + ./analyze-github-followers 11 + ./analyze-github-followers --summary-only # skip detailed analysis 12 + 13 + details: 14 + - uses github rest api to fetch followers/following 15 + - shows rich tables with follower stats 16 + - identifies mutual follows, notable followers, etc. 17 + - requires GITHUB_TOKEN in .env file 18 + """ 19 + 20 + from __future__ import annotations 21 + 22 + import argparse 23 + import os 24 + from datetime import datetime, timezone 25 + from typing import NamedTuple 26 + 27 + import httpx 28 + from pydantic import Field 29 + from pydantic_settings import BaseSettings, SettingsConfigDict 30 + from rich.console import Console 31 + from rich.panel import Panel 32 + from rich.progress import Progress, SpinnerColumn, TextColumn 33 + from rich.table import Table 34 + 35 + console = Console() 36 + 37 + 38 + class Settings(BaseSettings): 39 + """load settings from environment""" 40 + 41 + model_config = SettingsConfigDict( 42 + env_file=os.environ.get("ENV_FILE", ".env"), extra="ignore" 43 + ) 44 + github_token: str = Field(description="github api token") 45 + 46 + 47 + class GitHubUser(NamedTuple): 48 + """github user information""" 49 + 50 + login: str 51 + name: str | None 52 + bio: str | None 53 + followers: int 54 + following: int 55 + public_repos: int 56 + created_at: datetime 57 + url: str 58 + company: str | None 59 + location: str | None 60 + blog: str | None 61 + 62 + @property 63 + def follower_ratio(self) -> float: 64 + """ratio of followers to following (higher = more influential)""" 65 + if self.following == 0: 66 + return float(self.followers) if self.followers > 0 else 0.0 67 + return self.followers / self.following 68 + 69 + 70 + def _headers(token: str) -> dict[str, str]: 71 + return { 72 + "Accept": "application/vnd.github.v3+json", 73 + "Authorization": f"token {token}", 74 + } 75 + 76 + 77 + def get_authenticated_user(token: str) -> str: 78 + """get the authenticated user's login""" 79 + with httpx.Client() as client: 80 + r = client.get("https://api.github.com/user", headers=_headers(token)) 81 + r.raise_for_status() 82 + return r.json()["login"] 83 + 84 + 85 + def get_user_details(username: str, token: str) -> GitHubUser: 86 + """fetch detailed user information""" 87 + with httpx.Client() as client: 88 + r = client.get( 89 + f"https://api.github.com/users/{username}", headers=_headers(token) 90 + ) 91 + r.raise_for_status() 92 + data = r.json() 93 + return GitHubUser( 94 + login=data["login"], 95 + name=data.get("name"), 96 + bio=data.get("bio"), 97 + followers=data["followers"], 98 + following=data["following"], 99 + public_repos=data["public_repos"], 100 + created_at=datetime.fromisoformat(data["created_at"].replace("Z", "+00:00")), 101 + url=data["html_url"], 102 + company=data.get("company"), 103 + location=data.get("location"), 104 + blog=data.get("blog"), 105 + ) 106 + 107 + 108 + def get_all_followers(username: str, token: str) -> list[str]: 109 + """fetch all followers (just logins for now)""" 110 + followers = [] 111 + page = 1 112 + with httpx.Client() as client: 113 + while True: 114 + r = client.get( 115 + f"https://api.github.com/users/{username}/followers?page={page}&per_page=100", 116 + headers=_headers(token), 117 + ) 118 + r.raise_for_status() 119 + data = r.json() 120 + if not data: 121 + break 122 + followers.extend([user["login"] for user in data]) 123 + page += 1 124 + return followers 125 + 126 + 127 + def get_all_following(username: str, token: str) -> list[str]: 128 + """fetch all users being followed""" 129 + following = [] 130 + page = 1 131 + with httpx.Client() as client: 132 + while True: 133 + r = client.get( 134 + f"https://api.github.com/users/{username}/following?page={page}&per_page=100", 135 + headers=_headers(token), 136 + ) 137 + r.raise_for_status() 138 + data = r.json() 139 + if not data: 140 + break 141 + following.extend([user["login"] for user in data]) 142 + page += 1 143 + return following 144 + 145 + 146 + def main(): 147 + """main function to analyze github followers""" 148 + parser = argparse.ArgumentParser(description="analyze your github followers") 149 + parser.add_argument( 150 + "--summary-only", 151 + action="store_true", 152 + help="show summary only, skip detailed follower analysis", 153 + ) 154 + args = parser.parse_args() 155 + 156 + try: 157 + settings = Settings() # type: ignore 158 + except Exception as e: 159 + console.print(f"[red]error loading settings: {e}[/red]") 160 + console.print("[dim]ensure .env file exists with GITHUB_TOKEN[/dim]") 161 + return 162 + 163 + token = settings.github_token.strip() 164 + 165 + try: 166 + # get authenticated user 167 + username = get_authenticated_user(token) 168 + console.print(f"[blue]analyzing followers for @{username}[/blue]\n") 169 + 170 + # fetch user details 171 + with Progress( 172 + SpinnerColumn(), 173 + TextColumn("[progress.description]{task.description}"), 174 + console=console, 175 + ) as progress: 176 + task = progress.add_task("fetching your profile...", total=None) 177 + user = get_user_details(username, token) 178 + progress.update(task, completed=True) 179 + 180 + # show profile info 181 + profile_text = f"[bold cyan]@{user.login}[/bold cyan]" 182 + if user.name: 183 + profile_text += f" ({user.name})" 184 + profile_text += f"\n[dim]joined {user.created_at:%Y-%m-%d}[/dim]" 185 + if user.bio: 186 + profile_text += f"\n{user.bio}" 187 + if user.location: 188 + profile_text += f"\n๐Ÿ“ {user.location}" 189 + if user.company: 190 + profile_text += f"\n๐Ÿข {user.company}" 191 + 192 + console.print(Panel.fit(profile_text, border_style="blue")) 193 + console.print() 194 + 195 + # fetch followers and following 196 + with Progress( 197 + SpinnerColumn(), 198 + TextColumn("[progress.description]{task.description}"), 199 + console=console, 200 + ) as progress: 201 + task1 = progress.add_task("fetching followers...", total=None) 202 + followers = get_all_followers(username, token) 203 + progress.update(task1, completed=True) 204 + 205 + task2 = progress.add_task("fetching following...", total=None) 206 + following = get_all_following(username, token) 207 + progress.update(task2, completed=True) 208 + 209 + # analyze relationships 210 + followers_set = set(followers) 211 + following_set = set(following) 212 + 213 + mutual = followers_set & following_set 214 + followers_only = followers_set - following_set 215 + following_only = following_set - followers_set 216 + 217 + # summary table 218 + summary_table = Table(show_header=True, header_style="bold magenta") 219 + summary_table.add_column("metric", style="cyan") 220 + summary_table.add_column("count", justify="right", style="white") 221 + 222 + summary_table.add_row("total followers", str(len(followers))) 223 + summary_table.add_row("total following", str(len(following))) 224 + summary_table.add_row("mutual follows", f"[green]{len(mutual)}[/green]") 225 + summary_table.add_row( 226 + "followers not following back", f"[yellow]{len(followers_only)}[/yellow]" 227 + ) 228 + summary_table.add_row( 229 + "following but not following back", f"[red]{len(following_only)}[/red]" 230 + ) 231 + summary_table.add_row("public repos", str(user.public_repos)) 232 + 233 + console.print(summary_table) 234 + console.print() 235 + 236 + # fetch details for all followers 237 + if followers and not args.summary_only: 238 + console.print( 239 + f"\n[bold yellow]analyzing all {len(followers)} followers...[/bold yellow]" 240 + ) 241 + follower_details = [] 242 + 243 + with Progress( 244 + SpinnerColumn(), 245 + TextColumn("[progress.description]{task.description}"), 246 + console=console, 247 + ) as progress: 248 + task = progress.add_task( 249 + f"fetching follower details...", total=len(followers) 250 + ) 251 + 252 + for follower_login in followers: 253 + try: 254 + details = get_user_details(follower_login, token) 255 + follower_details.append(details) 256 + except Exception: 257 + pass # skip if we can't fetch details 258 + progress.advance(task) 259 + 260 + if follower_details: 261 + # most influential followers by follower ratio 262 + # filter out accounts with very few followers to avoid noise 263 + influential = [f for f in follower_details if f.followers >= 100] 264 + influential = sorted(influential, key=lambda u: u.follower_ratio, reverse=True)[:10] 265 + 266 + console.print() 267 + console.print("[bold magenta]most influential followers:[/bold magenta]") 268 + console.print("[dim]ranked by followers-to-following ratio[/dim]\n") 269 + followers_table = Table(show_header=True, header_style="bold magenta") 270 + followers_table.add_column("username", style="cyan") 271 + followers_table.add_column("name", style="white") 272 + followers_table.add_column("followers", justify="right", style="blue") 273 + followers_table.add_column("following", justify="right", style="yellow") 274 + followers_table.add_column("ratio", justify="right", style="green") 275 + followers_table.add_column("mutual", justify="center", style="magenta") 276 + 277 + for follower in influential: 278 + is_mutual = "โœ“" if follower.login in mutual else "" 279 + followers_table.add_row( 280 + f"@{follower.login}", 281 + follower.name or "[dim]no name[/dim]", 282 + f"{follower.followers:,}", 283 + f"{follower.following:,}", 284 + f"{follower.follower_ratio:.1f}x", 285 + is_mutual, 286 + ) 287 + 288 + console.print(followers_table) 289 + 290 + # location analysis 291 + locations = [ 292 + f.location for f in follower_details if f.location 293 + ] 294 + if locations: 295 + from collections import Counter 296 + 297 + location_counts = Counter(locations).most_common(5) 298 + console.print("\n[bold magenta]top follower locations:[/bold magenta]") 299 + location_table = Table(show_header=False) 300 + location_table.add_column("location", style="cyan") 301 + location_table.add_column("count", justify="right", style="white") 302 + for loc, count in location_counts: 303 + location_table.add_row(loc, str(count)) 304 + console.print(location_table) 305 + 306 + # company analysis 307 + companies = [ 308 + f.company.lstrip("@").strip() 309 + for f in follower_details 310 + if f.company 311 + ] 312 + if companies: 313 + from collections import Counter 314 + 315 + company_counts = Counter(companies).most_common(5) 316 + console.print("\n[bold magenta]top follower companies:[/bold magenta]") 317 + company_table = Table(show_header=False) 318 + company_table.add_column("company", style="cyan") 319 + company_table.add_column("count", justify="right", style="white") 320 + for comp, count in company_counts: 321 + company_table.add_row(comp, str(count)) 322 + console.print(company_table) 323 + 324 + # account age analysis 325 + now = datetime.now(timezone.utc) 326 + ages_years = [(now - f.created_at).days / 365.25 for f in follower_details] 327 + avg_age = sum(ages_years) / len(ages_years) 328 + oldest = min(follower_details, key=lambda f: f.created_at) 329 + newest = max(follower_details, key=lambda f: f.created_at) 330 + 331 + console.print("\n[bold magenta]account age stats:[/bold magenta]") 332 + age_table = Table(show_header=False) 333 + age_table.add_column("metric", style="cyan") 334 + age_table.add_column("value", style="white") 335 + age_table.add_row("average follower account age", f"{avg_age:.1f} years") 336 + age_table.add_row("oldest follower account", f"@{oldest.login} ({oldest.created_at:%Y-%m-%d})") 337 + age_table.add_row("newest follower account", f"@{newest.login} ({newest.created_at:%Y-%m-%d})") 338 + console.print(age_table) 339 + 340 + # repo stats 341 + repo_counts = [f.public_repos for f in follower_details] 342 + avg_repos = sum(repo_counts) / len(repo_counts) 343 + most_repos = max(follower_details, key=lambda f: f.public_repos) 344 + 345 + console.print("\n[bold magenta]repository stats:[/bold magenta]") 346 + repo_table = Table(show_header=False) 347 + repo_table.add_column("metric", style="cyan") 348 + repo_table.add_column("value", style="white") 349 + repo_table.add_row("average repos per follower", f"{avg_repos:.1f}") 350 + repo_table.add_row("follower with most repos", f"@{most_repos.login} ({most_repos.public_repos:,} repos)") 351 + repo_table.add_row("followers with 0 repos", str(sum(1 for f in follower_details if f.public_repos == 0))) 352 + console.print(repo_table) 353 + 354 + except httpx.HTTPStatusError as e: 355 + if e.response.status_code == 401: 356 + console.print("[red]error: invalid github token[/red]") 357 + elif e.response.status_code == 403: 358 + console.print("[red]error: rate limit exceeded[/red]") 359 + else: 360 + console.print(f"[red]github api error: {e.response.status_code}[/red]") 361 + except Exception as e: 362 + console.print(f"[red]error: {e}[/red]") 363 + 364 + 365 + if __name__ == "__main__": 366 + main()
+591
dm-me-when-a-flight-passes-over
···
··· 1 + #!/usr/bin/env -S uv run --script --quiet 2 + # /// script 3 + # requires-python = ">=3.12" 4 + # dependencies = ["atproto", "pydantic-settings", "geopy", "httpx", "jinja2"] 5 + # /// 6 + """ 7 + Monitor flights passing overhead and send BlueSky DMs. 8 + 9 + Usage: 10 + # Single user mode (backward compatible) 11 + ./dm-me-when-a-flight-passes-over 12 + 13 + # Multi-subscriber mode with JSON file 14 + ./dm-me-when-a-flight-passes-over --subscribers subscribers.json 15 + 16 + # Multi-subscriber mode with stdin 17 + echo '[{"handle": "user1.bsky.social", "latitude": 41.8781, "longitude": -87.6298, "radius_miles": 5}]' | ./dm-me-when-a-flight-passes-over --subscribers - 18 + 19 + This script monitors flights within a configurable radius and sends DMs on BlueSky 20 + when flights pass overhead. Supports multiple subscribers with different locations. 21 + 22 + ## Future Architecture Ideas 23 + 24 + ### Web App Deployment Options 25 + 26 + 1. **FastAPI + Fly.io/Railway/Render** 27 + - REST API with endpoints: 28 + - POST /subscribe - Register user with BlueSky handle 29 + - DELETE /unsubscribe - Remove subscription 30 + - POST /update-location - Update user's location 31 + - GET /status - Check subscription status 32 + - Background worker using Celery/RQ/APScheduler 33 + - PostgreSQL/SQLite for subscriber persistence 34 + - Redis for caching flight data & deduplication 35 + 36 + 2. **Vercel/Netlify Edge Functions** 37 + - Serverless approach with scheduled cron jobs 38 + - Use Vercel KV or Upstash Redis for state 39 + - Challenge: Long-running monitoring needs workarounds 40 + - Solution: Trigger checks via cron every minute 41 + 42 + 3. **Self-Hosted with ngrok/Cloudflare Tunnel** 43 + - Quick prototype option 44 + - Run this script as daemon 45 + - Expose simple Flask/FastAPI wrapper 46 + - Security concerns: rate limiting, auth required 47 + 48 + ### Mobile/Browser Integration 49 + 50 + 1. **Progressive Web App (PWA)** 51 + - Service worker for background location updates 52 + - Geolocation API for current position 53 + - Push notifications instead of/alongside DMs 54 + - IndexedDB for offline capability 55 + 56 + 2. **iOS Shortcuts Integration** 57 + - Create shortcut that gets location 58 + - Calls webhook with location + BlueSky handle 59 + - Could run automatically based on focus modes 60 + 61 + 3. **Browser Extension** 62 + - Background script polls location 63 + - Lighter weight than full app 64 + - Cross-platform solution 65 + 66 + ### Architecture Components 67 + 68 + 1. **Location Services Layer** 69 + - Browser Geolocation API 70 + - IP-based geolocation fallback 71 + - Manual location picker UI 72 + - Privacy: Only send location when checking flights 73 + 74 + 2. **Notification Options** 75 + - BlueSky DMs (current) 76 + - Web Push Notifications 77 + - Webhooks to other services 78 + - Email/SMS via Twilio/SendGrid 79 + 80 + 3. **Subscription Management** 81 + - OAuth with BlueSky for auth 82 + - User preferences: radius, notification types 83 + - Quiet hours/Do Not Disturb 84 + - Rate limiting per user 85 + 86 + 4. **Data Optimization** 87 + - Cache FlightRadar API responses 88 + - Batch location updates 89 + - Aggregate nearby users for efficiency 90 + - WebSocket for real-time updates 91 + 92 + ### Implementation Approach 93 + 94 + Phase 1: Web API Wrapper 95 + - FastAPI with /subscribe endpoint 96 + - SQLite for subscribers 97 + - Run monitoring in background thread 98 + - Deploy to Fly.io free tier 99 + 100 + Phase 2: Web UI 101 + - Simple React/Vue form 102 + - Geolocation permission request 103 + - Show nearby flights on map 104 + - Subscription management 105 + 106 + Phase 3: Mobile Experience 107 + - PWA with service workers 108 + - Background location updates 109 + - Local notifications 110 + - Offline support 111 + 112 + ### Security Considerations 113 + - Rate limit FlightRadar API calls 114 + - Authenticate BlueSky handles 115 + - Validate location bounds 116 + - Prevent subscription spam 117 + - GDPR compliance for location data 118 + """ 119 + 120 + import argparse 121 + import time 122 + import math 123 + import json 124 + import sys 125 + from datetime import datetime 126 + from concurrent.futures import ThreadPoolExecutor, as_completed 127 + 128 + import httpx 129 + from atproto import Client 130 + from geopy import distance 131 + from jinja2 import Template 132 + from pydantic import BaseModel, Field 133 + from pydantic_settings import BaseSettings, SettingsConfigDict 134 + 135 + 136 + class Settings(BaseSettings): 137 + """App settings loaded from environment variables""" 138 + 139 + model_config = SettingsConfigDict(env_file=".env", extra="ignore") 140 + 141 + bsky_handle: str = Field(...) 142 + bsky_password: str = Field(...) 143 + flightradar_api_token: str = Field(...) 144 + 145 + 146 + class Subscriber(BaseModel): 147 + """Subscriber with location and notification preferences""" 148 + 149 + handle: str 150 + latitude: float 151 + longitude: float 152 + radius_miles: float = 5.0 153 + filters: dict[str, list[str]] = Field(default_factory=dict) 154 + message_template: str | None = None 155 + 156 + 157 + class Flight(BaseModel): 158 + """Flight data model""" 159 + 160 + hex: str 161 + latitude: float 162 + longitude: float 163 + altitude: float | None = None 164 + ground_speed: float | None = None 165 + heading: float | None = None 166 + aircraft_type: str | None = None 167 + registration: str | None = None 168 + origin: str | None = None 169 + destination: str | None = None 170 + callsign: str | None = None 171 + distance_miles: float 172 + 173 + 174 + def get_flights_in_area( 175 + settings: Settings, latitude: float, longitude: float, radius_miles: float 176 + ) -> list[Flight]: 177 + """Get flights within the specified radius using FlightRadar24 API.""" 178 + lat_offset = radius_miles / 69 # 1 degree latitude โ‰ˆ 69 miles 179 + lon_offset = radius_miles / (69 * abs(math.cos(math.radians(latitude)))) 180 + 181 + bounds = { 182 + "north": latitude + lat_offset, 183 + "south": latitude - lat_offset, 184 + "west": longitude - lon_offset, 185 + "east": longitude + lon_offset, 186 + } 187 + 188 + headers = { 189 + "Authorization": f"Bearer {settings.flightradar_api_token}", 190 + "Accept": "application/json", 191 + "Accept-Version": "v1", 192 + } 193 + 194 + url = "https://fr24api.flightradar24.com/api/live/flight-positions/full" 195 + params = { 196 + "bounds": f"{bounds['north']},{bounds['south']},{bounds['west']},{bounds['east']}" 197 + } 198 + 199 + try: 200 + with httpx.Client() as client: 201 + response = client.get(url, headers=headers, params=params, timeout=10) 202 + response.raise_for_status() 203 + data = response.json() 204 + 205 + flights_in_radius = [] 206 + center = (latitude, longitude) 207 + 208 + if isinstance(data, dict) and "data" in data: 209 + for flight_data in data["data"]: 210 + lat = flight_data.get("lat") 211 + lon = flight_data.get("lon") 212 + 213 + if lat and lon: 214 + flight_pos = (lat, lon) 215 + dist = distance.distance(center, flight_pos).miles 216 + if dist <= radius_miles: 217 + flight = Flight( 218 + hex=flight_data.get("fr24_id", ""), 219 + latitude=lat, 220 + longitude=lon, 221 + altitude=flight_data.get("alt"), 222 + ground_speed=flight_data.get("gspeed"), 223 + heading=flight_data.get("track"), 224 + aircraft_type=flight_data.get("type"), 225 + registration=flight_data.get("reg"), 226 + origin=flight_data.get("orig_iata"), 227 + destination=flight_data.get("dest_iata"), 228 + callsign=flight_data.get("flight"), 229 + distance_miles=round(dist, 2), 230 + ) 231 + flights_in_radius.append(flight) 232 + 233 + return flights_in_radius 234 + except httpx.HTTPStatusError as e: 235 + print(f"HTTP error fetching flights: {e}") 236 + print(f"Response status: {e.response.status_code}") 237 + print(f"Response content: {e.response.text[:500]}") 238 + return [] 239 + except Exception as e: 240 + print(f"Error fetching flights: {e}") 241 + return [] 242 + 243 + 244 + DEFAULT_MESSAGE_TEMPLATE = """โœˆ๏ธ Flight passing overhead! 245 + 246 + Flight: {{ flight.callsign or 'Unknown' }} 247 + Distance: {{ flight.distance_miles }} miles 248 + {%- if flight.altitude %} 249 + Altitude: {{ "{:,.0f}".format(flight.altitude) }} ft 250 + {%- endif %} 251 + {%- if flight.ground_speed %} 252 + Speed: {{ "{:.0f}".format(flight.ground_speed) }} kts 253 + {%- endif %} 254 + {%- if flight.heading %} 255 + Heading: {{ "{:.0f}".format(flight.heading) }}ยฐ 256 + {%- endif %} 257 + {%- if flight.aircraft_type %} 258 + Aircraft: {{ flight.aircraft_type }} 259 + {%- endif %} 260 + {%- if flight.origin or flight.destination %} 261 + Route: {{ flight.origin or '???' }} โ†’ {{ flight.destination or '???' }} 262 + {%- endif %} 263 + 264 + Time: {{ timestamp }}""" 265 + 266 + 267 + def format_flight_info(flight: Flight, template_str: str | None = None) -> str: 268 + """Format flight information for a DM using Jinja2 template.""" 269 + template_str = template_str or DEFAULT_MESSAGE_TEMPLATE 270 + template = Template(template_str) 271 + 272 + return template.render( 273 + flight=flight, 274 + timestamp=datetime.now().strftime('%H:%M:%S') 275 + ) 276 + 277 + 278 + def send_dm(client: Client, message: str, target_handle: str) -> bool: 279 + """Send a direct message to the specified handle on BlueSky.""" 280 + try: 281 + resolved = client.com.atproto.identity.resolve_handle( 282 + params={"handle": target_handle} 283 + ) 284 + target_did = resolved.did 285 + 286 + chat_client = client.with_bsky_chat_proxy() 287 + 288 + convo_response = chat_client.chat.bsky.convo.get_convo_for_members( 289 + {"members": [target_did]} 290 + ) 291 + 292 + if not convo_response or not convo_response.convo: 293 + print(f"Could not create/get conversation with {target_handle}") 294 + return False 295 + 296 + recipient = None 297 + for member in convo_response.convo.members: 298 + if member.did != client.me.did: 299 + recipient = member 300 + break 301 + 302 + if not recipient or recipient.handle != target_handle: 303 + print( 304 + f"ERROR: About to message wrong person! Expected {target_handle}, but found {recipient.handle if recipient else 'no recipient'}" 305 + ) 306 + return False 307 + 308 + chat_client.chat.bsky.convo.send_message( 309 + data={ 310 + "convoId": convo_response.convo.id, 311 + "message": {"text": message, "facets": []}, 312 + } 313 + ) 314 + 315 + print(f"DM sent to {target_handle}") 316 + return True 317 + 318 + except Exception as e: 319 + print(f"Error sending DM to {target_handle}: {e}") 320 + return False 321 + 322 + 323 + def flight_matches_filters(flight: Flight, filters: dict[str, list[str]]) -> bool: 324 + """Check if a flight matches the subscriber's filters.""" 325 + if not filters: 326 + return True 327 + 328 + for field, allowed_values in filters.items(): 329 + if not allowed_values: 330 + continue 331 + 332 + flight_value = getattr(flight, field, None) 333 + if flight_value is None: 334 + return False 335 + 336 + if field == "aircraft_type": 337 + # Case-insensitive partial matching for aircraft types 338 + flight_value_lower = str(flight_value).lower() 339 + if not any(allowed.lower() in flight_value_lower for allowed in allowed_values): 340 + return False 341 + else: 342 + # Exact matching for other fields 343 + if str(flight_value) not in [str(v) for v in allowed_values]: 344 + return False 345 + 346 + return True 347 + 348 + 349 + def process_subscriber( 350 + client: Client, 351 + settings: Settings, 352 + subscriber: Subscriber, 353 + notified_flights: dict[str, set[str]], 354 + ) -> None: 355 + """Process flights for a single subscriber.""" 356 + try: 357 + flights = get_flights_in_area( 358 + settings, subscriber.latitude, subscriber.longitude, subscriber.radius_miles 359 + ) 360 + 361 + if subscriber.handle not in notified_flights: 362 + notified_flights[subscriber.handle] = set() 363 + 364 + subscriber_notified = notified_flights[subscriber.handle] 365 + filtered_count = 0 366 + 367 + for flight in flights: 368 + flight_id = flight.hex 369 + 370 + if not flight_matches_filters(flight, subscriber.filters): 371 + filtered_count += 1 372 + continue 373 + 374 + if flight_id not in subscriber_notified: 375 + message = format_flight_info(flight, subscriber.message_template) 376 + print(f"\n[{subscriber.handle}] {message}\n") 377 + 378 + if send_dm(client, message, subscriber.handle): 379 + print(f"DM sent to {subscriber.handle} for flight {flight_id}") 380 + subscriber_notified.add(flight_id) 381 + else: 382 + print( 383 + f"Failed to send DM to {subscriber.handle} for flight {flight_id}" 384 + ) 385 + 386 + current_flight_ids = {f.hex for f in flights} 387 + notified_flights[subscriber.handle] &= current_flight_ids 388 + 389 + if not flights: 390 + print( 391 + f"[{subscriber.handle}] No flights in range at {datetime.now().strftime('%H:%M:%S')}" 392 + ) 393 + elif filtered_count > 0 and filtered_count == len(flights): 394 + print( 395 + f"[{subscriber.handle}] {filtered_count} flights filtered out at {datetime.now().strftime('%H:%M:%S')}" 396 + ) 397 + 398 + except Exception as e: 399 + print(f"Error processing subscriber {subscriber.handle}: {e}") 400 + 401 + 402 + def load_subscribers(subscribers_input: str | None) -> list[Subscriber]: 403 + """Load subscribers from JSON file or stdin.""" 404 + if subscribers_input: 405 + with open(subscribers_input, "r") as f: 406 + data = json.load(f) 407 + else: 408 + print("Reading subscriber data from stdin (provide JSON array)...") 409 + data = json.load(sys.stdin) 410 + 411 + return [Subscriber(**item) for item in data] 412 + 413 + 414 + def main(): 415 + """Main monitoring loop.""" 416 + parser = argparse.ArgumentParser( 417 + description="Monitor flights overhead and send BlueSky DMs" 418 + ) 419 + 420 + parser.add_argument( 421 + "--subscribers", 422 + type=str, 423 + help="JSON file with subscriber list, or '-' for stdin", 424 + ) 425 + parser.add_argument( 426 + "--latitude", type=float, default=41.8781, help="Latitude (default: Chicago)" 427 + ) 428 + parser.add_argument( 429 + "--longitude", type=float, default=-87.6298, help="Longitude (default: Chicago)" 430 + ) 431 + parser.add_argument( 432 + "--radius", type=float, default=5.0, help="Radius in miles (default: 5)" 433 + ) 434 + parser.add_argument( 435 + "--handle", 436 + type=str, 437 + default="alternatebuild.dev", 438 + help="BlueSky handle to DM (default: alternatebuild.dev)", 439 + ) 440 + parser.add_argument( 441 + "--filter-aircraft-type", 442 + type=str, 443 + nargs="+", 444 + help="Filter by aircraft types (e.g., B737 A320 C172)", 445 + ) 446 + parser.add_argument( 447 + "--filter-callsign", 448 + type=str, 449 + nargs="+", 450 + help="Filter by callsigns (e.g., UAL DL AAL)", 451 + ) 452 + parser.add_argument( 453 + "--filter-origin", 454 + type=str, 455 + nargs="+", 456 + help="Filter by origin airports (e.g., ORD LAX JFK)", 457 + ) 458 + parser.add_argument( 459 + "--filter-destination", 460 + type=str, 461 + nargs="+", 462 + help="Filter by destination airports (e.g., ORD LAX JFK)", 463 + ) 464 + parser.add_argument( 465 + "--message-template", 466 + type=str, 467 + help="Custom Jinja2 template for messages", 468 + ) 469 + parser.add_argument( 470 + "--message-template-file", 471 + type=str, 472 + help="Path to file containing custom Jinja2 template", 473 + ) 474 + parser.add_argument( 475 + "--interval", 476 + type=int, 477 + default=60, 478 + help="Check interval in seconds (default: 60)", 479 + ) 480 + parser.add_argument( 481 + "--once", action="store_true", help="Run once and exit (for testing)" 482 + ) 483 + parser.add_argument( 484 + "--max-workers", 485 + type=int, 486 + default=5, 487 + help="Max concurrent workers for processing subscribers (default: 5)", 488 + ) 489 + args = parser.parse_args() 490 + 491 + try: 492 + settings = Settings() 493 + except Exception as e: 494 + print(f"Error loading settings: {e}") 495 + print( 496 + "Ensure .env file exists with BSKY_HANDLE, BSKY_PASSWORD, and FLIGHTRADAR_API_TOKEN" 497 + ) 498 + return 499 + 500 + client = Client() 501 + try: 502 + client.login(settings.bsky_handle, settings.bsky_password) 503 + print(f"Logged in to BlueSky as {settings.bsky_handle}") 504 + except Exception as e: 505 + print(f"Error logging into BlueSky: {e}") 506 + return 507 + 508 + if args.subscribers: 509 + if args.subscribers == "-": 510 + subscribers_input = None 511 + else: 512 + subscribers_input = args.subscribers 513 + 514 + try: 515 + subscribers = load_subscribers(subscribers_input) 516 + print(f"Loaded {len(subscribers)} subscriber(s)") 517 + except Exception as e: 518 + print(f"Error loading subscribers: {e}") 519 + return 520 + else: 521 + # Build filters from CLI args 522 + filters = {} 523 + if args.filter_aircraft_type: 524 + filters["aircraft_type"] = args.filter_aircraft_type 525 + if args.filter_callsign: 526 + filters["callsign"] = args.filter_callsign 527 + if args.filter_origin: 528 + filters["origin"] = args.filter_origin 529 + if args.filter_destination: 530 + filters["destination"] = args.filter_destination 531 + 532 + # Load custom template if provided 533 + message_template = None 534 + if args.message_template_file: 535 + with open(args.message_template_file, "r") as f: 536 + message_template = f.read() 537 + elif args.message_template: 538 + message_template = args.message_template 539 + 540 + subscribers = [ 541 + Subscriber( 542 + handle=args.handle, 543 + latitude=args.latitude, 544 + longitude=args.longitude, 545 + radius_miles=args.radius, 546 + filters=filters, 547 + message_template=message_template, 548 + ) 549 + ] 550 + print( 551 + f"Monitoring flights within {args.radius} miles of ({args.latitude}, {args.longitude}) for {args.handle}" 552 + ) 553 + if filters: 554 + print(f"Active filters: {filters}") 555 + 556 + print(f"Checking every {args.interval} seconds...") 557 + 558 + notified_flights: dict[str, set[str]] = {} 559 + 560 + while True: 561 + try: 562 + with ThreadPoolExecutor(max_workers=args.max_workers) as executor: 563 + futures = [] 564 + for subscriber in subscribers: 565 + future = executor.submit( 566 + process_subscriber, 567 + client, 568 + settings, 569 + subscriber, 570 + notified_flights, 571 + ) 572 + futures.append(future) 573 + 574 + for future in as_completed(futures): 575 + future.result() 576 + 577 + if args.once: 578 + break 579 + 580 + time.sleep(args.interval) 581 + 582 + except KeyboardInterrupt: 583 + print("\nStopping flight monitor...") 584 + break 585 + except Exception as e: 586 + print(f"Error in monitoring loop: {e}") 587 + time.sleep(args.interval) 588 + 589 + 590 + if __name__ == "__main__": 591 + main()
+237
find-longest-bsky-thread
···
··· 1 + #!/usr/bin/env -S uv run --script --quiet 2 + # /// script 3 + # requires-python = ">=3.12" 4 + # dependencies = ["atproto", "jinja2", "pydantic-settings"] 5 + # /// 6 + """ 7 + Find the longest reply thread from a Bluesky post. 8 + 9 + Usage: 10 + 11 + ```bash 12 + ./find-longest-bsky-thread https://bsky.app/profile/nerditry.bsky.social/post/3lnofix5nlc23 13 + ``` 14 + 15 + Details: 16 + - uses [`atproto`](https://github.com/MarshalX/atproto) to fetch the thread 17 + - uses [`jinja2`](https://github.com/pallets/jinja) to render the thread 18 + """ 19 + 20 + import argparse 21 + import os 22 + from datetime import datetime 23 + from typing import Any 24 + 25 + from atproto import Client 26 + from atproto.exceptions import BadRequestError 27 + from atproto_client.models.app.bsky.feed.defs import ThreadViewPost 28 + from jinja2 import Environment 29 + from pydantic_settings import BaseSettings, SettingsConfigDict 30 + 31 + 32 + class Settings(BaseSettings): 33 + """App settings loaded from environment variables""" 34 + 35 + model_config = SettingsConfigDict( 36 + env_file=os.environ.get("ENV_FILE", ".env"), extra="ignore" 37 + ) 38 + 39 + bsky_handle: str 40 + bsky_password: str 41 + bsky_pds_url: str = "https://bsky.social" 42 + 43 + 44 + def extract_post_uri(bluesky_url: str) -> str: 45 + """Extract the AT URI from a Bluesky post URL""" 46 + import re 47 + 48 + pattern = r"https?://bsky\.app/profile/([^/]+)/post/([a-zA-Z0-9]+)" 49 + match = re.match(pattern, bluesky_url) 50 + if not match: 51 + raise ValueError(f"Invalid Bluesky URL format: {bluesky_url}") 52 + profile_did_or_handle = match.group(1) 53 + post_id = match.group(2) 54 + 55 + # We need the DID, not necessarily the handle, for the URI 56 + # However, getPostThread seems to work with handles too, but let's be robust 57 + # For now, we construct the URI assuming the input might be a handle or DID 58 + # A more robust solution would resolve the handle to a DID if needed. 59 + # Let's try constructing a basic URI first. `get_post_thread` might handle resolution. 60 + return f"at://{profile_did_or_handle}/app.bsky.feed.post/{post_id}" 61 + 62 + 63 + def get_thread(client: Client, post_uri: str) -> ThreadViewPost | None: 64 + """Fetch the full thread view for a given post URI.""" 65 + # Slightly reduced depth, as we might fetch sub-threads explicitly 66 + depth = 50 67 + # Parent height arguably less crucial for finding the *longest child* path 68 + parent_height = 2 69 + try: 70 + response = client.app.bsky.feed.get_post_thread( 71 + {"uri": post_uri, "depth": depth, "parent_height": parent_height} 72 + ) 73 + if isinstance(response.thread, ThreadViewPost): 74 + return response.thread 75 + else: 76 + # Handle cases where the post is not found, blocked, or deleted 77 + # Suppress print for non-root calls later if needed 78 + print( 79 + f"Could not fetch thread or it's not a standard post thread: {post_uri}" 80 + ) 81 + return None 82 + except BadRequestError as e: 83 + print(f"Error fetching thread {post_uri}: {e}") 84 + return None 85 + except Exception as e: 86 + print(f"An unexpected error occurred fetching thread {post_uri}: {e}") 87 + return None 88 + 89 + 90 + def find_longest_thread_path( 91 + client: Client, thread: ThreadViewPost | None 92 + ) -> list[ThreadViewPost]: 93 + """Find the longest path of replies starting from the given thread view.""" 94 + if not thread or not isinstance(thread, ThreadViewPost) or not thread.post: 95 + # Base case: Invalid or deleted/blocked post in the middle of a thread 96 + return [] 97 + 98 + longest_reply_extension: list[ThreadViewPost] = [] 99 + max_len = 0 100 + 101 + # Use replies from the current view, but potentially refresh if they seem incomplete 102 + replies_to_check = thread.replies if thread.replies else [] 103 + 104 + for reply_view in replies_to_check: 105 + # Recurse only on valid ThreadViewPost replies 106 + if isinstance(reply_view, ThreadViewPost) and reply_view.post: 107 + current_reply_view = reply_view 108 + 109 + # If this reply has no children loaded, try fetching its thread directly 110 + if not current_reply_view.replies: 111 + # Check if the post *claims* to have replies (optional optimization, needs PostView check) 112 + # For simplicity now, just always try fetching if replies are empty. 113 + fetched_reply_view = get_thread(client, current_reply_view.post.uri) 114 + if fetched_reply_view and fetched_reply_view.replies: 115 + current_reply_view = fetched_reply_view # Use the richer view 116 + 117 + # Now recurse with the potentially updated view 118 + recursive_path = find_longest_thread_path(client, current_reply_view) 119 + if len(recursive_path) > max_len: 120 + max_len = len(recursive_path) 121 + longest_reply_extension = recursive_path 122 + 123 + # The full path includes the current post + the longest path found among its replies 124 + return [thread] + longest_reply_extension 125 + 126 + 127 + def format_post_for_template(post_view: ThreadViewPost) -> dict[str, Any] | None: 128 + """Extract relevant data from a ThreadViewPost for template rendering.""" 129 + if not post_view or not post_view.post: 130 + return None 131 + 132 + post = post_view.post 133 + record = post.record 134 + 135 + # Attempt to parse the timestamp 136 + timestamp_str = getattr(record, "created_at", None) 137 + timestamp_dt = None 138 + if timestamp_str: 139 + try: 140 + # Handle different possible ISO 8601 formats from Bluesky 141 + if "." in timestamp_str and "Z" in timestamp_str: 142 + # Format like 2024-07-26T15:07:19.123Z 143 + timestamp_dt = datetime.fromisoformat( 144 + timestamp_str.replace("Z", "+00:00") 145 + ) 146 + else: 147 + # Potentially other formats, add more parsing if needed 148 + print(f"Warning: Unrecognized timestamp format {timestamp_str}") 149 + timestamp_dt = None # Or handle error appropriately 150 + except ValueError: 151 + print(f"Warning: Could not parse timestamp {timestamp_str}") 152 + timestamp_dt = None 153 + 154 + return { 155 + "author": post.author.handle, 156 + "text": getattr(record, "text", "[No text content]"), 157 + "timestamp": timestamp_dt.strftime("%Y-%m-%d %H:%M:%S UTC") 158 + if timestamp_dt 159 + else "[Unknown time]", 160 + "uri": post.uri, 161 + "cid": post.cid, 162 + } 163 + 164 + 165 + def main(post_url: str, template_str: str): 166 + """Main function to find and render the longest thread.""" 167 + try: 168 + settings = Settings() # type: ignore 169 + except Exception as e: 170 + print( 171 + f"Error loading settings (ensure .env file exists with BSKY_HANDLE and BSKY_PASSWORD): {e}" 172 + ) 173 + return 174 + 175 + client = Client(base_url=settings.bsky_pds_url) 176 + try: 177 + client.login(settings.bsky_handle, settings.bsky_password) 178 + except Exception as e: 179 + print(f"Error logging into Bluesky: {e}") 180 + return 181 + 182 + try: 183 + post_uri = extract_post_uri(post_url) 184 + except ValueError as e: 185 + print(e) 186 + return 187 + 188 + print(f"Fetching thread for: {post_uri}") 189 + root_thread_view = get_thread(client, post_uri) 190 + 191 + if not root_thread_view: 192 + print("Failed to fetch the root post thread.") 193 + return 194 + 195 + # --- Finding the longest path --- 196 + print("Finding the longest thread path...") 197 + longest_path_views = find_longest_thread_path(client, root_thread_view) 198 + print(f"Found {len(longest_path_views)} post(s) in the longest path.") 199 + # --- End Finding --- 200 + 201 + thread_data = [ 202 + data 203 + for view in longest_path_views 204 + if (data := format_post_for_template(view)) is not None 205 + ] 206 + 207 + if not thread_data: 208 + print("No valid posts found in the path to render.") 209 + return 210 + 211 + # Render using Jinja 212 + environment = Environment() 213 + template = environment.from_string(template_str) 214 + output = template.render(posts=thread_data) 215 + 216 + print("\\n--- Rendered Thread ---") 217 + print(output) 218 + print("--- End Rendered Thread ---") 219 + 220 + 221 + if __name__ == "__main__": 222 + parser = argparse.ArgumentParser( 223 + description="Find and render the longest reply thread from a Bluesky post." 224 + ) 225 + parser.add_argument("post_url", help="The URL of the starting Bluesky post.") 226 + args = parser.parse_args() 227 + 228 + # Default Jinja Template 229 + default_template = """ 230 + {% for post in posts %} 231 + {{ loop.index }}. {{ post.author }} at {{ post.timestamp }} 232 + URI: {{ post.uri }} 233 + Text: {{ post.text | indent(width=4, first=false) }} 234 + {% endfor %} 235 + """ 236 + 237 + main(args.post_url, default_template)
+268
find-stale-bsky-follows
···
··· 1 + #!/usr/bin/env -S uv run --script --quiet 2 + # /// script 3 + # requires-python = ">=3.12" 4 + # dependencies = ["atproto", "pydantic-settings", "rich"] 5 + # /// 6 + """ 7 + Find stale/inactive accounts among those you follow on Bluesky. 8 + 9 + Usage: 10 + 11 + ```bash 12 + ./find-stale-bsky-follows 13 + # or with custom inactivity threshold (days) 14 + ./find-stale-bsky-follows --days 180 15 + ``` 16 + 17 + Details: 18 + - uses [`atproto`](https://github.com/MarshalX/atproto) to fetch following list 19 + - uses [`rich`](https://github.com/Textualize/rich) for pretty output 20 + - identifies accounts with no recent posts 21 + """ 22 + 23 + import argparse 24 + import os 25 + from datetime import datetime, timedelta, timezone 26 + from typing import NamedTuple 27 + 28 + from atproto import Client 29 + from pydantic_settings import BaseSettings, SettingsConfigDict 30 + from rich.console import Console 31 + from rich.progress import Progress, SpinnerColumn, TextColumn 32 + from rich.table import Table 33 + 34 + 35 + class Settings(BaseSettings): 36 + """App settings loaded from environment variables""" 37 + 38 + model_config = SettingsConfigDict( 39 + env_file=os.environ.get("ENV_FILE", ".env"), extra="ignore" 40 + ) 41 + 42 + bsky_handle: str 43 + bsky_password: str 44 + bsky_pds_url: str = "https://bsky.social" 45 + 46 + 47 + class AccountActivity(NamedTuple): 48 + """Activity information for a Bluesky account""" 49 + 50 + handle: str 51 + display_name: str | None 52 + did: str 53 + posts_count: int 54 + last_post_date: datetime | None 55 + days_inactive: int | None 56 + is_stale: bool 57 + 58 + 59 + def get_following_list(client: Client) -> list[dict]: 60 + """Fetch all accounts the authenticated user follows""" 61 + following = [] 62 + cursor = None 63 + 64 + while True: 65 + assert client.me, "client.me should be set" 66 + response = client.get_follows(client.me.did, cursor=cursor) 67 + following.extend(response.follows) 68 + 69 + if not response.cursor: 70 + break 71 + cursor = response.cursor 72 + 73 + return following 74 + 75 + 76 + def check_account_activity( 77 + client: Client, actor: dict, inactivity_threshold_days: int 78 + ) -> AccountActivity: 79 + """ 80 + Check the activity of a single account. 81 + 82 + Returns AccountActivity with stale status based on: 83 + - No posts at all 84 + - Last post older than threshold 85 + """ 86 + handle = actor.handle 87 + did = actor.did 88 + display_name = getattr(actor, "display_name", None) 89 + 90 + try: 91 + # Get the user's profile to check post count 92 + profile = client.get_profile(handle) 93 + posts_count = profile.posts_count or 0 94 + 95 + # If no posts, immediately mark as stale 96 + if posts_count == 0: 97 + return AccountActivity( 98 + handle=handle, 99 + display_name=display_name, 100 + did=did, 101 + posts_count=0, 102 + last_post_date=None, 103 + days_inactive=None, 104 + is_stale=True, 105 + ) 106 + 107 + # Get author feed to find last post 108 + feed_response = client.get_author_feed(actor=handle, limit=1) 109 + 110 + last_post_date = None 111 + if feed_response.feed: 112 + last_post = feed_response.feed[0].post 113 + if hasattr(last_post.record, "created_at"): 114 + created_at_str = last_post.record.created_at 115 + # Parse ISO 8601 timestamp 116 + last_post_date = datetime.fromisoformat( 117 + created_at_str.replace("Z", "+00:00") 118 + ) 119 + 120 + # Calculate days inactive 121 + if last_post_date: 122 + days_inactive = (datetime.now(timezone.utc) - last_post_date).days 123 + is_stale = days_inactive > inactivity_threshold_days 124 + else: 125 + # Has posts but couldn't determine date - consider stale 126 + days_inactive = None 127 + is_stale = True 128 + 129 + return AccountActivity( 130 + handle=handle, 131 + display_name=display_name, 132 + did=did, 133 + posts_count=posts_count, 134 + last_post_date=last_post_date, 135 + days_inactive=days_inactive, 136 + is_stale=is_stale, 137 + ) 138 + 139 + except Exception as e: 140 + # If we can't check activity, mark as potentially problematic 141 + # (could be deleted, suspended, or private) 142 + return AccountActivity( 143 + handle=handle, 144 + display_name=display_name, 145 + did=did, 146 + posts_count=0, 147 + last_post_date=None, 148 + days_inactive=None, 149 + is_stale=True, 150 + ) 151 + 152 + 153 + def format_account_link(handle: str) -> str: 154 + """Format a clickable Bluesky profile link""" 155 + return f"https://bsky.app/profile/{handle}" 156 + 157 + 158 + def main(inactivity_threshold_days: int): 159 + """Main function to find stale accounts""" 160 + console = Console() 161 + 162 + try: 163 + settings = Settings() # type: ignore 164 + except Exception as e: 165 + console.print( 166 + f"[red]Error loading settings (ensure .env file exists with BSKY_HANDLE and BSKY_PASSWORD): {e}[/red]" 167 + ) 168 + return 169 + 170 + client = Client(base_url=settings.bsky_pds_url) 171 + try: 172 + client.login(settings.bsky_handle, settings.bsky_password) 173 + except Exception as e: 174 + console.print(f"[red]Error logging into Bluesky: {e}[/red]") 175 + return 176 + 177 + console.print(f"[blue]Logged in as {client.me.handle}[/blue]") 178 + console.print( 179 + f"[blue]Checking for accounts inactive for more than {inactivity_threshold_days} days...[/blue]\n" 180 + ) 181 + 182 + # Fetch following list 183 + with Progress( 184 + SpinnerColumn(), 185 + TextColumn("[progress.description]{task.description}"), 186 + console=console, 187 + ) as progress: 188 + task = progress.add_task("Fetching following list...", total=None) 189 + following = get_following_list(client) 190 + progress.update(task, completed=True) 191 + 192 + console.print(f"[green]Found {len(following)} accounts you follow[/green]\n") 193 + 194 + # Check activity for each account 195 + stale_accounts = [] 196 + with Progress( 197 + SpinnerColumn(), 198 + TextColumn("[progress.description]{task.description}"), 199 + console=console, 200 + ) as progress: 201 + task = progress.add_task("Analyzing account activity...", total=len(following)) 202 + 203 + for actor in following: 204 + activity = check_account_activity( 205 + client, actor, inactivity_threshold_days 206 + ) 207 + if activity.is_stale: 208 + stale_accounts.append(activity) 209 + progress.advance(task) 210 + 211 + # Display results 212 + console.print(f"\n[yellow]Found {len(stale_accounts)} stale accounts:[/yellow]\n") 213 + 214 + if stale_accounts: 215 + table = Table(show_header=True, header_style="bold magenta") 216 + table.add_column("Handle", style="cyan") 217 + table.add_column("Display Name", style="white") 218 + table.add_column("Posts", justify="right", style="blue") 219 + table.add_column("Last Post", style="yellow") 220 + table.add_column("Days Inactive", justify="right", style="red") 221 + 222 + # Sort by days inactive (None values last) 223 + stale_accounts.sort( 224 + key=lambda x: (x.days_inactive is None, x.days_inactive or 0), 225 + reverse=True, 226 + ) 227 + 228 + for account in stale_accounts: 229 + last_post = ( 230 + account.last_post_date.strftime("%Y-%m-%d") 231 + if account.last_post_date 232 + else "Never" 233 + ) 234 + days = str(account.days_inactive) if account.days_inactive else "Unknown" 235 + 236 + table.add_row( 237 + f"@{account.handle}", 238 + account.display_name or "[dim]No name[/dim]", 239 + str(account.posts_count), 240 + last_post, 241 + days, 242 + ) 243 + 244 + console.print(table) 245 + 246 + # Print links for easy access 247 + console.print("\n[dim]Profile links:[/dim]") 248 + for account in stale_accounts[:10]: # Limit to first 10 249 + console.print(f" {format_account_link(account.handle)}") 250 + if len(stale_accounts) > 10: 251 + console.print(f" [dim]... and {len(stale_accounts) - 10} more[/dim]") 252 + else: 253 + console.print("[green]All accounts you follow are active![/green]") 254 + 255 + 256 + if __name__ == "__main__": 257 + parser = argparse.ArgumentParser( 258 + description="Find stale/inactive accounts you follow on Bluesky." 259 + ) 260 + parser.add_argument( 261 + "--days", 262 + type=int, 263 + default=180, 264 + help="Number of days of inactivity to consider an account stale (default: 180)", 265 + ) 266 + args = parser.parse_args() 267 + 268 + main(args.days)
+10
kill-processes
··· 11 ```bash 12 ./kill-processes 13 ``` 14 """ 15 16 import os ··· 19 import marvin 20 import hashlib 21 import json 22 from pathlib import Path 23 from cachetools import TTLCache 24 from textual.app import App, ComposeResult ··· 345 346 347 if __name__ == "__main__": 348 ProcessTUI().run()
··· 11 ```bash 12 ./kill-processes 13 ``` 14 + 15 + Details: 16 + - uses [`textual`](https://textual.textualize.io/) for the TUI 17 + - uses [`marvin`](https://github.com/prefecthq/marvin) (built on [`pydantic-ai`](https://github.com/pydantic/pydantic-ai)) to annotate processes 18 """ 19 20 import os ··· 23 import marvin 24 import hashlib 25 import json 26 + import asyncio 27 from pathlib import Path 28 from cachetools import TTLCache 29 from textual.app import App, ComposeResult ··· 350 351 352 if __name__ == "__main__": 353 + try: 354 + loop = asyncio.get_event_loop() 355 + except RuntimeError: 356 + loop = asyncio.new_event_loop() 357 + asyncio.set_event_loop(loop) 358 ProcessTUI().run()
+298
predict-github-stars
···
··· 1 + #!/usr/bin/env -S uv run --script --quiet 2 + # /// script 3 + # requires-python = ">=3.12" 4 + # dependencies = ["httpx", "rich", "numpy", "scikit-learn", "python-dateutil", "pandas", "pydantic-settings"] 5 + # /// 6 + """ 7 + Predict when a GitHub repository will reach a target number of stars. 8 + 9 + Usage: 10 + ./predict-github-stars owner/repo 10000 11 + 12 + Details: 13 + - Uses GitHub REST API to fetch star history (with timestamps). 14 + - Fits polynomial regression (degreeโ€ฏ1โ€“3) to full history. 15 + - Falls back to recentโ€‘trend linear extrapolation if the polynomial 16 + cannot reach the target within ten years. 17 + - Shows recent growth rate and a caution for longโ€‘range estimates. 18 + - Requires `GITHUB_TOKEN` in the environment for higher rate limits (optional). 19 + """ 20 + 21 + from __future__ import annotations 22 + 23 + import argparse 24 + import os 25 + import sys 26 + from datetime import datetime, timezone 27 + 28 + import httpx 29 + import numpy as np 30 + import pandas as pd 31 + from dateutil import parser as date_parser 32 + from pydantic import Field 33 + from pydantic_settings import BaseSettings, SettingsConfigDict 34 + from rich.console import Console 35 + from rich.panel import Panel 36 + from rich.table import Table 37 + from sklearn.linear_model import LinearRegression 38 + from sklearn.metrics import r2_score 39 + from sklearn.preprocessing import PolynomialFeatures 40 + 41 + console = Console() 42 + 43 + 44 + class Settings(BaseSettings): 45 + """Load settings (e.g. GitHub token) from environment.""" 46 + 47 + model_config = SettingsConfigDict( 48 + env_file=os.environ.get("ENV_FILE", ".env"), extra="ignore" 49 + ) 50 + github_token: str = Field(default="", description="GitHub API token") 51 + 52 + 53 + # โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ GitHub helpers โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ 54 + 55 + 56 + def _headers(token: str | None = None) -> dict[str, str]: 57 + h = {"Accept": "application/vnd.github.v3+json"} 58 + if token: 59 + h["Authorization"] = f"token {token}" 60 + return h 61 + 62 + 63 + def get_repo_data(owner: str, repo: str, token: str | None = None) -> dict: 64 + url = f"https://api.github.com/repos/{owner}/{repo}" 65 + with httpx.Client() as c: 66 + r = c.get(url, headers=_headers(token)) 67 + r.raise_for_status() 68 + return r.json() 69 + 70 + 71 + def get_star_history( 72 + owner: str, repo: str, token: str | None, total_stars: int 73 + ) -> list[tuple[datetime, int]]: 74 + """Return (timestamp, cumulative_star_count) pairs, sampled if repo is huge.""" 75 + hdrs = _headers(token) 76 + hdrs["Accept"] = "application/vnd.github.v3.star+json" # need starred_at 77 + 78 + history: list[tuple[datetime, int]] = [] 79 + 80 + if total_stars > 10_000: 81 + # sample ~200 evenlyโ€‘spaced star indices 82 + sample_points = 200 83 + step = max(1, total_stars // sample_points) 84 + pages_needed: dict[int, list[int]] = {} 85 + for s in range(1, total_stars, step): 86 + pg = (s - 1) // 100 + 1 87 + idx = (s - 1) % 100 88 + pages_needed.setdefault(pg, []).append(idx) 89 + 90 + # always include final star 91 + last_pg = (total_stars - 1) // 100 + 1 92 + last_idx = (total_stars - 1) % 100 93 + pages_needed.setdefault(last_pg, []).append(last_idx) 94 + 95 + with httpx.Client() as c: 96 + for pg, idxs in pages_needed.items(): 97 + url = f"https://api.github.com/repos/{owner}/{repo}/stargazers?page={pg}&per_page=100" 98 + r = c.get(url, headers=hdrs) 99 + r.raise_for_status() 100 + data = r.json() 101 + for i in sorted(set(idxs)): 102 + if i < len(data) and "starred_at" in data[i]: 103 + ts = date_parser.parse(data[i]["starred_at"]) 104 + history.append((ts, (pg - 1) * 100 + i + 1)) 105 + 106 + console.print(f"[dim]sampled {len(history)} points across star history[/dim]") 107 + 108 + else: 109 + # fetch all pages 110 + page = 1 111 + with httpx.Client() as c: 112 + while True: 113 + url = f"https://api.github.com/repos/{owner}/{repo}/stargazers?page={page}&per_page=100" 114 + r = c.get(url, headers=hdrs) 115 + r.raise_for_status() 116 + data = r.json() 117 + if not data: 118 + break 119 + for i, star in enumerate(data): 120 + if "starred_at" in star: 121 + ts = date_parser.parse(star["starred_at"]) 122 + history.append((ts, (page - 1) * 100 + i + 1)) 123 + page += 1 124 + 125 + # ensure order and anchor todayโ€™s count 126 + history.sort(key=lambda t: t[0]) 127 + if history and history[-1][1] < total_stars: 128 + history.append((datetime.now(timezone.utc), total_stars)) 129 + return history 130 + 131 + 132 + # โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ modelling โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ 133 + 134 + 135 + def best_poly_fit( 136 + X: np.ndarray, y: np.ndarray 137 + ) -> tuple[LinearRegression, PolynomialFeatures, int, float]: 138 + best_r2 = -1.0 139 + best_model: LinearRegression | None = None 140 + best_poly: PolynomialFeatures | None = None 141 + best_deg = 1 142 + for deg in (1, 2, 3): 143 + poly = PolynomialFeatures(degree=deg) 144 + Xpoly = poly.fit_transform(X) 145 + model = LinearRegression().fit(Xpoly, y) 146 + r2 = r2_score(y, model.predict(Xpoly)) 147 + if r2 > best_r2: 148 + best_r2, best_model, best_poly, best_deg = r2, model, poly, deg 149 + return best_model, best_poly, best_deg, best_r2 # type: ignore 150 + 151 + 152 + def predict_date(history: list[tuple[datetime, int]], target: int) -> datetime | None: 153 + if len(history) < 10: 154 + return None 155 + origin = history[0][0] 156 + X = np.array([(t - origin).total_seconds() / 86400 for t, _ in history]).reshape( 157 + -1, 1 158 + ) 159 + y = np.array([s for _, s in history]) 160 + 161 + model, poly, deg, r2 = best_poly_fit(X, y) 162 + console.print(f"[dim]best fit: degree {deg} polynomial (rยฒ = {r2:.3f})[/dim]") 163 + 164 + current_day = X[-1, 0] 165 + for d in range(0, 3650): # up toโ€ฏ10โ€ฏyears 166 + future = current_day + d 167 + if model.predict(poly.transform([[future]]))[0] >= target: 168 + return origin + pd.Timedelta(days=future) 169 + return None 170 + 171 + 172 + # โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ utils โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ 173 + 174 + 175 + def timeframe_str(dt: datetime) -> str: 176 + now = datetime.now(timezone.utc) 177 + if dt <= now: 178 + return "already reached" 179 + days = (dt - now).days 180 + if days == 0: 181 + return "today" 182 + if days == 1: 183 + return "tomorrow" 184 + if days < 7: 185 + return f"in {days} days" 186 + if days < 30: 187 + return f"in {days // 7} week(s)" 188 + if days < 365: 189 + return f"in {days // 30} month(s)" 190 + return f"in {days // 365} year(s)" 191 + 192 + 193 + # โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ main โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ 194 + 195 + 196 + def main() -> None: 197 + p = argparse.ArgumentParser( 198 + description="Predict when a GitHub repo will reach a target number of stars" 199 + ) 200 + p.add_argument("repo", help="owner/repo") 201 + p.add_argument("stars", type=int, help="target star count") 202 + args = p.parse_args() 203 + 204 + if "/" not in args.repo: 205 + console.print("[red]error: repo must be owner/repo[/red]") 206 + sys.exit(1) 207 + owner, repo = args.repo.split("/", 1) 208 + 209 + try: 210 + settings = Settings() # load token 211 + except Exception as e: # pragma: no cover 212 + console.print(f"[red]error loading settings: {e}[/red]") 213 + sys.exit(1) 214 + token = settings.github_token.strip() or None 215 + 216 + try: 217 + repo_data = get_repo_data(owner, repo, token) 218 + current_stars = repo_data["stargazers_count"] 219 + created_at = date_parser.parse(repo_data["created_at"]) 220 + 221 + console.print( 222 + Panel.fit( 223 + f"[bold cyan]{owner}/{repo}[/bold cyan]\n" 224 + f"[dim]current stars: {current_stars:,}\ncreated: {created_at:%Y-%m-%d}[/dim]", 225 + border_style="blue", 226 + ) 227 + ) 228 + 229 + if current_stars >= args.stars: 230 + console.print("\n[green]โœ“ already at or above target![/green]") 231 + sys.exit(0) 232 + 233 + console.print("\nfetching star historyโ€ฆ") 234 + history = get_star_history(owner, repo, token, current_stars) 235 + if not history: 236 + console.print("[red]error: no star history[/red]") 237 + sys.exit(1) 238 + if len(history) > 1000: # downโ€‘sample for speed 239 + step = len(history) // 1000 240 + history = history[::step] + [history[-1]] 241 + 242 + console.print(f"[dim]analysing {len(history)} data pointsโ€ฆ[/dim]") 243 + poly_date = predict_date(history, args.stars) 244 + 245 + def recent_rate(window: int = 30) -> float: 246 + cutoff = datetime.now(timezone.utc) - pd.Timedelta(days=window) 247 + pts = [s for t, s in history if t >= cutoff] 248 + return (pts[-1] - pts[0]) / window if len(pts) >= 2 else 0.0 249 + 250 + rate = recent_rate() or recent_rate(90) 251 + 252 + if poly_date: 253 + out_date, tf = poly_date, timeframe_str(poly_date) 254 + elif rate > 0: 255 + days_needed = (args.stars - current_stars) / rate 256 + out_date = datetime.now(timezone.utc) + pd.Timedelta(days=days_needed) 257 + tf = timeframe_str(out_date) 258 + console.print( 259 + "[dim]poly model pessimistic; using recent growth trend[/dim]" 260 + ) 261 + else: 262 + console.print( 263 + f"[red]โœ— unlikely to reach {args.stars:,} stars in the next 10โ€ฏyears[/red]" 264 + ) 265 + sys.exit(0) 266 + 267 + table = Table(show_header=True, header_style="bold magenta") 268 + table.add_column("metric") 269 + table.add_column("value", style="white") 270 + table.add_row("target stars", f"{args.stars:,}") 271 + table.add_row("current stars", f"{current_stars:,}") 272 + table.add_row("stars needed", f"{args.stars - current_stars:,}") 273 + table.add_row("predicted date", out_date.strftime("%Y-%m-%d")) 274 + table.add_row("timeframe", tf) 275 + if rate: 276 + table.add_row("recent growth", f"{rate:.1f} stars/day") 277 + 278 + console.print() 279 + console.print(table) 280 + if "year" in tf and "1 year" not in tf: 281 + console.print("\n[dim]โš  prediction far in future; uncertainty high[/dim]") 282 + 283 + except httpx.HTTPStatusError as e: 284 + if e.response.status_code == 404: 285 + msg = "repository not found" 286 + elif e.response.status_code == 403: 287 + msg = "rate limit exceeded (set GITHUB_TOKEN)" 288 + else: 289 + msg = f"GitHub API error {e.response.status_code}" 290 + console.print(f"[red]error: {msg}[/red]") 291 + sys.exit(1) 292 + except Exception as e: # pragma: no cover 293 + console.print(f"[red]error: {e}[/red]") 294 + sys.exit(1) 295 + 296 + 297 + if __name__ == "__main__": 298 + main()
+48 -4
update-lights
··· 11 ```bash 12 ./update-lights -m "turn on sahara in the living room and nightlight in the kitchen" 13 ``` 14 """ 15 16 import marvin ··· 19 from pydantic import Field 20 from pydantic_ai.mcp import MCPServerStdio 21 from pydantic_ai.models import KnownModelName 22 23 24 class Settings(BaseSettings): ··· 26 27 hue_bridge_ip: str = Field(default=...) 28 hue_bridge_username: str = Field(default=...) 29 30 - ai_model: KnownModelName = Field(default="gpt-4o") 31 32 33 settings = Settings() 34 35 hub_mcp = MCPServerStdio( 36 command="uvx", ··· 45 46 47 if __name__ == "__main__": 48 parser = argparse.ArgumentParser(description="Send a command to the Marvin agent.") 49 parser.add_argument( 50 "--message", 51 "-m", 52 type=str, 53 - default="turn off all the lights", 54 - help="The message to send to the agent (defaults to 'turn off all the lights').", 55 ) 56 args = parser.parse_args() 57 ··· 59 model=settings.ai_model, 60 mcp_servers=[hub_mcp], 61 ) 62 - agent.run(str(args.message))
··· 11 ```bash 12 ./update-lights -m "turn on sahara in the living room and nightlight in the kitchen" 13 ``` 14 + 15 + Details: 16 + - uses a [`marvin`](https://github.com/prefecthq/marvin) (built on [`pydantic-ai`](https://github.com/pydantic/pydantic-ai)) agent 17 + - the agent spins up a [`fastmcp`](https://github.com/jlowin/fastmcp) MCP server that talks to my [`phue`](https://github.com/studioimaginaire/phue) bridge 18 + - set `HUE_BRIDGE_IP` and `HUE_BRIDGE_USERNAME` in `.env` or otherwise in environment 19 + - uses `OPENAI_API_KEY` by default, but you can set `AI_MODEL` in `.env` or otherwise in environment to use a different model 20 """ 21 22 import marvin ··· 25 from pydantic import Field 26 from pydantic_ai.mcp import MCPServerStdio 27 from pydantic_ai.models import KnownModelName 28 + from rich.console import Console 29 + from rich.panel import Panel 30 + from rich.prompt import Prompt 31 32 33 class Settings(BaseSettings): ··· 35 36 hue_bridge_ip: str = Field(default=...) 37 hue_bridge_username: str = Field(default=...) 38 + anthropic_api_key: str | None = Field(default=None) 39 40 + ai_model: KnownModelName = Field(default="anthropic:claude-opus-4-5") 41 42 43 settings = Settings() 44 + console = Console() 45 46 hub_mcp = MCPServerStdio( 47 command="uvx", ··· 56 57 58 if __name__ == "__main__": 59 + import os 60 + 61 + if settings.anthropic_api_key: 62 + os.environ["ANTHROPIC_API_KEY"] = settings.anthropic_api_key 63 + 64 parser = argparse.ArgumentParser(description="Send a command to the Marvin agent.") 65 parser.add_argument( 66 "--message", 67 "-m", 68 type=str, 69 + default="soft and dim - Jessica Pratt energy, all areas", 70 + help="The message to send to the agent (defaults to 'soft and dim - Jessica Pratt energy, all areas').", 71 + ) 72 + parser.add_argument( 73 + "--once", 74 + action="store_true", 75 + help="Run once and exit instead of entering interactive mode.", 76 ) 77 args = parser.parse_args() 78 ··· 80 model=settings.ai_model, 81 mcp_servers=[hub_mcp], 82 ) 83 + 84 + console.print( 85 + Panel.fit( 86 + f"[bold cyan]๐Ÿ  lights agent[/bold cyan]\n" 87 + f"[dim]model: {settings.ai_model}[/dim]", 88 + border_style="blue", 89 + ) 90 + ) 91 + 92 + with marvin.Thread(): 93 + console.print(f"\n[bold yellow]โ†’[/bold yellow] {args.message}") 94 + agent.run(str(args.message)) 95 + 96 + if not args.once: 97 + while True: 98 + try: 99 + user_input = Prompt.ask( 100 + "\n[bold green]enter a message[/bold green]" 101 + ) 102 + console.print(f"[bold yellow]โ†’[/bold yellow] {user_input}") 103 + agent.run(str(user_input)) 104 + except KeyboardInterrupt: 105 + console.print("\n[dim red]exiting...[/dim red]") 106 + break
+3 -3
update-readme
··· 62 scripts = get_scripts() 63 script_list = "\n\n## scripts\n\n" 64 65 - # Add directory 66 for script, _ in scripts: 67 script_list += f"- [`{script.name}`](#{script.name})\n" 68 script_list += "\n---\n\n" 69 70 - # Add detailed entries 71 - for script, doc in scripts: 72 script_list += f"### `{script.name}`\n\n{doc or 'no description'}\n\n" 73 74 new_content = base_content + script_list 75
··· 62 scripts = get_scripts() 63 script_list = "\n\n## scripts\n\n" 64 65 for script, _ in scripts: 66 script_list += f"- [`{script.name}`](#{script.name})\n" 67 script_list += "\n---\n\n" 68 69 + for i, (script, doc) in enumerate(scripts): 70 script_list += f"### `{script.name}`\n\n{doc or 'no description'}\n\n" 71 + if i < len(scripts) - 1: 72 + script_list += "---\n\n" 73 74 new_content = base_content + script_list 75