feat: docket background tasks, atproto refactor, header stats styling (#534)

* feat: add pydocket for background task infrastructure

- add pydocket dependency for Redis-backed background tasks
- add DocketSettings to config (defaults to memory:// for local dev)
- create background.py with docket initialization and worker lifespan
- create background_tasks.py with scan_copyright task
- migrate copyright scan from asyncio.create_task to docket.add()
- add Redis to test docker-compose
- update AGENTS.md with test command note

the copyright scan is the first task migrated to docket. upload processing
still uses FastAPI BackgroundTasks pending auth session refactoring.

๐Ÿค– Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: make docket opt-in, add fallback to asyncio.create_task

- change DOCKET_URL default from "memory://plyr" to "" (disabled)
- add is_docket_enabled() check function
- update background_worker_lifespan() to yield None when disabled
- uploads.py: check is_docket_enabled(), fallback to asyncio.create_task
- remove stub process_upload from background_tasks.py

memory mode won't work in production with multiple machines, so docket
must be explicitly enabled with a Redis URL. when disabled, copyright
scans fall back to fire-and-forget asyncio.create_task().

๐Ÿค– Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* refactor: clean up upload background processing

- extract featured artist resolution to handles.py
- extract schedule_copyright_scan to unify docket/asyncio branching
- break _process_upload_background into smaller helper functions:
- _save_audio_to_storage
- _save_image_to_storage
- _add_tags_to_track
- _send_track_notification
- fix sketchy json.JSONDecodeError pass with proper logging
- add compose.yaml for local redis
- add just commands: dev-up, dev-down, dev

๐Ÿค– Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* chore: simplify just dev to single command

removes dev-up/dev-down, just use 'just dev' which starts redis,
backend (with DOCKET_URL set), and frontend.

๐Ÿค– Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* docs: add background tasks and redis documentation

- add docs/backend/background-tasks.md covering docket/redis setup
- update configuration.md with docket settings
- update environments.md with redis instances per environment
- created Upstash Redis instances:
- plyr-redis-prd (production, iad region)
- plyr-redis-stg (staging, iad region)

note: DOCKET_URL not yet wired to fly secrets - will test in dev first

๐Ÿค– Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* refactor: use UploadContext dataclass for background processing

- replace 12 function arguments with single UploadContext dataclass
- cleaner interface for _process_upload_background
- update just commands: dev-services / dev-services-down
- update local dev docs to reflect separate terminal workflow

๐Ÿค– Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* refactor: reorganize atproto module by lexicon namespace

- split monolithic records.py (1000+ lines) into focused modules:
- client.py: PDS request/auth logic, token refresh with per-session locks
- records/fm_plyr/: plyr.fm lexicons (track, like, comment, list, profile)
- records/fm_teal/: teal.fm lexicons (play, status)
- sync.py: high-level sync orchestration

- maintain backward compatibility via re-exports in records/__init__.py
- update test patch paths for new module locations
- add dev-services commands for local redis via docker compose

๐Ÿค– Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* style: make header stats smaller and evenly distributed

- reduce stats font size (0.75rem โ†’ 0.65rem) and icons (14px โ†’ 12px)
- distribute stats and search evenly across left margin with space-evenly
- margin width responds correctly when queue panel opens

๐Ÿค– Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: resolve docket startup and logging issues

- rename 'name' to 'docket_name' in log extra dict (conflicts with LogRecord.name)
- suppress docket logger noise by setting level to WARNING
- remove unnecessary sleep(0.1) from worker shutdown

๐Ÿค– Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* refactor: make suppressed loggers configurable

- add LOGFIRE_SUPPRESSED_LOGGERS setting (defaults to "docket")
- iterate over setting in main.py instead of hardcoding

๐Ÿค– Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

---------

Co-authored-by: Claude <noreply@anthropic.com>

authored by zzstoatzz.io Claude and committed by GitHub c498199d 9f856c67

+1 -1
AGENTS.md
··· 33 ## ๐Ÿ’ป Development Commands 34 * **Backend:** `just backend run` 35 * **Frontend:** `just frontend dev` 36 - * **Tests:** `just backend test` 37 * **Linting:** `just backend lint` (Python) / `just frontend check` (Svelte) 38 * **Migrations:** `just backend migrate "message"` (create), `just backend migrate-up` (apply) 39
··· 33 ## ๐Ÿ’ป Development Commands 34 * **Backend:** `just backend run` 35 * **Frontend:** `just frontend dev` 36 + * **Tests:** `just backend test` (run from repo root, not from backend/) 37 * **Linting:** `just backend lint` (Python) / `just frontend check` (Svelte) 38 * **Migrations:** `just backend migrate "message"` (create), `just backend migrate-up` (apply) 39
+1
backend/pyproject.toml
··· 27 "slowapi @ git+https://github.com/zzstoatzz/slowapi.git@fix-deprecation", 28 "orjson>=3.11.4", 29 "mutagen>=1.47.0", 30 ] 31 32 requires-python = ">=3.11"
··· 27 "slowapi @ git+https://github.com/zzstoatzz/slowapi.git@fix-deprecation", 28 "orjson>=3.11.4", 29 "mutagen>=1.47.0", 30 + "pydocket>=0.15.2", 31 ] 32 33 requires-python = ">=3.11"
+6 -2
backend/src/backend/_internal/CLAUDE.md
··· 3 internal services and business logic. 4 5 - **auth**: OAuth session encryption (Fernet), token refresh with per-session locks 6 - - **atproto**: record creation (fm.plyr.track, fm.plyr.like), PDS resolution with caching 7 - **queue**: fisher-yates shuffle with retry, postgres LISTEN/NOTIFY for cache invalidation 8 - **uploads**: streaming chunked uploads to R2/filesystem, duplicate detection via file_id 9 10 gotchas: 11 - - ATProto records use `_internal/atproto/records.py` (not `src/backend/atproto/`) 12 - file_id is sha256 hash truncated to 16 chars 13 - queue cache is TTL-based (5min), hydration includes duplicate track_ids
··· 3 internal services and business logic. 4 5 - **auth**: OAuth session encryption (Fernet), token refresh with per-session locks 6 + - **atproto**: record creation organized by lexicon namespace 7 + - `client.py`: low-level PDS requests, token refresh with per-session locks 8 + - `records/fm_plyr/`: plyr.fm lexicons (track, like, comment, list, profile) 9 + - `records/fm_teal/`: teal.fm lexicons (play, status) 10 + - `sync.py`: high-level sync orchestration (profile, albums, liked list) 11 - **queue**: fisher-yates shuffle with retry, postgres LISTEN/NOTIFY for cache invalidation 12 - **uploads**: streaming chunked uploads to R2/filesystem, duplicate detection via file_id 13 14 gotchas: 15 + - ATProto records organized under `_internal/atproto/records/` by lexicon namespace 16 - file_id is sha256 hash truncated to 16 chars 17 - queue cache is TTL-based (5min), hydration includes duplicate track_ids
+5 -1
backend/src/backend/_internal/atproto/__init__.py
··· 11 create_list_record, 12 create_track_record, 13 delete_record_by_uri, 14 - sync_atproto_records, 15 update_comment_record, 16 update_list_record, 17 upsert_album_list_record, 18 upsert_liked_list_record, 19 upsert_profile_record, 20 ) 21 22 __all__ = [ 23 "create_comment_record", ··· 27 "delete_record_by_uri", 28 "fetch_user_avatar", 29 "fetch_user_profile", 30 "normalize_avatar_url", 31 "sync_atproto_records", 32 "update_comment_record", 33 "update_list_record", 34 "upsert_album_list_record", 35 "upsert_liked_list_record", 36 "upsert_profile_record",
··· 11 create_list_record, 12 create_track_record, 13 delete_record_by_uri, 14 + get_record_public, 15 update_comment_record, 16 update_list_record, 17 + update_record, 18 upsert_album_list_record, 19 upsert_liked_list_record, 20 upsert_profile_record, 21 ) 22 + from backend._internal.atproto.sync import sync_atproto_records 23 24 __all__ = [ 25 "create_comment_record", ··· 29 "delete_record_by_uri", 30 "fetch_user_avatar", 31 "fetch_user_profile", 32 + "get_record_public", 33 "normalize_avatar_url", 34 "sync_atproto_records", 35 "update_comment_record", 36 "update_list_record", 37 + "update_record", 38 "upsert_album_list_record", 39 "upsert_liked_list_record", 40 "upsert_profile_record",
+229
backend/src/backend/_internal/atproto/client.py
···
··· 1 + """low-level ATProto PDS client with OAuth and token refresh.""" 2 + 3 + import asyncio 4 + import json 5 + import logging 6 + from typing import Any 7 + 8 + from atproto_oauth.models import OAuthSession 9 + 10 + from backend._internal import Session as AuthSession 11 + from backend._internal import get_oauth_client, get_session, update_session_tokens 12 + 13 + logger = logging.getLogger(__name__) 14 + 15 + # per-session locks for token refresh to prevent concurrent refresh races 16 + _refresh_locks: dict[str, asyncio.Lock] = {} 17 + 18 + 19 + def reconstruct_oauth_session(oauth_data: dict[str, Any]) -> OAuthSession: 20 + """reconstruct OAuthSession from serialized data.""" 21 + from cryptography.hazmat.backends import default_backend 22 + from cryptography.hazmat.primitives import serialization 23 + from cryptography.hazmat.primitives.asymmetric.ec import EllipticCurvePrivateKey 24 + 25 + # deserialize DPoP private key 26 + dpop_key_pem = oauth_data.get("dpop_private_key_pem") 27 + if not dpop_key_pem: 28 + raise ValueError("DPoP private key not found in session") 29 + 30 + private_key = serialization.load_pem_private_key( 31 + dpop_key_pem.encode("utf-8"), 32 + password=None, 33 + backend=default_backend(), 34 + ) 35 + if not isinstance(private_key, EllipticCurvePrivateKey): 36 + raise ValueError("DPoP private key must be an elliptic curve key") 37 + dpop_private_key: EllipticCurvePrivateKey = private_key 38 + 39 + return OAuthSession( 40 + did=oauth_data["did"], 41 + handle=oauth_data["handle"], 42 + pds_url=oauth_data["pds_url"], 43 + authserver_iss=oauth_data["authserver_iss"], 44 + access_token=oauth_data["access_token"], 45 + refresh_token=oauth_data["refresh_token"], 46 + dpop_private_key=dpop_private_key, 47 + dpop_authserver_nonce=oauth_data.get("dpop_authserver_nonce", ""), 48 + dpop_pds_nonce=oauth_data.get("dpop_pds_nonce", ""), 49 + scope=oauth_data["scope"], 50 + ) 51 + 52 + 53 + async def _refresh_session_tokens( 54 + auth_session: AuthSession, 55 + oauth_session: OAuthSession, 56 + ) -> OAuthSession: 57 + """refresh expired access token using refresh token. 58 + 59 + uses per-session locking to prevent concurrent refresh attempts for the same session. 60 + if another coroutine already refreshed the token, reloads from DB instead of making 61 + a redundant network call. 62 + """ 63 + session_id = auth_session.session_id 64 + 65 + # get or create lock for this session 66 + if session_id not in _refresh_locks: 67 + _refresh_locks[session_id] = asyncio.Lock() 68 + 69 + lock = _refresh_locks[session_id] 70 + 71 + async with lock: 72 + # check if another coroutine already refreshed while we were waiting 73 + # reload session from DB to get potentially updated tokens 74 + updated_auth_session = await get_session(session_id) 75 + if not updated_auth_session: 76 + raise ValueError(f"session {session_id} no longer exists") 77 + 78 + # reconstruct oauth session from potentially updated data 79 + updated_oauth_data = updated_auth_session.oauth_session 80 + if not updated_oauth_data or "access_token" not in updated_oauth_data: 81 + raise ValueError(f"OAuth session data missing for {auth_session.did}") 82 + 83 + current_oauth_session = reconstruct_oauth_session(updated_oauth_data) 84 + 85 + # if tokens are different from what we had, another coroutine already refreshed 86 + if current_oauth_session.access_token != oauth_session.access_token: 87 + logger.info( 88 + f"tokens already refreshed by another request for {auth_session.did}" 89 + ) 90 + return current_oauth_session 91 + 92 + # we need to refresh - no one else did it yet 93 + logger.info(f"refreshing access token for {auth_session.did}") 94 + 95 + try: 96 + # use OAuth client to refresh tokens 97 + refreshed_session = await get_oauth_client().refresh_session( 98 + current_oauth_session 99 + ) 100 + 101 + # serialize updated tokens back to database 102 + from cryptography.hazmat.primitives import serialization 103 + 104 + dpop_key_pem = refreshed_session.dpop_private_key.private_bytes( 105 + encoding=serialization.Encoding.PEM, 106 + format=serialization.PrivateFormat.PKCS8, 107 + encryption_algorithm=serialization.NoEncryption(), 108 + ).decode("utf-8") 109 + 110 + updated_session_data = { 111 + "did": refreshed_session.did, 112 + "handle": refreshed_session.handle, 113 + "pds_url": refreshed_session.pds_url, 114 + "authserver_iss": refreshed_session.authserver_iss, 115 + "scope": refreshed_session.scope, 116 + "access_token": refreshed_session.access_token, 117 + "refresh_token": refreshed_session.refresh_token, 118 + "dpop_private_key_pem": dpop_key_pem, 119 + "dpop_authserver_nonce": refreshed_session.dpop_authserver_nonce, 120 + "dpop_pds_nonce": refreshed_session.dpop_pds_nonce or "", 121 + } 122 + 123 + # update session in database 124 + await update_session_tokens(session_id, updated_session_data) 125 + 126 + logger.info(f"successfully refreshed access token for {auth_session.did}") 127 + return refreshed_session 128 + 129 + except Exception as e: 130 + logger.error( 131 + f"failed to refresh token for {auth_session.did}: {e}", exc_info=True 132 + ) 133 + 134 + # on failure, try reloading session one more time in case another 135 + # coroutine succeeded while we were failing 136 + await asyncio.sleep(0.1) # brief pause 137 + retry_session = await get_session(session_id) 138 + if retry_session and retry_session.oauth_session: 139 + retry_oauth_session = reconstruct_oauth_session( 140 + retry_session.oauth_session 141 + ) 142 + if retry_oauth_session.access_token != oauth_session.access_token: 143 + logger.info( 144 + f"using tokens refreshed by parallel request for {auth_session.did}" 145 + ) 146 + return retry_oauth_session 147 + 148 + raise ValueError(f"failed to refresh access token: {e}") from e 149 + 150 + 151 + async def make_pds_request( 152 + auth_session: AuthSession, 153 + method: str, 154 + endpoint: str, 155 + payload: dict[str, Any] | None = None, 156 + params: dict[str, Any] | None = None, 157 + success_codes: tuple[int, ...] = (200, 201), 158 + ) -> dict[str, Any]: 159 + """make an authenticated request to the PDS with automatic token refresh. 160 + 161 + args: 162 + auth_session: authenticated user session 163 + method: HTTP method (POST, GET, etc.) 164 + endpoint: XRPC endpoint (e.g., "com.atproto.repo.createRecord") 165 + payload: request JSON payload (for POST) 166 + params: query parameters (for GET) 167 + success_codes: HTTP status codes considered successful 168 + 169 + returns: 170 + response JSON dict (empty dict for 204 responses) 171 + 172 + raises: 173 + ValueError: if session is invalid 174 + Exception: if request fails after retry 175 + """ 176 + oauth_data = auth_session.oauth_session 177 + if not oauth_data or "access_token" not in oauth_data: 178 + raise ValueError( 179 + f"OAuth session data missing or invalid for {auth_session.did}" 180 + ) 181 + 182 + oauth_session = reconstruct_oauth_session(oauth_data) 183 + url = f"{oauth_data['pds_url']}/xrpc/{endpoint}" 184 + 185 + for attempt in range(2): 186 + kwargs: dict[str, Any] = {} 187 + if payload: 188 + kwargs["json"] = payload 189 + if params: 190 + kwargs["params"] = params 191 + 192 + response = await get_oauth_client().make_authenticated_request( 193 + session=oauth_session, 194 + method=method, 195 + url=url, 196 + **kwargs, 197 + ) 198 + 199 + if response.status_code in success_codes: 200 + if response.status_code == 204: 201 + return {} 202 + return response.json() 203 + 204 + # token expired - refresh and retry 205 + if response.status_code == 401 and attempt == 0: 206 + try: 207 + error_data = response.json() 208 + if "exp" in error_data.get("message", ""): 209 + logger.info( 210 + f"access token expired for {auth_session.did}, attempting refresh" 211 + ) 212 + oauth_session = await _refresh_session_tokens( 213 + auth_session, oauth_session 214 + ) 215 + continue 216 + except (json.JSONDecodeError, KeyError): 217 + pass 218 + 219 + raise Exception(f"PDS request failed: {response.status_code} {response.text}") 220 + 221 + 222 + def parse_at_uri(uri: str) -> tuple[str, str, str]: 223 + """parse an AT URI into (repo, collection, rkey).""" 224 + if not uri.startswith("at://"): 225 + raise ValueError(f"Invalid AT URI format: {uri}") 226 + parts = uri.replace("at://", "").split("/") 227 + if len(parts) != 3: 228 + raise ValueError(f"Invalid AT URI structure: {uri}") 229 + return parts[0], parts[1], parts[2]
+50
backend/src/backend/_internal/atproto/handles.py
··· 72 return None 73 74 75 async def search_handles(query: str, limit: int = 10) -> list[dict]: 76 """search for ATProto handles by prefix. 77
··· 72 return None 73 74 75 + async def resolve_featured_artists( 76 + features_json: str | None, 77 + exclude_handle: str, 78 + ) -> list[dict]: 79 + """resolve featured artist handles from JSON array. 80 + 81 + args: 82 + features_json: JSON array string of handles, e.g., '["user1.bsky.social"]' 83 + exclude_handle: handle to exclude (typically the uploading artist) 84 + 85 + returns: 86 + list of resolved artist dicts, excluding failures and the uploader 87 + """ 88 + if not features_json: 89 + return [] 90 + 91 + import asyncio 92 + import json 93 + 94 + try: 95 + handles_list = json.loads(features_json) 96 + except json.JSONDecodeError: 97 + logger.warning( 98 + "malformed features JSON, ignoring", extra={"raw": features_json} 99 + ) 100 + return [] 101 + 102 + if not isinstance(handles_list, list): 103 + return [] 104 + 105 + # filter valid handles, excluding the uploading artist 106 + valid_handles = [ 107 + handle 108 + for handle in handles_list 109 + if isinstance(handle, str) and handle.lstrip("@") != exclude_handle 110 + ] 111 + 112 + if not valid_handles: 113 + return [] 114 + 115 + # resolve concurrently 116 + resolved = await asyncio.gather( 117 + *[resolve_handle(h) for h in valid_handles], 118 + return_exceptions=True, 119 + ) 120 + 121 + # filter out exceptions and None values 122 + return [r for r in resolved if isinstance(r, dict) and r is not None] 123 + 124 + 125 async def search_handles(query: str, limit: int = 10) -> list[dict]: 126 """search for ATProto handles by prefix. 127
-1005
backend/src/backend/_internal/atproto/records.py
··· 1 - """ATProto record creation for relay audio items.""" 2 - 3 - import asyncio 4 - import json 5 - import logging 6 - from datetime import UTC, datetime 7 - from typing import Any 8 - 9 - from atproto_oauth.models import OAuthSession 10 - from sqlalchemy import select 11 - 12 - from backend._internal import Session as AuthSession 13 - from backend._internal import get_oauth_client, get_session, update_session_tokens 14 - from backend.config import settings 15 - 16 - logger = logging.getLogger(__name__) 17 - 18 - # per-session locks for token refresh to prevent concurrent refresh races 19 - _refresh_locks: dict[str, asyncio.Lock] = {} 20 - 21 - 22 - def _reconstruct_oauth_session(oauth_data: dict[str, Any]) -> OAuthSession: 23 - """reconstruct OAuthSession from serialized data.""" 24 - from cryptography.hazmat.backends import default_backend 25 - from cryptography.hazmat.primitives import serialization 26 - from cryptography.hazmat.primitives.asymmetric.ec import EllipticCurvePrivateKey 27 - 28 - # deserialize DPoP private key 29 - dpop_key_pem = oauth_data.get("dpop_private_key_pem") 30 - if not dpop_key_pem: 31 - raise ValueError("DPoP private key not found in session") 32 - 33 - private_key = serialization.load_pem_private_key( 34 - dpop_key_pem.encode("utf-8"), 35 - password=None, 36 - backend=default_backend(), 37 - ) 38 - if not isinstance(private_key, EllipticCurvePrivateKey): 39 - raise ValueError("DPoP private key must be an elliptic curve key") 40 - dpop_private_key: EllipticCurvePrivateKey = private_key 41 - 42 - return OAuthSession( 43 - did=oauth_data["did"], 44 - handle=oauth_data["handle"], 45 - pds_url=oauth_data["pds_url"], 46 - authserver_iss=oauth_data["authserver_iss"], 47 - access_token=oauth_data["access_token"], 48 - refresh_token=oauth_data["refresh_token"], 49 - dpop_private_key=dpop_private_key, 50 - dpop_authserver_nonce=oauth_data.get("dpop_authserver_nonce", ""), 51 - dpop_pds_nonce=oauth_data.get("dpop_pds_nonce", ""), 52 - scope=oauth_data["scope"], 53 - ) 54 - 55 - 56 - async def _refresh_session_tokens( 57 - auth_session: AuthSession, 58 - oauth_session: OAuthSession, 59 - ) -> OAuthSession: 60 - """refresh expired access token using refresh token. 61 - 62 - uses per-session locking to prevent concurrent refresh attempts for the same session. 63 - if another coroutine already refreshed the token, reloads from DB instead of making 64 - a redundant network call. 65 - """ 66 - session_id = auth_session.session_id 67 - 68 - # get or create lock for this session 69 - if session_id not in _refresh_locks: 70 - _refresh_locks[session_id] = asyncio.Lock() 71 - 72 - lock = _refresh_locks[session_id] 73 - 74 - async with lock: 75 - # check if another coroutine already refreshed while we were waiting 76 - # reload session from DB to get potentially updated tokens 77 - updated_auth_session = await get_session(session_id) 78 - if not updated_auth_session: 79 - raise ValueError(f"session {session_id} no longer exists") 80 - 81 - # reconstruct oauth session from potentially updated data 82 - updated_oauth_data = updated_auth_session.oauth_session 83 - if not updated_oauth_data or "access_token" not in updated_oauth_data: 84 - raise ValueError(f"OAuth session data missing for {auth_session.did}") 85 - 86 - current_oauth_session = _reconstruct_oauth_session(updated_oauth_data) 87 - 88 - # if tokens are different from what we had, another coroutine already refreshed 89 - if current_oauth_session.access_token != oauth_session.access_token: 90 - logger.info( 91 - f"tokens already refreshed by another request for {auth_session.did}" 92 - ) 93 - return current_oauth_session 94 - 95 - # we need to refresh - no one else did it yet 96 - logger.info(f"refreshing access token for {auth_session.did}") 97 - 98 - try: 99 - # use OAuth client to refresh tokens 100 - refreshed_session = await get_oauth_client().refresh_session( 101 - current_oauth_session 102 - ) 103 - 104 - # serialize updated tokens back to database 105 - from cryptography.hazmat.primitives import serialization 106 - 107 - dpop_key_pem = refreshed_session.dpop_private_key.private_bytes( 108 - encoding=serialization.Encoding.PEM, 109 - format=serialization.PrivateFormat.PKCS8, 110 - encryption_algorithm=serialization.NoEncryption(), 111 - ).decode("utf-8") 112 - 113 - updated_session_data = { 114 - "did": refreshed_session.did, 115 - "handle": refreshed_session.handle, 116 - "pds_url": refreshed_session.pds_url, 117 - "authserver_iss": refreshed_session.authserver_iss, 118 - "scope": refreshed_session.scope, 119 - "access_token": refreshed_session.access_token, 120 - "refresh_token": refreshed_session.refresh_token, 121 - "dpop_private_key_pem": dpop_key_pem, 122 - "dpop_authserver_nonce": refreshed_session.dpop_authserver_nonce, 123 - "dpop_pds_nonce": refreshed_session.dpop_pds_nonce or "", 124 - } 125 - 126 - # update session in database 127 - await update_session_tokens(session_id, updated_session_data) 128 - 129 - logger.info(f"successfully refreshed access token for {auth_session.did}") 130 - return refreshed_session 131 - 132 - except Exception as e: 133 - logger.error( 134 - f"failed to refresh token for {auth_session.did}: {e}", exc_info=True 135 - ) 136 - 137 - # on failure, try reloading session one more time in case another 138 - # coroutine succeeded while we were failing 139 - await asyncio.sleep(0.1) # brief pause 140 - retry_session = await get_session(session_id) 141 - if retry_session and retry_session.oauth_session: 142 - retry_oauth_session = _reconstruct_oauth_session( 143 - retry_session.oauth_session 144 - ) 145 - if retry_oauth_session.access_token != oauth_session.access_token: 146 - logger.info( 147 - f"using tokens refreshed by parallel request for {auth_session.did}" 148 - ) 149 - return retry_oauth_session 150 - 151 - raise ValueError(f"failed to refresh access token: {e}") from e 152 - 153 - 154 - async def _make_pds_request( 155 - auth_session: AuthSession, 156 - method: str, 157 - endpoint: str, 158 - payload: dict[str, Any], 159 - success_codes: tuple[int, ...] = (200, 201), 160 - ) -> dict[str, Any]: 161 - """make an authenticated request to the PDS with automatic token refresh. 162 - 163 - args: 164 - auth_session: authenticated user session 165 - method: HTTP method (POST, GET, etc.) 166 - endpoint: XRPC endpoint (e.g., "com.atproto.repo.createRecord") 167 - payload: request payload 168 - success_codes: HTTP status codes considered successful 169 - 170 - returns: 171 - response JSON dict (empty dict for 204 responses) 172 - 173 - raises: 174 - ValueError: if session is invalid 175 - Exception: if request fails after retry 176 - """ 177 - oauth_data = auth_session.oauth_session 178 - if not oauth_data or "access_token" not in oauth_data: 179 - raise ValueError( 180 - f"OAuth session data missing or invalid for {auth_session.did}" 181 - ) 182 - 183 - oauth_session = _reconstruct_oauth_session(oauth_data) 184 - url = f"{oauth_data['pds_url']}/xrpc/{endpoint}" 185 - 186 - for attempt in range(2): 187 - response = await get_oauth_client().make_authenticated_request( 188 - session=oauth_session, 189 - method=method, 190 - url=url, 191 - json=payload, 192 - ) 193 - 194 - if response.status_code in success_codes: 195 - if response.status_code == 204: 196 - return {} 197 - return response.json() 198 - 199 - # token expired - refresh and retry 200 - if response.status_code == 401 and attempt == 0: 201 - try: 202 - error_data = response.json() 203 - if "exp" in error_data.get("message", ""): 204 - logger.info( 205 - f"access token expired for {auth_session.did}, attempting refresh" 206 - ) 207 - oauth_session = await _refresh_session_tokens( 208 - auth_session, oauth_session 209 - ) 210 - continue 211 - except (json.JSONDecodeError, KeyError): 212 - pass 213 - 214 - raise Exception(f"PDS request failed: {response.status_code} {response.text}") 215 - 216 - 217 - def build_track_record( 218 - title: str, 219 - artist: str, 220 - audio_url: str, 221 - file_type: str, 222 - album: str | None = None, 223 - duration: int | None = None, 224 - features: list[dict] | None = None, 225 - image_url: str | None = None, 226 - ) -> dict[str, Any]: 227 - """Build a track record dict for ATProto. 228 - 229 - args: 230 - title: track title 231 - artist: artist name 232 - audio_url: R2 URL for audio file 233 - file_type: file extension (mp3, wav, etc) 234 - album: optional album name 235 - duration: optional duration in seconds 236 - features: optional list of featured artists [{did, handle, display_name, avatar_url}] 237 - image_url: optional cover art image URL 238 - 239 - returns: 240 - record dict ready for ATProto 241 - """ 242 - record: dict[str, Any] = { 243 - "$type": settings.atproto.track_collection, 244 - "title": title, 245 - "artist": artist, 246 - "audioUrl": audio_url, 247 - "fileType": file_type, 248 - "createdAt": datetime.now(UTC).isoformat().replace("+00:00", "Z"), 249 - } 250 - 251 - # add optional fields 252 - if album: 253 - record["album"] = album 254 - if duration: 255 - record["duration"] = duration 256 - if features: 257 - # only include essential fields for ATProto record 258 - record["features"] = [ 259 - { 260 - "did": f["did"], 261 - "handle": f["handle"], 262 - "displayName": f.get("display_name", f["handle"]), 263 - } 264 - for f in features 265 - ] 266 - if image_url: 267 - # validate image URL comes from allowed origin 268 - settings.storage.validate_image_url(image_url) 269 - record["imageUrl"] = image_url 270 - 271 - return record 272 - 273 - 274 - async def create_track_record( 275 - auth_session: AuthSession, 276 - title: str, 277 - artist: str, 278 - audio_url: str, 279 - file_type: str, 280 - album: str | None = None, 281 - duration: int | None = None, 282 - features: list[dict] | None = None, 283 - image_url: str | None = None, 284 - ) -> tuple[str, str]: 285 - """Create a track record on the user's PDS using the configured collection. 286 - 287 - args: 288 - auth_session: authenticated user session 289 - title: track title 290 - artist: artist name 291 - audio_url: R2 URL for audio file 292 - file_type: file extension (mp3, wav, etc) 293 - album: optional album name 294 - duration: optional duration in seconds 295 - features: optional list of featured artists [{did, handle, display_name, avatar_url}] 296 - image_url: optional cover art image URL 297 - 298 - returns: 299 - tuple of (record_uri, record_cid) 300 - 301 - raises: 302 - ValueError: if session is invalid 303 - Exception: if record creation fails 304 - """ 305 - record = build_track_record( 306 - title=title, 307 - artist=artist, 308 - audio_url=audio_url, 309 - file_type=file_type, 310 - album=album, 311 - duration=duration, 312 - features=features, 313 - image_url=image_url, 314 - ) 315 - 316 - payload = { 317 - "repo": auth_session.did, 318 - "collection": settings.atproto.track_collection, 319 - "record": record, 320 - } 321 - 322 - result = await _make_pds_request( 323 - auth_session, "POST", "com.atproto.repo.createRecord", payload 324 - ) 325 - return result["uri"], result["cid"] 326 - 327 - 328 - def _parse_at_uri(uri: str) -> tuple[str, str, str]: 329 - """parse an AT URI into (repo, collection, rkey).""" 330 - if not uri.startswith("at://"): 331 - raise ValueError(f"Invalid AT URI format: {uri}") 332 - parts = uri.replace("at://", "").split("/") 333 - if len(parts) != 3: 334 - raise ValueError(f"Invalid AT URI structure: {uri}") 335 - return parts[0], parts[1], parts[2] 336 - 337 - 338 - async def get_record_public( 339 - record_uri: str, 340 - pds_url: str | None = None, 341 - ) -> dict[str, Any]: 342 - """fetch an ATProto record without authentication. 343 - 344 - ATProto records are public by design - any client can read them. 345 - uses the owner's PDS URL if provided, otherwise falls back to 346 - bsky.network relay which indexes all public records. 347 - 348 - args: 349 - record_uri: AT URI of the record (at://did/collection/rkey) 350 - pds_url: optional PDS URL to use (falls back to bsky.network) 351 - 352 - returns: 353 - the record value dict 354 - 355 - raises: 356 - ValueError: if URI is malformed 357 - Exception: if fetch fails 358 - """ 359 - import httpx 360 - 361 - repo, collection, rkey = _parse_at_uri(record_uri) 362 - 363 - base_url = pds_url or "https://bsky.network" 364 - url = f"{base_url}/xrpc/com.atproto.repo.getRecord" 365 - params = {"repo": repo, "collection": collection, "rkey": rkey} 366 - 367 - async with httpx.AsyncClient() as client: 368 - response = await client.get(url, params=params, timeout=10.0) 369 - 370 - if response.status_code != 200: 371 - raise Exception( 372 - f"failed to fetch record: {response.status_code} {response.text}" 373 - ) 374 - 375 - return response.json() 376 - 377 - 378 - async def update_record( 379 - auth_session: AuthSession, 380 - record_uri: str, 381 - record: dict[str, Any], 382 - ) -> tuple[str, str]: 383 - """Update an existing record on the user's PDS. 384 - 385 - args: 386 - auth_session: authenticated user session 387 - record_uri: AT URI of the record to update (e.g., at://did:plc:.../fm.plyr.track/...) 388 - record: complete record data to update with (must include $type) 389 - 390 - returns: 391 - tuple of (record_uri, record_cid) 392 - 393 - raises: 394 - ValueError: if session is invalid or URI is malformed 395 - Exception: if record update fails 396 - """ 397 - repo, collection, rkey = _parse_at_uri(record_uri) 398 - 399 - payload = { 400 - "repo": repo, 401 - "collection": collection, 402 - "rkey": rkey, 403 - "record": record, 404 - } 405 - 406 - result = await _make_pds_request( 407 - auth_session, "POST", "com.atproto.repo.putRecord", payload 408 - ) 409 - return result["uri"], result["cid"] 410 - 411 - 412 - async def create_like_record( 413 - auth_session: AuthSession, 414 - subject_uri: str, 415 - subject_cid: str, 416 - ) -> str: 417 - """create a like record on the user's PDS. 418 - 419 - args: 420 - auth_session: authenticated user session 421 - subject_uri: AT URI of the track being liked 422 - subject_cid: CID of the track being liked 423 - 424 - returns: 425 - like record URI 426 - 427 - raises: 428 - ValueError: if session is invalid 429 - Exception: if record creation fails 430 - """ 431 - record = { 432 - "$type": settings.atproto.like_collection, 433 - "subject": { 434 - "uri": subject_uri, 435 - "cid": subject_cid, 436 - }, 437 - "createdAt": datetime.now(UTC).isoformat().replace("+00:00", "Z"), 438 - } 439 - 440 - payload = { 441 - "repo": auth_session.did, 442 - "collection": settings.atproto.like_collection, 443 - "record": record, 444 - } 445 - 446 - result = await _make_pds_request( 447 - auth_session, "POST", "com.atproto.repo.createRecord", payload 448 - ) 449 - return result["uri"] 450 - 451 - 452 - async def delete_record_by_uri( 453 - auth_session: AuthSession, 454 - record_uri: str, 455 - ) -> None: 456 - """delete a record on the user's PDS. 457 - 458 - args: 459 - auth_session: authenticated user session 460 - record_uri: AT URI of the record to delete 461 - 462 - raises: 463 - ValueError: if session is invalid or URI is malformed 464 - Exception: if record deletion fails 465 - """ 466 - repo, collection, rkey = _parse_at_uri(record_uri) 467 - 468 - payload = { 469 - "repo": repo, 470 - "collection": collection, 471 - "rkey": rkey, 472 - } 473 - 474 - await _make_pds_request( 475 - auth_session, 476 - "POST", 477 - "com.atproto.repo.deleteRecord", 478 - payload, 479 - success_codes=(200, 201, 204), 480 - ) 481 - 482 - 483 - async def create_comment_record( 484 - auth_session: AuthSession, 485 - subject_uri: str, 486 - subject_cid: str, 487 - text: str, 488 - timestamp_ms: int, 489 - ) -> str: 490 - """create a timed comment record on the user's PDS. 491 - 492 - args: 493 - auth_session: authenticated user session 494 - subject_uri: AT URI of the track being commented on 495 - subject_cid: CID of the track being commented on 496 - text: comment text content 497 - timestamp_ms: playback position in milliseconds when comment was made 498 - 499 - returns: 500 - comment record URI 501 - 502 - raises: 503 - ValueError: if session is invalid 504 - Exception: if record creation fails 505 - """ 506 - record = { 507 - "$type": settings.atproto.comment_collection, 508 - "subject": { 509 - "uri": subject_uri, 510 - "cid": subject_cid, 511 - }, 512 - "text": text, 513 - "timestampMs": timestamp_ms, 514 - "createdAt": datetime.now(UTC).isoformat().replace("+00:00", "Z"), 515 - } 516 - 517 - payload = { 518 - "repo": auth_session.did, 519 - "collection": settings.atproto.comment_collection, 520 - "record": record, 521 - } 522 - 523 - result = await _make_pds_request( 524 - auth_session, "POST", "com.atproto.repo.createRecord", payload 525 - ) 526 - return result["uri"] 527 - 528 - 529 - def build_list_record( 530 - items: list[dict[str, str]], 531 - name: str | None = None, 532 - list_type: str | None = None, 533 - created_at: datetime | None = None, 534 - updated_at: datetime | None = None, 535 - ) -> dict[str, Any]: 536 - """Build a list record dict for ATProto. 537 - 538 - args: 539 - items: list of record references, each with {"uri": str, "cid": str} 540 - name: optional display name 541 - list_type: optional semantic type (e.g., "album", "playlist", "liked") 542 - created_at: creation timestamp (defaults to now) 543 - updated_at: optional last modification timestamp 544 - 545 - returns: 546 - record dict ready for ATProto 547 - """ 548 - record: dict[str, Any] = { 549 - "$type": settings.atproto.list_collection, 550 - "items": [ 551 - {"subject": {"uri": item["uri"], "cid": item["cid"]}} for item in items 552 - ], 553 - "createdAt": (created_at or datetime.now(UTC)) 554 - .isoformat() 555 - .replace("+00:00", "Z"), 556 - } 557 - 558 - if name: 559 - record["name"] = name 560 - if list_type: 561 - record["listType"] = list_type 562 - if updated_at: 563 - record["updatedAt"] = updated_at.isoformat().replace("+00:00", "Z") 564 - 565 - return record 566 - 567 - 568 - async def create_list_record( 569 - auth_session: AuthSession, 570 - items: list[dict[str, str]], 571 - name: str | None = None, 572 - list_type: str | None = None, 573 - ) -> tuple[str, str]: 574 - """Create a list record on the user's PDS. 575 - 576 - args: 577 - auth_session: authenticated user session 578 - items: list of record references, each with {"uri": str, "cid": str} 579 - name: optional display name 580 - list_type: optional semantic type (e.g., "album", "playlist", "liked") 581 - 582 - returns: 583 - tuple of (record_uri, record_cid) 584 - """ 585 - record = build_list_record(items=items, name=name, list_type=list_type) 586 - 587 - payload = { 588 - "repo": auth_session.did, 589 - "collection": settings.atproto.list_collection, 590 - "record": record, 591 - } 592 - 593 - result = await _make_pds_request( 594 - auth_session, "POST", "com.atproto.repo.createRecord", payload 595 - ) 596 - return result["uri"], result["cid"] 597 - 598 - 599 - async def update_list_record( 600 - auth_session: AuthSession, 601 - list_uri: str, 602 - items: list[dict[str, str]], 603 - name: str | None = None, 604 - list_type: str | None = None, 605 - created_at: datetime | None = None, 606 - ) -> tuple[str, str]: 607 - """Update an existing list record on the user's PDS. 608 - 609 - args: 610 - auth_session: authenticated user session 611 - list_uri: AT URI of the list record to update 612 - items: list of record references (array order = display order) 613 - name: optional display name 614 - list_type: optional semantic type (e.g., "album", "playlist", "liked") 615 - created_at: original creation timestamp (preserved on updates) 616 - 617 - returns: 618 - tuple of (record_uri, new_record_cid) 619 - """ 620 - record = build_list_record( 621 - items=items, 622 - name=name, 623 - list_type=list_type, 624 - created_at=created_at, 625 - updated_at=datetime.now(UTC), 626 - ) 627 - 628 - return await update_record( 629 - auth_session=auth_session, 630 - record_uri=list_uri, 631 - record=record, 632 - ) 633 - 634 - 635 - async def update_comment_record( 636 - auth_session: AuthSession, 637 - comment_uri: str, 638 - subject_uri: str, 639 - subject_cid: str, 640 - text: str, 641 - timestamp_ms: int, 642 - created_at: datetime, 643 - updated_at: datetime, 644 - ) -> str: 645 - """update a timed comment record on the user's PDS. 646 - 647 - args: 648 - auth_session: authenticated user session 649 - comment_uri: AT URI of the comment record to update 650 - subject_uri: AT URI of the track being commented on 651 - subject_cid: CID of the track being commented on 652 - text: updated comment text content 653 - timestamp_ms: original playback position in milliseconds 654 - created_at: original creation timestamp 655 - updated_at: timestamp of this update 656 - 657 - returns: 658 - new CID for the updated record 659 - 660 - raises: 661 - ValueError: if session is invalid 662 - Exception: if record update fails 663 - """ 664 - record = { 665 - "$type": settings.atproto.comment_collection, 666 - "subject": { 667 - "uri": subject_uri, 668 - "cid": subject_cid, 669 - }, 670 - "text": text, 671 - "timestampMs": timestamp_ms, 672 - "createdAt": created_at.isoformat().replace("+00:00", "Z"), 673 - "updatedAt": updated_at.isoformat().replace("+00:00", "Z"), 674 - } 675 - 676 - _, new_cid = await update_record( 677 - auth_session=auth_session, 678 - record_uri=comment_uri, 679 - record=record, 680 - ) 681 - return new_cid 682 - 683 - 684 - def build_profile_record( 685 - bio: str | None = None, 686 - created_at: datetime | None = None, 687 - updated_at: datetime | None = None, 688 - ) -> dict[str, Any]: 689 - """Build a profile record dict for ATProto. 690 - 691 - args: 692 - bio: artist bio/description 693 - created_at: creation timestamp (defaults to now) 694 - updated_at: optional last modification timestamp 695 - 696 - returns: 697 - record dict ready for ATProto 698 - """ 699 - record: dict[str, Any] = { 700 - "$type": settings.atproto.profile_collection, 701 - "createdAt": (created_at or datetime.now(UTC)) 702 - .isoformat() 703 - .replace("+00:00", "Z"), 704 - } 705 - 706 - if bio: 707 - record["bio"] = bio 708 - if updated_at: 709 - record["updatedAt"] = updated_at.isoformat().replace("+00:00", "Z") 710 - 711 - return record 712 - 713 - 714 - async def upsert_profile_record( 715 - auth_session: AuthSession, 716 - bio: str | None = None, 717 - ) -> tuple[str, str] | None: 718 - """Create or update the user's plyr.fm profile record. 719 - 720 - uses putRecord with rkey="self" for upsert semantics - creates if 721 - doesn't exist, updates if it does. skips write if record already 722 - exists with the same bio (no-op for unchanged data). 723 - 724 - args: 725 - auth_session: authenticated user session 726 - bio: artist bio/description 727 - 728 - returns: 729 - tuple of (record_uri, record_cid) or None if skipped (unchanged) 730 - """ 731 - # check if profile already exists to preserve createdAt and skip if unchanged 732 - existing_created_at = None 733 - existing_bio = None 734 - existing_uri = None 735 - existing_cid = None 736 - 737 - try: 738 - # try to get existing record 739 - oauth_data = auth_session.oauth_session 740 - if oauth_data and "pds_url" in oauth_data: 741 - oauth_session = _reconstruct_oauth_session(oauth_data) 742 - url = f"{oauth_data['pds_url']}/xrpc/com.atproto.repo.getRecord" 743 - params = { 744 - "repo": auth_session.did, 745 - "collection": settings.atproto.profile_collection, 746 - "rkey": "self", 747 - } 748 - response = await get_oauth_client().make_authenticated_request( 749 - session=oauth_session, 750 - method="GET", 751 - url=url, 752 - params=params, 753 - ) 754 - if response.status_code == 200: 755 - existing = response.json() 756 - existing_uri = existing.get("uri") 757 - existing_cid = existing.get("cid") 758 - if "value" in existing: 759 - existing_bio = existing["value"].get("bio") 760 - if "createdAt" in existing["value"]: 761 - existing_created_at = datetime.fromisoformat( 762 - existing["value"]["createdAt"].replace("Z", "+00:00") 763 - ) 764 - except Exception: 765 - # record doesn't exist yet, that's fine 766 - pass 767 - 768 - # skip write if record exists with same bio (no changes needed) 769 - if existing_uri and existing_cid and existing_bio == bio: 770 - return None 771 - 772 - record = build_profile_record( 773 - bio=bio, 774 - created_at=existing_created_at, 775 - updated_at=datetime.now(UTC) if existing_created_at else None, 776 - ) 777 - 778 - payload = { 779 - "repo": auth_session.did, 780 - "collection": settings.atproto.profile_collection, 781 - "rkey": "self", 782 - "record": record, 783 - } 784 - 785 - result = await _make_pds_request( 786 - auth_session, "POST", "com.atproto.repo.putRecord", payload 787 - ) 788 - return result["uri"], result["cid"] 789 - 790 - 791 - async def upsert_album_list_record( 792 - auth_session: AuthSession, 793 - album_id: str, 794 - album_title: str, 795 - track_refs: list[dict[str, str]], 796 - existing_uri: str | None = None, 797 - existing_created_at: datetime | None = None, 798 - ) -> tuple[str, str] | None: 799 - """Create or update an album as a list record. 800 - 801 - args: 802 - auth_session: authenticated user session 803 - album_id: internal album ID (for logging) 804 - album_title: album display name 805 - track_refs: list of track references [{"uri": str, "cid": str}, ...] 806 - existing_uri: existing ATProto record URI if updating 807 - existing_created_at: original creation timestamp to preserve 808 - 809 - returns: 810 - tuple of (record_uri, record_cid) or None if no tracks to sync 811 - """ 812 - if not track_refs: 813 - logger.debug(f"album {album_id} has no tracks with ATProto records, skipping") 814 - return None 815 - 816 - if existing_uri: 817 - # update existing record 818 - uri, cid = await update_list_record( 819 - auth_session=auth_session, 820 - list_uri=existing_uri, 821 - items=track_refs, 822 - name=album_title, 823 - list_type="album", 824 - created_at=existing_created_at, 825 - ) 826 - logger.info(f"updated album list record for {album_id}: {uri}") 827 - return uri, cid 828 - else: 829 - # create new record 830 - uri, cid = await create_list_record( 831 - auth_session=auth_session, 832 - items=track_refs, 833 - name=album_title, 834 - list_type="album", 835 - ) 836 - logger.info(f"created album list record for {album_id}: {uri}") 837 - return uri, cid 838 - 839 - 840 - async def upsert_liked_list_record( 841 - auth_session: AuthSession, 842 - track_refs: list[dict[str, str]], 843 - existing_uri: str | None = None, 844 - existing_created_at: datetime | None = None, 845 - ) -> tuple[str, str] | None: 846 - """Create or update the user's liked tracks list record. 847 - 848 - args: 849 - auth_session: authenticated user session 850 - track_refs: list of liked track references [{"uri": str, "cid": str}, ...] 851 - existing_uri: existing ATProto record URI if updating 852 - existing_created_at: original creation timestamp to preserve 853 - 854 - returns: 855 - tuple of (record_uri, record_cid) or None if no likes to sync 856 - """ 857 - if not track_refs: 858 - logger.debug(f"user {auth_session.did} has no liked tracks to sync") 859 - return None 860 - 861 - if existing_uri: 862 - # update existing record 863 - uri, cid = await update_list_record( 864 - auth_session=auth_session, 865 - list_uri=existing_uri, 866 - items=track_refs, 867 - name="Liked Tracks", 868 - list_type="liked", 869 - created_at=existing_created_at, 870 - ) 871 - logger.info(f"updated liked list record for {auth_session.did}: {uri}") 872 - return uri, cid 873 - else: 874 - # create new record 875 - uri, cid = await create_list_record( 876 - auth_session=auth_session, 877 - items=track_refs, 878 - name="Liked Tracks", 879 - list_type="liked", 880 - ) 881 - logger.info(f"created liked list record for {auth_session.did}: {uri}") 882 - return uri, cid 883 - 884 - 885 - async def sync_atproto_records( 886 - auth_session: AuthSession, 887 - user_did: str, 888 - ) -> None: 889 - """sync profile, albums, and liked tracks to ATProto. 890 - 891 - this is the actual sync logic - runs all queries and PDS calls. 892 - should be called from a background task to avoid blocking. 893 - """ 894 - from backend.models import Album, Artist, Track, TrackLike, UserPreferences 895 - from backend.utilities.database import db_session 896 - 897 - # sync profile record 898 - async with db_session() as session: 899 - artist_result = await session.execute( 900 - select(Artist).where(Artist.did == user_did) 901 - ) 902 - artist = artist_result.scalar_one_or_none() 903 - artist_bio = artist.bio if artist else None 904 - 905 - if artist_bio is not None or artist: 906 - try: 907 - profile_result = await upsert_profile_record(auth_session, bio=artist_bio) 908 - if profile_result: 909 - logger.info(f"synced ATProto profile record for {user_did}") 910 - except Exception as e: 911 - logger.warning(f"failed to sync ATProto profile record for {user_did}: {e}") 912 - 913 - # query and sync album list records 914 - async with db_session() as session: 915 - albums_result = await session.execute( 916 - select(Album).where(Album.artist_did == user_did) 917 - ) 918 - albums = albums_result.scalars().all() 919 - 920 - for album in albums: 921 - tracks_result = await session.execute( 922 - select(Track) 923 - .where( 924 - Track.album_id == album.id, 925 - Track.atproto_record_uri.isnot(None), 926 - Track.atproto_record_cid.isnot(None), 927 - ) 928 - .order_by(Track.created_at.asc()) 929 - ) 930 - tracks = tracks_result.scalars().all() 931 - 932 - if tracks: 933 - track_refs = [ 934 - {"uri": t.atproto_record_uri, "cid": t.atproto_record_cid} 935 - for t in tracks 936 - ] 937 - try: 938 - album_result = await upsert_album_list_record( 939 - auth_session, 940 - album_id=album.id, 941 - album_title=album.title, 942 - track_refs=track_refs, 943 - existing_uri=album.atproto_record_uri, 944 - ) 945 - if album_result: 946 - album.atproto_record_uri = album_result[0] 947 - album.atproto_record_cid = album_result[1] 948 - await session.commit() 949 - logger.info( 950 - f"synced album list record for {album.id}: {album_result[0]}" 951 - ) 952 - except Exception as e: 953 - logger.warning( 954 - f"failed to sync album list record for {album.id}: {e}" 955 - ) 956 - 957 - # query and sync liked tracks list record 958 - async with db_session() as session: 959 - prefs_result = await session.execute( 960 - select(UserPreferences).where(UserPreferences.did == user_did) 961 - ) 962 - prefs = prefs_result.scalar_one_or_none() 963 - 964 - likes_result = await session.execute( 965 - select(Track) 966 - .join(TrackLike, TrackLike.track_id == Track.id) 967 - .where( 968 - TrackLike.user_did == user_did, 969 - Track.atproto_record_uri.isnot(None), 970 - Track.atproto_record_cid.isnot(None), 971 - ) 972 - .order_by(TrackLike.created_at.desc()) 973 - ) 974 - liked_tracks = likes_result.scalars().all() 975 - 976 - if liked_tracks: 977 - liked_refs = [ 978 - {"uri": t.atproto_record_uri, "cid": t.atproto_record_cid} 979 - for t in liked_tracks 980 - ] 981 - existing_liked_uri = prefs.liked_list_uri if prefs else None 982 - 983 - try: 984 - liked_result = await upsert_liked_list_record( 985 - auth_session, 986 - track_refs=liked_refs, 987 - existing_uri=existing_liked_uri, 988 - ) 989 - if liked_result: 990 - if prefs: 991 - prefs.liked_list_uri = liked_result[0] 992 - prefs.liked_list_cid = liked_result[1] 993 - else: 994 - prefs = UserPreferences( 995 - did=user_did, 996 - liked_list_uri=liked_result[0], 997 - liked_list_cid=liked_result[1], 998 - ) 999 - session.add(prefs) 1000 - await session.commit() 1001 - logger.info( 1002 - f"synced liked list record for {user_did}: {liked_result[0]}" 1003 - ) 1004 - except Exception as e: 1005 - logger.warning(f"failed to sync liked list record for {user_did}: {e}")
···
+53
backend/src/backend/_internal/atproto/records/__init__.py
···
··· 1 + """ATProto record types organized by lexicon namespace.""" 2 + 3 + # re-export commonly used functions for convenience 4 + from backend._internal.atproto.records.fm_plyr import ( 5 + build_track_record, 6 + create_comment_record, 7 + create_like_record, 8 + create_list_record, 9 + create_track_record, 10 + delete_record_by_uri, 11 + get_record_public, 12 + update_comment_record, 13 + update_list_record, 14 + update_record, 15 + upsert_album_list_record, 16 + upsert_liked_list_record, 17 + upsert_profile_record, 18 + ) 19 + from backend._internal.atproto.records.fm_teal import ( 20 + create_teal_play_record, 21 + update_teal_status, 22 + ) 23 + 24 + # re-export client functions for backward compatibility 25 + # these were previously in records.py and some code imports them from here 26 + from backend._internal.atproto.client import ( 27 + _refresh_session_tokens, 28 + make_pds_request as _make_pds_request, 29 + parse_at_uri as _parse_at_uri, 30 + reconstruct_oauth_session as _reconstruct_oauth_session, 31 + ) 32 + 33 + __all__ = [ 34 + "_make_pds_request", 35 + "_parse_at_uri", 36 + "_reconstruct_oauth_session", 37 + "_refresh_session_tokens", 38 + "build_track_record", 39 + "create_comment_record", 40 + "create_like_record", 41 + "create_list_record", 42 + "create_teal_play_record", 43 + "create_track_record", 44 + "delete_record_by_uri", 45 + "get_record_public", 46 + "update_comment_record", 47 + "update_list_record", 48 + "update_record", 49 + "update_teal_status", 50 + "upsert_album_list_record", 51 + "upsert_liked_list_record", 52 + "upsert_profile_record", 53 + ]
+39
backend/src/backend/_internal/atproto/records/fm_plyr/__init__.py
···
··· 1 + """fm.plyr.* lexicon record types.""" 2 + 3 + from backend._internal.atproto.records.fm_plyr.comment import ( 4 + create_comment_record, 5 + update_comment_record, 6 + ) 7 + from backend._internal.atproto.records.fm_plyr.like import create_like_record 8 + from backend._internal.atproto.records.fm_plyr.list import ( 9 + build_list_record, 10 + create_list_record, 11 + update_list_record, 12 + upsert_album_list_record, 13 + upsert_liked_list_record, 14 + ) 15 + from backend._internal.atproto.records.fm_plyr.profile import upsert_profile_record 16 + from backend._internal.atproto.records.fm_plyr.track import ( 17 + build_track_record, 18 + create_track_record, 19 + delete_record_by_uri, 20 + get_record_public, 21 + update_record, 22 + ) 23 + 24 + __all__ = [ 25 + "build_list_record", 26 + "build_track_record", 27 + "create_comment_record", 28 + "create_like_record", 29 + "create_list_record", 30 + "create_track_record", 31 + "delete_record_by_uri", 32 + "get_record_public", 33 + "update_comment_record", 34 + "update_list_record", 35 + "update_record", 36 + "upsert_album_list_record", 37 + "upsert_liked_list_record", 38 + "upsert_profile_record", 39 + ]
+103
backend/src/backend/_internal/atproto/records/fm_plyr/comment.py
···
··· 1 + """fm.plyr.comment record operations.""" 2 + 3 + from datetime import UTC, datetime 4 + 5 + from backend._internal import Session as AuthSession 6 + from backend._internal.atproto.client import make_pds_request 7 + from backend._internal.atproto.records.fm_plyr.track import update_record 8 + from backend.config import settings 9 + 10 + 11 + async def create_comment_record( 12 + auth_session: AuthSession, 13 + subject_uri: str, 14 + subject_cid: str, 15 + text: str, 16 + timestamp_ms: int, 17 + ) -> str: 18 + """create a timed comment record on the user's PDS. 19 + 20 + args: 21 + auth_session: authenticated user session 22 + subject_uri: AT URI of the track being commented on 23 + subject_cid: CID of the track being commented on 24 + text: comment text content 25 + timestamp_ms: playback position in milliseconds when comment was made 26 + 27 + returns: 28 + comment record URI 29 + 30 + raises: 31 + ValueError: if session is invalid 32 + Exception: if record creation fails 33 + """ 34 + record = { 35 + "$type": settings.atproto.comment_collection, 36 + "subject": { 37 + "uri": subject_uri, 38 + "cid": subject_cid, 39 + }, 40 + "text": text, 41 + "timestampMs": timestamp_ms, 42 + "createdAt": datetime.now(UTC).isoformat().replace("+00:00", "Z"), 43 + } 44 + 45 + payload = { 46 + "repo": auth_session.did, 47 + "collection": settings.atproto.comment_collection, 48 + "record": record, 49 + } 50 + 51 + result = await make_pds_request( 52 + auth_session, "POST", "com.atproto.repo.createRecord", payload 53 + ) 54 + return result["uri"] 55 + 56 + 57 + async def update_comment_record( 58 + auth_session: AuthSession, 59 + comment_uri: str, 60 + subject_uri: str, 61 + subject_cid: str, 62 + text: str, 63 + timestamp_ms: int, 64 + created_at: datetime, 65 + updated_at: datetime, 66 + ) -> str: 67 + """update a timed comment record on the user's PDS. 68 + 69 + args: 70 + auth_session: authenticated user session 71 + comment_uri: AT URI of the comment record to update 72 + subject_uri: AT URI of the track being commented on 73 + subject_cid: CID of the track being commented on 74 + text: updated comment text content 75 + timestamp_ms: original playback position in milliseconds 76 + created_at: original creation timestamp 77 + updated_at: timestamp of this update 78 + 79 + returns: 80 + new CID for the updated record 81 + 82 + raises: 83 + ValueError: if session is invalid 84 + Exception: if record update fails 85 + """ 86 + record = { 87 + "$type": settings.atproto.comment_collection, 88 + "subject": { 89 + "uri": subject_uri, 90 + "cid": subject_cid, 91 + }, 92 + "text": text, 93 + "timestampMs": timestamp_ms, 94 + "createdAt": created_at.isoformat().replace("+00:00", "Z"), 95 + "updatedAt": updated_at.isoformat().replace("+00:00", "Z"), 96 + } 97 + 98 + _, new_cid = await update_record( 99 + auth_session=auth_session, 100 + record_uri=comment_uri, 101 + record=record, 102 + ) 103 + return new_cid
+47
backend/src/backend/_internal/atproto/records/fm_plyr/like.py
···
··· 1 + """fm.plyr.like record operations.""" 2 + 3 + from datetime import UTC, datetime 4 + 5 + from backend._internal import Session as AuthSession 6 + from backend._internal.atproto.client import make_pds_request 7 + from backend.config import settings 8 + 9 + 10 + async def create_like_record( 11 + auth_session: AuthSession, 12 + subject_uri: str, 13 + subject_cid: str, 14 + ) -> str: 15 + """create a like record on the user's PDS. 16 + 17 + args: 18 + auth_session: authenticated user session 19 + subject_uri: AT URI of the track being liked 20 + subject_cid: CID of the track being liked 21 + 22 + returns: 23 + like record URI 24 + 25 + raises: 26 + ValueError: if session is invalid 27 + Exception: if record creation fails 28 + """ 29 + record = { 30 + "$type": settings.atproto.like_collection, 31 + "subject": { 32 + "uri": subject_uri, 33 + "cid": subject_cid, 34 + }, 35 + "createdAt": datetime.now(UTC).isoformat().replace("+00:00", "Z"), 36 + } 37 + 38 + payload = { 39 + "repo": auth_session.did, 40 + "collection": settings.atproto.like_collection, 41 + "record": record, 42 + } 43 + 44 + result = await make_pds_request( 45 + auth_session, "POST", "com.atproto.repo.createRecord", payload 46 + ) 47 + return result["uri"]
+212
backend/src/backend/_internal/atproto/records/fm_plyr/list.py
···
··· 1 + """fm.plyr.list record operations.""" 2 + 3 + import logging 4 + from datetime import UTC, datetime 5 + from typing import Any 6 + 7 + from backend._internal import Session as AuthSession 8 + from backend._internal.atproto.client import make_pds_request 9 + from backend._internal.atproto.records.fm_plyr.track import update_record 10 + from backend.config import settings 11 + 12 + logger = logging.getLogger(__name__) 13 + 14 + 15 + def build_list_record( 16 + items: list[dict[str, str]], 17 + name: str | None = None, 18 + list_type: str | None = None, 19 + created_at: datetime | None = None, 20 + updated_at: datetime | None = None, 21 + ) -> dict[str, Any]: 22 + """Build a list record dict for ATProto. 23 + 24 + args: 25 + items: list of record references, each with {"uri": str, "cid": str} 26 + name: optional display name 27 + list_type: optional semantic type (e.g., "album", "playlist", "liked") 28 + created_at: creation timestamp (defaults to now) 29 + updated_at: optional last modification timestamp 30 + 31 + returns: 32 + record dict ready for ATProto 33 + """ 34 + record: dict[str, Any] = { 35 + "$type": settings.atproto.list_collection, 36 + "items": [ 37 + {"subject": {"uri": item["uri"], "cid": item["cid"]}} for item in items 38 + ], 39 + "createdAt": (created_at or datetime.now(UTC)) 40 + .isoformat() 41 + .replace("+00:00", "Z"), 42 + } 43 + 44 + if name: 45 + record["name"] = name 46 + if list_type: 47 + record["listType"] = list_type 48 + if updated_at: 49 + record["updatedAt"] = updated_at.isoformat().replace("+00:00", "Z") 50 + 51 + return record 52 + 53 + 54 + async def create_list_record( 55 + auth_session: AuthSession, 56 + items: list[dict[str, str]], 57 + name: str | None = None, 58 + list_type: str | None = None, 59 + ) -> tuple[str, str]: 60 + """Create a list record on the user's PDS. 61 + 62 + args: 63 + auth_session: authenticated user session 64 + items: list of record references, each with {"uri": str, "cid": str} 65 + name: optional display name 66 + list_type: optional semantic type (e.g., "album", "playlist", "liked") 67 + 68 + returns: 69 + tuple of (record_uri, record_cid) 70 + """ 71 + record = build_list_record(items=items, name=name, list_type=list_type) 72 + 73 + payload = { 74 + "repo": auth_session.did, 75 + "collection": settings.atproto.list_collection, 76 + "record": record, 77 + } 78 + 79 + result = await make_pds_request( 80 + auth_session, "POST", "com.atproto.repo.createRecord", payload 81 + ) 82 + return result["uri"], result["cid"] 83 + 84 + 85 + async def update_list_record( 86 + auth_session: AuthSession, 87 + list_uri: str, 88 + items: list[dict[str, str]], 89 + name: str | None = None, 90 + list_type: str | None = None, 91 + created_at: datetime | None = None, 92 + ) -> tuple[str, str]: 93 + """Update an existing list record on the user's PDS. 94 + 95 + args: 96 + auth_session: authenticated user session 97 + list_uri: AT URI of the list record to update 98 + items: list of record references (array order = display order) 99 + name: optional display name 100 + list_type: optional semantic type (e.g., "album", "playlist", "liked") 101 + created_at: original creation timestamp (preserved on updates) 102 + 103 + returns: 104 + tuple of (record_uri, new_record_cid) 105 + """ 106 + record = build_list_record( 107 + items=items, 108 + name=name, 109 + list_type=list_type, 110 + created_at=created_at, 111 + updated_at=datetime.now(UTC), 112 + ) 113 + 114 + return await update_record( 115 + auth_session=auth_session, 116 + record_uri=list_uri, 117 + record=record, 118 + ) 119 + 120 + 121 + async def upsert_album_list_record( 122 + auth_session: AuthSession, 123 + album_id: str, 124 + album_title: str, 125 + track_refs: list[dict[str, str]], 126 + existing_uri: str | None = None, 127 + existing_created_at: datetime | None = None, 128 + ) -> tuple[str, str] | None: 129 + """Create or update an album as a list record. 130 + 131 + args: 132 + auth_session: authenticated user session 133 + album_id: internal album ID (for logging) 134 + album_title: album display name 135 + track_refs: list of track references [{"uri": str, "cid": str}, ...] 136 + existing_uri: existing ATProto record URI if updating 137 + existing_created_at: original creation timestamp to preserve 138 + 139 + returns: 140 + tuple of (record_uri, record_cid) or None if no tracks to sync 141 + """ 142 + if not track_refs: 143 + logger.debug(f"album {album_id} has no tracks with ATProto records, skipping") 144 + return None 145 + 146 + if existing_uri: 147 + # update existing record 148 + uri, cid = await update_list_record( 149 + auth_session=auth_session, 150 + list_uri=existing_uri, 151 + items=track_refs, 152 + name=album_title, 153 + list_type="album", 154 + created_at=existing_created_at, 155 + ) 156 + logger.info(f"updated album list record for {album_id}: {uri}") 157 + return uri, cid 158 + else: 159 + # create new record 160 + uri, cid = await create_list_record( 161 + auth_session=auth_session, 162 + items=track_refs, 163 + name=album_title, 164 + list_type="album", 165 + ) 166 + logger.info(f"created album list record for {album_id}: {uri}") 167 + return uri, cid 168 + 169 + 170 + async def upsert_liked_list_record( 171 + auth_session: AuthSession, 172 + track_refs: list[dict[str, str]], 173 + existing_uri: str | None = None, 174 + existing_created_at: datetime | None = None, 175 + ) -> tuple[str, str] | None: 176 + """Create or update the user's liked tracks list record. 177 + 178 + args: 179 + auth_session: authenticated user session 180 + track_refs: list of liked track references [{"uri": str, "cid": str}, ...] 181 + existing_uri: existing ATProto record URI if updating 182 + existing_created_at: original creation timestamp to preserve 183 + 184 + returns: 185 + tuple of (record_uri, record_cid) or None if no likes to sync 186 + """ 187 + if not track_refs: 188 + logger.debug(f"user {auth_session.did} has no liked tracks to sync") 189 + return None 190 + 191 + if existing_uri: 192 + # update existing record 193 + uri, cid = await update_list_record( 194 + auth_session=auth_session, 195 + list_uri=existing_uri, 196 + items=track_refs, 197 + name="Liked Tracks", 198 + list_type="liked", 199 + created_at=existing_created_at, 200 + ) 201 + logger.info(f"updated liked list record for {auth_session.did}: {uri}") 202 + return uri, cid 203 + else: 204 + # create new record 205 + uri, cid = await create_list_record( 206 + auth_session=auth_session, 207 + items=track_refs, 208 + name="Liked Tracks", 209 + list_type="liked", 210 + ) 211 + logger.info(f"created liked list record for {auth_session.did}: {uri}") 212 + return uri, cid
+116
backend/src/backend/_internal/atproto/records/fm_plyr/profile.py
···
··· 1 + """fm.plyr.profile record operations.""" 2 + 3 + from datetime import UTC, datetime 4 + from typing import Any 5 + 6 + from backend._internal import Session as AuthSession 7 + from backend._internal import get_oauth_client 8 + from backend._internal.atproto.client import make_pds_request, reconstruct_oauth_session 9 + from backend.config import settings 10 + 11 + 12 + def build_profile_record( 13 + bio: str | None = None, 14 + created_at: datetime | None = None, 15 + updated_at: datetime | None = None, 16 + ) -> dict[str, Any]: 17 + """Build a profile record dict for ATProto. 18 + 19 + args: 20 + bio: artist bio/description 21 + created_at: creation timestamp (defaults to now) 22 + updated_at: optional last modification timestamp 23 + 24 + returns: 25 + record dict ready for ATProto 26 + """ 27 + record: dict[str, Any] = { 28 + "$type": settings.atproto.profile_collection, 29 + "createdAt": (created_at or datetime.now(UTC)) 30 + .isoformat() 31 + .replace("+00:00", "Z"), 32 + } 33 + 34 + if bio: 35 + record["bio"] = bio 36 + if updated_at: 37 + record["updatedAt"] = updated_at.isoformat().replace("+00:00", "Z") 38 + 39 + return record 40 + 41 + 42 + async def upsert_profile_record( 43 + auth_session: AuthSession, 44 + bio: str | None = None, 45 + ) -> tuple[str, str] | None: 46 + """Create or update the user's plyr.fm profile record. 47 + 48 + uses putRecord with rkey="self" for upsert semantics - creates if 49 + doesn't exist, updates if it does. skips write if record already 50 + exists with the same bio (no-op for unchanged data). 51 + 52 + args: 53 + auth_session: authenticated user session 54 + bio: artist bio/description 55 + 56 + returns: 57 + tuple of (record_uri, record_cid) or None if skipped (unchanged) 58 + """ 59 + # check if profile already exists to preserve createdAt and skip if unchanged 60 + existing_created_at = None 61 + existing_bio = None 62 + existing_uri = None 63 + existing_cid = None 64 + 65 + try: 66 + # try to get existing record 67 + oauth_data = auth_session.oauth_session 68 + if oauth_data and "pds_url" in oauth_data: 69 + oauth_session = reconstruct_oauth_session(oauth_data) 70 + url = f"{oauth_data['pds_url']}/xrpc/com.atproto.repo.getRecord" 71 + params = { 72 + "repo": auth_session.did, 73 + "collection": settings.atproto.profile_collection, 74 + "rkey": "self", 75 + } 76 + response = await get_oauth_client().make_authenticated_request( 77 + session=oauth_session, 78 + method="GET", 79 + url=url, 80 + params=params, 81 + ) 82 + if response.status_code == 200: 83 + existing = response.json() 84 + existing_uri = existing.get("uri") 85 + existing_cid = existing.get("cid") 86 + if "value" in existing: 87 + existing_bio = existing["value"].get("bio") 88 + if "createdAt" in existing["value"]: 89 + existing_created_at = datetime.fromisoformat( 90 + existing["value"]["createdAt"].replace("Z", "+00:00") 91 + ) 92 + except Exception: 93 + # record doesn't exist yet, that's fine 94 + pass 95 + 96 + # skip write if record exists with same bio (no changes needed) 97 + if existing_uri and existing_cid and existing_bio == bio: 98 + return None 99 + 100 + record = build_profile_record( 101 + bio=bio, 102 + created_at=existing_created_at, 103 + updated_at=datetime.now(UTC) if existing_created_at else None, 104 + ) 105 + 106 + payload = { 107 + "repo": auth_session.did, 108 + "collection": settings.atproto.profile_collection, 109 + "rkey": "self", 110 + "record": record, 111 + } 112 + 113 + result = await make_pds_request( 114 + auth_session, "POST", "com.atproto.repo.putRecord", payload 115 + ) 116 + return result["uri"], result["cid"]
+227
backend/src/backend/_internal/atproto/records/fm_plyr/track.py
···
··· 1 + """fm.plyr.track record operations.""" 2 + 3 + import logging 4 + from datetime import UTC, datetime 5 + from typing import Any 6 + 7 + from backend._internal import Session as AuthSession 8 + from backend._internal.atproto.client import make_pds_request, parse_at_uri 9 + from backend.config import settings 10 + 11 + logger = logging.getLogger(__name__) 12 + 13 + 14 + def build_track_record( 15 + title: str, 16 + artist: str, 17 + audio_url: str, 18 + file_type: str, 19 + album: str | None = None, 20 + duration: int | None = None, 21 + features: list[dict] | None = None, 22 + image_url: str | None = None, 23 + ) -> dict[str, Any]: 24 + """Build a track record dict for ATProto. 25 + 26 + args: 27 + title: track title 28 + artist: artist name 29 + audio_url: R2 URL for audio file 30 + file_type: file extension (mp3, wav, etc) 31 + album: optional album name 32 + duration: optional duration in seconds 33 + features: optional list of featured artists [{did, handle, display_name, avatar_url}] 34 + image_url: optional cover art image URL 35 + 36 + returns: 37 + record dict ready for ATProto 38 + """ 39 + record: dict[str, Any] = { 40 + "$type": settings.atproto.track_collection, 41 + "title": title, 42 + "artist": artist, 43 + "audioUrl": audio_url, 44 + "fileType": file_type, 45 + "createdAt": datetime.now(UTC).isoformat().replace("+00:00", "Z"), 46 + } 47 + 48 + # add optional fields 49 + if album: 50 + record["album"] = album 51 + if duration: 52 + record["duration"] = duration 53 + if features: 54 + # only include essential fields for ATProto record 55 + record["features"] = [ 56 + { 57 + "did": f["did"], 58 + "handle": f["handle"], 59 + "displayName": f.get("display_name", f["handle"]), 60 + } 61 + for f in features 62 + ] 63 + if image_url: 64 + # validate image URL comes from allowed origin 65 + settings.storage.validate_image_url(image_url) 66 + record["imageUrl"] = image_url 67 + 68 + return record 69 + 70 + 71 + async def create_track_record( 72 + auth_session: AuthSession, 73 + title: str, 74 + artist: str, 75 + audio_url: str, 76 + file_type: str, 77 + album: str | None = None, 78 + duration: int | None = None, 79 + features: list[dict] | None = None, 80 + image_url: str | None = None, 81 + ) -> tuple[str, str]: 82 + """Create a track record on the user's PDS using the configured collection. 83 + 84 + args: 85 + auth_session: authenticated user session 86 + title: track title 87 + artist: artist name 88 + audio_url: R2 URL for audio file 89 + file_type: file extension (mp3, wav, etc) 90 + album: optional album name 91 + duration: optional duration in seconds 92 + features: optional list of featured artists [{did, handle, display_name, avatar_url}] 93 + image_url: optional cover art image URL 94 + 95 + returns: 96 + tuple of (record_uri, record_cid) 97 + 98 + raises: 99 + ValueError: if session is invalid 100 + Exception: if record creation fails 101 + """ 102 + record = build_track_record( 103 + title=title, 104 + artist=artist, 105 + audio_url=audio_url, 106 + file_type=file_type, 107 + album=album, 108 + duration=duration, 109 + features=features, 110 + image_url=image_url, 111 + ) 112 + 113 + payload = { 114 + "repo": auth_session.did, 115 + "collection": settings.atproto.track_collection, 116 + "record": record, 117 + } 118 + 119 + result = await make_pds_request( 120 + auth_session, "POST", "com.atproto.repo.createRecord", payload 121 + ) 122 + return result["uri"], result["cid"] 123 + 124 + 125 + async def get_record_public( 126 + record_uri: str, 127 + pds_url: str | None = None, 128 + ) -> dict[str, Any]: 129 + """fetch an ATProto record without authentication. 130 + 131 + ATProto records are public by design - any client can read them. 132 + uses the owner's PDS URL if provided, otherwise falls back to 133 + bsky.network relay which indexes all public records. 134 + 135 + args: 136 + record_uri: AT URI of the record (at://did/collection/rkey) 137 + pds_url: optional PDS URL to use (falls back to bsky.network) 138 + 139 + returns: 140 + the record value dict 141 + 142 + raises: 143 + ValueError: if URI is malformed 144 + Exception: if fetch fails 145 + """ 146 + import httpx 147 + 148 + repo, collection, rkey = parse_at_uri(record_uri) 149 + 150 + base_url = pds_url or "https://bsky.network" 151 + url = f"{base_url}/xrpc/com.atproto.repo.getRecord" 152 + params = {"repo": repo, "collection": collection, "rkey": rkey} 153 + 154 + async with httpx.AsyncClient() as client: 155 + response = await client.get(url, params=params, timeout=10.0) 156 + 157 + if response.status_code != 200: 158 + raise Exception( 159 + f"failed to fetch record: {response.status_code} {response.text}" 160 + ) 161 + 162 + return response.json() 163 + 164 + 165 + async def update_record( 166 + auth_session: AuthSession, 167 + record_uri: str, 168 + record: dict[str, Any], 169 + ) -> tuple[str, str]: 170 + """Update an existing record on the user's PDS. 171 + 172 + args: 173 + auth_session: authenticated user session 174 + record_uri: AT URI of the record to update (e.g., at://did:plc:.../fm.plyr.track/...) 175 + record: complete record data to update with (must include $type) 176 + 177 + returns: 178 + tuple of (record_uri, record_cid) 179 + 180 + raises: 181 + ValueError: if session is invalid or URI is malformed 182 + Exception: if record update fails 183 + """ 184 + repo, collection, rkey = parse_at_uri(record_uri) 185 + 186 + payload = { 187 + "repo": repo, 188 + "collection": collection, 189 + "rkey": rkey, 190 + "record": record, 191 + } 192 + 193 + result = await make_pds_request( 194 + auth_session, "POST", "com.atproto.repo.putRecord", payload 195 + ) 196 + return result["uri"], result["cid"] 197 + 198 + 199 + async def delete_record_by_uri( 200 + auth_session: AuthSession, 201 + record_uri: str, 202 + ) -> None: 203 + """delete a record on the user's PDS. 204 + 205 + args: 206 + auth_session: authenticated user session 207 + record_uri: AT URI of the record to delete 208 + 209 + raises: 210 + ValueError: if session is invalid or URI is malformed 211 + Exception: if record deletion fails 212 + """ 213 + repo, collection, rkey = parse_at_uri(record_uri) 214 + 215 + payload = { 216 + "repo": repo, 217 + "collection": collection, 218 + "rkey": rkey, 219 + } 220 + 221 + await make_pds_request( 222 + auth_session, 223 + "POST", 224 + "com.atproto.repo.deleteRecord", 225 + payload, 226 + success_codes=(200, 201, 204), 227 + )
+9
backend/src/backend/_internal/atproto/records/fm_teal/__init__.py
···
··· 1 + """fm.teal.* lexicon record types (scrobbling integration).""" 2 + 3 + from backend._internal.atproto.records.fm_teal.play import create_teal_play_record 4 + from backend._internal.atproto.records.fm_teal.status import update_teal_status 5 + 6 + __all__ = [ 7 + "create_teal_play_record", 8 + "update_teal_status", 9 + ]
+93
backend/src/backend/_internal/atproto/records/fm_teal/play.py
···
··· 1 + """fm.teal.play record operations (scrobbling).""" 2 + 3 + from datetime import UTC, datetime 4 + from typing import Any 5 + 6 + from backend._internal import Session as AuthSession 7 + from backend._internal.atproto.client import make_pds_request 8 + from backend.config import settings 9 + 10 + 11 + def build_teal_play_record( 12 + track_name: str, 13 + artist_name: str, 14 + duration: int | None = None, 15 + album_name: str | None = None, 16 + origin_url: str | None = None, 17 + ) -> dict[str, Any]: 18 + """build a teal.fm play record for scrobbling. 19 + 20 + args: 21 + track_name: track title 22 + artist_name: primary artist name 23 + duration: track duration in seconds 24 + album_name: optional album/release name 25 + origin_url: optional URL to the track on plyr.fm 26 + 27 + returns: 28 + record dict ready for ATProto 29 + """ 30 + now = datetime.now(UTC) 31 + 32 + record: dict[str, Any] = { 33 + "$type": settings.teal.play_collection, 34 + "trackName": track_name, 35 + "artists": [{"artistName": artist_name}], 36 + "musicServiceBaseDomain": "plyr.fm", 37 + "submissionClientAgent": "plyr.fm/1.0", 38 + "playedTime": now.isoformat().replace("+00:00", "Z"), 39 + } 40 + 41 + if duration: 42 + record["duration"] = duration 43 + if album_name: 44 + record["releaseName"] = album_name 45 + if origin_url: 46 + record["originUrl"] = origin_url 47 + 48 + return record 49 + 50 + 51 + async def create_teal_play_record( 52 + auth_session: AuthSession, 53 + track_name: str, 54 + artist_name: str, 55 + duration: int | None = None, 56 + album_name: str | None = None, 57 + origin_url: str | None = None, 58 + ) -> str: 59 + """create a teal.fm play record (scrobble) on the user's PDS. 60 + 61 + args: 62 + auth_session: authenticated user session with teal scopes 63 + track_name: track title 64 + artist_name: primary artist name 65 + duration: track duration in seconds 66 + album_name: optional album/release name 67 + origin_url: optional URL to the track on plyr.fm 68 + 69 + returns: 70 + record URI 71 + 72 + raises: 73 + ValueError: if session is invalid 74 + Exception: if record creation fails 75 + """ 76 + record = build_teal_play_record( 77 + track_name=track_name, 78 + artist_name=artist_name, 79 + duration=duration, 80 + album_name=album_name, 81 + origin_url=origin_url, 82 + ) 83 + 84 + payload = { 85 + "repo": auth_session.did, 86 + "collection": settings.teal.play_collection, 87 + "record": record, 88 + } 89 + 90 + result = await make_pds_request( 91 + auth_session, "POST", "com.atproto.repo.createRecord", payload 92 + ) 93 + return result["uri"]
+105
backend/src/backend/_internal/atproto/records/fm_teal/status.py
···
··· 1 + """fm.teal.actor.status record operations (now playing).""" 2 + 3 + from datetime import UTC, datetime 4 + from typing import Any 5 + 6 + from backend._internal import Session as AuthSession 7 + from backend._internal.atproto.client import make_pds_request 8 + from backend.config import settings 9 + 10 + 11 + def build_teal_status_record( 12 + track_name: str, 13 + artist_name: str, 14 + duration: int | None = None, 15 + album_name: str | None = None, 16 + origin_url: str | None = None, 17 + ) -> dict[str, Any]: 18 + """build a teal.fm actor status record (now playing). 19 + 20 + args: 21 + track_name: track title 22 + artist_name: primary artist name 23 + duration: track duration in seconds 24 + album_name: optional album/release name 25 + origin_url: optional URL to the track on plyr.fm 26 + 27 + returns: 28 + record dict ready for ATProto 29 + """ 30 + now = datetime.now(UTC) 31 + # expiry defaults to 10 minutes from now 32 + expiry = datetime.fromtimestamp(now.timestamp() + 600, UTC) 33 + 34 + # build the playView item 35 + item: dict[str, Any] = { 36 + "trackName": track_name, 37 + "artists": [{"artistName": artist_name}], 38 + "musicServiceBaseDomain": "plyr.fm", 39 + "submissionClientAgent": "plyr.fm/1.0", 40 + "playedTime": now.isoformat().replace("+00:00", "Z"), 41 + } 42 + 43 + if duration: 44 + item["duration"] = duration 45 + if album_name: 46 + item["releaseName"] = album_name 47 + if origin_url: 48 + item["originUrl"] = origin_url 49 + 50 + record: dict[str, Any] = { 51 + "$type": settings.teal.status_collection, 52 + "time": now.isoformat().replace("+00:00", "Z"), 53 + "expiry": expiry.isoformat().replace("+00:00", "Z"), 54 + "item": item, 55 + } 56 + 57 + return record 58 + 59 + 60 + async def update_teal_status( 61 + auth_session: AuthSession, 62 + track_name: str, 63 + artist_name: str, 64 + duration: int | None = None, 65 + album_name: str | None = None, 66 + origin_url: str | None = None, 67 + ) -> str: 68 + """update the user's teal.fm status (now playing). 69 + 70 + uses putRecord with rkey "self" as per the lexicon spec. 71 + 72 + args: 73 + auth_session: authenticated user session with teal scopes 74 + track_name: track title 75 + artist_name: primary artist name 76 + duration: track duration in seconds 77 + album_name: optional album/release name 78 + origin_url: optional URL to the track on plyr.fm 79 + 80 + returns: 81 + record URI 82 + 83 + raises: 84 + ValueError: if session is invalid 85 + Exception: if record creation fails 86 + """ 87 + record = build_teal_status_record( 88 + track_name=track_name, 89 + artist_name=artist_name, 90 + duration=duration, 91 + album_name=album_name, 92 + origin_url=origin_url, 93 + ) 94 + 95 + payload = { 96 + "repo": auth_session.did, 97 + "collection": settings.teal.status_collection, 98 + "rkey": "self", 99 + "record": record, 100 + } 101 + 102 + result = await make_pds_request( 103 + auth_session, "POST", "com.atproto.repo.putRecord", payload 104 + ) 105 + return result["uri"]
+137
backend/src/backend/_internal/atproto/sync.py
···
··· 1 + """high-level ATProto record synchronization.""" 2 + 3 + import logging 4 + 5 + from sqlalchemy import select 6 + 7 + from backend._internal import Session as AuthSession 8 + from backend._internal.atproto.records.fm_plyr import ( 9 + upsert_album_list_record, 10 + upsert_liked_list_record, 11 + upsert_profile_record, 12 + ) 13 + from backend.utilities.database import db_session 14 + 15 + logger = logging.getLogger(__name__) 16 + 17 + 18 + async def sync_atproto_records( 19 + auth_session: AuthSession, 20 + user_did: str, 21 + ) -> None: 22 + """sync profile, albums, and liked tracks to ATProto. 23 + 24 + this is the actual sync logic - runs all queries and PDS calls. 25 + should be called from a background task to avoid blocking. 26 + """ 27 + from backend.models import Album, Artist, Track, TrackLike, UserPreferences 28 + 29 + # sync profile record 30 + async with db_session() as session: 31 + artist_result = await session.execute( 32 + select(Artist).where(Artist.did == user_did) 33 + ) 34 + artist = artist_result.scalar_one_or_none() 35 + artist_bio = artist.bio if artist else None 36 + 37 + if artist_bio is not None or artist: 38 + try: 39 + profile_result = await upsert_profile_record(auth_session, bio=artist_bio) 40 + if profile_result: 41 + logger.info(f"synced ATProto profile record for {user_did}") 42 + except Exception as e: 43 + logger.warning(f"failed to sync ATProto profile record for {user_did}: {e}") 44 + 45 + # query and sync album list records 46 + async with db_session() as session: 47 + albums_result = await session.execute( 48 + select(Album).where(Album.artist_did == user_did) 49 + ) 50 + albums = albums_result.scalars().all() 51 + 52 + for album in albums: 53 + tracks_result = await session.execute( 54 + select(Track) 55 + .where( 56 + Track.album_id == album.id, 57 + Track.atproto_record_uri.isnot(None), 58 + Track.atproto_record_cid.isnot(None), 59 + ) 60 + .order_by(Track.created_at.asc()) 61 + ) 62 + tracks = tracks_result.scalars().all() 63 + 64 + if tracks: 65 + track_refs = [ 66 + {"uri": t.atproto_record_uri, "cid": t.atproto_record_cid} 67 + for t in tracks 68 + ] 69 + try: 70 + album_result = await upsert_album_list_record( 71 + auth_session, 72 + album_id=album.id, 73 + album_title=album.title, 74 + track_refs=track_refs, 75 + existing_uri=album.atproto_record_uri, 76 + ) 77 + if album_result: 78 + album.atproto_record_uri = album_result[0] 79 + album.atproto_record_cid = album_result[1] 80 + await session.commit() 81 + logger.info( 82 + f"synced album list record for {album.id}: {album_result[0]}" 83 + ) 84 + except Exception as e: 85 + logger.warning( 86 + f"failed to sync album list record for {album.id}: {e}" 87 + ) 88 + 89 + # query and sync liked tracks list record 90 + async with db_session() as session: 91 + prefs_result = await session.execute( 92 + select(UserPreferences).where(UserPreferences.did == user_did) 93 + ) 94 + prefs = prefs_result.scalar_one_or_none() 95 + 96 + likes_result = await session.execute( 97 + select(Track) 98 + .join(TrackLike, TrackLike.track_id == Track.id) 99 + .where( 100 + TrackLike.user_did == user_did, 101 + Track.atproto_record_uri.isnot(None), 102 + Track.atproto_record_cid.isnot(None), 103 + ) 104 + .order_by(TrackLike.created_at.desc()) 105 + ) 106 + liked_tracks = likes_result.scalars().all() 107 + 108 + if liked_tracks: 109 + liked_refs = [ 110 + {"uri": t.atproto_record_uri, "cid": t.atproto_record_cid} 111 + for t in liked_tracks 112 + ] 113 + existing_liked_uri = prefs.liked_list_uri if prefs else None 114 + 115 + try: 116 + liked_result = await upsert_liked_list_record( 117 + auth_session, 118 + track_refs=liked_refs, 119 + existing_uri=existing_liked_uri, 120 + ) 121 + if liked_result: 122 + if prefs: 123 + prefs.liked_list_uri = liked_result[0] 124 + prefs.liked_list_cid = liked_result[1] 125 + else: 126 + prefs = UserPreferences( 127 + did=user_did, 128 + liked_list_uri=liked_result[0], 129 + liked_list_cid=liked_result[1], 130 + ) 131 + session.add(prefs) 132 + await session.commit() 133 + logger.info( 134 + f"synced liked list record for {user_did}: {liked_result[0]}" 135 + ) 136 + except Exception as e: 137 + logger.warning(f"failed to sync liked list record for {user_did}: {e}")
+15 -190
backend/src/backend/_internal/atproto/teal.py
··· 1 - """teal.fm record creation for scrobbling integration.""" 2 - 3 - import logging 4 - from datetime import UTC, datetime 5 - from typing import Any 6 - 7 - from backend._internal import Session as AuthSession 8 - from backend._internal.atproto.records import _make_pds_request 9 - from backend.config import settings 10 - 11 - logger = logging.getLogger(__name__) 12 - 13 - 14 - def build_teal_play_record( 15 - track_name: str, 16 - artist_name: str, 17 - duration: int | None = None, 18 - album_name: str | None = None, 19 - origin_url: str | None = None, 20 - ) -> dict[str, Any]: 21 - """build a teal.fm play record for scrobbling. 22 23 - args: 24 - track_name: track title 25 - artist_name: primary artist name 26 - duration: track duration in seconds 27 - album_name: optional album/release name 28 - origin_url: optional URL to the track on plyr.fm 29 30 - returns: 31 - record dict ready for ATProto 32 - """ 33 - now = datetime.now(UTC) 34 35 - record: dict[str, Any] = { 36 - "$type": settings.teal.play_collection, 37 - "trackName": track_name, 38 - "artists": [{"artistName": artist_name}], 39 - "musicServiceBaseDomain": "plyr.fm", 40 - "submissionClientAgent": "plyr.fm/1.0", 41 - "playedTime": now.isoformat().replace("+00:00", "Z"), 42 - } 43 - 44 - if duration: 45 - record["duration"] = duration 46 - if album_name: 47 - record["releaseName"] = album_name 48 - if origin_url: 49 - record["originUrl"] = origin_url 50 - 51 - return record 52 - 53 - 54 - def build_teal_status_record( 55 - track_name: str, 56 - artist_name: str, 57 - duration: int | None = None, 58 - album_name: str | None = None, 59 - origin_url: str | None = None, 60 - ) -> dict[str, Any]: 61 - """build a teal.fm actor status record (now playing). 62 - 63 - args: 64 - track_name: track title 65 - artist_name: primary artist name 66 - duration: track duration in seconds 67 - album_name: optional album/release name 68 - origin_url: optional URL to the track on plyr.fm 69 - 70 - returns: 71 - record dict ready for ATProto 72 - """ 73 - now = datetime.now(UTC) 74 - # expiry defaults to 10 minutes from now 75 - expiry = datetime.fromtimestamp(now.timestamp() + 600, UTC) 76 - 77 - # build the playView item 78 - item: dict[str, Any] = { 79 - "trackName": track_name, 80 - "artists": [{"artistName": artist_name}], 81 - "musicServiceBaseDomain": "plyr.fm", 82 - "submissionClientAgent": "plyr.fm/1.0", 83 - "playedTime": now.isoformat().replace("+00:00", "Z"), 84 - } 85 - 86 - if duration: 87 - item["duration"] = duration 88 - if album_name: 89 - item["releaseName"] = album_name 90 - if origin_url: 91 - item["originUrl"] = origin_url 92 - 93 - record: dict[str, Any] = { 94 - "$type": settings.teal.status_collection, 95 - "time": now.isoformat().replace("+00:00", "Z"), 96 - "expiry": expiry.isoformat().replace("+00:00", "Z"), 97 - "item": item, 98 - } 99 - 100 - return record 101 - 102 - 103 - async def create_teal_play_record( 104 - auth_session: AuthSession, 105 - track_name: str, 106 - artist_name: str, 107 - duration: int | None = None, 108 - album_name: str | None = None, 109 - origin_url: str | None = None, 110 - ) -> str: 111 - """create a teal.fm play record (scrobble) on the user's PDS. 112 - 113 - args: 114 - auth_session: authenticated user session with teal scopes 115 - track_name: track title 116 - artist_name: primary artist name 117 - duration: track duration in seconds 118 - album_name: optional album/release name 119 - origin_url: optional URL to the track on plyr.fm 120 - 121 - returns: 122 - record URI 123 - 124 - raises: 125 - ValueError: if session is invalid 126 - Exception: if record creation fails 127 - """ 128 - record = build_teal_play_record( 129 - track_name=track_name, 130 - artist_name=artist_name, 131 - duration=duration, 132 - album_name=album_name, 133 - origin_url=origin_url, 134 - ) 135 - 136 - payload = { 137 - "repo": auth_session.did, 138 - "collection": settings.teal.play_collection, 139 - "record": record, 140 - } 141 - 142 - result = await _make_pds_request( 143 - auth_session, "POST", "com.atproto.repo.createRecord", payload 144 - ) 145 - return result["uri"] 146 - 147 - 148 - async def update_teal_status( 149 - auth_session: AuthSession, 150 - track_name: str, 151 - artist_name: str, 152 - duration: int | None = None, 153 - album_name: str | None = None, 154 - origin_url: str | None = None, 155 - ) -> str: 156 - """update the user's teal.fm status (now playing). 157 - 158 - uses putRecord with rkey "self" as per the lexicon spec. 159 - 160 - args: 161 - auth_session: authenticated user session with teal scopes 162 - track_name: track title 163 - artist_name: primary artist name 164 - duration: track duration in seconds 165 - album_name: optional album/release name 166 - origin_url: optional URL to the track on plyr.fm 167 - 168 - returns: 169 - record URI 170 - 171 - raises: 172 - ValueError: if session is invalid 173 - Exception: if record creation fails 174 - """ 175 - record = build_teal_status_record( 176 - track_name=track_name, 177 - artist_name=artist_name, 178 - duration=duration, 179 - album_name=album_name, 180 - origin_url=origin_url, 181 - ) 182 - 183 - payload = { 184 - "repo": auth_session.did, 185 - "collection": settings.teal.status_collection, 186 - "rkey": "self", 187 - "record": record, 188 - } 189 - 190 - result = await _make_pds_request( 191 - auth_session, "POST", "com.atproto.repo.putRecord", payload 192 - ) 193 - return result["uri"]
··· 1 + """backward compatibility - re-exports from fm_teal package. 2 3 + DEPRECATED: import from backend._internal.atproto.records.fm_teal instead. 4 + """ 5 6 + from backend._internal.atproto.records.fm_teal import ( 7 + create_teal_play_record, 8 + update_teal_status, 9 + ) 10 + from backend._internal.atproto.records.fm_teal.play import build_teal_play_record 11 + from backend._internal.atproto.records.fm_teal.status import build_teal_status_record 12 13 + __all__ = [ 14 + "build_teal_play_record", 15 + "build_teal_status_record", 16 + "create_teal_play_record", 17 + "update_teal_status", 18 + ]
+131
backend/src/backend/_internal/background.py
···
··· 1 + """background task infrastructure using pydocket. 2 + 3 + provides a docket instance for scheduling background tasks and a worker 4 + that runs alongside the FastAPI server. requires DOCKET_URL to be set 5 + to a Redis URL for durable execution across multiple machines. 6 + 7 + usage: 8 + from backend._internal.background import get_docket, is_docket_enabled 9 + 10 + if is_docket_enabled(): 11 + docket = get_docket() 12 + await docket.add(my_task_function)(arg1, arg2) 13 + else: 14 + # fallback to direct execution or FastAPI BackgroundTasks 15 + await my_task_function(arg1, arg2) 16 + """ 17 + 18 + import asyncio 19 + import logging 20 + from collections.abc import AsyncGenerator 21 + from contextlib import asynccontextmanager 22 + 23 + from docket import Docket, Worker 24 + 25 + from backend.config import settings 26 + 27 + logger = logging.getLogger(__name__) 28 + 29 + # global docket instance - initialized in lifespan (None if disabled) 30 + _docket: Docket | None = None 31 + _docket_enabled: bool = False 32 + 33 + 34 + def is_docket_enabled() -> bool: 35 + """check if docket is enabled and initialized.""" 36 + return _docket_enabled and _docket is not None 37 + 38 + 39 + def get_docket() -> Docket: 40 + """get the global docket instance. 41 + 42 + raises: 43 + RuntimeError: if docket is not initialized or disabled 44 + """ 45 + if not _docket_enabled: 46 + raise RuntimeError("docket is disabled - set DOCKET_URL to enable") 47 + if _docket is None: 48 + raise RuntimeError("docket not initialized - is the server running?") 49 + return _docket 50 + 51 + 52 + @asynccontextmanager 53 + async def background_worker_lifespan() -> AsyncGenerator[Docket | None, None]: 54 + """lifespan context manager for docket and its worker. 55 + 56 + if DOCKET_URL is not set, docket is disabled and this yields None. 57 + when enabled, initializes the docket connection and starts an in-process 58 + worker that processes background tasks. 59 + 60 + yields: 61 + Docket | None: the initialized docket instance, or None if disabled 62 + """ 63 + global _docket, _docket_enabled 64 + 65 + # check if docket should be enabled 66 + if not settings.docket.url: 67 + logger.info("docket disabled (DOCKET_URL not set)") 68 + _docket_enabled = False 69 + yield None 70 + return 71 + 72 + _docket_enabled = True 73 + logger.info( 74 + "initializing docket", 75 + extra={"docket_name": settings.docket.name, "url": settings.docket.url}, 76 + ) 77 + 78 + async with Docket( 79 + name=settings.docket.name, 80 + url=settings.docket.url, 81 + ) as docket: 82 + _docket = docket 83 + 84 + # register all background task functions 85 + _register_tasks(docket) 86 + 87 + # start worker as background task 88 + worker_task: asyncio.Task[None] | None = None 89 + try: 90 + async with Worker( 91 + docket, 92 + concurrency=settings.docket.worker_concurrency, 93 + ) as worker: 94 + worker_task = asyncio.create_task( 95 + worker.run_forever(), 96 + name="docket-worker", 97 + ) 98 + logger.info( 99 + "docket worker started", 100 + extra={"concurrency": settings.docket.worker_concurrency}, 101 + ) 102 + yield docket 103 + finally: 104 + # cancel the worker task and wait for it to finish 105 + if worker_task: 106 + worker_task.cancel() 107 + try: 108 + await worker_task 109 + except asyncio.CancelledError: 110 + logger.debug("docket worker task cancelled") 111 + # clear globals after worker is fully stopped 112 + _docket = None 113 + _docket_enabled = False 114 + logger.info("docket worker stopped") 115 + 116 + 117 + def _register_tasks(docket: Docket) -> None: 118 + """register all background task functions with the docket. 119 + 120 + tasks must be registered before they can be executed by workers. 121 + add new task imports here as they're created. 122 + """ 123 + # import task functions here to avoid circular imports 124 + from backend._internal.background_tasks import scan_copyright 125 + 126 + docket.register(scan_copyright) 127 + 128 + logger.info( 129 + "registered background tasks", 130 + extra={"tasks": ["scan_copyright"]}, 131 + )
+55
backend/src/backend/_internal/background_tasks.py
···
··· 1 + """background task functions for docket. 2 + 3 + these functions are registered with docket and executed by workers. 4 + they should be self-contained and handle their own database sessions. 5 + """ 6 + 7 + import asyncio 8 + import logging 9 + 10 + import logfire 11 + 12 + from backend._internal.background import get_docket, is_docket_enabled 13 + 14 + logger = logging.getLogger(__name__) 15 + 16 + 17 + async def scan_copyright(track_id: int, audio_url: str) -> None: 18 + """scan a track for potential copyright matches. 19 + 20 + this is the docket version of the copyright scan task. when docket 21 + is enabled (DOCKET_URL set), this provides durability and retries 22 + compared to fire-and-forget asyncio.create_task(). 23 + 24 + args: 25 + track_id: database ID of the track to scan 26 + audio_url: public URL of the audio file (R2) 27 + """ 28 + from backend._internal.moderation import scan_track_for_copyright 29 + 30 + await scan_track_for_copyright(track_id, audio_url) 31 + 32 + 33 + async def schedule_copyright_scan(track_id: int, audio_url: str) -> None: 34 + """schedule a copyright scan, using docket if enabled, else asyncio. 35 + 36 + this is the entry point for scheduling copyright scans. it handles 37 + the docket vs asyncio fallback logic in one place. 38 + """ 39 + from backend._internal.moderation import scan_track_for_copyright 40 + 41 + if is_docket_enabled(): 42 + try: 43 + docket = get_docket() 44 + await docket.add(scan_copyright)(track_id, audio_url) 45 + logfire.info("scheduled copyright scan via docket", track_id=track_id) 46 + return 47 + except Exception as e: 48 + logfire.warning( 49 + "docket scheduling failed, falling back to asyncio", 50 + track_id=track_id, 51 + error=str(e), 52 + ) 53 + 54 + # fallback: fire-and-forget 55 + asyncio.create_task(scan_track_for_copyright(track_id, audio_url)) # noqa: RUF006
+242 -262
backend/src/backend/api/tracks/uploads.py
··· 5 import json 6 import logging 7 import tempfile 8 from datetime import UTC, datetime 9 from pathlib import Path 10 from typing import Annotated ··· 27 from backend._internal import Session as AuthSession 28 from backend._internal import require_artist_profile 29 from backend._internal.atproto import create_track_record 30 - from backend._internal.atproto.handles import resolve_handle 31 from backend._internal.audio import AudioFormat 32 from backend._internal.image import ImageFormat 33 from backend._internal.jobs import job_service 34 - from backend._internal.moderation import scan_track_for_copyright 35 from backend.config import settings 36 from backend.models import Artist, Tag, Track, TrackTag 37 from backend.models.job import JobStatus, JobType ··· 49 logger = logging.getLogger(__name__) 50 51 52 async def _get_or_create_tag( 53 db: "AsyncSession", tag_name: str, creator_did: str 54 ) -> Tag: ··· 87 return tag 88 89 90 - async def _process_upload_background( 91 upload_id: str, 92 file_path: str, 93 filename: str, 94 - title: str, 95 - artist_did: str, 96 - album: str | None, 97 - features: str | None, 98 validated_tags: list[str], 99 - auth_session: AuthSession, 100 - image_path: str | None = None, 101 - image_filename: str | None = None, 102 - image_content_type: str | None = None, 103 ) -> None: 104 - """Background task to process upload.""" 105 with logfire.span( 106 - "process upload background", upload_id=upload_id, filename=filename 107 ): 108 try: 109 await job_service.update_progress( 110 - upload_id, JobStatus.PROCESSING, "processing upload..." 111 ) 112 113 # validate file type 114 - ext = Path(filename).suffix.lower() 115 audio_format = AudioFormat.from_extension(ext) 116 if not audio_format: 117 await job_service.update_progress( 118 - upload_id, 119 JobStatus.FAILED, 120 "upload failed", 121 error=f"unsupported file type: {ext}", 122 ) 123 return 124 125 - # extract duration from audio file 126 - with open(file_path, "rb") as f: 127 duration = extract_duration(f) 128 - if duration: 129 - logfire.info("extracted duration", duration=duration) 130 131 - # save audio file 132 - await job_service.update_progress( 133 - upload_id, 134 - JobStatus.PROCESSING, 135 - "uploading to storage...", 136 - phase="upload", 137 - progress_pct=0, 138 ) 139 - try: 140 - logfire.info("preparing to save audio file", filename=filename) 141 - 142 - async with R2ProgressTracker( 143 - job_id=upload_id, 144 - message="uploading to storage...", 145 - phase="upload", 146 - ) as tracker: 147 - with open(file_path, "rb") as file_obj: 148 - logfire.info("calling storage.save") 149 - file_id = await storage.save( 150 - file_obj, filename, progress_callback=tracker.on_progress 151 - ) 152 - 153 - # Final 100% update 154 - await job_service.update_progress( 155 - upload_id, 156 - JobStatus.PROCESSING, 157 - "uploading to storage...", 158 - phase="upload", 159 - progress_pct=100.0, 160 - ) 161 - 162 - logfire.info("storage.save completed", file_id=file_id) 163 - 164 - except ValueError as e: 165 - logfire.error("ValueError during storage.save", error=str(e)) 166 - await job_service.update_progress( 167 - upload_id, JobStatus.FAILED, "upload failed", error=str(e) 168 - ) 169 - return 170 - except Exception as e: 171 - logfire.error( 172 - "unexpected exception during storage.save", 173 - error=str(e), 174 - exc_info=True, 175 - ) 176 - await job_service.update_progress( 177 - upload_id, JobStatus.FAILED, "upload failed", error=str(e) 178 - ) 179 return 180 181 - # check for duplicate uploads (same file_id + artist_did) 182 async with db_session() as db: 183 - stmt = select(Track).where( 184 - Track.file_id == file_id, 185 - Track.artist_did == artist_did, 186 - ) 187 - result = await db.execute(stmt) 188 - existing_track = result.scalar_one_or_none() 189 - 190 - if existing_track: 191 - logfire.info( 192 - "duplicate upload detected, returning existing track", 193 - file_id=file_id, 194 - existing_track_id=existing_track.id, 195 - artist_did=artist_did, 196 ) 197 await job_service.update_progress( 198 - upload_id, 199 JobStatus.FAILED, 200 "upload failed", 201 - error=f"duplicate upload: track already exists (id: {existing_track.id})", 202 ) 203 return 204 ··· 206 r2_url = await storage.get_url( 207 file_id, file_type="audio", extension=ext[1:] 208 ) 209 210 # save image if provided 211 - image_id = None 212 image_url = None 213 - if image_path and image_filename: 214 - await job_service.update_progress( 215 - upload_id, 216 - JobStatus.PROCESSING, 217 - "saving image...", 218 - phase="image", 219 - ) 220 - # use content_type for format detection (more reliable on iOS) 221 - image_format, is_valid = ImageFormat.validate_and_extract( 222 - image_filename, image_content_type 223 ) 224 - if is_valid and image_format: 225 - try: 226 - with open(image_path, "rb") as image_obj: 227 - # save with images/ prefix to namespace it 228 - image_id = await storage.save( 229 - image_obj, f"images/{image_filename}" 230 - ) 231 - # get R2 URL for image 232 - image_url = await storage.get_url( 233 - image_id, file_type="image" 234 - ) 235 - except Exception as e: 236 - logger.warning(f"failed to save image: {e}", exc_info=True) 237 - # continue without image - it's optional 238 - else: 239 - logger.warning(f"unsupported image format: {image_filename}") 240 241 - # get artist and resolve features 242 async with db_session() as db: 243 result = await db.execute( 244 - select(Artist).where(Artist.did == artist_did) 245 ) 246 artist = result.scalar_one_or_none() 247 if not artist: 248 await job_service.update_progress( 249 - upload_id, 250 JobStatus.FAILED, 251 "upload failed", 252 error="artist profile not found", 253 ) 254 return 255 256 - # resolve featured artist handles 257 - featured_artists = [] 258 - if features: 259 await job_service.update_progress( 260 - upload_id, 261 JobStatus.PROCESSING, 262 "resolving featured artists...", 263 phase="metadata", 264 ) 265 - try: 266 - handles_list = json.loads(features) 267 - if isinstance(handles_list, list): 268 - # filter valid handles and batch resolve concurrently 269 - valid_handles = [ 270 - handle 271 - for handle in handles_list 272 - if isinstance(handle, str) 273 - and handle.lstrip("@") != artist.handle 274 - ] 275 - if valid_handles: 276 - resolved_artists = await asyncio.gather( 277 - *[resolve_handle(h) for h in valid_handles], 278 - return_exceptions=True, 279 - ) 280 - # filter out exceptions and None values 281 - featured_artists = [ 282 - r 283 - for r in resolved_artists 284 - if isinstance(r, dict) and r is not None 285 - ] 286 - except json.JSONDecodeError: 287 - pass # ignore malformed features 288 - 289 - async def fail_atproto_sync(reason: str) -> None: 290 - """mark upload as failed when ATProto sync cannot complete.""" 291 292 logger.error( 293 - "upload %s failed during ATProto sync", 294 - upload_id, 295 - extra={ 296 - "file_id": file_id, 297 - "artist_did": artist_did, 298 - "reason": reason, 299 - }, 300 ) 301 await job_service.update_progress( 302 - upload_id, 303 JobStatus.FAILED, 304 "upload failed", 305 - error=f"failed to sync track to ATProto: {reason}", 306 phase="atproto", 307 ) 308 - 309 - # delete uploaded media so we don't leave orphaned files behind 310 with contextlib.suppress(Exception): 311 await storage.delete(file_id, audio_format.value) 312 if image_id: 313 with contextlib.suppress(Exception): 314 await storage.delete(image_id) 315 - 316 - # create ATProto record 317 - atproto_uri = None 318 - atproto_cid = None 319 - if r2_url: 320 - await job_service.update_progress( 321 - upload_id, 322 - JobStatus.PROCESSING, 323 - "creating atproto record...", 324 - phase="atproto", 325 - ) 326 - try: 327 - result = await create_track_record( 328 - auth_session=auth_session, 329 - title=title, 330 - artist=artist.display_name, 331 - audio_url=r2_url, 332 - file_type=ext[1:], 333 - album=album, 334 - duration=duration, 335 - features=featured_artists if featured_artists else None, 336 - image_url=image_url, 337 - ) 338 - if not result: 339 - await fail_atproto_sync("PDS returned no record data") 340 - return 341 - atproto_uri, atproto_cid = result 342 - except Exception as e: 343 - await fail_atproto_sync(str(e)) 344 - return 345 - else: 346 - await fail_atproto_sync("no public audio URL available") 347 return 348 349 # create track record 350 await job_service.update_progress( 351 - upload_id, 352 JobStatus.PROCESSING, 353 "saving track metadata...", 354 phase="database", 355 ) 356 - extra = {} 357 if duration: 358 extra["duration"] = duration 359 album_record = None 360 - if album: 361 - extra["album"] = album 362 album_record = await get_or_create_album( 363 - db, 364 - artist, 365 - album, 366 - image_id, 367 - image_url, 368 ) 369 370 track = Track( 371 - title=title, 372 file_id=file_id, 373 file_type=ext[1:], 374 - artist_did=artist_did, 375 extra=extra, 376 album_id=album_record.id if album_record else None, 377 features=featured_artists, ··· 387 await db.commit() 388 await db.refresh(track) 389 390 - # handle tags if provided (already validated) 391 - if validated_tags: 392 - try: 393 - for tag_name in validated_tags: 394 - # get or create tag with race condition handling 395 - tag = await _get_or_create_tag(db, tag_name, artist_did) 396 - 397 - # create track_tag association 398 - track_tag = TrackTag(track_id=track.id, tag_id=tag.id) 399 - db.add(track_tag) 400 - 401 - await db.commit() 402 - except Exception as e: 403 - logfire.error( 404 - "failed to add tags to track", 405 - track_id=track.id, 406 - tags=validated_tags, 407 - error=str(e), 408 - ) 409 - 410 - # send notification about new track 411 - from backend._internal.notifications import notification_service 412 - 413 - try: 414 - # eagerly load artist for notification 415 - await db.refresh(track, ["artist"]) 416 - await notification_service.send_track_notification(track) 417 - track.notification_sent = True 418 - await db.commit() 419 - except Exception as e: 420 - logger.warning( 421 - f"failed to send notification for track {track.id}: {e}" 422 - ) 423 424 - # kick off copyright scan in background (fire-and-forget) 425 - # this runs independently and doesn't affect the upload result 426 if r2_url: 427 - # intentionally not storing reference - scan failures are logged 428 - # but shouldn't affect the upload result 429 - asyncio.create_task( # noqa: RUF006 430 - scan_track_for_copyright(track.id, r2_url) 431 - ) 432 433 await job_service.update_progress( 434 - upload_id, 435 JobStatus.COMPLETED, 436 "upload completed successfully", 437 result={"track_id": track.id}, ··· 439 440 except IntegrityError as e: 441 await db.rollback() 442 - # integrity errors now only occur for foreign key violations or other constraints 443 - error_msg = f"database constraint violation: {e!s}" 444 await job_service.update_progress( 445 - upload_id, 446 JobStatus.FAILED, 447 "upload failed", 448 - error=error_msg, 449 ) 450 - # cleanup: delete uploaded file 451 with contextlib.suppress(Exception): 452 await storage.delete(file_id, audio_format.value) 453 454 except Exception as e: 455 - logger.exception(f"upload {upload_id} failed with unexpected error") 456 await job_service.update_progress( 457 - upload_id, 458 JobStatus.FAILED, 459 "upload failed", 460 error=f"unexpected error: {e!s}", ··· 462 finally: 463 # cleanup temp files 464 with contextlib.suppress(Exception): 465 - Path(file_path).unlink(missing_ok=True) 466 - if image_path: 467 with contextlib.suppress(Exception): 468 - Path(image_path).unlink(missing_ok=True) 469 470 471 @router.post("/") ··· 575 ) 576 577 # schedule background processing once response is sent 578 - background_tasks.add_task( 579 - _process_upload_background, 580 - upload_id, 581 - file_path, 582 - file.filename, 583 - title, 584 - auth_session.did, 585 - album, 586 - features, 587 - validated_tags, 588 - auth_session, 589 - image_path, 590 - image_filename, 591 - image_content_type, 592 ) 593 except Exception: 594 if file_path: 595 with contextlib.suppress(Exception):
··· 5 import json 6 import logging 7 import tempfile 8 + from dataclasses import dataclass 9 from datetime import UTC, datetime 10 from pathlib import Path 11 from typing import Annotated ··· 28 from backend._internal import Session as AuthSession 29 from backend._internal import require_artist_profile 30 from backend._internal.atproto import create_track_record 31 + from backend._internal.atproto.handles import resolve_featured_artists 32 from backend._internal.audio import AudioFormat 33 + from backend._internal.background_tasks import schedule_copyright_scan 34 from backend._internal.image import ImageFormat 35 from backend._internal.jobs import job_service 36 from backend.config import settings 37 from backend.models import Artist, Tag, Track, TrackTag 38 from backend.models.job import JobStatus, JobType ··· 50 logger = logging.getLogger(__name__) 51 52 53 + @dataclass 54 + class UploadContext: 55 + """all data needed to process an upload in the background.""" 56 + 57 + upload_id: str 58 + auth_session: AuthSession 59 + 60 + # audio file 61 + file_path: str 62 + filename: str 63 + 64 + # track metadata 65 + title: str 66 + artist_did: str 67 + album: str | None 68 + features_json: str | None 69 + tags: list[str] 70 + 71 + # optional image 72 + image_path: str | None = None 73 + image_filename: str | None = None 74 + image_content_type: str | None = None 75 + 76 + 77 async def _get_or_create_tag( 78 db: "AsyncSession", tag_name: str, creator_did: str 79 ) -> Tag: ··· 112 return tag 113 114 115 + async def _save_audio_to_storage( 116 upload_id: str, 117 file_path: str, 118 filename: str, 119 + ) -> str | None: 120 + """save audio file to storage, returning file_id or None on failure.""" 121 + await job_service.update_progress( 122 + upload_id, 123 + JobStatus.PROCESSING, 124 + "uploading to storage...", 125 + phase="upload", 126 + progress_pct=0, 127 + ) 128 + try: 129 + async with R2ProgressTracker( 130 + job_id=upload_id, 131 + message="uploading to storage...", 132 + phase="upload", 133 + ) as tracker: 134 + with open(file_path, "rb") as file_obj: 135 + file_id = await storage.save( 136 + file_obj, filename, progress_callback=tracker.on_progress 137 + ) 138 + 139 + await job_service.update_progress( 140 + upload_id, 141 + JobStatus.PROCESSING, 142 + "uploading to storage...", 143 + phase="upload", 144 + progress_pct=100.0, 145 + ) 146 + logfire.info("storage.save completed", file_id=file_id) 147 + return file_id 148 + 149 + except Exception as e: 150 + logfire.error("storage.save failed", error=str(e), exc_info=True) 151 + await job_service.update_progress( 152 + upload_id, JobStatus.FAILED, "upload failed", error=str(e) 153 + ) 154 + return None 155 + 156 + 157 + async def _save_image_to_storage( 158 + upload_id: str, 159 + image_path: str, 160 + image_filename: str, 161 + image_content_type: str | None, 162 + ) -> tuple[str | None, str | None]: 163 + """save image to storage, returning (image_id, image_url) or (None, None).""" 164 + await job_service.update_progress( 165 + upload_id, 166 + JobStatus.PROCESSING, 167 + "saving image...", 168 + phase="image", 169 + ) 170 + image_format, is_valid = ImageFormat.validate_and_extract( 171 + image_filename, image_content_type 172 + ) 173 + if not is_valid or not image_format: 174 + logger.warning(f"unsupported image format: {image_filename}") 175 + return None, None 176 + 177 + try: 178 + with open(image_path, "rb") as image_obj: 179 + image_id = await storage.save(image_obj, f"images/{image_filename}") 180 + image_url = await storage.get_url(image_id, file_type="image") 181 + return image_id, image_url 182 + except Exception as e: 183 + logger.warning(f"failed to save image: {e}", exc_info=True) 184 + return None, None 185 + 186 + 187 + async def _add_tags_to_track( 188 + db: AsyncSession, 189 + track_id: int, 190 validated_tags: list[str], 191 + creator_did: str, 192 ) -> None: 193 + """add validated tags to a track.""" 194 + if not validated_tags: 195 + return 196 + 197 + try: 198 + for tag_name in validated_tags: 199 + tag = await _get_or_create_tag(db, tag_name, creator_did) 200 + track_tag = TrackTag(track_id=track_id, tag_id=tag.id) 201 + db.add(track_tag) 202 + await db.commit() 203 + except Exception as e: 204 + logfire.error( 205 + "failed to add tags to track", 206 + track_id=track_id, 207 + tags=validated_tags, 208 + error=str(e), 209 + ) 210 + 211 + 212 + async def _send_track_notification(db: AsyncSession, track: Track) -> None: 213 + """send notification for new track upload.""" 214 + from backend._internal.notifications import notification_service 215 + 216 + try: 217 + await db.refresh(track, ["artist"]) 218 + await notification_service.send_track_notification(track) 219 + track.notification_sent = True 220 + await db.commit() 221 + except Exception as e: 222 + logger.warning(f"failed to send notification for track {track.id}: {e}") 223 + 224 + 225 + async def _process_upload_background(ctx: UploadContext) -> None: 226 + """background task to process upload.""" 227 with logfire.span( 228 + "process upload background", upload_id=ctx.upload_id, filename=ctx.filename 229 ): 230 + file_id: str | None = None 231 + image_id: str | None = None 232 + audio_format: AudioFormat | None = None 233 + 234 try: 235 await job_service.update_progress( 236 + ctx.upload_id, JobStatus.PROCESSING, "processing upload..." 237 ) 238 239 # validate file type 240 + ext = Path(ctx.filename).suffix.lower() 241 audio_format = AudioFormat.from_extension(ext) 242 if not audio_format: 243 await job_service.update_progress( 244 + ctx.upload_id, 245 JobStatus.FAILED, 246 "upload failed", 247 error=f"unsupported file type: {ext}", 248 ) 249 return 250 251 + # extract duration 252 + with open(ctx.file_path, "rb") as f: 253 duration = extract_duration(f) 254 255 + # save audio to storage 256 + file_id = await _save_audio_to_storage( 257 + ctx.upload_id, ctx.file_path, ctx.filename 258 ) 259 + if not file_id: 260 return 261 262 + # check for duplicate 263 async with db_session() as db: 264 + result = await db.execute( 265 + select(Track).where( 266 + Track.file_id == file_id, 267 + Track.artist_did == ctx.artist_did, 268 ) 269 + ) 270 + if existing := result.scalar_one_or_none(): 271 await job_service.update_progress( 272 + ctx.upload_id, 273 JobStatus.FAILED, 274 "upload failed", 275 + error=f"duplicate upload: track already exists (id: {existing.id})", 276 ) 277 return 278 ··· 280 r2_url = await storage.get_url( 281 file_id, file_type="audio", extension=ext[1:] 282 ) 283 + if not r2_url: 284 + await job_service.update_progress( 285 + ctx.upload_id, 286 + JobStatus.FAILED, 287 + "upload failed", 288 + error="failed to get public audio URL", 289 + ) 290 + return 291 292 # save image if provided 293 image_url = None 294 + if ctx.image_path and ctx.image_filename: 295 + image_id, image_url = await _save_image_to_storage( 296 + ctx.upload_id, 297 + ctx.image_path, 298 + ctx.image_filename, 299 + ctx.image_content_type, 300 ) 301 302 + # get artist and resolve featured artists 303 async with db_session() as db: 304 result = await db.execute( 305 + select(Artist).where(Artist.did == ctx.artist_did) 306 ) 307 artist = result.scalar_one_or_none() 308 if not artist: 309 await job_service.update_progress( 310 + ctx.upload_id, 311 JobStatus.FAILED, 312 "upload failed", 313 error="artist profile not found", 314 ) 315 return 316 317 + # resolve featured artists 318 + featured_artists: list[dict] = [] 319 + if ctx.features_json: 320 await job_service.update_progress( 321 + ctx.upload_id, 322 JobStatus.PROCESSING, 323 "resolving featured artists...", 324 phase="metadata", 325 ) 326 + featured_artists = await resolve_featured_artists( 327 + ctx.features_json, artist.handle 328 + ) 329 330 + # create ATProto record 331 + await job_service.update_progress( 332 + ctx.upload_id, 333 + JobStatus.PROCESSING, 334 + "creating atproto record...", 335 + phase="atproto", 336 + ) 337 + try: 338 + atproto_result = await create_track_record( 339 + auth_session=ctx.auth_session, 340 + title=ctx.title, 341 + artist=artist.display_name, 342 + audio_url=r2_url, 343 + file_type=ext[1:], 344 + album=ctx.album, 345 + duration=duration, 346 + features=featured_artists or None, 347 + image_url=image_url, 348 + ) 349 + if not atproto_result: 350 + raise ValueError("PDS returned no record data") 351 + atproto_uri, atproto_cid = atproto_result 352 + except Exception as e: 353 logger.error( 354 + "ATProto sync failed for upload %s: %s", ctx.upload_id, e 355 ) 356 await job_service.update_progress( 357 + ctx.upload_id, 358 JobStatus.FAILED, 359 "upload failed", 360 + error=f"failed to sync track to ATProto: {e}", 361 phase="atproto", 362 ) 363 + # cleanup orphaned media 364 with contextlib.suppress(Exception): 365 await storage.delete(file_id, audio_format.value) 366 if image_id: 367 with contextlib.suppress(Exception): 368 await storage.delete(image_id) 369 return 370 371 # create track record 372 await job_service.update_progress( 373 + ctx.upload_id, 374 JobStatus.PROCESSING, 375 "saving track metadata...", 376 phase="database", 377 ) 378 + 379 + extra: dict = {} 380 if duration: 381 extra["duration"] = duration 382 + 383 album_record = None 384 + if ctx.album: 385 + extra["album"] = ctx.album 386 album_record = await get_or_create_album( 387 + db, artist, ctx.album, image_id, image_url 388 ) 389 390 track = Track( 391 + title=ctx.title, 392 file_id=file_id, 393 file_type=ext[1:], 394 + artist_did=ctx.artist_did, 395 extra=extra, 396 album_id=album_record.id if album_record else None, 397 features=featured_artists, ··· 407 await db.commit() 408 await db.refresh(track) 409 410 + await _add_tags_to_track(db, track.id, ctx.tags, ctx.artist_did) 411 + await _send_track_notification(db, track) 412 413 if r2_url: 414 + await schedule_copyright_scan(track.id, r2_url) 415 416 await job_service.update_progress( 417 + ctx.upload_id, 418 JobStatus.COMPLETED, 419 "upload completed successfully", 420 result={"track_id": track.id}, ··· 422 423 except IntegrityError as e: 424 await db.rollback() 425 await job_service.update_progress( 426 + ctx.upload_id, 427 JobStatus.FAILED, 428 "upload failed", 429 + error=f"database constraint violation: {e!s}", 430 ) 431 with contextlib.suppress(Exception): 432 await storage.delete(file_id, audio_format.value) 433 434 except Exception as e: 435 + logger.exception(f"upload {ctx.upload_id} failed with unexpected error") 436 await job_service.update_progress( 437 + ctx.upload_id, 438 JobStatus.FAILED, 439 "upload failed", 440 error=f"unexpected error: {e!s}", ··· 442 finally: 443 # cleanup temp files 444 with contextlib.suppress(Exception): 445 + Path(ctx.file_path).unlink(missing_ok=True) 446 + if ctx.image_path: 447 with contextlib.suppress(Exception): 448 + Path(ctx.image_path).unlink(missing_ok=True) 449 450 451 @router.post("/") ··· 555 ) 556 557 # schedule background processing once response is sent 558 + ctx = UploadContext( 559 + upload_id=upload_id, 560 + auth_session=auth_session, 561 + file_path=file_path, 562 + filename=file.filename, 563 + title=title, 564 + artist_did=auth_session.did, 565 + album=album, 566 + features_json=features, 567 + tags=validated_tags, 568 + image_path=image_path, 569 + image_filename=image_filename, 570 + image_content_type=image_content_type, 571 ) 572 + background_tasks.add_task(_process_upload_background, ctx) 573 except Exception: 574 if file_path: 575 with contextlib.suppress(Exception):
+42
backend/src/backend/config.py
··· 496 validation_alias="LOGFIRE_ENVIRONMENT", 497 description="Logfire environment (local/production)", 498 ) 499 500 501 class ModerationSettings(AppSettingsSection): ··· 527 labeler_url: str = Field( 528 default="https://moderation.plyr.fm", 529 description="URL of the ATProto labeler service for emitting labels", 530 ) 531 532 ··· 618 bufo: BufoSettings = Field( 619 default_factory=BufoSettings, 620 description="bufo easter egg settings", 621 ) 622 623
··· 496 validation_alias="LOGFIRE_ENVIRONMENT", 497 description="Logfire environment (local/production)", 498 ) 499 + suppressed_loggers: CommaSeparatedStringSet = Field( 500 + default={"docket"}, 501 + validation_alias="LOGFIRE_SUPPRESSED_LOGGERS", 502 + description="Logger names to suppress (set to WARNING level)", 503 + ) 504 505 506 class ModerationSettings(AppSettingsSection): ··· 532 labeler_url: str = Field( 533 default="https://moderation.plyr.fm", 534 description="URL of the ATProto labeler service for emitting labels", 535 + ) 536 + 537 + 538 + class DocketSettings(AppSettingsSection): 539 + """Background task queue configuration using pydocket. 540 + 541 + By default uses in-memory mode (no Redis required). Set DOCKET_URL to a Redis 542 + URL for durable task execution that survives server restarts. 543 + """ 544 + 545 + model_config = SettingsConfigDict( 546 + env_prefix="DOCKET_", 547 + env_file=".env", 548 + case_sensitive=False, 549 + extra="ignore", 550 + ) 551 + 552 + name: str = Field( 553 + default="plyr", 554 + description="Name of the docket instance (shared across workers)", 555 + ) 556 + url: str = Field( 557 + default="", 558 + validation_alias="DOCKET_URL", 559 + description="Redis URL for docket (required in production). Empty disables docket.", 560 + ) 561 + enabled: bool = Field( 562 + default=False, 563 + description="Enable docket background tasks. Auto-enabled when url is set.", 564 + ) 565 + worker_concurrency: int = Field( 566 + default=10, 567 + description="Number of concurrent tasks per worker", 568 ) 569 570 ··· 656 bufo: BufoSettings = Field( 657 default_factory=BufoSettings, 658 description="bufo easter egg settings", 659 + ) 660 + docket: DocketSettings = Field( 661 + default_factory=DocketSettings, 662 + description="Background task queue settings", 663 ) 664 665
+9 -3
backend/src/backend/main.py
··· 24 ) 25 26 from backend._internal import notification_service, queue_service 27 from backend.api import ( 28 account_router, 29 artists_router, ··· 72 format="%(asctime)s - %(name)s - %(levelname)s - %(message)s", 73 ) 74 75 - # # reduce noise from verbose loggers 76 - # logging.getLogger("httpx").setLevel(logging.WARNING) 77 78 logger = logging.getLogger(__name__) 79 ··· 155 await notification_service.setup() 156 await queue_service.setup() 157 158 - yield 159 160 # shutdown: cleanup resources 161 await notification_service.shutdown()
··· 24 ) 25 26 from backend._internal import notification_service, queue_service 27 + from backend._internal.background import background_worker_lifespan 28 from backend.api import ( 29 account_router, 30 artists_router, ··· 73 format="%(asctime)s - %(name)s - %(levelname)s - %(message)s", 74 ) 75 76 + # reduce noise from verbose loggers 77 + for logger_name in settings.observability.suppressed_loggers: 78 + logging.getLogger(logger_name).setLevel(logging.WARNING) 79 80 logger = logging.getLogger(__name__) 81 ··· 157 await notification_service.setup() 158 await queue_service.setup() 159 160 + # start background task worker (docket) 161 + async with background_worker_lifespan() as docket: 162 + # store docket on app state for access in routes if needed 163 + app.state.docket = docket 164 + yield 165 166 # shutdown: cleanup resources 167 await notification_service.shutdown()
+15 -15
backend/tests/api/test_list_record_sync.py
··· 157 158 with ( 159 patch( 160 - "backend._internal.atproto.records.upsert_profile_record", 161 new_callable=AsyncMock, 162 return_value=None, 163 ), 164 patch( 165 - "backend._internal.atproto.records.upsert_album_list_record", 166 new_callable=AsyncMock, 167 return_value=( 168 "at://did:plc:testartist123/fm.plyr.list/album123", ··· 170 ), 171 ) as mock_album_sync, 172 patch( 173 - "backend._internal.atproto.records.upsert_liked_list_record", 174 new_callable=AsyncMock, 175 return_value=None, 176 ), ··· 197 198 with ( 199 patch( 200 - "backend._internal.atproto.records.upsert_profile_record", 201 new_callable=AsyncMock, 202 return_value=None, 203 ), 204 patch( 205 - "backend._internal.atproto.records.upsert_album_list_record", 206 new_callable=AsyncMock, 207 return_value=None, 208 ), 209 patch( 210 - "backend._internal.atproto.records.upsert_liked_list_record", 211 new_callable=AsyncMock, 212 return_value=( 213 "at://did:plc:testartist123/fm.plyr.list/liked456", ··· 254 255 with ( 256 patch( 257 - "backend._internal.atproto.records.upsert_profile_record", 258 new_callable=AsyncMock, 259 return_value=None, 260 ), 261 patch( 262 - "backend._internal.atproto.records.upsert_album_list_record", 263 new_callable=AsyncMock, 264 ) as mock_album_sync, 265 patch( 266 - "backend._internal.atproto.records.upsert_liked_list_record", 267 new_callable=AsyncMock, 268 ), 269 ): ··· 286 287 with ( 288 patch( 289 - "backend._internal.atproto.records.upsert_profile_record", 290 new_callable=AsyncMock, 291 return_value=None, 292 ), 293 patch( 294 - "backend._internal.atproto.records.upsert_album_list_record", 295 side_effect=Exception("PDS error"), 296 ), 297 patch( 298 - "backend._internal.atproto.records.upsert_liked_list_record", 299 new_callable=AsyncMock, 300 return_value=None, 301 ) as mock_liked_sync, ··· 319 320 with ( 321 patch( 322 - "backend._internal.atproto.records.upsert_profile_record", 323 new_callable=AsyncMock, 324 return_value=None, 325 ), 326 patch( 327 - "backend._internal.atproto.records.upsert_album_list_record", 328 new_callable=AsyncMock, 329 return_value=None, 330 ), 331 patch( 332 - "backend._internal.atproto.records.upsert_liked_list_record", 333 side_effect=Exception("PDS error"), 334 ), 335 ):
··· 157 158 with ( 159 patch( 160 + "backend._internal.atproto.sync.upsert_profile_record", 161 new_callable=AsyncMock, 162 return_value=None, 163 ), 164 patch( 165 + "backend._internal.atproto.sync.upsert_album_list_record", 166 new_callable=AsyncMock, 167 return_value=( 168 "at://did:plc:testartist123/fm.plyr.list/album123", ··· 170 ), 171 ) as mock_album_sync, 172 patch( 173 + "backend._internal.atproto.sync.upsert_liked_list_record", 174 new_callable=AsyncMock, 175 return_value=None, 176 ), ··· 197 198 with ( 199 patch( 200 + "backend._internal.atproto.sync.upsert_profile_record", 201 new_callable=AsyncMock, 202 return_value=None, 203 ), 204 patch( 205 + "backend._internal.atproto.sync.upsert_album_list_record", 206 new_callable=AsyncMock, 207 return_value=None, 208 ), 209 patch( 210 + "backend._internal.atproto.sync.upsert_liked_list_record", 211 new_callable=AsyncMock, 212 return_value=( 213 "at://did:plc:testartist123/fm.plyr.list/liked456", ··· 254 255 with ( 256 patch( 257 + "backend._internal.atproto.sync.upsert_profile_record", 258 new_callable=AsyncMock, 259 return_value=None, 260 ), 261 patch( 262 + "backend._internal.atproto.sync.upsert_album_list_record", 263 new_callable=AsyncMock, 264 ) as mock_album_sync, 265 patch( 266 + "backend._internal.atproto.sync.upsert_liked_list_record", 267 new_callable=AsyncMock, 268 ), 269 ): ··· 286 287 with ( 288 patch( 289 + "backend._internal.atproto.sync.upsert_profile_record", 290 new_callable=AsyncMock, 291 return_value=None, 292 ), 293 patch( 294 + "backend._internal.atproto.sync.upsert_album_list_record", 295 side_effect=Exception("PDS error"), 296 ), 297 patch( 298 + "backend._internal.atproto.sync.upsert_liked_list_record", 299 new_callable=AsyncMock, 300 return_value=None, 301 ) as mock_liked_sync, ··· 319 320 with ( 321 patch( 322 + "backend._internal.atproto.sync.upsert_profile_record", 323 new_callable=AsyncMock, 324 return_value=None, 325 ), 326 patch( 327 + "backend._internal.atproto.sync.upsert_album_list_record", 328 new_callable=AsyncMock, 329 return_value=None, 330 ), 331 patch( 332 + "backend._internal.atproto.sync.upsert_liked_list_record", 333 side_effect=Exception("PDS error"), 334 ), 335 ):
+6
backend/tests/docker-compose.yml
··· 13 - postgres 14 - -c 15 - max_connections=100
··· 13 - postgres 14 - -c 15 - max_connections=100 16 + 17 + test-redis: 18 + image: redis:7-alpine 19 + ports: 20 + - "6380:6379" 21 + command: redis-server --appendonly no --save ""
+10 -10
backend/tests/test_token_refresh.py
··· 7 from atproto_oauth.models import OAuthSession 8 9 from backend._internal import Session as AuthSession 10 - from backend._internal.atproto.records import _refresh_session_tokens 11 12 13 @pytest.fixture ··· 115 116 with ( 117 patch( 118 - "backend._internal.atproto.records.get_oauth_client", 119 return_value=mock_oauth_client, 120 ), 121 patch( 122 - "backend._internal.atproto.records.get_session", 123 side_effect=mock_get_session, 124 ), 125 patch( 126 - "backend._internal.atproto.records.update_session_tokens", 127 side_effect=mock_update_session_tokens, 128 ), 129 ): ··· 182 183 with ( 184 patch( 185 - "backend._internal.atproto.records.get_oauth_client", 186 return_value=mock_oauth_client, 187 ), 188 patch( 189 - "backend._internal.atproto.records.get_session", 190 side_effect=mock_get_session, 191 ), 192 patch( 193 - "backend._internal.atproto.records.update_session_tokens", 194 side_effect=mock_update_session_tokens, 195 ), 196 ): ··· 246 247 with ( 248 patch( 249 - "backend._internal.atproto.records.get_oauth_client", 250 return_value=mock_oauth_client, 251 ), 252 patch( 253 - "backend._internal.atproto.records.get_session", 254 side_effect=mock_get_session, 255 ), 256 patch( 257 - "backend._internal.atproto.records.update_session_tokens", 258 side_effect=mock_update_session_tokens, 259 ), 260 ):
··· 7 from atproto_oauth.models import OAuthSession 8 9 from backend._internal import Session as AuthSession 10 + from backend._internal.atproto.client import _refresh_session_tokens 11 12 13 @pytest.fixture ··· 115 116 with ( 117 patch( 118 + "backend._internal.atproto.client.get_oauth_client", 119 return_value=mock_oauth_client, 120 ), 121 patch( 122 + "backend._internal.atproto.client.get_session", 123 side_effect=mock_get_session, 124 ), 125 patch( 126 + "backend._internal.atproto.client.update_session_tokens", 127 side_effect=mock_update_session_tokens, 128 ), 129 ): ··· 182 183 with ( 184 patch( 185 + "backend._internal.atproto.client.get_oauth_client", 186 return_value=mock_oauth_client, 187 ), 188 patch( 189 + "backend._internal.atproto.client.get_session", 190 side_effect=mock_get_session, 191 ), 192 patch( 193 + "backend._internal.atproto.client.update_session_tokens", 194 side_effect=mock_update_session_tokens, 195 ), 196 ): ··· 246 247 with ( 248 patch( 249 + "backend._internal.atproto.client.get_oauth_client", 250 return_value=mock_oauth_client, 251 ), 252 patch( 253 + "backend._internal.atproto.client.get_session", 254 side_effect=mock_get_session, 255 ), 256 patch( 257 + "backend._internal.atproto.client.update_session_tokens", 258 side_effect=mock_update_session_tokens, 259 ), 260 ):
+243
backend/uv.lock
··· 245 ] 246 247 [[package]] 248 name = "asyncpg" 249 version = "0.30.0" 250 source = { registry = "https://pypi.org/simple" } ··· 320 { name = "psycopg", extra = ["binary"] }, 321 { name = "pydantic" }, 322 { name = "pydantic-settings" }, 323 { name = "pytest-asyncio" }, 324 { name = "python-dotenv" }, 325 { name = "python-jose", extra = ["cryptography"] }, ··· 364 { name = "psycopg", extras = ["binary"], specifier = ">=3.2.12" }, 365 { name = "pydantic", specifier = ">=2.11.0" }, 366 { name = "pydantic-settings", specifier = ">=2.7.0" }, 367 { name = "pytest-asyncio", specifier = ">=0.25.3" }, 368 { name = "python-dotenv", specifier = ">=1.1.0" }, 369 { name = "python-jose", extras = ["cryptography"], specifier = ">=3.3.0" }, ··· 461 ] 462 463 [[package]] 464 name = "boto3" 465 version = "1.40.61" 466 source = { registry = "https://pypi.org/simple" } ··· 659 sdist = { url = "https://files.pythonhosted.org/packages/46/61/de6cd827efad202d7057d93e0fed9294b96952e188f7384832791c7b2254/click-8.3.0.tar.gz", hash = "sha256:e7b8232224eba16f4ebe410c25ced9f7875cb5f3263ffc93cc3e8da705e229c4", size = 276943, upload-time = "2025-09-18T17:32:23.696Z" } 660 wheels = [ 661 { url = "https://files.pythonhosted.org/packages/db/d3/9dcc0f5797f070ec8edf30fbadfb200e71d9db6b84d211e3b2085a7589a0/click-8.3.0-py3-none-any.whl", hash = "sha256:9b9f285302c6e3064f4330c05f05b81945b2a39544279343e6e7c5f27a9baddc", size = 107295, upload-time = "2025-09-18T17:32:22.42Z" }, 662 ] 663 664 [[package]] ··· 873 ] 874 875 [[package]] 876 name = "fancycompleter" 877 version = "0.11.1" 878 source = { registry = "https://pypi.org/simple" } ··· 1334 ] 1335 1336 [[package]] 1337 name = "mako" 1338 version = "1.3.10" 1339 source = { registry = "https://pypi.org/simple" } ··· 1622 ] 1623 1624 [[package]] 1625 name = "opentelemetry-instrumentation" 1626 version = "0.59b0" 1627 source = { registry = "https://pypi.org/simple" } ··· 1893 ] 1894 1895 [[package]] 1896 name = "prompt-toolkit" 1897 version = "3.0.52" 1898 source = { registry = "https://pypi.org/simple" } ··· 2098 ] 2099 2100 [[package]] 2101 name = "pyasn1" 2102 version = "0.6.1" 2103 source = { registry = "https://pypi.org/simple" } ··· 2242 ] 2243 2244 [[package]] 2245 name = "pygments" 2246 version = "2.19.2" 2247 source = { registry = "https://pypi.org/simple" } ··· 2386 [package.optional-dependencies] 2387 cryptography = [ 2388 { name = "cryptography" }, 2389 ] 2390 2391 [[package]] ··· 2453 ] 2454 2455 [[package]] 2456 name = "requests" 2457 version = "2.32.5" 2458 source = { registry = "https://pypi.org/simple" } ··· 2531 ] 2532 2533 [[package]] 2534 name = "six" 2535 version = "1.17.0" 2536 source = { registry = "https://pypi.org/simple" } ··· 2554 sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372, upload-time = "2024-02-25T23:20:04.057Z" } 2555 wheels = [ 2556 { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" }, 2557 ] 2558 2559 [[package]] ··· 2701 { url = "https://files.pythonhosted.org/packages/e4/25/9324ae947fcc4322470326cf8276a3fc2f08dc82adec1de79d963fdf7af5/ty-0.0.1a25-py3-none-win32.whl", hash = "sha256:168fc8aee396d617451acc44cd28baffa47359777342836060c27aa6f37e2445", size = 8387095, upload-time = "2025-10-29T19:40:18.368Z" }, 2702 { url = "https://files.pythonhosted.org/packages/3b/2b/cb12cbc7db1ba310aa7b1de9b4e018576f653105993736c086ee67d2ec02/ty-0.0.1a25-py3-none-win_amd64.whl", hash = "sha256:a2fad3d8e92bb4d57a8872a6f56b1aef54539d36f23ebb01abe88ac4338efafb", size = 9059225, upload-time = "2025-10-29T19:40:20.278Z" }, 2703 { url = "https://files.pythonhosted.org/packages/2f/c1/f6be8cdd0bf387c1d8ee9d14bb299b7b5d2c0532f550a6693216a32ec0c5/ty-0.0.1a25-py3-none-win_arm64.whl", hash = "sha256:dde2962d448ed87c48736e9a4bb13715a4cced705525e732b1c0dac1d4c66e3d", size = 8536832, upload-time = "2025-10-29T19:40:22.014Z" }, 2704 ] 2705 2706 [[package]]
··· 245 ] 246 247 [[package]] 248 + name = "async-timeout" 249 + version = "5.0.1" 250 + source = { registry = "https://pypi.org/simple" } 251 + sdist = { url = "https://files.pythonhosted.org/packages/a5/ae/136395dfbfe00dfc94da3f3e136d0b13f394cba8f4841120e34226265780/async_timeout-5.0.1.tar.gz", hash = "sha256:d9321a7a3d5a6a5e187e824d2fa0793ce379a202935782d555d6e9d2735677d3", size = 9274, upload-time = "2024-11-06T16:41:39.6Z" } 252 + wheels = [ 253 + { url = "https://files.pythonhosted.org/packages/fe/ba/e2081de779ca30d473f21f5b30e0e737c438205440784c7dfc81efc2b029/async_timeout-5.0.1-py3-none-any.whl", hash = "sha256:39e3809566ff85354557ec2398b55e096c8364bacac9405a7a1fa429e77fe76c", size = 6233, upload-time = "2024-11-06T16:41:37.9Z" }, 254 + ] 255 + 256 + [[package]] 257 name = "asyncpg" 258 version = "0.30.0" 259 source = { registry = "https://pypi.org/simple" } ··· 329 { name = "psycopg", extra = ["binary"] }, 330 { name = "pydantic" }, 331 { name = "pydantic-settings" }, 332 + { name = "pydocket" }, 333 { name = "pytest-asyncio" }, 334 { name = "python-dotenv" }, 335 { name = "python-jose", extra = ["cryptography"] }, ··· 374 { name = "psycopg", extras = ["binary"], specifier = ">=3.2.12" }, 375 { name = "pydantic", specifier = ">=2.11.0" }, 376 { name = "pydantic-settings", specifier = ">=2.7.0" }, 377 + { name = "pydocket", specifier = ">=0.15.2" }, 378 { name = "pytest-asyncio", specifier = ">=0.25.3" }, 379 { name = "python-dotenv", specifier = ">=1.1.0" }, 380 { name = "python-jose", extras = ["cryptography"], specifier = ">=3.3.0" }, ··· 472 ] 473 474 [[package]] 475 + name = "beartype" 476 + version = "0.22.8" 477 + source = { registry = "https://pypi.org/simple" } 478 + sdist = { url = "https://files.pythonhosted.org/packages/8c/1d/794ae2acaa67c8b216d91d5919da2606c2bb14086849ffde7f5555f3a3a5/beartype-0.22.8.tar.gz", hash = "sha256:b19b21c9359722ee3f7cc433f063b3e13997b27ae8226551ea5062e621f61165", size = 1602262, upload-time = "2025-12-03T05:11:10.766Z" } 479 + wheels = [ 480 + { url = "https://files.pythonhosted.org/packages/14/2a/fbcbf5a025d3e71ddafad7efd43e34ec4362f4d523c3c471b457148fb211/beartype-0.22.8-py3-none-any.whl", hash = "sha256:b832882d04e41a4097bab9f63e6992bc6de58c414ee84cba9b45b67314f5ab2e", size = 1331895, upload-time = "2025-12-03T05:11:08.373Z" }, 481 + ] 482 + 483 + [[package]] 484 name = "boto3" 485 version = "1.40.61" 486 source = { registry = "https://pypi.org/simple" } ··· 679 sdist = { url = "https://files.pythonhosted.org/packages/46/61/de6cd827efad202d7057d93e0fed9294b96952e188f7384832791c7b2254/click-8.3.0.tar.gz", hash = "sha256:e7b8232224eba16f4ebe410c25ced9f7875cb5f3263ffc93cc3e8da705e229c4", size = 276943, upload-time = "2025-09-18T17:32:23.696Z" } 680 wheels = [ 681 { url = "https://files.pythonhosted.org/packages/db/d3/9dcc0f5797f070ec8edf30fbadfb200e71d9db6b84d211e3b2085a7589a0/click-8.3.0-py3-none-any.whl", hash = "sha256:9b9f285302c6e3064f4330c05f05b81945b2a39544279343e6e7c5f27a9baddc", size = 107295, upload-time = "2025-09-18T17:32:22.42Z" }, 682 + ] 683 + 684 + [[package]] 685 + name = "cloudpickle" 686 + version = "3.1.2" 687 + source = { registry = "https://pypi.org/simple" } 688 + sdist = { url = "https://files.pythonhosted.org/packages/27/fb/576f067976d320f5f0114a8d9fa1215425441bb35627b1993e5afd8111e5/cloudpickle-3.1.2.tar.gz", hash = "sha256:7fda9eb655c9c230dab534f1983763de5835249750e85fbcef43aaa30a9a2414", size = 22330, upload-time = "2025-11-03T09:25:26.604Z" } 689 + wheels = [ 690 + { url = "https://files.pythonhosted.org/packages/88/39/799be3f2f0f38cc727ee3b4f1445fe6d5e4133064ec2e4115069418a5bb6/cloudpickle-3.1.2-py3-none-any.whl", hash = "sha256:9acb47f6afd73f60dc1df93bb801b472f05ff42fa6c84167d25cb206be1fbf4a", size = 22228, upload-time = "2025-11-03T09:25:25.534Z" }, 691 ] 692 693 [[package]] ··· 902 ] 903 904 [[package]] 905 + name = "fakeredis" 906 + version = "2.32.1" 907 + source = { registry = "https://pypi.org/simple" } 908 + dependencies = [ 909 + { name = "redis" }, 910 + { name = "sortedcontainers" }, 911 + ] 912 + sdist = { url = "https://files.pythonhosted.org/packages/56/14/b47b8471303af7deed7080290c14cff27a831fa47b38f45643e6bf889cee/fakeredis-2.32.1.tar.gz", hash = "sha256:dd8246db159f0b66a1ced7800c9d5ef07769e3d2fde44b389a57f2ce2834e444", size = 171582, upload-time = "2025-11-06T01:40:57.836Z" } 913 + wheels = [ 914 + { url = "https://files.pythonhosted.org/packages/c2/d2/c28f6909864bfdb7411bb8f39fabedb5a50da1cbd7da5a1a3a46dfea2eab/fakeredis-2.32.1-py3-none-any.whl", hash = "sha256:e80c8886db2e47ba784f7dfe66aad6cd2eab76093c6bfda50041e5bc890d46cf", size = 118964, upload-time = "2025-11-06T01:40:55.885Z" }, 915 + ] 916 + 917 + [package.optional-dependencies] 918 + lua = [ 919 + { name = "lupa" }, 920 + ] 921 + 922 + [[package]] 923 name = "fancycompleter" 924 version = "0.11.1" 925 source = { registry = "https://pypi.org/simple" } ··· 1381 ] 1382 1383 [[package]] 1384 + name = "lupa" 1385 + version = "2.6" 1386 + source = { registry = "https://pypi.org/simple" } 1387 + sdist = { url = "https://files.pythonhosted.org/packages/b8/1c/191c3e6ec6502e3dbe25a53e27f69a5daeac3e56de1f73c0138224171ead/lupa-2.6.tar.gz", hash = "sha256:9a770a6e89576be3447668d7ced312cd6fd41d3c13c2462c9dc2c2ab570e45d9", size = 7240282, upload-time = "2025-10-24T07:20:29.738Z" } 1388 + wheels = [ 1389 + { url = "https://files.pythonhosted.org/packages/ca/29/1f66907c1ebf1881735afa695e646762c674f00738ebf66d795d59fc0665/lupa-2.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6d988c0f9331b9f2a5a55186701a25444ab10a1432a1021ee58011499ecbbdd5", size = 962875, upload-time = "2025-10-24T07:17:39.107Z" }, 1390 + { url = "https://files.pythonhosted.org/packages/e6/67/4a748604be360eb9c1c215f6a0da921cd1a2b44b2c5951aae6fb83019d3a/lupa-2.6-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:ebe1bbf48259382c72a6fe363dea61a0fd6fe19eab95e2ae881e20f3654587bf", size = 1935390, upload-time = "2025-10-24T07:17:41.427Z" }, 1391 + { url = "https://files.pythonhosted.org/packages/ac/0c/8ef9ee933a350428b7bdb8335a37ef170ab0bb008bbf9ca8f4f4310116b6/lupa-2.6-cp311-cp311-macosx_11_0_x86_64.whl", hash = "sha256:a8fcee258487cf77cdd41560046843bb38c2e18989cd19671dd1e2596f798306", size = 992193, upload-time = "2025-10-24T07:17:43.231Z" }, 1392 + { url = "https://files.pythonhosted.org/packages/65/46/e6c7facebdb438db8a65ed247e56908818389c1a5abbf6a36aab14f1057d/lupa-2.6-cp311-cp311-manylinux2010_i686.manylinux_2_12_i686.manylinux_2_28_i686.whl", hash = "sha256:561a8e3be800827884e767a694727ed8482d066e0d6edfcbf423b05e63b05535", size = 1165844, upload-time = "2025-10-24T07:17:45.437Z" }, 1393 + { url = "https://files.pythonhosted.org/packages/1c/26/9f1154c6c95f175ccbf96aa96c8f569c87f64f463b32473e839137601a8b/lupa-2.6-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:af880a62d47991cae78b8e9905c008cbfdc4a3a9723a66310c2634fc7644578c", size = 1048069, upload-time = "2025-10-24T07:17:47.181Z" }, 1394 + { url = "https://files.pythonhosted.org/packages/68/67/2cc52ab73d6af81612b2ea24c870d3fa398443af8e2875e5befe142398b1/lupa-2.6-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:80b22923aa4023c86c0097b235615f89d469a0c4eee0489699c494d3367c4c85", size = 2079079, upload-time = "2025-10-24T07:17:49.755Z" }, 1395 + { url = "https://files.pythonhosted.org/packages/2e/dc/f843f09bbf325f6e5ee61730cf6c3409fc78c010d968c7c78acba3019ca7/lupa-2.6-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:153d2cc6b643f7efb9cfc0c6bb55ec784d5bac1a3660cfc5b958a7b8f38f4a75", size = 1071428, upload-time = "2025-10-24T07:17:51.991Z" }, 1396 + { url = "https://files.pythonhosted.org/packages/2e/60/37533a8d85bf004697449acb97ecdacea851acad28f2ad3803662487dd2a/lupa-2.6-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:3fa8777e16f3ded50b72967dc17e23f5a08e4f1e2c9456aff2ebdb57f5b2869f", size = 1181756, upload-time = "2025-10-24T07:17:53.752Z" }, 1397 + { url = "https://files.pythonhosted.org/packages/e4/f2/cf29b20dbb4927b6a3d27c339ac5d73e74306ecc28c8e2c900b2794142ba/lupa-2.6-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:8dbdcbe818c02a2f56f5ab5ce2de374dab03e84b25266cfbaef237829bc09b3f", size = 2175687, upload-time = "2025-10-24T07:17:56.228Z" }, 1398 + { url = "https://files.pythonhosted.org/packages/94/7c/050e02f80c7131b63db1474bff511e63c545b5a8636a24cbef3fc4da20b6/lupa-2.6-cp311-cp311-win32.whl", hash = "sha256:defaf188fde8f7a1e5ce3a5e6d945e533b8b8d547c11e43b96c9b7fe527f56dc", size = 1412592, upload-time = "2025-10-24T07:17:59.062Z" }, 1399 + { url = "https://files.pythonhosted.org/packages/6f/9a/6f2af98aa5d771cea661f66c8eb8f53772ec1ab1dfbce24126cfcd189436/lupa-2.6-cp311-cp311-win_amd64.whl", hash = "sha256:9505ae600b5c14f3e17e70f87f88d333717f60411faca1ddc6f3e61dce85fa9e", size = 1669194, upload-time = "2025-10-24T07:18:01.647Z" }, 1400 + { url = "https://files.pythonhosted.org/packages/94/86/ce243390535c39d53ea17ccf0240815e6e457e413e40428a658ea4ee4b8d/lupa-2.6-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:47ce718817ef1cc0c40d87c3d5ae56a800d61af00fbc0fad1ca9be12df2f3b56", size = 951707, upload-time = "2025-10-24T07:18:03.884Z" }, 1401 + { url = "https://files.pythonhosted.org/packages/86/85/cedea5e6cbeb54396fdcc55f6b741696f3f036d23cfaf986d50d680446da/lupa-2.6-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:7aba985b15b101495aa4b07112cdc08baa0c545390d560ad5cfde2e9e34f4d58", size = 1916703, upload-time = "2025-10-24T07:18:05.6Z" }, 1402 + { url = "https://files.pythonhosted.org/packages/24/be/3d6b5f9a8588c01a4d88129284c726017b2089f3a3fd3ba8bd977292fea0/lupa-2.6-cp312-cp312-macosx_11_0_x86_64.whl", hash = "sha256:b766f62f95b2739f2248977d29b0722e589dcf4f0ccfa827ccbd29f0148bd2e5", size = 985152, upload-time = "2025-10-24T07:18:08.561Z" }, 1403 + { url = "https://files.pythonhosted.org/packages/eb/23/9f9a05beee5d5dce9deca4cb07c91c40a90541fc0a8e09db4ee670da550f/lupa-2.6-cp312-cp312-manylinux2010_i686.manylinux_2_12_i686.manylinux_2_28_i686.whl", hash = "sha256:00a934c23331f94cb51760097ebfab14b005d55a6b30a2b480e3c53dd2fa290d", size = 1159599, upload-time = "2025-10-24T07:18:10.346Z" }, 1404 + { url = "https://files.pythonhosted.org/packages/40/4e/e7c0583083db9d7f1fd023800a9767d8e4391e8330d56c2373d890ac971b/lupa-2.6-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:21de9f38bd475303e34a042b7081aabdf50bd9bafd36ce4faea2f90fd9f15c31", size = 1038686, upload-time = "2025-10-24T07:18:12.112Z" }, 1405 + { url = "https://files.pythonhosted.org/packages/1c/9f/5a4f7d959d4feba5e203ff0c31889e74d1ca3153122be4a46dca7d92bf7c/lupa-2.6-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:cf3bda96d3fc41237e964a69c23647d50d4e28421111360274d4799832c560e9", size = 2071956, upload-time = "2025-10-24T07:18:14.572Z" }, 1406 + { url = "https://files.pythonhosted.org/packages/92/34/2f4f13ca65d01169b1720176aedc4af17bc19ee834598c7292db232cb6dc/lupa-2.6-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:5a76ead245da54801a81053794aa3975f213221f6542d14ec4b859ee2e7e0323", size = 1057199, upload-time = "2025-10-24T07:18:16.379Z" }, 1407 + { url = "https://files.pythonhosted.org/packages/35/2a/5f7d2eebec6993b0dcd428e0184ad71afb06a45ba13e717f6501bfed1da3/lupa-2.6-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:8dd0861741caa20886ddbda0a121d8e52fb9b5bb153d82fa9bba796962bf30e8", size = 1173693, upload-time = "2025-10-24T07:18:18.153Z" }, 1408 + { url = "https://files.pythonhosted.org/packages/e4/29/089b4d2f8e34417349af3904bb40bec40b65c8731f45e3fd8d497ca573e5/lupa-2.6-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:239e63948b0b23023f81d9a19a395e768ed3da6a299f84e7963b8f813f6e3f9c", size = 2164394, upload-time = "2025-10-24T07:18:20.403Z" }, 1409 + { url = "https://files.pythonhosted.org/packages/f3/1b/79c17b23c921f81468a111cad843b076a17ef4b684c4a8dff32a7969c3f0/lupa-2.6-cp312-cp312-win32.whl", hash = "sha256:325894e1099499e7a6f9c351147661a2011887603c71086d36fe0f964d52d1ce", size = 1420647, upload-time = "2025-10-24T07:18:23.368Z" }, 1410 + { url = "https://files.pythonhosted.org/packages/b8/15/5121e68aad3584e26e1425a5c9a79cd898f8a152292059e128c206ee817c/lupa-2.6-cp312-cp312-win_amd64.whl", hash = "sha256:c735a1ce8ee60edb0fe71d665f1e6b7c55c6021f1d340eb8c865952c602cd36f", size = 1688529, upload-time = "2025-10-24T07:18:25.523Z" }, 1411 + { url = "https://files.pythonhosted.org/packages/28/1d/21176b682ca5469001199d8b95fa1737e29957a3d185186e7a8b55345f2e/lupa-2.6-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:663a6e58a0f60e7d212017d6678639ac8df0119bc13c2145029dcba084391310", size = 947232, upload-time = "2025-10-24T07:18:27.878Z" }, 1412 + { url = "https://files.pythonhosted.org/packages/ce/4c/d327befb684660ca13cf79cd1f1d604331808f9f1b6fb6bf57832f8edf80/lupa-2.6-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:d1f5afda5c20b1f3217a80e9bc1b77037f8a6eb11612fd3ada19065303c8f380", size = 1908625, upload-time = "2025-10-24T07:18:29.944Z" }, 1413 + { url = "https://files.pythonhosted.org/packages/66/8e/ad22b0a19454dfd08662237a84c792d6d420d36b061f239e084f29d1a4f3/lupa-2.6-cp313-cp313-macosx_11_0_x86_64.whl", hash = "sha256:26f2b3c085fe76e9119e48c1013c1cccdc1f51585d456858290475aa38e7089e", size = 981057, upload-time = "2025-10-24T07:18:31.553Z" }, 1414 + { url = "https://files.pythonhosted.org/packages/5c/48/74859073ab276bd0566c719f9ca0108b0cfc1956ca0d68678d117d47d155/lupa-2.6-cp313-cp313-manylinux2010_i686.manylinux_2_12_i686.manylinux_2_28_i686.whl", hash = "sha256:60d2f902c7b96fb8ab98493dcff315e7bb4d0b44dc9dd76eb37de575025d5685", size = 1156227, upload-time = "2025-10-24T07:18:33.981Z" }, 1415 + { url = "https://files.pythonhosted.org/packages/09/6c/0e9ded061916877253c2266074060eb71ed99fb21d73c8c114a76725bce2/lupa-2.6-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a02d25dee3a3250967c36590128d9220ae02f2eda166a24279da0b481519cbff", size = 1035752, upload-time = "2025-10-24T07:18:36.32Z" }, 1416 + { url = "https://files.pythonhosted.org/packages/dd/ef/f8c32e454ef9f3fe909f6c7d57a39f950996c37a3deb7b391fec7903dab7/lupa-2.6-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6eae1ee16b886b8914ff292dbefbf2f48abfbdee94b33a88d1d5475e02423203", size = 2069009, upload-time = "2025-10-24T07:18:38.072Z" }, 1417 + { url = "https://files.pythonhosted.org/packages/53/dc/15b80c226a5225815a890ee1c11f07968e0aba7a852df41e8ae6fe285063/lupa-2.6-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:b0edd5073a4ee74ab36f74fe61450148e6044f3952b8d21248581f3c5d1a58be", size = 1056301, upload-time = "2025-10-24T07:18:40.165Z" }, 1418 + { url = "https://files.pythonhosted.org/packages/31/14/2086c1425c985acfb30997a67e90c39457122df41324d3c179d6ee2292c6/lupa-2.6-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:0c53ee9f22a8a17e7d4266ad48e86f43771951797042dd51d1494aaa4f5f3f0a", size = 1170673, upload-time = "2025-10-24T07:18:42.426Z" }, 1419 + { url = "https://files.pythonhosted.org/packages/10/e5/b216c054cf86576c0191bf9a9f05de6f7e8e07164897d95eea0078dca9b2/lupa-2.6-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:de7c0f157a9064a400d828789191a96da7f4ce889969a588b87ec80de9b14772", size = 2162227, upload-time = "2025-10-24T07:18:46.112Z" }, 1420 + { url = "https://files.pythonhosted.org/packages/59/2f/33ecb5bedf4f3bc297ceacb7f016ff951331d352f58e7e791589609ea306/lupa-2.6-cp313-cp313-win32.whl", hash = "sha256:ee9523941ae0a87b5b703417720c5d78f72d2f5bc23883a2ea80a949a3ed9e75", size = 1419558, upload-time = "2025-10-24T07:18:48.371Z" }, 1421 + { url = "https://files.pythonhosted.org/packages/f9/b4/55e885834c847ea610e111d87b9ed4768f0afdaeebc00cd46810f25029f6/lupa-2.6-cp313-cp313-win_amd64.whl", hash = "sha256:b1335a5835b0a25ebdbc75cf0bda195e54d133e4d994877ef025e218c2e59db9", size = 1683424, upload-time = "2025-10-24T07:18:50.976Z" }, 1422 + { url = "https://files.pythonhosted.org/packages/66/9d/d9427394e54d22a35d1139ef12e845fd700d4872a67a34db32516170b746/lupa-2.6-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:dcb6d0a3264873e1653bc188499f48c1fb4b41a779e315eba45256cfe7bc33c1", size = 953818, upload-time = "2025-10-24T07:18:53.378Z" }, 1423 + { url = "https://files.pythonhosted.org/packages/10/41/27bbe81953fb2f9ecfced5d9c99f85b37964cfaf6aa8453bb11283983721/lupa-2.6-cp314-cp314-macosx_11_0_universal2.whl", hash = "sha256:a37e01f2128f8c36106726cb9d360bac087d58c54b4522b033cc5691c584db18", size = 1915850, upload-time = "2025-10-24T07:18:55.259Z" }, 1424 + { url = "https://files.pythonhosted.org/packages/a3/98/f9ff60db84a75ba8725506bbf448fb085bc77868a021998ed2a66d920568/lupa-2.6-cp314-cp314-macosx_11_0_x86_64.whl", hash = "sha256:458bd7e9ff3c150b245b0fcfbb9bd2593d1152ea7f0a7b91c1d185846da033fe", size = 982344, upload-time = "2025-10-24T07:18:57.05Z" }, 1425 + { url = "https://files.pythonhosted.org/packages/41/f7/f39e0f1c055c3b887d86b404aaf0ca197b5edfd235a8b81b45b25bac7fc3/lupa-2.6-cp314-cp314-manylinux2010_i686.manylinux_2_12_i686.manylinux_2_28_i686.whl", hash = "sha256:052ee82cac5206a02df77119c325339acbc09f5ce66967f66a2e12a0f3211cad", size = 1156543, upload-time = "2025-10-24T07:18:59.251Z" }, 1426 + { url = "https://files.pythonhosted.org/packages/9e/9c/59e6cffa0d672d662ae17bd7ac8ecd2c89c9449dee499e3eb13ca9cd10d9/lupa-2.6-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:96594eca3c87dd07938009e95e591e43d554c1dbd0385be03c100367141db5a8", size = 1047974, upload-time = "2025-10-24T07:19:01.449Z" }, 1427 + { url = "https://files.pythonhosted.org/packages/23/c6/a04e9cef7c052717fcb28fb63b3824802488f688391895b618e39be0f684/lupa-2.6-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e8faddd9d198688c8884091173a088a8e920ecc96cda2ffed576a23574c4b3f6", size = 2073458, upload-time = "2025-10-24T07:19:03.369Z" }, 1428 + { url = "https://files.pythonhosted.org/packages/e6/10/824173d10f38b51fc77785228f01411b6ca28826ce27404c7c912e0e442c/lupa-2.6-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:daebb3a6b58095c917e76ba727ab37b27477fb926957c825205fbda431552134", size = 1067683, upload-time = "2025-10-24T07:19:06.2Z" }, 1429 + { url = "https://files.pythonhosted.org/packages/b6/dc/9692fbcf3c924d9c4ece2d8d2f724451ac2e09af0bd2a782db1cef34e799/lupa-2.6-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:f3154e68972befe0f81564e37d8142b5d5d79931a18309226a04ec92487d4ea3", size = 1171892, upload-time = "2025-10-24T07:19:08.544Z" }, 1430 + { url = "https://files.pythonhosted.org/packages/84/ff/e318b628d4643c278c96ab3ddea07fc36b075a57383c837f5b11e537ba9d/lupa-2.6-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:e4dadf77b9fedc0bfa53417cc28dc2278a26d4cbd95c29f8927ad4d8fe0a7ef9", size = 2166641, upload-time = "2025-10-24T07:19:10.485Z" }, 1431 + { url = "https://files.pythonhosted.org/packages/12/f7/a6f9ec2806cf2d50826980cdb4b3cffc7691dc6f95e13cc728846d5cb793/lupa-2.6-cp314-cp314-win32.whl", hash = "sha256:cb34169c6fa3bab3e8ac58ca21b8a7102f6a94b6a5d08d3636312f3f02fafd8f", size = 1456857, upload-time = "2025-10-24T07:19:37.989Z" }, 1432 + { url = "https://files.pythonhosted.org/packages/c5/de/df71896f25bdc18360fdfa3b802cd7d57d7fede41a0e9724a4625b412c85/lupa-2.6-cp314-cp314-win_amd64.whl", hash = "sha256:b74f944fe46c421e25d0f8692aef1e842192f6f7f68034201382ac440ef9ea67", size = 1731191, upload-time = "2025-10-24T07:19:40.281Z" }, 1433 + { url = "https://files.pythonhosted.org/packages/47/3c/a1f23b01c54669465f5f4c4083107d496fbe6fb45998771420e9aadcf145/lupa-2.6-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:0e21b716408a21ab65723f8841cf7f2f37a844b7a965eeabb785e27fca4099cf", size = 999343, upload-time = "2025-10-24T07:19:12.519Z" }, 1434 + { url = "https://files.pythonhosted.org/packages/c5/6d/501994291cb640bfa2ccf7f554be4e6914afa21c4026bd01bff9ca8aac57/lupa-2.6-cp314-cp314t-macosx_11_0_universal2.whl", hash = "sha256:589db872a141bfff828340079bbdf3e9a31f2689f4ca0d88f97d9e8c2eae6142", size = 2000730, upload-time = "2025-10-24T07:19:14.869Z" }, 1435 + { url = "https://files.pythonhosted.org/packages/53/a5/457ffb4f3f20469956c2d4c4842a7675e884efc895b2f23d126d23e126cc/lupa-2.6-cp314-cp314t-macosx_11_0_x86_64.whl", hash = "sha256:cd852a91a4a9d4dcbb9a58100f820a75a425703ec3e3f049055f60b8533b7953", size = 1021553, upload-time = "2025-10-24T07:19:17.123Z" }, 1436 + { url = "https://files.pythonhosted.org/packages/51/6b/36bb5a5d0960f2a5c7c700e0819abb76fd9bf9c1d8a66e5106416d6e9b14/lupa-2.6-cp314-cp314t-manylinux2010_i686.manylinux_2_12_i686.manylinux_2_28_i686.whl", hash = "sha256:0334753be028358922415ca97a64a3048e4ed155413fc4eaf87dd0a7e2752983", size = 1133275, upload-time = "2025-10-24T07:19:20.51Z" }, 1437 + { url = "https://files.pythonhosted.org/packages/19/86/202ff4429f663013f37d2229f6176ca9f83678a50257d70f61a0a97281bf/lupa-2.6-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:661d895cd38c87658a34780fac54a690ec036ead743e41b74c3fb81a9e65a6aa", size = 1038441, upload-time = "2025-10-24T07:19:22.509Z" }, 1438 + { url = "https://files.pythonhosted.org/packages/a7/42/d8125f8e420714e5b52e9c08d88b5329dfb02dcca731b4f21faaee6cc5b5/lupa-2.6-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6aa58454ccc13878cc177c62529a2056be734da16369e451987ff92784994ca7", size = 2058324, upload-time = "2025-10-24T07:19:24.979Z" }, 1439 + { url = "https://files.pythonhosted.org/packages/2b/2c/47bf8b84059876e877a339717ddb595a4a7b0e8740bacae78ba527562e1c/lupa-2.6-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:1425017264e470c98022bba8cff5bd46d054a827f5df6b80274f9cc71dafd24f", size = 1060250, upload-time = "2025-10-24T07:19:27.262Z" }, 1440 + { url = "https://files.pythonhosted.org/packages/c2/06/d88add2b6406ca1bdec99d11a429222837ca6d03bea42ca75afa169a78cb/lupa-2.6-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:224af0532d216e3105f0a127410f12320f7c5f1aa0300bdf9646b8d9afb0048c", size = 1151126, upload-time = "2025-10-24T07:19:29.522Z" }, 1441 + { url = "https://files.pythonhosted.org/packages/b4/a0/89e6a024c3b4485b89ef86881c9d55e097e7cb0bdb74efb746f2fa6a9a76/lupa-2.6-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:9abb98d5a8fd27c8285302e82199f0e56e463066f88f619d6594a450bf269d80", size = 2153693, upload-time = "2025-10-24T07:19:31.379Z" }, 1442 + { url = "https://files.pythonhosted.org/packages/b6/36/a0f007dc58fc1bbf51fb85dcc82fcb1f21b8c4261361de7dab0e3d8521ef/lupa-2.6-cp314-cp314t-win32.whl", hash = "sha256:1849efeba7a8f6fb8aa2c13790bee988fd242ae404bd459509640eeea3d1e291", size = 1590104, upload-time = "2025-10-24T07:19:33.514Z" }, 1443 + { url = "https://files.pythonhosted.org/packages/7d/5e/db903ce9cf82c48d6b91bf6d63ae4c8d0d17958939a4e04ba6b9f38b8643/lupa-2.6-cp314-cp314t-win_amd64.whl", hash = "sha256:fc1498d1a4fc028bc521c26d0fad4ca00ed63b952e32fb95949bda76a04bad52", size = 1913818, upload-time = "2025-10-24T07:19:36.039Z" }, 1444 + ] 1445 + 1446 + [[package]] 1447 name = "mako" 1448 version = "1.3.10" 1449 source = { registry = "https://pypi.org/simple" } ··· 1732 ] 1733 1734 [[package]] 1735 + name = "opentelemetry-exporter-prometheus" 1736 + version = "0.59b0" 1737 + source = { registry = "https://pypi.org/simple" } 1738 + dependencies = [ 1739 + { name = "opentelemetry-api" }, 1740 + { name = "opentelemetry-sdk" }, 1741 + { name = "prometheus-client" }, 1742 + ] 1743 + sdist = { url = "https://files.pythonhosted.org/packages/1b/07/39370ec7eacfca10462121a0e036b66ccea3a616bf6ae6ea5fdb72e5009d/opentelemetry_exporter_prometheus-0.59b0.tar.gz", hash = "sha256:d64f23c49abb5a54e271c2fbc8feacea0c394a30ec29876ab5ef7379f08cf3d7", size = 14972, upload-time = "2025-10-16T08:35:55.973Z" } 1744 + wheels = [ 1745 + { url = "https://files.pythonhosted.org/packages/05/ea/3005a732002242fd86203989520bdd5a752e1fd30dc225d5d45751ea19fb/opentelemetry_exporter_prometheus-0.59b0-py3-none-any.whl", hash = "sha256:71ced23207abd15b30d1fe4e7e910dcaa7c2ff1f24a6ffccbd4fdded676f541b", size = 13017, upload-time = "2025-10-16T08:35:37.253Z" }, 1746 + ] 1747 + 1748 + [[package]] 1749 name = "opentelemetry-instrumentation" 1750 version = "0.59b0" 1751 source = { registry = "https://pypi.org/simple" } ··· 2017 ] 2018 2019 [[package]] 2020 + name = "prometheus-client" 2021 + version = "0.23.1" 2022 + source = { registry = "https://pypi.org/simple" } 2023 + sdist = { url = "https://files.pythonhosted.org/packages/23/53/3edb5d68ecf6b38fcbcc1ad28391117d2a322d9a1a3eff04bfdb184d8c3b/prometheus_client-0.23.1.tar.gz", hash = "sha256:6ae8f9081eaaaf153a2e959d2e6c4f4fb57b12ef76c8c7980202f1e57b48b2ce", size = 80481, upload-time = "2025-09-18T20:47:25.043Z" } 2024 + wheels = [ 2025 + { url = "https://files.pythonhosted.org/packages/b8/db/14bafcb4af2139e046d03fd00dea7873e48eafe18b7d2797e73d6681f210/prometheus_client-0.23.1-py3-none-any.whl", hash = "sha256:dd1913e6e76b59cfe44e7a4b83e01afc9873c1bdfd2ed8739f1e76aeca115f99", size = 61145, upload-time = "2025-09-18T20:47:23.875Z" }, 2026 + ] 2027 + 2028 + [[package]] 2029 name = "prompt-toolkit" 2030 version = "3.0.52" 2031 source = { registry = "https://pypi.org/simple" } ··· 2231 ] 2232 2233 [[package]] 2234 + name = "py-key-value-aio" 2235 + version = "0.3.0" 2236 + source = { registry = "https://pypi.org/simple" } 2237 + dependencies = [ 2238 + { name = "beartype" }, 2239 + { name = "py-key-value-shared" }, 2240 + ] 2241 + sdist = { url = "https://files.pythonhosted.org/packages/93/ce/3136b771dddf5ac905cc193b461eb67967cf3979688c6696e1f2cdcde7ea/py_key_value_aio-0.3.0.tar.gz", hash = "sha256:858e852fcf6d696d231266da66042d3355a7f9871650415feef9fca7a6cd4155", size = 50801, upload-time = "2025-11-17T16:50:04.711Z" } 2242 + wheels = [ 2243 + { url = "https://files.pythonhosted.org/packages/99/10/72f6f213b8f0bce36eff21fda0a13271834e9eeff7f9609b01afdc253c79/py_key_value_aio-0.3.0-py3-none-any.whl", hash = "sha256:1c781915766078bfd608daa769fefb97e65d1d73746a3dfb640460e322071b64", size = 96342, upload-time = "2025-11-17T16:50:03.801Z" }, 2244 + ] 2245 + 2246 + [package.optional-dependencies] 2247 + memory = [ 2248 + { name = "cachetools" }, 2249 + ] 2250 + redis = [ 2251 + { name = "redis" }, 2252 + ] 2253 + 2254 + [[package]] 2255 + name = "py-key-value-shared" 2256 + version = "0.3.0" 2257 + source = { registry = "https://pypi.org/simple" } 2258 + dependencies = [ 2259 + { name = "beartype" }, 2260 + { name = "typing-extensions" }, 2261 + ] 2262 + sdist = { url = "https://files.pythonhosted.org/packages/7b/e4/1971dfc4620a3a15b4579fe99e024f5edd6e0967a71154771a059daff4db/py_key_value_shared-0.3.0.tar.gz", hash = "sha256:8fdd786cf96c3e900102945f92aa1473138ebe960ef49da1c833790160c28a4b", size = 11666, upload-time = "2025-11-17T16:50:06.849Z" } 2263 + wheels = [ 2264 + { url = "https://files.pythonhosted.org/packages/51/e4/b8b0a03ece72f47dce2307d36e1c34725b7223d209fc679315ffe6a4e2c3/py_key_value_shared-0.3.0-py3-none-any.whl", hash = "sha256:5b0efba7ebca08bb158b1e93afc2f07d30b8f40c2fc12ce24a4c0d84f42f9298", size = 19560, upload-time = "2025-11-17T16:50:05.954Z" }, 2265 + ] 2266 + 2267 + [[package]] 2268 name = "pyasn1" 2269 version = "0.6.1" 2270 source = { registry = "https://pypi.org/simple" } ··· 2409 ] 2410 2411 [[package]] 2412 + name = "pydocket" 2413 + version = "0.15.2" 2414 + source = { registry = "https://pypi.org/simple" } 2415 + dependencies = [ 2416 + { name = "cloudpickle" }, 2417 + { name = "fakeredis", extra = ["lua"] }, 2418 + { name = "opentelemetry-api" }, 2419 + { name = "opentelemetry-exporter-prometheus" }, 2420 + { name = "prometheus-client" }, 2421 + { name = "py-key-value-aio", extra = ["memory", "redis"] }, 2422 + { name = "python-json-logger" }, 2423 + { name = "redis" }, 2424 + { name = "rich" }, 2425 + { name = "typer" }, 2426 + { name = "typing-extensions" }, 2427 + ] 2428 + sdist = { url = "https://files.pythonhosted.org/packages/9f/a8/7786b0d528823fda886370fb58222ace079afffc9b28bbbc22e8b9e668d6/pydocket-0.15.2.tar.gz", hash = "sha256:c4d3c8d221251eb57c98c0e69aa60adf1eba163afcfeded01c8f32b2e4aaae37", size = 269495, upload-time = "2025-12-08T15:53:22.766Z" } 2429 + wheels = [ 2430 + { url = "https://files.pythonhosted.org/packages/82/22/d449dd17453a3b7e0168d9045f2f609c4a12a3288277f9358e580721ae9d/pydocket-0.15.2-py3-none-any.whl", hash = "sha256:8f8414315bdb2db588a09bd5ea399573052ff2cc185743e5a003db3e91f1cefc", size = 57292, upload-time = "2025-12-08T15:53:21.184Z" }, 2431 + ] 2432 + 2433 + [[package]] 2434 name = "pygments" 2435 version = "2.19.2" 2436 source = { registry = "https://pypi.org/simple" } ··· 2575 [package.optional-dependencies] 2576 cryptography = [ 2577 { name = "cryptography" }, 2578 + ] 2579 + 2580 + [[package]] 2581 + name = "python-json-logger" 2582 + version = "4.0.0" 2583 + source = { registry = "https://pypi.org/simple" } 2584 + sdist = { url = "https://files.pythonhosted.org/packages/29/bf/eca6a3d43db1dae7070f70e160ab20b807627ba953663ba07928cdd3dc58/python_json_logger-4.0.0.tar.gz", hash = "sha256:f58e68eb46e1faed27e0f574a55a0455eecd7b8a5b88b85a784519ba3cff047f", size = 17683, upload-time = "2025-10-06T04:15:18.984Z" } 2585 + wheels = [ 2586 + { url = "https://files.pythonhosted.org/packages/51/e5/fecf13f06e5e5f67e8837d777d1bc43fac0ed2b77a676804df5c34744727/python_json_logger-4.0.0-py3-none-any.whl", hash = "sha256:af09c9daf6a813aa4cc7180395f50f2a9e5fa056034c9953aec92e381c5ba1e2", size = 15548, upload-time = "2025-10-06T04:15:17.553Z" }, 2587 ] 2588 2589 [[package]] ··· 2651 ] 2652 2653 [[package]] 2654 + name = "redis" 2655 + version = "7.1.0" 2656 + source = { registry = "https://pypi.org/simple" } 2657 + dependencies = [ 2658 + { name = "async-timeout", marker = "python_full_version < '3.11.3'" }, 2659 + ] 2660 + sdist = { url = "https://files.pythonhosted.org/packages/43/c8/983d5c6579a411d8a99bc5823cc5712768859b5ce2c8afe1a65b37832c81/redis-7.1.0.tar.gz", hash = "sha256:b1cc3cfa5a2cb9c2ab3ba700864fb0ad75617b41f01352ce5779dabf6d5f9c3c", size = 4796669, upload-time = "2025-11-19T15:54:39.961Z" } 2661 + wheels = [ 2662 + { url = "https://files.pythonhosted.org/packages/89/f0/8956f8a86b20d7bb9d6ac0187cf4cd54d8065bc9a1a09eb8011d4d326596/redis-7.1.0-py3-none-any.whl", hash = "sha256:23c52b208f92b56103e17c5d06bdc1a6c2c0b3106583985a76a18f83b265de2b", size = 354159, upload-time = "2025-11-19T15:54:38.064Z" }, 2663 + ] 2664 + 2665 + [[package]] 2666 name = "requests" 2667 version = "2.32.5" 2668 source = { registry = "https://pypi.org/simple" } ··· 2741 ] 2742 2743 [[package]] 2744 + name = "shellingham" 2745 + version = "1.5.4" 2746 + source = { registry = "https://pypi.org/simple" } 2747 + sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310, upload-time = "2023-10-24T04:13:40.426Z" } 2748 + wheels = [ 2749 + { url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755, upload-time = "2023-10-24T04:13:38.866Z" }, 2750 + ] 2751 + 2752 + [[package]] 2753 name = "six" 2754 version = "1.17.0" 2755 source = { registry = "https://pypi.org/simple" } ··· 2773 sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372, upload-time = "2024-02-25T23:20:04.057Z" } 2774 wheels = [ 2775 { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" }, 2776 + ] 2777 + 2778 + [[package]] 2779 + name = "sortedcontainers" 2780 + version = "2.4.0" 2781 + source = { registry = "https://pypi.org/simple" } 2782 + sdist = { url = "https://files.pythonhosted.org/packages/e8/c4/ba2f8066cceb6f23394729afe52f3bf7adec04bf9ed2c820b39e19299111/sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88", size = 30594, upload-time = "2021-05-16T22:03:42.897Z" } 2783 + wheels = [ 2784 + { url = "https://files.pythonhosted.org/packages/32/46/9cb0e58b2deb7f82b84065f37f3bffeb12413f947f9388e4cac22c4621ce/sortedcontainers-2.4.0-py2.py3-none-any.whl", hash = "sha256:a163dcaede0f1c021485e957a39245190e74249897e2ae4b2aa38595db237ee0", size = 29575, upload-time = "2021-05-16T22:03:41.177Z" }, 2785 ] 2786 2787 [[package]] ··· 2929 { url = "https://files.pythonhosted.org/packages/e4/25/9324ae947fcc4322470326cf8276a3fc2f08dc82adec1de79d963fdf7af5/ty-0.0.1a25-py3-none-win32.whl", hash = "sha256:168fc8aee396d617451acc44cd28baffa47359777342836060c27aa6f37e2445", size = 8387095, upload-time = "2025-10-29T19:40:18.368Z" }, 2930 { url = "https://files.pythonhosted.org/packages/3b/2b/cb12cbc7db1ba310aa7b1de9b4e018576f653105993736c086ee67d2ec02/ty-0.0.1a25-py3-none-win_amd64.whl", hash = "sha256:a2fad3d8e92bb4d57a8872a6f56b1aef54539d36f23ebb01abe88ac4338efafb", size = 9059225, upload-time = "2025-10-29T19:40:20.278Z" }, 2931 { url = "https://files.pythonhosted.org/packages/2f/c1/f6be8cdd0bf387c1d8ee9d14bb299b7b5d2c0532f550a6693216a32ec0c5/ty-0.0.1a25-py3-none-win_arm64.whl", hash = "sha256:dde2962d448ed87c48736e9a4bb13715a4cced705525e732b1c0dac1d4c66e3d", size = 8536832, upload-time = "2025-10-29T19:40:22.014Z" }, 2932 + ] 2933 + 2934 + [[package]] 2935 + name = "typer" 2936 + version = "0.20.0" 2937 + source = { registry = "https://pypi.org/simple" } 2938 + dependencies = [ 2939 + { name = "click" }, 2940 + { name = "rich" }, 2941 + { name = "shellingham" }, 2942 + { name = "typing-extensions" }, 2943 + ] 2944 + sdist = { url = "https://files.pythonhosted.org/packages/8f/28/7c85c8032b91dbe79725b6f17d2fffc595dff06a35c7a30a37bef73a1ab4/typer-0.20.0.tar.gz", hash = "sha256:1aaf6494031793e4876fb0bacfa6a912b551cf43c1e63c800df8b1a866720c37", size = 106492, upload-time = "2025-10-20T17:03:49.445Z" } 2945 + wheels = [ 2946 + { url = "https://files.pythonhosted.org/packages/78/64/7713ffe4b5983314e9d436a90d5bd4f63b6054e2aca783a3cfc44cb95bbf/typer-0.20.0-py3-none-any.whl", hash = "sha256:5b463df6793ec1dca6213a3cf4c0f03bc6e322ac5e16e13ddd622a889489784a", size = 47028, upload-time = "2025-10-20T17:03:47.617Z" }, 2947 ] 2948 2949 [[package]]
+9
compose.yaml
···
··· 1 + # dev services for local development 2 + # usage: docker compose up -d 3 + 4 + services: 5 + redis: 6 + image: redis:7-alpine 7 + ports: 8 + - "6379:6379" 9 + command: redis-server --appendonly no --save ""
+145
docs/backend/background-tasks.md
···
··· 1 + # background tasks 2 + 3 + plyr.fm uses [pydocket](https://github.com/PrefectHQ/pydocket) for durable background task execution, backed by Redis. 4 + 5 + ## overview 6 + 7 + background tasks handle operations that shouldn't block the request/response cycle: 8 + - **copyright scanning** - analyzes uploaded tracks for potential copyright matches 9 + - (future) upload processing, notifications, etc. 10 + 11 + ## architecture 12 + 13 + ``` 14 + โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” 15 + โ”‚ FastAPI โ”‚โ”€โ”€โ”€โ”€โ–ถโ”‚ Redis โ”‚โ—€โ”€โ”€โ”€โ”€โ”‚ Worker โ”‚ 16 + โ”‚ (add task)โ”‚ โ”‚ (queue) โ”‚ โ”‚ (process) โ”‚ 17 + โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ 18 + ``` 19 + 20 + - **docket** schedules tasks to Redis 21 + - **worker** runs in-process alongside FastAPI, processing tasks from the queue 22 + - tasks are durable - if the worker crashes, tasks are retried on restart 23 + 24 + ## configuration 25 + 26 + ### environment variables 27 + 28 + ```bash 29 + # redis URL (required to enable docket) 30 + DOCKET_URL=redis://localhost:6379 31 + 32 + # optional settings (have sensible defaults) 33 + DOCKET_NAME=plyr # queue namespace 34 + DOCKET_WORKER_CONCURRENCY=10 # concurrent task limit 35 + ``` 36 + 37 + when `DOCKET_URL` is not set, docket is disabled and tasks fall back to `asyncio.create_task()` (fire-and-forget). 38 + 39 + ### local development 40 + 41 + ```bash 42 + # start redis + backend + frontend 43 + just dev 44 + 45 + # or manually: 46 + docker compose up -d # starts redis on localhost:6379 47 + DOCKET_URL=redis://localhost:6379 just backend run 48 + ``` 49 + 50 + ### production/staging 51 + 52 + Redis instances are provisioned via Upstash (managed Redis): 53 + 54 + | environment | instance | region | 55 + |-------------|----------|--------| 56 + | production | `plyr-redis-prd` | us-east-1 (near fly.io) | 57 + | staging | `plyr-redis-stg` | us-east-1 | 58 + 59 + set `DOCKET_URL` in fly.io secrets: 60 + ```bash 61 + flyctl secrets set DOCKET_URL=rediss://default:xxx@xxx.upstash.io:6379 -a relay-api 62 + flyctl secrets set DOCKET_URL=rediss://default:xxx@xxx.upstash.io:6379 -a relay-api-staging 63 + ``` 64 + 65 + note: use `rediss://` (with double 's') for TLS connections to Upstash. 66 + 67 + ## usage 68 + 69 + ### scheduling a task 70 + 71 + ```python 72 + from backend._internal.background_tasks import schedule_copyright_scan 73 + 74 + # automatically uses docket if enabled, else asyncio.create_task 75 + await schedule_copyright_scan(track_id, audio_url) 76 + ``` 77 + 78 + ### adding new tasks 79 + 80 + 1. define the task function in `backend/_internal/background_tasks.py`: 81 + ```python 82 + async def my_new_task(arg1: str, arg2: int) -> None: 83 + """task functions must be async and JSON-serializable args only.""" 84 + # do work here 85 + pass 86 + ``` 87 + 88 + 2. register it in `backend/_internal/background.py`: 89 + ```python 90 + def _register_tasks(docket: Docket) -> None: 91 + from backend._internal.background_tasks import my_new_task, scan_copyright 92 + 93 + docket.register(scan_copyright) 94 + docket.register(my_new_task) # add here 95 + ``` 96 + 97 + 3. create a scheduler helper if needed: 98 + ```python 99 + async def schedule_my_task(arg1: str, arg2: int) -> None: 100 + """schedule with docket if enabled, else asyncio.""" 101 + if is_docket_enabled(): 102 + try: 103 + docket = get_docket() 104 + await docket.add(my_new_task)(arg1, arg2) 105 + return 106 + except Exception: 107 + pass # fall through to asyncio 108 + 109 + asyncio.create_task(my_new_task(arg1, arg2)) 110 + ``` 111 + 112 + ## costs 113 + 114 + **Upstash pricing** (pay-per-request): 115 + - free tier: 10k commands/day 116 + - pro: $0.2 per 100k commands + $0.25/GB storage 117 + 118 + for plyr.fm's volume (~100 uploads/day), this stays well within free tier or costs $0-5/mo. 119 + 120 + **tips to avoid surprise bills**: 121 + - use **regional** (not global) replication 122 + - set **max data limit** (256MB is plenty for a task queue) 123 + - monitor usage in Upstash dashboard 124 + 125 + ## fallback behavior 126 + 127 + when docket is disabled (`DOCKET_URL` not set): 128 + - `schedule_copyright_scan()` uses `asyncio.create_task()` instead 129 + - tasks are fire-and-forget (no retries, no durability) 130 + - suitable for local dev without Redis 131 + 132 + ## monitoring 133 + 134 + background task execution is traced in Logfire: 135 + - span: `scheduled copyright scan via docket` 136 + - span: `docket scheduling failed, falling back to asyncio` 137 + 138 + query recent background task activity: 139 + ```sql 140 + SELECT start_timestamp, message, span_name, duration 141 + FROM records 142 + WHERE span_name LIKE '%copyright%' 143 + AND start_timestamp > NOW() - INTERVAL '1 hour' 144 + ORDER BY start_timestamp DESC 145 + ```
+10
docs/backend/configuration.md
··· 57 settings.notify.recipient_handle # from NOTIFY_RECIPIENT_HANDLE 58 settings.notify.bot.handle # from NOTIFY_BOT_HANDLE 59 settings.notify.bot.password # from NOTIFY_BOT_PASSWORD 60 ``` 61 62 ## environment variables ··· 102 NOTIFY_RECIPIENT_HANDLE=your.handle 103 NOTIFY_BOT_HANDLE=bot.handle 104 NOTIFY_BOT_PASSWORD=app-password 105 ``` 106 107 ## computed fields
··· 57 settings.notify.recipient_handle # from NOTIFY_RECIPIENT_HANDLE 58 settings.notify.bot.handle # from NOTIFY_BOT_HANDLE 59 settings.notify.bot.password # from NOTIFY_BOT_PASSWORD 60 + 61 + # background task settings (docket/redis) 62 + settings.docket.name # "plyr" (queue namespace) 63 + settings.docket.url # from DOCKET_URL (empty = disabled) 64 + settings.docket.worker_concurrency # 10 (concurrent tasks) 65 ``` 66 67 ## environment variables ··· 107 NOTIFY_RECIPIENT_HANDLE=your.handle 108 NOTIFY_BOT_HANDLE=bot.handle 109 NOTIFY_BOT_PASSWORD=app-password 110 + 111 + # background tasks (docket/redis) 112 + DOCKET_URL=redis://localhost:6379 # or rediss:// for TLS 113 + DOCKET_NAME=plyr # queue namespace (default: plyr) 114 + DOCKET_WORKER_CONCURRENCY=10 # concurrent task limit (default: 10) 115 ``` 116 117 ## computed fields
+14 -10
docs/deployment/environments.md
··· 4 5 ## environments 6 7 - | environment | trigger | backend URL | database | frontend | storage | 8 - |-------------|---------|-------------|----------|----------|---------| 9 - | **development** | local | localhost:8001 | plyr-dev (neon) | localhost:5173 | audio-dev, images-dev (r2) | 10 - | **staging** | push to main | api-stg.plyr.fm | plyr-staging (neon) | stg.plyr.fm (main branch) | audio-staging, images-staging (r2) | 11 - | **production** | github release | api.plyr.fm | plyr-prod (neon) | plyr.fm (production-fe branch) | audio-prod, images-prod (r2) | 12 13 ## workflow 14 15 ### local development 16 17 ```bash 18 - # start backend (hot reloads) 19 - just run-backend 20 21 - # start frontend (hot reloads) 22 just frontend dev 23 24 - # start transcoder (hot reloads) 25 just transcoder run 26 ``` 27 28 - connects to `plyr-dev` neon database and uses `fm.plyr.dev` atproto namespace. 29 30 ### staging deployment (automatic) 31 ··· 110 111 all secrets configured via `flyctl secrets set`. key environment variables: 112 - `DATABASE_URL` โ†’ neon connection string (env-specific) 113 - `FRONTEND_URL` โ†’ frontend URL for CORS (production: `https://plyr.fm`, staging: `https://stg.plyr.fm`) 114 - `ATPROTO_APP_NAMESPACE` โ†’ atproto namespace (environment-specific, separates records by environment) 115 - development: `fm.plyr.dev` (local `.env`)
··· 4 5 ## environments 6 7 + | environment | trigger | backend URL | database | redis | frontend | storage | 8 + |-------------|---------|-------------|----------|-------|----------|---------| 9 + | **development** | local | localhost:8001 | plyr-dev (neon) | localhost:6379 (docker) | localhost:5173 | audio-dev, images-dev (r2) | 10 + | **staging** | push to main | api-stg.plyr.fm | plyr-stg (neon) | plyr-redis-stg (upstash) | stg.plyr.fm (main branch) | audio-staging, images-staging (r2) | 11 + | **production** | github release | api.plyr.fm | plyr-prd (neon) | plyr-redis-prd (upstash) | plyr.fm (production-fe branch) | audio-prod, images-prod (r2) | 12 13 ## workflow 14 15 ### local development 16 17 ```bash 18 + # terminal 1: start redis 19 + just dev-services 20 21 + # terminal 2: start backend (with docket enabled) 22 + DOCKET_URL=redis://localhost:6379 just backend run 23 + 24 + # terminal 3: start frontend 25 just frontend dev 26 27 + # optional: start transcoder 28 just transcoder run 29 ``` 30 31 + connects to `plyr-dev` neon database, local Redis, and uses `fm.plyr.dev` atproto namespace. 32 33 ### staging deployment (automatic) 34 ··· 113 114 all secrets configured via `flyctl secrets set`. key environment variables: 115 - `DATABASE_URL` โ†’ neon connection string (env-specific) 116 + - `DOCKET_URL` โ†’ redis URL for background tasks (env-specific, use `rediss://` for TLS) 117 - `FRONTEND_URL` โ†’ frontend URL for CORS (production: `https://plyr.fm`, staging: `https://stg.plyr.fm`) 118 - `ATPROTO_APP_NAMESPACE` โ†’ atproto namespace (environment-specific, separates records by environment) 119 - development: `fm.plyr.dev` (local `.env`)
+7 -7
frontend/src/lib/components/Header.svelte
··· 236 237 .margin-left { 238 position: absolute; 239 - /* Center in the left margin: halfway between left edge and content area */ 240 - left: calc((100vw - var(--queue-width, 0px) - 800px) / 4); 241 top: 50%; 242 - transform: translate(-50%, -50%); 243 - transition: left 0.3s ease; 244 display: flex; 245 align-items: center; 246 - gap: 1rem; 247 - /* Constrain width to prevent overflow into content area */ 248 - max-width: calc((100vw - var(--queue-width, 0px) - 800px) / 2 - 2rem); 249 } 250 251 .logout-right {
··· 236 237 .margin-left { 238 position: absolute; 239 + left: 0; 240 top: 50%; 241 + transform: translateY(-50%); 242 + transition: width 0.3s ease; 243 display: flex; 244 align-items: center; 245 + justify-content: space-evenly; 246 + /* Fill the left margin area */ 247 + width: calc((100vw - var(--queue-width, 0px) - 800px) / 2); 248 + padding: 0 1rem; 249 } 250 251 .logout-right {
+9 -9
frontend/src/lib/components/PlatformStats.svelte
··· 25 <div class="stats-header"> 26 {#if !loading && stats} 27 <div class="header-stat" title="{stats.total_plays.toLocaleString()} {pluralize(stats.total_plays, 'play', 'plays')}"> 28 - <svg class="header-icon" width="14" height="14" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"> 29 <polygon points="5 3 19 12 5 21 5 3"></polygon> 30 </svg> 31 <span class="header-value">{stats.total_plays.toLocaleString()}</span> 32 </div> 33 <div class="header-stat" title="{stats.total_tracks.toLocaleString()} {pluralize(stats.total_tracks, 'track', 'tracks')}"> 34 - <svg class="header-icon" width="14" height="14" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="1.5"> 35 <path d="M9 18V5l12-2v13"></path> 36 <circle cx="6" cy="18" r="3"></circle> 37 <circle cx="18" cy="16" r="3"></circle> ··· 39 <span class="header-value">{stats.total_tracks.toLocaleString()}</span> 40 </div> 41 <div class="header-stat" title="{stats.total_artists.toLocaleString()} {pluralize(stats.total_artists, 'artist', 'artists')}"> 42 - <svg class="header-icon" width="14" height="14" viewBox="0 0 16 16" fill="none"> 43 <circle cx="8" cy="5" r="3" stroke="currentColor" stroke-width="1.5" fill="none" /> 44 <path d="M3 14c0-2.5 2-4.5 5-4.5s5 2 5 4.5" stroke="currentColor" stroke-width="1.5" stroke-linecap="round" /> 45 </svg> 46 <span class="header-value">{stats.total_artists.toLocaleString()}</span> 47 </div> 48 <div class="header-stat" title="{formatDuration(stats.total_duration_seconds)} of audio"> 49 - <svg class="header-icon" width="14" height="14" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"> 50 <circle cx="12" cy="12" r="10"></circle> 51 <polyline points="12 6 12 12 16 14"></polyline> 52 </svg> ··· 121 .stats-header { 122 display: flex; 123 align-items: center; 124 - gap: 0.5rem 0.75rem; 125 flex-wrap: wrap; 126 - justify-content: center; 127 - max-width: 280px; 128 } 129 130 .header-stat { 131 display: flex; 132 align-items: center; 133 - gap: 0.3rem; 134 color: var(--text-secondary); 135 - font-size: 0.75rem; 136 transition: color 0.2s; 137 white-space: nowrap; 138 }
··· 25 <div class="stats-header"> 26 {#if !loading && stats} 27 <div class="header-stat" title="{stats.total_plays.toLocaleString()} {pluralize(stats.total_plays, 'play', 'plays')}"> 28 + <svg class="header-icon" width="12" height="12" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"> 29 <polygon points="5 3 19 12 5 21 5 3"></polygon> 30 </svg> 31 <span class="header-value">{stats.total_plays.toLocaleString()}</span> 32 </div> 33 <div class="header-stat" title="{stats.total_tracks.toLocaleString()} {pluralize(stats.total_tracks, 'track', 'tracks')}"> 34 + <svg class="header-icon" width="12" height="12" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="1.5"> 35 <path d="M9 18V5l12-2v13"></path> 36 <circle cx="6" cy="18" r="3"></circle> 37 <circle cx="18" cy="16" r="3"></circle> ··· 39 <span class="header-value">{stats.total_tracks.toLocaleString()}</span> 40 </div> 41 <div class="header-stat" title="{stats.total_artists.toLocaleString()} {pluralize(stats.total_artists, 'artist', 'artists')}"> 42 + <svg class="header-icon" width="12" height="12" viewBox="0 0 16 16" fill="none"> 43 <circle cx="8" cy="5" r="3" stroke="currentColor" stroke-width="1.5" fill="none" /> 44 <path d="M3 14c0-2.5 2-4.5 5-4.5s5 2 5 4.5" stroke="currentColor" stroke-width="1.5" stroke-linecap="round" /> 45 </svg> 46 <span class="header-value">{stats.total_artists.toLocaleString()}</span> 47 </div> 48 <div class="header-stat" title="{formatDuration(stats.total_duration_seconds)} of audio"> 49 + <svg class="header-icon" width="12" height="12" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"> 50 <circle cx="12" cy="12" r="10"></circle> 51 <polyline points="12 6 12 12 16 14"></polyline> 52 </svg> ··· 121 .stats-header { 122 display: flex; 123 align-items: center; 124 + gap: 0.4rem 0.6rem; 125 flex-wrap: wrap; 126 + justify-content: flex-start; 127 + max-width: 260px; 128 } 129 130 .header-stat { 131 display: flex; 132 align-items: center; 133 + gap: 0.25rem; 134 color: var(--text-secondary); 135 + font-size: 0.65rem; 136 transition: color 0.2s; 137 white-space: nowrap; 138 }
+10 -6
justfile
··· 11 12 # get setup 13 setup: 14 - # symlink AGENTS.md to CLAUDE.md and GEMINI.md 15 ln -s AGENTS.md CLAUDE.md 16 - ln -s AGENTS.md GEMINI.md 17 - 18 - # Setup sub-modules if they have setup recipes 19 - # just frontend setup # Uncomment if frontend/justfile gets a setup 20 - # just backend setup # Uncomment if backend/justfile gets a setup 21 22 23 # show commits since last release ··· 32 release-frontend-only: 33 git fetch origin main 34 git push origin origin/main:production-fe
··· 11 12 # get setup 13 setup: 14 + # symlink AGENTS.md to CLAUDE.md 15 ln -s AGENTS.md CLAUDE.md 16 17 18 # show commits since last release ··· 27 release-frontend-only: 28 git fetch origin main 29 git push origin origin/main:production-fe 30 + 31 + # start dev services (redis) 32 + dev-services: 33 + docker compose up -d 34 + @echo "redis running at localhost:6379" 35 + 36 + # stop dev services 37 + dev-services-down: 38 + docker compose down