A tool for tailing the firehose and matching images against known perceptual hashes, and then labeling them.

feat: initial commit of skywatch-phash image moderation service

Perceptual hash-based image moderation system for Bluesky using Jetstream firehose.

Features:
- Real-time image processing from Jetstream firehose
- Perceptual hashing with configurable similarity thresholds
- Redis-based caching and queue system
- Automatic moderation actions via Ozone
- PDS resolution via PLC directory
- Cursor persistence for resumable processing
- Repo takedown detection to skip unnecessary processing

Technical details:
- Built with Bun 1.1 and TypeScript
- Uses Sharp for image processing
- Docker Compose setup with Redis
- Configurable via .env file

Note: This is alpha software and requires further testing and refinement.

Skywatch 905b9820

+18
.claude/settings.local.json
··· 1 + { 2 + "permissions": { 3 + "allow": [ 4 + "Read(//Users/scarndp/dev/skywatch/skywatch-tools/src/**)", 5 + "Bash(bun add:*)", 6 + "Bash(bun run lint:fix:*)", 7 + "Bash(bun run lint)", 8 + "Bash(bun run typecheck:*)", 9 + "Bash(docker compose:*)", 10 + "Read(//Users/scarndp/dev/skywatch/skywatch-tail/src/blobs/**)", 11 + "Read(//Users/scarndp/dev/skywatch/skywatch-tail/**)", 12 + "Bash(curl:*)", 13 + "Bash(cat:*)" 14 + ], 15 + "deny": [], 16 + "ask": [] 17 + } 18 + }
+44
.dockerignore
··· 1 + # Dependencies 2 + node_modules 3 + bun.lockb 4 + 5 + # Build artifacts 6 + *.log 7 + *.tsbuildinfo 8 + 9 + # Tests 10 + tests 11 + *.test.ts 12 + *.spec.ts 13 + 14 + # Scripts 15 + scripts 16 + 17 + # Development 18 + .git 19 + .gitignore 20 + .env 21 + .env.local 22 + .env.*.local 23 + 24 + # Documentation 25 + README.md 26 + docs 27 + *.md 28 + !package.json 29 + 30 + # IDE 31 + .vscode 32 + .idea 33 + *.swp 34 + *.swo 35 + *~ 36 + 37 + # OS 38 + .DS_Store 39 + Thumbs.db 40 + 41 + # Docker 42 + Dockerfile 43 + docker-compose.yml 44 + .dockerignore
+32
.env.example
··· 1 + # Jetstream Configuration 2 + JETSTREAM_URL=wss://jetstream1.us-east.fire.hose.cam/subscribe 3 + 4 + # Redis Configuration 5 + REDIS_URL=redis://localhost:6379 6 + 7 + # Processing Configuration 8 + PROCESSING_CONCURRENCY=10 9 + RETRY_ATTEMPTS=3 10 + RETRY_DELAY_MS=1000 11 + 12 + # Cache Configuration 13 + CACHE_ENABLED=true 14 + CACHE_TTL_SECONDS=86400 15 + 16 + # PDS Configuration 17 + PDS_ENDPOINT=https://bsky.social 18 + 19 + # Labeler Configuration 20 + LABELER_DID=did:plc:your-labeler-did 21 + LABELER_HANDLE=your-labeler.bsky.social 22 + LABELER_PASSWORD=your-app-password 23 + 24 + # Ozone/Moderation Configuration 25 + MOD_DID=did:plc:e4elbtctnfqocyfcml6h2lf7 26 + OZONE_URL=https://ozone.skywatch.blue 27 + OZONE_PDS=https://blewit.us-west.host.bsky.network 28 + RATE_LIMIT_MS=100 29 + 30 + # Logging 31 + LOG_LEVEL=info 32 + NODE_ENV=production
+42
.gitignore
··· 1 + # Dependencies 2 + node_modules/ 3 + bun.lockb 4 + 5 + # Environment 6 + .env 7 + .env.local 8 + .env.*.local 9 + 10 + # Session data 11 + .session 12 + cursor.txt 13 + 14 + # Logs 15 + logs/ 16 + *.log 17 + pids 18 + *.pid 19 + *.seed 20 + *.pid.lock 21 + 22 + # Build outputs 23 + dist/ 24 + build/ 25 + *.tsbuildinfo 26 + 27 + # IDE 28 + .vscode/ 29 + .idea/ 30 + *.swp 31 + *.swo 32 + *~ 33 + .DS_Store 34 + 35 + # Test coverage 36 + coverage/ 37 + .nyc_output/ 38 + 39 + # Temporary files 40 + tmp/ 41 + temp/ 42 + .session
+13
Dockerfile
··· 1 + FROM oven/bun:1.1-slim 2 + 3 + WORKDIR /app 4 + 5 + COPY package.json bun.lockb* ./ 6 + 7 + RUN bun install --frozen-lockfile 8 + 9 + COPY . . 10 + 11 + ENV NODE_ENV=production 12 + 13 + CMD ["bun", "run", "start"]
+128
README.md
··· 1 + # skywatch-phash 2 + 3 + Perceptual hash-based image moderation service for Bluesky/ATProto. Detects known harassment images using phash fingerprinting and automatically applies labels and reports. 4 + 5 + ## How it works 6 + 7 + 1. Subscribes to Bluesky firehose via Jetstream 8 + 2. Extracts images from posts and computes perceptual hashes 9 + 3. Compares against known harassment image hashes using Hamming distance 10 + 4. On match, executes configured moderation actions (label/report post and/or account) 11 + 5. Caches phashes in Redis to avoid re-fetching viral images 12 + 13 + ## Features 14 + 15 + - **Fast matching** - Hamming distance threshold for fuzzy matching (handles crops, filters, etc) 16 + - **Caching** - Redis-backed phash cache (24hr TTL by default) 17 + - **Deduplication** - Prevents duplicate labels/reports via Redis claims (7-day TTL) 18 + - **Allowlisting** - Skip checks for trusted accounts via `ignoreDID` field 19 + - **Rate limiting** - Configurable delay between moderation API calls 20 + - **Metrics** - Tracks cache hits, matches, labels applied, etc 21 + 22 + ## Setup 23 + 24 + ### Prerequisites 25 + 26 + - Bun runtime 27 + - Redis server 28 + - Bluesky labeler account with app password 29 + 30 + ### Installation 31 + 32 + ```bash 33 + bun install 34 + ``` 35 + 36 + ### Configuration 37 + 38 + Copy `.env.example` to `.env` and configure: 39 + 40 + ```bash 41 + # Required 42 + LABELER_DID=did:plc:your-labeler-did 43 + LABELER_HANDLE=your-labeler.bsky.social 44 + LABELER_PASSWORD=your-app-password 45 + 46 + # Optional (defaults shown) 47 + JETSTREAM_URL=wss://jetstream1.us-east.fire.hose.cam/subscribe 48 + REDIS_URL=redis://localhost:6379 49 + PROCESSING_CONCURRENCY=10 50 + CACHE_ENABLED=true 51 + CACHE_TTL_SECONDS=86400 52 + OZONE_URL=https://ozone.skywatch.blue 53 + OZONE_PDS=https://blewit.us-west.host.bsky.network 54 + MOD_DID=did:plc:e4elbtctnfqocyfcml6h2lf7 55 + RATE_LIMIT_MS=100 56 + ``` 57 + 58 + ### Adding phash rules 59 + 60 + Edit `rules/blobs.ts`: 61 + 62 + ```typescript 63 + export const BLOB_CHECKS: BlobCheck[] = [ 64 + { 65 + phashes: ["0f1e2d3c4b5a6978", "1a2b3c4d5e6f7890"], 66 + label: "harassment-image", 67 + comment: "Known harassment meme detected", 68 + reportAcct: false, 69 + labelAcct: false, 70 + reportPost: true, 71 + toLabel: true, 72 + hammingThreshold: 5, 73 + ignoreDID: ["did:plc:trusted-account"], 74 + }, 75 + ]; 76 + ``` 77 + 78 + To generate a phash from an image: 79 + 80 + ```bash 81 + bun run phash /path/to/image.png 82 + ``` 83 + 84 + ## Running 85 + 86 + ### Development 87 + 88 + ```bash 89 + bun run dev 90 + ``` 91 + 92 + ### Production 93 + 94 + ```bash 95 + bun run start 96 + ``` 97 + 98 + ### Docker 99 + 100 + ```bash 101 + docker compose up -d 102 + ``` 103 + 104 + ## Testing 105 + 106 + ```bash 107 + bun test # run all tests 108 + bun run typecheck # type checking 109 + bun run lint # linting 110 + ``` 111 + 112 + ## VM Requirements 113 + 114 + **Minimal:** 115 + - 2GB RAM 116 + - 2 vCPUs 117 + - 10GB disk 118 + 119 + **Recommended:** 120 + - 4GB RAM 121 + - 2-4 vCPUs 122 + - 20GB disk 123 + 124 + Scale `PROCESSING_CONCURRENCY` based on available RAM (each concurrent image process uses ~50-200MB). 125 + 126 + ## License 127 + 128 + MIT
+95
biome.json
··· 1 + { 2 + "$schema": "https://biomejs.dev/schemas/1.9.4/schema.json", 3 + "organizeImports": { 4 + "enabled": true 5 + }, 6 + "linter": { 7 + "enabled": true, 8 + "rules": { 9 + "recommended": true, 10 + "complexity": { 11 + "noExtraBooleanCast": "error", 12 + "noMultipleSpacesInRegularExpressionLiterals": "error", 13 + "noUselessCatch": "error", 14 + "noUselessTypeConstraint": "error", 15 + "noWith": "error" 16 + }, 17 + "correctness": { 18 + "noConstAssign": "error", 19 + "noConstantCondition": "error", 20 + "noEmptyCharacterClassInRegex": "error", 21 + "noEmptyPattern": "error", 22 + "noGlobalObjectCalls": "error", 23 + "noInvalidConstructorSuper": "error", 24 + "noInvalidNewBuiltin": "error", 25 + "noNonoctalDecimalEscape": "error", 26 + "noPrecisionLoss": "error", 27 + "noSelfAssign": "error", 28 + "noSetterReturn": "error", 29 + "noSwitchDeclarations": "error", 30 + "noUndeclaredVariables": "error", 31 + "noUnreachable": "error", 32 + "noUnreachableSuper": "error", 33 + "noUnsafeFinally": "error", 34 + "noUnsafeOptionalChaining": "error", 35 + "noUnusedLabels": "error", 36 + "noUnusedVariables": "error", 37 + "useIsNan": "error", 38 + "useValidForDirection": "error", 39 + "useYield": "error" 40 + }, 41 + "suspicious": { 42 + "noAsyncPromiseExecutor": "error", 43 + "noCatchAssign": "error", 44 + "noClassAssign": "error", 45 + "noCompareNegZero": "error", 46 + "noControlCharactersInRegex": "error", 47 + "noDebugger": "error", 48 + "noDoubleEquals": "error", 49 + "noDuplicateCase": "error", 50 + "noDuplicateClassMembers": "error", 51 + "noDuplicateObjectKeys": "error", 52 + "noDuplicateParameters": "error", 53 + "noEmptyBlockStatements": "error", 54 + "noExplicitAny": "warn", 55 + "noExtraNonNullAssertion": "error", 56 + "noFallthroughSwitchClause": "error", 57 + "noFunctionAssign": "error", 58 + "noGlobalAssign": "error", 59 + "noImportAssign": "error", 60 + "noMisleadingCharacterClass": "error", 61 + "noPrototypeBuiltins": "error", 62 + "noRedeclare": "error", 63 + "noShadowRestrictedNames": "error", 64 + "noUnsafeNegation": "error", 65 + "useGetterReturn": "error", 66 + "useValidTypeof": "error" 67 + }, 68 + "style": { 69 + "noArguments": "error", 70 + "noVar": "error", 71 + "useConst": "error" 72 + } 73 + } 74 + }, 75 + "formatter": { 76 + "enabled": true, 77 + "formatWithErrors": false, 78 + "indentStyle": "space", 79 + "indentWidth": 2, 80 + "lineEnding": "lf", 81 + "lineWidth": 100 82 + }, 83 + "javascript": { 84 + "formatter": { 85 + "jsxQuoteStyle": "double", 86 + "quoteProperties": "asNeeded", 87 + "trailingCommas": "es5", 88 + "semicolons": "always", 89 + "arrowParentheses": "always", 90 + "bracketSpacing": true, 91 + "bracketSameLine": false, 92 + "quoteStyle": "double" 93 + } 94 + } 95 + }
+270
bun.lock
··· 1 + { 2 + "lockfileVersion": 1, 3 + "workspaces": { 4 + "": { 5 + "name": "skywatch-phash", 6 + "dependencies": { 7 + "@atproto/api": "^0.13.24", 8 + "@skyware/jetstream": "^0.2.2", 9 + "ioredis": "^5.4.1", 10 + "p-ratelimit": "^1.0.1", 11 + "pino": "^9.5.0", 12 + "pino-pretty": "^11.2.2", 13 + "prom-client": "^15.1.3", 14 + "sharp": "^0.33.5", 15 + "undici": "^7.16.0", 16 + }, 17 + "devDependencies": { 18 + "@biomejs/biome": "^1.9.4", 19 + "@types/bun": "^1.1.13", 20 + "@types/node": "^22.10.2", 21 + "typescript": "^5.7.2", 22 + }, 23 + }, 24 + }, 25 + "packages": { 26 + "@atcute/atproto": ["@atcute/atproto@3.1.8", "", { "dependencies": { "@atcute/lexicons": "^1.2.2" } }, "sha512-Miu+S7RSgAYbmQWtHJKfSFUN5Kliqoo4YH0rILPmBtfmlZieORJgXNj9oO/Uive0/ulWkiRse07ATIcK8JxMnw=="], 27 + 28 + "@atcute/bluesky": ["@atcute/bluesky@3.2.8", "", { "dependencies": { "@atcute/atproto": "^3.1.8", "@atcute/lexicons": "^1.2.2" } }, "sha512-wxEnSOvX7nLH4sVzX9YFCkaNEWIDrTv3pTs6/x4NgJ3AJ3XJio0OYPM8tR7wAgsklY6BHvlAgt3yoCDK0cl1CA=="], 29 + 30 + "@atcute/lexicons": ["@atcute/lexicons@1.2.2", "", { "dependencies": { "@standard-schema/spec": "^1.0.0", "esm-env": "^1.2.2" } }, "sha512-bgEhJq5Z70/0TbK5sx+tAkrR8FsCODNiL2gUEvS5PuJfPxmFmRYNWaMGehxSPaXWpU2+Oa9ckceHiYbrItDTkA=="], 31 + 32 + "@atproto/api": ["@atproto/api@0.13.35", "", { "dependencies": { "@atproto/common-web": "^0.4.0", "@atproto/lexicon": "^0.4.6", "@atproto/syntax": "^0.3.2", "@atproto/xrpc": "^0.6.8", "await-lock": "^2.2.2", "multiformats": "^9.9.0", "tlds": "^1.234.0", "zod": "^3.23.8" } }, "sha512-vsEfBj0C333TLjDppvTdTE0IdKlXuljKSveAeI4PPx/l6eUKNnDTsYxvILtXUVzwUlTDmSRqy5O4Ryh78n1b7g=="], 33 + 34 + "@atproto/common-web": ["@atproto/common-web@0.4.3", "", { "dependencies": { "graphemer": "^1.4.0", "multiformats": "^9.9.0", "uint8arrays": "3.0.0", "zod": "^3.23.8" } }, "sha512-nRDINmSe4VycJzPo6fP/hEltBcULFxt9Kw7fQk6405FyAWZiTluYHlXOnU7GkQfeUK44OENG1qFTBcmCJ7e8pg=="], 35 + 36 + "@atproto/lexicon": ["@atproto/lexicon@0.4.14", "", { "dependencies": { "@atproto/common-web": "^0.4.2", "@atproto/syntax": "^0.4.0", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-jiKpmH1QER3Gvc7JVY5brwrfo+etFoe57tKPQX/SmPwjvUsFnJAow5xLIryuBaJgFAhnTZViXKs41t//pahGHQ=="], 37 + 38 + "@atproto/syntax": ["@atproto/syntax@0.3.4", "", {}, "sha512-8CNmi5DipOLaVeSMPggMe7FCksVag0aO6XZy9WflbduTKM4dFZVCs4686UeMLfGRXX+X966XgwECHoLYrovMMg=="], 39 + 40 + "@atproto/xrpc": ["@atproto/xrpc@0.6.12", "", { "dependencies": { "@atproto/lexicon": "^0.4.10", "zod": "^3.23.8" } }, "sha512-Ut3iISNLujlmY9Gu8sNU+SPDJDvqlVzWddU8qUr0Yae5oD4SguaUFjjhireMGhQ3M5E0KljQgDbTmnBo1kIZ3w=="], 41 + 42 + "@biomejs/biome": ["@biomejs/biome@1.9.4", "", { "optionalDependencies": { "@biomejs/cli-darwin-arm64": "1.9.4", "@biomejs/cli-darwin-x64": "1.9.4", "@biomejs/cli-linux-arm64": "1.9.4", "@biomejs/cli-linux-arm64-musl": "1.9.4", "@biomejs/cli-linux-x64": "1.9.4", "@biomejs/cli-linux-x64-musl": "1.9.4", "@biomejs/cli-win32-arm64": "1.9.4", "@biomejs/cli-win32-x64": "1.9.4" }, "bin": { "biome": "bin/biome" } }, "sha512-1rkd7G70+o9KkTn5KLmDYXihGoTaIGO9PIIN2ZB7UJxFrWw04CZHPYiMRjYsaDvVV7hP1dYNRLxSANLaBFGpog=="], 43 + 44 + "@biomejs/cli-darwin-arm64": ["@biomejs/cli-darwin-arm64@1.9.4", "", { "os": "darwin", "cpu": "arm64" }, "sha512-bFBsPWrNvkdKrNCYeAp+xo2HecOGPAy9WyNyB/jKnnedgzl4W4Hb9ZMzYNbf8dMCGmUdSavlYHiR01QaYR58cw=="], 45 + 46 + "@biomejs/cli-darwin-x64": ["@biomejs/cli-darwin-x64@1.9.4", "", { "os": "darwin", "cpu": "x64" }, "sha512-ngYBh/+bEedqkSevPVhLP4QfVPCpb+4BBe2p7Xs32dBgs7rh9nY2AIYUL6BgLw1JVXV8GlpKmb/hNiuIxfPfZg=="], 47 + 48 + "@biomejs/cli-linux-arm64": ["@biomejs/cli-linux-arm64@1.9.4", "", { "os": "linux", "cpu": "arm64" }, "sha512-fJIW0+LYujdjUgJJuwesP4EjIBl/N/TcOX3IvIHJQNsAqvV2CHIogsmA94BPG6jZATS4Hi+xv4SkBBQSt1N4/g=="], 49 + 50 + "@biomejs/cli-linux-arm64-musl": ["@biomejs/cli-linux-arm64-musl@1.9.4", "", { "os": "linux", "cpu": "arm64" }, "sha512-v665Ct9WCRjGa8+kTr0CzApU0+XXtRgwmzIf1SeKSGAv+2scAlW6JR5PMFo6FzqqZ64Po79cKODKf3/AAmECqA=="], 51 + 52 + "@biomejs/cli-linux-x64": ["@biomejs/cli-linux-x64@1.9.4", "", { "os": "linux", "cpu": "x64" }, "sha512-lRCJv/Vi3Vlwmbd6K+oQ0KhLHMAysN8lXoCI7XeHlxaajk06u7G+UsFSO01NAs5iYuWKmVZjmiOzJ0OJmGsMwg=="], 53 + 54 + "@biomejs/cli-linux-x64-musl": ["@biomejs/cli-linux-x64-musl@1.9.4", "", { "os": "linux", "cpu": "x64" }, "sha512-gEhi/jSBhZ2m6wjV530Yy8+fNqG8PAinM3oV7CyO+6c3CEh16Eizm21uHVsyVBEB6RIM8JHIl6AGYCv6Q6Q9Tg=="], 55 + 56 + "@biomejs/cli-win32-arm64": ["@biomejs/cli-win32-arm64@1.9.4", "", { "os": "win32", "cpu": "arm64" }, "sha512-tlbhLk+WXZmgwoIKwHIHEBZUwxml7bRJgk0X2sPyNR3S93cdRq6XulAZRQJ17FYGGzWne0fgrXBKpl7l4M87Hg=="], 57 + 58 + "@biomejs/cli-win32-x64": ["@biomejs/cli-win32-x64@1.9.4", "", { "os": "win32", "cpu": "x64" }, "sha512-8Y5wMhVIPaWe6jw2H+KlEm4wP/f7EW3810ZLmDlrEEy5KvBsb9ECEfu/kMWD484ijfQ8+nIi0giMgu9g1UAuuA=="], 59 + 60 + "@emnapi/runtime": ["@emnapi/runtime@1.6.0", "", { "dependencies": { "tslib": "^2.4.0" } }, "sha512-obtUmAHTMjll499P+D9A3axeJFlhdjOWdKUNs/U6QIGT7V5RjcUW1xToAzjvmgTSQhDbYn/NwfTRoJcQ2rNBxA=="], 61 + 62 + "@img/sharp-darwin-arm64": ["@img/sharp-darwin-arm64@0.33.5", "", { "optionalDependencies": { "@img/sharp-libvips-darwin-arm64": "1.0.4" }, "os": "darwin", "cpu": "arm64" }, "sha512-UT4p+iz/2H4twwAoLCqfA9UH5pI6DggwKEGuaPy7nCVQ8ZsiY5PIcrRvD1DzuY3qYL07NtIQcWnBSY/heikIFQ=="], 63 + 64 + "@img/sharp-darwin-x64": ["@img/sharp-darwin-x64@0.33.5", "", { "optionalDependencies": { "@img/sharp-libvips-darwin-x64": "1.0.4" }, "os": "darwin", "cpu": "x64" }, "sha512-fyHac4jIc1ANYGRDxtiqelIbdWkIuQaI84Mv45KvGRRxSAa7o7d1ZKAOBaYbnepLC1WqxfpimdeWfvqqSGwR2Q=="], 65 + 66 + "@img/sharp-libvips-darwin-arm64": ["@img/sharp-libvips-darwin-arm64@1.0.4", "", { "os": "darwin", "cpu": "arm64" }, "sha512-XblONe153h0O2zuFfTAbQYAX2JhYmDHeWikp1LM9Hul9gVPjFY427k6dFEcOL72O01QxQsWi761svJ/ev9xEDg=="], 67 + 68 + "@img/sharp-libvips-darwin-x64": ["@img/sharp-libvips-darwin-x64@1.0.4", "", { "os": "darwin", "cpu": "x64" }, "sha512-xnGR8YuZYfJGmWPvmlunFaWJsb9T/AO2ykoP3Fz/0X5XV2aoYBPkX6xqCQvUTKKiLddarLaxpzNe+b1hjeWHAQ=="], 69 + 70 + "@img/sharp-libvips-linux-arm": ["@img/sharp-libvips-linux-arm@1.0.5", "", { "os": "linux", "cpu": "arm" }, "sha512-gvcC4ACAOPRNATg/ov8/MnbxFDJqf/pDePbBnuBDcjsI8PssmjoKMAz4LtLaVi+OnSb5FK/yIOamqDwGmXW32g=="], 71 + 72 + "@img/sharp-libvips-linux-arm64": ["@img/sharp-libvips-linux-arm64@1.0.4", "", { "os": "linux", "cpu": "arm64" }, "sha512-9B+taZ8DlyyqzZQnoeIvDVR/2F4EbMepXMc/NdVbkzsJbzkUjhXv/70GQJ7tdLA4YJgNP25zukcxpX2/SueNrA=="], 73 + 74 + "@img/sharp-libvips-linux-s390x": ["@img/sharp-libvips-linux-s390x@1.0.4", "", { "os": "linux", "cpu": "s390x" }, "sha512-u7Wz6ntiSSgGSGcjZ55im6uvTrOxSIS8/dgoVMoiGE9I6JAfU50yH5BoDlYA1tcuGS7g/QNtetJnxA6QEsCVTA=="], 75 + 76 + "@img/sharp-libvips-linux-x64": ["@img/sharp-libvips-linux-x64@1.0.4", "", { "os": "linux", "cpu": "x64" }, "sha512-MmWmQ3iPFZr0Iev+BAgVMb3ZyC4KeFc3jFxnNbEPas60e1cIfevbtuyf9nDGIzOaW9PdnDciJm+wFFaTlj5xYw=="], 77 + 78 + "@img/sharp-libvips-linuxmusl-arm64": ["@img/sharp-libvips-linuxmusl-arm64@1.0.4", "", { "os": "linux", "cpu": "arm64" }, "sha512-9Ti+BbTYDcsbp4wfYib8Ctm1ilkugkA/uscUn6UXK1ldpC1JjiXbLfFZtRlBhjPZ5o1NCLiDbg8fhUPKStHoTA=="], 79 + 80 + "@img/sharp-libvips-linuxmusl-x64": ["@img/sharp-libvips-linuxmusl-x64@1.0.4", "", { "os": "linux", "cpu": "x64" }, "sha512-viYN1KX9m+/hGkJtvYYp+CCLgnJXwiQB39damAO7WMdKWlIhmYTfHjwSbQeUK/20vY154mwezd9HflVFM1wVSw=="], 81 + 82 + "@img/sharp-linux-arm": ["@img/sharp-linux-arm@0.33.5", "", { "optionalDependencies": { "@img/sharp-libvips-linux-arm": "1.0.5" }, "os": "linux", "cpu": "arm" }, "sha512-JTS1eldqZbJxjvKaAkxhZmBqPRGmxgu+qFKSInv8moZ2AmT5Yib3EQ1c6gp493HvrvV8QgdOXdyaIBrhvFhBMQ=="], 83 + 84 + "@img/sharp-linux-arm64": ["@img/sharp-linux-arm64@0.33.5", "", { "optionalDependencies": { "@img/sharp-libvips-linux-arm64": "1.0.4" }, "os": "linux", "cpu": "arm64" }, "sha512-JMVv+AMRyGOHtO1RFBiJy/MBsgz0x4AWrT6QoEVVTyh1E39TrCUpTRI7mx9VksGX4awWASxqCYLCV4wBZHAYxA=="], 85 + 86 + "@img/sharp-linux-s390x": ["@img/sharp-linux-s390x@0.33.5", "", { "optionalDependencies": { "@img/sharp-libvips-linux-s390x": "1.0.4" }, "os": "linux", "cpu": "s390x" }, "sha512-y/5PCd+mP4CA/sPDKl2961b+C9d+vPAveS33s6Z3zfASk2j5upL6fXVPZi7ztePZ5CuH+1kW8JtvxgbuXHRa4Q=="], 87 + 88 + "@img/sharp-linux-x64": ["@img/sharp-linux-x64@0.33.5", "", { "optionalDependencies": { "@img/sharp-libvips-linux-x64": "1.0.4" }, "os": "linux", "cpu": "x64" }, "sha512-opC+Ok5pRNAzuvq1AG0ar+1owsu842/Ab+4qvU879ippJBHvyY5n2mxF1izXqkPYlGuP/M556uh53jRLJmzTWA=="], 89 + 90 + "@img/sharp-linuxmusl-arm64": ["@img/sharp-linuxmusl-arm64@0.33.5", "", { "optionalDependencies": { "@img/sharp-libvips-linuxmusl-arm64": "1.0.4" }, "os": "linux", "cpu": "arm64" }, "sha512-XrHMZwGQGvJg2V/oRSUfSAfjfPxO+4DkiRh6p2AFjLQztWUuY/o8Mq0eMQVIY7HJ1CDQUJlxGGZRw1a5bqmd1g=="], 91 + 92 + "@img/sharp-linuxmusl-x64": ["@img/sharp-linuxmusl-x64@0.33.5", "", { "optionalDependencies": { "@img/sharp-libvips-linuxmusl-x64": "1.0.4" }, "os": "linux", "cpu": "x64" }, "sha512-WT+d/cgqKkkKySYmqoZ8y3pxx7lx9vVejxW/W4DOFMYVSkErR+w7mf2u8m/y4+xHe7yY9DAXQMWQhpnMuFfScw=="], 93 + 94 + "@img/sharp-wasm32": ["@img/sharp-wasm32@0.33.5", "", { "dependencies": { "@emnapi/runtime": "^1.2.0" }, "cpu": "none" }, "sha512-ykUW4LVGaMcU9lu9thv85CbRMAwfeadCJHRsg2GmeRa/cJxsVY9Rbd57JcMxBkKHag5U/x7TSBpScF4U8ElVzg=="], 95 + 96 + "@img/sharp-win32-ia32": ["@img/sharp-win32-ia32@0.33.5", "", { "os": "win32", "cpu": "ia32" }, "sha512-T36PblLaTwuVJ/zw/LaH0PdZkRz5rd3SmMHX8GSmR7vtNSP5Z6bQkExdSK7xGWyxLw4sUknBuugTelgw2faBbQ=="], 97 + 98 + "@img/sharp-win32-x64": ["@img/sharp-win32-x64@0.33.5", "", { "os": "win32", "cpu": "x64" }, "sha512-MpY/o8/8kj+EcnxwvrP4aTJSWw/aZ7JIGR4aBeZkZw5B7/Jn+tY9/VNwtcoGmdT7GfggGIU4kygOMSbYnOrAbg=="], 99 + 100 + "@ioredis/commands": ["@ioredis/commands@1.4.0", "", {}, "sha512-aFT2yemJJo+TZCmieA7qnYGQooOS7QfNmYrzGtsYd3g9j5iDP8AimYYAesf79ohjbLG12XxC4nG5DyEnC88AsQ=="], 101 + 102 + "@opentelemetry/api": ["@opentelemetry/api@1.9.0", "", {}, "sha512-3giAOQvZiH5F9bMlMiv8+GSPMeqg0dbaeo58/0SlA9sxSqZhnUtxzX9/2FzyhS9sWQf5S0GJE0AKBrFqjpeYcg=="], 103 + 104 + "@pinojs/redact": ["@pinojs/redact@0.4.0", "", {}, "sha512-k2ENnmBugE/rzQfEcdWHcCY+/FM3VLzH9cYEsbdsoqrvzAKRhUZeRNhAZvB8OitQJ1TBed3yqWtdjzS6wJKBwg=="], 105 + 106 + "@skyware/jetstream": ["@skyware/jetstream@0.2.5", "", { "dependencies": { "@atcute/atproto": "^3.1.0", "@atcute/bluesky": "^3.1.4", "@atcute/lexicons": "^1.1.0", "partysocket": "^1.1.3", "tiny-emitter": "^2.1.0" } }, "sha512-fM/zs03DLwqRyzZZJFWN20e76KrdqIp97Tlm8Cek+vxn96+tu5d/fx79V6H85L0QN6HvGiX2l9A8hWFqHvYlOA=="], 107 + 108 + "@standard-schema/spec": ["@standard-schema/spec@1.0.0", "", {}, "sha512-m2bOd0f2RT9k8QJx1JN85cZYyH1RqFBdlwtkSlf4tBDYLCiiZnv1fIIwacK6cqwXavOydf0NPToMQgpKq+dVlA=="], 109 + 110 + "@types/bun": ["@types/bun@1.3.0", "", { "dependencies": { "bun-types": "1.3.0" } }, "sha512-+lAGCYjXjip2qY375xX/scJeVRmZ5cY0wyHYyCYxNcdEXrQ4AOe3gACgd4iQ8ksOslJtW4VNxBJ8llUwc3a6AA=="], 111 + 112 + "@types/node": ["@types/node@22.18.12", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-BICHQ67iqxQGFSzfCFTT7MRQ5XcBjG5aeKh5Ok38UBbPe5fxTyE+aHFxwVrGyr8GNlqFMLKD1D3P2K/1ks8tog=="], 113 + 114 + "@types/react": ["@types/react@19.2.2", "", { "dependencies": { "csstype": "^3.0.2" } }, "sha512-6mDvHUFSjyT2B2yeNx2nUgMxh9LtOWvkhIU3uePn2I2oyNymUAX1NIsdgviM4CH+JSrp2D2hsMvJOkxY+0wNRA=="], 115 + 116 + "abort-controller": ["abort-controller@3.0.0", "", { "dependencies": { "event-target-shim": "^5.0.0" } }, "sha512-h8lQ8tacZYnR3vNQTgibj+tODHI5/+l06Au2Pcriv/Gmet0eaj4TwWH41sO9wnHDiQsEj19q0drzdWdeAHtweg=="], 117 + 118 + "atomic-sleep": ["atomic-sleep@1.0.0", "", {}, "sha512-kNOjDqAh7px0XWNI+4QbzoiR/nTkHAWNud2uvnJquD1/x5a7EQZMJT0AczqK0Qn67oY/TTQ1LbUKajZpp3I9tQ=="], 119 + 120 + "await-lock": ["await-lock@2.2.2", "", {}, "sha512-aDczADvlvTGajTDjcjpJMqRkOF6Qdz3YbPZm/PyW6tKPkx2hlYBzxMhEywM/tU72HrVZjgl5VCdRuMlA7pZ8Gw=="], 121 + 122 + "base64-js": ["base64-js@1.5.1", "", {}, "sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA=="], 123 + 124 + "bintrees": ["bintrees@1.0.2", "", {}, "sha512-VOMgTMwjAaUG580SXn3LacVgjurrbMme7ZZNYGSSV7mmtY6QQRh0Eg3pwIcntQ77DErK1L0NxkbetjcoXzVwKw=="], 125 + 126 + "buffer": ["buffer@6.0.3", "", { "dependencies": { "base64-js": "^1.3.1", "ieee754": "^1.2.1" } }, "sha512-FTiCpNxtwiZZHEZbcbTIcZjERVICn9yq/pDFkTl95/AxzD1naBctN7YO68riM/gLSDY7sdrMby8hofADYuuqOA=="], 127 + 128 + "bun-types": ["bun-types@1.3.0", "", { "dependencies": { "@types/node": "*" }, "peerDependencies": { "@types/react": "^19" } }, "sha512-u8X0thhx+yJ0KmkxuEo9HAtdfgCBaM/aI9K90VQcQioAmkVp3SG3FkwWGibUFz3WdXAdcsqOcbU40lK7tbHdkQ=="], 129 + 130 + "cluster-key-slot": ["cluster-key-slot@1.1.2", "", {}, "sha512-RMr0FhtfXemyinomL4hrWcYJxmX6deFdCxpJzhDttxgO1+bcCnkk+9drydLVDmAMG7NE6aN/fl4F7ucU/90gAA=="], 131 + 132 + "color": ["color@4.2.3", "", { "dependencies": { "color-convert": "^2.0.1", "color-string": "^1.9.0" } }, "sha512-1rXeuUUiGGrykh+CeBdu5Ie7OJwinCgQY0bc7GCRxy5xVHy+moaqkpL/jqQq0MtQOeYcrqEz4abc5f0KtU7W4A=="], 133 + 134 + "color-convert": ["color-convert@2.0.1", "", { "dependencies": { "color-name": "~1.1.4" } }, "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ=="], 135 + 136 + "color-name": ["color-name@1.1.4", "", {}, "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA=="], 137 + 138 + "color-string": ["color-string@1.9.1", "", { "dependencies": { "color-name": "^1.0.0", "simple-swizzle": "^0.2.2" } }, "sha512-shrVawQFojnZv6xM40anx4CkoDP+fZsw/ZerEMsW/pyzsRbElpsL/DBVW7q3ExxwusdNXI3lXpuhEZkzs8p5Eg=="], 139 + 140 + "colorette": ["colorette@2.0.20", "", {}, "sha512-IfEDxwoWIjkeXL1eXcDiow4UbKjhLdq6/EuSVR9GMN7KVH3r9gQ83e73hsz1Nd1T3ijd5xv1wcWRYO+D6kCI2w=="], 141 + 142 + "csstype": ["csstype@3.1.3", "", {}, "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw=="], 143 + 144 + "dateformat": ["dateformat@4.6.3", "", {}, "sha512-2P0p0pFGzHS5EMnhdxQi7aJN+iMheud0UhG4dlE1DLAlvL8JHjJJTX/CSm4JXwV0Ka5nGk3zC5mcb5bUQUxxMA=="], 145 + 146 + "debug": ["debug@4.4.3", "", { "dependencies": { "ms": "^2.1.3" } }, "sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA=="], 147 + 148 + "denque": ["denque@2.1.0", "", {}, "sha512-HVQE3AAb/pxF8fQAoiqpvg9i3evqug3hoiwakOyZAwJm+6vZehbkYXZ0l4JxS+I3QxM97v5aaRNhj8v5oBhekw=="], 149 + 150 + "detect-libc": ["detect-libc@2.1.2", "", {}, "sha512-Btj2BOOO83o3WyH59e8MgXsxEQVcarkUOpEYrubB0urwnN10yQ364rsiByU11nZlqWYZm05i/of7io4mzihBtQ=="], 151 + 152 + "end-of-stream": ["end-of-stream@1.4.5", "", { "dependencies": { "once": "^1.4.0" } }, "sha512-ooEGc6HP26xXq/N+GCGOT0JKCLDGrq2bQUZrQ7gyrJiZANJ/8YDTxTpQBXGMn+WbIQXNVpyWymm7KYVICQnyOg=="], 153 + 154 + "esm-env": ["esm-env@1.2.2", "", {}, "sha512-Epxrv+Nr/CaL4ZcFGPJIYLWFom+YeV1DqMLHJoEd9SYRxNbaFruBwfEX/kkHUJf55j2+TUbmDcmuilbP1TmXHA=="], 155 + 156 + "event-target-polyfill": ["event-target-polyfill@0.0.4", "", {}, "sha512-Gs6RLjzlLRdT8X9ZipJdIZI/Y6/HhRLyq9RdDlCsnpxr/+Nn6bU2EFGuC94GjxqhM+Nmij2Vcq98yoHrU8uNFQ=="], 157 + 158 + "event-target-shim": ["event-target-shim@5.0.1", "", {}, "sha512-i/2XbnSz/uxRCU6+NdVJgKWDTM427+MqYbkQzD321DuCQJUqOuJKIA0IM2+W2xtYHdKOmZ4dR6fExsd4SXL+WQ=="], 159 + 160 + "events": ["events@3.3.0", "", {}, "sha512-mQw+2fkQbALzQ7V0MY0IqdnXNOeTtP4r0lN9z7AAawCXgqea7bDii20AYrIBrFd/Hx0M2Ocz6S111CaFkUcb0Q=="], 161 + 162 + "fast-copy": ["fast-copy@3.0.2", "", {}, "sha512-dl0O9Vhju8IrcLndv2eU4ldt1ftXMqqfgN4H1cpmGV7P6jeB9FwpN9a2c8DPGE1Ys88rNUJVYDHq73CGAGOPfQ=="], 163 + 164 + "fast-safe-stringify": ["fast-safe-stringify@2.1.1", "", {}, "sha512-W+KJc2dmILlPplD/H4K9l9LcAHAfPtP6BY84uVLXQ6Evcz9Lcg33Y2z1IVblT6xdY54PXYVHEv+0Wpq8Io6zkA=="], 165 + 166 + "graphemer": ["graphemer@1.4.0", "", {}, "sha512-EtKwoO6kxCL9WO5xipiHTZlSzBm7WLT627TqC/uVRd0HKmq8NXyebnNYxDoBi7wt8eTWrUrKXCOVaFq9x1kgag=="], 167 + 168 + "help-me": ["help-me@5.0.0", "", {}, "sha512-7xgomUX6ADmcYzFik0HzAxh/73YlKR9bmFzf51CZwR+b6YtzU2m0u49hQCqV6SvlqIqsaxovfwdvbnsw3b/zpg=="], 169 + 170 + "ieee754": ["ieee754@1.2.1", "", {}, "sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA=="], 171 + 172 + "ioredis": ["ioredis@5.8.2", "", { "dependencies": { "@ioredis/commands": "1.4.0", "cluster-key-slot": "^1.1.0", "debug": "^4.3.4", "denque": "^2.1.0", "lodash.defaults": "^4.2.0", "lodash.isarguments": "^3.1.0", "redis-errors": "^1.2.0", "redis-parser": "^3.0.0", "standard-as-callback": "^2.1.0" } }, "sha512-C6uC+kleiIMmjViJINWk80sOQw5lEzse1ZmvD+S/s8p8CWapftSaC+kocGTx6xrbrJ4WmYQGC08ffHLr6ToR6Q=="], 173 + 174 + "is-arrayish": ["is-arrayish@0.3.4", "", {}, "sha512-m6UrgzFVUYawGBh1dUsWR5M2Clqic9RVXC/9f8ceNlv2IcO9j9J/z8UoCLPqtsPBFNzEpfR3xftohbfqDx8EQA=="], 175 + 176 + "iso-datestring-validator": ["iso-datestring-validator@2.2.2", "", {}, "sha512-yLEMkBbLZTlVQqOnQ4FiMujR6T4DEcCb1xizmvXS+OxuhwcbtynoosRzdMA69zZCShCNAbi+gJ71FxZBBXx1SA=="], 177 + 178 + "joycon": ["joycon@3.1.1", "", {}, "sha512-34wB/Y7MW7bzjKRjUKTa46I2Z7eV62Rkhva+KkopW7Qvv/OSWBqvkSY7vusOPrNuZcUG3tApvdVgNB8POj3SPw=="], 179 + 180 + "lodash.defaults": ["lodash.defaults@4.2.0", "", {}, "sha512-qjxPLHd3r5DnsdGacqOMU6pb/avJzdh9tFX2ymgoZE27BmjXrNy/y4LoaiTeAb+O3gL8AfpJGtqfX/ae2leYYQ=="], 181 + 182 + "lodash.isarguments": ["lodash.isarguments@3.1.0", "", {}, "sha512-chi4NHZlZqZD18a0imDHnZPrDeBbTtVN7GXMwuGdRH9qotxAjYs3aVLKc7zNOG9eddR5Ksd8rvFEBc9SsggPpg=="], 183 + 184 + "minimist": ["minimist@1.2.8", "", {}, "sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA=="], 185 + 186 + "ms": ["ms@2.1.3", "", {}, "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA=="], 187 + 188 + "multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="], 189 + 190 + "on-exit-leak-free": ["on-exit-leak-free@2.1.2", "", {}, "sha512-0eJJY6hXLGf1udHwfNftBqH+g73EU4B504nZeKpz1sYRKafAghwxEJunB2O7rDZkL4PGfsMVnTXZ2EjibbqcsA=="], 191 + 192 + "once": ["once@1.4.0", "", { "dependencies": { "wrappy": "1" } }, "sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w=="], 193 + 194 + "p-ratelimit": ["p-ratelimit@1.0.1", "", {}, "sha512-tKBGoow6aWRH68K2eQx+qc1gSegjd5VLirZYc1Yms9pPFsYQ9TFI6aMn0vJH2vmvzjNpjlWZOFft4aPUen2w0A=="], 195 + 196 + "partysocket": ["partysocket@1.1.6", "", { "dependencies": { "event-target-polyfill": "^0.0.4" } }, "sha512-LkEk8N9hMDDsDT0iDK0zuwUDFVrVMUXFXCeN3850Ng8wtjPqPBeJlwdeY6ROlJSEh3tPoTTasXoSBYH76y118w=="], 197 + 198 + "pino": ["pino@9.14.0", "", { "dependencies": { "@pinojs/redact": "^0.4.0", "atomic-sleep": "^1.0.0", "on-exit-leak-free": "^2.1.0", "pino-abstract-transport": "^2.0.0", "pino-std-serializers": "^7.0.0", "process-warning": "^5.0.0", "quick-format-unescaped": "^4.0.3", "real-require": "^0.2.0", "safe-stable-stringify": "^2.3.1", "sonic-boom": "^4.0.1", "thread-stream": "^3.0.0" }, "bin": { "pino": "bin.js" } }, "sha512-8OEwKp5juEvb/MjpIc4hjqfgCNysrS94RIOMXYvpYCdm/jglrKEiAYmiumbmGhCvs+IcInsphYDFwqrjr7398w=="], 199 + 200 + "pino-abstract-transport": ["pino-abstract-transport@2.0.0", "", { "dependencies": { "split2": "^4.0.0" } }, "sha512-F63x5tizV6WCh4R6RHyi2Ml+M70DNRXt/+HANowMflpgGFMAym/VKm6G7ZOQRjqN7XbGxK1Lg9t6ZrtzOaivMw=="], 201 + 202 + "pino-pretty": ["pino-pretty@11.3.0", "", { "dependencies": { "colorette": "^2.0.7", "dateformat": "^4.6.3", "fast-copy": "^3.0.2", "fast-safe-stringify": "^2.1.1", "help-me": "^5.0.0", "joycon": "^3.1.1", "minimist": "^1.2.6", "on-exit-leak-free": "^2.1.0", "pino-abstract-transport": "^2.0.0", "pump": "^3.0.0", "readable-stream": "^4.0.0", "secure-json-parse": "^2.4.0", "sonic-boom": "^4.0.1", "strip-json-comments": "^3.1.1" }, "bin": { "pino-pretty": "bin.js" } }, "sha512-oXwn7ICywaZPHmu3epHGU2oJX4nPmKvHvB/bwrJHlGcbEWaVcotkpyVHMKLKmiVryWYByNp0jpgAcXpFJDXJzA=="], 203 + 204 + "pino-std-serializers": ["pino-std-serializers@7.0.0", "", {}, "sha512-e906FRY0+tV27iq4juKzSYPbUj2do2X2JX4EzSca1631EB2QJQUqGbDuERal7LCtOpxl6x3+nvo9NPZcmjkiFA=="], 205 + 206 + "process": ["process@0.11.10", "", {}, "sha512-cdGef/drWFoydD1JsMzuFf8100nZl+GT+yacc2bEced5f9Rjk4z+WtFUTBu9PhOi9j/jfmBPu0mMEY4wIdAF8A=="], 207 + 208 + "process-warning": ["process-warning@5.0.0", "", {}, "sha512-a39t9ApHNx2L4+HBnQKqxxHNs1r7KF+Intd8Q/g1bUh6q0WIp9voPXJ/x0j+ZL45KF1pJd9+q2jLIRMfvEshkA=="], 209 + 210 + "prom-client": ["prom-client@15.1.3", "", { "dependencies": { "@opentelemetry/api": "^1.4.0", "tdigest": "^0.1.1" } }, "sha512-6ZiOBfCywsD4k1BN9IX0uZhF+tJkV8q8llP64G5Hajs4JOeVLPCwpPVcpXy3BwYiUGgyJzsJJQeOIv7+hDSq8g=="], 211 + 212 + "pump": ["pump@3.0.3", "", { "dependencies": { "end-of-stream": "^1.1.0", "once": "^1.3.1" } }, "sha512-todwxLMY7/heScKmntwQG8CXVkWUOdYxIvY2s0VWAAMh/nd8SoYiRaKjlr7+iCs984f2P8zvrfWcDDYVb73NfA=="], 213 + 214 + "quick-format-unescaped": ["quick-format-unescaped@4.0.4", "", {}, "sha512-tYC1Q1hgyRuHgloV/YXs2w15unPVh8qfu/qCTfhTYamaw7fyhumKa2yGpdSo87vY32rIclj+4fWYQXUMs9EHvg=="], 215 + 216 + "readable-stream": ["readable-stream@4.7.0", "", { "dependencies": { "abort-controller": "^3.0.0", "buffer": "^6.0.3", "events": "^3.3.0", "process": "^0.11.10", "string_decoder": "^1.3.0" } }, "sha512-oIGGmcpTLwPga8Bn6/Z75SVaH1z5dUut2ibSyAMVhmUggWpmDn2dapB0n7f8nwaSiRtepAsfJyfXIO5DCVAODg=="], 217 + 218 + "real-require": ["real-require@0.2.0", "", {}, "sha512-57frrGM/OCTLqLOAh0mhVA9VBMHd+9U7Zb2THMGdBUoZVOtGbJzjxsYGDJ3A9AYYCP4hn6y1TVbaOfzWtm5GFg=="], 219 + 220 + "redis-errors": ["redis-errors@1.2.0", "", {}, "sha512-1qny3OExCf0UvUV/5wpYKf2YwPcOqXzkwKKSmKHiE6ZMQs5heeE/c8eXK+PNllPvmjgAbfnsbpkGZWy8cBpn9w=="], 221 + 222 + "redis-parser": ["redis-parser@3.0.0", "", { "dependencies": { "redis-errors": "^1.0.0" } }, "sha512-DJnGAeenTdpMEH6uAJRK/uiyEIH9WVsUmoLwzudwGJUwZPp80PDBWPHXSAGNPwNvIXAbe7MSUB1zQFugFml66A=="], 223 + 224 + "safe-buffer": ["safe-buffer@5.2.1", "", {}, "sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ=="], 225 + 226 + "safe-stable-stringify": ["safe-stable-stringify@2.5.0", "", {}, "sha512-b3rppTKm9T+PsVCBEOUR46GWI7fdOs00VKZ1+9c1EWDaDMvjQc6tUwuFyIprgGgTcWoVHSKrU8H31ZHA2e0RHA=="], 227 + 228 + "secure-json-parse": ["secure-json-parse@2.7.0", "", {}, "sha512-6aU+Rwsezw7VR8/nyvKTx8QpWH9FrcYiXXlqC4z5d5XQBDRqtbfsRjnwGyqbi3gddNtWHuEk9OANUotL26qKUw=="], 229 + 230 + "semver": ["semver@7.7.3", "", { "bin": { "semver": "bin/semver.js" } }, "sha512-SdsKMrI9TdgjdweUSR9MweHA4EJ8YxHn8DFaDisvhVlUOe4BF1tLD7GAj0lIqWVl+dPb/rExr0Btby5loQm20Q=="], 231 + 232 + "sharp": ["sharp@0.33.5", "", { "dependencies": { "color": "^4.2.3", "detect-libc": "^2.0.3", "semver": "^7.6.3" }, "optionalDependencies": { "@img/sharp-darwin-arm64": "0.33.5", "@img/sharp-darwin-x64": "0.33.5", "@img/sharp-libvips-darwin-arm64": "1.0.4", "@img/sharp-libvips-darwin-x64": "1.0.4", "@img/sharp-libvips-linux-arm": "1.0.5", "@img/sharp-libvips-linux-arm64": "1.0.4", "@img/sharp-libvips-linux-s390x": "1.0.4", "@img/sharp-libvips-linux-x64": "1.0.4", "@img/sharp-libvips-linuxmusl-arm64": "1.0.4", "@img/sharp-libvips-linuxmusl-x64": "1.0.4", "@img/sharp-linux-arm": "0.33.5", "@img/sharp-linux-arm64": "0.33.5", "@img/sharp-linux-s390x": "0.33.5", "@img/sharp-linux-x64": "0.33.5", "@img/sharp-linuxmusl-arm64": "0.33.5", "@img/sharp-linuxmusl-x64": "0.33.5", "@img/sharp-wasm32": "0.33.5", "@img/sharp-win32-ia32": "0.33.5", "@img/sharp-win32-x64": "0.33.5" } }, "sha512-haPVm1EkS9pgvHrQ/F3Xy+hgcuMV0Wm9vfIBSiwZ05k+xgb0PkBQpGsAA/oWdDobNaZTH5ppvHtzCFbnSEwHVw=="], 233 + 234 + "simple-swizzle": ["simple-swizzle@0.2.4", "", { "dependencies": { "is-arrayish": "^0.3.1" } }, "sha512-nAu1WFPQSMNr2Zn9PGSZK9AGn4t/y97lEm+MXTtUDwfP0ksAIX4nO+6ruD9Jwut4C49SB1Ws+fbXsm/yScWOHw=="], 235 + 236 + "sonic-boom": ["sonic-boom@4.2.0", "", { "dependencies": { "atomic-sleep": "^1.0.0" } }, "sha512-INb7TM37/mAcsGmc9hyyI6+QR3rR1zVRu36B0NeGXKnOOLiZOfER5SA+N7X7k3yUYRzLWafduTDvJAfDswwEww=="], 237 + 238 + "split2": ["split2@4.2.0", "", {}, "sha512-UcjcJOWknrNkF6PLX83qcHM6KHgVKNkV62Y8a5uYDVv9ydGQVwAHMKqHdJje1VTWpljG0WYpCDhrCdAOYH4TWg=="], 239 + 240 + "standard-as-callback": ["standard-as-callback@2.1.0", "", {}, "sha512-qoRRSyROncaz1z0mvYqIE4lCd9p2R90i6GxW3uZv5ucSu8tU7B5HXUP1gG8pVZsYNVaXjk8ClXHPttLyxAL48A=="], 241 + 242 + "string_decoder": ["string_decoder@1.3.0", "", { "dependencies": { "safe-buffer": "~5.2.0" } }, "sha512-hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA=="], 243 + 244 + "strip-json-comments": ["strip-json-comments@3.1.1", "", {}, "sha512-6fPc+R4ihwqP6N/aIv2f1gMH8lOVtWQHoqC4yK6oSDVVocumAsfCqjkXnqiYMhmMwS/mEHLp7Vehlt3ql6lEig=="], 245 + 246 + "tdigest": ["tdigest@0.1.2", "", { "dependencies": { "bintrees": "1.0.2" } }, "sha512-+G0LLgjjo9BZX2MfdvPfH+MKLCrxlXSYec5DaPYP1fe6Iyhf0/fSmJ0bFiZ1F8BT6cGXl2LpltQptzjXKWEkKA=="], 247 + 248 + "thread-stream": ["thread-stream@3.1.0", "", { "dependencies": { "real-require": "^0.2.0" } }, "sha512-OqyPZ9u96VohAyMfJykzmivOrY2wfMSf3C5TtFJVgN+Hm6aj+voFhlK+kZEIv2FBh1X6Xp3DlnCOfEQ3B2J86A=="], 249 + 250 + "tiny-emitter": ["tiny-emitter@2.1.0", "", {}, "sha512-NB6Dk1A9xgQPMoGqC5CVXn123gWyte215ONT5Pp5a0yt4nlEoO1ZWeCwpncaekPHXO60i47ihFnZPiRPjRMq4Q=="], 251 + 252 + "tlds": ["tlds@1.261.0", "", { "bin": { "tlds": "bin.js" } }, "sha512-QXqwfEl9ddlGBaRFXIvNKK6OhipSiLXuRuLJX5DErz0o0Q0rYxulWLdFryTkV5PkdZct5iMInwYEGe/eR++1AA=="], 253 + 254 + "tslib": ["tslib@2.8.1", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="], 255 + 256 + "typescript": ["typescript@5.9.3", "", { "bin": { "tsc": "bin/tsc", "tsserver": "bin/tsserver" } }, "sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw=="], 257 + 258 + "uint8arrays": ["uint8arrays@3.0.0", "", { "dependencies": { "multiformats": "^9.4.2" } }, "sha512-HRCx0q6O9Bfbp+HHSfQQKD7wU70+lydKVt4EghkdOvlK/NlrF90z+eXV34mUd48rNvVJXwkrMSPpCATkct8fJA=="], 259 + 260 + "undici": ["undici@7.16.0", "", {}, "sha512-QEg3HPMll0o3t2ourKwOeUAZ159Kn9mx5pnzHRQO8+Wixmh88YdZRiIwat0iNzNNXn0yoEtXJqFpyW7eM8BV7g=="], 261 + 262 + "undici-types": ["undici-types@6.21.0", "", {}, "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ=="], 263 + 264 + "wrappy": ["wrappy@1.0.2", "", {}, "sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ=="], 265 + 266 + "zod": ["zod@3.25.76", "", {}, "sha512-gzUt/qt81nXsFGKIFcC3YnfEAx5NkunCfnDlvuBSSFS02bcXu4Lmea0AFIUwbLWxWPx3d9p8S5QoaujKcNQxcQ=="], 267 + 268 + "@atproto/lexicon/@atproto/syntax": ["@atproto/syntax@0.4.1", "", {}, "sha512-CJdImtLAiFO+0z3BWTtxwk6aY5w4t8orHTMVJgkf++QRJWTxPbIFko/0hrkADB7n2EruDxDSeAgfUGehpH6ngw=="], 269 + } 270 + }
+41
docker-compose.yml
··· 1 + services: 2 + phash: 3 + build: . 4 + container_name: skywatch-phash 5 + volumes: 6 + - ./cursor.txt:/app/cursor.txt 7 + env_file: 8 + - .env 9 + environment: 10 + # Override Redis URL for internal Docker networking 11 + REDIS_URL: redis://redis:6379 12 + NODE_ENV: production 13 + depends_on: 14 + redis: 15 + condition: service_healthy 16 + restart: unless-stopped 17 + networks: 18 + - phash-network 19 + 20 + redis: 21 + image: redis:7-alpine 22 + container_name: skywatch-phash-redis 23 + command: redis-server --appendonly yes 24 + volumes: 25 + - redis-data:/data 26 + healthcheck: 27 + test: ["CMD", "redis-cli", "ping"] 28 + interval: 5s 29 + timeout: 3s 30 + retries: 5 31 + restart: unless-stopped 32 + networks: 33 + - phash-network 34 + 35 + volumes: 36 + redis-data: 37 + driver: local 38 + 39 + networks: 40 + phash-network: 41 + driver: bridge
+34
package.json
··· 1 + { 2 + "name": "skywatch-phash", 3 + "version": "0.1.0", 4 + "type": "module", 5 + "description": "Perceptual hash-based image moderation for Bluesky", 6 + "scripts": { 7 + "dev": "bun --watch src/main.ts", 8 + "start": "bun src/main.ts", 9 + "test": "bun test", 10 + "test:watch": "bun test --watch", 11 + "lint": "biome check .", 12 + "lint:fix": "biome check --write .", 13 + "format": "biome format --write .", 14 + "typecheck": "tsc --noEmit", 15 + "phash": "bun scripts/compute-phash.ts" 16 + }, 17 + "dependencies": { 18 + "@atproto/api": "^0.13.24", 19 + "@skyware/jetstream": "^0.2.2", 20 + "ioredis": "^5.4.1", 21 + "p-ratelimit": "^1.0.1", 22 + "pino": "^9.5.0", 23 + "pino-pretty": "^11.2.2", 24 + "prom-client": "^15.1.3", 25 + "sharp": "^0.33.5", 26 + "undici": "^7.16.0" 27 + }, 28 + "devDependencies": { 29 + "@biomejs/biome": "^1.9.4", 30 + "@types/bun": "^1.1.13", 31 + "@types/node": "^22.10.2", 32 + "typescript": "^5.7.2" 33 + } 34 + }
+27
rules/blobs.ts
··· 1 + import type { BlobCheck } from "../src/types"; 2 + 3 + export const BLOB_CHECKS: BlobCheck[] = [ 4 + { 5 + phashes: ["e0e0e0e0e0fcfefe"], 6 + label: "troll", 7 + comment: "Image is used in harrassment campaign targeting Will Stancil", 8 + reportAcct: false, 9 + labelAcct: true, 10 + reportPost: true, 11 + toLabel: true, 12 + hammingThreshold: 5, 13 + description: "Sample harassment image variants (placeholder hashes)", 14 + ignoreDID: ["did:plc:7umvpuxe2vbrc3zrzuquzniu"], 15 + }, 16 + { 17 + phashes: ["fffdf1c199d9fd00", "00ffbf3e1b5b0a00"], 18 + label: "maga-trump", 19 + comment: "Pro-trump imagery", 20 + reportAcct: true, 21 + labelAcct: false, 22 + reportPost: false, 23 + toLabel: true, 24 + hammingThreshold: 5, 25 + description: "Sample harassment image variants (placeholder hashes)", 26 + }, 27 + ];
+25
scripts/compute-phash.ts
··· 1 + #!/usr/bin/env bun 2 + import { readFileSync } from "node:fs"; 3 + import { computePerceptualHash } from "../src/hasher/phash"; 4 + 5 + async function main() { 6 + const imagePath = process.argv[2]; 7 + 8 + if (!imagePath) { 9 + console.error("Usage: bun scripts/compute-phash.ts <path-to-image>"); 10 + process.exit(1); 11 + } 12 + 13 + try { 14 + const imageBuffer = readFileSync(imagePath); 15 + const phash = await computePerceptualHash(imageBuffer); 16 + 17 + console.log(`Image: ${imagePath}`); 18 + console.log(`Phash: ${phash}`); 19 + } catch (error) { 20 + console.error("Error computing phash:", error); 21 + process.exit(1); 22 + } 23 + } 24 + 25 + main();
+142
src/agent.ts
··· 1 + import { AtpAgent } from "@atproto/api"; 2 + import { Agent, setGlobalDispatcher } from "undici"; 3 + import { config } from "./config/index.js"; 4 + import { logger } from "./logger/index.js"; 5 + import { type SessionData, loadSession, saveSession } from "./session.js"; 6 + 7 + setGlobalDispatcher( 8 + new Agent({ 9 + connect: { timeout: 20_000 }, 10 + keepAliveTimeout: 10_000, 11 + keepAliveMaxTimeout: 20_000, 12 + }) 13 + ); 14 + 15 + export const agent = new AtpAgent({ 16 + service: config.ozone.pds, 17 + }); 18 + 19 + const JWT_LIFETIME_MS = 2 * 60 * 60 * 1000; // 2 hours (typical ATP JWT lifetime) 20 + const REFRESH_AT_PERCENT = 0.8; // Refresh at 80% of lifetime 21 + let refreshTimer: NodeJS.Timeout | null = null; 22 + 23 + async function refreshSession(): Promise<void> { 24 + try { 25 + logger.info("Refreshing session tokens"); 26 + if (!agent.session) { 27 + throw new Error("No active session to refresh"); 28 + } 29 + await agent.resumeSession(agent.session); 30 + 31 + saveSession(agent.session as SessionData); 32 + scheduleSessionRefresh(); 33 + } catch (error: unknown) { 34 + logger.error({ error }, "Failed to refresh session, will re-authenticate"); 35 + await performLogin(); 36 + } 37 + } 38 + 39 + function scheduleSessionRefresh(): void { 40 + if (refreshTimer) { 41 + clearTimeout(refreshTimer); 42 + } 43 + 44 + const refreshIn = JWT_LIFETIME_MS * REFRESH_AT_PERCENT; 45 + logger.debug(`Scheduling session refresh in ${(refreshIn / 1000 / 60).toFixed(1)} minutes`); 46 + 47 + refreshTimer = setTimeout(() => { 48 + refreshSession().catch((error: unknown) => { 49 + logger.error({ error }, "Scheduled session refresh failed"); 50 + }); 51 + }, refreshIn); 52 + } 53 + 54 + async function performLogin(): Promise<boolean> { 55 + try { 56 + logger.info("Performing fresh login"); 57 + const response = await agent.login({ 58 + identifier: config.labeler.handle, 59 + password: config.labeler.password, 60 + }); 61 + 62 + if (response.success && agent.session) { 63 + saveSession(agent.session as SessionData); 64 + scheduleSessionRefresh(); 65 + logger.info("Login successful, session saved"); 66 + return true; 67 + } 68 + 69 + logger.error("Login failed: no session returned"); 70 + return false; 71 + } catch (error) { 72 + logger.error({ error }, "Login failed"); 73 + return false; 74 + } 75 + } 76 + 77 + const MAX_LOGIN_RETRIES = 3; 78 + const RETRY_DELAY_MS = 2000; 79 + 80 + let loginPromise: Promise<void> | null = null; 81 + 82 + async function sleep(ms: number): Promise<void> { 83 + return new Promise((resolve) => setTimeout(resolve, ms)); 84 + } 85 + 86 + async function authenticate(): Promise<boolean> { 87 + const savedSession = loadSession(); 88 + 89 + if (savedSession) { 90 + try { 91 + logger.info("Attempting to resume saved session"); 92 + await agent.resumeSession(savedSession); 93 + 94 + // Verify session is still valid with a lightweight call 95 + await agent.getProfile({ actor: savedSession.did }); 96 + 97 + logger.info("Session resumed successfully"); 98 + scheduleSessionRefresh(); 99 + return true; 100 + } catch (error) { 101 + logger.warn({ error }, "Saved session invalid, will re-authenticate"); 102 + } 103 + } 104 + 105 + return performLogin(); 106 + } 107 + 108 + async function authenticateWithRetry(): Promise<void> { 109 + // Reuse existing login attempt if one is in progress 110 + if (loginPromise) { 111 + return loginPromise; 112 + } 113 + 114 + loginPromise = (async () => { 115 + for (let attempt = 1; attempt <= MAX_LOGIN_RETRIES; attempt++) { 116 + logger.info({ attempt, maxRetries: MAX_LOGIN_RETRIES }, "Attempting login"); 117 + 118 + const success = await authenticate(); 119 + 120 + if (success) { 121 + logger.info("Authentication successful"); 122 + return; 123 + } 124 + 125 + if (attempt < MAX_LOGIN_RETRIES) { 126 + logger.warn( 127 + { attempt, maxRetries: MAX_LOGIN_RETRIES, retryInMs: RETRY_DELAY_MS }, 128 + "Login failed, retrying" 129 + ); 130 + await sleep(RETRY_DELAY_MS); 131 + } 132 + } 133 + 134 + logger.error({ maxRetries: MAX_LOGIN_RETRIES }, "All login attempts failed, aborting"); 135 + process.exit(1); 136 + })(); 137 + 138 + return loginPromise; 139 + } 140 + 141 + export const login = authenticateWithRetry; 142 + export const isLoggedIn = authenticateWithRetry().then(() => true);
+47
src/cache/moderation-claims.ts
··· 1 + import type Redis from "ioredis"; 2 + import { logger } from "../logger/index.js"; 3 + 4 + const CLAIM_TTL = 7 * 24 * 60 * 60; // 7 days in seconds 5 + 6 + export class ModerationClaims { 7 + constructor(private redis: Redis) {} 8 + 9 + async tryClaimPostLabel(uri: string, label: string): Promise<boolean> { 10 + const key = `claim:post:label:${uri}:${label}`; 11 + const result = await this.redis.set(key, "1", "EX", CLAIM_TTL, "NX"); 12 + 13 + if (result === "OK") { 14 + logger.debug({ uri, label }, "Post label claim acquired"); 15 + return true; 16 + } 17 + 18 + logger.debug({ uri, label }, "Post label already claimed"); 19 + return false; 20 + } 21 + 22 + async tryClaimAccountLabel(did: string, label: string): Promise<boolean> { 23 + const key = `claim:account:label:${did}:${label}`; 24 + const result = await this.redis.set(key, "1", "EX", CLAIM_TTL, "NX"); 25 + 26 + if (result === "OK") { 27 + logger.debug({ did, label }, "Account label claim acquired"); 28 + return true; 29 + } 30 + 31 + logger.debug({ did, label }, "Account label already claimed"); 32 + return false; 33 + } 34 + 35 + async tryClaimAccountComment(did: string, atURI: string): Promise<boolean> { 36 + const key = `claim:account:comment:${did}:${atURI}`; 37 + const result = await this.redis.set(key, "1", "EX", CLAIM_TTL, "NX"); 38 + 39 + if (result === "OK") { 40 + logger.debug({ did, atURI }, "Account comment claim acquired"); 41 + return true; 42 + } 43 + 44 + logger.debug({ did, atURI }, "Account comment already claimed"); 45 + return false; 46 + } 47 + }
+51
src/cache/phash-cache.ts
··· 1 + import type Redis from "ioredis"; 2 + import { logger } from "../logger/index"; 3 + 4 + export class PhashCache { 5 + private redis: Redis; 6 + private ttl: number; 7 + private readonly PHASH_PREFIX = "phash:cache:"; 8 + 9 + constructor(redis: Redis, ttlSeconds: number) { 10 + this.redis = redis; 11 + this.ttl = ttlSeconds; 12 + } 13 + 14 + async get(cid: string): Promise<string | null> { 15 + const key = this.PHASH_PREFIX + cid; 16 + const cached = await this.redis.get(key); 17 + 18 + if (cached) { 19 + logger.debug({ cid }, "Phash cache hit"); 20 + } 21 + 22 + return cached; 23 + } 24 + 25 + async set(cid: string, phash: string): Promise<void> { 26 + const key = this.PHASH_PREFIX + cid; 27 + await this.redis.set(key, phash, "EX", this.ttl); 28 + logger.debug({ cid, phash }, "Phash cached"); 29 + } 30 + 31 + async delete(cid: string): Promise<void> { 32 + const key = this.PHASH_PREFIX + cid; 33 + await this.redis.del(key); 34 + } 35 + 36 + async clear(): Promise<void> { 37 + const pattern = `${this.PHASH_PREFIX}*`; 38 + const keys = await this.redis.keys(pattern); 39 + 40 + if (keys.length > 0) { 41 + await this.redis.del(...keys); 42 + logger.info({ count: keys.length }, "Cleared phash cache"); 43 + } 44 + } 45 + 46 + async getStats(): Promise<{ size: number }> { 47 + const pattern = `${this.PHASH_PREFIX}*`; 48 + const keys = await this.redis.keys(pattern); 49 + return { size: keys.length }; 50 + } 51 + }
+102
src/config/index.ts
··· 1 + export interface Config { 2 + jetstream: { 3 + url: string; 4 + wantedCollections: string[]; 5 + }; 6 + redis: { 7 + url: string; 8 + }; 9 + processing: { 10 + concurrency: number; 11 + retryAttempts: number; 12 + retryDelay: number; 13 + }; 14 + cache: { 15 + enabled: boolean; 16 + ttl: number; 17 + }; 18 + pds: { 19 + endpoint: string; 20 + }; 21 + plc: { 22 + endpoint: string; 23 + }; 24 + labeler: { 25 + did: string; 26 + handle: string; 27 + password: string; 28 + }; 29 + ozone: { 30 + url: string; 31 + pds: string; 32 + }; 33 + moderation: { 34 + modDid: string; 35 + rateLimit: number; 36 + }; 37 + } 38 + 39 + function getEnv(key: string, defaultValue?: string): string { 40 + const value = process.env[key]; 41 + if (!value) { 42 + if (defaultValue === undefined) { 43 + throw new Error(`Missing required environment variable: ${key}`); 44 + } 45 + return defaultValue; 46 + } 47 + return value; 48 + } 49 + 50 + function getEnvNumber(key: string, defaultValue: number): number { 51 + const value = process.env[key]; 52 + if (!value) return defaultValue; 53 + const parsed = Number.parseInt(value, 10); 54 + if (Number.isNaN(parsed)) { 55 + throw new Error(`Invalid number for environment variable ${key}: ${value}`); 56 + } 57 + return parsed; 58 + } 59 + 60 + function getEnvBoolean(key: string, defaultValue: boolean): boolean { 61 + const value = process.env[key]; 62 + if (!value) return defaultValue; 63 + return value.toLowerCase() === "true" || value === "1"; 64 + } 65 + 66 + export const config: Config = { 67 + jetstream: { 68 + url: getEnv("JETSTREAM_URL", "wss://jetstream.atproto.tools/subscribe"), 69 + wantedCollections: ["app.bsky.feed.post"], 70 + }, 71 + redis: { 72 + url: getEnv("REDIS_URL", "redis://localhost:6379"), 73 + }, 74 + processing: { 75 + concurrency: getEnvNumber("PROCESSING_CONCURRENCY", 10), 76 + retryAttempts: getEnvNumber("RETRY_ATTEMPTS", 3), 77 + retryDelay: getEnvNumber("RETRY_DELAY_MS", 1000), 78 + }, 79 + cache: { 80 + enabled: getEnvBoolean("CACHE_ENABLED", true), 81 + ttl: getEnvNumber("CACHE_TTL_SECONDS", 86400), 82 + }, 83 + pds: { 84 + endpoint: getEnv("PDS_ENDPOINT", "https://bsky.social"), 85 + }, 86 + plc: { 87 + endpoint: getEnv("PLC_ENDPOINT", "https://plc.directory"), 88 + }, 89 + labeler: { 90 + did: getEnv("LABELER_DID"), 91 + handle: getEnv("LABELER_HANDLE"), 92 + password: getEnv("LABELER_PASSWORD"), 93 + }, 94 + ozone: { 95 + url: getEnv("OZONE_URL"), 96 + pds: getEnv("OZONE_PDS"), 97 + }, 98 + moderation: { 99 + modDid: getEnv("MOD_DID"), 100 + rateLimit: getEnvNumber("RATE_LIMIT_MS", 100), 101 + }, 102 + };
+22
src/hasher/phash.ts
··· 1 + import sharp from "sharp"; 2 + 3 + export async function computePerceptualHash(buffer: Buffer): Promise<string> { 4 + const image = sharp(buffer); 5 + const metadata = await image.metadata(); 6 + 7 + if (!metadata.width || !metadata.height) { 8 + throw new Error("Invalid image metadata"); 9 + } 10 + 11 + const resized = await image.resize(8, 8, { fit: "fill" }).grayscale().raw().toBuffer(); 12 + 13 + const pixels = new Uint8Array(resized); 14 + const avg = pixels.reduce((sum, val) => sum + val, 0) / pixels.length; 15 + 16 + let hash = ""; 17 + for (let i = 0; i < pixels.length; i++) { 18 + hash += pixels[i] > avg ? "1" : "0"; 19 + } 20 + 21 + return BigInt(`0b${hash}`).toString(16).padStart(16, "0"); 22 + }
+120
src/limits.ts
··· 1 + import { pRateLimit } from "p-ratelimit"; 2 + import { Counter, Gauge, Histogram } from "prom-client"; 3 + import { logger } from "./logger/index.js"; 4 + 5 + interface RateLimitState { 6 + limit: number; 7 + remaining: number; 8 + reset: number; // Unix timestamp in seconds 9 + policy?: string; 10 + } 11 + 12 + // Conservative defaults based on previous static configuration 13 + // Will be replaced with dynamic values from ATP response headers 14 + let rateLimitState: RateLimitState = { 15 + limit: 280, 16 + remaining: 280, 17 + reset: Math.floor(Date.now() / 1000) + 30, 18 + }; 19 + 20 + const SAFETY_BUFFER = 5; // Keep this many requests in reserve (reduced from 20) 21 + const CONCURRENCY = 24; // Reduced from 48 to prevent rapid depletion 22 + 23 + // Metrics 24 + const rateLimitWaitsTotal = new Counter({ 25 + name: "rate_limit_waits_total", 26 + help: "Total number of times rate limit wait was triggered", 27 + }); 28 + 29 + const rateLimitWaitDuration = new Histogram({ 30 + name: "rate_limit_wait_duration_seconds", 31 + help: "Duration of rate limit waits in seconds", 32 + buckets: [0.1, 0.5, 1, 5, 10, 30, 60], 33 + }); 34 + 35 + const rateLimitRemaining = new Gauge({ 36 + name: "rate_limit_remaining", 37 + help: "Current remaining rate limit", 38 + }); 39 + 40 + const rateLimitTotal = new Gauge({ 41 + name: "rate_limit_total", 42 + help: "Total rate limit from headers", 43 + }); 44 + 45 + const concurrentRequestsGauge = new Gauge({ 46 + name: "concurrent_requests", 47 + help: "Current number of concurrent requests", 48 + }); 49 + 50 + // Use p-ratelimit purely for concurrency management 51 + const concurrencyLimiter = pRateLimit({ 52 + interval: 1000, 53 + rate: 10000, // Very high rate, we manage rate limiting separately 54 + concurrency: CONCURRENCY, 55 + maxDelay: 0, 56 + }); 57 + 58 + export function getRateLimitState(): RateLimitState { 59 + return { ...rateLimitState }; 60 + } 61 + 62 + export function updateRateLimitState(state: Partial<RateLimitState>): void { 63 + rateLimitState = { ...rateLimitState, ...state }; 64 + 65 + // Update Prometheus metrics 66 + if (state.remaining !== undefined) { 67 + rateLimitRemaining.set(state.remaining); 68 + } 69 + if (state.limit !== undefined) { 70 + rateLimitTotal.set(state.limit); 71 + } 72 + 73 + logger.debug( 74 + { 75 + limit: rateLimitState.limit, 76 + remaining: rateLimitState.remaining, 77 + resetIn: rateLimitState.reset - Math.floor(Date.now() / 1000), 78 + }, 79 + "Rate limit state updated" 80 + ); 81 + } 82 + 83 + async function awaitRateLimit(): Promise<void> { 84 + const state = getRateLimitState(); 85 + const now = Math.floor(Date.now() / 1000); 86 + 87 + // Only wait if we're critically low 88 + if (state.remaining <= SAFETY_BUFFER) { 89 + rateLimitWaitsTotal.inc(); 90 + 91 + const delaySeconds = Math.max(0, state.reset - now); 92 + const delayMs = delaySeconds * 1000; 93 + 94 + if (delayMs > 0) { 95 + logger.warn( 96 + `Rate limit critical (${state.remaining.toString()}/${state.limit.toString()} remaining). Waiting ${delaySeconds.toString()}s until reset...` 97 + ); 98 + 99 + const waitStart = Date.now(); 100 + await new Promise((resolve) => setTimeout(resolve, delayMs)); 101 + const waitDuration = (Date.now() - waitStart) / 1000; 102 + rateLimitWaitDuration.observe(waitDuration); 103 + 104 + // Don't manually reset state - let the next API response update it 105 + logger.info("Rate limit wait complete, resuming requests"); 106 + } 107 + } 108 + } 109 + 110 + export async function limit<T>(fn: () => Promise<T>): Promise<T> { 111 + return concurrencyLimiter(async () => { 112 + concurrentRequestsGauge.inc(); 113 + try { 114 + await awaitRateLimit(); 115 + return await fn(); 116 + } finally { 117 + concurrentRequestsGauge.dec(); 118 + } 119 + }); 120 + }
+17
src/logger/index.ts
··· 1 + import pino from "pino"; 2 + 3 + const isDev = process.env.NODE_ENV !== "production"; 4 + 5 + export const logger = pino({ 6 + level: process.env.LOG_LEVEL || (isDev ? "debug" : "info"), 7 + transport: isDev 8 + ? { 9 + target: "pino-pretty", 10 + options: { 11 + colorize: true, 12 + translateTime: "HH:MM:ss Z", 13 + ignore: "pid,hostname", 14 + }, 15 + } 16 + : undefined, 17 + });
+335
src/main.ts
··· 1 + import { readFile, writeFile } from "node:fs/promises"; 2 + import { existsSync } from "node:fs"; 3 + import Redis from "ioredis"; 4 + 5 + // Monkey-patch WebSocket to work with @skyware/jetstream on Bun 6 + // The library tries to set binaryType to 'blob' but Bun doesn't support it 7 + const OriginalWebSocket = globalThis.WebSocket; 8 + if (OriginalWebSocket) { 9 + globalThis.WebSocket = class PatchedWebSocket extends OriginalWebSocket { 10 + constructor(...args: ConstructorParameters<typeof OriginalWebSocket>) { 11 + super(...args); 12 + // Silently ignore binaryType changes to 'blob' 13 + const descriptor = Object.getOwnPropertyDescriptor(this, "binaryType") || 14 + Object.getOwnPropertyDescriptor(OriginalWebSocket.prototype, "binaryType"); 15 + if (descriptor) { 16 + Object.defineProperty(this, "binaryType", { 17 + get: descriptor.get, 18 + set: (value: string) => { 19 + // Only set if it's arraybuffer, ignore 'blob' 20 + if (value === "arraybuffer" && descriptor.set) { 21 + descriptor.set.call(this, value); 22 + } 23 + }, 24 + enumerable: true, 25 + configurable: true, 26 + }); 27 + } 28 + } 29 + } as any; 30 + } 31 + 32 + import type { CommitCreateEvent } from "@skyware/jetstream"; 33 + import { Jetstream } from "@skyware/jetstream"; 34 + import { BLOB_CHECKS } from "../rules/blobs"; 35 + import { agent, isLoggedIn } from "./agent"; 36 + import { ModerationClaims } from "./cache/moderation-claims"; 37 + import { PhashCache } from "./cache/phash-cache"; 38 + import { config } from "./config/index"; 39 + import { logger } from "./logger/index"; 40 + import { MetricsCollector } from "./metrics/collector"; 41 + import { createAccountLabel, createAccountReport } from "./moderation/account"; 42 + import { createPostLabel, createPostReport } from "./moderation/post"; 43 + import { RedisQueue } from "./queue/redis-queue"; 44 + import { QueueWorker } from "./queue/worker"; 45 + import type { ImageJob } from "./types"; 46 + 47 + logger.info("Starting skywatch-phash service"); 48 + 49 + // Cursor persistence 50 + const CURSOR_FILE = "/app/cursor.txt"; 51 + 52 + async function loadCursor(): Promise<number | undefined> { 53 + try { 54 + if (existsSync(CURSOR_FILE)) { 55 + const content = await readFile(CURSOR_FILE, "utf-8"); 56 + const cursor = Number.parseInt(content.trim(), 10); 57 + if (!Number.isNaN(cursor) && cursor > 0) { 58 + logger.info({ cursor, cursorDate: new Date(cursor / 1000).toISOString() }, "Loaded cursor from file"); 59 + return cursor; 60 + } 61 + } 62 + } catch (error) { 63 + logger.warn({ error }, "Failed to load cursor from file"); 64 + } 65 + return undefined; 66 + } 67 + 68 + async function saveCursor(cursor: number): Promise<void> { 69 + try { 70 + await writeFile(CURSOR_FILE, cursor.toString(), "utf-8"); 71 + } catch (error) { 72 + logger.error({ error }, "Failed to save cursor to file"); 73 + } 74 + } 75 + 76 + // Load cursor before creating Jetstream instance 77 + const savedCursor = await loadCursor(); 78 + 79 + // Create Jetstream instance at module level 80 + const jetstream = new Jetstream({ 81 + endpoint: config.jetstream.url, 82 + wantedCollections: ["app.bsky.feed.post"], 83 + cursor: savedCursor, 84 + }); 85 + 86 + // Module-level variables for queue and worker 87 + // biome-ignore lint/style/useConst: These are assigned later via top-level await 88 + let redis: Redis; 89 + // biome-ignore lint/style/useConst: These are assigned later via top-level await 90 + let queue: RedisQueue; 91 + // biome-ignore lint/style/useConst: These are assigned later via top-level await 92 + let worker: QueueWorker; 93 + // biome-ignore lint/style/useConst: These are assigned later via top-level await 94 + let cache: PhashCache | undefined; 95 + // biome-ignore lint/style/useConst: These are assigned later via top-level await 96 + let claims: ModerationClaims; 97 + // biome-ignore lint/style/useConst: These are assigned later via top-level await 98 + let metrics: MetricsCollector; 99 + // biome-ignore lint/style/useConst: These are assigned later via top-level await 100 + let statsInterval: NodeJS.Timeout; 101 + // biome-ignore lint/style/useConst: These are assigned later via top-level await 102 + let cursorInterval: NodeJS.Timeout; 103 + 104 + // Register Jetstream event handlers at module level 105 + jetstream.on("open", () => { 106 + logger.info( 107 + { 108 + url: config.jetstream.url, 109 + cursor: jetstream.cursor, 110 + }, 111 + "Connected to Jetstream" 112 + ); 113 + }); 114 + 115 + jetstream.on("close", () => { 116 + logger.info("Jetstream connection closed"); 117 + }); 118 + 119 + jetstream.on("error", (error) => { 120 + logger.error({ error: error.message }, "Jetstream error"); 121 + }); 122 + 123 + // Register onCreate handler for posts 124 + jetstream.onCreate("app.bsky.feed.post", async (event: CommitCreateEvent<"app.bsky.feed.post">) => { 125 + try { 126 + const record = event.commit.record as Record<string, unknown>; 127 + const embed = record.embed; 128 + 129 + // Extract blob references 130 + const blobs: Array<{ cid: string; mimeType?: string }> = []; 131 + 132 + if (embed && typeof embed === "object") { 133 + const embedObj = embed as Record<string, unknown>; 134 + 135 + if (Array.isArray(embedObj.images)) { 136 + for (const img of embedObj.images) { 137 + if (typeof img === "object" && img !== null) { 138 + const image = img as Record<string, unknown>; 139 + if (image.image && typeof image.image === "object" && image.image !== null) { 140 + const imageObj = image.image as Record<string, unknown>; 141 + const ref = imageObj.ref as Record<string, unknown> | undefined; 142 + if (ref && typeof ref.$link === "string") { 143 + blobs.push({ 144 + cid: ref.$link, 145 + mimeType: typeof imageObj.mimeType === "string" ? imageObj.mimeType : undefined, 146 + }); 147 + } 148 + } 149 + } 150 + } 151 + } 152 + 153 + if (embedObj.media && typeof embedObj.media === "object" && embedObj.media !== null) { 154 + const media = embedObj.media as Record<string, unknown>; 155 + if (Array.isArray(media.images)) { 156 + for (const img of media.images) { 157 + if (typeof img === "object" && img !== null) { 158 + const image = img as Record<string, unknown>; 159 + if (image.image && typeof image.image === "object" && image.image !== null) { 160 + const imageObj = image.image as Record<string, unknown>; 161 + const ref = imageObj.ref as Record<string, unknown> | undefined; 162 + if (ref && typeof ref.$link === "string") { 163 + blobs.push({ 164 + cid: ref.$link, 165 + mimeType: typeof imageObj.mimeType === "string" ? imageObj.mimeType : undefined, 166 + }); 167 + } 168 + } 169 + } 170 + } 171 + } 172 + } 173 + } 174 + 175 + if (blobs.length === 0) { 176 + return; 177 + } 178 + 179 + const postUri = `at://${event.did}/${event.commit.collection}/${event.commit.rkey}`; 180 + 181 + logger.debug({ uri: postUri, blobCount: blobs.length }, "Post with blobs detected"); 182 + 183 + const job: ImageJob = { 184 + postUri, 185 + postCid: event.commit.cid, 186 + postDid: event.did, 187 + blobs, 188 + timestamp: Date.now(), 189 + attempts: 0, 190 + }; 191 + 192 + await queue.enqueue(job); 193 + } catch (error) { 194 + logger.error({ error, event }, "Error processing jetstream event"); 195 + } 196 + }); 197 + 198 + // Async setup 199 + logger.info("Authenticating labeler"); 200 + await isLoggedIn; 201 + logger.info("Authentication complete"); 202 + 203 + logger.info("Connecting to Redis"); 204 + redis = new Redis(config.redis.url); 205 + queue = new RedisQueue(config.redis.url); 206 + 207 + redis.on("connect", () => { 208 + logger.info("Redis connected"); 209 + }); 210 + 211 + redis.on("error", (error) => { 212 + logger.error({ error }, "Redis error"); 213 + }); 214 + 215 + cache = config.cache.enabled ? new PhashCache(redis, config.cache.ttl) : undefined; 216 + claims = new ModerationClaims(redis); 217 + metrics = new MetricsCollector(); 218 + 219 + if (cache) { 220 + logger.info({ ttl: config.cache.ttl }, "Phash caching enabled"); 221 + } 222 + 223 + worker = new QueueWorker( 224 + queue, 225 + BLOB_CHECKS, 226 + { 227 + concurrency: config.processing.concurrency, 228 + retryAttempts: config.processing.retryAttempts, 229 + retryDelay: config.processing.retryDelay, 230 + }, 231 + agent, 232 + cache, 233 + metrics 234 + ); 235 + 236 + worker.onMatchFound(async (postUri, postCid, postDid, match) => { 237 + const check = match.matchedCheck; 238 + 239 + logger.warn( 240 + { 241 + postUri, 242 + postDid, 243 + label: check.label, 244 + comment: check.comment, 245 + phash: match.phash, 246 + matchedPhash: match.matchedPhash, 247 + hammingDistance: match.hammingDistance, 248 + }, 249 + "Match found - executing moderation actions" 250 + ); 251 + 252 + try { 253 + if (check.toLabel) { 254 + await createPostLabel(postUri, postCid, check.label, check.comment, match.phash, claims, metrics); 255 + } 256 + 257 + if (check.reportPost) { 258 + await createPostReport(postUri, postCid, check.comment, match.phash, metrics); 259 + } 260 + 261 + if (check.labelAcct) { 262 + await createAccountLabel(postDid, check.label, check.comment, postUri, match.phash, claims, metrics); 263 + } 264 + 265 + if (check.reportAcct) { 266 + await createAccountReport(postDid, check.comment, postUri, match.phash, metrics); 267 + } 268 + } catch (error) { 269 + logger.error({ error, postUri, postDid }, "Failed to execute moderation actions"); 270 + } 271 + }); 272 + 273 + const shutdown = async () => { 274 + try { 275 + logger.info("Shutting down gracefully"); 276 + jetstream.close(); 277 + 278 + // Save cursor one last time before shutdown 279 + if (jetstream.cursor) { 280 + await saveCursor(jetstream.cursor); 281 + logger.info({ cursor: jetstream.cursor }, "Saved final cursor position"); 282 + } 283 + 284 + await worker.stop(); 285 + await queue.disconnect(); 286 + await redis.quit(); 287 + clearInterval(statsInterval); 288 + clearInterval(cursorInterval); 289 + logger.info("Service stopped"); 290 + process.exit(0); 291 + } catch (error) { 292 + logger.error({ error }, "Error shutting down gracefully"); 293 + process.exit(1); 294 + } 295 + }; 296 + 297 + process.on("SIGTERM", () => void shutdown()); 298 + process.on("SIGINT", () => void shutdown()); 299 + 300 + logger.info("Starting queue worker"); 301 + await worker.start(); 302 + 303 + logger.info("Starting Jetstream subscription"); 304 + jetstream.start(); 305 + 306 + logger.info("Service started and ready"); 307 + 308 + // Save cursor position every 10 seconds 309 + cursorInterval = setInterval(async () => { 310 + if (jetstream.cursor) { 311 + const cursorDate = new Date(jetstream.cursor / 1000).toISOString(); 312 + logger.info( 313 + { 314 + cursor: jetstream.cursor, 315 + cursorDate, 316 + }, 317 + "Jetstream cursor position" 318 + ); 319 + await saveCursor(jetstream.cursor); 320 + } 321 + }, 10_000); 322 + 323 + statsInterval = setInterval(async () => { 324 + const workerStats = await worker.getStats(); 325 + const cacheStats = cache ? await cache.getStats() : null; 326 + const metricsData = metrics.getWithRates(); 327 + logger.info( 328 + { 329 + worker: workerStats, 330 + cache: cacheStats, 331 + metrics: metricsData, 332 + }, 333 + "Service stats" 334 + ); 335 + }, 60000);
+32
src/matcher/hamming.ts
··· 1 + import type { BlobCheck } from "../types"; 2 + 3 + export function hammingDistance(hash1: string, hash2: string): number { 4 + const a = BigInt(`0x${hash1}`); 5 + const b = BigInt(`0x${hash2}`); 6 + const xor = a ^ b; 7 + 8 + let count = 0; 9 + let n = xor; 10 + while (n > 0n) { 11 + count++; 12 + n &= n - 1n; 13 + } 14 + 15 + return count; 16 + } 17 + 18 + export function findMatch(phash: string, checks: BlobCheck[]): BlobCheck | null { 19 + for (const check of checks) { 20 + const threshold = check.hammingThreshold ?? 5; 21 + 22 + for (const checkPhash of check.phashes) { 23 + const distance = hammingDistance(phash, checkPhash); 24 + 25 + if (distance <= threshold) { 26 + return check; 27 + } 28 + } 29 + } 30 + 31 + return null; 32 + }
+54
src/metrics/collector.ts
··· 1 + export class MetricsCollector { 2 + private counters: Map<string, number> = new Map(); 3 + private startTime: number = Date.now(); 4 + 5 + increment(metric: string, value = 1): void { 6 + const current = this.counters.get(metric) || 0; 7 + this.counters.set(metric, current + value); 8 + } 9 + 10 + get(metric: string): number { 11 + return this.counters.get(metric) || 0; 12 + } 13 + 14 + getAll(): Record<string, number> { 15 + const metrics: Record<string, number> = {}; 16 + for (const [key, value] of this.counters) { 17 + metrics[key] = value; 18 + } 19 + return metrics; 20 + } 21 + 22 + getWithRates(): Record<string, number | string> { 23 + const uptimeSeconds = (Date.now() - this.startTime) / 1000; 24 + const all = this.getAll(); 25 + const withRates: Record<string, number | string> = { ...all }; 26 + 27 + withRates.uptimeSeconds = Math.floor(uptimeSeconds); 28 + 29 + if (all.postsProcessed) { 30 + withRates.postsPerSecond = (all.postsProcessed / uptimeSeconds).toFixed(2); 31 + } 32 + 33 + if (all.blobsProcessed) { 34 + withRates.blobsPerSecond = (all.blobsProcessed / uptimeSeconds).toFixed(2); 35 + } 36 + 37 + if (all.cacheHits && all.cacheMisses) { 38 + const hitRate = (all.cacheHits / (all.cacheHits + all.cacheMisses)) * 100; 39 + withRates.cacheHitRate = `${hitRate.toFixed(1)}%`; 40 + } 41 + 42 + if (all.matchesFound && all.blobsProcessed) { 43 + const matchRate = (all.matchesFound / all.blobsProcessed) * 100; 44 + withRates.matchRate = `${matchRate.toFixed(2)}%`; 45 + } 46 + 47 + return withRates; 48 + } 49 + 50 + reset(): void { 51 + this.counters.clear(); 52 + this.startTime = Date.now(); 53 + } 54 + }
+200
src/moderation/account.ts
··· 1 + import { agent, isLoggedIn } from "../agent.js"; 2 + import type { ModerationClaims } from "../cache/moderation-claims.js"; 3 + import { config } from "../config/index.js"; 4 + import { limit } from "../limits.js"; 5 + import { logger } from "../logger/index.js"; 6 + import type { MetricsCollector } from "../metrics/collector.js"; 7 + 8 + const doesLabelExist = (labels: { val: string }[] | undefined, labelVal: string): boolean => { 9 + if (!labels) { 10 + return false; 11 + } 12 + return labels.some((label) => label.val === labelVal); 13 + }; 14 + 15 + export const createAccountLabel = async ( 16 + did: string, 17 + label: string, 18 + comment: string, 19 + atUri: string, 20 + phash: string, 21 + claims: ModerationClaims, 22 + metrics?: MetricsCollector 23 + ) => { 24 + await isLoggedIn; 25 + 26 + const claimed = await claims.tryClaimAccountLabel(did, label); 27 + if (!claimed) { 28 + logger.debug( 29 + { process: "MODERATION", did, label }, 30 + "Account label already claimed in Redis, skipping" 31 + ); 32 + metrics?.increment("labelsCached"); 33 + return; 34 + } 35 + 36 + const hasLabel = await checkAccountLabels(did, label); 37 + if (hasLabel) { 38 + logger.debug({ process: "MODERATION", did, label }, "Account already has label, skipping"); 39 + metrics?.increment("labelsCached"); 40 + return; 41 + } 42 + 43 + logger.info({ process: "MODERATION", did, label }, "Labeling account"); 44 + metrics?.increment("labelsApplied"); 45 + 46 + const timestamp = new Date().toISOString(); 47 + const formattedComment = `${timestamp}: ${comment} at ${atUri} with phash "${phash}"`; 48 + 49 + await limit(async () => { 50 + try { 51 + await agent.tools.ozone.moderation.emitEvent( 52 + { 53 + event: { 54 + $type: "tools.ozone.moderation.defs#modEventLabel", 55 + comment: formattedComment, 56 + createLabelVals: [label], 57 + negateLabelVals: [], 58 + }, 59 + subject: { 60 + $type: "com.atproto.admin.defs#repoRef", 61 + did, 62 + }, 63 + createdBy: agent.did ?? "", 64 + createdAt: new Date().toISOString(), 65 + modTool: { 66 + name: "skywatch/skywatch-phash", 67 + }, 68 + }, 69 + { 70 + encoding: "application/json", 71 + headers: { 72 + "atproto-proxy": `${config.moderation.modDid}#atproto_labeler`, 73 + "atproto-accept-labelers": "did:plc:ar7c4by46qjdydhdevvrndac;redact", 74 + }, 75 + } 76 + ); 77 + } catch (e) { 78 + logger.error({ process: "MODERATION", error: e }, "Failed to create account label"); 79 + } 80 + }); 81 + }; 82 + 83 + export const createAccountComment = async ( 84 + did: string, 85 + comment: string, 86 + atURI: string, 87 + claims: ModerationClaims 88 + ) => { 89 + await isLoggedIn; 90 + 91 + const claimed = await claims.tryClaimAccountComment(did, atURI); 92 + if (!claimed) { 93 + logger.debug( 94 + { process: "MODERATION", did, atURI }, 95 + "Account comment already claimed in Redis, skipping" 96 + ); 97 + return; 98 + } 99 + 100 + logger.info({ process: "MODERATION", did, atURI }, "Commenting on account"); 101 + 102 + await limit(async () => { 103 + try { 104 + await agent.tools.ozone.moderation.emitEvent( 105 + { 106 + event: { 107 + $type: "tools.ozone.moderation.defs#modEventComment", 108 + comment, 109 + }, 110 + subject: { 111 + $type: "com.atproto.admin.defs#repoRef", 112 + did, 113 + }, 114 + createdBy: agent.did ?? "", 115 + createdAt: new Date().toISOString(), 116 + modTool: { 117 + name: "skywatch/skywatch-phash", 118 + }, 119 + }, 120 + { 121 + encoding: "application/json", 122 + headers: { 123 + "atproto-proxy": `${config.moderation.modDid}#atproto_labeler`, 124 + "atproto-accept-labelers": "did:plc:ar7c4by46qjdydhdevvrndac;redact", 125 + }, 126 + } 127 + ); 128 + } catch (e) { 129 + logger.error({ process: "MODERATION", error: e }, "Failed to create account comment"); 130 + } 131 + }); 132 + }; 133 + 134 + export const createAccountReport = async ( 135 + did: string, 136 + comment: string, 137 + atUri: string, 138 + phash: string, 139 + metrics?: MetricsCollector 140 + ) => { 141 + await isLoggedIn; 142 + metrics?.increment("reportsCreated"); 143 + 144 + const timestamp = new Date().toISOString(); 145 + const formattedComment = `${timestamp}: ${comment} at ${atUri} with phash "${phash}"`; 146 + 147 + await limit(async () => { 148 + try { 149 + await agent.tools.ozone.moderation.emitEvent( 150 + { 151 + event: { 152 + $type: "tools.ozone.moderation.defs#modEventReport", 153 + comment: formattedComment, 154 + reportType: "com.atproto.moderation.defs#reasonOther", 155 + }, 156 + subject: { 157 + $type: "com.atproto.admin.defs#repoRef", 158 + did, 159 + }, 160 + createdBy: agent.did ?? "", 161 + createdAt: new Date().toISOString(), 162 + modTool: { 163 + name: "skywatch/skywatch-phash", 164 + }, 165 + }, 166 + { 167 + encoding: "application/json", 168 + headers: { 169 + "atproto-proxy": `${config.moderation.modDid}#atproto_labeler`, 170 + "atproto-accept-labelers": "did:plc:ar7c4by46qjdydhdevvrndac;redact", 171 + }, 172 + } 173 + ); 174 + } catch (e) { 175 + logger.error({ process: "MODERATION", error: e }, "Failed to create account report"); 176 + } 177 + }); 178 + }; 179 + 180 + export const checkAccountLabels = async (did: string, label: string): Promise<boolean> => { 181 + await isLoggedIn; 182 + return await limit(async () => { 183 + try { 184 + const response = await agent.tools.ozone.moderation.getRepo( 185 + { did }, 186 + { 187 + headers: { 188 + "atproto-proxy": `${config.moderation.modDid}#atproto_labeler`, 189 + "atproto-accept-labelers": "did:plc:ar7c4by46qjdydhdevvrndac;redact", 190 + }, 191 + } 192 + ); 193 + 194 + return doesLabelExist(response.data.labels, label); 195 + } catch (e) { 196 + logger.error({ process: "MODERATION", did, error: e }, "Failed to check account labels"); 197 + return false; 198 + } 199 + }); 200 + };
+164
src/moderation/post.ts
··· 1 + import { agent, isLoggedIn } from "../agent.js"; 2 + import type { ModerationClaims } from "../cache/moderation-claims.js"; 3 + import { config } from "../config/index.js"; 4 + import { limit } from "../limits.js"; 5 + import { logger } from "../logger/index.js"; 6 + import type { MetricsCollector } from "../metrics/collector.js"; 7 + 8 + const doesLabelExist = (labels: { val: string }[] | undefined, labelVal: string): boolean => { 9 + if (!labels) { 10 + return false; 11 + } 12 + return labels.some((label) => label.val === labelVal); 13 + }; 14 + 15 + export const createPostLabel = async ( 16 + uri: string, 17 + cid: string, 18 + label: string, 19 + comment: string, 20 + phash: string, 21 + claims: ModerationClaims, 22 + metrics?: MetricsCollector, 23 + duration?: number 24 + ) => { 25 + await isLoggedIn; 26 + 27 + const claimed = await claims.tryClaimPostLabel(uri, label); 28 + if (!claimed) { 29 + logger.debug( 30 + { process: "MODERATION", uri, label }, 31 + "Post label already claimed in Redis, skipping" 32 + ); 33 + metrics?.increment("labelsCached"); 34 + return; 35 + } 36 + 37 + const hasLabel = await checkRecordLabels(uri, label); 38 + if (hasLabel) { 39 + logger.debug({ process: "MODERATION", uri, label }, "Post already has label, skipping"); 40 + metrics?.increment("labelsCached"); 41 + return; 42 + } 43 + 44 + logger.info({ process: "MODERATION", label, atURI: uri }, "Labeling post"); 45 + metrics?.increment("labelsApplied"); 46 + 47 + const timestamp = new Date().toISOString(); 48 + const formattedComment = `${timestamp}: ${comment} at ${uri} with phash "${phash}"`; 49 + 50 + await limit(async () => { 51 + try { 52 + const event: { 53 + $type: string; 54 + comment: string; 55 + createLabelVals: string[]; 56 + negateLabelVals: string[]; 57 + durationInHours?: number; 58 + } = { 59 + $type: "tools.ozone.moderation.defs#modEventLabel", 60 + comment: formattedComment, 61 + createLabelVals: [label], 62 + negateLabelVals: [], 63 + }; 64 + 65 + if (duration) { 66 + event.durationInHours = duration; 67 + } 68 + 69 + await agent.tools.ozone.moderation.emitEvent( 70 + { 71 + event, 72 + subject: { 73 + $type: "com.atproto.repo.strongRef", 74 + uri, 75 + cid, 76 + }, 77 + createdBy: agent.did ?? "", 78 + createdAt: new Date().toISOString(), 79 + modTool: { 80 + name: "skywatch/skywatch-phash", 81 + }, 82 + }, 83 + { 84 + encoding: "application/json", 85 + headers: { 86 + "atproto-proxy": `${config.moderation.modDid}#atproto_labeler`, 87 + "atproto-accept-labelers": "did:plc:ar7c4by46qjdydhdevvrndac;redact", 88 + }, 89 + } 90 + ); 91 + } catch (e) { 92 + logger.error({ process: "MODERATION", error: e }, "Failed to create post label"); 93 + } 94 + }); 95 + }; 96 + 97 + export const createPostReport = async ( 98 + uri: string, 99 + cid: string, 100 + comment: string, 101 + phash: string, 102 + metrics?: MetricsCollector 103 + ) => { 104 + await isLoggedIn; 105 + metrics?.increment("reportsCreated"); 106 + 107 + const timestamp = new Date().toISOString(); 108 + const formattedComment = `${timestamp}: ${comment} at ${uri} with phash "${phash}"`; 109 + 110 + await limit(async () => { 111 + try { 112 + return await agent.tools.ozone.moderation.emitEvent( 113 + { 114 + event: { 115 + $type: "tools.ozone.moderation.defs#modEventReport", 116 + comment: formattedComment, 117 + reportType: "com.atproto.moderation.defs#reasonOther", 118 + }, 119 + subject: { 120 + $type: "com.atproto.repo.strongRef", 121 + uri, 122 + cid, 123 + }, 124 + createdBy: agent.did ?? "", 125 + createdAt: new Date().toISOString(), 126 + modTool: { 127 + name: "skywatch/skywatch-phash", 128 + }, 129 + }, 130 + { 131 + encoding: "application/json", 132 + headers: { 133 + "atproto-proxy": `${config.moderation.modDid}#atproto_labeler`, 134 + "atproto-accept-labelers": "did:plc:ar7c4by46qjdydhdevvrndac;redact", 135 + }, 136 + } 137 + ); 138 + } catch (e) { 139 + logger.error({ process: "MODERATION", error: e }, "Failed to create post report"); 140 + } 141 + }); 142 + }; 143 + 144 + export const checkRecordLabels = async (uri: string, label: string): Promise<boolean> => { 145 + await isLoggedIn; 146 + return await limit(async () => { 147 + try { 148 + const response = await agent.tools.ozone.moderation.getRecord( 149 + { uri }, 150 + { 151 + headers: { 152 + "atproto-proxy": `${config.moderation.modDid}#atproto_labeler`, 153 + "atproto-accept-labelers": "did:plc:ar7c4by46qjdydhdevvrndac;redact", 154 + }, 155 + } 156 + ); 157 + 158 + return doesLabelExist(response.data.labels, label); 159 + } catch (e) { 160 + logger.error({ process: "MODERATION", uri, error: e }, "Failed to check record labels"); 161 + return false; 162 + } 163 + }); 164 + };
+261
src/processor/image-processor.ts
··· 1 + import type { AtpAgent } from "@atproto/api"; 2 + import type { PhashCache } from "../cache/phash-cache"; 3 + import { config } from "../config/index"; 4 + import { computePerceptualHash } from "../hasher/phash"; 5 + import { logger } from "../logger/index"; 6 + import { findMatch } from "../matcher/hamming"; 7 + import type { MetricsCollector } from "../metrics/collector"; 8 + import type { BlobCheck, BlobReference, MatchResult } from "../types"; 9 + 10 + export interface ProcessResult { 11 + cid: string; 12 + phash: string; 13 + match: MatchResult | null; 14 + } 15 + 16 + export class ImageProcessor { 17 + private pdsCache = new Map<string, string | null>(); 18 + private repoTakendownCache = new Map<string, boolean>(); 19 + 20 + constructor( 21 + private checks: BlobCheck[], 22 + private agent: AtpAgent, 23 + private cache?: PhashCache, 24 + private metrics?: MetricsCollector 25 + ) {} 26 + 27 + private async resolvePds(did: string): Promise<string | null> { 28 + if (this.pdsCache.has(did)) { 29 + return this.pdsCache.get(did) ?? null; 30 + } 31 + 32 + try { 33 + const didDocResponse = await fetch(`${config.plc.endpoint}/${did}`); 34 + if (!didDocResponse.ok) { 35 + logger.warn({ did, status: didDocResponse.status }, "Failed to fetch DID document"); 36 + this.pdsCache.set(did, null); 37 + return null; 38 + } 39 + 40 + const didDoc = await didDocResponse.json(); 41 + const pdsService = didDoc.service?.find((s: any) => 42 + s.id === "#atproto_pds" && s.type === "AtprotoPersonalDataServer" 43 + ); 44 + 45 + if (!pdsService?.serviceEndpoint) { 46 + logger.warn({ did }, "No PDS endpoint found in DID document"); 47 + this.pdsCache.set(did, null); 48 + return null; 49 + } 50 + 51 + this.pdsCache.set(did, pdsService.serviceEndpoint); 52 + return pdsService.serviceEndpoint; 53 + } catch (error) { 54 + logger.error({ error, did }, "Failed to resolve PDS from DID"); 55 + this.pdsCache.set(did, null); 56 + return null; 57 + } 58 + } 59 + 60 + private async checkRepoTakendown(did: string, pdsEndpoint: string): Promise<boolean> { 61 + if (this.repoTakendownCache.has(did)) { 62 + return this.repoTakendownCache.get(did)!; 63 + } 64 + 65 + try { 66 + const response = await fetch( 67 + `${pdsEndpoint}/xrpc/com.atproto.repo.describeRepo?repo=${did}` 68 + ); 69 + 70 + if (!response.ok) { 71 + const data = await response.json().catch(() => ({})); 72 + if (data.error === "RepoTakendown") { 73 + logger.info({ did, message: data.message }, "Repo has been taken down, skipping"); 74 + this.repoTakendownCache.set(did, true); 75 + return true; 76 + } 77 + } 78 + 79 + this.repoTakendownCache.set(did, false); 80 + return false; 81 + } catch (error) { 82 + logger.warn({ error, did }, "Failed to check repo status, assuming not taken down"); 83 + this.repoTakendownCache.set(did, false); 84 + return false; 85 + } 86 + } 87 + 88 + async processBlob(did: string, blob: BlobReference, postUri?: string): Promise<ProcessResult | null> { 89 + try { 90 + if (!blob.mimeType?.startsWith("image/") || blob.mimeType.includes("svg")) { 91 + logger.debug({ cid: blob.cid, mimeType: blob.mimeType }, "Skipping non-image blob"); 92 + return null; 93 + } 94 + 95 + const applicableChecks = this.checks.filter((check) => !check.ignoreDID?.includes(did)); 96 + 97 + if (applicableChecks.length === 0) { 98 + logger.debug({ did, cid: blob.cid }, "DID is in ignoreDID for all checks, skipping"); 99 + return null; 100 + } 101 + 102 + if (this.cache) { 103 + const cachedPhash = await this.cache.get(blob.cid); 104 + if (cachedPhash) { 105 + this.metrics?.increment("cacheHits"); 106 + logger.debug({ cid: blob.cid, phash: cachedPhash }, "Using cached phash"); 107 + const matchedCheck = findMatch(cachedPhash, applicableChecks); 108 + 109 + if (matchedCheck) { 110 + const match = this.createMatchResult(cachedPhash, matchedCheck); 111 + if (match) { 112 + this.metrics?.increment("matchesFound"); 113 + logger.info( 114 + { 115 + cid: blob.cid, 116 + phash: cachedPhash, 117 + label: matchedCheck.label, 118 + hammingDistance: match.hammingDistance, 119 + }, 120 + "Match found (cached)" 121 + ); 122 + return { cid: blob.cid, phash: cachedPhash, match }; 123 + } 124 + } 125 + 126 + return { cid: blob.cid, phash: cachedPhash, match: null }; 127 + } 128 + this.metrics?.increment("cacheMisses"); 129 + } 130 + 131 + const pdsEndpoint = await this.resolvePds(did); 132 + if (!pdsEndpoint) { 133 + logger.warn({ did, cid: blob.cid, postUri }, "Cannot fetch blob without PDS endpoint"); 134 + this.metrics?.increment("fetchErrors"); 135 + return null; 136 + } 137 + 138 + // Check if repo has been taken down before fetching blobs 139 + const isTakendown = await this.checkRepoTakendown(did, pdsEndpoint); 140 + if (isTakendown) { 141 + logger.debug({ did, cid: blob.cid, postUri }, "Skipping blob from taken down repo"); 142 + this.metrics?.increment("fetchErrors"); 143 + return null; 144 + } 145 + 146 + const blobUrl = `${pdsEndpoint}/xrpc/com.atproto.sync.getBlob?did=${did}&cid=${blob.cid}`; 147 + logger.debug({ did, cid: blob.cid, pdsEndpoint, postUri }, "Fetching blob"); 148 + 149 + const response = await fetch(blobUrl, { 150 + redirect: 'follow' 151 + }); 152 + 153 + if (!response.ok) { 154 + this.metrics?.increment("fetchErrors"); 155 + logger.warn({ did, cid: blob.cid, status: response.status, pdsEndpoint, postUri }, "Failed to fetch blob"); 156 + return null; 157 + } 158 + 159 + this.metrics?.increment("blobsFetched"); 160 + 161 + const blobData = Buffer.from(await response.arrayBuffer()); 162 + 163 + let phash: string; 164 + try { 165 + phash = await computePerceptualHash(blobData); 166 + } catch (error) { 167 + // Handle corrupt or invalid image data 168 + if (error instanceof Error && 169 + (error.message.includes("Corrupt JPEG") || 170 + error.message.includes("premature end of data") || 171 + error.message.includes("Invalid image") || 172 + error.message.includes("unsupported image format"))) { 173 + logger.debug({ cid: blob.cid, error: error.message, postUri }, "Skipping corrupt or invalid image"); 174 + this.metrics?.increment("blobsCorrupted"); 175 + return null; 176 + } 177 + // Re-throw other errors 178 + throw error; 179 + } 180 + 181 + this.metrics?.increment("blobsProcessed"); 182 + logger.debug({ cid: blob.cid, phash }, "Computed perceptual hash"); 183 + 184 + if (this.cache) { 185 + await this.cache.set(blob.cid, phash); 186 + } 187 + 188 + const matchedCheck = findMatch(phash, applicableChecks); 189 + 190 + if (matchedCheck) { 191 + const match = this.createMatchResult(phash, matchedCheck); 192 + if (match) { 193 + this.metrics?.increment("matchesFound"); 194 + logger.info( 195 + { 196 + cid: blob.cid, 197 + phash, 198 + label: matchedCheck.label, 199 + hammingDistance: match.hammingDistance, 200 + }, 201 + "Match found" 202 + ); 203 + return { cid: blob.cid, phash, match }; 204 + } 205 + } 206 + 207 + return { cid: blob.cid, phash, match: null }; 208 + } catch (error) { 209 + logger.error({ error, did, cid: blob.cid }, "Failed to process blob"); 210 + throw error; 211 + } 212 + } 213 + 214 + private createMatchResult(phash: string, matchedCheck: BlobCheck): MatchResult | null { 215 + for (const checkPhash of matchedCheck.phashes) { 216 + const a = BigInt(`0x${phash}`); 217 + const b = BigInt(`0x${checkPhash}`); 218 + const xor = a ^ b; 219 + 220 + let distance = 0; 221 + let n = xor; 222 + while (n > 0n) { 223 + distance++; 224 + n &= n - 1n; 225 + } 226 + 227 + const threshold = matchedCheck.hammingThreshold ?? 5; 228 + if (distance <= threshold) { 229 + return { 230 + phash, 231 + matchedCheck, 232 + matchedPhash: checkPhash, 233 + hammingDistance: distance, 234 + }; 235 + } 236 + } 237 + 238 + return null; 239 + } 240 + 241 + async processPost( 242 + postUri: string, 243 + did: string, 244 + blobs: BlobReference[] 245 + ): Promise<ProcessResult[]> { 246 + const results: ProcessResult[] = []; 247 + 248 + for (const blob of blobs) { 249 + try { 250 + const result = await this.processBlob(did, blob, postUri); 251 + if (result) { 252 + results.push(result); 253 + } 254 + } catch (error) { 255 + logger.error({ error, postUri, cid: blob.cid }, "Failed to process blob in post"); 256 + } 257 + } 258 + 259 + return results; 260 + } 261 + }
+113
src/queue/redis-queue.ts
··· 1 + import Redis from "ioredis"; 2 + import { logger } from "../logger/index"; 3 + import type { ImageJob } from "../types"; 4 + 5 + export class RedisQueue { 6 + private redis: Redis; 7 + private readonly PENDING_KEY = "phash:queue:pending"; 8 + private readonly PROCESSING_KEY = "phash:queue:processing"; 9 + private readonly FAILED_KEY = "phash:queue:failed"; 10 + 11 + constructor(redisUrl: string) { 12 + this.redis = new Redis(redisUrl); 13 + 14 + this.redis.on("error", (error) => { 15 + logger.error({ error }, "Redis connection error"); 16 + }); 17 + 18 + this.redis.on("connect", () => { 19 + logger.info("Redis connected"); 20 + }); 21 + } 22 + 23 + async enqueue(job: ImageJob): Promise<void> { 24 + const jobData = JSON.stringify(job); 25 + await this.redis.rpush(this.PENDING_KEY, jobData); 26 + logger.debug({ job }, "Job enqueued"); 27 + } 28 + 29 + async dequeue(): Promise<ImageJob | null> { 30 + const jobData = await this.redis.lpop(this.PENDING_KEY); 31 + if (!jobData) { 32 + return null; 33 + } 34 + 35 + const job = JSON.parse(jobData) as ImageJob; 36 + await this.redis.rpush(this.PROCESSING_KEY, jobData); 37 + logger.debug({ job }, "Job dequeued"); 38 + 39 + return job; 40 + } 41 + 42 + async markComplete(job: ImageJob): Promise<void> { 43 + const jobData = JSON.stringify(job); 44 + await this.redis.lrem(this.PROCESSING_KEY, 1, jobData); 45 + logger.debug({ job }, "Job completed"); 46 + } 47 + 48 + async markFailed(job: ImageJob, error: Error): Promise<void> { 49 + const failedJob = { 50 + ...job, 51 + error: error.message, 52 + failedAt: Date.now(), 53 + }; 54 + 55 + const jobData = JSON.stringify(job); 56 + await this.redis.lrem(this.PROCESSING_KEY, 1, jobData); 57 + await this.redis.rpush(this.FAILED_KEY, JSON.stringify(failedJob)); 58 + 59 + logger.error({ job, error }, "Job failed"); 60 + } 61 + 62 + async retry(job: ImageJob, maxAttempts: number): Promise<boolean> { 63 + job.attempts++; 64 + 65 + if (job.attempts >= maxAttempts) { 66 + logger.warn({ job }, "Job exceeded max retry attempts"); 67 + return false; 68 + } 69 + 70 + await this.enqueue(job); 71 + logger.debug({ job }, "Job requeued for retry"); 72 + 73 + return true; 74 + } 75 + 76 + async getPendingCount(): Promise<number> { 77 + return await this.redis.llen(this.PENDING_KEY); 78 + } 79 + 80 + async getProcessingCount(): Promise<number> { 81 + return await this.redis.llen(this.PROCESSING_KEY); 82 + } 83 + 84 + async getFailedCount(): Promise<number> { 85 + return await this.redis.llen(this.FAILED_KEY); 86 + } 87 + 88 + async getStats(): Promise<{ 89 + pending: number; 90 + processing: number; 91 + failed: number; 92 + }> { 93 + const [pending, processing, failed] = await Promise.all([ 94 + this.getPendingCount(), 95 + this.getProcessingCount(), 96 + this.getFailedCount(), 97 + ]); 98 + 99 + return { pending, processing, failed }; 100 + } 101 + 102 + async clearFailed(): Promise<number> { 103 + const count = await this.redis.llen(this.FAILED_KEY); 104 + await this.redis.del(this.FAILED_KEY); 105 + logger.info({ count }, "Cleared failed jobs"); 106 + return count; 107 + } 108 + 109 + async disconnect(): Promise<void> { 110 + await this.redis.quit(); 111 + logger.info("Redis disconnected"); 112 + } 113 + }
+155
src/queue/worker.ts
··· 1 + import type { AtpAgent } from "@atproto/api"; 2 + import type { PhashCache } from "../cache/phash-cache"; 3 + import { logger } from "../logger/index"; 4 + import type { MetricsCollector } from "../metrics/collector"; 5 + import { ImageProcessor } from "../processor/image-processor"; 6 + import type { BlobCheck, ImageJob, MatchResult } from "../types"; 7 + import type { RedisQueue } from "./redis-queue"; 8 + 9 + export interface WorkerConfig { 10 + concurrency: number; 11 + retryAttempts: number; 12 + retryDelay: number; 13 + pollInterval?: number; 14 + } 15 + 16 + export class QueueWorker { 17 + private queue: RedisQueue; 18 + private processor: ImageProcessor; 19 + private config: WorkerConfig; 20 + private running = false; 21 + private activeJobs = 0; 22 + private metrics?: MetricsCollector; 23 + private onMatch?: ( 24 + postUri: string, 25 + postCid: string, 26 + postDid: string, 27 + match: MatchResult 28 + ) => void | Promise<void>; 29 + 30 + constructor( 31 + queue: RedisQueue, 32 + checks: BlobCheck[], 33 + config: WorkerConfig, 34 + agent: AtpAgent, 35 + cache?: PhashCache, 36 + metrics?: MetricsCollector 37 + ) { 38 + this.queue = queue; 39 + this.metrics = metrics; 40 + this.processor = new ImageProcessor(checks, agent, cache, metrics); 41 + this.config = config; 42 + } 43 + 44 + onMatchFound( 45 + handler: ( 46 + postUri: string, 47 + postCid: string, 48 + postDid: string, 49 + match: MatchResult 50 + ) => void | Promise<void> 51 + ): void { 52 + this.onMatch = handler; 53 + } 54 + 55 + async start(): Promise<void> { 56 + this.running = true; 57 + logger.info({ config: this.config }, "Queue worker started"); 58 + 59 + // Start worker loops in background (don't await) 60 + for (let i = 0; i < this.config.concurrency; i++) { 61 + this.processLoop().catch((error) => { 62 + logger.error({ error }, "Worker loop crashed"); 63 + }); 64 + } 65 + } 66 + 67 + async stop(): Promise<void> { 68 + this.running = false; 69 + logger.info("Queue worker stopping, waiting for active jobs to complete"); 70 + 71 + while (this.activeJobs > 0) { 72 + await new Promise((resolve) => setTimeout(resolve, 100)); 73 + } 74 + 75 + logger.info("Queue worker stopped"); 76 + } 77 + 78 + private async processLoop(): Promise<void> { 79 + const pollInterval = this.config.pollInterval ?? 1000; 80 + 81 + while (this.running) { 82 + try { 83 + const job = await this.queue.dequeue(); 84 + 85 + if (!job) { 86 + await new Promise((resolve) => setTimeout(resolve, pollInterval)); 87 + continue; 88 + } 89 + 90 + this.activeJobs++; 91 + await this.processJob(job); 92 + this.activeJobs--; 93 + } catch (error) { 94 + logger.error({ error }, "Error in process loop"); 95 + await new Promise((resolve) => setTimeout(resolve, pollInterval)); 96 + } 97 + } 98 + } 99 + 100 + private async processJob(job: ImageJob): Promise<void> { 101 + try { 102 + logger.debug({ job }, "Processing job"); 103 + 104 + for (const blob of job.blobs) { 105 + try { 106 + const result = await this.processor.processBlob(job.postDid, blob, job.postUri); 107 + 108 + if (result?.match && this.onMatch) { 109 + await this.onMatch(job.postUri, job.postCid, job.postDid, result.match); 110 + } 111 + } catch (error) { 112 + // Log corrupt image errors at debug level since they're expected 113 + if (error instanceof Error && 114 + (error.message.includes("Corrupt JPEG") || 115 + error.message.includes("premature end of data") || 116 + error.message.includes("Invalid image") || 117 + error.message.includes("unsupported image format"))) { 118 + logger.debug({ error: error.message, job, blob }, "Skipping corrupt or invalid image"); 119 + } else { 120 + logger.error({ error, job, blob }, "Error processing blob, continuing with next blob"); 121 + } 122 + } 123 + } 124 + 125 + this.metrics?.increment("postsProcessed"); 126 + await this.queue.markComplete(job); 127 + } catch (error) { 128 + logger.error({ error, job }, "Error processing job"); 129 + 130 + if (error instanceof Error) { 131 + const shouldRetry = await this.queue.retry(job, this.config.retryAttempts); 132 + 133 + if (!shouldRetry) { 134 + await this.queue.markFailed(job, error); 135 + } else { 136 + await new Promise((resolve) => setTimeout(resolve, this.config.retryDelay)); 137 + } 138 + } 139 + } 140 + } 141 + 142 + async getStats(): Promise<{ 143 + queue: { pending: number; processing: number; failed: number }; 144 + worker: { running: boolean; activeJobs: number }; 145 + }> { 146 + const queueStats = await this.queue.getStats(); 147 + return { 148 + queue: queueStats, 149 + worker: { 150 + running: this.running, 151 + activeJobs: this.activeJobs, 152 + }, 153 + }; 154 + } 155 + }
+62
src/session.ts
··· 1 + import { chmodSync, existsSync, readFileSync, unlinkSync, writeFileSync } from "node:fs"; 2 + import { join } from "node:path"; 3 + import { logger } from "./logger/index.js"; 4 + 5 + const SESSION_FILE_PATH = join(process.cwd(), ".session"); 6 + 7 + export interface SessionData { 8 + accessJwt: string; 9 + refreshJwt: string; 10 + did: string; 11 + handle: string; 12 + email?: string; 13 + emailConfirmed?: boolean; 14 + emailAuthFactor?: boolean; 15 + active: boolean; 16 + status?: string; 17 + } 18 + 19 + export function loadSession(): SessionData | null { 20 + try { 21 + if (!existsSync(SESSION_FILE_PATH)) { 22 + logger.debug("No session file found"); 23 + return null; 24 + } 25 + 26 + const data = readFileSync(SESSION_FILE_PATH, "utf-8"); 27 + const session = JSON.parse(data) as SessionData; 28 + 29 + if (!session.accessJwt || !session.refreshJwt || !session.did) { 30 + logger.warn("Session file is missing required fields, ignoring"); 31 + return null; 32 + } 33 + 34 + logger.info("Loaded existing session from file"); 35 + return session; 36 + } catch (error) { 37 + logger.error({ error }, "Failed to load session file, will authenticate fresh"); 38 + return null; 39 + } 40 + } 41 + 42 + export function saveSession(session: SessionData): void { 43 + try { 44 + const data = JSON.stringify(session, null, 2); 45 + writeFileSync(SESSION_FILE_PATH, data, "utf-8"); 46 + chmodSync(SESSION_FILE_PATH, 0o600); 47 + logger.info("Session saved to file"); 48 + } catch (error) { 49 + logger.error({ error }, "Failed to save session to file"); 50 + } 51 + } 52 + 53 + export function clearSession(): void { 54 + try { 55 + if (existsSync(SESSION_FILE_PATH)) { 56 + unlinkSync(SESSION_FILE_PATH); 57 + logger.info("Session file cleared"); 58 + } 59 + } catch (error) { 60 + logger.error({ error }, "Failed to clear session file"); 61 + } 62 + }
+33
src/types.ts
··· 1 + export interface BlobCheck { 2 + phashes: string[]; 3 + label: string; 4 + comment: string; 5 + reportAcct: boolean; 6 + labelAcct: boolean; 7 + reportPost: boolean; 8 + toLabel: boolean; 9 + hammingThreshold?: number; 10 + description?: string; 11 + ignoreDID?: string[]; 12 + } 13 + 14 + export interface BlobReference { 15 + cid: string; 16 + mimeType?: string; 17 + } 18 + 19 + export interface ImageJob { 20 + postUri: string; 21 + postCid: string; 22 + postDid: string; 23 + blobs: BlobReference[]; 24 + timestamp: number; 25 + attempts: number; 26 + } 27 + 28 + export interface MatchResult { 29 + phash: string; 30 + matchedCheck: BlobCheck; 31 + matchedPhash: string; 32 + hammingDistance: number; 33 + }
+216
tests/integration/queue.test.ts
··· 1 + import { afterAll, beforeAll, describe, expect, it } from "bun:test"; 2 + import Redis from "ioredis"; 3 + import { RedisQueue } from "../../src/queue/redis-queue"; 4 + import type { ImageJob } from "../../src/types"; 5 + 6 + describe("RedisQueue Integration", () => { 7 + let queue: RedisQueue | undefined; 8 + let redis: Redis | undefined; 9 + let redisAvailable = false; 10 + const testRedisUrl = process.env.REDIS_URL || "redis://localhost:6379"; 11 + 12 + beforeAll(async () => { 13 + try { 14 + redis = new Redis(testRedisUrl, { 15 + connectTimeout: 2000, 16 + maxRetriesPerRequest: 1, 17 + }); 18 + await redis.ping(); 19 + await redis.flushdb(); 20 + queue = new RedisQueue(testRedisUrl); 21 + redisAvailable = true; 22 + } catch (_error) { 23 + redisAvailable = false; 24 + console.log("Redis not available, skipping integration tests"); 25 + if (redis) { 26 + redis.disconnect(); 27 + } 28 + } 29 + }); 30 + 31 + afterAll(async () => { 32 + if (queue) { 33 + await queue.disconnect(); 34 + } 35 + if (redis) { 36 + await redis.quit(); 37 + } 38 + }); 39 + 40 + it("should enqueue and dequeue jobs", async () => { 41 + if (!redisAvailable || !queue) { 42 + return; 43 + } 44 + 45 + const job: ImageJob = { 46 + postUri: "at://did:test/app.bsky.feed.post/test", 47 + postCid: "bafypostcid1", 48 + postDid: "did:test", 49 + blobs: [{ cid: "bafytest", mimeType: "image/png" }], 50 + timestamp: Date.now(), 51 + attempts: 0, 52 + }; 53 + 54 + await queue.enqueue(job); 55 + const dequeued = await queue.dequeue(); 56 + 57 + expect(dequeued).toEqual(job); 58 + }); 59 + 60 + it("should return null when queue is empty", async () => { 61 + if (!redisAvailable || !queue) { 62 + return; 63 + } 64 + 65 + const job = await queue.dequeue(); 66 + expect(job).toBeNull(); 67 + }); 68 + 69 + it("should mark jobs as complete", async () => { 70 + if (!redisAvailable || !queue) { 71 + return; 72 + } 73 + 74 + const job: ImageJob = { 75 + postUri: "at://did:test/app.bsky.feed.post/test2", 76 + postCid: "bafypostcid2", 77 + postDid: "did:test", 78 + blobs: [{ cid: "bafytest2", mimeType: "image/jpeg" }], 79 + timestamp: Date.now(), 80 + attempts: 0, 81 + }; 82 + 83 + await queue.enqueue(job); 84 + const dequeued = await queue.dequeue(); 85 + 86 + expect(dequeued).not.toBeNull(); 87 + 88 + if (dequeued) { 89 + await queue.markComplete(dequeued); 90 + 91 + const stats = await queue.getStats(); 92 + expect(stats.processing).toBe(0); 93 + } 94 + }); 95 + 96 + it("should mark jobs as failed", async () => { 97 + if (!redisAvailable || !queue) { 98 + return; 99 + } 100 + 101 + const job: ImageJob = { 102 + postUri: "at://did:test/app.bsky.feed.post/test3", 103 + postCid: "bafypostcid3", 104 + postDid: "did:test", 105 + blobs: [{ cid: "bafytest3", mimeType: "image/webp" }], 106 + timestamp: Date.now(), 107 + attempts: 0, 108 + }; 109 + 110 + await queue.enqueue(job); 111 + const dequeued = await queue.dequeue(); 112 + 113 + expect(dequeued).not.toBeNull(); 114 + 115 + if (dequeued) { 116 + await queue.markFailed(dequeued, new Error("Test error")); 117 + 118 + const stats = await queue.getStats(); 119 + expect(stats.failed).toBeGreaterThan(0); 120 + } 121 + }); 122 + 123 + it("should retry jobs up to max attempts", async () => { 124 + if (!redisAvailable || !queue) { 125 + return; 126 + } 127 + 128 + const job: ImageJob = { 129 + postUri: "at://did:test/app.bsky.feed.post/test4", 130 + postCid: "bafypostcid4", 131 + postDid: "did:test", 132 + blobs: [{ cid: "bafytest4", mimeType: "image/png" }], 133 + timestamp: Date.now(), 134 + attempts: 0, 135 + }; 136 + 137 + const shouldRetry1 = await queue.retry(job, 3); 138 + expect(shouldRetry1).toBe(true); 139 + expect(job.attempts).toBe(1); 140 + 141 + const shouldRetry2 = await queue.retry(job, 3); 142 + expect(shouldRetry2).toBe(true); 143 + expect(job.attempts).toBe(2); 144 + 145 + const shouldRetry3 = await queue.retry(job, 3); 146 + expect(shouldRetry3).toBe(false); 147 + expect(job.attempts).toBe(3); 148 + }); 149 + 150 + it("should track queue stats accurately", async () => { 151 + if (!redisAvailable || !queue || !redis) { 152 + return; 153 + } 154 + 155 + await redis.flushdb(); 156 + 157 + const job1: ImageJob = { 158 + postUri: "at://did:test/app.bsky.feed.post/stats1", 159 + postCid: "bafypoststats1", 160 + postDid: "did:test", 161 + blobs: [{ cid: "bafystats1" }], 162 + timestamp: Date.now(), 163 + attempts: 0, 164 + }; 165 + 166 + const job2: ImageJob = { 167 + postUri: "at://did:test/app.bsky.feed.post/stats2", 168 + postCid: "bafypoststats2", 169 + postDid: "did:test", 170 + blobs: [{ cid: "bafystats2" }], 171 + timestamp: Date.now(), 172 + attempts: 0, 173 + }; 174 + 175 + await queue.enqueue(job1); 176 + await queue.enqueue(job2); 177 + 178 + let stats = await queue.getStats(); 179 + expect(stats.pending).toBe(2); 180 + expect(stats.processing).toBe(0); 181 + 182 + await queue.dequeue(); 183 + 184 + stats = await queue.getStats(); 185 + expect(stats.pending).toBe(1); 186 + expect(stats.processing).toBe(1); 187 + }); 188 + 189 + it("should clear failed jobs", async () => { 190 + if (!redisAvailable || !queue) { 191 + return; 192 + } 193 + 194 + const job: ImageJob = { 195 + postUri: "at://did:test/app.bsky.feed.post/clear", 196 + postCid: "bafypostclear", 197 + postDid: "did:test", 198 + blobs: [{ cid: "bafyclear" }], 199 + timestamp: Date.now(), 200 + attempts: 0, 201 + }; 202 + 203 + await queue.enqueue(job); 204 + const dequeued = await queue.dequeue(); 205 + 206 + if (dequeued) { 207 + await queue.markFailed(dequeued, new Error("Clear test")); 208 + } 209 + 210 + const clearedCount = await queue.clearFailed(); 211 + expect(clearedCount).toBeGreaterThan(0); 212 + 213 + const stats = await queue.getStats(); 214 + expect(stats.failed).toBe(0); 215 + }); 216 + });
+147
tests/unit/hamming.test.ts
··· 1 + import { describe, expect, test } from "bun:test"; 2 + import { findMatch, hammingDistance } from "../../src/matcher/hamming"; 3 + import type { BlobCheck } from "../../src/types"; 4 + 5 + describe("hammingDistance", () => { 6 + test("should return 0 for identical hashes", () => { 7 + const distance = hammingDistance("a1b2c3d4e5f6a7b8", "a1b2c3d4e5f6a7b8"); 8 + expect(distance).toBe(0); 9 + }); 10 + 11 + test("should calculate correct distance for 1-bit difference", () => { 12 + const distance = hammingDistance("0000000000000000", "0000000000000001"); 13 + expect(distance).toBe(1); 14 + }); 15 + 16 + test("should calculate correct distance for multi-bit difference", () => { 17 + const distance = hammingDistance("0000000000000000", "000000000000000f"); 18 + expect(distance).toBe(4); 19 + }); 20 + 21 + test("should calculate correct distance for completely different hashes", () => { 22 + const distance = hammingDistance("0000000000000000", "ffffffffffffffff"); 23 + expect(distance).toBe(64); 24 + }); 25 + 26 + test("should calculate distance correctly for real-world hashes", () => { 27 + const hash1 = "a1b2c3d4e5f6a7b8"; 28 + const hash2 = "a1b2c3d4e5f6a7b9"; 29 + const distance = hammingDistance(hash1, hash2); 30 + expect(distance).toBeGreaterThan(0); 31 + expect(distance).toBeLessThan(64); 32 + }); 33 + }); 34 + 35 + describe("findMatch", () => { 36 + const sampleChecks: BlobCheck[] = [ 37 + { 38 + phashes: ["a1b2c3d4e5f6a7b8"], 39 + label: "test-label-1", 40 + comment: "Test comment", 41 + reportAcct: false, 42 + labelAcct: false, 43 + reportPost: true, 44 + toLabel: true, 45 + hammingThreshold: 5, 46 + }, 47 + { 48 + phashes: ["1234567890abcdef", "fedcba0987654321"], 49 + label: "test-label-2", 50 + comment: "Test comment 2", 51 + reportAcct: true, 52 + labelAcct: true, 53 + reportPost: true, 54 + toLabel: true, 55 + hammingThreshold: 3, 56 + }, 57 + ]; 58 + 59 + test("should find exact match", () => { 60 + const match = findMatch("a1b2c3d4e5f6a7b8", sampleChecks); 61 + expect(match).not.toBeNull(); 62 + expect(match?.label).toBe("test-label-1"); 63 + }); 64 + 65 + test("should find match within threshold", () => { 66 + const match = findMatch("a1b2c3d4e5f6a7b9", sampleChecks); 67 + expect(match).not.toBeNull(); 68 + expect(match?.label).toBe("test-label-1"); 69 + }); 70 + 71 + test("should not match if distance exceeds threshold", () => { 72 + const match = findMatch("0000000000000000", sampleChecks); 73 + expect(match).toBeNull(); 74 + }); 75 + 76 + test("should match against multiple phashes in same check", () => { 77 + const match = findMatch("fedcba0987654321", sampleChecks); 78 + expect(match).not.toBeNull(); 79 + expect(match?.label).toBe("test-label-2"); 80 + }); 81 + 82 + test("should respect custom threshold", () => { 83 + const checks: BlobCheck[] = [ 84 + { 85 + phashes: ["a1b2c3d4e5f6a7b8"], 86 + label: "strict-check", 87 + comment: "Strict threshold", 88 + reportAcct: true, 89 + labelAcct: false, 90 + reportPost: true, 91 + toLabel: true, 92 + hammingThreshold: 1, 93 + }, 94 + ]; 95 + 96 + const closeMatch = findMatch("a1b2c3d4e5f6a7bf", checks); 97 + expect(closeMatch).toBeNull(); 98 + }); 99 + 100 + test("should use default threshold of 5 when not specified", () => { 101 + const checks: BlobCheck[] = [ 102 + { 103 + phashes: ["a1b2c3d4e5f6a7b8"], 104 + label: "default-threshold", 105 + comment: "No threshold specified", 106 + reportAcct: false, 107 + labelAcct: false, 108 + reportPost: true, 109 + toLabel: true, 110 + }, 111 + ]; 112 + 113 + const match = findMatch("a1b2c3d4e5f6a7bd", checks); 114 + expect(match).not.toBeNull(); 115 + }); 116 + 117 + test("should return null for empty checks array", () => { 118 + const match = findMatch("a1b2c3d4e5f6a7b8", []); 119 + expect(match).toBeNull(); 120 + }); 121 + 122 + test("should return first matching check", () => { 123 + const checks: BlobCheck[] = [ 124 + { 125 + phashes: ["a1b2c3d4e5f6a7b8"], 126 + label: "first", 127 + comment: "First", 128 + reportAcct: false, 129 + labelAcct: false, 130 + reportPost: true, 131 + toLabel: true, 132 + }, 133 + { 134 + phashes: ["a1b2c3d4e5f6a7b8"], 135 + label: "second", 136 + comment: "Second", 137 + reportAcct: false, 138 + labelAcct: false, 139 + reportPost: true, 140 + toLabel: true, 141 + }, 142 + ]; 143 + 144 + const match = findMatch("a1b2c3d4e5f6a7b8", checks); 145 + expect(match?.label).toBe("first"); 146 + }); 147 + });
+405
tests/unit/image-processor.test.ts
··· 1 + import { beforeEach, describe, expect, mock, test } from "bun:test"; 2 + import sharp from "sharp"; 3 + import { ImageProcessor } from "../../src/processor/image-processor"; 4 + import type { BlobCheck, BlobReference } from "../../src/types"; 5 + 6 + function bufferToArrayBuffer(buffer: Buffer): ArrayBuffer { 7 + const ab = new ArrayBuffer(buffer.length); 8 + const view = new Uint8Array(ab); 9 + for (let i = 0; i < buffer.length; i++) { 10 + view[i] = buffer[i]; 11 + } 12 + return ab; 13 + } 14 + 15 + describe("ImageProcessor", () => { 16 + let processor: ImageProcessor; 17 + const testChecks: BlobCheck[] = [ 18 + { 19 + phashes: ["ffffffffffffffff"], 20 + label: "test-label", 21 + comment: "Test comment", 22 + reportAcct: false, 23 + labelAcct: false, 24 + reportPost: true, 25 + toLabel: true, 26 + hammingThreshold: 5, 27 + }, 28 + ]; 29 + 30 + beforeEach(() => { 31 + processor = new ImageProcessor(testChecks, "https://test.pds"); 32 + }); 33 + 34 + test("should skip non-image blobs", async () => { 35 + const mockFetch = mock(() => 36 + Promise.resolve({ 37 + ok: true, 38 + arrayBuffer: () => Promise.resolve(new ArrayBuffer(0)), 39 + } as Response) 40 + ); 41 + 42 + globalThis.fetch = mockFetch as unknown as typeof fetch; 43 + 44 + const blob: BlobReference = { 45 + cid: "test-cid", 46 + mimeType: "text/plain", 47 + }; 48 + 49 + const result = await processor.processBlob("did:test:123", blob); 50 + 51 + expect(result).toBeNull(); 52 + }); 53 + 54 + test("should skip SVG images", async () => { 55 + const mockFetch = mock(() => 56 + Promise.resolve({ 57 + ok: true, 58 + arrayBuffer: () => Promise.resolve(new ArrayBuffer(0)), 59 + } as Response) 60 + ); 61 + 62 + globalThis.fetch = mockFetch as unknown as typeof fetch; 63 + 64 + const blob: BlobReference = { 65 + cid: "test-cid", 66 + mimeType: "image/svg+xml", 67 + }; 68 + 69 + const result = await processor.processBlob("did:test:123", blob); 70 + 71 + expect(result).toBeNull(); 72 + }); 73 + 74 + test("should return null when blob fetch fails", async () => { 75 + const mockFetch = mock(() => 76 + Promise.resolve({ 77 + ok: false, 78 + status: 404, 79 + } as Response) 80 + ); 81 + 82 + globalThis.fetch = mockFetch as unknown as typeof fetch; 83 + 84 + const blob: BlobReference = { 85 + cid: "test-cid", 86 + mimeType: "image/png", 87 + }; 88 + 89 + const result = await processor.processBlob("did:test:123", blob); 90 + 91 + expect(result).toBeNull(); 92 + }); 93 + 94 + test("should process image and compute phash", async () => { 95 + const testImage = await sharp({ 96 + create: { 97 + width: 100, 98 + height: 100, 99 + channels: 3, 100 + background: { r: 128, g: 128, b: 128 }, 101 + }, 102 + }) 103 + .png() 104 + .toBuffer(); 105 + 106 + const mockFetch = mock(() => 107 + Promise.resolve({ 108 + ok: true, 109 + arrayBuffer: async () => bufferToArrayBuffer(testImage), 110 + } as Response) 111 + ); 112 + 113 + globalThis.fetch = mockFetch as unknown as typeof fetch; 114 + 115 + const blob: BlobReference = { 116 + cid: "test-cid", 117 + mimeType: "image/png", 118 + }; 119 + 120 + const result = await processor.processBlob("did:test:123", blob); 121 + 122 + expect(result).not.toBeNull(); 123 + expect(result?.cid).toBe("test-cid"); 124 + expect(result?.phash).toBeTypeOf("string"); 125 + expect(result?.phash).toHaveLength(16); 126 + }); 127 + 128 + test("should match against checks", async () => { 129 + const testImage = await sharp({ 130 + create: { 131 + width: 100, 132 + height: 100, 133 + channels: 3, 134 + background: { r: 128, g: 128, b: 128 }, 135 + }, 136 + }) 137 + .png() 138 + .toBuffer(); 139 + 140 + const mockFetch = mock(() => 141 + Promise.resolve({ 142 + ok: true, 143 + arrayBuffer: async () => bufferToArrayBuffer(testImage), 144 + } as Response) 145 + ); 146 + 147 + globalThis.fetch = mockFetch as unknown as typeof fetch; 148 + 149 + const blob: BlobReference = { 150 + cid: "test-cid", 151 + mimeType: "image/png", 152 + }; 153 + 154 + const testResult = await processor.processBlob("did:test:123", blob); 155 + const actualPhash = testResult?.phash || ""; 156 + 157 + const checksWithActualHash: BlobCheck[] = [ 158 + { 159 + phashes: [actualPhash], 160 + label: "matching-label", 161 + comment: "Test match", 162 + reportAcct: false, 163 + labelAcct: true, 164 + reportPost: true, 165 + toLabel: true, 166 + hammingThreshold: 0, 167 + }, 168 + ]; 169 + 170 + const matchingProcessor = new ImageProcessor(checksWithActualHash, "https://test.pds"); 171 + 172 + const result = await matchingProcessor.processBlob("did:test:123", blob); 173 + 174 + expect(result?.match).not.toBeNull(); 175 + expect(result?.match?.matchedCheck.label).toBe("matching-label"); 176 + expect(result?.match?.hammingDistance).toBe(0); 177 + }); 178 + 179 + test("should process multiple blobs in a post", async () => { 180 + const testImage = await sharp({ 181 + create: { 182 + width: 100, 183 + height: 100, 184 + channels: 3, 185 + background: { r: 128, g: 128, b: 128 }, 186 + }, 187 + }) 188 + .png() 189 + .toBuffer(); 190 + 191 + const mockFetch = mock(() => 192 + Promise.resolve({ 193 + ok: true, 194 + arrayBuffer: async () => bufferToArrayBuffer(testImage), 195 + } as Response) 196 + ); 197 + 198 + globalThis.fetch = mockFetch as unknown as typeof fetch; 199 + 200 + const blobs: BlobReference[] = [ 201 + { cid: "cid-1", mimeType: "image/png" }, 202 + { cid: "cid-2", mimeType: "image/jpeg" }, 203 + ]; 204 + 205 + const results = await processor.processPost("at://did:test:123/post/1", "did:test:123", blobs); 206 + 207 + expect(results).toHaveLength(2); 208 + expect(results[0].cid).toBe("cid-1"); 209 + expect(results[1].cid).toBe("cid-2"); 210 + }); 211 + 212 + test("should continue processing other blobs if one fails", async () => { 213 + const testImage = await sharp({ 214 + create: { 215 + width: 100, 216 + height: 100, 217 + channels: 3, 218 + background: { r: 128, g: 128, b: 128 }, 219 + }, 220 + }) 221 + .png() 222 + .toBuffer(); 223 + 224 + let callCount = 0; 225 + const mockFetch = mock(() => { 226 + callCount++; 227 + if (callCount === 1) { 228 + return Promise.resolve({ 229 + ok: false, 230 + status: 404, 231 + } as Response); 232 + } 233 + return Promise.resolve({ 234 + ok: true, 235 + arrayBuffer: async () => bufferToArrayBuffer(testImage), 236 + } as Response); 237 + }); 238 + 239 + globalThis.fetch = mockFetch as unknown as typeof fetch; 240 + 241 + const blobs: BlobReference[] = [ 242 + { cid: "cid-1", mimeType: "image/png" }, 243 + { cid: "cid-2", mimeType: "image/jpeg" }, 244 + ]; 245 + 246 + const results = await processor.processPost("at://did:test:123/post/1", "did:test:123", blobs); 247 + 248 + expect(results).toHaveLength(1); 249 + expect(results[0].cid).toBe("cid-2"); 250 + }); 251 + 252 + test("should skip processing if DID is in ignoreDID list", async () => { 253 + const checksWithIgnore: BlobCheck[] = [ 254 + { 255 + phashes: ["0000000000000000"], 256 + label: "test-label", 257 + comment: "Test comment", 258 + reportAcct: false, 259 + labelAcct: false, 260 + reportPost: true, 261 + toLabel: true, 262 + hammingThreshold: 5, 263 + ignoreDID: ["did:plc:ignored123", "did:plc:trusted456"], 264 + }, 265 + ]; 266 + 267 + const processorWithIgnore = new ImageProcessor(checksWithIgnore, "https://test.pds"); 268 + 269 + const testImage = await sharp({ 270 + create: { 271 + width: 100, 272 + height: 100, 273 + channels: 3, 274 + background: { r: 128, g: 128, b: 128 }, 275 + }, 276 + }) 277 + .png() 278 + .toBuffer(); 279 + 280 + const mockFetch = mock(() => 281 + Promise.resolve({ 282 + ok: true, 283 + arrayBuffer: async () => bufferToArrayBuffer(testImage), 284 + } as Response) 285 + ); 286 + 287 + globalThis.fetch = mockFetch as unknown as typeof fetch; 288 + 289 + const blob: BlobReference = { 290 + cid: "test-cid", 291 + mimeType: "image/png", 292 + }; 293 + 294 + const result = await processorWithIgnore.processBlob("did:plc:ignored123", blob); 295 + 296 + expect(result).toBeNull(); 297 + expect(mockFetch).not.toHaveBeenCalled(); 298 + }); 299 + 300 + test("should process normally if DID is not in ignoreDID list", async () => { 301 + const checksWithIgnore: BlobCheck[] = [ 302 + { 303 + phashes: ["0000000000000000"], 304 + label: "matching-label", 305 + comment: "Test comment", 306 + reportAcct: false, 307 + labelAcct: false, 308 + reportPost: true, 309 + toLabel: true, 310 + hammingThreshold: 5, 311 + ignoreDID: ["did:plc:ignored123"], 312 + }, 313 + ]; 314 + 315 + const processorWithIgnore = new ImageProcessor(checksWithIgnore, "https://test.pds"); 316 + 317 + const testImage = await sharp({ 318 + create: { 319 + width: 100, 320 + height: 100, 321 + channels: 3, 322 + background: { r: 128, g: 128, b: 128 }, 323 + }, 324 + }) 325 + .png() 326 + .toBuffer(); 327 + 328 + const mockFetch = mock(() => 329 + Promise.resolve({ 330 + ok: true, 331 + arrayBuffer: async () => bufferToArrayBuffer(testImage), 332 + } as Response) 333 + ); 334 + 335 + globalThis.fetch = mockFetch as unknown as typeof fetch; 336 + 337 + const blob: BlobReference = { 338 + cid: "test-cid", 339 + mimeType: "image/png", 340 + }; 341 + 342 + const result = await processorWithIgnore.processBlob("did:plc:allowed999", blob); 343 + 344 + expect(result).not.toBeNull(); 345 + expect(result?.match?.matchedCheck.label).toBe("matching-label"); 346 + }); 347 + 348 + test("should apply checks without ignoreDID to all DIDs", async () => { 349 + const mixedChecks: BlobCheck[] = [ 350 + { 351 + phashes: ["0000000000000000"], 352 + label: "check-with-ignore", 353 + comment: "Test", 354 + reportAcct: false, 355 + labelAcct: false, 356 + reportPost: true, 357 + toLabel: true, 358 + hammingThreshold: 5, 359 + ignoreDID: ["did:plc:ignored123"], 360 + }, 361 + { 362 + phashes: ["0000000000000000"], 363 + label: "check-without-ignore", 364 + comment: "Test", 365 + reportAcct: false, 366 + labelAcct: false, 367 + reportPost: true, 368 + toLabel: true, 369 + hammingThreshold: 5, 370 + }, 371 + ]; 372 + 373 + const processorMixed = new ImageProcessor(mixedChecks, "https://test.pds"); 374 + 375 + const testImage = await sharp({ 376 + create: { 377 + width: 100, 378 + height: 100, 379 + channels: 3, 380 + background: { r: 128, g: 128, b: 128 }, 381 + }, 382 + }) 383 + .png() 384 + .toBuffer(); 385 + 386 + const mockFetch = mock(() => 387 + Promise.resolve({ 388 + ok: true, 389 + arrayBuffer: async () => bufferToArrayBuffer(testImage), 390 + } as Response) 391 + ); 392 + 393 + globalThis.fetch = mockFetch as unknown as typeof fetch; 394 + 395 + const blob: BlobReference = { 396 + cid: "test-cid", 397 + mimeType: "image/png", 398 + }; 399 + 400 + const result = await processorMixed.processBlob("did:plc:ignored123", blob); 401 + 402 + expect(result).not.toBeNull(); 403 + expect(result?.match?.matchedCheck.label).toBe("check-without-ignore"); 404 + }); 405 + });
+137
tests/unit/phash-cache.test.ts
··· 1 + import { afterAll, beforeAll, describe, expect, it } from "bun:test"; 2 + import Redis from "ioredis"; 3 + import { PhashCache } from "../../src/cache/phash-cache"; 4 + 5 + describe("PhashCache", () => { 6 + let redis: Redis | undefined; 7 + let cache: PhashCache | undefined; 8 + let redisAvailable = false; 9 + const testRedisUrl = process.env.REDIS_URL || "redis://localhost:6379"; 10 + 11 + beforeAll(async () => { 12 + try { 13 + redis = new Redis(testRedisUrl, { 14 + connectTimeout: 2000, 15 + maxRetriesPerRequest: 1, 16 + }); 17 + await redis.ping(); 18 + await redis.flushdb(); 19 + cache = new PhashCache(redis, 60); 20 + redisAvailable = true; 21 + } catch (_error) { 22 + redisAvailable = false; 23 + console.log("Redis not available, skipping cache tests"); 24 + if (redis) { 25 + redis.disconnect(); 26 + } 27 + } 28 + }); 29 + 30 + afterAll(async () => { 31 + if (redis) { 32 + await redis.quit(); 33 + } 34 + }); 35 + 36 + it("should cache and retrieve phash by CID", async () => { 37 + if (!redisAvailable || !cache) { 38 + return; 39 + } 40 + 41 + const cid = "bafytest123"; 42 + const phash = "0123456789abcdef"; 43 + 44 + await cache.set(cid, phash); 45 + const retrieved = await cache.get(cid); 46 + 47 + expect(retrieved).toBe(phash); 48 + }); 49 + 50 + it("should return null for non-existent CID", async () => { 51 + if (!redisAvailable || !cache) { 52 + return; 53 + } 54 + 55 + const result = await cache.get("nonexistent"); 56 + expect(result).toBeNull(); 57 + }); 58 + 59 + it("should delete cached phash", async () => { 60 + if (!redisAvailable || !cache) { 61 + return; 62 + } 63 + 64 + const cid = "bafydelete"; 65 + const phash = "fedcba9876543210"; 66 + 67 + await cache.set(cid, phash); 68 + let retrieved = await cache.get(cid); 69 + expect(retrieved).toBe(phash); 70 + 71 + await cache.delete(cid); 72 + retrieved = await cache.get(cid); 73 + expect(retrieved).toBeNull(); 74 + }); 75 + 76 + it("should clear all cached phashes", async () => { 77 + if (!redisAvailable || !cache || !redis) { 78 + return; 79 + } 80 + 81 + await redis.flushdb(); 82 + 83 + await cache.set("cid1", "hash1"); 84 + await cache.set("cid2", "hash2"); 85 + await cache.set("cid3", "hash3"); 86 + 87 + let stats = await cache.getStats(); 88 + expect(stats.size).toBe(3); 89 + 90 + await cache.clear(); 91 + 92 + stats = await cache.getStats(); 93 + expect(stats.size).toBe(0); 94 + }); 95 + 96 + it("should track cache size accurately", async () => { 97 + if (!redisAvailable || !cache || !redis) { 98 + return; 99 + } 100 + 101 + await redis.flushdb(); 102 + 103 + let stats = await cache.getStats(); 104 + expect(stats.size).toBe(0); 105 + 106 + await cache.set("cid1", "hash1"); 107 + stats = await cache.getStats(); 108 + expect(stats.size).toBe(1); 109 + 110 + await cache.set("cid2", "hash2"); 111 + stats = await cache.getStats(); 112 + expect(stats.size).toBe(2); 113 + 114 + await cache.delete("cid1"); 115 + stats = await cache.getStats(); 116 + expect(stats.size).toBe(1); 117 + }); 118 + 119 + it("should expire cached phashes after TTL", async () => { 120 + if (!redisAvailable || !redis) { 121 + return; 122 + } 123 + 124 + const shortTtlCache = new PhashCache(redis, 1); 125 + const cid = "bafyexpire"; 126 + const phash = "expirablehash"; 127 + 128 + await shortTtlCache.set(cid, phash); 129 + let retrieved = await shortTtlCache.get(cid); 130 + expect(retrieved).toBe(phash); 131 + 132 + await new Promise((resolve) => setTimeout(resolve, 1100)); 133 + 134 + retrieved = await shortTtlCache.get(cid); 135 + expect(retrieved).toBeNull(); 136 + }); 137 + });
+135
tests/unit/phash.test.ts
··· 1 + import { describe, expect, test } from "bun:test"; 2 + import sharp from "sharp"; 3 + import { computePerceptualHash } from "../../src/hasher/phash"; 4 + 5 + describe("computePerceptualHash", () => { 6 + test("should compute hash for a solid color image", async () => { 7 + const buffer = await sharp({ 8 + create: { 9 + width: 100, 10 + height: 100, 11 + channels: 3, 12 + background: { r: 128, g: 128, b: 128 }, 13 + }, 14 + }) 15 + .png() 16 + .toBuffer(); 17 + 18 + const hash = await computePerceptualHash(buffer); 19 + 20 + expect(hash).toBeTypeOf("string"); 21 + expect(hash).toHaveLength(16); 22 + expect(hash).toMatch(/^[0-9a-f]+$/); 23 + }); 24 + 25 + test("should compute same hash for identical images", async () => { 26 + const buffer1 = await sharp({ 27 + create: { 28 + width: 200, 29 + height: 200, 30 + channels: 3, 31 + background: { r: 255, g: 0, b: 0 }, 32 + }, 33 + }) 34 + .png() 35 + .toBuffer(); 36 + 37 + const buffer2 = await sharp({ 38 + create: { 39 + width: 200, 40 + height: 200, 41 + channels: 3, 42 + background: { r: 255, g: 0, b: 0 }, 43 + }, 44 + }) 45 + .png() 46 + .toBuffer(); 47 + 48 + const hash1 = await computePerceptualHash(buffer1); 49 + const hash2 = await computePerceptualHash(buffer2); 50 + 51 + expect(hash1).toBe(hash2); 52 + }); 53 + 54 + test("should compute different hashes for different images", async () => { 55 + const whitePixel = Buffer.from([255, 255, 255]); 56 + const blackPixel = Buffer.from([0, 0, 0]); 57 + 58 + const buffer1 = await sharp( 59 + Buffer.concat([...new Array(50).fill(whitePixel), ...new Array(50).fill(blackPixel)]), 60 + { 61 + raw: { 62 + width: 10, 63 + height: 10, 64 + channels: 3, 65 + }, 66 + } 67 + ) 68 + .png() 69 + .toBuffer(); 70 + 71 + const buffer2 = await sharp( 72 + Buffer.concat([...new Array(50).fill(blackPixel), ...new Array(50).fill(whitePixel)]), 73 + { 74 + raw: { 75 + width: 10, 76 + height: 10, 77 + channels: 3, 78 + }, 79 + } 80 + ) 81 + .png() 82 + .toBuffer(); 83 + 84 + const hash1 = await computePerceptualHash(buffer1); 85 + const hash2 = await computePerceptualHash(buffer2); 86 + 87 + expect(hash1).not.toBe(hash2); 88 + }); 89 + 90 + test("should compute similar hashes for slightly modified images", async () => { 91 + const buffer1 = await sharp({ 92 + create: { 93 + width: 100, 94 + height: 100, 95 + channels: 3, 96 + background: { r: 128, g: 128, b: 128 }, 97 + }, 98 + }) 99 + .png() 100 + .toBuffer(); 101 + 102 + const buffer2 = await sharp({ 103 + create: { 104 + width: 100, 105 + height: 100, 106 + channels: 3, 107 + background: { r: 130, g: 130, b: 130 }, 108 + }, 109 + }) 110 + .png() 111 + .toBuffer(); 112 + 113 + const hash1 = await computePerceptualHash(buffer1); 114 + const hash2 = await computePerceptualHash(buffer2); 115 + 116 + const h1 = BigInt(`0x${hash1}`); 117 + const h2 = BigInt(`0x${hash2}`); 118 + const xor = h1 ^ h2; 119 + 120 + let hammingDistance = 0; 121 + let n = xor; 122 + while (n > 0n) { 123 + hammingDistance++; 124 + n &= n - 1n; 125 + } 126 + 127 + expect(hammingDistance).toBeLessThan(10); 128 + }); 129 + 130 + test("should throw error for invalid image", async () => { 131 + const invalidBuffer = Buffer.from("not an image"); 132 + 133 + await expect(computePerceptualHash(invalidBuffer)).rejects.toThrow(); 134 + }); 135 + });
+20
tsconfig.json
··· 1 + { 2 + "compilerOptions": { 3 + "target": "ES2022", 4 + "module": "ESNext", 5 + "lib": ["ES2022"], 6 + "moduleResolution": "bundler", 7 + "allowImportingTsExtensions": true, 8 + "noEmit": true, 9 + "composite": false, 10 + "strict": true, 11 + "esModuleInterop": true, 12 + "skipLibCheck": true, 13 + "forceConsistentCasingInFileNames": true, 14 + "resolveJsonModule": true, 15 + "isolatedModules": true, 16 + "types": ["bun-types"] 17 + }, 18 + "include": ["src/**/*", "tests/**/*"], 19 + "exclude": ["node_modules"] 20 + }