Your music, beautifully tracked. All yours. (coming soon) teal.fm
teal-fm atproto

sqlx + some ci stuff

Changed files
+2562 -51
.github
.sqlx
services
.sqlx
+201
.github/WORKFLOWS.md
··· 1 + # GitHub Actions Workflows Documentation 2 + 3 + This document describes the CI/CD workflows configured for the Teal project. 4 + 5 + ## Overview 6 + 7 + The project uses GitHub Actions for continuous integration, deployment, and security scanning. The workflows are designed to handle a polyglot codebase with Rust services, Node.js packages, and a React Native application. 8 + 9 + ## Workflows 10 + 11 + ### 🔧 CI (`ci.yml`) 12 + 13 + **Triggers:** Push/PR to `main` or `develop` branches 14 + 15 + **Purpose:** Primary continuous integration workflow that runs tests, linting, and type checking. 16 + 17 + **Jobs:** 18 + - **rust-check**: Formats, lints (clippy), and tests all Rust code in both `services/` and `apps/` 19 + - **node-check**: Type checking, linting, building, and testing Node.js packages 20 + - **lexicon-check**: Validates lexicon files and ensures generated code is up to date 21 + 22 + **Key Features:** 23 + - Caches Rust and Node.js dependencies for faster builds 24 + - Runs in parallel for optimal performance 25 + - Fails fast if any check fails 26 + 27 + ### 🚀 Aqua (`aqua.yml`) 28 + 29 + **Triggers:** Push/PR to `main` with changes to `apps/aqua/**` 30 + 31 + **Purpose:** Builds and pushes the Aqua Rust application Docker image. 32 + 33 + **Features:** 34 + - Multi-platform builds (linux/amd64, linux/arm64) 35 + - Pushes to GitHub Container Registry (ghcr.io) 36 + - Only pushes on main branch (not PRs) 37 + - Uses GitHub Actions cache for Docker layers 38 + 39 + ### 🤖 Cadet (`cadet.yml`) 40 + 41 + **Triggers:** Push/PR to `main` with changes to `services/cadet/**` 42 + 43 + **Purpose:** Builds and pushes the Cadet Rust service Docker image. 44 + 45 + **Features:** 46 + - Multi-platform builds (linux/amd64, linux/arm64) 47 + - Pushes to GitHub Container Registry (ghcr.io) 48 + - Only pushes on main branch (not PRs) 49 + - Uses GitHub Actions cache for Docker layers 50 + 51 + ### 🔮 Amethyst (`amethyst.yml`) 52 + 53 + **Triggers:** Push/PR to `main` with changes to `apps/amethyst/**` 54 + 55 + **Purpose:** Builds the React Native/Expo application for different platforms. 56 + 57 + **Jobs:** 58 + - **build-web**: Builds web version and uploads artifacts 59 + - **build-ios**: Builds iOS version (only on main branch pushes, requires macOS runner) 60 + - **lint-and-test**: Type checking and testing 61 + 62 + **Features:** 63 + - Generates lexicons before building 64 + - Platform-specific builds 65 + - Artifact uploads for build assets 66 + 67 + ### 🛠️ Services (`services.yml`) 68 + 69 + **Triggers:** Push/PR to `main` with changes to `services/**` 70 + 71 + **Purpose:** Dynamically detects and builds all services with Dockerfiles. 72 + 73 + **Jobs:** 74 + - **detect-services**: Scans for services with Dockerfiles 75 + - **build-service**: Matrix build for each detected service 76 + - **test-services**: Runs tests for all services 77 + 78 + **Features:** 79 + - Dynamic service detection 80 + - Skips special directories (target, migrations, types, .sqlx) 81 + - Per-service Docker caching 82 + - Multi-platform builds 83 + 84 + ### 🎉 Release (`release.yml`) 85 + 86 + **Triggers:** 87 + - Push to tags matching `v*` 88 + - Manual workflow dispatch 89 + 90 + **Purpose:** Creates GitHub releases and builds production Docker images. 91 + 92 + **Jobs:** 93 + - **create-release**: Creates GitHub release with changelog 94 + - **build-and-release-aqua**: Builds and tags Aqua for release 95 + - **build-and-release-cadet**: Builds and tags Cadet for release 96 + - **release-other-services**: Builds other services (rocketman, satellite) 97 + - **build-and-release-amethyst**: Builds Amethyst and uploads to release 98 + 99 + **Features:** 100 + - Automatic changelog extraction 101 + - Production Docker tags (latest + version) 102 + - Release artifact uploads 103 + - Support for pre-releases (tags with `-`) 104 + 105 + ### 🔒 Security (`security.yml`) 106 + 107 + **Triggers:** 108 + - Push/PR to `main` or `develop` 109 + - Daily at 2 AM UTC 110 + - Manual dispatch 111 + 112 + **Purpose:** Comprehensive security scanning and vulnerability detection. 113 + 114 + **Jobs:** 115 + - **rust-security-audit**: Uses `cargo audit` for Rust dependencies 116 + - **node-security-audit**: Uses `pnpm audit` for Node.js dependencies 117 + - **codeql-analysis**: GitHub's semantic code analysis 118 + - **docker-security-scan**: Trivy vulnerability scanning for Docker images 119 + - **secrets-scan**: TruffleHog for secrets detection 120 + 121 + **Features:** 122 + - Fails on high/critical vulnerabilities 123 + - SARIF upload for security tab integration 124 + - Historical scanning with git history 125 + 126 + ## Configuration Files 127 + 128 + ### Dependabot (`dependabot.yml`) 129 + 130 + Automated dependency updates for: 131 + - **npm**: Weekly updates for Node.js dependencies 132 + - **cargo**: Weekly updates for Rust dependencies (services + apps) 133 + - **github-actions**: Weekly updates for workflow actions 134 + - **docker**: Weekly updates for Docker base images 135 + 136 + **Schedule:** Monday-Tuesday mornings, staggered to avoid conflicts 137 + 138 + ## Container Registry 139 + 140 + All Docker images are pushed to GitHub Container Registry: 141 + - `ghcr.io/[owner]/[repo]/aqua` 142 + - `ghcr.io/[owner]/[repo]/cadet` 143 + - `ghcr.io/[owner]/[repo]/[service-name]` 144 + 145 + **Tags:** 146 + - `latest`: Latest build from main branch 147 + - `sha-[commit]`: Specific commit builds 148 + - `v[version]`: Release builds 149 + - `pr-[number]`: Pull request builds (for testing) 150 + 151 + ## Secrets and Permissions 152 + 153 + **Required secrets:** 154 + - `GITHUB_TOKEN`: Automatically provided (for registry access and releases) 155 + 156 + **Permissions used:** 157 + - `contents: read`: Read repository contents 158 + - `packages: write`: Push to GitHub Container Registry 159 + - `security-events: write`: Upload security scan results 160 + - `actions: read`: Access workflow information 161 + 162 + ## Best Practices 163 + 164 + 1. **Path-based triggers**: Workflows only run when relevant files change 165 + 2. **Caching**: Aggressive caching for Rust, Node.js, and Docker layers 166 + 3. **Multi-platform**: Docker images built for amd64 and arm64 167 + 4. **Security-first**: Regular vulnerability scanning and secrets detection 168 + 5. **Fail-fast**: Early termination on critical issues 169 + 6. **Artifact preservation**: Build outputs stored for debugging/deployment 170 + 171 + ## Usage Examples 172 + 173 + ### Manual Release 174 + ```bash 175 + # Tag and push for automatic release 176 + git tag v1.0.0 177 + git push origin v1.0.0 178 + 179 + # Or use workflow dispatch in GitHub UI 180 + ``` 181 + 182 + ### Local Development 183 + ```bash 184 + # Run the same checks locally 185 + pnpm rust:fmt 186 + pnpm rust:clippy 187 + pnpm typecheck 188 + pnpm test 189 + ``` 190 + 191 + ### Debugging Failed Builds 192 + 1. Check the Actions tab for detailed logs 193 + 2. Download artifacts from successful builds 194 + 3. Use the same commands locally with cached dependencies 195 + 196 + ## Maintenance 197 + 198 + - **Weekly**: Review Dependabot PRs 199 + - **Monthly**: Update action versions if not auto-updated 200 + - **Quarterly**: Review and update security scanning tools 201 + - **As needed**: Add new services to release workflow matrix
+96
.github/dependabot.yml
··· 1 + version: 2 2 + updates: 3 + # Enable version updates for npm (Node.js dependencies) 4 + - package-ecosystem: "npm" 5 + directory: "/" 6 + schedule: 7 + interval: "weekly" 8 + day: "monday" 9 + time: "09:00" 10 + open-pull-requests-limit: 10 11 + assignees: 12 + - "@me" 13 + reviewers: 14 + - "@me" 15 + commit-message: 16 + prefix: "deps" 17 + include: "scope" 18 + 19 + # Enable version updates for Cargo (Rust dependencies) - services 20 + - package-ecosystem: "cargo" 21 + directory: "/services" 22 + schedule: 23 + interval: "weekly" 24 + day: "monday" 25 + time: "10:00" 26 + open-pull-requests-limit: 5 27 + assignees: 28 + - "@me" 29 + reviewers: 30 + - "@me" 31 + commit-message: 32 + prefix: "deps(rust)" 33 + include: "scope" 34 + 35 + # Enable version updates for Cargo (Rust dependencies) - aqua app 36 + - package-ecosystem: "cargo" 37 + directory: "/apps/aqua" 38 + schedule: 39 + interval: "weekly" 40 + day: "monday" 41 + time: "10:30" 42 + open-pull-requests-limit: 5 43 + assignees: 44 + - "@me" 45 + reviewers: 46 + - "@me" 47 + commit-message: 48 + prefix: "deps(rust)" 49 + include: "scope" 50 + 51 + # Enable version updates for GitHub Actions 52 + - package-ecosystem: "github-actions" 53 + directory: "/" 54 + schedule: 55 + interval: "weekly" 56 + day: "monday" 57 + time: "11:00" 58 + open-pull-requests-limit: 5 59 + assignees: 60 + - "@me" 61 + reviewers: 62 + - "@me" 63 + commit-message: 64 + prefix: "deps(actions)" 65 + include: "scope" 66 + 67 + # Enable version updates for Docker 68 + - package-ecosystem: "docker" 69 + directory: "/apps/aqua" 70 + schedule: 71 + interval: "weekly" 72 + day: "tuesday" 73 + time: "09:00" 74 + open-pull-requests-limit: 3 75 + assignees: 76 + - "@me" 77 + reviewers: 78 + - "@me" 79 + commit-message: 80 + prefix: "deps(docker)" 81 + include: "scope" 82 + 83 + - package-ecosystem: "docker" 84 + directory: "/services/cadet" 85 + schedule: 86 + interval: "weekly" 87 + day: "tuesday" 88 + time: "09:30" 89 + open-pull-requests-limit: 3 90 + assignees: 91 + - "@me" 92 + reviewers: 93 + - "@me" 94 + commit-message: 95 + prefix: "deps(docker)" 96 + include: "scope"
+128
.github/workflows/amethyst.yml
··· 1 + # yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json 2 + 3 + name: Build Amethyst 4 + 5 + on: 6 + push: 7 + branches: [main] 8 + paths: 9 + - "apps/amethyst/**" 10 + - "packages/**" 11 + - "lexicons/**" 12 + - "package.json" 13 + - "pnpm-lock.yaml" 14 + - ".github/workflows/amethyst.yml" 15 + pull_request: 16 + branches: [main] 17 + paths: 18 + - "apps/amethyst/**" 19 + - "packages/**" 20 + - "lexicons/**" 21 + - "package.json" 22 + - "pnpm-lock.yaml" 23 + - ".github/workflows/amethyst.yml" 24 + 25 + jobs: 26 + build-web: 27 + name: Build Web 28 + runs-on: ubuntu-latest 29 + steps: 30 + - name: Checkout repository 31 + uses: actions/checkout@v4 32 + 33 + - name: Install pnpm 34 + uses: pnpm/action-setup@v4 35 + 36 + - name: Setup Node.js 37 + uses: actions/setup-node@v4 38 + with: 39 + node-version: "20" 40 + cache: "pnpm" 41 + 42 + - name: Install dependencies 43 + run: pnpm install --frozen-lockfile 44 + 45 + - name: Generate lexicons 46 + run: pnpm lex:gen-server 47 + 48 + - name: Build web 49 + run: | 50 + cd apps/amethyst 51 + pnpm build:web 52 + 53 + - name: Upload web build artifacts 54 + uses: actions/upload-artifact@v4 55 + with: 56 + name: amethyst-web-build 57 + path: apps/amethyst/build/ 58 + retention-days: 7 59 + 60 + build-ios: 61 + name: Build iOS 62 + runs-on: macos-latest 63 + if: github.event_name == 'push' && github.ref == 'refs/heads/main' 64 + steps: 65 + - name: Checkout repository 66 + uses: actions/checkout@v4 67 + 68 + - name: Install pnpm 69 + uses: pnpm/action-setup@v4 70 + 71 + - name: Setup Node.js 72 + uses: actions/setup-node@v4 73 + with: 74 + node-version: "20" 75 + cache: "pnpm" 76 + 77 + - name: Install dependencies 78 + run: pnpm install --frozen-lockfile 79 + 80 + - name: Generate lexicons 81 + run: pnpm lex:gen-server 82 + 83 + - name: Setup Expo CLI 84 + run: npm install -g @expo/cli 85 + 86 + - name: Build iOS 87 + run: | 88 + cd apps/amethyst 89 + pnpm build:ios 90 + 91 + - name: Upload iOS build artifacts 92 + uses: actions/upload-artifact@v4 93 + with: 94 + name: amethyst-ios-build 95 + path: apps/amethyst/build/ 96 + retention-days: 7 97 + 98 + lint-and-test: 99 + name: Lint and Test 100 + runs-on: ubuntu-latest 101 + steps: 102 + - name: Checkout repository 103 + uses: actions/checkout@v4 104 + 105 + - name: Install pnpm 106 + uses: pnpm/action-setup@v4 107 + 108 + - name: Setup Node.js 109 + uses: actions/setup-node@v4 110 + with: 111 + node-version: "20" 112 + cache: "pnpm" 113 + 114 + - name: Install dependencies 115 + run: pnpm install --frozen-lockfile 116 + 117 + - name: Generate lexicons 118 + run: pnpm lex:gen-server 119 + 120 + - name: Type check 121 + run: | 122 + cd apps/amethyst 123 + npx tsc --noEmit 124 + 125 + - name: Run tests 126 + run: | 127 + cd apps/amethyst 128 + pnpm test --watchAll=false
+7 -3
.github/workflows/appview.yml .github/workflows/aqua.yml
··· 1 + # yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json 2 + 1 3 name: Build and Push Aqua 2 4 3 5 on: 4 6 push: 5 - branches: [main] 7 + branches: 8 + - main 6 9 paths: 7 10 - "apps/aqua/**" 8 11 - "Cargo.toml" 9 12 - "Cargo.lock" 10 13 - ".github/workflows/aqua.yml" 11 14 pull_request: 12 - branches: [main] 15 + branches: 16 + - main 13 17 paths: 14 18 - "apps/aqua/**" 15 19 - "Cargo.toml" ··· 17 21 - ".github/workflows/aqua.yml" 18 22 19 23 env: 20 - REGISTRY: ghcr.io 24 + REGISTRY: ghcr.iosyste 21 25 IMAGE_NAME: ${{ github.repository }}/aqua 22 26 23 27 jobs:
+44 -42
.github/workflows/cadet.yml
··· 1 + # yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json 2 + 1 3 name: Build and Push Cadet 2 4 3 5 on: 4 6 push: 5 - branches: [ main ] 7 + branches: [main] 6 8 paths: 7 - - 'services/cadet/**' 8 - - 'Cargo.toml' 9 - - 'Cargo.lock' 10 - - '.github/workflows/cadet.yml' 9 + - "services/cadet/**" 10 + - "Cargo.toml" 11 + - "Cargo.lock" 12 + - ".github/workflows/cadet.yml" 11 13 pull_request: 12 - branches: [ main ] 14 + branches: [main] 13 15 paths: 14 - - 'services/cadet/**' 15 - - 'Cargo.toml' 16 - - 'Cargo.lock' 17 - - '.github/workflows/cadet.yml' 16 + - "services/cadet/**" 17 + - "Cargo.toml" 18 + - "Cargo.lock" 19 + - ".github/workflows/cadet.yml" 18 20 19 21 env: 20 22 REGISTRY: ghcr.io ··· 28 30 packages: write 29 31 30 32 steps: 31 - - name: Checkout repository 32 - uses: actions/checkout@v4 33 + - name: Checkout repository 34 + uses: actions/checkout@v4 33 35 34 - - name: Log in to Container Registry 35 - if: github.event_name != 'pull_request' 36 - uses: docker/login-action@v3 37 - with: 38 - registry: ${{ env.REGISTRY }} 39 - username: ${{ github.actor }} 40 - password: ${{ secrets.GITHUB_TOKEN }} 36 + - name: Log in to Container Registry 37 + if: github.event_name != 'pull_request' 38 + uses: docker/login-action@v3 39 + with: 40 + registry: ${{ env.REGISTRY }} 41 + username: ${{ github.actor }} 42 + password: ${{ secrets.GITHUB_TOKEN }} 41 43 42 - - name: Extract metadata 43 - id: meta 44 - uses: docker/metadata-action@v5 45 - with: 46 - images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }} 47 - tags: | 48 - type=ref,event=branch 49 - type=ref,event=pr 50 - type=sha,prefix=sha- 51 - type=raw,value=latest,enable={{is_default_branch}} 44 + - name: Extract metadata 45 + id: meta 46 + uses: docker/metadata-action@v5 47 + with: 48 + images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }} 49 + tags: | 50 + type=ref,event=branch 51 + type=ref,event=pr 52 + type=sha,prefix=sha- 53 + type=raw,value=latest,enable={{is_default_branch}} 52 54 53 - - name: Set up Docker Buildx 54 - uses: docker/setup-buildx-action@v3 55 + - name: Set up Docker Buildx 56 + uses: docker/setup-buildx-action@v3 55 57 56 - - name: Build and push Docker image 57 - uses: docker/build-push-action@v5 58 - with: 59 - context: ./services/cadet 60 - file: ./services/cadet/Dockerfile 61 - push: ${{ github.event_name != 'pull_request' }} 62 - tags: ${{ steps.meta.outputs.tags }} 63 - labels: ${{ steps.meta.outputs.labels }} 64 - platforms: linux/amd64,linux/arm64 65 - cache-from: type=gha 66 - cache-to: type=gha,mode=max 58 + - name: Build and push Docker image 59 + uses: docker/build-push-action@v5 60 + with: 61 + context: ./services/cadet 62 + file: ./services/cadet/Dockerfile 63 + push: ${{ github.event_name != 'pull_request' }} 64 + tags: ${{ steps.meta.outputs.tags }} 65 + labels: ${{ steps.meta.outputs.labels }} 66 + platforms: linux/amd64,linux/arm64 67 + cache-from: type=gha 68 + cache-to: type=gha,mode=max
+128
.github/workflows/ci.yml
··· 1 + # yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json 2 + 3 + name: CI 4 + 5 + on: 6 + push: 7 + branches: [main, develop] 8 + pull_request: 9 + branches: [main, develop] 10 + 11 + env: 12 + CARGO_TERM_COLOR: always 13 + SQLX_OFFLINE: true 14 + 15 + jobs: 16 + rust-check: 17 + name: Rust Check 18 + runs-on: ubuntu-latest 19 + steps: 20 + - name: Checkout repository 21 + uses: actions/checkout@v4 22 + 23 + - name: Install Rust toolchain 24 + uses: dtolnay/rust-toolchain@stable 25 + with: 26 + components: rustfmt, clippy 27 + 28 + - name: Cache Rust dependencies 29 + uses: Swatinem/rust-cache@v2 30 + with: 31 + workspaces: | 32 + services 33 + apps/aqua 34 + 35 + - name: Check Rust formatting (services) 36 + run: | 37 + cd services 38 + cargo fmt --all -- --check 39 + 40 + - name: Check Rust formatting (apps) 41 + run: | 42 + for dir in apps/*/; do 43 + if [ -f "$dir/Cargo.toml" ]; then 44 + echo "Checking formatting for $dir" 45 + cd "$dir" 46 + cargo fmt --all -- --check 47 + cd ../.. 48 + fi 49 + done 50 + 51 + - name: Run Clippy (services) 52 + run: | 53 + cd services 54 + cargo clippy --all-targets --all-features -- -D warnings 55 + 56 + - name: Run Clippy (apps) 57 + run: | 58 + for dir in apps/*/; do 59 + if [ -f "$dir/Cargo.toml" ]; then 60 + echo "Running clippy for $dir" 61 + cd "$dir" 62 + cargo clippy --all-targets --all-features -- -D warnings 63 + cd ../.. 64 + fi 65 + done 66 + 67 + - name: Run Rust tests (services) 68 + run: | 69 + cd services 70 + cargo test --all-features 71 + 72 + node-check: 73 + name: Node.js Check 74 + runs-on: ubuntu-latest 75 + steps: 76 + - name: Checkout repository 77 + uses: actions/checkout@v4 78 + 79 + - name: Install pnpm 80 + uses: pnpm/action-setup@v4 81 + 82 + - name: Setup Node.js 83 + uses: actions/setup-node@v4 84 + with: 85 + node-version: "20" 86 + cache: "pnpm" 87 + 88 + - name: Install dependencies 89 + run: pnpm install --frozen-lockfile 90 + 91 + - name: Type check 92 + run: pnpm typecheck 93 + 94 + - name: Lint and format check 95 + run: pnpm fix --check 96 + 97 + - name: Build packages 98 + run: pnpm build 99 + 100 + - name: Run tests 101 + run: pnpm test 102 + 103 + lexicon-check: 104 + name: Lexicon Validation 105 + runs-on: ubuntu-latest 106 + steps: 107 + - name: Checkout repository 108 + uses: actions/checkout@v4 109 + 110 + - name: Install pnpm 111 + uses: pnpm/action-setup@v4 112 + 113 + - name: Setup Node.js 114 + uses: actions/setup-node@v4 115 + with: 116 + node-version: "20" 117 + cache: "pnpm" 118 + 119 + - name: Install dependencies 120 + run: pnpm install --frozen-lockfile 121 + 122 + - name: Validate lexicons 123 + run: pnpm lex:validate 124 + 125 + - name: Check lexicon generation 126 + run: | 127 + pnpm lex:gen-server 128 + git diff --exit-code || (echo "Lexicon files are out of sync. Run 'pnpm lex:gen-server' locally." && exit 1)
+283
.github/workflows/release.yml
··· 1 + # yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json 2 + 3 + name: Release 4 + 5 + on: 6 + push: 7 + tags: 8 + - "v*" 9 + workflow_dispatch: 10 + inputs: 11 + tag: 12 + description: "Release tag" 13 + required: true 14 + type: string 15 + 16 + env: 17 + REGISTRY: ghcr.io 18 + CARGO_TERM_COLOR: always 19 + SQLX_OFFLINE: true 20 + 21 + jobs: 22 + create-release: 23 + name: Create Release 24 + runs-on: ubuntu-latest 25 + outputs: 26 + release_id: ${{ steps.create_release.outputs.id }} 27 + upload_url: ${{ steps.create_release.outputs.upload_url }} 28 + steps: 29 + - name: Checkout repository 30 + uses: actions/checkout@v4 31 + 32 + - name: Get tag name 33 + id: tag 34 + run: | 35 + if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then 36 + echo "tag=${{ github.event.inputs.tag }}" >> $GITHUB_OUTPUT 37 + else 38 + echo "tag=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT 39 + fi 40 + 41 + - name: Generate changelog 42 + id: changelog 43 + run: | 44 + if [ -f "CHANGELOG.md" ]; then 45 + # Extract changelog for this version 46 + awk '/^## \[${{ steps.tag.outputs.tag }}\]/{flag=1; next} /^## \[/{flag=0} flag' CHANGELOG.md > release_notes.md 47 + else 48 + echo "Release ${{ steps.tag.outputs.tag }}" > release_notes.md 49 + fi 50 + 51 + - name: Create Release 52 + id: create_release 53 + uses: actions/create-release@v1 54 + env: 55 + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} 56 + with: 57 + tag_name: ${{ steps.tag.outputs.tag }} 58 + release_name: Release ${{ steps.tag.outputs.tag }} 59 + body_path: release_notes.md 60 + draft: false 61 + prerelease: ${{ contains(steps.tag.outputs.tag, '-') }} 62 + 63 + build-and-release-aqua: 64 + name: Release Aqua 65 + runs-on: ubuntu-latest 66 + needs: create-release 67 + permissions: 68 + contents: read 69 + packages: write 70 + steps: 71 + - name: Checkout repository 72 + uses: actions/checkout@v4 73 + 74 + - name: Log in to Container Registry 75 + uses: docker/login-action@v3 76 + with: 77 + registry: ${{ env.REGISTRY }} 78 + username: ${{ github.actor }} 79 + password: ${{ secrets.GITHUB_TOKEN }} 80 + 81 + - name: Get tag name 82 + id: tag 83 + run: | 84 + if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then 85 + echo "tag=${{ github.event.inputs.tag }}" >> $GITHUB_OUTPUT 86 + else 87 + echo "tag=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT 88 + fi 89 + 90 + - name: Extract metadata 91 + id: meta 92 + uses: docker/metadata-action@v5 93 + with: 94 + images: ${{ env.REGISTRY }}/${{ github.repository }}/aqua 95 + tags: | 96 + type=raw,value=latest 97 + type=raw,value=${{ steps.tag.outputs.tag }} 98 + 99 + - name: Set up Docker Buildx 100 + uses: docker/setup-buildx-action@v3 101 + 102 + - name: Build and push Docker image 103 + uses: docker/build-push-action@v5 104 + with: 105 + context: ./apps/aqua 106 + file: ./apps/aqua/Dockerfile 107 + push: true 108 + tags: ${{ steps.meta.outputs.tags }} 109 + labels: ${{ steps.meta.outputs.labels }} 110 + platforms: linux/amd64,linux/arm64 111 + cache-from: type=gha 112 + cache-to: type=gha,mode=max 113 + 114 + build-and-release-cadet: 115 + name: Release Cadet 116 + runs-on: ubuntu-latest 117 + needs: create-release 118 + permissions: 119 + contents: read 120 + packages: write 121 + steps: 122 + - name: Checkout repository 123 + uses: actions/checkout@v4 124 + 125 + - name: Log in to Container Registry 126 + uses: docker/login-action@v3 127 + with: 128 + registry: ${{ env.REGISTRY }} 129 + username: ${{ github.actor }} 130 + password: ${{ secrets.GITHUB_TOKEN }} 131 + 132 + - name: Get tag name 133 + id: tag 134 + run: | 135 + if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then 136 + echo "tag=${{ github.event.inputs.tag }}" >> $GITHUB_OUTPUT 137 + else 138 + echo "tag=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT 139 + fi 140 + 141 + - name: Extract metadata 142 + id: meta 143 + uses: docker/metadata-action@v5 144 + with: 145 + images: ${{ env.REGISTRY }}/${{ github.repository }}/cadet 146 + tags: | 147 + type=raw,value=latest 148 + type=raw,value=${{ steps.tag.outputs.tag }} 149 + 150 + - name: Set up Docker Buildx 151 + uses: docker/setup-buildx-action@v3 152 + 153 + - name: Build and push Docker image 154 + uses: docker/build-push-action@v5 155 + with: 156 + context: ./services/cadet 157 + file: ./services/cadet/Dockerfile 158 + push: true 159 + tags: ${{ steps.meta.outputs.tags }} 160 + labels: ${{ steps.meta.outputs.labels }} 161 + platforms: linux/amd64,linux/arm64 162 + cache-from: type=gha 163 + cache-to: type=gha,mode=max 164 + 165 + release-other-services: 166 + name: Release Other Services 167 + runs-on: ubuntu-latest 168 + needs: create-release 169 + permissions: 170 + contents: read 171 + packages: write 172 + strategy: 173 + matrix: 174 + service: [rocketman, satellite] 175 + steps: 176 + - name: Checkout repository 177 + uses: actions/checkout@v4 178 + 179 + - name: Check if service has Dockerfile 180 + id: check 181 + run: | 182 + if [ -f "services/${{ matrix.service }}/Dockerfile" ]; then 183 + echo "has_dockerfile=true" >> $GITHUB_OUTPUT 184 + else 185 + echo "has_dockerfile=false" >> $GITHUB_OUTPUT 186 + fi 187 + 188 + - name: Log in to Container Registry 189 + if: steps.check.outputs.has_dockerfile == 'true' 190 + uses: docker/login-action@v3 191 + with: 192 + registry: ${{ env.REGISTRY }} 193 + username: ${{ github.actor }} 194 + password: ${{ secrets.GITHUB_TOKEN }} 195 + 196 + - name: Get tag name 197 + if: steps.check.outputs.has_dockerfile == 'true' 198 + id: tag 199 + run: | 200 + if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then 201 + echo "tag=${{ github.event.inputs.tag }}" >> $GITHUB_OUTPUT 202 + else 203 + echo "tag=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT 204 + fi 205 + 206 + - name: Extract metadata 207 + if: steps.check.outputs.has_dockerfile == 'true' 208 + id: meta 209 + uses: docker/metadata-action@v5 210 + with: 211 + images: ${{ env.REGISTRY }}/${{ github.repository }}/${{ matrix.service }} 212 + tags: | 213 + type=raw,value=latest 214 + type=raw,value=${{ steps.tag.outputs.tag }} 215 + 216 + - name: Set up Docker Buildx 217 + if: steps.check.outputs.has_dockerfile == 'true' 218 + uses: docker/setup-buildx-action@v3 219 + 220 + - name: Build and push Docker image 221 + if: steps.check.outputs.has_dockerfile == 'true' 222 + uses: docker/build-push-action@v5 223 + with: 224 + context: ./services/${{ matrix.service }} 225 + file: ./services/${{ matrix.service }}/Dockerfile 226 + push: true 227 + tags: ${{ steps.meta.outputs.tags }} 228 + labels: ${{ steps.meta.outputs.labels }} 229 + platforms: linux/amd64,linux/arm64 230 + cache-from: type=gha,scope=${{ matrix.service }} 231 + cache-to: type=gha,mode=max,scope=${{ matrix.service }} 232 + 233 + build-and-release-amethyst: 234 + name: Release Amethyst 235 + runs-on: ubuntu-latest 236 + needs: create-release 237 + steps: 238 + - name: Checkout repository 239 + uses: actions/checkout@v4 240 + 241 + - name: Install pnpm 242 + uses: pnpm/action-setup@v4 243 + 244 + - name: Setup Node.js 245 + uses: actions/setup-node@v4 246 + with: 247 + node-version: "20" 248 + cache: "pnpm" 249 + 250 + - name: Install dependencies 251 + run: pnpm install --frozen-lockfile 252 + 253 + - name: Generate lexicons 254 + run: pnpm lex:gen-server 255 + 256 + - name: Build for all platforms 257 + run: | 258 + cd apps/amethyst 259 + pnpm build 260 + 261 + - name: Create build archive 262 + run: | 263 + cd apps/amethyst 264 + tar -czf amethyst-build.tar.gz build/ 265 + 266 + - name: Get tag name 267 + id: tag 268 + run: | 269 + if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then 270 + echo "tag=${{ github.event.inputs.tag }}" >> $GITHUB_OUTPUT 271 + else 272 + echo "tag=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT 273 + fi 274 + 275 + - name: Upload Amethyst build to release 276 + uses: actions/upload-release-asset@v1 277 + env: 278 + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} 279 + with: 280 + upload_url: ${{ needs.create-release.outputs.upload_url }} 281 + asset_path: ./apps/amethyst/amethyst-build.tar.gz 282 + asset_name: amethyst-${{ steps.tag.outputs.tag }}.tar.gz 283 + asset_content_type: application/gzip
+180
.github/workflows/security.yml
··· 1 + # yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json 2 + 3 + name: Security 4 + 5 + on: 6 + push: 7 + branches: [main, develop] 8 + pull_request: 9 + branches: [main, develop] 10 + schedule: 11 + # Run security checks daily at 2 AM sunday 12 + - cron: "0 2 * * 0" 13 + 14 + jobs: 15 + rust-security-audit: 16 + name: Rust Security Audit 17 + runs-on: ubuntu-latest 18 + steps: 19 + - name: Checkout repository 20 + uses: actions/checkout@v4 21 + 22 + - name: Install Rust toolchain 23 + uses: dtolnay/rust-toolchain@stable 24 + 25 + - name: Install cargo-audit 26 + run: cargo install cargo-audit 27 + 28 + - name: Cache Rust dependencies 29 + uses: Swatinem/rust-cache@v2 30 + with: 31 + workspaces: | 32 + services 33 + apps/aqua 34 + 35 + - name: Run cargo audit (services) 36 + run: | 37 + cd services 38 + cargo audit 39 + 40 + - name: Run cargo audit (apps) 41 + run: | 42 + for dir in apps/*/; do 43 + if [ -f "$dir/Cargo.toml" ]; then 44 + echo "Running security audit for $dir" 45 + cd "$dir" 46 + cargo audit 47 + cd ../.. 48 + fi 49 + done 50 + 51 + node-security-audit: 52 + name: Node.js Security Audit 53 + runs-on: ubuntu-latest 54 + steps: 55 + - name: Checkout repository 56 + uses: actions/checkout@v4 57 + 58 + - name: Install pnpm 59 + uses: pnpm/action-setup@v4 60 + 61 + - name: Setup Node.js 62 + uses: actions/setup-node@v4 63 + with: 64 + node-version: "20" 65 + cache: "pnpm" 66 + 67 + - name: Install dependencies 68 + run: pnpm install --frozen-lockfile 69 + 70 + - name: Run npm audit 71 + run: pnpm audit --prod 72 + 73 + - name: Check for known vulnerabilities 74 + run: | 75 + # Run audit and capture output 76 + pnpm audit --json > audit-results.json || true 77 + 78 + # Check if there are any high or critical vulnerabilities 79 + if jq '.vulnerabilities | to_entries[] | select(.value.severity == "high" or .value.severity == "critical")' audit-results.json | grep -q .; then 80 + echo "High or critical vulnerabilities found!" 81 + jq '.vulnerabilities | to_entries[] | select(.value.severity == "high" or .value.severity == "critical")' audit-results.json 82 + exit 1 83 + else 84 + echo "No high or critical vulnerabilities found." 85 + fi 86 + 87 + codeql-analysis: 88 + name: CodeQL Analysis 89 + runs-on: ubuntu-latest 90 + permissions: 91 + actions: read 92 + contents: read 93 + security-events: write 94 + strategy: 95 + fail-fast: false 96 + matrix: 97 + language: ["javascript", "typescript"] 98 + steps: 99 + - name: Checkout repository 100 + uses: actions/checkout@v4 101 + 102 + - name: Initialize CodeQL 103 + uses: github/codeql-action/init@v3 104 + with: 105 + languages: ${{ matrix.language }} 106 + queries: security-extended,security-and-quality 107 + 108 + - name: Install pnpm 109 + uses: pnpm/action-setup@v4 110 + 111 + - name: Setup Node.js 112 + uses: actions/setup-node@v4 113 + with: 114 + node-version: "20" 115 + cache: "pnpm" 116 + 117 + - name: Install dependencies 118 + run: pnpm install --frozen-lockfile 119 + 120 + - name: Build 121 + run: pnpm build 122 + 123 + - name: Perform CodeQL Analysis 124 + uses: github/codeql-action/analyze@v3 125 + with: 126 + category: "/language:${{matrix.language}}" 127 + 128 + docker-security-scan: 129 + name: Docker Security Scan 130 + runs-on: ubuntu-latest 131 + if: github.event_name == 'push' || github.event_name == 'schedule' 132 + strategy: 133 + matrix: 134 + service: [aqua, cadet] 135 + steps: 136 + - name: Checkout repository 137 + uses: actions/checkout@v4 138 + 139 + - name: Set up Docker Buildx 140 + uses: docker/setup-buildx-action@v3 141 + 142 + - name: Build Docker image 143 + uses: docker/build-push-action@v5 144 + with: 145 + context: ${{ matrix.service == 'aqua' && './apps/aqua' || './services/cadet' }} 146 + file: ${{ matrix.service == 'aqua' && './apps/aqua/Dockerfile' || './services/cadet/Dockerfile' }} 147 + load: true 148 + tags: ${{ matrix.service }}:latest 149 + cache-from: type=gha,scope=${{ matrix.service }} 150 + cache-to: type=gha,mode=max,scope=${{ matrix.service }} 151 + 152 + - name: Run Trivy vulnerability scanner 153 + uses: aquasecurity/trivy-action@master 154 + with: 155 + image-ref: "${{ matrix.service }}:latest" 156 + format: "sarif" 157 + output: "trivy-results-${{ matrix.service }}.sarif" 158 + 159 + - name: Upload Trivy scan results to GitHub Security tab 160 + uses: github/codeql-action/upload-sarif@v3 161 + if: always() 162 + with: 163 + sarif_file: "trivy-results-${{ matrix.service }}.sarif" 164 + 165 + secrets-scan: 166 + name: Secrets Scan 167 + runs-on: ubuntu-latest 168 + steps: 169 + - name: Checkout repository 170 + uses: actions/checkout@v4 171 + with: 172 + fetch-depth: 0 173 + 174 + - name: Run TruffleHog OSS 175 + uses: trufflesecurity/trufflehog@main 176 + with: 177 + path: ./ 178 + base: main 179 + head: HEAD 180 + extra_args: --debug --only-verified
+124
.github/workflows/services.yml
··· 1 + # yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json 2 + 3 + name: Build and Push Services 4 + 5 + on: 6 + push: 7 + branches: [main] 8 + paths: 9 + - "services/**" 10 + - "Cargo.toml" 11 + - "Cargo.lock" 12 + - ".github/workflows/services.yml" 13 + pull_request: 14 + branches: [main] 15 + paths: 16 + - "services/**" 17 + - "Cargo.toml" 18 + - "Cargo.lock" 19 + - ".github/workflows/services.yml" 20 + 21 + env: 22 + REGISTRY: ghcr.io 23 + CARGO_TERM_COLOR: always 24 + SQLX_OFFLINE: true 25 + 26 + jobs: 27 + detect-services: 28 + name: Detect Services to Build 29 + runs-on: ubuntu-latest 30 + outputs: 31 + services: ${{ steps.detect.outputs.services }} 32 + steps: 33 + - name: Checkout repository 34 + uses: actions/checkout@v4 35 + 36 + - name: Detect services with Dockerfiles 37 + id: detect 38 + run: | 39 + services=() 40 + for service_dir in services/*/; do 41 + service_name=$(basename "$service_dir") 42 + # Skip special directories 43 + if [[ "$service_name" == "target" || "$service_name" == "migrations" || "$service_name" == "types" || "$service_name" == ".sqlx" ]]; then 44 + continue 45 + fi 46 + # Check if service has a Dockerfile 47 + if [[ -f "$service_dir/Dockerfile" ]]; then 48 + services+=("$service_name") 49 + fi 50 + done 51 + 52 + # Convert to JSON array 53 + services_json=$(printf '%s\n' "${services[@]}" | jq -R . | jq -s .) 54 + echo "services=$services_json" >> $GITHUB_OUTPUT 55 + echo "Detected services: $services_json" 56 + 57 + build-service: 58 + name: Build ${{ matrix.service }} 59 + runs-on: ubuntu-latest 60 + needs: detect-services 61 + if: needs.detect-services.outputs.services != '[]' 62 + strategy: 63 + matrix: 64 + service: ${{ fromJson(needs.detect-services.outputs.services) }} 65 + permissions: 66 + contents: read 67 + packages: write 68 + steps: 69 + - name: Checkout repository 70 + uses: actions/checkout@v4 71 + 72 + - name: Log in to Container Registry 73 + if: github.event_name != 'pull_request' 74 + uses: docker/login-action@v3 75 + with: 76 + registry: ${{ env.REGISTRY }} 77 + username: ${{ github.actor }} 78 + password: ${{ secrets.GITHUB_TOKEN }} 79 + 80 + - name: Extract metadata 81 + id: meta 82 + uses: docker/metadata-action@v5 83 + with: 84 + images: ${{ env.REGISTRY }}/${{ github.repository }}/${{ matrix.service }} 85 + tags: | 86 + type=ref,event=branch 87 + type=ref,event=pr 88 + type=sha,prefix=sha- 89 + type=raw,value=latest,enable={{is_default_branch}} 90 + 91 + - name: Set up Docker Buildx 92 + uses: docker/setup-buildx-action@v3 93 + 94 + - name: Build and push Docker image 95 + uses: docker/build-push-action@v5 96 + with: 97 + context: ./services/${{ matrix.service }} 98 + file: ./services/${{ matrix.service }}/Dockerfile 99 + push: ${{ github.event_name != 'pull_request' }} 100 + tags: ${{ steps.meta.outputs.tags }} 101 + labels: ${{ steps.meta.outputs.labels }} 102 + platforms: linux/amd64,linux/arm64 103 + cache-from: type=gha,scope=${{ matrix.service }} 104 + cache-to: type=gha,mode=max,scope=${{ matrix.service }} 105 + 106 + test-services: 107 + name: Test Services 108 + runs-on: ubuntu-latest 109 + steps: 110 + - name: Checkout repository 111 + uses: actions/checkout@v4 112 + 113 + - name: Install Rust toolchain 114 + uses: dtolnay/rust-toolchain@stable 115 + 116 + - name: Cache Rust dependencies 117 + uses: Swatinem/rust-cache@v2 118 + with: 119 + workspaces: services 120 + 121 + - name: Run service tests 122 + run: | 123 + cd services 124 + cargo test --all-features --workspace
+46
.sqlx/query-00b655145e9033d951628a8bc69521815b3af632d0433f87d78c5403dd22eb75.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n SELECT DISTINCT\n ae1.name as synthetic_name,\n ae2.name as target_name,\n similarity(LOWER(TRIM(ae1.name)), LOWER(TRIM(ae2.name))) as similarity_score,\n COUNT(ptae1.play_uri) as synthetic_plays,\n COUNT(ptae2.play_uri) as target_plays\n FROM artists_extended ae1\n CROSS JOIN artists_extended ae2\n LEFT JOIN play_to_artists_extended ptae1 ON ae1.id = ptae1.artist_id\n LEFT JOIN play_to_artists_extended ptae2 ON ae2.id = ptae2.artist_id\n WHERE ae1.id != ae2.id\n AND ae1.mbid_type = 'synthetic'\n AND ae2.mbid_type = 'musicbrainz'\n AND similarity(LOWER(TRIM(ae1.name)), LOWER(TRIM(ae2.name))) >= $1\n GROUP BY ae1.id, ae1.name, ae2.id, ae2.name, similarity_score\n ORDER BY similarity_score DESC\n LIMIT 10\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "synthetic_name", 9 + "type_info": "Text" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "target_name", 14 + "type_info": "Text" 15 + }, 16 + { 17 + "ordinal": 2, 18 + "name": "similarity_score", 19 + "type_info": "Float4" 20 + }, 21 + { 22 + "ordinal": 3, 23 + "name": "synthetic_plays", 24 + "type_info": "Int8" 25 + }, 26 + { 27 + "ordinal": 4, 28 + "name": "target_plays", 29 + "type_info": "Int8" 30 + } 31 + ], 32 + "parameters": { 33 + "Left": [ 34 + "Float4" 35 + ] 36 + }, 37 + "nullable": [ 38 + false, 39 + false, 40 + null, 41 + null, 42 + null 43 + ] 44 + }, 45 + "hash": "00b655145e9033d951628a8bc69521815b3af632d0433f87d78c5403dd22eb75" 46 + }
+12
.sqlx/query-0d7c3ef80c20dac6efd0fe3c430d7f41b1c90368ff99ce8a09f66bca63864d1e.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "REFRESH MATERIALIZED VIEW mv_release_play_counts;", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [] 8 + }, 9 + "nullable": [] 10 + }, 11 + "hash": "0d7c3ef80c20dac6efd0fe3c430d7f41b1c90368ff99ce8a09f66bca63864d1e" 12 + }
+35
.sqlx/query-0e053ba402c8b769b697f60d189675eceb89f1d14e52174bda67dc65cc68d273.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n SELECT\n pta.artist_mbid as mbid,\n pta.artist_name as name,\n COUNT(*) as play_count\n FROM plays p\n INNER JOIN play_to_artists pta ON p.uri = pta.play_uri\n WHERE p.did = $1\n AND pta.artist_mbid IS NOT NULL\n AND pta.artist_name IS NOT NULL\n GROUP BY pta.artist_mbid, pta.artist_name\n ORDER BY play_count DESC\n LIMIT $2\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "mbid", 9 + "type_info": "Uuid" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "name", 14 + "type_info": "Text" 15 + }, 16 + { 17 + "ordinal": 2, 18 + "name": "play_count", 19 + "type_info": "Int8" 20 + } 21 + ], 22 + "parameters": { 23 + "Left": [ 24 + "Text", 25 + "Int8" 26 + ] 27 + }, 28 + "nullable": [ 29 + false, 30 + true, 31 + null 32 + ] 33 + }, 34 + "hash": "0e053ba402c8b769b697f60d189675eceb89f1d14e52174bda67dc65cc68d273" 35 + }
+14
.sqlx/query-0f62d18dcac06b6da3fc90e2206af0fc21e46e42ce1402750f9cc4dd08b54cec.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "DELETE FROM artists_extended WHERE id = $1", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Int4" 9 + ] 10 + }, 11 + "nullable": [] 12 + }, 13 + "hash": "0f62d18dcac06b6da3fc90e2206af0fc21e46e42ce1402750f9cc4dd08b54cec" 14 + }
+112
.sqlx/query-0ff59e15ce4faa50bb4b9996ae7877681060ed462a7905012f8097c9545f60b1.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n SELECT\n uri, did, rkey, cid, isrc, duration, track_name, played_time, processed_time,\n release_mbid, release_name, recording_mbid, submission_client_agent,\n music_service_base_domain, origin_url,\n COALESCE(\n json_agg(\n json_build_object(\n 'artist_mbid', pta.artist_mbid,\n 'artist_name', pta.artist_name\n )\n ) FILTER (WHERE pta.artist_name IS NOT NULL),\n '[]'\n ) AS artists\n FROM plays\n LEFT JOIN play_to_artists as pta ON uri = pta.play_uri\n WHERE did = ANY($1)\n GROUP BY uri, did, rkey, cid, isrc, duration, track_name, played_time, processed_time,\n release_mbid, release_name, recording_mbid, submission_client_agent,\n music_service_base_domain, origin_url\n ORDER BY processed_time desc\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "uri", 9 + "type_info": "Text" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "did", 14 + "type_info": "Text" 15 + }, 16 + { 17 + "ordinal": 2, 18 + "name": "rkey", 19 + "type_info": "Text" 20 + }, 21 + { 22 + "ordinal": 3, 23 + "name": "cid", 24 + "type_info": "Text" 25 + }, 26 + { 27 + "ordinal": 4, 28 + "name": "isrc", 29 + "type_info": "Text" 30 + }, 31 + { 32 + "ordinal": 5, 33 + "name": "duration", 34 + "type_info": "Int4" 35 + }, 36 + { 37 + "ordinal": 6, 38 + "name": "track_name", 39 + "type_info": "Text" 40 + }, 41 + { 42 + "ordinal": 7, 43 + "name": "played_time", 44 + "type_info": "Timestamptz" 45 + }, 46 + { 47 + "ordinal": 8, 48 + "name": "processed_time", 49 + "type_info": "Timestamptz" 50 + }, 51 + { 52 + "ordinal": 9, 53 + "name": "release_mbid", 54 + "type_info": "Uuid" 55 + }, 56 + { 57 + "ordinal": 10, 58 + "name": "release_name", 59 + "type_info": "Text" 60 + }, 61 + { 62 + "ordinal": 11, 63 + "name": "recording_mbid", 64 + "type_info": "Uuid" 65 + }, 66 + { 67 + "ordinal": 12, 68 + "name": "submission_client_agent", 69 + "type_info": "Text" 70 + }, 71 + { 72 + "ordinal": 13, 73 + "name": "music_service_base_domain", 74 + "type_info": "Text" 75 + }, 76 + { 77 + "ordinal": 14, 78 + "name": "origin_url", 79 + "type_info": "Text" 80 + }, 81 + { 82 + "ordinal": 15, 83 + "name": "artists", 84 + "type_info": "Json" 85 + } 86 + ], 87 + "parameters": { 88 + "Left": [ 89 + "TextArray" 90 + ] 91 + }, 92 + "nullable": [ 93 + false, 94 + false, 95 + false, 96 + false, 97 + true, 98 + true, 99 + false, 100 + true, 101 + true, 102 + true, 103 + true, 104 + true, 105 + true, 106 + true, 107 + true, 108 + null 109 + ] 110 + }, 111 + "hash": "0ff59e15ce4faa50bb4b9996ae7877681060ed462a7905012f8097c9545f60b1" 112 + }
+22
.sqlx/query-193ac753fc587fa24887d8be61eea86f74de6a1a8d4546304fb023532dfaefe7.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "SELECT extract_discriminant($1)", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "extract_discriminant", 9 + "type_info": "Text" 10 + } 11 + ], 12 + "parameters": { 13 + "Left": [ 14 + "Text" 15 + ] 16 + }, 17 + "nullable": [ 18 + null 19 + ] 20 + }, 21 + "hash": "193ac753fc587fa24887d8be61eea86f74de6a1a8d4546304fb023532dfaefe7" 22 + }
+14
.sqlx/query-1d35c8cf83ad859a8c50986ef1f587fb7f9aef2067feccd8af89d3b03d88020c.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "DELETE FROM releases WHERE mbid = $1", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Uuid" 9 + ] 10 + }, 11 + "nullable": [] 12 + }, 13 + "hash": "1d35c8cf83ad859a8c50986ef1f587fb7f9aef2067feccd8af89d3b03d88020c" 14 + }
+14
.sqlx/query-1e4e6b89ac28b1b6cb21c9fbab8f22348943b3f27e9ba9642785d33129f98363.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "DELETE FROM play_to_artists WHERE play_uri = $1", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Text" 9 + ] 10 + }, 11 + "nullable": [] 12 + }, 13 + "hash": "1e4e6b89ac28b1b6cb21c9fbab8f22348943b3f27e9ba9642785d33129f98363" 14 + }
+22
.sqlx/query-28b1d571a1d045115bcae785b2583f7bf6d02b0b19946b322192dd7f62748d4e.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "SELECT extract_edition_discriminant($1)", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "extract_edition_discriminant", 9 + "type_info": "Text" 10 + } 11 + ], 12 + "parameters": { 13 + "Left": [ 14 + "Text" 15 + ] 16 + }, 17 + "nullable": [ 18 + null 19 + ] 20 + }, 21 + "hash": "28b1d571a1d045115bcae785b2583f7bf6d02b0b19946b322192dd7f62748d4e" 22 + }
+52
.sqlx/query-2bdfb2ec8d91cffc761dc72be1a4f540e6cc918a9f7941bfdbefbea6f3dee149.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n SELECT DISTINCT\n r1.mbid as release1_mbid,\n r1.name as release1_name,\n r2.mbid as release2_mbid,\n r2.name as release2_name,\n similarity(LOWER(TRIM(r1.name)), LOWER(TRIM(r2.name))) as similarity_score,\n COUNT(DISTINCT ptae1.artist_id) as shared_artists\n FROM releases r1\n CROSS JOIN releases r2\n INNER JOIN plays p1 ON p1.release_mbid = r1.mbid\n INNER JOIN plays p2 ON p2.release_mbid = r2.mbid\n INNER JOIN play_to_artists_extended ptae1 ON p1.uri = ptae1.play_uri\n INNER JOIN play_to_artists_extended ptae2 ON p2.uri = ptae2.play_uri\n WHERE r1.mbid != r2.mbid\n AND similarity(LOWER(TRIM(r1.name)), LOWER(TRIM(r2.name))) >= $1\n AND ptae1.artist_id = ptae2.artist_id -- Same artist\n AND (\n (r1.discriminant IS NULL AND r2.discriminant IS NULL) OR\n (LOWER(TRIM(COALESCE(r1.discriminant, ''))) = LOWER(TRIM(COALESCE(r2.discriminant, ''))))\n ) -- Same or no discriminants\n GROUP BY r1.mbid, r1.name, r2.mbid, r2.name, similarity_score\n HAVING COUNT(DISTINCT ptae1.artist_id) > 0 -- At least one shared artist\n ORDER BY similarity_score DESC, shared_artists DESC\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "release1_mbid", 9 + "type_info": "Uuid" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "release1_name", 14 + "type_info": "Text" 15 + }, 16 + { 17 + "ordinal": 2, 18 + "name": "release2_mbid", 19 + "type_info": "Uuid" 20 + }, 21 + { 22 + "ordinal": 3, 23 + "name": "release2_name", 24 + "type_info": "Text" 25 + }, 26 + { 27 + "ordinal": 4, 28 + "name": "similarity_score", 29 + "type_info": "Float4" 30 + }, 31 + { 32 + "ordinal": 5, 33 + "name": "shared_artists", 34 + "type_info": "Int8" 35 + } 36 + ], 37 + "parameters": { 38 + "Left": [ 39 + "Float4" 40 + ] 41 + }, 42 + "nullable": [ 43 + false, 44 + false, 45 + false, 46 + false, 47 + null, 48 + null 49 + ] 50 + }, 51 + "hash": "2bdfb2ec8d91cffc761dc72be1a4f540e6cc918a9f7941bfdbefbea6f3dee149" 52 + }
+14
.sqlx/query-2c2f9db90b7465147a6a696a628e2542d51c42844162455230e702a87719588a.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "DELETE FROM play_to_artists_extended WHERE artist_id = $1", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Int4" 9 + ] 10 + }, 11 + "nullable": [] 12 + }, 13 + "hash": "2c2f9db90b7465147a6a696a628e2542d51c42844162455230e702a87719588a" 14 + }
+12
.sqlx/query-3d84a9e1ed05846bc931eea9b90fd88cae8b636968af4bd2f9b1a9927d15379d.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "REFRESH MATERIALIZED VIEW mv_global_play_count;", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [] 8 + }, 9 + "nullable": [] 10 + }, 11 + "hash": "3d84a9e1ed05846bc931eea9b90fd88cae8b636968af4bd2f9b1a9927d15379d" 12 + }
+22
.sqlx/query-413d8c111e295ddda68a47f38f6b9df88d4b45b149288caba54c339742a718a0.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "SELECT COUNT(*) FROM plays WHERE recording_mbid = $1", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "count", 9 + "type_info": "Int8" 10 + } 11 + ], 12 + "parameters": { 13 + "Left": [ 14 + "Uuid" 15 + ] 16 + }, 17 + "nullable": [ 18 + null 19 + ] 20 + }, 21 + "hash": "413d8c111e295ddda68a47f38f6b9df88d4b45b149288caba54c339742a718a0" 22 + }
+14
.sqlx/query-5095c5a6b62d018f95c5c1f58c274b9682f33d918ab02d4d78963fa9ca9c07d1.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n DELETE FROM profiles WHERE did = $1\n ", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Text" 9 + ] 10 + }, 11 + "nullable": [] 12 + }, 13 + "hash": "5095c5a6b62d018f95c5c1f58c274b9682f33d918ab02d4d78963fa9ca9c07d1" 14 + }
+112
.sqlx/query-651c94b4edd5afa55c3679a5f8c1ef1cbe53f7dac01b050ec7ad9100950527c0.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n SELECT\n uri, did, rkey, cid, isrc, duration, track_name, played_time, processed_time,\n release_mbid, release_name, recording_mbid, submission_client_agent,\n music_service_base_domain, origin_url,\n COALESCE(\n json_agg(\n json_build_object(\n 'artist_mbid', pta.artist_mbid,\n 'artist_name', pta.artist_name\n )\n ) FILTER (WHERE pta.artist_name IS NOT NULL),\n '[]'\n ) AS artists\n FROM plays\n LEFT JOIN play_to_artists as pta ON uri = pta.play_uri\n WHERE uri = $1\n GROUP BY uri, did, rkey, cid, isrc, duration, track_name, played_time, processed_time,\n release_mbid, release_name, recording_mbid, submission_client_agent,\n music_service_base_domain, origin_url\n ORDER BY processed_time desc\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "uri", 9 + "type_info": "Text" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "did", 14 + "type_info": "Text" 15 + }, 16 + { 17 + "ordinal": 2, 18 + "name": "rkey", 19 + "type_info": "Text" 20 + }, 21 + { 22 + "ordinal": 3, 23 + "name": "cid", 24 + "type_info": "Text" 25 + }, 26 + { 27 + "ordinal": 4, 28 + "name": "isrc", 29 + "type_info": "Text" 30 + }, 31 + { 32 + "ordinal": 5, 33 + "name": "duration", 34 + "type_info": "Int4" 35 + }, 36 + { 37 + "ordinal": 6, 38 + "name": "track_name", 39 + "type_info": "Text" 40 + }, 41 + { 42 + "ordinal": 7, 43 + "name": "played_time", 44 + "type_info": "Timestamptz" 45 + }, 46 + { 47 + "ordinal": 8, 48 + "name": "processed_time", 49 + "type_info": "Timestamptz" 50 + }, 51 + { 52 + "ordinal": 9, 53 + "name": "release_mbid", 54 + "type_info": "Uuid" 55 + }, 56 + { 57 + "ordinal": 10, 58 + "name": "release_name", 59 + "type_info": "Text" 60 + }, 61 + { 62 + "ordinal": 11, 63 + "name": "recording_mbid", 64 + "type_info": "Uuid" 65 + }, 66 + { 67 + "ordinal": 12, 68 + "name": "submission_client_agent", 69 + "type_info": "Text" 70 + }, 71 + { 72 + "ordinal": 13, 73 + "name": "music_service_base_domain", 74 + "type_info": "Text" 75 + }, 76 + { 77 + "ordinal": 14, 78 + "name": "origin_url", 79 + "type_info": "Text" 80 + }, 81 + { 82 + "ordinal": 15, 83 + "name": "artists", 84 + "type_info": "Json" 85 + } 86 + ], 87 + "parameters": { 88 + "Left": [ 89 + "Text" 90 + ] 91 + }, 92 + "nullable": [ 93 + false, 94 + false, 95 + false, 96 + false, 97 + true, 98 + true, 99 + false, 100 + true, 101 + true, 102 + true, 103 + true, 104 + true, 105 + true, 106 + true, 107 + true, 108 + null 109 + ] 110 + }, 111 + "hash": "651c94b4edd5afa55c3679a5f8c1ef1cbe53f7dac01b050ec7ad9100950527c0" 112 + }
+16
.sqlx/query-6b1a3660fc7e391293278d11020b1f37ddf7446cbc73931c8e30ee38c2f3ae48.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n UPDATE play_to_artists_extended\n SET artist_id = $1, artist_name = $2\n WHERE artist_id = $3\n AND NOT EXISTS (\n SELECT 1 FROM play_to_artists_extended existing\n WHERE existing.play_uri = play_to_artists_extended.play_uri\n AND existing.artist_id = $1\n )\n ", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Int4", 9 + "Text", 10 + "Int4" 11 + ] 12 + }, 13 + "nullable": [] 14 + }, 15 + "hash": "6b1a3660fc7e391293278d11020b1f37ddf7446cbc73931c8e30ee38c2f3ae48" 16 + }
+52
.sqlx/query-6fec79345247c090a72f32d06cb53290156d41f49abba3a9280bc2bedc1c9c56.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n SELECT DISTINCT\n ae1.id as synthetic_id,\n ae1.name as synthetic_name,\n ae2.id as target_id,\n ae2.name as target_name,\n ae2.mbid as target_mbid,\n similarity(LOWER(TRIM(ae1.name)), LOWER(TRIM(ae2.name))) as similarity_score\n FROM artists_extended ae1\n CROSS JOIN artists_extended ae2\n WHERE ae1.id != ae2.id\n AND ae1.mbid_type = 'synthetic'\n AND ae2.mbid_type = 'musicbrainz'\n AND similarity(LOWER(TRIM(ae1.name)), LOWER(TRIM(ae2.name))) >= $1\n ORDER BY similarity_score DESC\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "synthetic_id", 9 + "type_info": "Int4" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "synthetic_name", 14 + "type_info": "Text" 15 + }, 16 + { 17 + "ordinal": 2, 18 + "name": "target_id", 19 + "type_info": "Int4" 20 + }, 21 + { 22 + "ordinal": 3, 23 + "name": "target_name", 24 + "type_info": "Text" 25 + }, 26 + { 27 + "ordinal": 4, 28 + "name": "target_mbid", 29 + "type_info": "Uuid" 30 + }, 31 + { 32 + "ordinal": 5, 33 + "name": "similarity_score", 34 + "type_info": "Float4" 35 + } 36 + ], 37 + "parameters": { 38 + "Left": [ 39 + "Float4" 40 + ] 41 + }, 42 + "nullable": [ 43 + false, 44 + false, 45 + false, 46 + false, 47 + true, 48 + null 49 + ] 50 + }, 51 + "hash": "6fec79345247c090a72f32d06cb53290156d41f49abba3a9280bc2bedc1c9c56" 52 + }
+23
.sqlx/query-76c4d9600293bb80c2a6009b2b823ba85b02f77442ce3a783643e89676fef9a0.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n INSERT INTO artists_extended (mbid, name, mbid_type) VALUES ($1, $2, 'musicbrainz')\n ON CONFLICT (mbid) DO UPDATE SET\n name = EXCLUDED.name,\n updated_at = NOW()\n RETURNING id;\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "id", 9 + "type_info": "Int4" 10 + } 11 + ], 12 + "parameters": { 13 + "Left": [ 14 + "Uuid", 15 + "Text" 16 + ] 17 + }, 18 + "nullable": [ 19 + false 20 + ] 21 + }, 22 + "hash": "76c4d9600293bb80c2a6009b2b823ba85b02f77442ce3a783643e89676fef9a0" 23 + }
+29
.sqlx/query-7cdcd5e8ecada65d351a38c38cfda64ad3d9f04982181dbb32bde93ebd5adc85.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n INSERT INTO plays (\n uri, cid, did, rkey, isrc, duration, track_name, played_time,\n processed_time, release_mbid, release_name, recording_mbid,\n submission_client_agent, music_service_base_domain, artist_names_raw,\n track_discriminant, release_discriminant\n ) VALUES (\n $1, $2, $3, $4, $5, $6, $7, $8,\n NOW(), $9, $10, $11, $12, $13, $14, $15, $16\n ) ON CONFLICT(uri) DO UPDATE SET\n isrc = EXCLUDED.isrc,\n duration = EXCLUDED.duration,\n track_name = EXCLUDED.track_name,\n played_time = EXCLUDED.played_time,\n processed_time = EXCLUDED.processed_time,\n release_mbid = EXCLUDED.release_mbid,\n release_name = EXCLUDED.release_name,\n recording_mbid = EXCLUDED.recording_mbid,\n submission_client_agent = EXCLUDED.submission_client_agent,\n music_service_base_domain = EXCLUDED.music_service_base_domain,\n artist_names_raw = EXCLUDED.artist_names_raw,\n track_discriminant = EXCLUDED.track_discriminant,\n release_discriminant = EXCLUDED.release_discriminant;\n ", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Text", 9 + "Text", 10 + "Text", 11 + "Text", 12 + "Text", 13 + "Int4", 14 + "Text", 15 + "Timestamptz", 16 + "Uuid", 17 + "Text", 18 + "Uuid", 19 + "Text", 20 + "Text", 21 + "Jsonb", 22 + "Text", 23 + "Text" 24 + ] 25 + }, 26 + "nullable": [] 27 + }, 28 + "hash": "7cdcd5e8ecada65d351a38c38cfda64ad3d9f04982181dbb32bde93ebd5adc85" 29 + }
+16
.sqlx/query-7cfece6879feb2653c647d1248913c9cf54bd02a20e9694c7f6d7e92f28f8d10.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "UPDATE plays SET release_mbid = $1, release_name = $2 WHERE release_mbid = $3", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Uuid", 9 + "Text", 10 + "Uuid" 11 + ] 12 + }, 13 + "nullable": [] 14 + }, 15 + "hash": "7cfece6879feb2653c647d1248913c9cf54bd02a20e9694c7f6d7e92f28f8d10" 16 + }
+18
.sqlx/query-8758f5bb57feedca6cd65831f36aabc811e8b7072dc6bdbfd4a49242e5d7c946.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n INSERT INTO statii (uri, did, rkey, cid, record)\n VALUES ($1, $2, $3, $4, $5)\n ON CONFLICT (uri) DO UPDATE SET\n cid = EXCLUDED.cid,\n record = EXCLUDED.record,\n indexed_at = NOW();\n ", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Text", 9 + "Text", 10 + "Text", 11 + "Text", 12 + "Jsonb" 13 + ] 14 + }, 15 + "nullable": [] 16 + }, 17 + "hash": "8758f5bb57feedca6cd65831f36aabc811e8b7072dc6bdbfd4a49242e5d7c946" 18 + }
+34
.sqlx/query-97e98ede9b32adab5e1ad9808ae827387eba7ad376fba8e41217862a76179f59.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n SELECT\n p.release_mbid as mbid,\n p.release_name as name,\n COUNT(*) as play_count\n FROM plays p\n WHERE p.release_mbid IS NOT NULL\n AND p.release_name IS NOT NULL\n GROUP BY p.release_mbid, p.release_name\n ORDER BY play_count DESC\n LIMIT $1\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "mbid", 9 + "type_info": "Uuid" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "name", 14 + "type_info": "Text" 15 + }, 16 + { 17 + "ordinal": 2, 18 + "name": "play_count", 19 + "type_info": "Int8" 20 + } 21 + ], 22 + "parameters": { 23 + "Left": [ 24 + "Int8" 25 + ] 26 + }, 27 + "nullable": [ 28 + true, 29 + true, 30 + null 31 + ] 32 + }, 33 + "hash": "97e98ede9b32adab5e1ad9808ae827387eba7ad376fba8e41217862a76179f59" 34 + }
+12
.sqlx/query-9af33e4329198dee7814519573b63858eaf69f08ad2959d96ffee5c8387af0ba.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "REFRESH MATERIALIZED VIEW mv_artist_play_counts;", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [] 8 + }, 9 + "nullable": [] 10 + }, 11 + "hash": "9af33e4329198dee7814519573b63858eaf69f08ad2959d96ffee5c8387af0ba" 12 + }
+16
.sqlx/query-9bac472357fa38a6e3bb38d02ebb56a6e11c85d4aff91096f8ea68f1196e8bd3.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n INSERT INTO play_to_artists_extended (play_uri, artist_id, artist_name) VALUES\n ($1, $2, $3)\n ON CONFLICT (play_uri, artist_id) DO NOTHING;\n ", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Text", 9 + "Int4", 10 + "Text" 11 + ] 12 + }, 13 + "nullable": [] 14 + }, 15 + "hash": "9bac472357fa38a6e3bb38d02ebb56a6e11c85d4aff91096f8ea68f1196e8bd3" 16 + }
+24
.sqlx/query-9c08de3ad1dd8e005e6cf15694ad1878203772969a3b280c3db4193631a98f81.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n INSERT INTO recordings (mbid, name, discriminant) VALUES ($1, $2, $3)\n ON CONFLICT (mbid) DO UPDATE SET\n name = EXCLUDED.name,\n discriminant = COALESCE(EXCLUDED.discriminant, recordings.discriminant)\n RETURNING mbid;\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "mbid", 9 + "type_info": "Uuid" 10 + } 11 + ], 12 + "parameters": { 13 + "Left": [ 14 + "Uuid", 15 + "Text", 16 + "Text" 17 + ] 18 + }, 19 + "nullable": [ 20 + false 21 + ] 22 + }, 23 + "hash": "9c08de3ad1dd8e005e6cf15694ad1878203772969a3b280c3db4193631a98f81" 24 + }
+14
.sqlx/query-9d4e872755f90087f64f116d8fee340218e09b40ab8f94b5d9d17b9c39bf3d4f.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "DELETE FROM plays WHERE uri = $1", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Text" 9 + ] 10 + }, 11 + "nullable": [] 12 + }, 13 + "hash": "9d4e872755f90087f64f116d8fee340218e09b40ab8f94b5d9d17b9c39bf3d4f" 14 + }
+22
.sqlx/query-ad02971766fb37f49f4a75a6414807606be0562574826f8fe88827c645c01acd.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "SELECT generate_synthetic_mbid($1)", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "generate_synthetic_mbid", 9 + "type_info": "Uuid" 10 + } 11 + ], 12 + "parameters": { 13 + "Left": [ 14 + "Text" 15 + ] 16 + }, 17 + "nullable": [ 18 + null 19 + ] 20 + }, 21 + "hash": "ad02971766fb37f49f4a75a6414807606be0562574826f8fe88827c645c01acd" 22 + }
+35
.sqlx/query-af5c1fdabaee1cbc49f89d1df92e13cbb4a0837e3c644de9c7cf8e33e170d2e3.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n SELECT\n p.release_mbid as mbid,\n p.release_name as name,\n COUNT(*) as play_count\n FROM plays p\n WHERE p.did = $1\n AND p.release_mbid IS NOT NULL\n AND p.release_name IS NOT NULL\n GROUP BY p.release_mbid, p.release_name\n ORDER BY play_count DESC\n LIMIT $2\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "mbid", 9 + "type_info": "Uuid" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "name", 14 + "type_info": "Text" 15 + }, 16 + { 17 + "ordinal": 2, 18 + "name": "play_count", 19 + "type_info": "Int8" 20 + } 21 + ], 22 + "parameters": { 23 + "Left": [ 24 + "Text", 25 + "Int8" 26 + ] 27 + }, 28 + "nullable": [ 29 + true, 30 + true, 31 + null 32 + ] 33 + }, 34 + "hash": "af5c1fdabaee1cbc49f89d1df92e13cbb4a0837e3c644de9c7cf8e33e170d2e3" 35 + }
+46
.sqlx/query-b0036bbbb21b71900394c33f4c1db6f8281159b68ca492f6977dc153c60ab453.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n SELECT DISTINCT\n r1.name as recording1_name,\n r2.name as recording2_name,\n similarity(LOWER(TRIM(r1.name)), LOWER(TRIM(r2.name))) as similarity_score,\n COUNT(DISTINCT ptae1.artist_id) as shared_artists,\n STRING_AGG(DISTINCT ae.name, ', ') as artist_names\n FROM recordings r1\n CROSS JOIN recordings r2\n INNER JOIN plays p1 ON p1.recording_mbid = r1.mbid\n INNER JOIN plays p2 ON p2.recording_mbid = r2.mbid\n INNER JOIN play_to_artists_extended ptae1 ON p1.uri = ptae1.play_uri\n INNER JOIN play_to_artists_extended ptae2 ON p2.uri = ptae2.play_uri\n INNER JOIN artists_extended ae ON ptae1.artist_id = ae.id\n WHERE r1.mbid != r2.mbid\n AND similarity(LOWER(TRIM(r1.name)), LOWER(TRIM(r2.name))) >= $1\n AND ptae1.artist_id = ptae2.artist_id\n GROUP BY r1.mbid, r1.name, r2.mbid, r2.name, similarity_score\n HAVING COUNT(DISTINCT ptae1.artist_id) > 0\n ORDER BY similarity_score DESC\n LIMIT 5\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "recording1_name", 9 + "type_info": "Text" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "recording2_name", 14 + "type_info": "Text" 15 + }, 16 + { 17 + "ordinal": 2, 18 + "name": "similarity_score", 19 + "type_info": "Float4" 20 + }, 21 + { 22 + "ordinal": 3, 23 + "name": "shared_artists", 24 + "type_info": "Int8" 25 + }, 26 + { 27 + "ordinal": 4, 28 + "name": "artist_names", 29 + "type_info": "Text" 30 + } 31 + ], 32 + "parameters": { 33 + "Left": [ 34 + "Float4" 35 + ] 36 + }, 37 + "nullable": [ 38 + false, 39 + false, 40 + null, 41 + null, 42 + null 43 + ] 44 + }, 45 + "hash": "b0036bbbb21b71900394c33f4c1db6f8281159b68ca492f6977dc153c60ab453" 46 + }
+15
.sqlx/query-b23dc54aec3e2bee85f1e5874df7ad4cbaeb15ca043b244bbce224dfc26d8b56.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "UPDATE artists_extended SET name = $1, updated_at = NOW() WHERE id = $2", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Text", 9 + "Int4" 10 + ] 11 + }, 12 + "nullable": [] 13 + }, 14 + "hash": "b23dc54aec3e2bee85f1e5874df7ad4cbaeb15ca043b244bbce224dfc26d8b56" 15 + }
+65
.sqlx/query-b4e829c20bb78b9db20eccd9827e0d2f7bdbeedbaa39f6b40d1ae8a1045d6837.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "SELECT\n p.avatar,\n p.banner,\n p.created_at,\n p.description,\n p.description_facets,\n p.did,\n p.display_name,\n s.record as status\n FROM profiles p\n LEFT JOIN statii s ON p.did = s.did AND s.rkey = 'self'\n WHERE (p.did = ANY($1))\n OR (p.handle = ANY($2))", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "avatar", 9 + "type_info": "Text" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "banner", 14 + "type_info": "Text" 15 + }, 16 + { 17 + "ordinal": 2, 18 + "name": "created_at", 19 + "type_info": "Timestamptz" 20 + }, 21 + { 22 + "ordinal": 3, 23 + "name": "description", 24 + "type_info": "Text" 25 + }, 26 + { 27 + "ordinal": 4, 28 + "name": "description_facets", 29 + "type_info": "Jsonb" 30 + }, 31 + { 32 + "ordinal": 5, 33 + "name": "did", 34 + "type_info": "Text" 35 + }, 36 + { 37 + "ordinal": 6, 38 + "name": "display_name", 39 + "type_info": "Text" 40 + }, 41 + { 42 + "ordinal": 7, 43 + "name": "status", 44 + "type_info": "Jsonb" 45 + } 46 + ], 47 + "parameters": { 48 + "Left": [ 49 + "TextArray", 50 + "TextArray" 51 + ] 52 + }, 53 + "nullable": [ 54 + true, 55 + true, 56 + true, 57 + true, 58 + true, 59 + false, 60 + true, 61 + true 62 + ] 63 + }, 64 + "hash": "b4e829c20bb78b9db20eccd9827e0d2f7bdbeedbaa39f6b40d1ae8a1045d6837" 65 + }
+34
.sqlx/query-b8bf07c21c04acf3b4d908b2db93643e497db9a1f01d4d51b99dfdbddd2d4c0e.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n SELECT\n pta.artist_mbid as mbid,\n pta.artist_name as name,\n COUNT(*) as play_count\n FROM plays p\n INNER JOIN play_to_artists pta ON p.uri = pta.play_uri\n WHERE pta.artist_mbid IS NOT NULL\n AND pta.artist_name IS NOT NULL\n GROUP BY pta.artist_mbid, pta.artist_name\n ORDER BY play_count DESC\n LIMIT $1\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "mbid", 9 + "type_info": "Uuid" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "name", 14 + "type_info": "Text" 15 + }, 16 + { 17 + "ordinal": 2, 18 + "name": "play_count", 19 + "type_info": "Int8" 20 + } 21 + ], 22 + "parameters": { 23 + "Left": [ 24 + "Int8" 25 + ] 26 + }, 27 + "nullable": [ 28 + false, 29 + true, 30 + null 31 + ] 32 + }, 33 + "hash": "b8bf07c21c04acf3b4d908b2db93643e497db9a1f01d4d51b99dfdbddd2d4c0e" 34 + }
+21
.sqlx/query-b9ca1a73cba5a29665e5f996fd33410054936bbd74cfd611767bf6a7893ebded.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n INSERT INTO profiles (did, handle, display_name, description, description_facets, avatar, banner, created_at)\n VALUES ($1, $2, $3, $4, $5, $6, $7, $8)\n ON CONFLICT (did) DO UPDATE SET\n display_name = EXCLUDED.display_name,\n description = EXCLUDED.description,\n description_facets = EXCLUDED.description_facets,\n avatar = EXCLUDED.avatar,\n banner = EXCLUDED.banner,\n created_at = EXCLUDED.created_at;\n ", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Text", 9 + "Text", 10 + "Text", 11 + "Text", 12 + "Jsonb", 13 + "Text", 14 + "Text", 15 + "Timestamptz" 16 + ] 17 + }, 18 + "nullable": [] 19 + }, 20 + "hash": "b9ca1a73cba5a29665e5f996fd33410054936bbd74cfd611767bf6a7893ebded" 21 + }
+22
.sqlx/query-bbedc0ebf2ae8ecd086c089546f700e4c027150db583ae78ebba24da334c7224.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "SELECT COUNT(*) FROM plays WHERE release_mbid = $1", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "count", 9 + "type_info": "Int8" 10 + } 11 + ], 12 + "parameters": { 13 + "Left": [ 14 + "Uuid" 15 + ] 16 + }, 17 + "nullable": [ 18 + null 19 + ] 20 + }, 21 + "hash": "bbedc0ebf2ae8ecd086c089546f700e4c027150db583ae78ebba24da334c7224" 22 + }
+12
.sqlx/query-bf9c6d3bf0f9594ae1c02dc85c9887b747aaa5f0c3e67d9381c3867c4f67ae6d.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "REFRESH MATERIALIZED VIEW mv_recording_play_counts;", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [] 8 + }, 9 + "nullable": [] 10 + }, 11 + "hash": "bf9c6d3bf0f9594ae1c02dc85c9887b747aaa5f0c3e67d9381c3867c4f67ae6d" 12 + }
+46
.sqlx/query-cbc1d1c3cfe95d3d223ab4bb125e301436c9d6bbf09376215aa43e7abc98d87c.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n SELECT DISTINCT\n r1.name as release1_name,\n r2.name as release2_name,\n similarity(LOWER(TRIM(r1.name)), LOWER(TRIM(r2.name))) as similarity_score,\n COUNT(DISTINCT ptae1.artist_id) as shared_artists,\n STRING_AGG(DISTINCT ae.name, ', ') as artist_names\n FROM releases r1\n CROSS JOIN releases r2\n INNER JOIN plays p1 ON p1.release_mbid = r1.mbid\n INNER JOIN plays p2 ON p2.release_mbid = r2.mbid\n INNER JOIN play_to_artists_extended ptae1 ON p1.uri = ptae1.play_uri\n INNER JOIN play_to_artists_extended ptae2 ON p2.uri = ptae2.play_uri\n INNER JOIN artists_extended ae ON ptae1.artist_id = ae.id\n WHERE r1.mbid != r2.mbid\n AND similarity(LOWER(TRIM(r1.name)), LOWER(TRIM(r2.name))) >= $1\n AND ptae1.artist_id = ptae2.artist_id\n GROUP BY r1.mbid, r1.name, r2.mbid, r2.name, similarity_score\n HAVING COUNT(DISTINCT ptae1.artist_id) > 0\n ORDER BY similarity_score DESC\n LIMIT 5\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "release1_name", 9 + "type_info": "Text" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "release2_name", 14 + "type_info": "Text" 15 + }, 16 + { 17 + "ordinal": 2, 18 + "name": "similarity_score", 19 + "type_info": "Float4" 20 + }, 21 + { 22 + "ordinal": 3, 23 + "name": "shared_artists", 24 + "type_info": "Int8" 25 + }, 26 + { 27 + "ordinal": 4, 28 + "name": "artist_names", 29 + "type_info": "Text" 30 + } 31 + ], 32 + "parameters": { 33 + "Left": [ 34 + "Float4" 35 + ] 36 + }, 37 + "nullable": [ 38 + false, 39 + false, 40 + null, 41 + null, 42 + null 43 + ] 44 + }, 45 + "hash": "cbc1d1c3cfe95d3d223ab4bb125e301436c9d6bbf09376215aa43e7abc98d87c" 46 + }
+15
.sqlx/query-cdd7488f49e0b81ab138afaf173030ef4c37d195aee42cc6e5e2c6638cb6f3b2.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "UPDATE plays SET recording_mbid = $1 WHERE recording_mbid = $2", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Uuid", 9 + "Uuid" 10 + ] 11 + }, 12 + "nullable": [] 13 + }, 14 + "hash": "cdd7488f49e0b81ab138afaf173030ef4c37d195aee42cc6e5e2c6638cb6f3b2" 15 + }
+14
.sqlx/query-d5414741e228591937d2d3e743d0ed343ee2434cc86a8b726806959f024b7b45.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "DELETE FROM recordings WHERE mbid = $1", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Uuid" 9 + ] 10 + }, 11 + "nullable": [] 12 + }, 13 + "hash": "d5414741e228591937d2d3e743d0ed343ee2434cc86a8b726806959f024b7b45" 14 + }
+14
.sqlx/query-d80a24e6b32f04c26d28823db4601960a926801000b5f37583c98ae168c7e961.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n DELETE FROM statii WHERE uri = $1\n ", 4 + "describe": { 5 + "columns": [], 6 + "parameters": { 7 + "Left": [ 8 + "Text" 9 + ] 10 + }, 11 + "nullable": [] 12 + }, 13 + "hash": "d80a24e6b32f04c26d28823db4601960a926801000b5f37583c98ae168c7e961" 14 + }
+112
.sqlx/query-f224b252a34a67a71266caca5affc5022e74dc42496aef9e61cec0e86d80f9d0.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n SELECT\n uri, did, rkey, cid, isrc, duration, track_name, played_time, processed_time,\n release_mbid, release_name, recording_mbid, submission_client_agent,\n music_service_base_domain, origin_url,\n COALESCE(\n json_agg(\n json_build_object(\n 'artist_mbid', pta.artist_mbid,\n 'artist_name', pta.artist_name\n )\n ) FILTER (WHERE pta.artist_name IS NOT NULL),\n '[]'\n ) AS artists\n FROM plays p\n LEFT JOIN play_to_artists as pta ON p.uri = pta.play_uri\n GROUP BY uri, did, rkey, cid, isrc, duration, track_name, played_time, processed_time,\n release_mbid, release_name, recording_mbid, submission_client_agent,\n music_service_base_domain, origin_url\n ORDER BY processed_time DESC\n LIMIT $1\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "uri", 9 + "type_info": "Text" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "did", 14 + "type_info": "Text" 15 + }, 16 + { 17 + "ordinal": 2, 18 + "name": "rkey", 19 + "type_info": "Text" 20 + }, 21 + { 22 + "ordinal": 3, 23 + "name": "cid", 24 + "type_info": "Text" 25 + }, 26 + { 27 + "ordinal": 4, 28 + "name": "isrc", 29 + "type_info": "Text" 30 + }, 31 + { 32 + "ordinal": 5, 33 + "name": "duration", 34 + "type_info": "Int4" 35 + }, 36 + { 37 + "ordinal": 6, 38 + "name": "track_name", 39 + "type_info": "Text" 40 + }, 41 + { 42 + "ordinal": 7, 43 + "name": "played_time", 44 + "type_info": "Timestamptz" 45 + }, 46 + { 47 + "ordinal": 8, 48 + "name": "processed_time", 49 + "type_info": "Timestamptz" 50 + }, 51 + { 52 + "ordinal": 9, 53 + "name": "release_mbid", 54 + "type_info": "Uuid" 55 + }, 56 + { 57 + "ordinal": 10, 58 + "name": "release_name", 59 + "type_info": "Text" 60 + }, 61 + { 62 + "ordinal": 11, 63 + "name": "recording_mbid", 64 + "type_info": "Uuid" 65 + }, 66 + { 67 + "ordinal": 12, 68 + "name": "submission_client_agent", 69 + "type_info": "Text" 70 + }, 71 + { 72 + "ordinal": 13, 73 + "name": "music_service_base_domain", 74 + "type_info": "Text" 75 + }, 76 + { 77 + "ordinal": 14, 78 + "name": "origin_url", 79 + "type_info": "Text" 80 + }, 81 + { 82 + "ordinal": 15, 83 + "name": "artists", 84 + "type_info": "Json" 85 + } 86 + ], 87 + "parameters": { 88 + "Left": [ 89 + "Int8" 90 + ] 91 + }, 92 + "nullable": [ 93 + false, 94 + false, 95 + false, 96 + false, 97 + true, 98 + true, 99 + false, 100 + true, 101 + true, 102 + true, 103 + true, 104 + true, 105 + true, 106 + true, 107 + true, 108 + null 109 + ] 110 + }, 111 + "hash": "f224b252a34a67a71266caca5affc5022e74dc42496aef9e61cec0e86d80f9d0" 112 + }
+23
.sqlx/query-f604394b9517a78f2dd81723bed6435b9c3a03922a50d86daa21bfb6d09ac734.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n INSERT INTO artists_extended (mbid, name, mbid_type) VALUES ($1, $2, 'synthetic')\n ON CONFLICT (mbid) DO UPDATE SET\n name = EXCLUDED.name,\n updated_at = NOW()\n RETURNING id;\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "id", 9 + "type_info": "Int4" 10 + } 11 + ], 12 + "parameters": { 13 + "Left": [ 14 + "Uuid", 15 + "Text" 16 + ] 17 + }, 18 + "nullable": [ 19 + false 20 + ] 21 + }, 22 + "hash": "f604394b9517a78f2dd81723bed6435b9c3a03922a50d86daa21bfb6d09ac734" 23 + }
+24
.sqlx/query-f8caa11009d6220e139157dff83a0d3ffb37fcd8590527a5d7d3fc6e2e8f3672.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n INSERT INTO releases (mbid, name, discriminant) VALUES ($1, $2, $3)\n ON CONFLICT (mbid) DO UPDATE SET\n name = EXCLUDED.name,\n discriminant = COALESCE(EXCLUDED.discriminant, releases.discriminant)\n RETURNING mbid;\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "mbid", 9 + "type_info": "Uuid" 10 + } 11 + ], 12 + "parameters": { 13 + "Left": [ 14 + "Uuid", 15 + "Text", 16 + "Text" 17 + ] 18 + }, 19 + "nullable": [ 20 + false 21 + ] 22 + }, 23 + "hash": "f8caa11009d6220e139157dff83a0d3ffb37fcd8590527a5d7d3fc6e2e8f3672" 24 + }
+28
.sqlx/query-fd5f376dac5f38005efa3217c9614e377703c681e1510fc0c6539b1edee289b7.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n SELECT\n ae.id,\n ae.name\n FROM artists_extended ae\n WHERE ae.mbid_type = 'musicbrainz'\n AND (\n LOWER(TRIM(ae.name)) = $1\n OR LOWER(TRIM(ae.name)) LIKE '%' || $1 || '%'\n OR $1 LIKE '%' || LOWER(TRIM(ae.name)) || '%'\n OR similarity(LOWER(TRIM(ae.name)), $1) > 0.6\n )\n ORDER BY similarity(LOWER(TRIM(ae.name)), $1) DESC\n LIMIT 10\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "id", 9 + "type_info": "Int4" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "name", 14 + "type_info": "Text" 15 + } 16 + ], 17 + "parameters": { 18 + "Left": [ 19 + "Text" 20 + ] 21 + }, 22 + "nullable": [ 23 + false, 24 + false 25 + ] 26 + }, 27 + "hash": "fd5f376dac5f38005efa3217c9614e377703c681e1510fc0c6539b1edee289b7" 28 + }
+52
.sqlx/query-ffa27ada5f1ef0d5c699277b88ad33aa6576f6d14a12ad61974e77d52b42eea0.json
··· 1 + { 2 + "db_name": "PostgreSQL", 3 + "query": "\n SELECT DISTINCT\n r1.mbid as recording1_mbid,\n r1.name as recording1_name,\n r2.mbid as recording2_mbid,\n r2.name as recording2_name,\n similarity(LOWER(TRIM(r1.name)), LOWER(TRIM(r2.name))) as similarity_score,\n COUNT(DISTINCT ptae1.artist_id) as shared_artists\n FROM recordings r1\n CROSS JOIN recordings r2\n INNER JOIN plays p1 ON p1.recording_mbid = r1.mbid\n INNER JOIN plays p2 ON p2.recording_mbid = r2.mbid\n INNER JOIN play_to_artists_extended ptae1 ON p1.uri = ptae1.play_uri\n INNER JOIN play_to_artists_extended ptae2 ON p2.uri = ptae2.play_uri\n WHERE r1.mbid != r2.mbid\n AND similarity(LOWER(TRIM(r1.name)), LOWER(TRIM(r2.name))) >= $1\n AND ptae1.artist_id = ptae2.artist_id -- Same artist\n AND (\n (r1.discriminant IS NULL AND r2.discriminant IS NULL) OR\n (LOWER(TRIM(COALESCE(r1.discriminant, ''))) = LOWER(TRIM(COALESCE(r2.discriminant, ''))))\n ) -- Same or no discriminants\n GROUP BY r1.mbid, r1.name, r2.mbid, r2.name, similarity_score\n HAVING COUNT(DISTINCT ptae1.artist_id) > 0 -- At least one shared artist\n ORDER BY similarity_score DESC, shared_artists DESC\n ", 4 + "describe": { 5 + "columns": [ 6 + { 7 + "ordinal": 0, 8 + "name": "recording1_mbid", 9 + "type_info": "Uuid" 10 + }, 11 + { 12 + "ordinal": 1, 13 + "name": "recording1_name", 14 + "type_info": "Text" 15 + }, 16 + { 17 + "ordinal": 2, 18 + "name": "recording2_mbid", 19 + "type_info": "Uuid" 20 + }, 21 + { 22 + "ordinal": 3, 23 + "name": "recording2_name", 24 + "type_info": "Text" 25 + }, 26 + { 27 + "ordinal": 4, 28 + "name": "similarity_score", 29 + "type_info": "Float4" 30 + }, 31 + { 32 + "ordinal": 5, 33 + "name": "shared_artists", 34 + "type_info": "Int8" 35 + } 36 + ], 37 + "parameters": { 38 + "Left": [ 39 + "Float4" 40 + ] 41 + }, 42 + "nullable": [ 43 + false, 44 + false, 45 + false, 46 + false, 47 + null, 48 + null 49 + ] 50 + }, 51 + "hash": "ffa27ada5f1ef0d5c699277b88ad33aa6576f6d14a12ad61974e77d52b42eea0" 52 + }
-6
services/.sqlx/.sqlxrc
··· 1 - [database] 2 - url = "postgres://localhost/teal" 3 - migrations = "./migrations" 4 - 5 - [compile_time_verification] 6 - offline = false