prototypey.org - atproto lexicon typescript toolkit - mirror https://github.com/tylersayshi/prototypey

add support to create prototypey lexicon fromJSON with cli support (#26)

* initial implementation with tests for fromJson

* cli to generate from json and updated docs

* include changeset

* fix lint and format

authored by tyler and committed by GitHub 91a8c845 922d0930

+2223 -16
+5
.changeset/eleven-steaks-search.md
···
··· 1 + --- 2 + "prototypey": minor 3 + --- 4 + 5 + generate prototypey lexicon utils from json definitions
+53 -11
packages/prototypey/README.md
··· 138 prototypey gen-emit ./lexicons ./src/lexicons/**/*.ts 139 ``` 140 141 - ### Typical Workflow 142 143 1. Author lexicons in TypeScript using the library 144 2. Emit JSON schemas with `gen-emit` for runtime validation ··· 159 npm run lexicon:emit 160 ``` 161 162 ## State of the Project 163 164 **Done:** ··· 169 - Inferrance of valid type from full lexicon definition 170 - the really cool part of this is that it fills in the refs from the defs all at the type level 171 - `lx.lexicon(...).validate(data)` for validating data using `@atproto/lexicon` and your lexicon definitions 172 - 173 - **TODO/In Progress:** 174 - 175 - - Library art! Please reach out if you'd be willing to contribute some drawings or anything! 176 - - Add CLI support for inferring and validating from json as the starting point 177 178 - Please give any and all feedback. I've not really written many lexicons much myself yet, so this project is at a point of "well I think this makes sense" 😂. Both the [issues page](https://github.com/tylersayshi/prototypey/issues) and [discussions](https://github.com/tylersayshi/prototypey/discussions) are open and ready for y'all 🙂. 179 180 - --- 181 182 - > 💝 This package was templated with 183 - > [`create-typescript-app`](https://github.com/JoshuaKGoldberg/create-typescript-app) 184 - > using the [Bingo framework](https://create.bingo).
··· 138 prototypey gen-emit ./lexicons ./src/lexicons/**/*.ts 139 ``` 140 141 + #### `gen-from-json` - Generate TypeScript from JSON schemas 142 + 143 + ```bash 144 + prototypey gen-from-json <outdir> <sources...> 145 + ``` 146 + 147 + Generates TypeScript files from JSON lexicon schemas using the `fromJSON` helper. This is useful when you have existing lexicon JSON files and want to work with them in TypeScript with full type inference. 148 + 149 + **Example:** 150 + 151 + ```bash 152 + prototypey gen-from-json ./src/lexicons ./lexicons/**/*.json 153 + ``` 154 + 155 + This will create TypeScript files that export typed lexicon objects: 156 + 157 + ```ts 158 + // Generated file: src/lexicons/app.bsky.feed.post.ts 159 + import { fromJSON } from "prototypey"; 160 + 161 + export const appBskyFeedPost = fromJSON({ 162 + // ... lexicon JSON 163 + }); 164 + ``` 165 + 166 + ### Typical Workflows 167 + 168 + #### TypeScript-first workflow 169 170 1. Author lexicons in TypeScript using the library 171 2. Emit JSON schemas with `gen-emit` for runtime validation ··· 186 npm run lexicon:emit 187 ``` 188 189 + #### JSON-first workflow 190 + 191 + 1. Start with JSON lexicon schemas (e.g., from atproto) 192 + 2. Generate TypeScript with `gen-from-json` for type-safe access 193 + 194 + **Recommended:** Add as a script to your `package.json`: 195 + 196 + ```json 197 + { 198 + "scripts": { 199 + "lexicon:import": "prototypey gen-from-json ./src/lexicons ./lexicons/**/*.json" 200 + } 201 + } 202 + ``` 203 + 204 + Then run: 205 + 206 + ```bash 207 + npm run lexicon:import 208 + ``` 209 + 210 ## State of the Project 211 212 **Done:** ··· 217 - Inferrance of valid type from full lexicon definition 218 - the really cool part of this is that it fills in the refs from the defs all at the type level 219 - `lx.lexicon(...).validate(data)` for validating data using `@atproto/lexicon` and your lexicon definitions 220 + - `fromJSON()` helper for creating lexicons directly from JSON objects with full type inference 221 222 + Please give any and all feedback. I've not really written many lexicons much myself yet, so this project is at a point of "well I think this makes sense". Both the [issues page](https://github.com/tylersayshi/prototypey/issues) and [discussions](https://github.com/tylersayshi/prototypey/discussions) are open and ready for y'all 🙂. 223 224 + **Call For Contribution:** 225 226 + - We need library art! Please reach out if you'd be willing to contribute some drawings or anything :)
+111
packages/prototypey/cli/gen-from-json.ts
···
··· 1 + import { glob } from "tinyglobby"; 2 + import { mkdir, writeFile, readFile } from "node:fs/promises"; 3 + import { join, dirname, basename } from "node:path"; 4 + 5 + interface LexiconJSON { 6 + lexicon: number; 7 + id: string; 8 + defs: Record<string, unknown>; 9 + } 10 + 11 + /** 12 + * Converts a lexicon ID to a valid TypeScript export name 13 + * e.g., "app.bsky.feed.post" -> "appBskyFeedPost" 14 + * "com.atproto.repo.createRecord" -> "comAtprotoRepoCreateRecord" 15 + */ 16 + function lexiconIdToExportName(id: string): string { 17 + // Split by dots and handle camelCase conversion 18 + const parts = id.split("."); 19 + 20 + // For the first part (e.g., "app", "com"), keep it lowercase 21 + // For subsequent parts, capitalize the first letter of each word 22 + // But preserve any existing camelCase within parts 23 + return parts 24 + .map((part, index) => { 25 + if (index === 0) return part; 26 + // Capitalize first letter of the part 27 + return part.charAt(0).toUpperCase() + part.slice(1); 28 + }) 29 + .join(""); 30 + } 31 + 32 + export async function genFromJSON( 33 + outdir: string, 34 + sources: string | string[], 35 + ): Promise<void> { 36 + try { 37 + const sourcePatterns = Array.isArray(sources) ? sources : [sources]; 38 + 39 + // Find all JSON files matching the patterns 40 + const jsonFiles = await glob(sourcePatterns, { 41 + absolute: true, 42 + onlyFiles: true, 43 + }); 44 + 45 + if (jsonFiles.length === 0) { 46 + console.log("No JSON files found matching patterns:", sourcePatterns); 47 + return; 48 + } 49 + 50 + console.log(`Found ${String(jsonFiles.length)} JSON file(s)`); 51 + 52 + // Ensure output directory exists 53 + await mkdir(outdir, { recursive: true }); 54 + 55 + // Process each JSON file 56 + for (const jsonPath of jsonFiles) { 57 + await processJSONFile(jsonPath, outdir); 58 + } 59 + 60 + console.log(`\nGenerated TypeScript files in ${outdir}`); 61 + } catch (error) { 62 + console.error("Error generating TypeScript from JSON:", error); 63 + process.exit(1); 64 + } 65 + } 66 + 67 + async function processJSONFile( 68 + jsonPath: string, 69 + outdir: string, 70 + ): Promise<void> { 71 + try { 72 + // Read and parse the JSON file 73 + const content = await readFile(jsonPath, "utf-8"); 74 + const lexiconJSON = JSON.parse(content); 75 + 76 + // Validate it's a lexicon 77 + if ( 78 + !lexiconJSON.lexicon || 79 + !lexiconJSON.id || 80 + !lexiconJSON.defs || 81 + typeof lexiconJSON.defs !== "object" 82 + ) { 83 + console.warn(` ⚠ ${jsonPath}: Not a valid lexicon JSON`); 84 + return; 85 + } 86 + 87 + const { id } = lexiconJSON as LexiconJSON; 88 + const exportName = lexiconIdToExportName(id); 89 + 90 + // Generate TypeScript content 91 + const tsContent = `import { fromJSON } from "prototypey"; 92 + 93 + export const ${exportName} = fromJSON(${JSON.stringify(lexiconJSON, null, "\t")}); 94 + `; 95 + 96 + // Determine output path - use same structure but in outdir 97 + const outputFileName = `${basename(jsonPath, ".json")}.ts`; 98 + const outputPath = join(outdir, outputFileName); 99 + 100 + // Ensure output directory exists 101 + await mkdir(dirname(outputPath), { recursive: true }); 102 + 103 + // Write the TypeScript file 104 + await writeFile(outputPath, tsContent, "utf-8"); 105 + 106 + console.log(` ✓ ${id} -> ${outputFileName}`); 107 + } catch (error) { 108 + console.error(` ✗ Error processing ${jsonPath}:`, error); 109 + throw error; 110 + } 111 + }
+7
packages/prototypey/cli/main.ts
··· 2 3 import sade from "sade"; 4 import { genEmit } from "./gen-emit.ts"; 5 import pkg from "../package.json" with { type: "json" }; 6 7 const prog = sade("prototypey"); ··· 13 .describe("Emit JSON lexicon schemas from authored TypeScript") 14 .example("gen-emit ./lexicons ./src/lexicons/**/*.ts") 15 .action(genEmit); 16 17 prog.parse(process.argv);
··· 2 3 import sade from "sade"; 4 import { genEmit } from "./gen-emit.ts"; 5 + import { genFromJSON } from "./gen-from-json.ts"; 6 import pkg from "../package.json" with { type: "json" }; 7 8 const prog = sade("prototypey"); ··· 14 .describe("Emit JSON lexicon schemas from authored TypeScript") 15 .example("gen-emit ./lexicons ./src/lexicons/**/*.ts") 16 .action(genEmit); 17 + 18 + prog 19 + .command("gen-from-json <outdir> <sources...>") 20 + .describe("Generate TypeScript files from JSON lexicon schemas") 21 + .example("gen-from-json ./src/lexicons ./lexicons/**/*.json") 22 + .action(genFromJSON); 23 24 prog.parse(process.argv);
+434
packages/prototypey/cli/tests/gen-from-json.test.ts
···
··· 1 + import { expect, test, describe, beforeEach, afterEach } from "vitest"; 2 + import { mkdir, writeFile, rm, readFile } from "node:fs/promises"; 3 + import { join } from "node:path"; 4 + import { tmpdir } from "node:os"; 5 + import { genFromJSON } from "../gen-from-json.ts"; 6 + 7 + describe("genFromJSON", () => { 8 + let testDir: string; 9 + let outDir: string; 10 + 11 + beforeEach(async () => { 12 + // Create a temporary directory for test files 13 + testDir = join(tmpdir(), `prototypey-test-import-${String(Date.now())}`); 14 + outDir = join(testDir, "output"); 15 + await mkdir(testDir, { recursive: true }); 16 + await mkdir(outDir, { recursive: true }); 17 + }); 18 + 19 + afterEach(async () => { 20 + // Clean up test directory 21 + await rm(testDir, { recursive: true, force: true }); 22 + }); 23 + 24 + test("generates TypeScript from a simple JSON lexicon", async () => { 25 + // Create a test JSON lexicon file 26 + const jsonFile = join(testDir, "app.bsky.actor.profile.json"); 27 + const lexiconJSON = { 28 + lexicon: 1, 29 + id: "app.bsky.actor.profile", 30 + defs: { 31 + main: { 32 + type: "record", 33 + key: "self", 34 + record: { 35 + type: "object", 36 + properties: { 37 + displayName: { 38 + type: "string", 39 + maxLength: 64, 40 + maxGraphemes: 64, 41 + }, 42 + description: { 43 + type: "string", 44 + maxLength: 256, 45 + maxGraphemes: 256, 46 + }, 47 + }, 48 + }, 49 + }, 50 + }, 51 + }; 52 + 53 + await writeFile(jsonFile, JSON.stringify(lexiconJSON, null, 2)); 54 + 55 + // Run the from-json command 56 + await genFromJSON(outDir, jsonFile); 57 + 58 + // Read the generated TypeScript file 59 + const outputFile = join(outDir, "app.bsky.actor.profile.ts"); 60 + const content = await readFile(outputFile, "utf-8"); 61 + 62 + // Verify the structure 63 + expect(content).toContain('import { fromJSON } from "prototypey"'); 64 + expect(content).toContain("export const appBskyActorProfile = fromJSON("); 65 + expect(content).toContain('"id": "app.bsky.actor.profile"'); 66 + expect(content).toContain('"lexicon": 1'); 67 + 68 + // Verify it can be parsed as JSON within the call 69 + const jsonMatch = content.match(/fromJSON\(([\s\S]+)\);/); 70 + expect(jsonMatch).toBeTruthy(); 71 + if (jsonMatch) { 72 + const parsedJSON = JSON.parse(jsonMatch[1]); 73 + expect(parsedJSON).toEqual(lexiconJSON); 74 + } 75 + }); 76 + 77 + test("handles multiple JSON files with glob pattern", async () => { 78 + // Create multiple test JSON files 79 + const lexicons = join(testDir, "lexicons"); 80 + await mkdir(lexicons, { recursive: true }); 81 + 82 + const profileJSON = { 83 + lexicon: 1, 84 + id: "app.bsky.actor.profile", 85 + defs: { 86 + main: { 87 + type: "record", 88 + key: "self", 89 + record: { type: "object", properties: {} }, 90 + }, 91 + }, 92 + }; 93 + 94 + const postJSON = { 95 + lexicon: 1, 96 + id: "app.bsky.feed.post", 97 + defs: { 98 + main: { 99 + type: "record", 100 + key: "tid", 101 + record: { type: "object", properties: {} }, 102 + }, 103 + }, 104 + }; 105 + 106 + await writeFile( 107 + join(lexicons, "app.bsky.actor.profile.json"), 108 + JSON.stringify(profileJSON, null, 2), 109 + ); 110 + await writeFile( 111 + join(lexicons, "app.bsky.feed.post.json"), 112 + JSON.stringify(postJSON, null, 2), 113 + ); 114 + 115 + // Run with glob pattern 116 + await genFromJSON(outDir, `${lexicons}/*.json`); 117 + 118 + // Verify both files were created 119 + const profileTS = await readFile( 120 + join(outDir, "app.bsky.actor.profile.ts"), 121 + "utf-8", 122 + ); 123 + const postTS = await readFile( 124 + join(outDir, "app.bsky.feed.post.ts"), 125 + "utf-8", 126 + ); 127 + 128 + expect(profileTS).toContain("appBskyActorProfile"); 129 + expect(postTS).toContain("appBskyFeedPost"); 130 + }); 131 + 132 + test("generates correct export names from lexicon IDs", async () => { 133 + const testCases = [ 134 + { id: "app.bsky.feed.post", expectedName: "appBskyFeedPost" }, 135 + { 136 + id: "com.atproto.repo.createRecord", 137 + expectedName: "comAtprotoRepoCreateRecord", 138 + }, 139 + { id: "app.bsky.actor.profile", expectedName: "appBskyActorProfile" }, 140 + { id: "simple", expectedName: "simple" }, 141 + ]; 142 + 143 + for (const { id, expectedName } of testCases) { 144 + const jsonFile = join(testDir, `${id}.json`); 145 + const lexiconJSON = { 146 + lexicon: 1, 147 + id, 148 + defs: { main: { type: "object", properties: {} } }, 149 + }; 150 + 151 + await writeFile(jsonFile, JSON.stringify(lexiconJSON, null, 2)); 152 + await genFromJSON(outDir, jsonFile); 153 + 154 + const outputFile = join(outDir, `${id}.ts`); 155 + const content = await readFile(outputFile, "utf-8"); 156 + 157 + expect(content).toContain(`export const ${expectedName} = fromJSON(`); 158 + } 159 + }); 160 + 161 + test("generates TypeScript from query endpoint JSON", async () => { 162 + const jsonFile = join(testDir, "app.bsky.feed.searchPosts.json"); 163 + const lexiconJSON = { 164 + lexicon: 1, 165 + id: "app.bsky.feed.searchPosts", 166 + defs: { 167 + main: { 168 + type: "query", 169 + description: "Find posts matching search criteria", 170 + parameters: { 171 + type: "params", 172 + properties: { 173 + q: { type: "string", required: true }, 174 + limit: { type: "integer", minimum: 1, maximum: 100, default: 25 }, 175 + cursor: { type: "string" }, 176 + }, 177 + required: ["q"], 178 + }, 179 + output: { 180 + encoding: "application/json", 181 + schema: { 182 + type: "object", 183 + properties: { 184 + cursor: { type: "string" }, 185 + posts: { 186 + type: "array", 187 + items: { type: "ref", ref: "app.bsky.feed.defs#postView" }, 188 + required: true, 189 + }, 190 + }, 191 + required: ["posts"], 192 + }, 193 + }, 194 + }, 195 + }, 196 + }; 197 + 198 + await writeFile(jsonFile, JSON.stringify(lexiconJSON, null, 2)); 199 + await genFromJSON(outDir, jsonFile); 200 + 201 + const outputFile = join(outDir, "app.bsky.feed.searchPosts.ts"); 202 + const content = await readFile(outputFile, "utf-8"); 203 + 204 + expect(content).toContain("appBskyFeedSearchPosts"); 205 + expect(content).toContain('"type": "query"'); 206 + expect(content).toContain("Find posts matching search criteria"); 207 + }); 208 + 209 + test("generates TypeScript from procedure endpoint JSON", async () => { 210 + const jsonFile = join(testDir, "com.atproto.repo.createRecord.json"); 211 + const lexiconJSON = { 212 + lexicon: 1, 213 + id: "com.atproto.repo.createRecord", 214 + defs: { 215 + main: { 216 + type: "procedure", 217 + description: "Create a record", 218 + input: { 219 + encoding: "application/json", 220 + schema: { 221 + type: "object", 222 + properties: { 223 + repo: { type: "string", required: true }, 224 + collection: { type: "string", required: true }, 225 + record: { type: "unknown", required: true }, 226 + }, 227 + required: ["repo", "collection", "record"], 228 + }, 229 + }, 230 + output: { 231 + encoding: "application/json", 232 + schema: { 233 + type: "object", 234 + properties: { 235 + uri: { type: "string", required: true }, 236 + cid: { type: "string", required: true }, 237 + }, 238 + required: ["uri", "cid"], 239 + }, 240 + }, 241 + }, 242 + }, 243 + }; 244 + 245 + await writeFile(jsonFile, JSON.stringify(lexiconJSON, null, 2)); 246 + await genFromJSON(outDir, jsonFile); 247 + 248 + const outputFile = join(outDir, "com.atproto.repo.createRecord.ts"); 249 + const content = await readFile(outputFile, "utf-8"); 250 + 251 + expect(content).toContain("comAtprotoRepoCreateRecord"); 252 + expect(content).toContain('"type": "procedure"'); 253 + }); 254 + 255 + test("generates TypeScript from subscription endpoint JSON", async () => { 256 + const jsonFile = join(testDir, "com.atproto.sync.subscribeRepos.json"); 257 + const lexiconJSON = { 258 + lexicon: 1, 259 + id: "com.atproto.sync.subscribeRepos", 260 + defs: { 261 + main: { 262 + type: "subscription", 263 + description: "Repository event stream", 264 + parameters: { 265 + type: "params", 266 + properties: { 267 + cursor: { type: "integer" }, 268 + }, 269 + }, 270 + message: { 271 + schema: { 272 + type: "union", 273 + refs: ["#commit", "#identity", "#account"], 274 + }, 275 + }, 276 + }, 277 + commit: { 278 + type: "object", 279 + properties: { 280 + seq: { type: "integer", required: true }, 281 + rebase: { type: "boolean", required: true }, 282 + }, 283 + required: ["seq", "rebase"], 284 + }, 285 + identity: { 286 + type: "object", 287 + properties: { 288 + seq: { type: "integer", required: true }, 289 + did: { type: "string", format: "did", required: true }, 290 + }, 291 + required: ["seq", "did"], 292 + }, 293 + account: { 294 + type: "object", 295 + properties: { 296 + seq: { type: "integer", required: true }, 297 + active: { type: "boolean", required: true }, 298 + }, 299 + required: ["seq", "active"], 300 + }, 301 + }, 302 + }; 303 + 304 + await writeFile(jsonFile, JSON.stringify(lexiconJSON, null, 2)); 305 + await genFromJSON(outDir, jsonFile); 306 + 307 + const outputFile = join(outDir, "com.atproto.sync.subscribeRepos.ts"); 308 + const content = await readFile(outputFile, "utf-8"); 309 + 310 + expect(content).toContain("comAtprotoSyncSubscribeRepos"); 311 + expect(content).toContain('"type": "subscription"'); 312 + expect(content).toContain("commit"); 313 + expect(content).toContain("identity"); 314 + expect(content).toContain("account"); 315 + }); 316 + 317 + test("generates TypeScript from complex namespace with refs and unions", async () => { 318 + const jsonFile = join(testDir, "app.bsky.feed.defs.json"); 319 + const lexiconJSON = { 320 + lexicon: 1, 321 + id: "app.bsky.feed.defs", 322 + defs: { 323 + postView: { 324 + type: "object", 325 + properties: { 326 + uri: { type: "string", format: "at-uri", required: true }, 327 + cid: { type: "string", format: "cid", required: true }, 328 + author: { 329 + type: "ref", 330 + ref: "app.bsky.actor.defs#profileViewBasic", 331 + required: true, 332 + }, 333 + embed: { 334 + type: "union", 335 + refs: ["app.bsky.embed.images#view", "app.bsky.embed.video#view"], 336 + }, 337 + likeCount: { type: "integer", minimum: 0 }, 338 + }, 339 + required: ["uri", "cid", "author"], 340 + }, 341 + requestLess: { 342 + type: "token", 343 + description: "Request less content like this", 344 + }, 345 + requestMore: { 346 + type: "token", 347 + description: "Request more content like this", 348 + }, 349 + }, 350 + }; 351 + 352 + await writeFile(jsonFile, JSON.stringify(lexiconJSON, null, 2)); 353 + await genFromJSON(outDir, jsonFile); 354 + 355 + const outputFile = join(outDir, "app.bsky.feed.defs.ts"); 356 + const content = await readFile(outputFile, "utf-8"); 357 + 358 + expect(content).toContain("appBskyFeedDefs"); 359 + expect(content).toContain("postView"); 360 + expect(content).toContain("requestLess"); 361 + expect(content).toContain("requestMore"); 362 + }); 363 + 364 + test("handles invalid JSON gracefully", async () => { 365 + const jsonFile = join(testDir, "invalid.json"); 366 + await writeFile(jsonFile, "{ this is not valid json }"); 367 + 368 + await expect(genFromJSON(outDir, jsonFile)).rejects.toThrow(); 369 + }); 370 + 371 + test("skips non-lexicon JSON files", async () => { 372 + const jsonFile = join(testDir, "not-a-lexicon.json"); 373 + await writeFile( 374 + jsonFile, 375 + JSON.stringify({ someKey: "someValue" }, null, 2), 376 + ); 377 + 378 + // Should not throw, just warn and skip 379 + await genFromJSON(outDir, jsonFile); 380 + 381 + // Verify no output file was created 382 + const outputFiles = await readdir(outDir).catch(() => []); 383 + expect(outputFiles.length).toBe(0); 384 + }); 385 + 386 + test("round-trip: gen-emit then gen-from-json produces equivalent types", async () => { 387 + // This is an integration test that verifies the round-trip works 388 + const intermediateDir = join(testDir, "json"); 389 + await mkdir(intermediateDir, { recursive: true }); 390 + 391 + // First, create a simple TypeScript lexicon 392 + const tsFile = join(testDir, "original.ts"); 393 + await writeFile( 394 + tsFile, 395 + ` 396 + import { lx } from "prototypey"; 397 + 398 + export const postSchema = lx.lexicon("app.bsky.feed.post", { 399 + main: lx.record({ 400 + key: "tid", 401 + record: lx.object({ 402 + text: lx.string({ maxLength: 300, required: true }), 403 + createdAt: lx.string({ format: "datetime", required: true }), 404 + }), 405 + }), 406 + }); 407 + `, 408 + ); 409 + 410 + // Import gen-emit dynamically to use it 411 + const { genEmit } = await import("../gen-emit.ts"); 412 + 413 + // Step 1: gen-emit to create JSON 414 + await genEmit(intermediateDir, tsFile); 415 + 416 + // Step 2: gen-from-json to create TypeScript from JSON 417 + const jsonFile = join(intermediateDir, "app.bsky.feed.post.json"); 418 + await genFromJSON(outDir, jsonFile); 419 + 420 + // Verify the output exists 421 + const outputFile = join(outDir, "app.bsky.feed.post.ts"); 422 + const content = await readFile(outputFile, "utf-8"); 423 + 424 + expect(content).toContain("appBskyFeedPost"); 425 + expect(content).toContain('"id": "app.bsky.feed.post"'); 426 + expect(content).toContain('"type": "record"'); 427 + }); 428 + }); 429 + 430 + // Helper function that was missing from imports 431 + async function readdir(path: string): Promise<string[]> { 432 + const { readdir: fsReaddir } = await import("node:fs/promises"); 433 + return fsReaddir(path); 434 + }
+5
packages/prototypey/core/lib.ts
··· 611 }); 612 }, 613 };
··· 611 }); 612 }, 613 }; 614 + 615 + /** helper to pull lexicon from json directly */ 616 + export function fromJSON<const Lex extends LexiconNamespace>(json: Lex) { 617 + return lx.lexicon<Lex["id"], Lex["defs"]>(json.id, json.defs); 618 + }
+1 -1
packages/prototypey/core/main.ts
··· 1 - export { lx } from "./lib.ts"; 2 export { type Infer } from "./infer.ts";
··· 1 + export { lx, fromJSON } from "./lib.ts"; 2 export { type Infer } from "./infer.ts";
+1258
packages/prototypey/core/tests/from-json-infer.test.ts
···
··· 1 + import { test } from "vitest"; 2 + import { attest } from "@ark/attest"; 3 + import { fromJSON } from "../lib.ts"; 4 + 5 + test("fromJSON InferNS produces expected type shape", () => { 6 + const exampleLexicon = fromJSON({ 7 + id: "com.example.post", 8 + defs: { 9 + main: { 10 + type: "record", 11 + key: "tid", 12 + record: { 13 + type: "object", 14 + properties: { 15 + text: { type: "string", required: true }, 16 + createdAt: { type: "string", required: true, format: "datetime" }, 17 + likes: { type: "integer" }, 18 + tags: { type: "array", items: { type: "string" }, maxLength: 5 }, 19 + }, 20 + required: ["text", "createdAt"], 21 + }, 22 + }, 23 + }, 24 + }); 25 + 26 + // Type snapshot - this captures how types appear on hover 27 + attest(exampleLexicon["~infer"]).type.toString.snap(`{ 28 + $type: "com.example.post" 29 + tags?: string[] | undefined 30 + likes?: number | undefined 31 + createdAt: string 32 + text: string 33 + }`); 34 + }); 35 + 36 + test("fromJSON InferObject handles required fields", () => { 37 + const schema = fromJSON({ 38 + id: "test", 39 + defs: { 40 + main: { 41 + type: "object", 42 + properties: { 43 + required: { type: "string", required: true }, 44 + optional: { type: "string" }, 45 + }, 46 + required: ["required"], 47 + }, 48 + }, 49 + }); 50 + 51 + attest(schema["~infer"]).type.toString.snap(`{ 52 + $type: "test" 53 + optional?: string | undefined 54 + required: string 55 + }`); 56 + }); 57 + 58 + test("fromJSON InferObject handles nullable fields", () => { 59 + const schema = fromJSON({ 60 + id: "test", 61 + defs: { 62 + main: { 63 + type: "object", 64 + properties: { 65 + nullable: { type: "string", nullable: true, required: true }, 66 + }, 67 + required: ["nullable"], 68 + nullable: ["nullable"], 69 + }, 70 + }, 71 + }); 72 + 73 + attest(schema["~infer"]).type.toString.snap( 74 + '{ $type: "test"; nullable: string | null }', 75 + ); 76 + }); 77 + 78 + // ============================================================================ 79 + // PRIMITIVE TYPES TESTS 80 + // ============================================================================ 81 + 82 + test("fromJSON InferType handles string primitive", () => { 83 + const lexicon = fromJSON({ 84 + id: "test.string", 85 + defs: { 86 + main: { 87 + type: "object", 88 + properties: { 89 + simpleString: { type: "string" }, 90 + }, 91 + }, 92 + }, 93 + }); 94 + 95 + attest(lexicon["~infer"]).type.toString.snap(`{ 96 + $type: "test.string" 97 + simpleString?: string | undefined 98 + }`); 99 + }); 100 + 101 + test("fromJSON InferType handles integer primitive", () => { 102 + const lexicon = fromJSON({ 103 + id: "test.integer", 104 + defs: { 105 + main: { 106 + type: "object", 107 + properties: { 108 + count: { type: "integer" }, 109 + age: { type: "integer", minimum: 0, maximum: 120 }, 110 + }, 111 + }, 112 + }, 113 + }); 114 + 115 + attest(lexicon["~infer"]).type.toString.snap(`{ 116 + $type: "test.integer" 117 + count?: number | undefined 118 + age?: number | undefined 119 + }`); 120 + }); 121 + 122 + test("fromJSON InferType handles boolean primitive", () => { 123 + const lexicon = fromJSON({ 124 + id: "test.boolean", 125 + defs: { 126 + main: { 127 + type: "object", 128 + properties: { 129 + isActive: { type: "boolean" }, 130 + hasAccess: { type: "boolean", required: true }, 131 + }, 132 + required: ["hasAccess"], 133 + }, 134 + }, 135 + }); 136 + 137 + attest(lexicon["~infer"]).type.toString.snap(`{ 138 + $type: "test.boolean" 139 + isActive?: boolean | undefined 140 + hasAccess: boolean 141 + }`); 142 + }); 143 + 144 + test("fromJSON InferType handles null primitive", () => { 145 + const lexicon = fromJSON({ 146 + id: "test.null", 147 + defs: { 148 + main: { 149 + type: "object", 150 + properties: { 151 + nullValue: { type: "null" }, 152 + }, 153 + }, 154 + }, 155 + }); 156 + 157 + attest(lexicon["~infer"]).type.toString.snap(`{ 158 + $type: "test.null" 159 + nullValue?: null | undefined 160 + }`); 161 + }); 162 + 163 + test("fromJSON InferType handles unknown primitive", () => { 164 + const lexicon = fromJSON({ 165 + id: "test.unknown", 166 + defs: { 167 + main: { 168 + type: "object", 169 + properties: { 170 + metadata: { type: "unknown" }, 171 + }, 172 + }, 173 + }, 174 + }); 175 + 176 + attest(lexicon["~infer"]).type.toString.snap( 177 + '{ $type: "test.unknown"; metadata?: unknown }', 178 + ); 179 + }); 180 + 181 + test("fromJSON InferType handles bytes primitive", () => { 182 + const lexicon = fromJSON({ 183 + id: "test.bytes", 184 + defs: { 185 + main: { 186 + type: "object", 187 + properties: { 188 + data: { type: "bytes" }, 189 + }, 190 + }, 191 + }, 192 + }); 193 + 194 + attest(lexicon["~infer"]).type.toString.snap(`{ 195 + $type: "test.bytes" 196 + data?: Uint8Array<ArrayBufferLike> | undefined 197 + }`); 198 + }); 199 + 200 + test("fromJSON InferType handles blob primitive", () => { 201 + const lexicon = fromJSON({ 202 + id: "test.blob", 203 + defs: { 204 + main: { 205 + type: "object", 206 + properties: { 207 + image: { 208 + type: "blob", 209 + accept: ["image/png", "image/jpeg"], 210 + }, 211 + }, 212 + }, 213 + }, 214 + }); 215 + 216 + attest(lexicon["~infer"]).type.toString.snap( 217 + '{ $type: "test.blob"; image?: Blob | undefined }', 218 + ); 219 + }); 220 + 221 + // ============================================================================ 222 + // TOKEN TYPE TESTS 223 + // ============================================================================ 224 + 225 + test("fromJSON InferToken handles basic token without enum", () => { 226 + const lexicon = fromJSON({ 227 + id: "test.token", 228 + defs: { 229 + main: { 230 + type: "object", 231 + properties: { 232 + symbol: { type: "token", description: "A symbolic value" }, 233 + }, 234 + }, 235 + }, 236 + }); 237 + 238 + attest(lexicon["~infer"]).type.toString.snap(`{ 239 + $type: "test.token" 240 + symbol?: string | undefined 241 + }`); 242 + }); 243 + 244 + // ============================================================================ 245 + // ARRAY TYPE TESTS 246 + // ============================================================================ 247 + 248 + test("fromJSON InferArray handles string arrays", () => { 249 + const lexicon = fromJSON({ 250 + id: "test.array.string", 251 + defs: { 252 + main: { 253 + type: "object", 254 + properties: { 255 + tags: { type: "array", items: { type: "string" } }, 256 + }, 257 + }, 258 + }, 259 + }); 260 + 261 + attest(lexicon["~infer"]).type.toString.snap(`{ 262 + $type: "test.array.string" 263 + tags?: string[] | undefined 264 + }`); 265 + }); 266 + 267 + test("fromJSON InferArray handles integer arrays", () => { 268 + const lexicon = fromJSON({ 269 + id: "test.array.integer", 270 + defs: { 271 + main: { 272 + type: "object", 273 + properties: { 274 + scores: { 275 + type: "array", 276 + items: { type: "integer" }, 277 + minLength: 1, 278 + maxLength: 10, 279 + }, 280 + }, 281 + }, 282 + }, 283 + }); 284 + 285 + attest(lexicon["~infer"]).type.toString.snap(`{ 286 + $type: "test.array.integer" 287 + scores?: number[] | undefined 288 + }`); 289 + }); 290 + 291 + test("fromJSON InferArray handles boolean arrays", () => { 292 + const lexicon = fromJSON({ 293 + id: "test.array.boolean", 294 + defs: { 295 + main: { 296 + type: "object", 297 + properties: { 298 + flags: { type: "array", items: { type: "boolean" } }, 299 + }, 300 + }, 301 + }, 302 + }); 303 + 304 + attest(lexicon["~infer"]).type.toString.snap(`{ 305 + $type: "test.array.boolean" 306 + flags?: boolean[] | undefined 307 + }`); 308 + }); 309 + 310 + test("fromJSON InferArray handles unknown arrays", () => { 311 + const lexicon = fromJSON({ 312 + id: "test.array.unknown", 313 + defs: { 314 + main: { 315 + type: "object", 316 + properties: { 317 + items: { type: "array", items: { type: "unknown" } }, 318 + }, 319 + }, 320 + }, 321 + }); 322 + 323 + attest(lexicon["~infer"]).type.toString.snap(`{ 324 + $type: "test.array.unknown" 325 + items?: unknown[] | undefined 326 + }`); 327 + }); 328 + 329 + // ============================================================================ 330 + // OBJECT PROPERTY COMBINATIONS 331 + // ============================================================================ 332 + 333 + test("fromJSON InferObject handles mixed optional and required fields", () => { 334 + const lexicon = fromJSON({ 335 + id: "test.mixed", 336 + defs: { 337 + main: { 338 + type: "object", 339 + properties: { 340 + id: { type: "string", required: true }, 341 + name: { type: "string", required: true }, 342 + email: { type: "string" }, 343 + age: { type: "integer" }, 344 + }, 345 + required: ["id", "name"], 346 + }, 347 + }, 348 + }); 349 + 350 + attest(lexicon["~infer"]).type.toString.snap(`{ 351 + $type: "test.mixed" 352 + age?: number | undefined 353 + email?: string | undefined 354 + id: string 355 + name: string 356 + }`); 357 + }); 358 + 359 + test("fromJSON InferObject handles all optional fields", () => { 360 + const lexicon = fromJSON({ 361 + id: "test.allOptional", 362 + defs: { 363 + main: { 364 + type: "object", 365 + properties: { 366 + field1: { type: "string" }, 367 + field2: { type: "integer" }, 368 + field3: { type: "boolean" }, 369 + }, 370 + }, 371 + }, 372 + }); 373 + 374 + attest(lexicon["~infer"]).type.toString.snap(`{ 375 + $type: "test.allOptional" 376 + field1?: string | undefined 377 + field2?: number | undefined 378 + field3?: boolean | undefined 379 + }`); 380 + }); 381 + 382 + test("fromJSON InferObject handles all required fields", () => { 383 + const lexicon = fromJSON({ 384 + id: "test.allRequired", 385 + defs: { 386 + main: { 387 + type: "object", 388 + properties: { 389 + field1: { type: "string", required: true }, 390 + field2: { type: "integer", required: true }, 391 + field3: { type: "boolean", required: true }, 392 + }, 393 + required: ["field1", "field2", "field3"], 394 + }, 395 + }, 396 + }); 397 + 398 + attest(lexicon["~infer"]).type.toString.snap(`{ 399 + $type: "test.allRequired" 400 + field1: string 401 + field2: number 402 + field3: boolean 403 + }`); 404 + }); 405 + 406 + // ============================================================================ 407 + // NULLABLE FIELDS TESTS 408 + // ============================================================================ 409 + 410 + test("fromJSON InferObject handles nullable optional field", () => { 411 + const lexicon = fromJSON({ 412 + id: "test.nullableOptional", 413 + defs: { 414 + main: { 415 + type: "object", 416 + properties: { 417 + description: { type: "string", nullable: true }, 418 + }, 419 + nullable: ["description"], 420 + }, 421 + }, 422 + }); 423 + 424 + attest(lexicon["~infer"]).type.toString.snap(`{ 425 + $type: "test.nullableOptional" 426 + description?: string | null | undefined 427 + }`); 428 + }); 429 + 430 + test("fromJSON InferObject handles multiple nullable fields", () => { 431 + const lexicon = fromJSON({ 432 + id: "test.multipleNullable", 433 + defs: { 434 + main: { 435 + type: "object", 436 + properties: { 437 + field1: { type: "string", nullable: true }, 438 + field2: { type: "integer", nullable: true }, 439 + field3: { type: "boolean", nullable: true }, 440 + }, 441 + nullable: ["field1", "field2", "field3"], 442 + }, 443 + }, 444 + }); 445 + 446 + attest(lexicon["~infer"]).type.toString.snap(`{ 447 + $type: "test.multipleNullable" 448 + field1?: string | null | undefined 449 + field2?: number | null | undefined 450 + field3?: boolean | null | undefined 451 + }`); 452 + }); 453 + 454 + test("fromJSON InferObject handles nullable and required field", () => { 455 + const lexicon = fromJSON({ 456 + id: "test.nullableRequired", 457 + defs: { 458 + main: { 459 + type: "object", 460 + properties: { 461 + value: { type: "string", nullable: true, required: true }, 462 + }, 463 + required: ["value"], 464 + nullable: ["value"], 465 + }, 466 + }, 467 + }); 468 + 469 + attest(lexicon["~infer"]).type.toString.snap(`{ 470 + $type: "test.nullableRequired" 471 + value: string | null 472 + }`); 473 + }); 474 + 475 + test("fromJSON InferObject handles mixed nullable, required, and optional", () => { 476 + const lexicon = fromJSON({ 477 + id: "test.mixedNullable", 478 + defs: { 479 + main: { 480 + type: "object", 481 + properties: { 482 + requiredNullable: { type: "string", required: true, nullable: true }, 483 + optionalNullable: { type: "string", nullable: true }, 484 + required: { type: "string", required: true }, 485 + optional: { type: "string" }, 486 + }, 487 + required: ["requiredNullable", "required"], 488 + nullable: ["requiredNullable", "optionalNullable"], 489 + }, 490 + }, 491 + }); 492 + 493 + attest(lexicon["~infer"]).type.toString.snap(`{ 494 + $type: "test.mixedNullable" 495 + optional?: string | undefined 496 + required: string 497 + optionalNullable?: string | null | undefined 498 + requiredNullable: string | null 499 + }`); 500 + }); 501 + 502 + // ============================================================================ 503 + // REF TYPE TESTS 504 + // ============================================================================ 505 + 506 + test("fromJSON InferRef handles basic reference", () => { 507 + const lexicon = fromJSON({ 508 + id: "test.ref", 509 + defs: { 510 + main: { 511 + type: "object", 512 + properties: { 513 + post: { type: "ref", ref: "com.example.post" }, 514 + }, 515 + }, 516 + }, 517 + }); 518 + 519 + attest(lexicon["~infer"]).type.toString.snap(`{ 520 + $type: "test.ref" 521 + post?: 522 + | { [x: string]: unknown; $type: "com.example.post" } 523 + | undefined 524 + }`); 525 + }); 526 + 527 + test("fromJSON InferRef handles required reference", () => { 528 + const lexicon = fromJSON({ 529 + id: "test.refRequired", 530 + defs: { 531 + main: { 532 + type: "object", 533 + properties: { 534 + author: { type: "ref", ref: "com.example.user", required: true }, 535 + }, 536 + required: ["author"], 537 + }, 538 + }, 539 + }); 540 + 541 + attest(lexicon["~infer"]).type.toString.snap(`{ 542 + $type: "test.refRequired" 543 + author: { 544 + [x: string]: unknown 545 + $type: "com.example.user" 546 + } 547 + }`); 548 + }); 549 + 550 + test("fromJSON InferRef handles nullable reference", () => { 551 + const lexicon = fromJSON({ 552 + id: "test.refNullable", 553 + defs: { 554 + main: { 555 + type: "object", 556 + properties: { 557 + parent: { type: "ref", ref: "com.example.node", nullable: true }, 558 + }, 559 + nullable: ["parent"], 560 + }, 561 + }, 562 + }); 563 + 564 + attest(lexicon["~infer"]).type.toString.snap(`{ 565 + $type: "test.refNullable" 566 + parent?: 567 + | { [x: string]: unknown; $type: "com.example.node" } 568 + | null 569 + | undefined 570 + }`); 571 + }); 572 + 573 + // ============================================================================ 574 + // UNION TYPE TESTS 575 + // ============================================================================ 576 + 577 + test("fromJSON InferUnion handles basic union", () => { 578 + const lexicon = fromJSON({ 579 + id: "test.union", 580 + defs: { 581 + main: { 582 + type: "object", 583 + properties: { 584 + content: { 585 + type: "union", 586 + refs: ["com.example.text", "com.example.image"], 587 + }, 588 + }, 589 + }, 590 + }, 591 + }); 592 + 593 + attest(lexicon["~infer"]).type.toString.snap(`{ 594 + $type: "test.union" 595 + content?: 596 + | { [x: string]: unknown; $type: "com.example.text" } 597 + | { [x: string]: unknown; $type: "com.example.image" } 598 + | undefined 599 + }`); 600 + }); 601 + 602 + test("fromJSON InferUnion handles required union", () => { 603 + const lexicon = fromJSON({ 604 + id: "test.unionRequired", 605 + defs: { 606 + main: { 607 + type: "object", 608 + properties: { 609 + media: { 610 + type: "union", 611 + refs: ["com.example.video", "com.example.audio"], 612 + required: true, 613 + }, 614 + }, 615 + required: ["media"], 616 + }, 617 + }, 618 + }); 619 + 620 + attest(lexicon["~infer"]).type.toString.snap(`{ 621 + $type: "test.unionRequired" 622 + media: 623 + | { [x: string]: unknown; $type: "com.example.video" } 624 + | { [x: string]: unknown; $type: "com.example.audio" } 625 + }`); 626 + }); 627 + 628 + test("fromJSON InferUnion handles union with many types", () => { 629 + const lexicon = fromJSON({ 630 + id: "test.unionMultiple", 631 + defs: { 632 + main: { 633 + type: "object", 634 + properties: { 635 + attachment: { 636 + type: "union", 637 + refs: [ 638 + "com.example.image", 639 + "com.example.video", 640 + "com.example.audio", 641 + "com.example.document", 642 + ], 643 + }, 644 + }, 645 + }, 646 + }, 647 + }); 648 + 649 + attest(lexicon["~infer"]).type.toString.snap(`{ 650 + $type: "test.unionMultiple" 651 + attachment?: 652 + | { [x: string]: unknown; $type: "com.example.image" } 653 + | { [x: string]: unknown; $type: "com.example.video" } 654 + | { [x: string]: unknown; $type: "com.example.audio" } 655 + | { 656 + [x: string]: unknown 657 + $type: "com.example.document" 658 + } 659 + | undefined 660 + }`); 661 + }); 662 + 663 + // ============================================================================ 664 + // PARAMS TYPE TESTS 665 + // ============================================================================ 666 + 667 + test("fromJSON InferParams handles basic params", () => { 668 + const lexicon = fromJSON({ 669 + id: "test.params", 670 + defs: { 671 + main: { 672 + type: "params", 673 + properties: { 674 + limit: { type: "integer" }, 675 + offset: { type: "integer" }, 676 + }, 677 + }, 678 + }, 679 + }); 680 + 681 + attest(lexicon["~infer"]).type.toString.snap(`{ 682 + $type: "test.params" 683 + limit?: number | undefined 684 + offset?: number | undefined 685 + }`); 686 + }); 687 + 688 + test("fromJSON InferParams handles required params", () => { 689 + const lexicon = fromJSON({ 690 + id: "test.paramsRequired", 691 + defs: { 692 + main: { 693 + type: "params", 694 + properties: { 695 + query: { type: "string", required: true }, 696 + limit: { type: "integer" }, 697 + }, 698 + required: ["query"], 699 + }, 700 + }, 701 + }); 702 + 703 + attest(lexicon["~infer"]).type.toString.snap(`{ 704 + $type: "test.paramsRequired" 705 + limit?: number | undefined 706 + query: string 707 + }`); 708 + }); 709 + 710 + // ============================================================================ 711 + // RECORD TYPE TESTS 712 + // ============================================================================ 713 + 714 + test("fromJSON InferRecord handles record with object schema", () => { 715 + const lexicon = fromJSON({ 716 + id: "test.record", 717 + defs: { 718 + main: { 719 + type: "record", 720 + key: "tid", 721 + record: { 722 + type: "object", 723 + properties: { 724 + title: { type: "string", required: true }, 725 + content: { type: "string", required: true }, 726 + published: { type: "boolean" }, 727 + }, 728 + required: ["title", "content"], 729 + }, 730 + }, 731 + }, 732 + }); 733 + 734 + attest(lexicon["~infer"]).type.toString.snap(`{ 735 + $type: "test.record" 736 + published?: boolean | undefined 737 + content: string 738 + title: string 739 + }`); 740 + }); 741 + 742 + // ============================================================================ 743 + // NESTED OBJECTS TESTS 744 + // ============================================================================ 745 + 746 + test("fromJSON InferObject handles nested objects", () => { 747 + const lexicon = fromJSON({ 748 + id: "test.nested", 749 + defs: { 750 + main: { 751 + type: "object", 752 + properties: { 753 + user: { 754 + type: "object", 755 + properties: { 756 + name: { type: "string", required: true }, 757 + email: { type: "string", required: true }, 758 + }, 759 + required: ["name", "email"], 760 + }, 761 + }, 762 + }, 763 + }, 764 + }); 765 + 766 + attest(lexicon["~infer"]).type.toString.snap(`{ 767 + $type: "test.nested" 768 + user?: { name: string; email: string } | undefined 769 + }`); 770 + }); 771 + 772 + test("fromJSON InferObject handles deeply nested objects", () => { 773 + const lexicon = fromJSON({ 774 + id: "test.deepNested", 775 + defs: { 776 + main: { 777 + type: "object", 778 + properties: { 779 + data: { 780 + type: "object", 781 + properties: { 782 + user: { 783 + type: "object", 784 + properties: { 785 + profile: { 786 + type: "object", 787 + properties: { 788 + name: { type: "string", required: true }, 789 + }, 790 + required: ["name"], 791 + }, 792 + }, 793 + }, 794 + }, 795 + }, 796 + }, 797 + }, 798 + }, 799 + }); 800 + 801 + attest(lexicon["~infer"]).type.toString.snap(`{ 802 + $type: "test.deepNested" 803 + data?: 804 + | { 805 + user?: 806 + | { profile?: { name: string } | undefined } 807 + | undefined 808 + } 809 + | undefined 810 + }`); 811 + }); 812 + 813 + // ============================================================================ 814 + // NESTED ARRAYS TESTS 815 + // ============================================================================ 816 + 817 + test("fromJSON InferArray handles arrays of objects", () => { 818 + const lexicon = fromJSON({ 819 + id: "test.arrayOfObjects", 820 + defs: { 821 + main: { 822 + type: "object", 823 + properties: { 824 + users: { 825 + type: "array", 826 + items: { 827 + type: "object", 828 + properties: { 829 + id: { type: "string", required: true }, 830 + name: { type: "string", required: true }, 831 + }, 832 + required: ["id", "name"], 833 + }, 834 + }, 835 + }, 836 + }, 837 + }, 838 + }); 839 + 840 + attest(lexicon["~infer"]).type.toString.snap(`{ 841 + $type: "test.arrayOfObjects" 842 + users?: { id: string; name: string }[] | undefined 843 + }`); 844 + }); 845 + 846 + test("fromJSON InferArray handles arrays of arrays", () => { 847 + const lexicon = fromJSON({ 848 + id: "test.nestedArrays", 849 + defs: { 850 + main: { 851 + type: "object", 852 + properties: { 853 + matrix: { 854 + type: "array", 855 + items: { type: "array", items: { type: "integer" } }, 856 + }, 857 + }, 858 + }, 859 + }, 860 + }); 861 + 862 + attest(lexicon["~infer"]).type.toString.snap(`{ 863 + $type: "test.nestedArrays" 864 + matrix?: number[][] | undefined 865 + }`); 866 + }); 867 + 868 + test("fromJSON InferArray handles arrays of refs", () => { 869 + const lexicon = fromJSON({ 870 + id: "test.arrayOfRefs", 871 + defs: { 872 + main: { 873 + type: "object", 874 + properties: { 875 + followers: { 876 + type: "array", 877 + items: { type: "ref", ref: "com.example.user" }, 878 + }, 879 + }, 880 + }, 881 + }, 882 + }); 883 + 884 + attest(lexicon["~infer"]).type.toString.snap(`{ 885 + $type: "test.arrayOfRefs" 886 + followers?: 887 + | { [x: string]: unknown; $type: "com.example.user" }[] 888 + | undefined 889 + }`); 890 + }); 891 + 892 + // ============================================================================ 893 + // COMPLEX NESTED STRUCTURES 894 + // ============================================================================ 895 + 896 + test("fromJSON InferObject handles complex nested structure", () => { 897 + const lexicon = fromJSON({ 898 + id: "test.complex", 899 + defs: { 900 + main: { 901 + type: "object", 902 + properties: { 903 + id: { type: "string", required: true }, 904 + author: { 905 + type: "object", 906 + properties: { 907 + did: { type: "string", required: true, format: "did" }, 908 + handle: { type: "string", required: true, format: "handle" }, 909 + avatar: { type: "string" }, 910 + }, 911 + required: ["did", "handle"], 912 + }, 913 + content: { 914 + type: "union", 915 + refs: ["com.example.text", "com.example.image"], 916 + }, 917 + tags: { type: "array", items: { type: "string" }, maxLength: 10 }, 918 + metadata: { 919 + type: "object", 920 + properties: { 921 + views: { type: "integer" }, 922 + likes: { type: "integer" }, 923 + shares: { type: "integer" }, 924 + }, 925 + }, 926 + }, 927 + required: ["id"], 928 + }, 929 + }, 930 + }); 931 + 932 + attest(lexicon["~infer"]).type.toString.snap(`{ 933 + $type: "test.complex" 934 + tags?: string[] | undefined 935 + content?: 936 + | { [x: string]: unknown; $type: "com.example.text" } 937 + | { [x: string]: unknown; $type: "com.example.image" } 938 + | undefined 939 + author?: 940 + | { 941 + avatar?: string | undefined 942 + did: string 943 + handle: string 944 + } 945 + | undefined 946 + metadata?: 947 + | { 948 + likes?: number | undefined 949 + views?: number | undefined 950 + shares?: number | undefined 951 + } 952 + | undefined 953 + id: string 954 + }`); 955 + }); 956 + 957 + // ============================================================================ 958 + // MULTIPLE DEFS IN NAMESPACE 959 + // ============================================================================ 960 + 961 + test("fromJSON InferNS handles multiple defs in namespace", () => { 962 + const lexicon = fromJSON({ 963 + id: "com.example.app", 964 + defs: { 965 + user: { 966 + type: "object", 967 + properties: { 968 + name: { type: "string", required: true }, 969 + email: { type: "string", required: true }, 970 + }, 971 + required: ["name", "email"], 972 + }, 973 + post: { 974 + type: "object", 975 + properties: { 976 + title: { type: "string", required: true }, 977 + content: { type: "string", required: true }, 978 + }, 979 + required: ["title", "content"], 980 + }, 981 + comment: { 982 + type: "object", 983 + properties: { 984 + text: { type: "string", required: true }, 985 + author: { type: "ref", ref: "com.example.user" }, 986 + }, 987 + required: ["text"], 988 + }, 989 + }, 990 + }); 991 + 992 + attest(lexicon["~infer"]).type.toString.snap("never"); 993 + }); 994 + 995 + test("fromJSON InferNS handles namespace with record and object defs", () => { 996 + const lexicon = fromJSON({ 997 + id: "com.example.blog", 998 + defs: { 999 + main: { 1000 + type: "record", 1001 + key: "tid", 1002 + record: { 1003 + type: "object", 1004 + properties: { 1005 + title: { type: "string", required: true }, 1006 + body: { type: "string", required: true }, 1007 + }, 1008 + required: ["title", "body"], 1009 + }, 1010 + }, 1011 + metadata: { 1012 + type: "object", 1013 + properties: { 1014 + category: { type: "string" }, 1015 + tags: { type: "array", items: { type: "string" } }, 1016 + }, 1017 + }, 1018 + }, 1019 + }); 1020 + 1021 + attest(lexicon["~infer"]).type.toString.snap(`{ 1022 + $type: "com.example.blog" 1023 + title: string 1024 + body: string 1025 + }`); 1026 + }); 1027 + 1028 + // ============================================================================ 1029 + // LOCAL REF RESOLUTION TESTS 1030 + // ============================================================================ 1031 + 1032 + test("fromJSON Local ref resolution: resolves refs to actual types", () => { 1033 + const ns = fromJSON({ 1034 + id: "test", 1035 + defs: { 1036 + user: { 1037 + type: "object", 1038 + properties: { 1039 + name: { type: "string", required: true }, 1040 + email: { type: "string", required: true }, 1041 + }, 1042 + required: ["name", "email"], 1043 + }, 1044 + main: { 1045 + type: "object", 1046 + properties: { 1047 + author: { type: "ref", ref: "#user", required: true }, 1048 + content: { type: "string", required: true }, 1049 + }, 1050 + required: ["author", "content"], 1051 + }, 1052 + }, 1053 + }); 1054 + 1055 + attest(ns["~infer"]).type.toString.snap(`{ 1056 + $type: "test" 1057 + content: string 1058 + author: { name: string; email: string; $type: "#user" } 1059 + }`); 1060 + }); 1061 + 1062 + test("fromJSON Local ref resolution: refs in arrays", () => { 1063 + const ns = fromJSON({ 1064 + id: "test", 1065 + defs: { 1066 + user: { 1067 + type: "object", 1068 + properties: { 1069 + name: { type: "string", required: true }, 1070 + }, 1071 + required: ["name"], 1072 + }, 1073 + main: { 1074 + type: "object", 1075 + properties: { 1076 + users: { type: "array", items: { type: "ref", ref: "#user" } }, 1077 + }, 1078 + }, 1079 + }, 1080 + }); 1081 + 1082 + attest(ns["~infer"]).type.toString.snap(`{ 1083 + $type: "test" 1084 + users?: { name: string; $type: "#user" }[] | undefined 1085 + }`); 1086 + }); 1087 + 1088 + test("fromJSON Local ref resolution: refs in unions", () => { 1089 + const ns = fromJSON({ 1090 + id: "test", 1091 + defs: { 1092 + text: { 1093 + type: "object", 1094 + properties: { content: { type: "string", required: true } }, 1095 + required: ["content"], 1096 + }, 1097 + image: { 1098 + type: "object", 1099 + properties: { url: { type: "string", required: true } }, 1100 + required: ["url"], 1101 + }, 1102 + main: { 1103 + type: "object", 1104 + properties: { 1105 + embed: { type: "union", refs: ["#text", "#image"] }, 1106 + }, 1107 + }, 1108 + }, 1109 + }); 1110 + 1111 + attest(ns["~infer"]).type.toString.snap(`{ 1112 + $type: "test" 1113 + embed?: 1114 + | { content: string; $type: "#text" } 1115 + | { url: string; $type: "#image" } 1116 + | undefined 1117 + }`); 1118 + }); 1119 + 1120 + test("fromJSON Local ref resolution: nested refs", () => { 1121 + const ns = fromJSON({ 1122 + id: "test", 1123 + defs: { 1124 + profile: { 1125 + type: "object", 1126 + properties: { 1127 + bio: { type: "string", required: true }, 1128 + }, 1129 + required: ["bio"], 1130 + }, 1131 + user: { 1132 + type: "object", 1133 + properties: { 1134 + name: { type: "string", required: true }, 1135 + profile: { type: "ref", ref: "#profile", required: true }, 1136 + }, 1137 + required: ["name", "profile"], 1138 + }, 1139 + main: { 1140 + type: "object", 1141 + properties: { 1142 + author: { type: "ref", ref: "#user", required: true }, 1143 + }, 1144 + required: ["author"], 1145 + }, 1146 + }, 1147 + }); 1148 + 1149 + attest(ns["~infer"]).type.toString.snap(`{ 1150 + $type: "test" 1151 + author: { 1152 + name: string 1153 + profile: { bio: string; $type: "#profile" } 1154 + $type: "#user" 1155 + } 1156 + }`); 1157 + }); 1158 + 1159 + // ============================================================================ 1160 + // EDGE CASE TESTS 1161 + // ============================================================================ 1162 + 1163 + test("fromJSON Edge case: circular reference detection", () => { 1164 + const ns = fromJSON({ 1165 + id: "test", 1166 + defs: { 1167 + main: { 1168 + type: "object", 1169 + properties: { 1170 + value: { type: "string", required: true }, 1171 + parent: { type: "ref", ref: "#main" }, 1172 + }, 1173 + required: ["value"], 1174 + }, 1175 + }, 1176 + }); 1177 + 1178 + attest(ns["~infer"]).type.toString.snap(`{ 1179 + $type: "test" 1180 + parent?: 1181 + | { 1182 + parent?: 1183 + | "[Circular reference detected: #main]" 1184 + | undefined 1185 + value: string 1186 + $type: "#main" 1187 + } 1188 + | undefined 1189 + value: string 1190 + }`); 1191 + }); 1192 + 1193 + test("fromJSON Edge case: circular reference between multiple types", () => { 1194 + const ns = fromJSON({ 1195 + id: "test", 1196 + defs: { 1197 + user: { 1198 + type: "object", 1199 + properties: { 1200 + name: { type: "string", required: true }, 1201 + posts: { type: "array", items: { type: "ref", ref: "#post" } }, 1202 + }, 1203 + required: ["name"], 1204 + }, 1205 + post: { 1206 + type: "object", 1207 + properties: { 1208 + title: { type: "string", required: true }, 1209 + author: { type: "ref", ref: "#user", required: true }, 1210 + }, 1211 + required: ["title", "author"], 1212 + }, 1213 + main: { 1214 + type: "object", 1215 + properties: { 1216 + users: { type: "array", items: { type: "ref", ref: "#user" } }, 1217 + }, 1218 + }, 1219 + }, 1220 + }); 1221 + 1222 + attest(ns["~infer"]).type.toString.snap(`{ 1223 + $type: "test" 1224 + users?: 1225 + | { 1226 + posts?: 1227 + | { 1228 + author: "[Circular reference detected: #user]" 1229 + title: string 1230 + $type: "#post" 1231 + }[] 1232 + | undefined 1233 + name: string 1234 + $type: "#user" 1235 + }[] 1236 + | undefined 1237 + }`); 1238 + }); 1239 + 1240 + test("fromJSON Edge case: missing reference detection", () => { 1241 + const ns = fromJSON({ 1242 + id: "test", 1243 + defs: { 1244 + main: { 1245 + type: "object", 1246 + properties: { 1247 + author: { type: "ref", ref: "#user", required: true }, 1248 + }, 1249 + required: ["author"], 1250 + }, 1251 + }, 1252 + }); 1253 + 1254 + attest(ns["~infer"]).type.toString.snap(`{ 1255 + $type: "test" 1256 + author: "[Reference not found: #user]" 1257 + }`); 1258 + });
+136
packages/prototypey/core/tests/from-json.test.ts
···
··· 1 + import { expect, test } from "vitest"; 2 + import { fromJSON, lx } from "../lib.ts"; 3 + 4 + test("fromJSON creates lexicon from JSON", () => { 5 + const lexicon = fromJSON({ 6 + id: "app.bsky.actor.profile", 7 + defs: { 8 + main: { 9 + type: "record", 10 + key: "self", 11 + record: { 12 + type: "object", 13 + properties: { 14 + displayName: { 15 + type: "string", 16 + maxLength: 64, 17 + maxGraphemes: 64, 18 + }, 19 + description: { 20 + type: "string", 21 + maxLength: 256, 22 + maxGraphemes: 256, 23 + }, 24 + }, 25 + }, 26 + }, 27 + }, 28 + }); 29 + 30 + expect(lexicon.json).toEqual({ 31 + lexicon: 1, 32 + id: "app.bsky.actor.profile", 33 + defs: { 34 + main: { 35 + type: "record", 36 + key: "self", 37 + record: { 38 + type: "object", 39 + properties: { 40 + displayName: { 41 + type: "string", 42 + maxLength: 64, 43 + maxGraphemes: 64, 44 + }, 45 + description: { 46 + type: "string", 47 + maxLength: 256, 48 + maxGraphemes: 256, 49 + }, 50 + }, 51 + }, 52 + }, 53 + }, 54 + }); 55 + }); 56 + 57 + test("fromJSON and lx.lexicon produce equivalent results", () => { 58 + // Create using lx.lexicon 59 + const viaLx = lx.lexicon("app.bsky.feed.post", { 60 + main: lx.record({ 61 + key: "tid", 62 + record: lx.object({ 63 + text: lx.string({ maxLength: 300, required: true }), 64 + createdAt: lx.string({ format: "datetime", required: true }), 65 + }), 66 + }), 67 + }); 68 + 69 + // Create using fromJSON 70 + const viaJSON = fromJSON({ 71 + id: "app.bsky.feed.post", 72 + defs: { 73 + main: { 74 + type: "record", 75 + key: "tid", 76 + record: { 77 + type: "object", 78 + properties: { 79 + text: { 80 + type: "string", 81 + maxLength: 300, 82 + required: true, 83 + }, 84 + createdAt: { 85 + type: "string", 86 + format: "datetime", 87 + required: true, 88 + }, 89 + }, 90 + required: ["text", "createdAt"], 91 + }, 92 + }, 93 + }, 94 + }); 95 + 96 + expect(viaLx.json).toEqual(viaJSON.json); 97 + }); 98 + 99 + test("fromJSON supports validation", () => { 100 + const lexicon = fromJSON({ 101 + id: "com.example.post", 102 + defs: { 103 + main: { 104 + type: "record", 105 + key: "tid", 106 + record: { 107 + type: "object", 108 + properties: { 109 + text: { 110 + type: "string", 111 + maxLength: 100, 112 + required: true, 113 + }, 114 + }, 115 + required: ["text"], 116 + }, 117 + }, 118 + }, 119 + }); 120 + 121 + // Valid data 122 + const validResult = lexicon.validate({ 123 + text: "Hello world", 124 + }); 125 + expect(validResult.success).toBe(true); 126 + 127 + // Invalid data - missing required field 128 + const invalidResult = lexicon.validate({}); 129 + expect(invalidResult.success).toBe(false); 130 + 131 + // Invalid data - text too long 132 + const tooLongResult = lexicon.validate({ 133 + text: "a".repeat(101), 134 + }); 135 + expect(tooLongResult.success).toBe(false); 136 + });
+213 -4
packages/prototypey/core/tests/infer.bench.ts
··· 1 import { bench } from "@ark/attest"; 2 import { lx } from "../lib.ts"; 3 4 bench("infer with simple object", () => { 5 const schema = lx.lexicon("test.simple", { ··· 9 }), 10 }); 11 return schema["~infer"]; 12 - }).types([741, "instantiations"]); 13 14 bench("infer with complex nested structure", () => { 15 const schema = lx.lexicon("test.complex", { ··· 32 }), 33 }); 34 return schema["~infer"]; 35 - }).types([1040, "instantiations"]); 36 37 bench("infer with circular reference", () => { 38 const ns = lx.lexicon("test", { ··· 49 }), 50 }); 51 return ns["~infer"]; 52 - }).types([692, "instantiations"]); 53 54 bench("infer with app.bsky.feed.defs lexicon", () => { 55 const schema = lx.lexicon("app.bsky.feed.defs", { ··· 116 interactionShare: lx.token("User shared the feed item"), 117 }); 118 return schema["~infer"]; 119 - }).types([1285, "instantiations"]);
··· 1 import { bench } from "@ark/attest"; 2 import { lx } from "../lib.ts"; 3 + import { fromJSON } from "../lib.ts"; 4 5 bench("infer with simple object", () => { 6 const schema = lx.lexicon("test.simple", { ··· 10 }), 11 }); 12 return schema["~infer"]; 13 + }).types([748, "instantiations"]); 14 15 bench("infer with complex nested structure", () => { 16 const schema = lx.lexicon("test.complex", { ··· 33 }), 34 }); 35 return schema["~infer"]; 36 + }).types([1047, "instantiations"]); 37 38 bench("infer with circular reference", () => { 39 const ns = lx.lexicon("test", { ··· 50 }), 51 }); 52 return ns["~infer"]; 53 + }).types([699, "instantiations"]); 54 55 bench("infer with app.bsky.feed.defs lexicon", () => { 56 const schema = lx.lexicon("app.bsky.feed.defs", { ··· 117 interactionShare: lx.token("User shared the feed item"), 118 }); 119 return schema["~infer"]; 120 + }).types([1292, "instantiations"]); 121 + 122 + bench("fromJSON infer with simple object", () => { 123 + const schema = fromJSON({ 124 + id: "test.simple", 125 + defs: { 126 + main: { 127 + type: "object", 128 + properties: { 129 + id: { type: "string", required: true }, 130 + name: { type: "string", required: true }, 131 + }, 132 + required: ["id", "name"], 133 + }, 134 + }, 135 + }); 136 + return schema["~infer"]; 137 + }).types([477, "instantiations"]); 138 + 139 + bench("fromJSON infer with complex nested structure", () => { 140 + const schema = fromJSON({ 141 + id: "test.complex", 142 + defs: { 143 + user: { 144 + type: "object", 145 + properties: { 146 + handle: { type: "string", required: true }, 147 + displayName: { type: "string" }, 148 + }, 149 + required: ["handle"], 150 + }, 151 + reply: { 152 + type: "object", 153 + properties: { 154 + text: { type: "string", required: true }, 155 + author: { type: "ref", ref: "#user", required: true }, 156 + }, 157 + required: ["text", "author"], 158 + }, 159 + main: { 160 + type: "record", 161 + key: "tid", 162 + record: { 163 + type: "object", 164 + properties: { 165 + author: { type: "ref", ref: "#user", required: true }, 166 + replies: { type: "array", items: { type: "ref", ref: "#reply" } }, 167 + content: { type: "string", required: true }, 168 + createdAt: { 169 + type: "string", 170 + required: true, 171 + format: "datetime", 172 + }, 173 + }, 174 + required: ["author", "content", "createdAt"], 175 + }, 176 + }, 177 + }, 178 + }); 179 + return schema["~infer"]; 180 + }).types([538, "instantiations"]); 181 + 182 + bench("fromJSON infer with circular reference", () => { 183 + const ns = fromJSON({ 184 + id: "test", 185 + defs: { 186 + user: { 187 + type: "object", 188 + properties: { 189 + name: { type: "string", required: true }, 190 + posts: { type: "array", items: { type: "ref", ref: "#post" } }, 191 + }, 192 + required: ["name"], 193 + }, 194 + post: { 195 + type: "object", 196 + properties: { 197 + title: { type: "string", required: true }, 198 + author: { type: "ref", ref: "#user", required: true }, 199 + }, 200 + required: ["title", "author"], 201 + }, 202 + main: { 203 + type: "object", 204 + properties: { 205 + users: { type: "array", items: { type: "ref", ref: "#user" } }, 206 + }, 207 + }, 208 + }, 209 + }); 210 + return ns["~infer"]; 211 + }).types([450, "instantiations"]); 212 + 213 + bench("fromJSON infer with app.bsky.feed.defs lexicon", () => { 214 + const schema = fromJSON({ 215 + id: "app.bsky.feed.defs", 216 + defs: { 217 + viewerState: { 218 + type: "object", 219 + properties: { 220 + repost: { type: "string", format: "at-uri" }, 221 + like: { type: "string", format: "at-uri" }, 222 + bookmarked: { type: "boolean" }, 223 + threadMuted: { type: "boolean" }, 224 + replyDisabled: { type: "boolean" }, 225 + embeddingDisabled: { type: "boolean" }, 226 + pinned: { type: "boolean" }, 227 + }, 228 + }, 229 + main: { 230 + type: "object", 231 + properties: { 232 + uri: { type: "string", required: true, format: "at-uri" }, 233 + cid: { type: "string", required: true, format: "cid" }, 234 + author: { 235 + type: "ref", 236 + ref: "app.bsky.actor.defs#profileViewBasic", 237 + required: true, 238 + }, 239 + record: { type: "unknown", required: true }, 240 + embed: { 241 + type: "union", 242 + refs: [ 243 + "app.bsky.embed.images#view", 244 + "app.bsky.embed.video#view", 245 + "app.bsky.embed.external#view", 246 + "app.bsky.embed.record#view", 247 + "app.bsky.embed.recordWithMedia#view", 248 + ], 249 + }, 250 + bookmarkCount: { type: "integer" }, 251 + replyCount: { type: "integer" }, 252 + repostCount: { type: "integer" }, 253 + likeCount: { type: "integer" }, 254 + quoteCount: { type: "integer" }, 255 + indexedAt: { type: "string", required: true, format: "datetime" }, 256 + viewer: { type: "ref", ref: "#viewerState" }, 257 + labels: { 258 + type: "array", 259 + items: { type: "ref", ref: "com.atproto.label.defs#label" }, 260 + }, 261 + threadgate: { type: "ref", ref: "#threadgateView" }, 262 + }, 263 + required: ["uri", "cid", "author", "record", "indexedAt"], 264 + }, 265 + requestLess: { 266 + type: "token", 267 + description: 268 + "Request that less content like the given feed item be shown in the feed", 269 + }, 270 + requestMore: { 271 + type: "token", 272 + description: 273 + "Request that more content like the given feed item be shown in the feed", 274 + }, 275 + clickthroughItem: { 276 + type: "token", 277 + description: "User clicked through to the feed item", 278 + }, 279 + clickthroughAuthor: { 280 + type: "token", 281 + description: "User clicked through to the author of the feed item", 282 + }, 283 + clickthroughReposter: { 284 + type: "token", 285 + description: "User clicked through to the reposter of the feed item", 286 + }, 287 + clickthroughEmbed: { 288 + type: "token", 289 + description: 290 + "User clicked through to the embedded content of the feed item", 291 + }, 292 + contentModeUnspecified: { 293 + type: "token", 294 + description: "Declares the feed generator returns any types of posts.", 295 + }, 296 + contentModeVideo: { 297 + type: "token", 298 + description: 299 + "Declares the feed generator returns posts containing app.bsky.embed.video embeds.", 300 + }, 301 + interactionSeen: { 302 + type: "token", 303 + description: "Feed item was seen by user", 304 + }, 305 + interactionLike: { 306 + type: "token", 307 + description: "User liked the feed item", 308 + }, 309 + interactionRepost: { 310 + type: "token", 311 + description: "User reposted the feed item", 312 + }, 313 + interactionReply: { 314 + type: "token", 315 + description: "User replied to the feed item", 316 + }, 317 + interactionQuote: { 318 + type: "token", 319 + description: "User quoted the feed item", 320 + }, 321 + interactionShare: { 322 + type: "token", 323 + description: "User shared the feed item", 324 + }, 325 + }, 326 + }); 327 + return schema["~infer"]; 328 + }).types([552, "instantiations"]);