An ATProto Lexicon validator for Gleam.

Compare changes

Choose any two refs to compare.

+1
.gitignore
··· 3 3 build/ 4 4 erl_crash.dump 5 5 .claude 6 + node_modules/
+25
CHANGELOG.md
··· 1 + # Changelog 2 + 3 + ## 1.2.0 4 + 5 + ### Added 6 + 7 + - Validate full ATProto blob structure with stricter field checking 8 + 9 + ## 1.1.0 10 + 11 + ### Added 12 + 13 + - Add `ValidationContext` type for external use 14 + - Add `build_validation_context` function to build a reusable validation context from lexicons 15 + - Add `validate_record_with_context` function for faster batch validation using a pre-built context 16 + 17 + ## 1.0.1 18 + 19 + ### Fixed 20 + 21 + - Fix `is_null_dynamic` to use `dynamic.classify` for consistent null detection 22 + 23 + ## 1.0.0 24 + 25 + - Initial release
+201
LICENSE
··· 1 + Apache License 2 + Version 2.0, January 2004 3 + http://www.apache.org/licenses/ 4 + 5 + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 + 7 + 1. Definitions. 8 + 9 + "License" shall mean the terms and conditions for use, reproduction, 10 + and distribution as defined by Sections 1 through 9 of this document. 11 + 12 + "Licensor" shall mean the copyright owner or entity authorized by 13 + the copyright owner that is granting the License. 14 + 15 + "Legal Entity" shall mean the union of the acting entity and all 16 + other entities that control, are controlled by, or are under common 17 + control with that entity. For the purposes of this definition, 18 + "control" means (i) the power, direct or indirect, to cause the 19 + direction or management of such entity, whether by contract or 20 + otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 + outstanding shares, or (iii) beneficial ownership of such entity. 22 + 23 + "You" (or "Your") shall mean an individual or Legal Entity 24 + exercising permissions granted by this License. 25 + 26 + "Source" form shall mean the preferred form for making modifications, 27 + including but not limited to software source code, documentation 28 + source, and configuration files. 29 + 30 + "Object" form shall mean any form resulting from mechanical 31 + transformation or translation of a Source form, including but 32 + not limited to compiled object code, generated documentation, 33 + and conversions to other media types. 34 + 35 + "Work" shall mean the work of authorship, whether in Source or 36 + Object form, made available under the License, as indicated by a 37 + copyright notice that is included in or attached to the work 38 + (an example is provided in the Appendix below). 39 + 40 + "Derivative Works" shall mean any work, whether in Source or Object 41 + form, that is based on (or derived from) the Work and for which the 42 + editorial revisions, annotations, elaborations, or other modifications 43 + represent, as a whole, an original work of authorship. For the purposes 44 + of this License, Derivative Works shall not include works that remain 45 + separable from, or merely link (or bind by name) to the interfaces of, 46 + the Work and Derivative Works thereof. 47 + 48 + "Contribution" shall mean any work of authorship, including 49 + the original version of the Work and any modifications or additions 50 + to that Work or Derivative Works thereof, that is intentionally 51 + submitted to Licensor for inclusion in the Work by the copyright owner 52 + or by an individual or Legal Entity authorized to submit on behalf of 53 + the copyright owner. For the purposes of this definition, "submitted" 54 + means any form of electronic, verbal, or written communication sent 55 + to the Licensor or its representatives, including but not limited to 56 + communication on electronic mailing lists, source code control systems, 57 + and issue tracking systems that are managed by, or on behalf of, the 58 + Licensor for the purpose of discussing and improving the Work, but 59 + excluding communication that is conspicuously marked or otherwise 60 + designated in writing by the copyright owner as "Not a Contribution." 61 + 62 + "Contributor" shall mean Licensor and any individual or Legal Entity 63 + on behalf of whom a Contribution has been received by Licensor and 64 + subsequently incorporated within the Work. 65 + 66 + 2. Grant of Copyright License. Subject to the terms and conditions of 67 + this License, each Contributor hereby grants to You a perpetual, 68 + worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 + copyright license to reproduce, prepare Derivative Works of, 70 + publicly display, publicly perform, sublicense, and distribute the 71 + Work and such Derivative Works in Source or Object form. 72 + 73 + 3. Grant of Patent License. Subject to the terms and conditions of 74 + this License, each Contributor hereby grants to You a perpetual, 75 + worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 + (except as stated in this section) patent license to make, have made, 77 + use, offer to sell, sell, import, and otherwise transfer the Work, 78 + where such license applies only to those patent claims licensable 79 + by such Contributor that are necessarily infringed by their 80 + Contribution(s) alone or by combination of their Contribution(s) 81 + with the Work to which such Contribution(s) was submitted. If You 82 + institute patent litigation against any entity (including a 83 + cross-claim or counterclaim in a lawsuit) alleging that the Work 84 + or a Contribution incorporated within the Work constitutes direct 85 + or contributory patent infringement, then any patent licenses 86 + granted to You under this License for that Work shall terminate 87 + as of the date such litigation is filed. 88 + 89 + 4. Redistribution. You may reproduce and distribute copies of the 90 + Work or Derivative Works thereof in any medium, with or without 91 + modifications, and in Source or Object form, provided that You 92 + meet the following conditions: 93 + 94 + (a) You must give any other recipients of the Work or 95 + Derivative Works a copy of this License; and 96 + 97 + (b) You must cause any modified files to carry prominent notices 98 + stating that You changed the files; and 99 + 100 + (c) You must retain, in the Source form of any Derivative Works 101 + that You distribute, all copyright, patent, trademark, and 102 + attribution notices from the Source form of the Work, 103 + excluding those notices that do not pertain to any part of 104 + the Derivative Works; and 105 + 106 + (d) If the Work includes a "NOTICE" text file as part of its 107 + distribution, then any Derivative Works that You distribute must 108 + include a readable copy of the attribution notices contained 109 + within such NOTICE file, excluding those notices that do not 110 + pertain to any part of the Derivative Works, in at least one 111 + of the following places: within a NOTICE text file distributed 112 + as part of the Derivative Works; within the Source form or 113 + documentation, if provided along with the Derivative Works; or, 114 + within a display generated by the Derivative Works, if and 115 + wherever such third-party notices normally appear. The contents 116 + of the NOTICE file are for informational purposes only and 117 + do not modify the License. You may add Your own attribution 118 + notices within Derivative Works that You distribute, alongside 119 + or as an addendum to the NOTICE text from the Work, provided 120 + that such additional attribution notices cannot be construed 121 + as modifying the License. 122 + 123 + You may add Your own copyright statement to Your modifications and 124 + may provide additional or different license terms and conditions 125 + for use, reproduction, or distribution of Your modifications, or 126 + for any such Derivative Works as a whole, provided Your use, 127 + reproduction, and distribution of the Work otherwise complies with 128 + the conditions stated in this License. 129 + 130 + 5. Submission of Contributions. Unless You explicitly state otherwise, 131 + any Contribution intentionally submitted for inclusion in the Work 132 + by You to the Licensor shall be under the terms and conditions of 133 + this License, without any additional terms or conditions. 134 + Notwithstanding the above, nothing herein shall supersede or modify 135 + the terms of any separate license agreement you may have executed 136 + with Licensor regarding such Contributions. 137 + 138 + 6. Trademarks. This License does not grant permission to use the trade 139 + names, trademarks, service marks, or product names of the Licensor, 140 + except as required for reasonable and customary use in describing the 141 + origin of the Work and reproducing the content of the NOTICE file. 142 + 143 + 7. Disclaimer of Warranty. Unless required by applicable law or 144 + agreed to in writing, Licensor provides the Work (and each 145 + Contributor provides its Contributions) on an "AS IS" BASIS, 146 + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 + implied, including, without limitation, any warranties or conditions 148 + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 + PARTICULAR PURPOSE. You are solely responsible for determining the 150 + appropriateness of using or redistributing the Work and assume any 151 + risks associated with Your exercise of permissions under this License. 152 + 153 + 8. Limitation of Liability. In no event and under no legal theory, 154 + whether in tort (including negligence), contract, or otherwise, 155 + unless required by applicable law (such as deliberate and grossly 156 + negligent acts) or agreed to in writing, shall any Contributor be 157 + liable to You for damages, including any direct, indirect, special, 158 + incidental, or consequential damages of any character arising as a 159 + result of this License or out of the use or inability to use the 160 + Work (including but not limited to damages for loss of goodwill, 161 + work stoppage, computer failure or malfunction, or any and all 162 + other commercial damages or losses), even if such Contributor 163 + has been advised of the possibility of such damages. 164 + 165 + 9. Accepting Warranty or Additional Liability. While redistributing 166 + the Work or Derivative Works thereof, You may choose to offer, 167 + and charge a fee for, acceptance of support, warranty, indemnity, 168 + or other liability obligations and/or rights consistent with this 169 + License. However, in accepting such obligations, You may act only 170 + on Your own behalf and on Your sole responsibility, not on behalf 171 + of any other Contributor, and only if You agree to indemnify, 172 + defend, and hold each Contributor harmless for any liability 173 + incurred by, or claims asserted against, such Contributor by reason 174 + of your accepting any such warranty or additional liability. 175 + 176 + END OF TERMS AND CONDITIONS 177 + 178 + APPENDIX: How to apply the Apache License to your work. 179 + 180 + To apply the Apache License to your work, attach the following 181 + boilerplate notice, with the fields enclosed by brackets "[]" 182 + replaced with your own identifying information. (Don't include 183 + the brackets!) The text should be enclosed in the appropriate 184 + comment syntax for the file format. We also recommend that a 185 + file or class name and description of purpose be included on the 186 + same "printed page" as the copyright notice for easier 187 + identification within third-party archives. 188 + 189 + Copyright [yyyy] [name of copyright owner] 190 + 191 + Licensed under the Apache License, Version 2.0 (the "License"); 192 + you may not use this file except in compliance with the License. 193 + You may obtain a copy of the License at 194 + 195 + http://www.apache.org/licenses/LICENSE-2.0 196 + 197 + Unless required by applicable law or agreed to in writing, software 198 + distributed under the License is distributed on an "AS IS" BASIS, 199 + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 + See the License for the specific language governing permissions and 201 + limitations under the License.
+9 -32
README.md
··· 5 5 6 6 An [AT Protocol](https://atproto.com/) Lexicon validator for Gleam. 7 7 8 + > [!WARNING] 9 + > While I've tried to be as thorough as possible checking the validators against various atproto 10 + > validation libraries, this may contain bugs. Please report any issues you find. 11 + 8 12 ## Installation 9 13 10 14 ```sh ··· 79 83 80 84 ## Features 81 85 82 - - โœ… **Type Validators**: string, integer, boolean, bytes, blob, cid-link, null, object, array, union, ref, record, query, procedure, subscription, token, unknown 83 - - โœ… **String Format Validators**: datetime (RFC3339), uri, at-uri, did, handle, at-identifier, nsid, cid, language, tid, record-key 84 - - โœ… **Constraint Validation**: length limits, ranges, enums, required fields 85 - - โœ… **Reference Resolution**: local (`#def`), global (`nsid#def`), and cross-lexicon references 86 - - โœ… **Circular Dependency Detection**: prevents infinite reference loops 87 - - โœ… **Detailed Error Messages**: validation errors with path information 86 + - **Type Validators**: string, integer, boolean, bytes, blob, cid-link, null, object, array, union, ref, record, query, procedure, subscription, token, unknown 87 + - **String Format Validators**: datetime (RFC3339), uri, at-uri, did, handle, at-identifier, nsid, cid, language, tid, record-key 88 + - **Constraint Validation**: length limits, ranges, enums, required fields 89 + - **Reference Resolution**: local (`#def`), global (`nsid#def`), and cross-lexicon references 90 + - **Detailed Error Messages**: validation errors with path information 88 91 89 92 ## CLI Usage 90 93 ··· 103 106 104 107 When validating a directory, all lexicons are loaded together to resolve cross-lexicon references 105 108 106 - ## API Overview 107 - 108 - ### Main Functions 109 - 110 - - `validate(lexicons: List(Json))` - Validates one or more lexicon schemas 111 - - `validate_record(lexicons, nsid, data)` - Validates record data against a schema 112 - - `is_valid_nsid(value)` - Checks if a string is a valid NSID 113 - - `validate_string_format(value, format)` - Validates string against a format 114 - 115 - ### Context Builder Pattern 116 - 117 - ```gleam 118 - import validation/context 119 - import validation/field 120 - 121 - let assert Ok(ctx) = 122 - context.builder() 123 - |> context.with_validator(field.dispatch_data_validation) 124 - |> context.with_lexicons([lexicon]) 125 - |> context.build 126 - ``` 127 - 128 109 ## Testing 129 110 130 111 ```sh 131 112 gleam test 132 113 ``` 133 - 134 - ## Implementation 135 - 136 - This implementation aligns with the [indigo/atproto/lexicon](https://github.com/bluesky-social/indigo/tree/main/atproto/lexicon) implementation as much as possible, ensuring compatibility with the ATProto specification and ecosystem. 137 114 138 115 ## Documentation 139 116
+16
build-js.sh
··· 1 + #!/bin/bash 2 + set -e 3 + 4 + gleam build --target javascript 5 + 6 + rm -rf dist 7 + mkdir -p dist 8 + 9 + npx esbuild src/honk_bundle.mjs \ 10 + --bundle \ 11 + --minify \ 12 + --format=iife \ 13 + --global-name=honk \ 14 + --outfile=dist/honk.min.js 15 + 16 + echo "Built to dist/honk.min.js"
+4
dist/honk.min.js
··· 1 + var honk=(()=>{var sr=Object.defineProperty;var Zs=Object.getOwnPropertyDescriptor;var Ws=Object.getOwnPropertyNames;var Hs=Object.prototype.hasOwnProperty;var Ys=(e,t)=>{for(var n in t)sr(e,n,{get:t[n],enumerable:!0})},Ks=(e,t,n,r)=>{if(t&&typeof t=="object"||typeof t=="function")for(let i of Ws(t))!Hs.call(e,i)&&i!==n&&sr(e,i,{get:()=>t[i],enumerable:!(r=Zs(t,i))||r.enumerable});return e};var Xs=e=>Ks(sr({},"__esModule",{value:!0}),e);var ru={};Ys(ru,{build_validation_context:()=>Rs,dynamic_to_json:()=>Js,is_valid_nsid:()=>eu,parse_json_string:()=>Gs,parse_json_strings:()=>nu,toList:()=>w,validate:()=>Xa,validate_record:()=>Qa,validate_record_with_context:()=>Ps,validate_string_format:()=>tu});var v=class{withFields(t){let n=Object.keys(this).map(r=>r in t?t[r]:this[r]);return new this.constructor(...n)}},Q=class{static fromArray(t,n){let r=n||new y;for(let i=t.length-1;i>=0;--i)r=new Te(t[i],r);return r}[Symbol.iterator](){return new or(this)}toArray(){return[...this]}atLeastLength(t){let n=this;for(;t-- >0&&n;)n=n.tail;return n!==void 0}hasLength(t){let n=this;for(;t-- >0&&n;)n=n.tail;return t===-1&&n instanceof y}countLength(){let t=this,n=0;for(;t;)t=t.tail,n++;return n-1}};function F(e,t){return new Te(e,t)}function w(e,t){return Q.fromArray(e,t)}var or=class{#e;constructor(t){this.#e=t}next(){if(this.#e instanceof y)return{done:!0};{let{head:t,tail:n}=this.#e;return this.#e=n,{value:t,done:!1}}}},y=class extends Q{};var Te=class extends Q{constructor(t,n){super(),this.head=t,this.tail=n}};var di=e=>e instanceof Te,pi=e=>e.head,$i=e=>e.tail,X=class{bitSize;byteSize;bitOffset;rawBuffer;constructor(t,n,r){if(!(t instanceof Uint8Array))throw globalThis.Error("BitArray can only be constructed from a Uint8Array");if(this.bitSize=n??t.length*8,this.byteSize=Math.trunc((this.bitSize+7)/8),this.bitOffset=r??0,this.bitSize<0)throw globalThis.Error(`BitArray bit size is invalid: ${this.bitSize}`);if(this.bitOffset<0||this.bitOffset>7)throw globalThis.Error(`BitArray bit offset is invalid: ${this.bitOffset}`);if(t.length!==Math.trunc((this.bitOffset+this.bitSize+7)/8))throw globalThis.Error("BitArray buffer length is invalid");this.rawBuffer=t}byteAt(t){if(!(t<0||t>=this.byteSize))return nt(this.rawBuffer,this.bitOffset,t)}equals(t){if(this.bitSize!==t.bitSize)return!1;let n=Math.trunc(this.bitSize/8);if(this.bitOffset===0&&t.bitOffset===0){for(let i=0;i<n;i++)if(this.rawBuffer[i]!==t.rawBuffer[i])return!1;let r=this.bitSize%8;if(r){let i=8-r;if(this.rawBuffer[n]>>i!==t.rawBuffer[n]>>i)return!1}}else{for(let i=0;i<n;i++){let s=nt(this.rawBuffer,this.bitOffset,i),l=nt(t.rawBuffer,t.bitOffset,i);if(s!==l)return!1}let r=this.bitSize%8;if(r){let i=nt(this.rawBuffer,this.bitOffset,n),s=nt(t.rawBuffer,t.bitOffset,n),l=8-r;if(i>>l!==s>>l)return!1}}return!0}get buffer(){if(ui("buffer","Use BitArray.byteAt() or BitArray.rawBuffer instead"),this.bitOffset!==0||this.bitSize%8!==0)throw new globalThis.Error("BitArray.buffer does not support unaligned bit arrays");return this.rawBuffer}get length(){if(ui("length","Use BitArray.bitSize or BitArray.byteSize instead"),this.bitOffset!==0||this.bitSize%8!==0)throw new globalThis.Error("BitArray.length does not support unaligned bit arrays");return this.rawBuffer.length}};function nt(e,t,n){if(t===0)return e[n]??0;{let r=e[n]<<t&255,i=e[n+1]>>8-t;return r|i}}var mn=class{constructor(t){this.value=t}},ai={};function ui(e,t){ai[e]||(console.warn(`Deprecated BitArray.${e} property used in JavaScript FFI code. ${t}.`),ai[e]=!0)}function le(e,t,n){if(n??=e.bitSize,Qs(e,t,n),t===n)return new X(new Uint8Array);if(t===0&&n===e.bitSize)return e;t+=e.bitOffset,n+=e.bitOffset;let r=Math.trunc(t/8),s=Math.trunc((n+7)/8)-r,l;return r===0&&s===e.rawBuffer.byteLength?l=e.rawBuffer:l=new Uint8Array(e.rawBuffer.buffer,e.rawBuffer.byteOffset+r,s),new X(l,n-t,t%8)}function hn(e){if(e.length===0)return new X(new Uint8Array);if(e.length===1){let l=e[0];return l instanceof X?l:l instanceof Uint8Array?new X(l):new X(new Uint8Array(e))}let t=0,n=!0;for(let l of e)l instanceof X?(t+=l.bitSize,n=!1):l instanceof Uint8Array?(t+=l.byteLength*8,n=!1):t+=8;if(n)return new X(new Uint8Array(e));let r=new Uint8Array(Math.trunc((t+7)/8)),i=0;for(let l of e){let a=i%8===0;if(l instanceof X)if(a&&l.bitOffset===0){r.set(l.rawBuffer,i/8),i+=l.bitSize;let f=l.bitSize%8;if(f!==0){let c=Math.trunc(i/8);r[c]>>=8-f,r[c]<<=8-f}}else s(l.rawBuffer,l.bitSize,l.bitOffset);else l instanceof Uint8Array?a?(r.set(l,i/8),i+=l.byteLength*8):s(l,l.byteLength*8,0):a?(r[i/8]=l,i+=8):s(new Uint8Array([l]),8,0)}function s(l,a,f){if(a===0)return;let c=Math.trunc(a+7/8),$=i%8,_=8-$,x=Math.trunc(i/8);for(let g=0;g<c;g++){let S=nt(l,f,g);a<8&&(S>>=8-a,S<<=8-a),r[x]|=S>>$;let E=a-Math.max(0,a-_);if(a-=E,i+=E,a===0)break;r[++x]=S<<_,E=a-Math.max(0,a-$),a-=E,i+=E}}return new X(r,t)}function Qs(e,t,n){if(t<0||t>e.bitSize||n<t||n>e.bitSize){let r=`Invalid bit array slice: start = ${t}, end = ${n}, bit size = ${e.bitSize}`;throw new globalThis.Error(r)}}var fi;function _i(e){return fi??=new TextEncoder,fi.encode(e)}var rt=class e extends v{static isResult(t){return t instanceof e}},o=class extends rt{constructor(t){super(),this[0]=t}isOk(){return!0}},mi=e=>new o(e);var u=class extends rt{constructor(t){super(),this[0]=t}isOk(){return!1}},hi=e=>new u(e);function D(e,t){let n=[e,t];for(;n.length;){let r=n.pop(),i=n.pop();if(r===i)continue;if(!ci(r)||!ci(i)||!lo(r,i)||to(r,i)||no(r,i)||ro(r,i)||io(r,i)||so(r,i)||oo(r,i))return!1;let l=Object.getPrototypeOf(r);if(l!==null&&typeof l.equals=="function")try{if(r.equals(i))continue;return!1}catch{}let[a,f]=eo(r),c=a(r),$=a(i);if(c.length!==$.length)return!1;for(let _ of c)n.push(f(r,_),f(i,_))}return!0}function eo(e){if(e instanceof Map)return[t=>t.keys(),(t,n)=>t.get(n)];{let t=e instanceof globalThis.Error?["message"]:[];return[n=>[...t,...Object.keys(n)],(n,r)=>n[r]]}}function to(e,t){return e instanceof Date&&(e>t||e<t)}function no(e,t){return!(e instanceof X)&&e.buffer instanceof ArrayBuffer&&e.BYTES_PER_ELEMENT&&!(e.byteLength===t.byteLength&&e.every((n,r)=>n===t[r]))}function ro(e,t){return Array.isArray(e)&&e.length!==t.length}function io(e,t){return e instanceof Map&&e.size!==t.size}function so(e,t){return e instanceof Set&&(e.size!=t.size||[...e].some(n=>!t.has(n)))}function oo(e,t){return e instanceof RegExp&&(e.source!==t.source||e.flags!==t.flags)}function ci(e){return typeof e=="object"&&e!==null}function lo(e,t){return typeof e!="object"&&typeof t!="object"&&(!e||!t)||[Promise,WeakSet,WeakMap,Function].some(r=>e instanceof r)?!1:e.constructor===t.constructor}function Ut(e,t){return t===0?0:e%t}function Ft(e,t){return Math.trunc(wn(e,t))}function wn(e,t){return t===0?0:e/t}var d=class extends v{constructor(t){super(),this[0]=t}};var A=class extends v{};var wi=new WeakMap,lr=new DataView(new ArrayBuffer(8)),ar=0;function ur(e){let t=wi.get(e);if(t!==void 0)return t;let n=ar++;return ar===2147483647&&(ar=0),wi.set(e,n),n}function fr(e,t){return e^t+2654435769+(e<<6)+(e>>2)|0}function dr(e){let t=0,n=e.length;for(let r=0;r<n;r++)t=Math.imul(31,t)+e.charCodeAt(r)|0;return t}function yi(e){lr.setFloat64(0,e);let t=lr.getInt32(0),n=lr.getInt32(4);return Math.imul(73244475,t>>16^t)^n}function ao(e){return dr(e.toString())}function uo(e){let t=Object.getPrototypeOf(e);if(t!==null&&typeof t.hashCode=="function")try{let r=e.hashCode(e);if(typeof r=="number")return r}catch{}if(e instanceof Promise||e instanceof WeakSet||e instanceof WeakMap)return ur(e);if(e instanceof Date)return yi(e.getTime());let n=0;if(e instanceof ArrayBuffer&&(e=new Uint8Array(e)),Array.isArray(e)||e instanceof Uint8Array)for(let r=0;r<e.length;r++)n=Math.imul(31,n)+pe(e[r])|0;else if(e instanceof Set)e.forEach(r=>{n=n+pe(r)|0});else if(e instanceof Map)e.forEach((r,i)=>{n=n+fr(pe(r),pe(i))|0});else{let r=Object.keys(e);for(let i=0;i<r.length;i++){let s=r[i],l=e[s];n=n+fr(pe(l),dr(s))|0}}return n}function pe(e){if(e===null)return 1108378658;if(e===void 0)return 1108378659;if(e===!0)return 1108378657;if(e===!1)return 1108378656;switch(typeof e){case"number":return yi(e);case"string":return dr(e);case"bigint":return ao(e);case"object":return uo(e);case"symbol":return ur(e);case"function":return ur(e);default:return 0}}var Ee=5,pr=Math.pow(2,Ee),fo=pr-1,co=pr/2,po=pr/4,re=0,Se=1,ae=2,Ye=3,$r={type:ae,bitmap:0,array:[]};function Vt(e,t){return e>>>t&fo}function gn(e,t){return 1<<Vt(e,t)}function $o(e){return e-=e>>1&1431655765,e=(e&858993459)+(e>>2&858993459),e=e+(e>>4)&252645135,e+=e>>8,e+=e>>16,e&127}function _r(e,t){return $o(e&t-1)}function $e(e,t,n){let r=e.length,i=new Array(r);for(let s=0;s<r;++s)i[s]=e[s];return i[t]=n,i}function _o(e,t,n){let r=e.length,i=new Array(r+1),s=0,l=0;for(;s<t;)i[l++]=e[s++];for(i[l++]=n;s<r;)i[l++]=e[s++];return i}function cr(e,t){let n=e.length,r=new Array(n-1),i=0,s=0;for(;i<t;)r[s++]=e[i++];for(++i;i<n;)r[s++]=e[i++];return r}function gi(e,t,n,r,i,s){let l=pe(t);if(l===r)return{type:Ye,hash:l,array:[{type:re,k:t,v:n},{type:re,k:i,v:s}]};let a={val:!1};return Rt(mr($r,e,l,t,n,a),e,r,i,s,a)}function Rt(e,t,n,r,i,s){switch(e.type){case Se:return mo(e,t,n,r,i,s);case ae:return mr(e,t,n,r,i,s);case Ye:return ho(e,t,n,r,i,s)}}function mo(e,t,n,r,i,s){let l=Vt(n,t),a=e.array[l];if(a===void 0)return s.val=!0,{type:Se,size:e.size+1,array:$e(e.array,l,{type:re,k:r,v:i})};if(a.type===re)return D(r,a.k)?i===a.v?e:{type:Se,size:e.size,array:$e(e.array,l,{type:re,k:r,v:i})}:(s.val=!0,{type:Se,size:e.size,array:$e(e.array,l,gi(t+Ee,a.k,a.v,n,r,i))});let f=Rt(a,t+Ee,n,r,i,s);return f===a?e:{type:Se,size:e.size,array:$e(e.array,l,f)}}function mr(e,t,n,r,i,s){let l=gn(n,t),a=_r(e.bitmap,l);if((e.bitmap&l)!==0){let f=e.array[a];if(f.type!==re){let $=Rt(f,t+Ee,n,r,i,s);return $===f?e:{type:ae,bitmap:e.bitmap,array:$e(e.array,a,$)}}let c=f.k;return D(r,c)?i===f.v?e:{type:ae,bitmap:e.bitmap,array:$e(e.array,a,{type:re,k:r,v:i})}:(s.val=!0,{type:ae,bitmap:e.bitmap,array:$e(e.array,a,gi(t+Ee,c,f.v,n,r,i))})}else{let f=e.array.length;if(f>=co){let c=new Array(32),$=Vt(n,t);c[$]=mr($r,t+Ee,n,r,i,s);let _=0,x=e.bitmap;for(let g=0;g<32;g++){if((x&1)!==0){let S=e.array[_++];c[g]=S}x=x>>>1}return{type:Se,size:f+1,array:c}}else{let c=_o(e.array,a,{type:re,k:r,v:i});return s.val=!0,{type:ae,bitmap:e.bitmap|l,array:c}}}}function ho(e,t,n,r,i,s){if(n===e.hash){let l=hr(e,r);if(l!==-1)return e.array[l].v===i?e:{type:Ye,hash:n,array:$e(e.array,l,{type:re,k:r,v:i})};let a=e.array.length;return s.val=!0,{type:Ye,hash:n,array:$e(e.array,a,{type:re,k:r,v:i})}}return Rt({type:ae,bitmap:gn(e.hash,t),array:[e]},t,n,r,i,s)}function hr(e,t){let n=e.array.length;for(let r=0;r<n;r++)if(D(t,e.array[r].k))return r;return-1}function yn(e,t,n,r){switch(e.type){case Se:return wo(e,t,n,r);case ae:return xo(e,t,n,r);case Ye:return yo(e,r)}}function wo(e,t,n,r){let i=Vt(n,t),s=e.array[i];if(s!==void 0){if(s.type!==re)return yn(s,t+Ee,n,r);if(D(r,s.k))return s}}function xo(e,t,n,r){let i=gn(n,t);if((e.bitmap&i)===0)return;let s=_r(e.bitmap,i),l=e.array[s];if(l.type!==re)return yn(l,t+Ee,n,r);if(D(r,l.k))return l}function yo(e,t){let n=hr(e,t);if(!(n<0))return e.array[n]}function wr(e,t,n,r){switch(e.type){case Se:return go(e,t,n,r);case ae:return bo(e,t,n,r);case Ye:return vo(e,r)}}function go(e,t,n,r){let i=Vt(n,t),s=e.array[i];if(s===void 0)return e;let l;if(s.type===re){if(!D(s.k,r))return e}else if(l=wr(s,t+Ee,n,r),l===s)return e;if(l===void 0){if(e.size<=po){let a=e.array,f=new Array(e.size-1),c=0,$=0,_=0;for(;c<i;){let x=a[c];x!==void 0&&(f[$]=x,_|=1<<c,++$),++c}for(++c;c<a.length;){let x=a[c];x!==void 0&&(f[$]=x,_|=1<<c,++$),++c}return{type:ae,bitmap:_,array:f}}return{type:Se,size:e.size-1,array:$e(e.array,i,l)}}return{type:Se,size:e.size,array:$e(e.array,i,l)}}function bo(e,t,n,r){let i=gn(n,t);if((e.bitmap&i)===0)return e;let s=_r(e.bitmap,i),l=e.array[s];if(l.type!==re){let a=wr(l,t+Ee,n,r);return a===l?e:a!==void 0?{type:ae,bitmap:e.bitmap,array:$e(e.array,s,a)}:e.bitmap===i?void 0:{type:ae,bitmap:e.bitmap^i,array:cr(e.array,s)}}return D(r,l.k)?e.bitmap===i?void 0:{type:ae,bitmap:e.bitmap^i,array:cr(e.array,s)}:e}function vo(e,t){let n=hr(e,t);if(n<0)return e;if(e.array.length!==1)return{type:Ye,hash:e.hash,array:cr(e.array,n)}}function bi(e,t){if(e===void 0)return;let n=e.array,r=n.length;for(let i=0;i<r;i++){let s=n[i];if(s!==void 0){if(s.type===re){t(s.v,s.k);continue}bi(s,t)}}}var ye=class e{static fromObject(t){let n=Object.keys(t),r=e.new();for(let i=0;i<n.length;i++){let s=n[i];r=r.set(s,t[s])}return r}static fromMap(t){let n=e.new();return t.forEach((r,i)=>{n=n.set(i,r)}),n}static new(){return new e(void 0,0)}constructor(t,n){this.root=t,this.size=n}get(t,n){if(this.root===void 0)return n;let r=yn(this.root,0,pe(t),t);return r===void 0?n:r.v}set(t,n){let r={val:!1},i=this.root===void 0?$r:this.root,s=Rt(i,0,pe(t),t,n,r);return s===this.root?this:new e(s,r.val?this.size+1:this.size)}delete(t){if(this.root===void 0)return this;let n=wr(this.root,0,pe(t),t);return n===this.root?this:n===void 0?e.new():new e(n,this.size-1)}has(t){return this.root===void 0?!1:yn(this.root,0,pe(t),t)!==void 0}entries(){if(this.root===void 0)return[];let t=[];return this.forEach((n,r)=>t.push([r,n])),t}forEach(t){bi(this.root,t)}hashCode(){let t=0;return this.forEach((n,r)=>{t=t+fr(pe(n),pe(r))|0}),t}equals(t){if(!(t instanceof e)||this.size!==t.size)return!1;try{return this.forEach((n,r)=>{if(!D(t.get(r,!n),n))throw xi}),!0}catch(n){if(n===xi)return!1;throw n}}},xi=Symbol();function xr(e){return jn(e)===0}function _e(e,t,n){return vi(t,n,e)}function jo(e,t){for(;;){let n=e,r=t;if(n instanceof y)return r;{let i=n.tail,s=n.head[0],l=n.head[1];e=i,t=_e(r,s,l)}}}function bn(e){return jo(e,ee())}function ko(e,t){for(;;){let n=e,r=t;if(n instanceof y)return r;{let i=n.head;e=n.tail,t=F(i,r)}}}function So(e,t){for(;;){let n=e,r=t;if(n instanceof y)return ko(r,w([]));{let i=n.tail,s=n.head[0];e=i,t=F(s,r)}}}function vn(e){return So(it(e),w([]))}function Eo(e,t,n){for(;;){let r=e,i=t,s=n;if(r instanceof y)return i;{let l=r.tail,a=r.head[0],f=r.head[1];e=l,t=s(i,a,f),n=s}}}function me(e,t,n){return Eo(it(e),t,n)}function zo(e,t){for(;;){let n=e,r=t;if(n instanceof y)return r;e=n.tail,t=r+1}}function he(e){return zo(e,0)}function Ao(e,t){for(;;){let n=e,r=t;if(n instanceof y)return r;{let i=n.head;e=n.tail,t=F(i,r)}}}function be(e){return Ao(e,w([]))}function Ke(e){return D(e,w([]))}function st(e,t){for(;;){let n=e,r=t;if(n instanceof y)return!1;{let i=n.head;if(D(i,r))return!0;e=n.tail,t=r}}}function Oo(e,t,n){for(;;){let r=e,i=t,s=n;if(r instanceof y)return be(s);{let l=r.head,a=r.tail,f;i(l)?f=F(l,s):f=s;let $=f;e=a,t=i,n=$}}}function kn(e,t){return Oo(e,t,w([]))}function Co(e,t,n){for(;;){let r=e,i=t,s=n;if(r instanceof y)return be(s);{let l=r.head,a=r.tail,f,c=i(l);if(c instanceof o){let _=c[0];f=F(_,s)}else f=s;let $=f;e=a,t=i,n=$}}}function ie(e,t){return Co(e,t,w([]))}function Bo(e,t,n){for(;;){let r=e,i=t,s=n;if(r instanceof y)return be(s);{let l=r.head;e=r.tail,t=i,n=F(i(l),s)}}}function Ne(e,t){return Bo(e,t,w([]))}function Mo(e,t,n){for(;;){let r=e,i=t,s=n;if(r instanceof y)return new o(be(s));{let l=r.head,a=r.tail,f=i(l);if(f instanceof o){let c=f[0];e=a,t=i,n=F(c,s)}else return f}}}function qe(e,t){return Mo(e,t,w([]))}function Do(e,t){for(;;){let n=e,r=t;if(n instanceof y)return r;{let i=n.head;e=n.tail,t=F(i,r)}}}function ki(e,t){return Do(be(e),t)}function Sn(e,t,n){for(;;){let r=e,i=t,s=n;if(r instanceof y)return i;{let l=r.head;e=r.tail,t=s(i,l),n=s}}}function Lo(e,t,n,r){for(;;){let i=e,s=t,l=n,a=r;if(i instanceof y)return s;{let f=i.head;e=i.tail,t=l(s,f,a),n=l,r=a+1}}}function ot(e,t,n){return Lo(e,t,n,0)}function K(e,t,n){for(;;){let r=e,i=t,s=n;if(r instanceof y)return new o(i);{let l=r.head,a=r.tail,f=s(i,l);if(f instanceof o){let c=f[0];e=a,t=c,n=s}else return f}}}function Si(e,t){for(;;){let n=e,r=t;if(n instanceof y)return new u(void 0);{let i=n.head,s=n.tail;if(r(i))return new o(i);e=s,t=r}}}function En(e,t){for(;;){let n=e,r=t;if(n instanceof y)return!1;{let i=n.head,s=n.tail,l=r(i);if(l)return l;e=s,t=r}}}function Ei(e){for(;;){let t=e;if(t instanceof y)return new u(void 0);{let n=t.tail;if(n instanceof y){let r=t.head;return new o(r)}else e=n}}}var De=class extends v{constructor(t,n,r){super(),this.expected=t,this.found=n,this.path=r}};var we=class extends v{constructor(t){super(),this.function=t}};function j(e,t){let n=t.function(e),r,i;return r=n[0],i=n[1],i instanceof y?new o(r):new u(i)}function Io(e){return[e,w([])]}function No(e,t){return new we(n=>{let r=e.function(n),i,s;return i=r[0],s=r[1],[t(i),s]})}function qo(e,t,n){for(;;){let r=e,i=t,s=n;if(s instanceof y)return i;{let l=s.head,a=s.tail,f=l.function(r),c,$;if(c=f,$=f[1],$ instanceof y)return c;e=r,t=i,n=a}}}function Uo(e,t){return new we(n=>{let r=e.function(n),i,s;return i=r,s=r[1],s instanceof y?i:qo(n,i,t)})}var se=new we(Io);function Oi(e,t){return w([new De(e,ge(t),w([]))])}function vr(e,t,n){let r=n(e);return r instanceof o?[r[0],w([])]:[r[0],w([new De(t,ge(e),w([]))])]}function Fo(e){return D(!0,e)?[!0,w([])]:D(!1,e)?[!1,w([])]:[!1,Oi("Bool",e)]}function Vo(e){return vr(e,"Int",Ti)}function Ro(e){return vr(e,"Float",Li)}var zn=new we(Fo),Ue=new we(Vo),Ci=new we(Ro);function Po(e){return vr(e,"String",Ii)}var L=new we(Po);function Jo(e,t,n,r,i){let s=r(t),l=s[1];if(l instanceof y){let a=s[0],f=i(n),c=f[1];if(c instanceof y){let $=f[0];return[_e(e[0],a,$),e[1]]}else{let $=c;return lt([ee(),$],w(["values"]))}}else{let a=l;return lt([ee(),a],w(["keys"]))}}function at(e,t){return new we(n=>{let r=Di(n);if(r instanceof o){let i=r[0];return me(i,[ee(),w([])],(s,l,a)=>s[1]instanceof y?Jo(s,l,a,e.function,t.function):s)}else return[ee(),Oi("Dict",n)]})}function ut(e){return new we(t=>Mi(t,e.function,(n,r)=>lt(n,w([r])),0,w([])))}function lt(e,t){let n=Uo(L,w([No(Ue,B)])),r=Ne(t,s=>{let l=s,a=j(l,n);return a instanceof o?a[0]:"<"+ge(l)+">"}),i=Ne(e[1],s=>new De(s.expected,s.found,ki(r,s.path)));return[e[0],i]}function Go(e,t,n,r,i){for(;;){let s=e,l=t,a=n,f=r,c=i;if(s instanceof y){let _=a(f);return lt(_,be(l))}else{let $=s.head,_=s.tail,x=Bi(f,$);if(x instanceof o){let g=x[0];if(g instanceof d){let S=g[0];e=_,t=F($,l),n=a,r=S,i=c}else return c(f,F($,l))}else{let g=x[0],S=a(f),E;E=S[0];let I=[E,w([new De(g,ge(f),w([]))])];return lt(I,be(l))}}}}function ft(e,t){return new we(n=>Go(e,w([]),t.function,n,(r,i)=>{let s=t.function(r),l;l=s[0];let a=[l,w([new De("Field","Nothing",w([]))])];return lt(a,be(i))}))}var kr=void 0,Ni={};function Y(e){return e}function Sr(e){return/^[-+]?(\d+)$/.test(e)?new o(parseInt(e)):new u(kr)}function B(e){return e.toString()}function G(e){if(e==="")return 0;let t=Er(e);if(t){let n=0;for(let r of t)n++;return n}else return e.match(/./gsu).length}function Gt(e){let t=Er(e);return t?Q.fromArray(Array.from(t).map(n=>n.segment)):Q.fromArray(e.match(/./gsu))}var qi;function Er(e){if(globalThis.Intl&&Intl.Segmenter)return qi||=new Intl.Segmenter,qi.segment(e)[Symbol.iterator]()}function zr(e,t){return Q.fromArray(e.split(t))}function Ui(e,t,n){return e.slice(t,t+n)}function An(e,t,n){if(n<=0||t>=e.length)return"";let r=Er(e);if(r){for(;t-- >0;)r.next();let i="";for(;n-- >0;){let s=r.next().value;if(s===void 0)break;i+=s.segment}return i}else return e.match(/./gsu).slice(t,t+n).join("")}function Fe(e,t){return e.indexOf(t)>=0}function Z(e,t){return e.startsWith(t)}function Oe(e,t){return e.endsWith(t)}var Fi=[" "," ",` 2 + `,"\v","\f","\r","\x85","\u2028","\u2029"].join(""),Iu=new RegExp(`^[${Fi}]*`),Nu=new RegExp(`[${Fi}]*$`);function On(e){return hn([_i(e)])}function Cn(e){return e.byteSize}function ee(){return ye.new()}function jn(e){return e.size}function it(e){return Q.fromArray(e.entries())}function ue(e,t){let n=e.get(t,Ni);return n===Ni?new u(kr):new o(n)}function vi(e,t,n){return n.set(e,t)}function Vi(e){try{let t=atob(e),n=t.length,r=new Uint8Array(n);for(let i=0;i<n;i++)r[i]=t.charCodeAt(i);return new o(new X(r))}catch{return new u(kr)}}function ge(e){if(typeof e=="string")return"String";if(typeof e=="boolean")return"Bool";if(e instanceof rt)return"Result";if(e instanceof Q)return"List";if(e instanceof X)return"BitArray";if(e instanceof ye)return"Dict";if(Number.isInteger(e))return"Int";if(Array.isArray(e))return"Array";if(typeof e=="number")return"Float";if(e===null)return"Nil";if(e===void 0)return"Nil";{let t=typeof e;return t.charAt(0).toUpperCase()+t.slice(1)}}function Zt(e){return new TextEncoder().encode(e).length}function Ri(e){return new jr().inspect(e)}function Pi(e){let t=e.toString().replace("+","");if(t.indexOf(".")>=0)return t;{let n=t.indexOf("e");return n>=0?t.slice(0,n)+".0"+t.slice(n):t+".0"}}var jr=class{#e=new Set;inspect(t){let n=typeof t;if(t===!0)return"True";if(t===!1)return"False";if(t===null)return"//js(null)";if(t===void 0)return"Nil";if(n==="string")return this.#s(t);if(n==="bigint"||Number.isInteger(t))return t.toString();if(n==="number")return Pi(t);if(t instanceof mn)return this.#o(t);if(t instanceof X)return this.#l(t);if(t instanceof RegExp)return`//js(${t})`;if(t instanceof Date)return`//js(Date("${t.toISOString()}"))`;if(t instanceof globalThis.Error)return`//js(${t.toString()})`;if(t instanceof Function){let i=[];for(let s of Array(t.length).keys())i.push(String.fromCharCode(s+97));return`//fn(${i.join(", ")}) { ... }`}if(this.#e.size===this.#e.add(t).size)return"//js(circular reference)";let r;if(Array.isArray(t))r=`#(${t.map(i=>this.inspect(i)).join(", ")})`;else if(t instanceof Q)r=this.#i(t);else if(t instanceof v)r=this.#r(t);else if(t instanceof ye)r=this.#n(t);else{if(t instanceof Set)return`//js(Set(${[...t].map(i=>this.inspect(i)).join(", ")}))`;r=this.#t(t)}return this.#e.delete(t),r}#t(t){let n=Object.getPrototypeOf(t)?.constructor?.name||"Object",r=[];for(let l of Object.keys(t))r.push(`${this.inspect(l)}: ${this.inspect(t[l])}`);let i=r.length?" "+r.join(", ")+" ":"";return`//js(${n==="Object"?"":n+" "}{${i}})`}#n(t){let n="dict.from_list([",r=!0;return t.forEach((i,s)=>{r||(n=n+", "),n=n+"#("+this.inspect(s)+", "+this.inspect(i)+")",r=!1}),n+"])"}#r(t){let n=Object.keys(t).map(r=>{let i=this.inspect(t[r]);return isNaN(parseInt(r))?`${r}: ${i}`:i}).join(", ");return n?`${t.constructor.name}(${n})`:t.constructor.name}#i(t){if(t instanceof y)return"[]";let n='charlist.from_string("',r="[",i=t;for(;i instanceof Te;){let s=i.head;i=i.tail,r!=="["&&(r+=", "),r+=this.inspect(s),n&&(Number.isInteger(s)&&s>=32&&s<=126?n+=String.fromCharCode(s):n=null)}return n?n+'")':r+"]"}#s(t){let n='"';for(let r=0;r<t.length;r++){let i=t[r];switch(i){case` 3 + `:n+="\\n";break;case"\r":n+="\\r";break;case" ":n+="\\t";break;case"\f":n+="\\f";break;case"\\":n+="\\\\";break;case'"':n+='\\"';break;default:i<" "||i>"~"&&i<"\xA0"?n+="\\u{"+i.charCodeAt(0).toString(16).toUpperCase().padStart(4,"0")+"}":n+=i}}return n+='"',n}#o(t){return`//utfcodepoint(${String.fromCodePoint(t.value)})`}#l(t){if(t.bitSize===0)return"<<>>";let n="<<";for(let r=0;r<t.byteSize-1;r++)n+=t.byteAt(r).toString(),n+=", ";if(t.byteSize*8===t.bitSize)n+=t.byteAt(t.byteSize-1).toString();else{let r=t.bitSize%8;n+=t.byteAt(t.byteSize-1)>>8-r,n+=`:size(${r})`}return n+=">>",n}};function Bi(e,t){if(e instanceof ye||e instanceof WeakMap||e instanceof Map){let r={},i=e.get(t,r);return i===r?new o(new A):new o(new d(i))}let n=Number.isInteger(t);if(n&&t>=0&&t<8&&e instanceof Q){let r=0;for(let i of e){if(r===t)return new o(new d(i));r++}return new u("Indexable")}return n&&Array.isArray(e)||e&&typeof e=="object"||e&&Object.getPrototypeOf(e)===Object.prototype?t in e?new o(new d(e[t])):new o(new A):new u(n?"Indexable":"Dict")}function Mi(e,t,n,r,i){if(!(e instanceof Q||Array.isArray(e))){let l=new De("List",ge(e),i);return[i,Q.fromArray([l])]}let s=[];for(let l of e){let a=t(l),[f,c]=a;if(c instanceof Te){let[$,_]=n(a,r.toString());return[i,_]}s.push(f),r++}return[Q.fromArray(s),i]}function Di(e){if(e instanceof ye)return new o(e);if(e instanceof Map||e instanceof WeakMap)return new o(ye.fromMap(e));if(e==null)return new u("Dict");if(typeof e!="object")return new u("Dict");let t=Object.getPrototypeOf(e);return t===Object.prototype||t===null?new o(ye.fromObject(e)):new u("Dict")}function Li(e){return typeof e=="number"?new o(e):new u(0)}function Ti(e){return Number.isInteger(e)?new o(e):new u(0)}function Ii(e){return typeof e=="string"?new o(e):new u("")}function Ce(e){return e===""}function Wt(e,t,n){if(n<=0)return"";if(t<0){let s=G(e)+t;return s<0?"":An(e,s,n)}else return An(e,t,n)}function Ji(e,t){return t<=0?e:Wt(e,0,G(e)-t)}function Or(e,t){return e+t}function Xo(e,t,n){for(;;){let r=e,i=t,s=n,l;r%2===0?l=s:l=s+i;let f=l,c=globalThis.Math.trunc(r/2);if(c<=0)return f;e=c,t=i+i,n=f}}function Gi(e,t){return t<=0?"":Xo(t,e,"")}function Qo(e,t,n){for(;;){let r=e,i=t,s=n;if(r instanceof y)return s;{let l=r.head;e=r.tail,t=i,n=s+i+l}}}function Ve(e,t){if(e instanceof y)return"";{let n=e.head,r=e.tail;return Qo(r,t,n)}}function fe(e,t){if(t==="")return Gt(e);{let r=e,i=zr(r,t);return Ne(i,Y)}}function Ht(e){let n=Ri(e);return n}function Re(e,t){if(t<=0)return e;{let r=An(e,0,t),i=Zt(r);return Ui(e,i,Zt(e)-i)}}function Zi(e){let t,n=Cn(On(e))%4;return n===0?t=e:t=Or(e,Gi("=",4-n)),Vi(t)}function Wi(e){return e instanceof o}function Xe(e,t){if(e instanceof o)return e;{let n=e[0];return new u(t(n))}}function p(e,t){if(e instanceof o){let n=e[0];return t(n)}else return e}function Br(e){return JSON.stringify(e)}function Hi(e){return Object.fromEntries(e)}function Yi(e){let t=[];for(;di(e);)t.push(pi(e)),e=$i(e);return t}function Ki(){return null}function Xi(e){try{let t=JSON.parse(e);return mi(t)}catch(t){return hi(tl(t,e))}}function tl(e,t){return nl(e)?Qi():rl(e,t)}function nl(e){return/((unexpected (end|eof))|(end of data)|(unterminated string)|(json( parse error|\.parse)\: expected '(\:|\}|\])'))/i.test(e.message)}function rl(e,t){let n=[il,sl,ll,ol];for(let r of n){let i=r(e,t);if(i)return i}return ct("")}function il(e){let n=/unexpected token '(.)', ".+" is not valid JSON/i.exec(e.message);if(!n)return null;let r=Bn(n[1]);return ct(r)}function sl(e){let n=/unexpected token (.) in JSON at position (\d+)/i.exec(e.message);if(!n)return null;let r=Bn(n[1]);return ct(r)}function ol(e,t){let r=/(unexpected character|expected .*) at line (\d+) column (\d+)/i.exec(e.message);if(!r)return null;let i=Number(r[2]),s=Number(r[3]),l=al(i,s,t),a=Bn(t[l]);return ct(a)}function ll(e){let n=/unexpected (identifier|token) "(.)"/i.exec(e.message);if(!n)return null;let r=Bn(n[2]);return ct(r)}function Bn(e){return"0x"+e.charCodeAt(0).toString(16).toUpperCase()}function al(e,t,n){if(e===1)return t-1;let r=1,i=0;return n.split("").find((s,l)=>(s===` 4 + `&&(r+=1),r===e?(i=l+t,!0):!1)),i}var Mr=class extends v{},Qi=()=>new Mr;var Dr=class extends v{constructor(t){super(),this[0]=t}},ct=e=>new Dr(e);var Lr=class extends v{constructor(t){super(),this[0]=t}};function ul(e,t){return p(Xi(e),n=>{let r=j(n,t);return Xe(r,i=>new Lr(i))})}function Mn(e,t){return ul(e,t)}function je(e){return Br(e)}function es(e){return e}function ts(e){return e}function ns(e){return e}function rs(e){return e}function is(){return Ki()}function Ir(e){return Hi(e)}function fl(e){return Yi(e)}function Nr(e,t){let r=Ne(e,t);return fl(r)}var pt=class extends v{constructor(t){super(),this.collection=t}};var $t=class extends v{constructor(t){super(),this.message=t}};var qr=class extends v{constructor(t){super(),this.message=t}};function Ur(e){return e instanceof pt?"Lexicon not found for collection: "+e.collection:e instanceof $t?"Invalid lexicon schema: "+e.message:"Data validation failed: "+e.message}function m(e){return new $t(e)}function h(e){return new qr(e)}function Fr(e){return new pt(e)}function ce(e){let t=je(e),n=Mn(t,se);return Xe(n,r=>"Failed to parse JSON")}function _t(e){return je(e)==="null"}function Pe(e){let t=ce(e);if(t instanceof o){let n=t[0];return j(n,L)instanceof o}else return!1}function mt(e){let t=ce(e);if(t instanceof o){let n=t[0];return j(n,Ue)instanceof o}else return!1}function ht(e){let t=ce(e);if(t instanceof o){let n=t[0];return j(n,zn)instanceof o}else return!1}function Yt(e){let t=ce(e);if(t instanceof o){let n=t[0];return j(n,ut(se))instanceof o}else return!1}function q(e){let t=ce(e);if(t instanceof o){let n=t[0];return j(n,at(L,se))instanceof o}else return!1}function k(e,t){let n=ce(e);if(n instanceof o){let r=n[0],i=j(r,ft(w([t]),L));if(i instanceof o){let s=i[0];return new d(s)}else return new A}else return new A}function M(e,t){let n=ce(e);if(n instanceof o){let r=n[0],i=j(r,ft(w([t]),Ue));if(i instanceof o){let s=i[0];return new d(s)}else return new A}else return new A}function Je(e,t){let n=ce(e);if(n instanceof o){let r=n[0],i=j(r,ft(w([t]),zn));if(i instanceof o){let s=i[0];return new d(s)}else return new A}else return new A}function T(e,t){let n=ce(e);if(n instanceof o){let r=n[0],i=j(r,ft(w([t]),ut(se)));if(i instanceof o){let s=i[0];return new d(s)}else return new A}else return new A}function z(e){let t=ce(e);if(t instanceof o){let n=t[0],r=j(n,at(L,se));if(r instanceof o){let i=r[0];return vn(i)}else return w([])}else return w([])}function Vr(e){let t=ce(e);if(t instanceof o){let n=t[0],r=j(n,ut(se));if(r instanceof o){let i=r[0];return new d(i)}else return new A}else return new A}function ss(e){return ge(e)==="Nil"}function ke(e){let t=ce(e);if(t instanceof o){let n=t[0],r=j(n,at(L,se));return r instanceof o?r:new u(h("Failed to convert JSON to dictionary"))}else return new u(h("Failed to parse JSON as dynamic"))}function te(e){let t=ge(e);if(t==="Nil")return new o(is());if(t==="String"){let n=j(e,L);if(n instanceof o){let r=n[0];return new o(es(r))}else return new u(h("Failed to decode string"))}else if(t==="Int"){let n=j(e,Ue);if(n instanceof o){let r=n[0];return new o(ns(r))}else return new u(h("Failed to decode int"))}else if(t==="Float"){let n=j(e,Ci);if(n instanceof o){let r=n[0];return new o(rs(r))}else return new u(h("Failed to decode float"))}else if(t==="Bool"){let n=j(e,zn);if(n instanceof o){let r=n[0];return new o(ts(r))}else return new u(h("Failed to decode bool"))}else if(t==="List"){let n=j(e,ut(se));if(n instanceof o){let r=n[0],i=qe(r,te);if(i instanceof o){let s=i[0];return new o(Nr(s,l=>l))}else return i}else return new u(h("Failed to decode list"))}else if(t==="Array"){let n=j(e,ut(se));if(n instanceof o){let r=n[0],i=qe(r,te);if(i instanceof o){let s=i[0];return new o(Nr(s,l=>l))}else return i}else return new u(h("Failed to decode list"))}else if(t==="Dict"){let n=j(e,at(L,se));if(n instanceof o){let r=n[0],i=it(r),s=qe(i,l=>{let a,f;a=l[0],f=l[1];let c=te(f);if(c instanceof o){let $=c[0];return new o([a,$])}else return c});if(s instanceof o){let l=s[0];return new o(Ir(l))}else return s}else return new u(h("Failed to decode dict"))}else if(t==="Object"){let n=j(e,at(L,se));if(n instanceof o){let r=n[0],i=it(r),s=qe(i,l=>{let a,f;a=l[0],f=l[1];let c=te(f);if(c instanceof o){let $=c[0];return new o([a,$])}else return c});if(s instanceof o){let l=s[0];return new o(Ir(l))}else return s}else return new u(h("Failed to decode dict"))}else{let n=t;return new u(h("Unsupported type for JSON conversion: "+n))}}function C(e,t){let n=ce(e);if(n instanceof o){let r=n[0],i=j(r,ft(w([t]),se));if(i instanceof o){let s=i[0],l=te(s);if(l instanceof o){let a=l[0];return new d(a)}else return new A}else return new A}else return new A}function Dn(){return ee()}function Qe(e,t){return ue(e,t)instanceof o}function Ln(e,t,n){return me(e,t,n)}var Tn=class extends v{constructor(t,n){super(),this.id=t,this.defs=n}};var wt=class extends v{};var xt=class extends v{};var yt=class extends v{};var gt=class extends v{};var bt=class extends v{};var vt=class extends v{};var jt=class extends v{};var kt=class extends v{};var St=class extends v{};var Et=class extends v{};var Rr=class extends v{};function Pr(e){return e==="datetime"?new o(new wt):e==="uri"?new o(new xt):e==="at-uri"?new o(new yt):e==="did"?new o(new gt):e==="handle"?new o(new bt):e==="at-identifier"?new o(new vt):e==="nsid"?new o(new jt):e==="cid"?new o(new kt):e==="language"?new o(new St):e==="tid"?new o(new Et):e==="record-key"?new o(new Rr):new u(void 0)}function In(e){return e instanceof wt?"datetime":e instanceof xt?"uri":e instanceof yt?"at-uri":e instanceof gt?"did":e instanceof bt?"handle":e instanceof vt?"at-identifier":e instanceof jt?"nsid":e instanceof kt?"cid":e instanceof St?"language":e instanceof Et?"tid":"record-key"}var qn=class extends v{constructor(t){super(),this.dict=t}};function Jr(){return new qn(ee())}function Gr(e,t){let n=e.dict,r=ue(n,t);return Wi(r)}var cl=void 0;function Zr(e,t){return new qn(_e(e.dict,t,cl))}function ls(e,t){return e.lastIndex=0,e.test(t)}function as(e,t){try{let n="gu";return t.case_insensitive&&(n+="i"),t.multi_line&&(n+="m"),new o(new RegExp(e,n))}catch(n){let r=(n.columnNumber||0)|0;return new u(new Un(n.message,r))}}var Un=class extends v{constructor(t,n){super(),this.error=t,this.byte_index=n}};var Wr=class extends v{constructor(t,n){super(),this.case_insensitive=t,this.multi_line=n}};function dl(e,t){return as(e,t)}function Be(e){return dl(e,new Wr(!1,!1))}function Me(e,t){return ls(e,t)}var Xt=class extends v{constructor(t,n){super(),this.seconds=t,this.nanoseconds=n}};function wl(e){let t=1e9,n=Ut(e.nanoseconds,t),r=e.nanoseconds-n,i=e.seconds+Ft(r,t);return n>=0?new Xt(i,n):new Xt(i-1,t+n)}function xl(e){return e%4===0&&(e%100!==0||e%400===0)}function yl(e){if(e.bitSize>=8)if(e.byteAt(0)===43)if((e.bitSize-8)%8===0){let t=le(e,8);return new o(["+",t])}else return new u(void 0);else if(e.byteAt(0)===45&&(e.bitSize-8)%8===0){let t=le(e,8);return new o(["-",t])}else return new u(void 0);else return new u(void 0)}function Kt(e,t){if(e.bitSize>=8&&(e.bitSize-8)%8===0)if(e.byteAt(0)===t){let r=le(e,8);return new o(r)}else return new u(void 0);else return new u(void 0)}function gl(e){return e.bitSize===0?new o(void 0):new u(void 0)}function bl(e,t,n){let r=globalThis.Math.trunc((14-t)/12),i=e+4800-r,s=t+12*r-3;return n+globalThis.Math.trunc((153*s+2)/5)+365*i+globalThis.Math.trunc(i/4)-globalThis.Math.trunc(i/100)+globalThis.Math.trunc(i/400)-32045}var vl=86400,cs=3600,ds=60;function jl(e,t,n){let r=t*cs+n*ds;return e==="-"?-r:r}function kl(e,t,n,r,i,s){return bl(e,t,n)*vl+r*cs+i*ds+s}var Sl=1e9,Yr=58,fs=45;function El(e,t,n){for(;;){let r=e,i=t,s=n,l=globalThis.Math.trunc(s/10);if(r.bitSize>=8&&(r.bitSize-8)%8===0){let a=r.byteAt(0);if(48<=a&&a<=57&&l<1)e=le(r,8),t=i,n=l;else{let f=r.byteAt(0);if(48<=f&&f<=57){let c=le(r,8),$=f-48;e=c,t=i+$*l,n=l}else return new o([i,r])}}else return new o([i,r])}}function zl(e){if(e.bitSize>=8&&e.byteAt(0)===46)if(e.bitSize>=16&&(e.bitSize-16)%8===0){let t=e.byteAt(1);if(48<=t&&t<=57){let n=le(e,16);return El(hn([t,n]),0,Sl)}else return(e.bitSize-8)%8===0?new u(void 0):new o([0,e])}else return(e.bitSize-8)%8===0?new u(void 0):new o([0,e]);else return new o([0,e])}function Al(e,t,n,r){for(;;){let i=e,s=t,l=n,a=r;if(a>=s)return new o([l,i]);if(i.bitSize>=8&&(i.bitSize-8)%8===0){let f=i.byteAt(0);if(48<=f&&f<=57)e=le(i,8),t=s,n=l*10+(f-48),r=a+1;else return new u(void 0)}else return new u(void 0)}}function zt(e,t){return Al(e,t,0,0)}function Ol(e){return zt(e,4)}function Cl(e){return p(zt(e,2),t=>{let n,r;return n=t[0],r=t[1],1<=n&&n<=12?new o([n,r]):new u(void 0)})}function Bl(e,t,n){return p(zt(e,2),r=>{let i,s;return i=r[0],s=r[1],p(n===1?new o(31):n===3?new o(31):n===5?new o(31):n===7?new o(31):n===8?new o(31):n===10?new o(31):n===12?new o(31):n===4?new o(30):n===6?new o(30):n===9?new o(30):n===11?new o(30):n===2?xl(t)?new o(29):new o(28):new u(void 0),l=>1<=i&&i<=l?new o([i,s]):new u(void 0))})}function ps(e){return p(zt(e,2),t=>{let n,r;return n=t[0],r=t[1],0<=n&&n<=23?new o([n,r]):new u(void 0)})}function $s(e){return p(zt(e,2),t=>{let n,r;return n=t[0],r=t[1],0<=n&&n<=59?new o([n,r]):new u(void 0)})}function Ml(e){return p(zt(e,2),t=>{let n,r;return n=t[0],r=t[1],0<=n&&n<=60?new o([n,r]):new u(void 0)})}function Hr(e){return p(yl(e),t=>{let n,r;return n=t[0],r=t[1],p(ps(r),i=>{let s,l;return s=i[0],l=i[1],p(Kt(l,Yr),a=>p($s(a),f=>{let c,$;c=f[0],$=f[1];let _=jl(n,s,c);return new o([_,$])}))})})}function Dl(e){if(e.bitSize>=8)if(e.byteAt(0)===90)if((e.bitSize-8)%8===0){let t=le(e,8);return new o([0,t])}else return Hr(e);else if(e.byteAt(0)===122&&(e.bitSize-8)%8===0){let t=le(e,8);return new o([0,t])}else return Hr(e);else return Hr(e)}function Ll(e){if(e.bitSize>=8&&(e.bitSize-8)%8===0){let t=e.byteAt(0);if(t===84||t===116||t===32){let n=le(e,8);return new o(n)}else return new u(void 0)}else return new u(void 0)}var Tl=210866803200;function Il(e,t,n,r,i,s,l,a){let c=kl(e,t,n,r,i,s)-Tl,$=new Xt(c-a,l);return wl($)}function _s(e){let t=On(e);return p(Ol(t),n=>{let r,i;return r=n[0],i=n[1],p(Kt(i,fs),s=>p(Cl(s),l=>{let a,f;return a=l[0],f=l[1],p(Kt(f,fs),c=>p(Bl(c,r,a),$=>{let _,x;return _=$[0],x=$[1],p(Ll(x),g=>p(ps(g),S=>{let E,I;return E=S[0],I=S[1],p(Kt(I,Yr),xe=>p($s(xe),H=>{let J,de;return J=H[0],de=H[1],p(Kt(de,Yr),We=>p(Ml(We),Le=>{let ir,_n;return ir=Le[0],_n=Le[1],p(zl(_n),Nt=>{let qt,He;return qt=Nt[0],He=Nt[1],p(Dl(He),si=>{let oi,li;return oi=si[0],li=si[1],p(gl(li),iu=>new o(Il(r,a,_,E,J,ir,qt,oi)))})})}))}))}))}))}))})}function ql(e){let t=G(e);if(t===0||t>64)return!1;{let i=Be("^[0-9]{4}-[01][0-9]-[0-3][0-9]T[0-2][0-9]:[0-6][0-9]:[0-6][0-9](\\.[0-9]{1,20})?(Z|([+-][0-2][0-9]:[0-5][0-9]))$");if(i instanceof o){let s=i[0],l=Me(s,e);return l&&(Oe(e,"-00:00")?!1:_s(e)instanceof o)}else return!1}}function Ul(e){let t=G(e);if(t===0||t>8192)return!1;{let i=Be("^[a-z][a-z.-]{0,80}:[!-~]+$");if(i instanceof o){let s=i[0];return Me(s,e)}else return!1}}function Qt(e){let t=G(e);if(t===0||t>2048)return!1;{let r=Z(e,"did:");if(r){let s=Be("^did:[a-z]+:[a-zA-Z0-9._:%-]*[a-zA-Z0-9._-]$");if(s instanceof o){let l=s[0];return Me(l,e)}else return!1}else return r}}function en(e){let t="^([a-zA-Z0-9]([a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?\\.)+[a-zA-Z]([a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?$",n=G(e)===0||G(e)>253,r=Be(t);if(n)return!1;if(r instanceof o){let i=r[0],s=Me(i,e);if(s){let l=fe(e,"."),a=Ei(l);if(a instanceof o){let f=a[0];return f==="local"||f==="arpa"||f==="invalid"||f==="localhost"||f==="internal"||f==="example"||f==="onion"?!1:f!=="alt"}else return!1}else return s}else return n}function Fl(e){return Qt(e)||en(e)}function et(e){let n=Be("^[a-zA-Z]([a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?(\\.[a-zA-Z]([a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?)+$");if(n instanceof o){let r=n[0],i=Me(r,e);if(i){let s=fe(e,".");return he(s)>=3&&G(e)<=317}else return i}else return!1}function Fn(e){let t=G(e);if(t<8||t>256)return!1;if(Z(e,"Qmb"))return!1;{let s=Be("^[a-zA-Z0-9+=]{8,256}$");if(s instanceof o){let l=s[0];return Me(l,e)}else return!1}}function ms(e){let t=Fn(e);return t&&Z(e,"bafkrei")}function Vl(e){let t=G(e);if(t===0||t>128)return!1;{let i=Be("^(i|[a-z]{2,3})(-[a-zA-Z0-9]+)*$");if(i instanceof o){let s=i[0];return Me(s,e)}else return!1}}function hs(e){let t="^[234567abcdefghij][234567abcdefghijklmnopqrstuvwxyz]{12}$",n=G(e)===13,r=Be(t);if(n&&r instanceof o){let i=r[0];return Me(i,e)}else return!1}function ws(e){let t=G(e);if(e==="."||e==="..")return!1;{let r=t>=1&&t<=512;if(r){let i=hs(e);if(i)return i;{let l=Be("^[a-zA-Z0-9_~.:-]+$");if(l instanceof o){let a=l[0];return Me(a,e)}else return!1}}else return r}}function Rl(e){let t=G(e);if(t===0||t>8192)return!1;{let r=Z(e,"at://");if(r){let i=Re(e,5),s=fe(i,"/");if(s instanceof y)return!1;{let l=s.tail;if(l instanceof y){let a=s.head;return Qt(a)||en(a)}else{let a=l.tail;if(a instanceof y){let f=s.head,c=l.head,$=Qt(f)||en(f);return $&&et(c)}else if(a.tail instanceof y){let c=s.head,$=l.head,_=a.head,x=Qt(c)||en(c);if(x){let g=et($);return g&&ws(_)}else return x}else return!1}}}else return r}}function Vn(e,t){return t instanceof wt?ql(e):t instanceof xt?Ul(e):t instanceof yt?Rl(e):t instanceof gt?Qt(e):t instanceof bt?en(e):t instanceof vt?Fl(e):t instanceof jt?et(e):t instanceof kt?Fn(e):t instanceof St?Vl(e):t instanceof Et?hs(e):ws(e)}var At=class extends v{constructor(t,n,r,i,s){super(),this.lexicons=t,this.path=n,this.current_lexicon_id=r,this.reference_stack=i,this.validator=s}};var Rn=class extends v{constructor(t,n){super(),this.lexicons=t,this.validator=n}};function Kr(){return new Rn(ee(),new A)}function Xr(e){let t,n=e.validator;n instanceof d?t=n[0]:t=(i,s,l)=>new o(void 0);let r=t;return new o(new At(e.lexicons,"",new A,Jr(),r))}function Pn(e,t){let n=ue(e.lexicons,t);if(n instanceof o){let r=n[0];return new d(r)}else return new A}function b(e){return e.path}function N(e,t){let n;e.path===""?n=t:n=e.path+"."+t;let i=n;return new At(e.lexicons,i,e.current_lexicon_id,e.reference_stack,e.validator)}function Ot(e){return e.current_lexicon_id}function nn(e,t){return new At(e.lexicons,e.path,new d(t),e.reference_stack,e.validator)}function xs(e,t){return new At(e.lexicons,e.path,e.current_lexicon_id,Zr(e.reference_stack,t),e.validator)}function Qr(e,t){return Gr(e.reference_stack,t)}function ys(e,t){let n=fe(t,"#");if(n instanceof y)return new u(m("Invalid reference format: "+t));{let r=n.tail;if(r instanceof y){let i=n.head;return i!==""?new o([i,"main"]):new u(m("Invalid reference format: "+t))}else if(r.tail instanceof y){let s=n.head;if(s===""){let l=r.head,a=e.current_lexicon_id;if(a instanceof d){let f=a[0];return new o([f,l])}else return new u(m("Local reference '"+t+"' used without current lexicon context"))}else{let l=s,a=r.head;return l!==""&&a!==""?new o([l,a]):new u(m("Invalid reference format: "+t))}}else return new u(m("Invalid reference format: "+t))}}function Pl(e){let t,n=k(e,"id");if(n instanceof d){let i=n[0];t=new o(i)}else t=new u(m("Lexicon missing required 'id' field"));return p(t,i=>p(et(i)?new o(void 0):new u(m("Lexicon 'id' field is not a valid NSID: "+i)),s=>{let l,a=C(e,"defs");if(a instanceof d){let c=a[0];q(c)?l=new o(c):l=new u(m("Lexicon 'defs' must be an object at "+i))}else l=new u(m("Lexicon missing required 'defs' field at "+i));return p(l,c=>new o(new Tn(i,c)))}))}function ei(e,t){return K(t,e,(n,r)=>{let i=Pl(r);if(i instanceof o){let s=i[0],l=_e(n.lexicons,s.id,s);return new o(new Rn(l,n.validator))}else return i})}function ti(e,t,n,r,i){let s;if(n instanceof d){let a=n[0];t<a?s=new u(h(e+": "+i+" length "+B(t)+" is less than minLength "+B(a))):s=new o(void 0)}else s=new o(void 0);return p(s,a=>{if(r instanceof d){let f=r[0];return t>f?new u(h(e+": "+i+" length "+B(t)+" exceeds maxLength "+B(f))):new o(void 0)}else return new o(void 0)})}function tt(e,t,n,r){if(t instanceof d&&n instanceof d){let i=t[0],s=n[0];return i>s?new u(m(e+": "+r+" minLength ("+B(i)+") cannot be greater than maxLength ("+B(s)+")")):new o(void 0)}else return new o(void 0)}function gs(e,t,n,r){let i;if(n instanceof d){let l=n[0];t<l?i=new u(h(e+": value "+B(t)+" is less than minimum "+B(l))):i=new o(void 0)}else i=new o(void 0);return p(i,l=>{if(r instanceof d){let a=r[0];return t>a?new u(h(e+": value "+B(t)+" exceeds maximum "+B(a))):new o(void 0)}else return new o(void 0)})}function bs(e,t,n){if(t instanceof d&&n instanceof d){let r=t[0],i=n[0];return r>i?new u(m(e+": minimum ("+B(r)+") cannot be greater than maximum ("+B(i)+")")):new o(void 0)}else return new o(void 0)}function Jn(e,t,n,r,i,s){return En(n,a=>s(t,a))?new o(void 0):new u(h(e+": "+r+" value '"+i(t)+"' is not in enum"))}function Ct(e,t,n,r){return t&&n?new u(m(e+": "+r+" cannot have both 'const' and 'default'")):new o(void 0)}function O(e,t,n,r){let i=kn(t,s=>!st(n,s));if(i instanceof y)return new o(void 0);{let s=i;return new u(m(e+": "+r+" has unknown fields: "+Ve(s,", ")+". Allowed fields: "+Ve(n,", ")))}}function Bt(e,t,n){let r=nn(t,n),i=ys(r,e);if(i instanceof o){let s=i[0][0],l=i[0][1],a=Pn(r,s);if(a instanceof d){let f=a[0],c=C(f.defs,l);if(c instanceof d){let $=c[0];return new o(new d($))}else return new u(m("Definition '"+l+"' not found in lexicon '"+s+"'"))}else return new u(m("Referenced lexicon not found: "+s))}else return i}function Gn(e,t,n){let r=b(n);return p((()=>{let i=k(t,"ref");if(i instanceof d){let s=i[0];return new o(s)}else return new u(h(r+": ref schema missing 'ref' field"))})(),i=>{if(Qr(n,i))return new u(h(r+": circular reference detected: "+i));{let l=xs(n,i);return p((()=>{let a=Ot(l);if(a instanceof d){let f=a[0];return new o(f)}else return new u(h(r+": no current lexicon set for resolving reference"))})(),a=>p(Bt(i,l,a),f=>p((()=>{if(f instanceof d){let c=f[0];return new o(c)}else return new u(h(r+": reference not found: "+i))})(),c=>{let $=l.validator;return $(e,c,l)})))}})}function Jl(e,t){let n=fe(e,"#");if(n instanceof y)return new u(m(t+": global reference can only contain one # character"));{let r=n.tail;if(r instanceof y)return new u(m(t+": global reference can only contain one # character"));if(r.tail instanceof y){let s=n.head,l=r.head;return Ce(s)?new u(m(t+": NSID part of reference cannot be empty")):Ce(l)?new u(m(t+": definition name part of reference cannot be empty")):new o(void 0)}else return new u(m(t+": global reference can only contain one # character"))}}function Gl(e,t){if(Ce(e))return new u(m(t+": reference cannot be empty"));if(Z(e,"#")){let i=Re(e,1);return Ce(i)?new u(m(t+": local reference must have a definition name after #")):new o(void 0)}else return Fe(e,"#")?Jl(e,t):new o(void 0)}var Zl=w(["type","ref","description"]);function rn(e,t){let n=b(t),r=z(e);return p(O(n,r,Zl,"ref"),i=>{let s,l=k(e,"ref");if(l instanceof d){let f=l[0];s=new o(f)}else s=new u(m(n+": ref missing required 'ref' field"));return p(s,f=>p(Gl(f,n),c=>{if(Z(f,"#"))return new o(void 0);{let _=Ot(t);if(_ instanceof d){let x=_[0];return p(Bt(f,t,x),g=>g instanceof d?new o(void 0):new u(m(n+": reference not found: "+f)))}else return new o(void 0)}}))})}function Wl(e,t,n){let r=ie(e,i=>j(i,L));return K(r,void 0,(i,s)=>{let l=Ot(t);if(l instanceof d){let a=l[0];return p(Bt(s,t,a),f=>f instanceof d?new o(void 0):new u(m(n+": reference not found: "+s)))}else return new o(void 0)})}function Hl(e,t,n,r){let i=Ot(n);if(i instanceof d){let s=i[0];return p(Bt(t,n,s),l=>{if(l instanceof d){let a=l[0],f=n.validator;return f(e,a,n)}else return new u(h(r+": reference not found: "+t))})}else return new o(void 0)}function Yl(e,t){let n=e===t;if(n)return n;if(Z(e,"#")){let i=Re(e,1),s=t===i;return s||Oe(t,"#"+i)}else if(Oe(t,"#main")){let s=Ji(t,5);return e===s}else{if(Fe(t,"#"))return!1;{let l=t+"#main";return e===l}}}function Kl(e){return _t(e)?"null":ht(e)?"boolean":mt(e)?"number":Pe(e)?"string":Yt(e)?"array":q(e)?"object":"unknown"}function Zn(e,t,n){let r=b(n);if(q(e)){let s,l=k(e,"$type");if(l instanceof d){let f=l[0];s=new o(f)}else s=new u(h(r+': union data must be an object which includes the "$type" property'));return p(s,f=>{let c,$=T(t,"refs");if($ instanceof d){let x=$[0];c=new o(x)}else c=new u(h(r+": union schema missing or invalid 'refs' field"));return p(c,x=>{if(Ke(x))return new u(h(r+": union schema has empty refs array"));{let S=ie(x,I=>j(I,L)),E=Si(S,I=>Yl(I,f));if(E instanceof o){let I=E[0];return Hl(e,I,n,r)}else{let I,xe=Je(t,"closed");return xe instanceof d?I=xe[0]:I=!1,I?new u(h(r+": union data $type must be one of "+Ve(S,", ")+", found '"+f+"'")):new o(void 0)}}})})}else{let s=Kl(e);return new u(h(r+': union data must be an object which includes the "$type" property, found '+s))}}var Xl=w(["type","refs","closed","description"]);function Wn(e,t){let n=b(t),r=z(e);return p(O(n,r,Xl,"union"),i=>{let s,l=T(e,"refs");if(l instanceof d){let f=l[0];s=new o(f)}else s=new u(m(n+": union missing required 'refs' field"));return p(s,f=>p(ot(f,new o(void 0),(c,$,_)=>p(c,x=>j($,L)instanceof o?new o(void 0):new u(m(n+": refs["+Ht(_)+"] must be a string")))),c=>p((()=>{let $=Je(e,"closed");return $ instanceof d?$[0]&&Ke(f)?new u(m(n+": union cannot be closed with empty refs array")):new o(void 0):new o(void 0)})(),$=>p((()=>{if(Ke(f)){let x=Je(e,"closed");return x instanceof d?x[0]?new u(m(n+": union cannot have empty refs array when closed=true")):new o(void 0):new o(void 0)}else return new o(void 0)})(),_=>Wl(f,t,n)))))})}function js(e,t,n){let r=b(n);if(Pe(e)){let s=je(e),l;return Z(s,'"')&&Oe(s,'"')?l=Wt(s,1,G(s)-2):l=s,Ce(l)?new u(h(r+": token value cannot be empty string")):new o(void 0)}else return new u(h(r+": expected string for token data, got other type"))}var Ql=w(["type","description"]);function Yn(e,t){let n=b(t),r=z(e);return O(n,r,Ql,"token")}function sn(e,t,n){let r=b(n);if(q(e)){if(k(e,"$bytes")instanceof d)return new u(h(r+": unknown type cannot be a bytes object"));{let l=k(e,"$type");return l instanceof d?l[0]==="blob"?new u(h(r+": unknown type cannot be a blob object")):new o(void 0):new o(void 0)}}else return new u(h(r+": unknown type must be an object, not a primitive or array"))}var ea=w(["type","description"]);function Mt(e,t){let n=b(t),r=z(e);return O(n,r,ea,"unknown")}function ta(e,t){let n=C(e,"ref");if(n instanceof d){let r=n[0];if(q(r)){let s=k(r,"$link");if(s instanceof d){let l=s[0];return ms(l)?new o(void 0):new u(h(t+": blob ref.$link must be a valid CID with raw multicodec (bafkrei prefix)"))}else return new u(h(t+": blob ref must have $link field"))}else return new u(h(t+": blob ref must be an object"))}else return new u(h(t+": blob missing required 'ref' field"))}function Ss(e,t,n,r){return Fe(t,"*")?t==="*"?new o(void 0):new u(m(e+": blob MIME type '"+r+"' can only use '*' as a complete wildcard for "+n)):new o(void 0)}function na(e,t,n){if(Ce(t))return new u(m(e+": blob MIME type cannot be empty"));if(t==="*/*")return new o(void 0);if(Fe(t,"/")){let s=fe(t,"/");if(s instanceof y)return new u(m(e+": blob MIME type '"+t+"' must have exactly one '/' character"));{let l=s.tail;if(l instanceof y)return new u(m(e+": blob MIME type '"+t+"' must have exactly one '/' character"));if(l.tail instanceof y){let f=s.head,c=l.head;return p(Ss(e,f,"type",t),$=>Ss(e,c,"subtype",t))}else return new u(m(e+": blob MIME type '"+t+"' must have exactly one '/' character"))}}else return new u(m(e+": blob MIME type '"+t+"' must contain a '/' character"))}function ra(e,t){return ot(t,new o(void 0),(n,r,i)=>p(n,s=>{let l=j(r,L);if(l instanceof o){let a=l[0];return na(e,a,i)}else return new u(m(e+": blob accept["+B(i)+"] must be a string"))}))}function ia(e,t){if(t==="*/*")return!0;{let n=fe(e,"/"),r=fe(t,"/");if(n instanceof y)return!1;if(r instanceof y)return!1;{let i=n.tail;if(i instanceof y)return!1;{let s=r.tail;if(s instanceof y)return!1;if(i.tail instanceof y)if(s.tail instanceof y){let f=n.head,c=r.head,$=i.head,_=s.head,x;c==="*"?x=!0:x=f===c;let g=x,S;return _==="*"?S=!0:S=$===_,g&&S}else return!1;else return!1}}}}function sa(e,t,n){let r=ie(n,s=>j(s,L));return En(r,s=>ia(t,s))?new o(void 0):new u(h(e+": blob mimeType '"+t+"' not accepted. Allowed: "+Ve(r,", ")))}var oa=w(["type","accept","maxSize","description"]);function Kn(e,t){let n=b(t),r=z(e);return p(O(n,r,oa,"blob"),i=>p((()=>{let s=T(e,"accept");if(s instanceof d){let l=s[0];return ra(n,l)}else return new o(void 0)})(),s=>{let l=M(e,"maxSize");return l instanceof d?l[0]>0?new o(void 0):new u(m(n+": blob maxSize must be greater than 0")):new o(void 0)}))}var la=w(["$type","ref","mimeType","size"]);function aa(e,t){let n=kn(t,r=>!st(la,r));if(n instanceof y)return new o(void 0);{let r=n.head;return new u(h(e+": blob has unexpected field '"+r+"'"))}}function Es(e,t,n){let r=b(n);if(q(e)){let s=z(e);return p(aa(r,s),l=>p((()=>{let a=k(e,"$type");if(a instanceof d){let f=a[0];if(f==="blob")return new o(void 0);{let c=f;return new u(h(r+": blob $type must be 'blob', got '"+c+"'"))}}else return new u(h(r+": blob missing required '$type' field"))})(),a=>p(ta(e,r),f=>p((()=>{let c=k(e,"mimeType");if(c instanceof d){let $=c[0];return Ce($)?new u(h(r+": blob mimeType cannot be empty")):new o($)}else return new u(h(r+": blob missing required 'mimeType' field"))})(),c=>p((()=>{let $=M(e,"size");if($ instanceof d){let _=$[0];return _>=0?new o(_):new u(h(r+": blob size must be non-negative"))}else return new u(h(r+": blob missing or invalid 'size' field"))})(),$=>p((()=>{let _=T(t,"accept");if(_ instanceof d){let x=_[0];return sa(r,c,x)}else return new o(void 0)})(),_=>{let x=M(t,"maxSize");if(x instanceof d){let g=x[0];return $<=g?new o(void 0):new u(h(r+": blob size "+B($)+" exceeds maxSize "+B(g)))}else return new o(void 0)}))))))}else return new u(h(r+": expected blob object"))}function ln(e,t,n){let r=b(n);if(ht(e)){let s=je(e),l=s==="true";if(l||s==="false"){let c=l,$=Je(t,"const");if($ instanceof d){let _=$[0];return _!==c?new u(h(r+": must be constant value "+(_?"true":"false"))):new o(void 0)}else return new o(void 0)}else return new u(h(r+": invalid boolean representation"))}else return new u(h(r+": expected boolean, got other type"))}var ua=w(["type","const","default","description"]);function Dt(e,t){let n=b(t),r=z(e);return p(O(n,r,ua,"boolean"),i=>{let s=!D(Je(e,"const"),new A),l=!D(Je(e,"default"),new A);return Ct(n,s,l,"boolean")})}function As(e,t,n){let r=b(n);if(q(e)){let s=z(e);return p(he(s)===1?new o(void 0):new u(h(r+": $bytes objects must have a single field")),l=>{let a=k(e,"$bytes");if(a instanceof d){let f=a[0],c=Zi(f);if(c instanceof o){let $=c[0],_=Cn($),x=M(t,"minLength"),g=M(t,"maxLength");return p((()=>{if(x instanceof d){let S=x[0];return _<S?new u(h(r+": bytes size out of bounds: "+Ht(_))):new o(void 0)}else return new o(void 0)})(),S=>p((()=>{if(g instanceof d){let E=g[0];return _>E?new u(h(r+": bytes size out of bounds: "+Ht(_))):new o(void 0)}else return new o(void 0)})(),E=>new o(void 0)))}else return new u(h(r+": decoding $bytes value: invalid base64 encoding"))}else return new u(h(r+": $bytes field missing or not a string"))})}else return new u(h(r+": expecting bytes"))}var fa=w(["type","minLength","maxLength","description"]);function Xn(e,t){let n=b(t),r=z(e);return p(O(n,r,fa,"bytes"),i=>{let s=M(e,"minLength"),l=M(e,"maxLength");return p(s instanceof d?s[0]<0?new u(m(n+": bytes schema minLength below zero")):new o(void 0):new o(void 0),a=>p(l instanceof d?l[0]<0?new u(m(n+": bytes schema maxLength below zero")):new o(void 0):new o(void 0),f=>tt(n,s,l,"bytes")))})}function Cs(e,t,n){let r=b(n);if(q(e)){let s=k(e,"$link");if(s instanceof d){let l=s[0];return Fn(l)?new o(void 0):new u(h(r+": invalid CID format in $link"))}else return new u(h(r+": CID link must have $link field"))}else return new u(h(r+": expected CID link object"))}var ca=w(["type","description"]);function Qn(e,t){let n=b(t),r=z(e);return O(n,r,ca,"cid-link")}function da(e,t,n){return Jn(n,e,t,"integer",B,(r,i)=>r===i)}function un(e,t,n){let r=b(n);if(mt(e)){let s=je(e),l=Sr(s);if(l instanceof o){let a=l[0],f=M(t,"const");if(f instanceof d){let c=f[0];return c!==a?new u(h(r+": must be constant value "+B(c)+", found "+B(a))):new o(void 0)}else return p((()=>{let c=T(t,"enum");if(c instanceof d){let $=c[0],_=ie($,x=>j(x,Ue));return da(a,_,r)}else return new o(void 0)})(),c=>{let $=M(t,"minimum"),_=M(t,"maximum");return gs(r,a,$,_)})}else return new u(h(r+": failed to parse integer value"))}else return new u(h(r+": expected integer, got other type"))}var pa=w(["type","minimum","maximum","enum","const","default","description"]);function Lt(e,t){let n=b(t),r=z(e);return p(O(n,r,pa,"integer"),i=>{let s=M(e,"minimum"),l=M(e,"maximum");return p(bs(n,s,l),a=>p((()=>{let f=T(e,"enum");if(f instanceof d){let c=f[0];return K(c,void 0,($,_)=>j(_,Ue)instanceof o?new o(void 0):new u(m(n+": enum values must be integers")))}else return new o(void 0)})(),f=>{let c=!D(M(e,"const"),new A),$=!D(M(e,"default"),new A);return Ct(n,c,$,"integer")}))})}function Ms(e,t,n){let r=b(n);return _t(e)?new o(void 0):new u(h(r+": expected null, got other type"))}var $a=w(["type","description"]);function er(e,t){let n=b(t),r=z(e);return O(n,r,$a,"null")}function _a(e,t,n,r){let i=Zt(e);return ti(r,i,t,n,"string")}function ma(e,t,n,r){let i,l=Gt(e);return i=he(l),ti(r,i,t,n,"string (graphemes)")}function ha(e,t,n){if(Vn(e,t))return new o(void 0);{let i=In(t);return new u(h(n+": string does not match format '"+i+"'"))}}function wa(e,t,n){return Jn(n,e,t,"string",r=>r,(r,i)=>r===i)}function cn(e,t,n){let r=b(n);if(Pe(e)){let s=je(e),l;Z(s,'"')&&Oe(s,'"')?l=Wt(s,1,G(s)-2):l=s;let f=l,c=M(t,"minLength"),$=M(t,"maxLength");return p(_a(f,c,$,r),_=>{let x=M(t,"minGraphemes"),g=M(t,"maxGraphemes");return p(ma(f,x,g,r),S=>{let E,I=k(t,"format");if(I instanceof d){let H=I[0],J=Pr(H);if(J instanceof o){let de=J[0];E=ha(f,de,r)}else E=new o(void 0)}else E=new o(void 0);return p(E,H=>{let J=T(t,"enum");if(J instanceof d){let de=J[0],We=ie(de,Le=>j(Le,L));return wa(f,We,r)}else return new o(void 0)})})})}else return new u(h(r+": expected string, got other type"))}var xa=w(["type","format","minLength","maxLength","minGraphemes","maxGraphemes","enum","knownValues","const","default","description"]);function Tt(e,t){let n=b(t),r=z(e);return p(O(n,r,xa,"string"),i=>{let s,l=k(e,"format");if(l instanceof d){let _=l[0];Pr(_)instanceof o?s=new o(void 0):s=new u(m(n+": unknown format '"+_+"'. Valid formats: datetime, uri, at-uri, did, handle, at-identifier, nsid, cid, language, tid, record-key"))}else s=new o(void 0);let f=p(s,_=>{let x=M(e,"minLength"),g=M(e,"maxLength"),S=M(e,"minGraphemes"),E=M(e,"maxGraphemes");return p(x instanceof d?x[0]<0?new u(m(n+": string schema minLength below zero")):new o(void 0):new o(void 0),I=>p(g instanceof d?g[0]<0?new u(m(n+": string schema maxLength below zero")):new o(void 0):new o(void 0),xe=>p(S instanceof d?S[0]<0?new u(m(n+": string schema minGraphemes below zero")):new o(void 0):new o(void 0),H=>p(E instanceof d?E[0]<0?new u(m(n+": string schema maxGraphemes below zero")):new o(void 0):new o(void 0),J=>p(tt(n,x,g,"string"),de=>tt(n,S,E,"string (graphemes)"))))))}),c=p(f,_=>{let x=T(e,"enum");if(x instanceof d){let g=x[0];return K(g,void 0,(S,E)=>j(E,L)instanceof o?new o(void 0):new u(m(n+": enum values must be strings")))}else return new o(void 0)}),$=p(c,_=>{let x=T(e,"knownValues");if(x instanceof d){let g=x[0];return K(g,void 0,(S,E)=>j(E,L)instanceof o?new o(void 0):new u(m(n+": knownValues must be strings")))}else return new o(void 0)});return p($,_=>{let x=!D(k(e,"const"),new A),g=!D(k(e,"default"),new A);return Ct(n,x,g,"string")})})}function ya(e,t,n){let r=ie(t,i=>j(i,L));return p(he(r)===he(t)?new o(void 0):new u(m(e+": required fields must be strings")),i=>{if(n instanceof d){let s=n[0],l=ke(s);if(l instanceof o){let a=l[0];return K(r,void 0,(f,c)=>Qe(a,c)?new o(void 0):new u(m(e+": required field '"+c+"' not found in properties")))}else return new o(void 0)}else return Ke(r)?new o(void 0):new u(m(e+": required fields specified but no properties defined"))})}function ga(e,t,n){let r=ie(t,i=>j(i,L));return p(he(r)===he(t)?new o(void 0):new u(m(e+": nullable fields must be strings")),i=>{if(n instanceof d){let s=n[0],l=ke(s);if(l instanceof o){let a=l[0];return K(r,void 0,(f,c)=>Qe(a,c)?new o(void 0):new u(m(e+": nullable field '"+c+"' not found in properties")))}else return new o(void 0)}else return Ke(r)?new o(void 0):new u(m(e+": nullable fields specified but no properties defined"))})}function ba(e,t,n){let r=ie(t,i=>j(i,L));return K(r,void 0,(i,s)=>{if(C(n,s)instanceof d)return new o(void 0);{let a;e===""?a="required field '"+s+"' is missing":a=e+": required field '"+s+"' is missing";let f=a;return new u(h(f))}})}function Ls(e){return _t(e)?"null":ht(e)?"boolean":mt(e)?"number":Pe(e)?"string":Yt(e)?"array":"object"}var va=w(["type","properties","required","nullable","description"]),ja=w(["type","items","minLength","maxLength","description"]);function ka(e,t,n){return ri(e,t,n)}function ri(e,t,n){let r=k(t,"type");if(r instanceof d){let i=r[0];if(i==="string")return cn(e,t,n);if(i==="integer")return un(e,t,n);if(i==="boolean")return ln(e,t,n);if(i==="bytes")return As(e,t,n);if(i==="blob")return Es(e,t,n);if(i==="cid-link")return Cs(e,t,n);if(i==="null")return Ms(e,t,n);if(i==="object")return tr(e,t,n);if(i==="array")return nr(e,t,n);if(i==="union")return Zn(e,t,n);if(i==="ref")return Gn(e,t,n);if(i==="token")return js(e,t,n);if(i==="unknown")return sn(e,t,n);{let s=i;return new u(h("Unknown schema type '"+s+"' at '"+b(n)+"'"))}}else return new u(h("Schema missing type field at '"+b(n)+"'"))}function tr(e,t,n){let r=b(n);if(q(e))return p((()=>{let s=T(t,"required");if(s instanceof d){let l=s[0];return ba(r,l,e)}else return new o(void 0)})(),s=>{let l,a=T(t,"nullable");if(a instanceof d){let $=a[0];l=ie($,_=>j(_,L))}else l=w([]);let f=l,c=C(t,"properties");if(c instanceof d){let $=c[0];return Sa(e,$,f,n)}else return new o(void 0)});{let s=Ls(e);return new u(h("Expected object at '"+r+"', found "+s))}}function Sa(e,t,n,r){let i=ke(e);if(i instanceof o){let s=i[0],l=ke(t);if(l instanceof o){let a=l[0];return me(s,new o(void 0),(f,c,$)=>p(f,_=>{let x=ue(a,c);if(x instanceof o){let g=x[0],S=te(g);if(S instanceof o){let E=S[0],I=N(r,c);if(ss($))return st(n,c)?new o(void 0):new u(h("Field '"+c+"' at '"+b(r)+"' cannot be null"));{let H=te($);if(H instanceof o){let J=H[0];return ka(J,E,I)}else return H}}else return S}else return new o(void 0)}))}else return l}else return i}function Ea(e,t,n){let r=te(e);return p(r,i=>{let s=k(t,"type");return s instanceof d&&s[0]==="ref"?Gn(i,t,n):ri(i,t,n)})}function nr(e,t,n){let r=b(n);if(Yt(e)){let s,l=Vr(e);if(l instanceof d){let f=l[0];s=new o(f)}else s=new u(h(r+": failed to parse array"));return p(s,f=>{let c=he(f);return p((()=>{let $=M(t,"minLength");if($ instanceof d){let _=$[0];return c<_?new u(h(r+": array has length "+B(c)+" but minimum length is "+B(_))):new o(void 0)}else return new o(void 0)})(),$=>p((()=>{let _=M(t,"maxLength");if(_ instanceof d){let x=_[0];return c>x?new u(h(r+": array has length "+B(c)+" but maximum length is "+B(x))):new o(void 0)}else return new o(void 0)})(),_=>{let x=C(t,"items");if(x instanceof d){let g=x[0];return ot(f,new o(void 0),(S,E,I)=>p(S,xe=>{let H=N(n,"["+B(I)+"]");return Ea(E,g,H)}))}else return new o(void 0)}))})}else{let s=Ls(e);return new u(h(r+": expected array, found "+s))}}function za(e,t){return ii(e,t)}function ii(e,t){let n=k(e,"type");if(n instanceof d){let r=n[0];if(r==="string")return Tt(e,t);if(r==="integer")return Lt(e,t);if(r==="boolean")return Dt(e,t);if(r==="bytes")return Xn(e,t);if(r==="blob")return Kn(e,t);if(r==="cid-link")return Qn(e,t);if(r==="null")return er(e,t);if(r==="object")return pn(e,t);if(r==="array")return $n(e,t);if(r==="union")return Wn(e,t);if(r==="ref")return rn(e,t);if(r==="token")return Yn(e,t);if(r==="unknown")return Mt(e,t);{let i=r;return new u(m(b(t)+": unknown type '"+i+"'"))}}else return new u(m(b(t)+": schema missing type field"))}function pn(e,t){let n=b(t),r=z(e);return p(O(n,r,va,"object"),i=>{let s;return T(e,"properties")instanceof d?s=new u(m(n+": properties must be an object, not an array")):q(e)?s=new o(new A):s=new o(new A),p(s,f=>{let c=C(e,"properties");return p((()=>{let $=T(e,"required");if($ instanceof d){let _=$[0];return ya(n,_,c)}else return new o(void 0)})(),$=>p((()=>{let _=T(e,"nullable");if(_ instanceof d){let x=_[0];return ga(n,x,c)}else return new o(void 0)})(),_=>{if(c instanceof d){let x=c[0];return q(x)?Aa(x,t):new o(void 0)}else return new o(void 0)}))})})}function Aa(e,t){let n=ke(e);if(n instanceof o){let r=n[0];return me(r,new o(void 0),(i,s,l)=>p(i,a=>{let f=te(l);if(f instanceof o){let c=f[0],$=N(t,"properties."+s);return za(c,$)}else return f}))}else return n}function Oa(e,t){let n=k(e,"type");return n instanceof d&&n[0]==="ref"?rn(e,t):ii(e,t)}function $n(e,t){let n=b(t),r=z(e);return p(O(n,r,ja,"array"),i=>{let s,l=C(e,"items");if(l instanceof d){let f=l[0];s=new o(f)}else s=new u(m(n+": array missing required 'items' field"));return p(s,f=>{let c=N(t,".items");return p(Oa(f,c),$=>{let _=M(e,"minLength"),x=M(e,"maxLength");return p(tt(n,_,x,"array"),g=>new o(void 0))})})})}function Ca(e,t,n){if(t instanceof d){let r=t[0];return K(r,void 0,(i,s)=>{let l=j(s,L);if(l instanceof o){let a=l[0];return Qe(n,a)?new o(void 0):new u(m(e+": required field '"+a+"' not found in properties"))}else return new u(m(e+": required field must be a string"))})}else return new o(void 0)}function Ge(e,t){let n=k(e,"type");if(n instanceof d){let r=n[0];if(r==="boolean")return Dt(e,t);if(r==="integer")return Lt(e,t);if(r==="string")return Tt(e,t);if(r==="unknown")return Mt(e,t);if(r==="array")return $n(e,t);{let i=r;return new u(m(b(t)+": unknown type '"+i+"'"))}}else return new u(m(b(t)+": schema missing type field"))}function Ba(e,t,n,r){let i=e+".properties."+t,s=k(n,"type");if(s instanceof d){let l=s[0];if(l==="boolean"){let a=N(r,"properties."+t);return Ge(n,a)}else if(l==="integer"){let a=N(r,"properties."+t);return Ge(n,a)}else if(l==="string"){let a=N(r,"properties."+t);return Ge(n,a)}else if(l==="unknown"){let a=N(r,"properties."+t);return Ge(n,a)}else if(l==="array"){let a=C(n,"items");if(a instanceof d){let f=a[0],c=k(f,"type");if(c instanceof d){let $=c[0];if($==="boolean"){let _=N(r,"properties."+t);return Ge(n,_)}else if($==="integer"){let _=N(r,"properties."+t);return Ge(n,_)}else if($==="string"){let _=N(r,"properties."+t);return Ge(n,_)}else if($==="unknown"){let _=N(r,"properties."+t);return Ge(n,_)}else{let _=$;return new u(m(i+": params array items must be boolean, integer, string, or unknown, got '"+_+"'"))}}else return new u(m(i+": array items missing type field"))}else return new u(m(i+": array property missing items field"))}else{let a=l;return new u(m(i+": params properties must be boolean, integer, string, unknown, or arrays of these, got '"+a+"'"))}}else return new u(m(i+": property missing type field"))}function Ma(e,t,n){return Ln(t,new o(void 0),(r,i,s)=>r instanceof o?p(i===""?new u(m(e+": empty property name not allowed")):new o(void 0),l=>p((()=>{let a=te(s);return a instanceof o?a:new u(m(e+": invalid property value for '"+i+"'"))})(),a=>Ba(e,i,a,n))):r)}var Da=w(["type","description","properties","required"]);function Ze(e,t){let n=b(t),r=z(e);return p(O(n,r,Da,"params"),i=>p((()=>{let s=k(e,"type");if(s instanceof d){let l=s[0];if(l==="params")return new o(void 0);{let a=l;return new u(m(n+": expected type 'params', got '"+a+"'"))}}else return new u(m(n+": params missing type field"))})(),s=>{let l,a=C(e,"properties");if(a instanceof d){let x=a[0];l=ke(x)}else l=new o(Dn());let f=l,c,$=T(e,"required");$ instanceof d,c=$;let _=c;return p(f,x=>p(Ca(n,_,x),g=>Ma(n,x,t)))}))}function La(e,t){let n=N(t,"parameters");return Ze(e,n)}function Is(e,t,n){return k(t,"encoding")instanceof d?new o(void 0):new u(m(e+": procedure "+n+" missing encoding field"))}var Ta=w(["type","parameters","input","output","errors","description"]);function Ns(e,t){let n=b(t),r=z(e);return p(O(n,r,Ta,"procedure"),i=>p((()=>{let s=C(e,"parameters");if(s instanceof d){let l=s[0];return La(l,t)}else return new o(void 0)})(),s=>p((()=>{let l=C(e,"input");if(l instanceof d){let a=l[0];return Is(n,a,"input")}else return new o(void 0)})(),l=>p((()=>{let a=C(e,"output");if(a instanceof d){let f=a[0];return Is(n,f,"output")}else return new o(void 0)})(),a=>T(e,"errors")instanceof d?new o(void 0):new o(void 0)))))}function Na(e,t){let n=N(t,"parameters");return Ze(e,n)}function qa(e,t){return k(t,"encoding")instanceof d?new o(void 0):new u(m(e+": query output missing encoding field"))}var Ua=w(["type","parameters","output","errors","description"]);function qs(e,t){let n=b(t),r=z(e);return p(O(n,r,Ua,"query"),i=>p((()=>{let s=C(e,"parameters");if(s instanceof d){let l=s[0];return Na(l,t)}else return new o(void 0)})(),s=>p((()=>{let l=C(e,"output");if(l instanceof d){let a=l[0];return qa(n,a)}else return new o(void 0)})(),l=>T(e,"errors")instanceof d?new o(void 0):new o(void 0))))}function Us(e,t,n){let r=b(n);if(q(e)){let s=C(t,"record");if(s instanceof d){let l=s[0];return tr(e,l,n)}else return new u(h(r+": record schema missing 'record' field"))}else return new u(h(r+": expected object for record"))}function Va(e,t){return t==="tid"?new o(void 0):t==="any"?new o(void 0):t==="nsid"?new o(void 0):Z(t,"literal:")?new o(void 0):new u(m(e+": record has invalid key type '"+t+"'. Must be 'tid', 'any', 'nsid', or 'literal:*'"))}var Ra=w(["type","key","record","description"]),Pa=w(["type","properties","required","nullable","description"]);function Ja(e,t){let n=k(t,"type");if(n instanceof d){let r=n[0];if(r==="object"){let i=z(t);return p(O(e,i,Pa,"record object"),s=>p((()=>{let l=C(t,"properties");if(l instanceof d){let a=l[0];return q(a)?new o(void 0):new u(m(e+": record properties must be an object"))}else return new o(void 0)})(),l=>T(t,"nullable")instanceof d?new o(void 0):C(t,"nullable")instanceof d?new u(m(e+": record nullable field must be an array")):new o(void 0)))}else{let i=r;return new u(m(e+": record field must be type 'object', got '"+i+"'"))}}else return new u(m(e+": record field missing type"))}function Fs(e,t){let n=b(t),r=z(e);return p(O(n,r,Ra,"record"),i=>{let s,l=k(e,"key");if(l instanceof d){let f=l[0];s=new o(f)}else s=new u(m(n+": record missing required 'key' field"));return p(s,f=>p(Va(n,f),c=>{let $,_=C(e,"record");if(_ instanceof d){let g=_[0];$=new o(g)}else $=new u(m(n+": record missing required 'record' field"));return p($,g=>p(Ja(n,g),S=>{let E=N(t,".record");return pn(g,E)}))}))})}function Za(e,t){let n=N(t,"parameters");return Ze(e,n)}function Wa(e,t){let n=C(t,"schema");if(n instanceof d){let r=n[0],i=k(r,"type");if(i instanceof d){let s=i[0];if(s==="union")return new o(void 0);{let l=s;return new u(m(e+": subscription message schema must be type 'union', got '"+l+"'"))}}else return new u(m(e+": subscription message schema missing type field"))}else return new u(m(e+": subscription message missing schema field"))}var Ha=w(["type","parameters","message","errors","description"]);function Vs(e,t){let n=b(t),r=z(e);return p(O(n,r,Ha,"subscription"),i=>p((()=>{let s=C(e,"parameters");if(s instanceof d){let l=s[0];return Za(l,t)}else return new o(void 0)})(),s=>p((()=>{let l=C(e,"message");if(l instanceof d){let a=l[0];return Wa(n,a)}else return new o(void 0)})(),l=>T(e,"errors")instanceof d?new o(void 0):new o(void 0))))}function Ka(e,t){let n=k(e,"type");if(n instanceof d){let r=n[0];if(r==="record")return Fs(e,t);if(r==="query")return qs(e,t);if(r==="procedure")return Ns(e,t);if(r==="subscription")return Vs(e,t);if(r==="params")return Ze(e,t);if(r==="object")return pn(e,t);if(r==="array")return $n(e,t);if(r==="union")return Wn(e,t);if(r==="string")return Tt(e,t);if(r==="integer")return Lt(e,t);if(r==="boolean")return Dt(e,t);if(r==="bytes")return Xn(e,t);if(r==="blob")return Kn(e,t);if(r==="cid-link")return Qn(e,t);if(r==="null")return er(e,t);if(r==="ref")return rn(e,t);if(r==="token")return Yn(e,t);if(r==="unknown")return Mt(e,t);{let i=r;return new u(m("Unknown type: "+i))}}else return new u(m("Definition missing type field"))}function Xa(e){let t,n=Kr();t=ei(n,e);let r=t;if(r instanceof o){let i=r[0],s=Xr(i);if(s instanceof o){let l=s[0],a=me(l.lexicons,ee(),(c,$,_)=>{let x=z(_.defs),g=nn(l,$);return Sn(x,c,(S,E)=>{let I=C(_.defs,E);if(I instanceof d){let xe=I[0],H=Ka(xe,g);if(H instanceof o)return S;{let J=H[0],de;J instanceof pt?de="Lexicon not found: "+J.collection:(J instanceof $t,de=J.message);let We=de,Le;Z(We,": ")?Le=Re(We,2):Le=We;let _n=Le,Nt=$+"#"+E+": "+_n,qt=ue(S,$);if(qt instanceof o){let He=qt[0];return _e(S,$,F(Nt,He))}else return _e(S,$,w([Nt]))}}else return S})});return xr(a)?new o(void 0):new u(a)}else{let l=s[0];return new u(bn(w([["builder",w([Ur(l)])]])))}}else{let i=r[0];return new u(bn(w([["builder",w([Ur(i)])]])))}}function Rs(e){let t,n=Kr();return t=ei(n,e),p(t,i=>Xr(i))}function Ps(e,t,n){let r=Pn(e,t);if(r instanceof d){let i=r[0],s=C(i.defs,"main");if(s instanceof d){let l=s[0],a=nn(e,t),f=N(a,"defs.main");return Us(n,l,f)}else return new u(m("Lexicon '"+t+"' has no main definition"))}else return new u(Fr(t))}function Qa(e,t,n){return p(Rs(e),r=>Ps(r,t,n))}function eu(e){return et(e)}function tu(e,t){if(Vn(e,t))return new o(void 0);{let r=In(t);return new u("Value does not match format: "+r)}}function Js(e){return te(e)}function Gs(e){return p((()=>{let t=Mn(e,se);return Xe(t,n=>m("Failed to parse JSON string"))})(),t=>Js(t))}function nu(e){let n=qe(e,Gs);return Xe(n,r=>m("Failed to parse JSON strings"))}return Xs(ru);})();
+3 -1
gleam.toml
··· 1 1 name = "honk" 2 - version = "1.0.0" 2 + version = "1.2.0" 3 3 description = "ATProtocol lexicon validator for Gleam" 4 4 internal_modules = ["honk/internal", "honk/internal/*"] 5 + licences = ["Apache-2.0"] 6 + repository = { type = "github", user = "bigmoves", repo = "honk" } 5 7 6 8 [dependencies] 7 9 gleam_stdlib = ">= 0.44.0 and < 2.0.0"
+2 -2
manifest.toml
··· 6 6 { name = "filepath", version = "1.1.2", build_tools = ["gleam"], requirements = ["gleam_stdlib"], otp_app = "filepath", source = "hex", outer_checksum = "B06A9AF0BF10E51401D64B98E4B627F1D2E48C154967DA7AF4D0914780A6D40A" }, 7 7 { name = "gleam_json", version = "3.1.0", build_tools = ["gleam"], requirements = ["gleam_stdlib"], otp_app = "gleam_json", source = "hex", outer_checksum = "44FDAA8847BE8FC48CA7A1C089706BD54BADCC4C45B237A992EDDF9F2CDB2836" }, 8 8 { name = "gleam_regexp", version = "1.1.1", build_tools = ["gleam"], requirements = ["gleam_stdlib"], otp_app = "gleam_regexp", source = "hex", outer_checksum = "9C215C6CA84A5B35BB934A9B61A9A306EC743153BE2B0425A0D032E477B062A9" }, 9 - { name = "gleam_stdlib", version = "0.65.0", build_tools = ["gleam"], requirements = [], otp_app = "gleam_stdlib", source = "hex", outer_checksum = "7C69C71D8C493AE11A5184828A77110EB05A7786EBF8B25B36A72F879C3EE107" }, 10 - { name = "gleam_time", version = "1.5.0", build_tools = ["gleam"], requirements = ["gleam_stdlib"], otp_app = "gleam_time", source = "hex", outer_checksum = "D560E672C7279C89908981E068DF07FD16D0C859DCA266F908B18F04DF0EB8E6" }, 9 + { name = "gleam_stdlib", version = "0.67.1", build_tools = ["gleam"], requirements = [], otp_app = "gleam_stdlib", source = "hex", outer_checksum = "6CE3E4189A8B8EC2F73AB61A2FBDE49F159D6C9C61C49E3B3082E439F260D3D0" }, 10 + { name = "gleam_time", version = "1.6.0", build_tools = ["gleam"], requirements = ["gleam_stdlib"], otp_app = "gleam_time", source = "hex", outer_checksum = "0DF3834D20193F0A38D0EB21F0A78D48F2EC276C285969131B86DF8D4EF9E762" }, 11 11 { name = "gleeunit", version = "1.9.0", build_tools = ["gleam"], requirements = ["gleam_stdlib"], otp_app = "gleeunit", source = "hex", outer_checksum = "DA9553CE58B67924B3C631F96FE3370C49EB6D6DC6B384EC4862CC4AAA718F3C" }, 12 12 { name = "simplifile", version = "2.3.1", build_tools = ["gleam"], requirements = ["filepath", "gleam_stdlib"], otp_app = "simplifile", source = "hex", outer_checksum = "957E0E5B75927659F1D2A1B7B75D7B9BA96FAA8D0C53EA71C4AD9CD0C6B848F6" }, 13 13 ]
+51 -53
src/honk/internal/json_helpers.gleam
··· 221 221 222 222 /// Check if dynamic value is null 223 223 pub fn is_null_dynamic(dyn: Dynamic) -> Bool { 224 - case decode.run(dyn, decode.string) { 225 - Ok("null") -> True 226 - _ -> False 227 - } 224 + dynamic.classify(dyn) == "Nil" 228 225 } 229 226 230 227 /// Convert JSON object to a dictionary ··· 243 240 } 244 241 245 242 /// Convert a dynamic value back to Json 246 - /// This works by trying different decoders 247 243 pub fn dynamic_to_json(dyn: Dynamic) -> Result(Json, ValidationError) { 248 - // Try null 249 - case decode.run(dyn, decode.string) { 250 - Ok(s) -> { 251 - case s { 252 - "null" -> Ok(json.null()) 253 - _ -> Ok(json.string(s)) 244 + case dynamic.classify(dyn) { 245 + "Nil" -> Ok(json.null()) 246 + "String" -> { 247 + case decode.run(dyn, decode.string) { 248 + Ok(s) -> Ok(json.string(s)) 249 + Error(_) -> Error(data_validation("Failed to decode string")) 254 250 } 255 251 } 256 - Error(_) -> { 257 - // Try number 252 + "Int" -> { 258 253 case decode.run(dyn, decode.int) { 259 254 Ok(i) -> Ok(json.int(i)) 260 - Error(_) -> { 261 - // Try boolean 262 - case decode.run(dyn, decode.bool) { 263 - Ok(b) -> Ok(json.bool(b)) 264 - Error(_) -> { 265 - // Try array 266 - case decode.run(dyn, decode.list(decode.dynamic)) { 267 - Ok(arr) -> { 268 - // Recursively convert array items 269 - case list.try_map(arr, dynamic_to_json) { 270 - Ok(json_arr) -> Ok(json.array(json_arr, fn(x) { x })) 271 - Error(e) -> Error(e) 272 - } 273 - } 274 - Error(_) -> { 275 - // Try object 276 - case 277 - decode.run(dyn, decode.dict(decode.string, decode.dynamic)) 278 - { 279 - Ok(dict_val) -> { 280 - // Convert dict to object 281 - let pairs = dict.to_list(dict_val) 282 - case 283 - list.try_map(pairs, fn(pair) { 284 - let #(key, value_dyn) = pair 285 - case dynamic_to_json(value_dyn) { 286 - Ok(value_json) -> Ok(#(key, value_json)) 287 - Error(e) -> Error(e) 288 - } 289 - }) 290 - { 291 - Ok(json_pairs) -> Ok(json.object(json_pairs)) 292 - Error(e) -> Error(e) 293 - } 294 - } 295 - Error(_) -> 296 - Error(data_validation("Failed to convert dynamic to Json")) 297 - } 298 - } 255 + Error(_) -> Error(data_validation("Failed to decode int")) 256 + } 257 + } 258 + "Float" -> { 259 + case decode.run(dyn, decode.float) { 260 + Ok(f) -> Ok(json.float(f)) 261 + Error(_) -> Error(data_validation("Failed to decode float")) 262 + } 263 + } 264 + "Bool" -> { 265 + case decode.run(dyn, decode.bool) { 266 + Ok(b) -> Ok(json.bool(b)) 267 + Error(_) -> Error(data_validation("Failed to decode bool")) 268 + } 269 + } 270 + "List" | "Array" -> { 271 + case decode.run(dyn, decode.list(decode.dynamic)) { 272 + Ok(arr) -> { 273 + case list.try_map(arr, dynamic_to_json) { 274 + Ok(json_arr) -> Ok(json.array(json_arr, fn(x) { x })) 275 + Error(e) -> Error(e) 276 + } 277 + } 278 + Error(_) -> Error(data_validation("Failed to decode list")) 279 + } 280 + } 281 + "Dict" | "Object" -> { 282 + case decode.run(dyn, decode.dict(decode.string, decode.dynamic)) { 283 + Ok(dict_val) -> { 284 + let pairs = dict.to_list(dict_val) 285 + case 286 + list.try_map(pairs, fn(pair) { 287 + let #(key, value_dyn) = pair 288 + case dynamic_to_json(value_dyn) { 289 + Ok(value_json) -> Ok(#(key, value_json)) 290 + Error(e) -> Error(e) 299 291 } 300 - } 292 + }) 293 + { 294 + Ok(json_pairs) -> Ok(json.object(json_pairs)) 295 + Error(e) -> Error(e) 301 296 } 302 297 } 298 + Error(_) -> Error(data_validation("Failed to decode dict")) 303 299 } 304 300 } 301 + other -> 302 + Error(data_validation("Unsupported type for JSON conversion: " <> other)) 305 303 } 306 304 } 307 305
+89 -7
src/honk/validation/field/union.gleam
··· 9 9 import honk/errors 10 10 import honk/internal/constraints 11 11 import honk/internal/json_helpers 12 + import honk/internal/resolution 12 13 import honk/validation/context.{type ValidationContext} 13 14 14 15 const allowed_fields = ["type", "refs", "closed", "description"] ··· 70 71 }) 71 72 72 73 // Empty refs array is only allowed for open unions 73 - case list.is_empty(refs_array) { 74 + use _ <- result.try(case list.is_empty(refs_array) { 74 75 True -> { 75 76 case json_helpers.get_bool(schema, "closed") { 76 77 Some(True) -> ··· 81 82 } 82 83 } 83 84 False -> Ok(Nil) 84 - } 85 - // Note: Full implementation would validate that each reference can be resolved 85 + }) 86 + 87 + // Validate that each reference can be resolved 88 + validate_refs_resolvable(refs_array, ctx, def_name) 89 + } 90 + 91 + /// Validates that all references in the refs array can be resolved 92 + fn validate_refs_resolvable( 93 + refs_array: List(decode.Dynamic), 94 + ctx: ValidationContext, 95 + def_name: String, 96 + ) -> Result(Nil, errors.ValidationError) { 97 + // Convert refs to strings 98 + let ref_strings = 99 + list.filter_map(refs_array, fn(r) { decode.run(r, decode.string) }) 100 + 101 + // Check each reference can be resolved (both local and global refs) 102 + list.try_fold(ref_strings, Nil, fn(_, ref_str) { 103 + case context.current_lexicon_id(ctx) { 104 + Some(lex_id) -> { 105 + // We have a full validation context, so validate reference resolution 106 + // This works for both local refs (#def) and global refs (nsid#def) 107 + use resolved <- result.try(resolution.resolve_reference( 108 + ref_str, 109 + ctx, 110 + lex_id, 111 + )) 112 + 113 + case resolved { 114 + Some(_) -> Ok(Nil) 115 + None -> 116 + Error(errors.invalid_schema( 117 + def_name <> ": reference not found: " <> ref_str, 118 + )) 119 + } 120 + } 121 + None -> { 122 + // No current lexicon (e.g., unit test context) 123 + // Just validate syntax, can't check if reference exists 124 + Ok(Nil) 125 + } 126 + } 127 + }) 86 128 } 87 129 88 130 /// Validates union data against schema ··· 143 185 refs_contain_type(ref_str, type_name) 144 186 }) 145 187 { 146 - Ok(_matching_ref) -> { 147 - // Found matching ref 148 - // In full implementation, would validate against the resolved schema 149 - Ok(Nil) 188 + Ok(matching_ref) -> { 189 + // Found matching ref - validate data against the resolved schema 190 + validate_against_resolved_ref(data, matching_ref, ctx, def_name) 150 191 } 151 192 Error(Nil) -> { 152 193 // No matching ref found ··· 177 218 } 178 219 } 179 220 } 221 + } 222 + } 223 + } 224 + 225 + /// Validates data against a resolved reference from the union 226 + fn validate_against_resolved_ref( 227 + data: Json, 228 + ref_str: String, 229 + ctx: ValidationContext, 230 + def_name: String, 231 + ) -> Result(Nil, errors.ValidationError) { 232 + // Get current lexicon ID to resolve the reference 233 + case context.current_lexicon_id(ctx) { 234 + Some(lex_id) -> { 235 + // We have a validation context, try to resolve and validate 236 + use resolved_opt <- result.try(resolution.resolve_reference( 237 + ref_str, 238 + ctx, 239 + lex_id, 240 + )) 241 + 242 + case resolved_opt { 243 + Some(resolved_schema) -> { 244 + // Successfully resolved - validate data against the resolved schema 245 + let validator = ctx.validator 246 + validator(data, resolved_schema, ctx) 247 + } 248 + None -> { 249 + // Reference couldn't be resolved 250 + // This shouldn't happen as schema validation should have caught it, 251 + // but handle gracefully 252 + Error(errors.data_validation( 253 + def_name <> ": reference not found: " <> ref_str, 254 + )) 255 + } 256 + } 257 + } 258 + None -> { 259 + // No lexicon context (e.g., unit test) 260 + // Can't validate against resolved schema, just accept the data 261 + Ok(Nil) 180 262 } 181 263 } 182 264 }
+88 -16
src/honk/validation/field.gleam
··· 160 160 161 161 use _ <- result.try(properties) 162 162 163 + // Get properties for validation 164 + let properties_json = json_helpers.get_field(schema, "properties") 165 + 163 166 // Validate required fields reference existing properties 164 167 use _ <- result.try(case json_helpers.get_array(schema, "required") { 165 - Some(required_array) -> validate_required_fields(def_name, required_array) 168 + Some(required_array) -> 169 + validate_required_fields(def_name, required_array, properties_json) 166 170 None -> Ok(Nil) 167 171 }) 168 172 169 173 // Validate nullable fields reference existing properties 170 174 use _ <- result.try(case json_helpers.get_array(schema, "nullable") { 171 - Some(nullable_array) -> validate_nullable_fields(def_name, nullable_array) 175 + Some(nullable_array) -> 176 + validate_nullable_fields(def_name, nullable_array, properties_json) 172 177 None -> Ok(Nil) 173 178 }) 174 179 175 180 // Validate each property schema recursively 176 - case json_helpers.get_field(schema, "properties") { 181 + case properties_json { 177 182 Some(properties) -> { 178 183 case json_helpers.is_object(properties) { 179 184 True -> { ··· 235 240 fn validate_required_fields( 236 241 def_name: String, 237 242 required: List(Dynamic), 243 + properties: option.Option(Json), 238 244 ) -> Result(Nil, errors.ValidationError) { 239 245 // Convert dynamics to strings 240 246 let field_names = 241 247 list.filter_map(required, fn(item) { decode.run(item, decode.string) }) 242 248 243 - // Each required field should be validated against properties 244 - // Simplified: just check they're strings 245 - case list.length(field_names) == list.length(required) { 249 + // Check all items are strings 250 + use _ <- result.try(case list.length(field_names) == list.length(required) { 246 251 True -> Ok(Nil) 247 252 False -> 248 253 Error(errors.invalid_schema( 249 254 def_name <> ": required fields must be strings", 250 255 )) 256 + }) 257 + 258 + // Validate each required field exists in properties 259 + case properties { 260 + Some(props) -> { 261 + case json_helpers.json_to_dict(props) { 262 + Ok(props_dict) -> { 263 + list.try_fold(field_names, Nil, fn(_, field_name) { 264 + case json_helpers.dict_has_key(props_dict, field_name) { 265 + True -> Ok(Nil) 266 + False -> 267 + Error(errors.invalid_schema( 268 + def_name 269 + <> ": required field '" 270 + <> field_name 271 + <> "' not found in properties", 272 + )) 273 + } 274 + }) 275 + } 276 + Error(_) -> Ok(Nil) 277 + } 278 + } 279 + None -> { 280 + // No properties defined, but required fields specified - this is an error 281 + case list.is_empty(field_names) { 282 + True -> Ok(Nil) 283 + False -> 284 + Error(errors.invalid_schema( 285 + def_name <> ": required fields specified but no properties defined", 286 + )) 287 + } 288 + } 251 289 } 252 290 } 253 291 ··· 255 293 fn validate_nullable_fields( 256 294 def_name: String, 257 295 nullable: List(Dynamic), 296 + properties: option.Option(Json), 258 297 ) -> Result(Nil, errors.ValidationError) { 259 298 // Convert dynamics to strings 260 299 let field_names = 261 300 list.filter_map(nullable, fn(item) { decode.run(item, decode.string) }) 262 301 263 - // Each nullable field should be validated against properties 264 - // Simplified: just check they're strings 265 - case list.length(field_names) == list.length(nullable) { 302 + // Check all items are strings 303 + use _ <- result.try(case list.length(field_names) == list.length(nullable) { 266 304 True -> Ok(Nil) 267 305 False -> 268 306 Error(errors.invalid_schema( 269 307 def_name <> ": nullable fields must be strings", 270 308 )) 309 + }) 310 + 311 + // Validate each nullable field exists in properties 312 + case properties { 313 + Some(props) -> { 314 + case json_helpers.json_to_dict(props) { 315 + Ok(props_dict) -> { 316 + list.try_fold(field_names, Nil, fn(_, field_name) { 317 + case json_helpers.dict_has_key(props_dict, field_name) { 318 + True -> Ok(Nil) 319 + False -> 320 + Error(errors.invalid_schema( 321 + def_name 322 + <> ": nullable field '" 323 + <> field_name 324 + <> "' not found in properties", 325 + )) 326 + } 327 + }) 328 + } 329 + Error(_) -> Ok(Nil) 330 + } 331 + } 332 + None -> { 333 + // No properties defined, but nullable fields specified - this is an error 334 + case list.is_empty(field_names) { 335 + True -> Ok(Nil) 336 + False -> 337 + Error(errors.invalid_schema( 338 + def_name <> ": nullable fields specified but no properties defined", 339 + )) 340 + } 341 + } 271 342 } 272 343 } 273 344 ··· 283 354 284 355 // Check each required field exists in data 285 356 list.try_fold(field_names, Nil, fn(_, field_name) { 286 - case json_helpers.get_string(data, field_name) { 357 + case json_helpers.get_field(data, field_name) { 287 358 Some(_) -> Ok(Nil) 288 - None -> 289 - // Field might not be a string, check if it exists at all 290 - // Simplified: just report missing 291 - Error(errors.data_validation( 292 - def_name <> ": required field '" <> field_name <> "' is missing", 293 - )) 359 + None -> { 360 + let message = case def_name { 361 + "" -> "required field '" <> field_name <> "' is missing" 362 + _ -> def_name <> ": required field '" <> field_name <> "' is missing" 363 + } 364 + Error(errors.data_validation(message)) 365 + } 294 366 } 295 367 }) 296 368 }
+9
src/honk/validation/formats.gleam
··· 217 217 } 218 218 } 219 219 220 + /// Validates CID format with raw multicodec (0x55) for blobs 221 + /// Base32 CIDv1 with raw multicodec starts with "bafkrei" 222 + pub fn is_valid_raw_cid(value: String) -> Bool { 223 + case is_valid_cid(value) { 224 + False -> False 225 + True -> string.starts_with(value, "bafkrei") 226 + } 227 + } 228 + 220 229 /// Validates BCP47 language tag 221 230 pub fn is_valid_language_tag(value: String) -> Bool { 222 231 // Lenient BCP47 validation (max 128 chars)
+109 -8
src/honk/validation/primary/params.gleam
··· 1 1 // Params type validator 2 - // Mirrors the Go implementation's validation/primary/params 3 2 // Params define query/procedure/subscription parameters (XRPC endpoint arguments) 4 3 5 4 import gleam/dynamic/decode ··· 219 218 220 219 /// Validates params data against schema 221 220 pub fn validate_data( 222 - _data: Json, 223 - _schema: Json, 224 - _ctx: ValidationContext, 221 + data: Json, 222 + schema: Json, 223 + ctx: ValidationContext, 225 224 ) -> Result(Nil, errors.ValidationError) { 226 - // Params data validation would check that all required parameters are present 227 - // and that each parameter value matches its schema 228 - // For now, simplified implementation 229 - Ok(Nil) 225 + let def_name = context.path(ctx) 226 + 227 + // Get data as dict 228 + use data_dict <- result.try(json_helpers.json_to_dict(data)) 229 + 230 + // Get properties and required from params schema 231 + let properties_dict = case json_helpers.get_field(schema, "properties") { 232 + Some(props) -> json_helpers.json_to_dict(props) 233 + None -> Ok(json_helpers.empty_dict()) 234 + } 235 + 236 + let required_array = json_helpers.get_array(schema, "required") 237 + 238 + use props_dict <- result.try(properties_dict) 239 + 240 + // Check all required parameters are present 241 + use _ <- result.try(case required_array { 242 + Some(required) -> { 243 + list.try_fold(required, Nil, fn(_, item) { 244 + case decode.run(item, decode.string) { 245 + Ok(param_name) -> { 246 + case json_helpers.dict_has_key(data_dict, param_name) { 247 + True -> Ok(Nil) 248 + False -> 249 + Error(errors.data_validation( 250 + def_name 251 + <> ": missing required parameter '" 252 + <> param_name 253 + <> "'", 254 + )) 255 + } 256 + } 257 + Error(_) -> Ok(Nil) 258 + } 259 + }) 260 + } 261 + None -> Ok(Nil) 262 + }) 263 + 264 + // Validate each parameter in data 265 + json_helpers.dict_fold(data_dict, Ok(Nil), fn(acc, param_name, param_value) { 266 + case acc { 267 + Error(e) -> Error(e) 268 + Ok(_) -> { 269 + // Get the schema for this parameter 270 + case json_helpers.dict_get(props_dict, param_name) { 271 + Some(param_schema_dyn) -> { 272 + // Convert dynamic to JSON 273 + case json_helpers.dynamic_to_json(param_schema_dyn) { 274 + Ok(param_schema) -> { 275 + // Convert param value to JSON 276 + case json_helpers.dynamic_to_json(param_value) { 277 + Ok(param_json) -> { 278 + // Validate the parameter value against its schema 279 + let param_ctx = context.with_path(ctx, param_name) 280 + validate_parameter_value( 281 + param_json, 282 + param_schema, 283 + param_ctx, 284 + ) 285 + } 286 + Error(e) -> Error(e) 287 + } 288 + } 289 + Error(e) -> Error(e) 290 + } 291 + } 292 + None -> { 293 + // Parameter not in schema - could warn or allow 294 + // For now, allow unknown parameters 295 + Ok(Nil) 296 + } 297 + } 298 + } 299 + } 300 + }) 301 + } 302 + 303 + /// Validates a single parameter value against its schema 304 + fn validate_parameter_value( 305 + value: Json, 306 + schema: Json, 307 + ctx: ValidationContext, 308 + ) -> Result(Nil, errors.ValidationError) { 309 + // Dispatch based on schema type 310 + case json_helpers.get_string(schema, "type") { 311 + Some("boolean") -> 312 + validation_primitive_boolean.validate_data(value, schema, ctx) 313 + Some("integer") -> 314 + validation_primitive_integer.validate_data(value, schema, ctx) 315 + Some("string") -> 316 + validation_primitive_string.validate_data(value, schema, ctx) 317 + Some("unknown") -> validation_meta_unknown.validate_data(value, schema, ctx) 318 + Some("array") -> validation_field.validate_array_data(value, schema, ctx) 319 + Some(other_type) -> 320 + Error(errors.data_validation( 321 + context.path(ctx) 322 + <> ": unsupported parameter type '" 323 + <> other_type 324 + <> "'", 325 + )) 326 + None -> 327 + Error(errors.data_validation( 328 + context.path(ctx) <> ": parameter schema missing type field", 329 + )) 330 + } 230 331 }
+93 -4
src/honk/validation/primitive/blob.gleam
··· 13 13 import honk/internal/constraints 14 14 import honk/internal/json_helpers 15 15 import honk/validation/context.{type ValidationContext} 16 + import honk/validation/formats 16 17 17 18 const allowed_fields = ["type", "accept", "maxSize", "description"] 19 + 20 + const allowed_data_fields = ["$type", "ref", "mimeType", "size"] 18 21 19 22 /// Validates blob schema definition 20 23 pub fn validate_schema( ··· 66 69 Error(errors.data_validation(def_name <> ": expected blob object")) 67 70 } 68 71 True -> { 69 - // Validate required mimeType field 72 + // Validate no extra fields (strict mode per atproto implementation) 73 + let keys = json_helpers.get_keys(data) 74 + use _ <- result.try(validate_no_extra_fields(def_name, keys)) 75 + 76 + // Validate $type field must be "blob" 77 + use _ <- result.try(case json_helpers.get_string(data, "$type") { 78 + Some("blob") -> Ok(Nil) 79 + Some(other) -> 80 + Error(errors.data_validation( 81 + def_name <> ": blob $type must be 'blob', got '" <> other <> "'", 82 + )) 83 + None -> 84 + Error(errors.data_validation( 85 + def_name <> ": blob missing required '$type' field", 86 + )) 87 + }) 88 + 89 + // Validate ref field with $link containing raw CID 90 + use _ <- result.try(validate_ref_field(data, def_name)) 91 + 92 + // Validate required mimeType field (non-empty) 70 93 use mime_type <- result.try( 71 94 case json_helpers.get_string(data, "mimeType") { 72 - Some(mt) -> Ok(mt) 95 + Some(mt) -> 96 + case string.is_empty(mt) { 97 + True -> 98 + Error(errors.data_validation( 99 + def_name <> ": blob mimeType cannot be empty", 100 + )) 101 + False -> Ok(mt) 102 + } 73 103 None -> 74 104 Error(errors.data_validation( 75 105 def_name <> ": blob missing required 'mimeType' field", ··· 77 107 }, 78 108 ) 79 109 80 - // Validate required size field 110 + // Validate required size field (non-negative integer) 81 111 use size <- result.try(case json_helpers.get_int(data, "size") { 82 - Some(s) -> Ok(s) 112 + Some(s) -> 113 + case s >= 0 { 114 + True -> Ok(s) 115 + False -> 116 + Error(errors.data_validation( 117 + def_name <> ": blob size must be non-negative", 118 + )) 119 + } 83 120 None -> 84 121 Error(errors.data_validation( 85 122 def_name <> ": blob missing or invalid 'size' field", ··· 111 148 None -> Ok(Nil) 112 149 } 113 150 } 151 + } 152 + } 153 + 154 + /// Validates that blob data has no extra fields 155 + fn validate_no_extra_fields( 156 + def_name: String, 157 + keys: List(String), 158 + ) -> Result(Nil, errors.ValidationError) { 159 + let extra_keys = 160 + list.filter(keys, fn(key) { !list.contains(allowed_data_fields, key) }) 161 + case extra_keys { 162 + [] -> Ok(Nil) 163 + [first, ..] -> 164 + Error(errors.data_validation( 165 + def_name <> ": blob has unexpected field '" <> first <> "'", 166 + )) 167 + } 168 + } 169 + 170 + /// Validates the ref field containing $link with raw CID 171 + fn validate_ref_field( 172 + data: Json, 173 + def_name: String, 174 + ) -> Result(Nil, errors.ValidationError) { 175 + case json_helpers.get_field(data, "ref") { 176 + Some(ref_json) -> 177 + case json_helpers.is_object(ref_json) { 178 + False -> 179 + Error(errors.data_validation( 180 + def_name <> ": blob ref must be an object", 181 + )) 182 + True -> 183 + case json_helpers.get_string(ref_json, "$link") { 184 + Some(cid) -> 185 + case formats.is_valid_raw_cid(cid) { 186 + True -> Ok(Nil) 187 + False -> 188 + Error(errors.data_validation( 189 + def_name 190 + <> ": blob ref.$link must be a valid CID with raw multicodec (bafkrei prefix)", 191 + )) 192 + } 193 + None -> 194 + Error(errors.data_validation( 195 + def_name <> ": blob ref must have $link field", 196 + )) 197 + } 198 + } 199 + None -> 200 + Error(errors.data_validation( 201 + def_name <> ": blob missing required 'ref' field", 202 + )) 114 203 } 115 204 } 116 205
+44 -8
src/honk.gleam
··· 1 1 // Main public API for the ATProtocol lexicon validator 2 2 3 + @target(erlang) 3 4 import argv 4 5 import gleam/dict.{type Dict} 5 6 import gleam/dynamic 6 7 import gleam/dynamic/decode 8 + @target(erlang) 7 9 import gleam/int 10 + @target(erlang) 8 11 import gleam/io 9 12 import gleam/json.{type Json} 10 13 import gleam/list ··· 16 19 import honk/types 17 20 import honk/validation/context 18 21 import honk/validation/formats 22 + @target(erlang) 19 23 import simplifile 20 24 21 25 // Import validators ··· 149 153 } 150 154 } 151 155 152 - /// Validates a single data record against a collection schema 153 - pub fn validate_record( 156 + /// Validation context type (re-exported for external use) 157 + pub type ValidationContext = 158 + context.ValidationContext 159 + 160 + /// Build a reusable validation context from lexicons 161 + /// Call this once, then use validate_record_with_context for each record 162 + pub fn build_validation_context( 154 163 lexicons: List(Json), 155 - collection: String, 156 - record: Json, 157 - ) -> Result(Nil, ValidationError) { 158 - // Build validation context 164 + ) -> Result(ValidationContext, ValidationError) { 159 165 let builder_result = 160 166 context.builder() 161 167 |> context.with_lexicons(lexicons) 162 168 163 169 use builder <- result.try(builder_result) 164 - use ctx <- result.try(context.build(builder)) 170 + context.build(builder) 171 + } 165 172 173 + /// Validates a single data record against a collection schema using pre-built context 174 + /// This is much faster when validating many records - build context once with 175 + /// build_validation_context, then call this for each record 176 + pub fn validate_record_with_context( 177 + ctx: ValidationContext, 178 + collection: String, 179 + record: Json, 180 + ) -> Result(Nil, ValidationError) { 166 181 // Get the lexicon for this collection 167 182 case context.get_lexicon(ctx, collection) { 168 183 Some(lexicon) -> { ··· 170 185 case json_helpers.get_field(lexicon.defs, "main") { 171 186 Some(main_def) -> { 172 187 let lex_ctx = context.with_current_lexicon(ctx, collection) 188 + // Set the path to include the definition name 189 + let def_ctx = context.with_path(lex_ctx, "defs.main") 173 190 // Validate the record data against the main definition 174 - validation_primary_record.validate_data(record, main_def, lex_ctx) 191 + validation_primary_record.validate_data(record, main_def, def_ctx) 175 192 } 176 193 None -> 177 194 Error(errors.invalid_schema( ··· 183 200 } 184 201 } 185 202 203 + /// Validates a single data record against a collection schema 204 + pub fn validate_record( 205 + lexicons: List(Json), 206 + collection: String, 207 + record: Json, 208 + ) -> Result(Nil, ValidationError) { 209 + // Build validation context 210 + use ctx <- result.try(build_validation_context(lexicons)) 211 + validate_record_with_context(ctx, collection, record) 212 + } 213 + 186 214 /// Validates NSID format 187 215 pub fn is_valid_nsid(nsid: String) -> Bool { 188 216 formats.is_valid_nsid(nsid) ··· 257 285 }) 258 286 } 259 287 288 + @target(erlang) 260 289 /// CLI entry point for the honk lexicon validator 261 290 /// 262 291 /// Usage: ··· 273 302 } 274 303 } 275 304 305 + @target(erlang) 276 306 /// Validate a path (auto-detects file or directory) 277 307 fn validate_path(path: String) -> Nil { 278 308 case simplifile.is_file(path) { ··· 298 328 } 299 329 } 300 330 331 + @target(erlang) 301 332 /// Validate a single lexicon file 302 333 fn validate_file(file_path: String) -> Nil { 303 334 case read_and_validate_file(file_path) { ··· 313 344 } 314 345 } 315 346 347 + @target(erlang) 316 348 /// Validate all .json files in a directory 317 349 fn validate_directory(dir_path: String) -> Nil { 318 350 case simplifile.get_files(dir_path) { ··· 450 482 } 451 483 } 452 484 485 + @target(erlang) 453 486 /// Read and parse a JSON file (without validation) 454 487 fn read_json_file(file_path: String) -> Result(Json, String) { 455 488 use content <- result.try( ··· 466 499 |> result.map_error(fn(_) { "Failed to convert JSON" }) 467 500 } 468 501 502 + @target(erlang) 469 503 /// Read a file and validate it as a lexicon 470 504 fn read_and_validate_file(file_path: String) -> Result(Nil, String) { 471 505 use content <- result.try( ··· 491 525 Ok(Nil) 492 526 } 493 527 528 + @target(erlang) 494 529 /// Format validation errors from the error map 495 530 fn format_validation_errors(error_map: Dict(String, List(String))) -> String { 496 531 error_map ··· 502 537 |> string.join("\n ") 503 538 } 504 539 540 + @target(erlang) 505 541 /// Show help text 506 542 fn show_help() -> Nil { 507 543 io.println(
+3
src/honk_bundle.mjs
··· 1 + // Bundle entry point - re-exports honk functions plus toList for JS interop 2 + export * from "../build/dev/javascript/honk/honk.mjs"; 3 + export { toList } from "../build/dev/javascript/prelude.mjs";
+336 -2
test/blob_validator_test.gleam
··· 90 90 91 91 let data = 92 92 json.object([ 93 + #("$type", json.string("blob")), 94 + #( 95 + "ref", 96 + json.object([ 97 + #( 98 + "$link", 99 + json.string( 100 + "bafkreigh2akiscaildcqabsyg3dfr6chu3fgpregiymsck7e7aqa4s52zy", 101 + ), 102 + ), 103 + ]), 104 + ), 93 105 #("mimeType", json.string("image/jpeg")), 94 106 #("size", json.int(50_000)), 95 107 ]) ··· 109 121 110 122 let data = 111 123 json.object([ 124 + #("$type", json.string("blob")), 125 + #( 126 + "ref", 127 + json.object([ 128 + #( 129 + "$link", 130 + json.string( 131 + "bafkreigh2akiscaildcqabsyg3dfr6chu3fgpregiymsck7e7aqa4s52zy", 132 + ), 133 + ), 134 + ]), 135 + ), 112 136 #("mimeType", json.string("video/mp4")), 113 137 #("size", json.int(50_000)), 114 138 ]) ··· 128 152 129 153 let data = 130 154 json.object([ 155 + #("$type", json.string("blob")), 156 + #( 157 + "ref", 158 + json.object([ 159 + #( 160 + "$link", 161 + json.string( 162 + "bafkreigh2akiscaildcqabsyg3dfr6chu3fgpregiymsck7e7aqa4s52zy", 163 + ), 164 + ), 165 + ]), 166 + ), 131 167 #("mimeType", json.string("image/jpeg")), 132 168 #("size", json.int(50_000)), 133 169 ]) ··· 141 177 pub fn missing_mime_type_test() { 142 178 let schema = json.object([#("type", json.string("blob"))]) 143 179 144 - let data = json.object([#("size", json.int(50_000))]) 180 + let data = 181 + json.object([ 182 + #("$type", json.string("blob")), 183 + #( 184 + "ref", 185 + json.object([ 186 + #( 187 + "$link", 188 + json.string( 189 + "bafkreigh2akiscaildcqabsyg3dfr6chu3fgpregiymsck7e7aqa4s52zy", 190 + ), 191 + ), 192 + ]), 193 + ), 194 + #("size", json.int(50_000)), 195 + ]) 145 196 146 197 let assert Ok(ctx) = context.builder() |> context.build 147 198 let result = blob.validate_data(data, schema, ctx) ··· 152 203 pub fn missing_size_test() { 153 204 let schema = json.object([#("type", json.string("blob"))]) 154 205 155 - let data = json.object([#("mimeType", json.string("image/jpeg"))]) 206 + let data = 207 + json.object([ 208 + #("$type", json.string("blob")), 209 + #( 210 + "ref", 211 + json.object([ 212 + #( 213 + "$link", 214 + json.string( 215 + "bafkreigh2akiscaildcqabsyg3dfr6chu3fgpregiymsck7e7aqa4s52zy", 216 + ), 217 + ), 218 + ]), 219 + ), 220 + #("mimeType", json.string("image/jpeg")), 221 + ]) 222 + 223 + let assert Ok(ctx) = context.builder() |> context.build 224 + let result = blob.validate_data(data, schema, ctx) 225 + result |> should.be_error 226 + } 227 + 228 + // ========== FULL BLOB STRUCTURE TESTS ========== 229 + 230 + // Test valid full blob structure 231 + pub fn valid_full_blob_structure_test() { 232 + let schema = json.object([#("type", json.string("blob"))]) 233 + 234 + let data = 235 + json.object([ 236 + #("$type", json.string("blob")), 237 + #( 238 + "ref", 239 + json.object([ 240 + #( 241 + "$link", 242 + json.string( 243 + "bafkreigh2akiscaildcqabsyg3dfr6chu3fgpregiymsck7e7aqa4s52zy", 244 + ), 245 + ), 246 + ]), 247 + ), 248 + #("mimeType", json.string("image/jpeg")), 249 + #("size", json.int(50_000)), 250 + ]) 251 + 252 + let assert Ok(ctx) = context.builder() |> context.build 253 + let result = blob.validate_data(data, schema, ctx) 254 + result |> should.be_ok 255 + } 256 + 257 + // Test missing $type field 258 + pub fn missing_type_field_test() { 259 + let schema = json.object([#("type", json.string("blob"))]) 260 + 261 + let data = 262 + json.object([ 263 + #( 264 + "ref", 265 + json.object([ 266 + #( 267 + "$link", 268 + json.string( 269 + "bafkreigh2akiscaildcqabsyg3dfr6chu3fgpregiymsck7e7aqa4s52zy", 270 + ), 271 + ), 272 + ]), 273 + ), 274 + #("mimeType", json.string("image/jpeg")), 275 + #("size", json.int(50_000)), 276 + ]) 277 + 278 + let assert Ok(ctx) = context.builder() |> context.build 279 + let result = blob.validate_data(data, schema, ctx) 280 + result |> should.be_error 281 + } 282 + 283 + // Test wrong $type value 284 + pub fn wrong_type_value_test() { 285 + let schema = json.object([#("type", json.string("blob"))]) 286 + 287 + let data = 288 + json.object([ 289 + #("$type", json.string("notblob")), 290 + #( 291 + "ref", 292 + json.object([ 293 + #( 294 + "$link", 295 + json.string( 296 + "bafkreigh2akiscaildcqabsyg3dfr6chu3fgpregiymsck7e7aqa4s52zy", 297 + ), 298 + ), 299 + ]), 300 + ), 301 + #("mimeType", json.string("image/jpeg")), 302 + #("size", json.int(50_000)), 303 + ]) 304 + 305 + let assert Ok(ctx) = context.builder() |> context.build 306 + let result = blob.validate_data(data, schema, ctx) 307 + result |> should.be_error 308 + } 309 + 310 + // Test missing ref field 311 + pub fn missing_ref_field_test() { 312 + let schema = json.object([#("type", json.string("blob"))]) 313 + 314 + let data = 315 + json.object([ 316 + #("$type", json.string("blob")), 317 + #("mimeType", json.string("image/jpeg")), 318 + #("size", json.int(50_000)), 319 + ]) 320 + 321 + let assert Ok(ctx) = context.builder() |> context.build 322 + let result = blob.validate_data(data, schema, ctx) 323 + result |> should.be_error 324 + } 325 + 326 + // Test ref without $link 327 + pub fn ref_missing_link_test() { 328 + let schema = json.object([#("type", json.string("blob"))]) 329 + 330 + let data = 331 + json.object([ 332 + #("$type", json.string("blob")), 333 + #("ref", json.object([#("cid", json.string("bafkrei..."))])), 334 + #("mimeType", json.string("image/jpeg")), 335 + #("size", json.int(50_000)), 336 + ]) 337 + 338 + let assert Ok(ctx) = context.builder() |> context.build 339 + let result = blob.validate_data(data, schema, ctx) 340 + result |> should.be_error 341 + } 342 + 343 + // Test ref with invalid CID 344 + pub fn ref_invalid_cid_test() { 345 + let schema = json.object([#("type", json.string("blob"))]) 346 + 347 + let data = 348 + json.object([ 349 + #("$type", json.string("blob")), 350 + #("ref", json.object([#("$link", json.string("not-a-valid-cid"))])), 351 + #("mimeType", json.string("image/jpeg")), 352 + #("size", json.int(50_000)), 353 + ]) 354 + 355 + let assert Ok(ctx) = context.builder() |> context.build 356 + let result = blob.validate_data(data, schema, ctx) 357 + result |> should.be_error 358 + } 359 + 360 + // Test ref with dag-cbor CID (should fail - blobs need raw multicodec) 361 + pub fn ref_dag_cbor_cid_test() { 362 + let schema = json.object([#("type", json.string("blob"))]) 363 + 364 + let data = 365 + json.object([ 366 + #("$type", json.string("blob")), 367 + #( 368 + "ref", 369 + json.object([ 370 + #( 371 + "$link", 372 + json.string( 373 + "bafyreidfayvfuwqa7qlnopdjiqrxzs6blmoeu4rujcjtnci5beludirz2a", 374 + ), 375 + ), 376 + ]), 377 + ), 378 + #("mimeType", json.string("image/jpeg")), 379 + #("size", json.int(50_000)), 380 + ]) 381 + 382 + let assert Ok(ctx) = context.builder() |> context.build 383 + let result = blob.validate_data(data, schema, ctx) 384 + result |> should.be_error 385 + } 386 + 387 + // Test empty mimeType rejected 388 + pub fn empty_mime_type_test() { 389 + let schema = json.object([#("type", json.string("blob"))]) 390 + 391 + let data = 392 + json.object([ 393 + #("$type", json.string("blob")), 394 + #( 395 + "ref", 396 + json.object([ 397 + #( 398 + "$link", 399 + json.string( 400 + "bafkreigh2akiscaildcqabsyg3dfr6chu3fgpregiymsck7e7aqa4s52zy", 401 + ), 402 + ), 403 + ]), 404 + ), 405 + #("mimeType", json.string("")), 406 + #("size", json.int(50_000)), 407 + ]) 408 + 409 + let assert Ok(ctx) = context.builder() |> context.build 410 + let result = blob.validate_data(data, schema, ctx) 411 + result |> should.be_error 412 + } 413 + 414 + // Test size zero is allowed (per atproto implementation) 415 + pub fn size_zero_allowed_test() { 416 + let schema = json.object([#("type", json.string("blob"))]) 417 + 418 + let data = 419 + json.object([ 420 + #("$type", json.string("blob")), 421 + #( 422 + "ref", 423 + json.object([ 424 + #( 425 + "$link", 426 + json.string( 427 + "bafkreigh2akiscaildcqabsyg3dfr6chu3fgpregiymsck7e7aqa4s52zy", 428 + ), 429 + ), 430 + ]), 431 + ), 432 + #("mimeType", json.string("image/jpeg")), 433 + #("size", json.int(0)), 434 + ]) 435 + 436 + let assert Ok(ctx) = context.builder() |> context.build 437 + let result = blob.validate_data(data, schema, ctx) 438 + result |> should.be_ok 439 + } 440 + 441 + // Test negative size rejected 442 + pub fn negative_size_test() { 443 + let schema = json.object([#("type", json.string("blob"))]) 444 + 445 + let data = 446 + json.object([ 447 + #("$type", json.string("blob")), 448 + #( 449 + "ref", 450 + json.object([ 451 + #( 452 + "$link", 453 + json.string( 454 + "bafkreigh2akiscaildcqabsyg3dfr6chu3fgpregiymsck7e7aqa4s52zy", 455 + ), 456 + ), 457 + ]), 458 + ), 459 + #("mimeType", json.string("image/jpeg")), 460 + #("size", json.int(-100)), 461 + ]) 462 + 463 + let assert Ok(ctx) = context.builder() |> context.build 464 + let result = blob.validate_data(data, schema, ctx) 465 + result |> should.be_error 466 + } 467 + 468 + // Test extra fields are rejected (strict mode per atproto implementation) 469 + pub fn extra_fields_rejected_test() { 470 + let schema = json.object([#("type", json.string("blob"))]) 471 + 472 + let data = 473 + json.object([ 474 + #("$type", json.string("blob")), 475 + #( 476 + "ref", 477 + json.object([ 478 + #( 479 + "$link", 480 + json.string( 481 + "bafkreigh2akiscaildcqabsyg3dfr6chu3fgpregiymsck7e7aqa4s52zy", 482 + ), 483 + ), 484 + ]), 485 + ), 486 + #("mimeType", json.string("image/jpeg")), 487 + #("size", json.int(50_000)), 488 + #("extraField", json.string("not allowed")), 489 + ]) 156 490 157 491 let assert Ok(ctx) = context.builder() |> context.build 158 492 let result = blob.validate_data(data, schema, ctx)
+199
test/end_to_end_test.gleam
··· 5 5 import gleeunit 6 6 import gleeunit/should 7 7 import honk 8 + import honk/errors 8 9 import honk/types.{DateTime, Uri} 9 10 10 11 pub fn main() { ··· 327 328 honk.validate([lexicon]) 328 329 |> should.be_ok 329 330 } 331 + 332 + // Test missing required field error message with full defs.main path 333 + pub fn validate_record_missing_required_field_message_test() { 334 + let lexicon = 335 + json.object([ 336 + #("lexicon", json.int(1)), 337 + #("id", json.string("com.example.post")), 338 + #( 339 + "defs", 340 + json.object([ 341 + #( 342 + "main", 343 + json.object([ 344 + #("type", json.string("record")), 345 + #("key", json.string("tid")), 346 + #( 347 + "record", 348 + json.object([ 349 + #("type", json.string("object")), 350 + #("required", json.array([json.string("title")], fn(x) { x })), 351 + #( 352 + "properties", 353 + json.object([ 354 + #( 355 + "title", 356 + json.object([#("type", json.string("string"))]), 357 + ), 358 + ]), 359 + ), 360 + ]), 361 + ), 362 + ]), 363 + ), 364 + ]), 365 + ), 366 + ]) 367 + 368 + let data = json.object([#("description", json.string("No title"))]) 369 + 370 + let assert Error(error) = 371 + honk.validate_record([lexicon], "com.example.post", data) 372 + 373 + let error_message = errors.to_string(error) 374 + error_message 375 + |> should.equal( 376 + "Data validation failed: defs.main: required field 'title' is missing", 377 + ) 378 + } 379 + 380 + // Test missing required field in nested object with full path 381 + pub fn validate_record_nested_missing_required_field_message_test() { 382 + let lexicon = 383 + json.object([ 384 + #("lexicon", json.int(1)), 385 + #("id", json.string("com.example.post")), 386 + #( 387 + "defs", 388 + json.object([ 389 + #( 390 + "main", 391 + json.object([ 392 + #("type", json.string("record")), 393 + #("key", json.string("tid")), 394 + #( 395 + "record", 396 + json.object([ 397 + #("type", json.string("object")), 398 + #( 399 + "properties", 400 + json.object([ 401 + #( 402 + "title", 403 + json.object([#("type", json.string("string"))]), 404 + ), 405 + #( 406 + "metadata", 407 + json.object([ 408 + #("type", json.string("object")), 409 + #( 410 + "required", 411 + json.array([json.string("author")], fn(x) { x }), 412 + ), 413 + #( 414 + "properties", 415 + json.object([ 416 + #( 417 + "author", 418 + json.object([#("type", json.string("string"))]), 419 + ), 420 + ]), 421 + ), 422 + ]), 423 + ), 424 + ]), 425 + ), 426 + ]), 427 + ), 428 + ]), 429 + ), 430 + ]), 431 + ), 432 + ]) 433 + 434 + let data = 435 + json.object([ 436 + #("title", json.string("My Post")), 437 + #("metadata", json.object([#("tags", json.string("tech"))])), 438 + ]) 439 + 440 + let assert Error(error) = 441 + honk.validate_record([lexicon], "com.example.post", data) 442 + 443 + let error_message = errors.to_string(error) 444 + error_message 445 + |> should.equal( 446 + "Data validation failed: defs.main.metadata: required field 'author' is missing", 447 + ) 448 + } 449 + 450 + // Test schema validation error for non-main definition includes correct path 451 + pub fn validate_schema_non_main_definition_error_test() { 452 + let lexicon = 453 + json.object([ 454 + #("lexicon", json.int(1)), 455 + #("id", json.string("com.example.test")), 456 + #( 457 + "defs", 458 + json.object([ 459 + #( 460 + "objectDef", 461 + json.object([ 462 + #("type", json.string("object")), 463 + #( 464 + "properties", 465 + json.object([ 466 + #( 467 + "fieldA", 468 + json.object([ 469 + #("type", json.string("string")), 470 + // Invalid: maxLength must be an integer, not a string 471 + #("maxLength", json.string("300")), 472 + ]), 473 + ), 474 + ]), 475 + ), 476 + ]), 477 + ), 478 + #( 479 + "recordDef", 480 + json.object([ 481 + #("type", json.string("record")), 482 + #("key", json.string("tid")), 483 + #( 484 + "record", 485 + json.object([ 486 + #("type", json.string("object")), 487 + #( 488 + "properties", 489 + json.object([ 490 + #( 491 + "fieldB", 492 + json.object([ 493 + #("type", json.string("ref")), 494 + // Invalid: missing required "ref" field for ref type 495 + ]), 496 + ), 497 + ]), 498 + ), 499 + ]), 500 + ), 501 + ]), 502 + ), 503 + ]), 504 + ), 505 + ]) 506 + 507 + let result = honk.validate([lexicon]) 508 + 509 + // Should have errors 510 + result |> should.be_error 511 + 512 + case result { 513 + Error(error_map) -> { 514 + // Get errors for this lexicon 515 + case dict.get(error_map, "com.example.test") { 516 + Ok(error_list) -> { 517 + // Should have exactly one error from the recordDef (ref missing 'ref' field) 518 + error_list 519 + |> should.equal([ 520 + "com.example.test#recordDef: .record.properties.fieldB: ref missing required 'ref' field", 521 + ]) 522 + } 523 + Error(_) -> should.fail() 524 + } 525 + } 526 + Ok(_) -> should.fail() 527 + } 528 + }
+30
test/format_validator_test.gleam
··· 256 256 formats.is_valid_cid("") |> should.be_false 257 257 } 258 258 259 + // ========== RAW CID TESTS ========== 260 + 261 + // Test valid raw CID (bafkrei prefix = CIDv1 + raw multicodec 0x55) 262 + pub fn valid_raw_cid_test() { 263 + formats.is_valid_raw_cid( 264 + "bafkreigh2akiscaildcqabsyg3dfr6chu3fgpregiymsck7e7aqa4s52zy", 265 + ) 266 + |> should.be_true 267 + } 268 + 269 + // Test dag-cbor CID rejected (bafyrei prefix = CIDv1 + dag-cbor multicodec 0x71) 270 + pub fn invalid_raw_cid_dag_cbor_test() { 271 + formats.is_valid_raw_cid( 272 + "bafyreidfayvfuwqa7qlnopdjiqrxzs6blmoeu4rujcjtnci5beludirz2a", 273 + ) 274 + |> should.be_false 275 + } 276 + 277 + // Test CIDv0 rejected for raw CID 278 + pub fn invalid_raw_cid_v0_test() { 279 + formats.is_valid_raw_cid("QmbWqxBEKC3P8tqsKc98xmWNzrzDtRLMiMPL8wBuTGsMnR") 280 + |> should.be_false 281 + } 282 + 283 + // Test invalid CID rejected 284 + pub fn invalid_raw_cid_garbage_test() { 285 + formats.is_valid_raw_cid("not-a-cid") 286 + |> should.be_false 287 + } 288 + 259 289 // ========== LANGUAGE TESTS ========== 260 290 261 291 pub fn language_valid_test() {
+88
test/integration_test.gleam
··· 1 1 import gleam/json 2 2 import gleeunit 3 3 import gleeunit/should 4 + import honk/errors 4 5 import honk/validation/context 5 6 import honk/validation/primary/record 6 7 ··· 230 231 let result = record.validate_schema(schema, ctx) 231 232 result |> should.be_ok 232 233 } 234 + 235 + // Test missing required field error message at record root level 236 + pub fn record_missing_required_field_message_test() { 237 + let schema = 238 + json.object([ 239 + #("type", json.string("record")), 240 + #("key", json.string("tid")), 241 + #( 242 + "record", 243 + json.object([ 244 + #("type", json.string("object")), 245 + #("required", json.array([json.string("title")], fn(x) { x })), 246 + #( 247 + "properties", 248 + json.object([ 249 + #("title", json.object([#("type", json.string("string"))])), 250 + ]), 251 + ), 252 + ]), 253 + ), 254 + ]) 255 + 256 + let data = json.object([#("description", json.string("No title"))]) 257 + 258 + let assert Ok(ctx) = context.builder() |> context.build 259 + let assert Error(error) = record.validate_data(data, schema, ctx) 260 + 261 + let error_message = errors.to_string(error) 262 + error_message 263 + |> should.equal("Data validation failed: required field 'title' is missing") 264 + } 265 + 266 + // Test missing required field error message in nested object 267 + pub fn record_nested_missing_required_field_message_test() { 268 + let schema = 269 + json.object([ 270 + #("type", json.string("record")), 271 + #("key", json.string("tid")), 272 + #( 273 + "record", 274 + json.object([ 275 + #("type", json.string("object")), 276 + #( 277 + "properties", 278 + json.object([ 279 + #("title", json.object([#("type", json.string("string"))])), 280 + #( 281 + "metadata", 282 + json.object([ 283 + #("type", json.string("object")), 284 + #( 285 + "required", 286 + json.array([json.string("author")], fn(x) { x }), 287 + ), 288 + #( 289 + "properties", 290 + json.object([ 291 + #( 292 + "author", 293 + json.object([#("type", json.string("string"))]), 294 + ), 295 + #("tags", json.object([#("type", json.string("string"))])), 296 + ]), 297 + ), 298 + ]), 299 + ), 300 + ]), 301 + ), 302 + ]), 303 + ), 304 + ]) 305 + 306 + let data = 307 + json.object([ 308 + #("title", json.string("My Post")), 309 + #("metadata", json.object([#("tags", json.string("tech"))])), 310 + ]) 311 + 312 + let assert Ok(ctx) = context.builder() |> context.build 313 + let assert Error(error) = record.validate_data(data, schema, ctx) 314 + 315 + let error_message = errors.to_string(error) 316 + error_message 317 + |> should.equal( 318 + "Data validation failed: metadata: required field 'author' is missing", 319 + ) 320 + }
+117
test/object_validator_test.gleam
··· 1 1 import gleam/json 2 2 import gleeunit 3 3 import gleeunit/should 4 + import honk/errors 4 5 import honk/validation/context 5 6 import honk/validation/field 6 7 ··· 74 75 let result = field.validate_object_data(data, schema, ctx) 75 76 result |> should.be_error 76 77 } 78 + 79 + // Test missing required field error message at root level (no path) 80 + pub fn missing_required_field_message_root_test() { 81 + let schema = 82 + json.object([ 83 + #("type", json.string("object")), 84 + #( 85 + "properties", 86 + json.object([ 87 + #("title", json.object([#("type", json.string("string"))])), 88 + ]), 89 + ), 90 + #("required", json.array([json.string("title")], fn(x) { x })), 91 + ]) 92 + 93 + let data = json.object([#("other", json.string("value"))]) 94 + 95 + let assert Ok(ctx) = context.builder() |> context.build 96 + let assert Error(error) = field.validate_object_data(data, schema, ctx) 97 + 98 + let error_message = errors.to_string(error) 99 + error_message 100 + |> should.equal("Data validation failed: required field 'title' is missing") 101 + } 102 + 103 + // Test nullable field accepts null value 104 + pub fn nullable_field_accepts_null_test() { 105 + let schema = 106 + json.object([ 107 + #("type", json.string("object")), 108 + #( 109 + "properties", 110 + json.object([ 111 + #("name", json.object([#("type", json.string("string"))])), 112 + #("duration", json.object([#("type", json.string("integer"))])), 113 + ]), 114 + ), 115 + #("nullable", json.array([json.string("duration")], fn(x) { x })), 116 + ]) 117 + 118 + let data = 119 + json.object([ 120 + #("name", json.string("test")), 121 + #("duration", json.null()), 122 + ]) 123 + 124 + let assert Ok(ctx) = context.builder() |> context.build 125 + let result = field.validate_object_data(data, schema, ctx) 126 + result |> should.be_ok 127 + } 128 + 129 + // Test non-nullable field rejects null value 130 + pub fn non_nullable_field_rejects_null_test() { 131 + let schema = 132 + json.object([ 133 + #("type", json.string("object")), 134 + #( 135 + "properties", 136 + json.object([ 137 + #("name", json.object([#("type", json.string("string"))])), 138 + #("count", json.object([#("type", json.string("integer"))])), 139 + ]), 140 + ), 141 + // No nullable array - count cannot be null 142 + ]) 143 + 144 + let data = 145 + json.object([ 146 + #("name", json.string("test")), 147 + #("count", json.null()), 148 + ]) 149 + 150 + let assert Ok(ctx) = context.builder() |> context.build 151 + let result = field.validate_object_data(data, schema, ctx) 152 + result |> should.be_error 153 + } 154 + 155 + // Test nullable field must exist in properties (schema validation) 156 + pub fn nullable_field_not_in_properties_fails_test() { 157 + let schema = 158 + json.object([ 159 + #("type", json.string("object")), 160 + #( 161 + "properties", 162 + json.object([ 163 + #("name", json.object([#("type", json.string("string"))])), 164 + ]), 165 + ), 166 + // "nonexistent" is not in properties 167 + #("nullable", json.array([json.string("nonexistent")], fn(x) { x })), 168 + ]) 169 + 170 + let assert Ok(ctx) = context.builder() |> context.build 171 + let result = field.validate_object_schema(schema, ctx) 172 + result |> should.be_error 173 + } 174 + 175 + // Test valid nullable schema passes validation 176 + pub fn valid_nullable_schema_test() { 177 + let schema = 178 + json.object([ 179 + #("type", json.string("object")), 180 + #( 181 + "properties", 182 + json.object([ 183 + #("name", json.object([#("type", json.string("string"))])), 184 + #("duration", json.object([#("type", json.string("integer"))])), 185 + ]), 186 + ), 187 + #("nullable", json.array([json.string("duration")], fn(x) { x })), 188 + ]) 189 + 190 + let assert Ok(ctx) = context.builder() |> context.build 191 + let result = field.validate_object_schema(schema, ctx) 192 + result |> should.be_ok 193 + }
+271
test/params_validator_test.gleam
··· 334 334 Error(_) -> should.fail() 335 335 } 336 336 } 337 + 338 + // ==================== DATA VALIDATION TESTS ==================== 339 + 340 + // Test valid data with required parameters 341 + pub fn valid_data_with_required_params_test() { 342 + let schema = 343 + json.object([ 344 + #("type", json.string("params")), 345 + #( 346 + "properties", 347 + json.object([ 348 + #("repo", json.object([#("type", json.string("string"))])), 349 + #("limit", json.object([#("type", json.string("integer"))])), 350 + ]), 351 + ), 352 + #( 353 + "required", 354 + json.array([json.string("repo"), json.string("limit")], fn(x) { x }), 355 + ), 356 + ]) 357 + 358 + let data = 359 + json.object([ 360 + #("repo", json.string("alice.bsky.social")), 361 + #("limit", json.int(50)), 362 + ]) 363 + 364 + let assert Ok(c) = context.builder() |> context.build() 365 + params.validate_data(data, schema, c) |> should.be_ok 366 + } 367 + 368 + // Test valid data with optional parameters 369 + pub fn valid_data_with_optional_params_test() { 370 + let schema = 371 + json.object([ 372 + #("type", json.string("params")), 373 + #( 374 + "properties", 375 + json.object([ 376 + #("repo", json.object([#("type", json.string("string"))])), 377 + #("cursor", json.object([#("type", json.string("string"))])), 378 + ]), 379 + ), 380 + #("required", json.array([json.string("repo")], fn(x) { x })), 381 + ]) 382 + 383 + // Data has required param but not optional cursor 384 + let data = json.object([#("repo", json.string("alice.bsky.social"))]) 385 + 386 + let assert Ok(c) = context.builder() |> context.build() 387 + params.validate_data(data, schema, c) |> should.be_ok 388 + } 389 + 390 + // Test valid data with all parameter types 391 + pub fn valid_data_all_types_test() { 392 + let schema = 393 + json.object([ 394 + #("type", json.string("params")), 395 + #( 396 + "properties", 397 + json.object([ 398 + #("name", json.object([#("type", json.string("string"))])), 399 + #("count", json.object([#("type", json.string("integer"))])), 400 + #("enabled", json.object([#("type", json.string("boolean"))])), 401 + #("metadata", json.object([#("type", json.string("unknown"))])), 402 + ]), 403 + ), 404 + ]) 405 + 406 + let data = 407 + json.object([ 408 + #("name", json.string("test")), 409 + #("count", json.int(42)), 410 + #("enabled", json.bool(True)), 411 + #("metadata", json.object([#("key", json.string("value"))])), 412 + ]) 413 + 414 + let assert Ok(c) = context.builder() |> context.build() 415 + params.validate_data(data, schema, c) |> should.be_ok 416 + } 417 + 418 + // Test valid data with array parameter 419 + pub fn valid_data_with_array_test() { 420 + let schema = 421 + json.object([ 422 + #("type", json.string("params")), 423 + #( 424 + "properties", 425 + json.object([ 426 + #( 427 + "tags", 428 + json.object([ 429 + #("type", json.string("array")), 430 + #("items", json.object([#("type", json.string("string"))])), 431 + ]), 432 + ), 433 + ]), 434 + ), 435 + ]) 436 + 437 + let data = 438 + json.object([ 439 + #( 440 + "tags", 441 + json.array([json.string("foo"), json.string("bar")], fn(x) { x }), 442 + ), 443 + ]) 444 + 445 + let assert Ok(c) = context.builder() |> context.build() 446 + params.validate_data(data, schema, c) |> should.be_ok 447 + } 448 + 449 + // Test invalid data: missing required parameter 450 + pub fn invalid_data_missing_required_test() { 451 + let schema = 452 + json.object([ 453 + #("type", json.string("params")), 454 + #( 455 + "properties", 456 + json.object([ 457 + #("repo", json.object([#("type", json.string("string"))])), 458 + #("limit", json.object([#("type", json.string("integer"))])), 459 + ]), 460 + ), 461 + #("required", json.array([json.string("repo")], fn(x) { x })), 462 + ]) 463 + 464 + // Data is missing required "repo" parameter 465 + let data = json.object([#("limit", json.int(50))]) 466 + 467 + let assert Ok(c) = context.builder() |> context.build() 468 + params.validate_data(data, schema, c) |> should.be_error 469 + } 470 + 471 + // Test invalid data: wrong type for parameter 472 + pub fn invalid_data_wrong_type_test() { 473 + let schema = 474 + json.object([ 475 + #("type", json.string("params")), 476 + #( 477 + "properties", 478 + json.object([ 479 + #("limit", json.object([#("type", json.string("integer"))])), 480 + ]), 481 + ), 482 + ]) 483 + 484 + // limit should be integer but is string 485 + let data = json.object([#("limit", json.string("not a number"))]) 486 + 487 + let assert Ok(c) = context.builder() |> context.build() 488 + params.validate_data(data, schema, c) |> should.be_error 489 + } 490 + 491 + // Test invalid data: string exceeds maxLength 492 + pub fn invalid_data_string_too_long_test() { 493 + let schema = 494 + json.object([ 495 + #("type", json.string("params")), 496 + #( 497 + "properties", 498 + json.object([ 499 + #( 500 + "name", 501 + json.object([ 502 + #("type", json.string("string")), 503 + #("maxLength", json.int(5)), 504 + ]), 505 + ), 506 + ]), 507 + ), 508 + ]) 509 + 510 + // name is longer than maxLength of 5 511 + let data = json.object([#("name", json.string("toolongname"))]) 512 + 513 + let assert Ok(c) = context.builder() |> context.build() 514 + params.validate_data(data, schema, c) |> should.be_error 515 + } 516 + 517 + // Test invalid data: integer below minimum 518 + pub fn invalid_data_integer_below_minimum_test() { 519 + let schema = 520 + json.object([ 521 + #("type", json.string("params")), 522 + #( 523 + "properties", 524 + json.object([ 525 + #( 526 + "count", 527 + json.object([ 528 + #("type", json.string("integer")), 529 + #("minimum", json.int(1)), 530 + ]), 531 + ), 532 + ]), 533 + ), 534 + ]) 535 + 536 + // count is below minimum of 1 537 + let data = json.object([#("count", json.int(0))]) 538 + 539 + let assert Ok(c) = context.builder() |> context.build() 540 + params.validate_data(data, schema, c) |> should.be_error 541 + } 542 + 543 + // Test invalid data: array with wrong item type 544 + pub fn invalid_data_array_wrong_item_type_test() { 545 + let schema = 546 + json.object([ 547 + #("type", json.string("params")), 548 + #( 549 + "properties", 550 + json.object([ 551 + #( 552 + "ids", 553 + json.object([ 554 + #("type", json.string("array")), 555 + #("items", json.object([#("type", json.string("integer"))])), 556 + ]), 557 + ), 558 + ]), 559 + ), 560 + ]) 561 + 562 + // Array contains strings instead of integers 563 + let data = 564 + json.object([ 565 + #( 566 + "ids", 567 + json.array([json.string("one"), json.string("two")], fn(x) { x }), 568 + ), 569 + ]) 570 + 571 + let assert Ok(c) = context.builder() |> context.build() 572 + params.validate_data(data, schema, c) |> should.be_error 573 + } 574 + 575 + // Test valid data with no properties defined (empty schema) 576 + pub fn valid_data_empty_schema_test() { 577 + let schema = json.object([#("type", json.string("params"))]) 578 + 579 + let data = json.object([]) 580 + 581 + let assert Ok(c) = context.builder() |> context.build() 582 + params.validate_data(data, schema, c) |> should.be_ok 583 + } 584 + 585 + // Test valid data allows unknown parameters not in schema 586 + pub fn valid_data_unknown_parameters_allowed_test() { 587 + let schema = 588 + json.object([ 589 + #("type", json.string("params")), 590 + #( 591 + "properties", 592 + json.object([ 593 + #("repo", json.object([#("type", json.string("string"))])), 594 + ]), 595 + ), 596 + ]) 597 + 598 + // Data has "extra" parameter not in schema 599 + let data = 600 + json.object([ 601 + #("repo", json.string("alice.bsky.social")), 602 + #("extra", json.string("allowed")), 603 + ]) 604 + 605 + let assert Ok(c) = context.builder() |> context.build() 606 + params.validate_data(data, schema, c) |> should.be_ok 607 + }
+632 -38
test/union_validator_test.gleam
··· 2 2 import gleeunit 3 3 import gleeunit/should 4 4 import honk/validation/context 5 + import honk/validation/field 5 6 import honk/validation/field/union 6 7 7 8 pub fn main() { 8 9 gleeunit.main() 9 10 } 10 11 11 - // Test valid union schema with refs 12 - pub fn valid_union_schema_test() { 13 - let schema = 14 - json.object([ 15 - #("type", json.string("union")), 16 - #( 17 - "refs", 18 - json.array([json.string("#post"), json.string("#repost")], fn(x) { x }), 19 - ), 20 - ]) 21 - 22 - let assert Ok(ctx) = context.builder() |> context.build 23 - let result = union.validate_schema(schema, ctx) 24 - result |> should.be_ok 25 - } 26 - 27 - // Test union schema with closed flag 28 - pub fn closed_union_schema_test() { 29 - let schema = 30 - json.object([ 31 - #("type", json.string("union")), 32 - #("refs", json.array([json.string("#post")], fn(x) { x })), 33 - #("closed", json.bool(True)), 34 - ]) 35 - 36 - let assert Ok(ctx) = context.builder() |> context.build 37 - let result = union.validate_schema(schema, ctx) 38 - result |> should.be_ok 39 - } 40 - 41 12 // Test open union with empty refs 42 13 pub fn open_union_empty_refs_test() { 43 14 let schema = ··· 75 46 result |> should.be_error 76 47 } 77 48 78 - // Test valid union data with $type 49 + // Test valid union data with $type matching global ref 79 50 pub fn valid_union_data_test() { 80 51 let schema = 81 52 json.object([ 82 53 #("type", json.string("union")), 83 - #("refs", json.array([json.string("app.bsky.feed.post")], fn(x) { x })), 54 + #("refs", json.array([json.string("com.example.post")], fn(x) { x })), 84 55 ]) 85 56 86 57 let data = 87 58 json.object([ 88 - #("$type", json.string("app.bsky.feed.post")), 59 + #("$type", json.string("com.example.post")), 89 60 #("text", json.string("Hello world")), 90 61 ]) 91 62 ··· 99 70 let schema = 100 71 json.object([ 101 72 #("type", json.string("union")), 102 - #("refs", json.array([json.string("#post")], fn(x) { x })), 73 + #("refs", json.array([json.string("com.example.post")], fn(x) { x })), 103 74 ]) 104 75 105 76 let data = json.object([#("text", json.string("Hello"))]) ··· 114 85 let schema = 115 86 json.object([ 116 87 #("type", json.string("union")), 117 - #("refs", json.array([json.string("#post")], fn(x) { x })), 88 + #("refs", json.array([json.string("com.example.post")], fn(x) { x })), 118 89 ]) 119 90 120 91 let data = json.string("not an object") ··· 124 95 result |> should.be_error 125 96 } 126 97 127 - // Test union data with $type not in refs 98 + // Test closed union rejects $type not in refs 128 99 pub fn union_data_type_not_in_refs_test() { 129 100 let schema = 130 101 json.object([ 131 102 #("type", json.string("union")), 132 - #("refs", json.array([json.string("app.bsky.feed.post")], fn(x) { x })), 103 + #("refs", json.array([json.string("com.example.typeA")], fn(x) { x })), 133 104 #("closed", json.bool(True)), 134 105 ]) 135 106 136 107 let data = 137 108 json.object([ 138 - #("$type", json.string("app.bsky.feed.repost")), 109 + #("$type", json.string("com.example.typeB")), 110 + #("data", json.string("some data")), 111 + ]) 112 + 113 + let assert Ok(ctx) = context.builder() |> context.build 114 + let result = union.validate_data(data, schema, ctx) 115 + result |> should.be_error 116 + } 117 + 118 + // Test union with invalid ref (non-string in array) 119 + pub fn union_with_invalid_ref_type_test() { 120 + let schema = 121 + json.object([ 122 + #("type", json.string("union")), 123 + #( 124 + "refs", 125 + json.array([json.int(123), json.string("com.example.post")], fn(x) { x }), 126 + ), 127 + ]) 128 + 129 + let assert Ok(ctx) = context.builder() |> context.build 130 + let result = union.validate_schema(schema, ctx) 131 + result |> should.be_error 132 + } 133 + 134 + // Test local ref matching in data validation 135 + pub fn union_data_local_ref_matching_test() { 136 + let schema = 137 + json.object([ 138 + #("type", json.string("union")), 139 + #( 140 + "refs", 141 + json.array([json.string("#post"), json.string("#reply")], fn(x) { x }), 142 + ), 143 + ]) 144 + 145 + // Data with $type matching local ref pattern 146 + let data = 147 + json.object([ 148 + #("$type", json.string("post")), 139 149 #("text", json.string("Hello")), 140 150 ]) 141 151 142 152 let assert Ok(ctx) = context.builder() |> context.build 143 153 let result = union.validate_data(data, schema, ctx) 154 + // Should pass because local ref #post matches bare name "post" 155 + result |> should.be_ok 156 + } 157 + 158 + // Test local ref with NSID in data 159 + pub fn union_data_local_ref_with_nsid_test() { 160 + let schema = 161 + json.object([ 162 + #("type", json.string("union")), 163 + #("refs", json.array([json.string("#view")], fn(x) { x })), 164 + ]) 165 + 166 + // Data with $type as full NSID#fragment 167 + let data = 168 + json.object([ 169 + #("$type", json.string("com.example.feed#view")), 170 + #("uri", json.string("at://did:plc:abc/com.example.feed/123")), 171 + ]) 172 + 173 + let assert Ok(ctx) = context.builder() |> context.build 174 + let result = union.validate_data(data, schema, ctx) 175 + // Should pass because local ref #view matches NSID with #view fragment 176 + result |> should.be_ok 177 + } 178 + 179 + // Test multiple local refs in schema 180 + pub fn union_with_multiple_local_refs_test() { 181 + let schema = 182 + json.object([ 183 + #("type", json.string("union")), 184 + #( 185 + "refs", 186 + json.array( 187 + [json.string("#post"), json.string("#repost"), json.string("#reply")], 188 + fn(x) { x }, 189 + ), 190 + ), 191 + ]) 192 + 193 + let assert Ok(ctx) = context.builder() |> context.build 194 + let result = union.validate_schema(schema, ctx) 195 + // In test context without lexicon catalog, local refs are syntactically valid 196 + result |> should.be_ok 197 + } 198 + 199 + // Test mixed global and local refs 200 + pub fn union_with_mixed_refs_test() { 201 + let schema = 202 + json.object([ 203 + #("type", json.string("union")), 204 + #( 205 + "refs", 206 + json.array( 207 + [json.string("com.example.post"), json.string("#localDef")], 208 + fn(x) { x }, 209 + ), 210 + ), 211 + ]) 212 + 213 + let assert Ok(ctx) = context.builder() |> context.build 214 + let result = union.validate_schema(schema, ctx) 215 + // In test context without lexicon catalog, both types are syntactically valid 216 + result |> should.be_ok 217 + } 218 + 219 + // Test all primitive types for non-object validation 220 + pub fn union_data_all_non_object_types_test() { 221 + let schema = 222 + json.object([ 223 + #("type", json.string("union")), 224 + #("refs", json.array([json.string("com.example.post")], fn(x) { x })), 225 + ]) 226 + 227 + let assert Ok(ctx) = context.builder() |> context.build 228 + 229 + // Test number 230 + let number_data = json.int(123) 231 + union.validate_data(number_data, schema, ctx) |> should.be_error 232 + 233 + // Test string 234 + let string_data = json.string("not an object") 235 + union.validate_data(string_data, schema, ctx) |> should.be_error 236 + 237 + // Test null 238 + let null_data = json.null() 239 + union.validate_data(null_data, schema, ctx) |> should.be_error 240 + 241 + // Test array 242 + let array_data = json.array([json.string("item")], fn(x) { x }) 243 + union.validate_data(array_data, schema, ctx) |> should.be_error 244 + 245 + // Test boolean 246 + let bool_data = json.bool(True) 247 + union.validate_data(bool_data, schema, ctx) |> should.be_error 248 + } 249 + 250 + // Test empty refs in data validation context 251 + pub fn union_data_empty_refs_test() { 252 + let schema = 253 + json.object([ 254 + #("type", json.string("union")), 255 + #("refs", json.array([], fn(x) { x })), 256 + ]) 257 + 258 + let data = 259 + json.object([ 260 + #("$type", json.string("any.type")), 261 + #("data", json.string("some data")), 262 + ]) 263 + 264 + let assert Ok(ctx) = context.builder() |> context.build 265 + let result = union.validate_data(data, schema, ctx) 266 + // Data validation should fail with empty refs array 144 267 result |> should.be_error 145 268 } 269 + 270 + // Test comprehensive reference matching with full lexicon catalog 271 + pub fn union_data_reference_matching_test() { 272 + // Set up lexicons with local, global main, and fragment refs 273 + let main_lexicon = 274 + json.object([ 275 + #("lexicon", json.int(1)), 276 + #("id", json.string("com.example.test")), 277 + #( 278 + "defs", 279 + json.object([ 280 + #( 281 + "main", 282 + json.object([ 283 + #("type", json.string("union")), 284 + #( 285 + "refs", 286 + json.array( 287 + [ 288 + json.string("#localType"), 289 + json.string("com.example.global#main"), 290 + json.string("com.example.types#fragmentType"), 291 + ], 292 + fn(x) { x }, 293 + ), 294 + ), 295 + ]), 296 + ), 297 + #( 298 + "localType", 299 + json.object([ 300 + #("type", json.string("object")), 301 + #("properties", json.object([])), 302 + ]), 303 + ), 304 + ]), 305 + ), 306 + ]) 307 + 308 + let global_lexicon = 309 + json.object([ 310 + #("lexicon", json.int(1)), 311 + #("id", json.string("com.example.global")), 312 + #( 313 + "defs", 314 + json.object([ 315 + #( 316 + "main", 317 + json.object([ 318 + #("type", json.string("object")), 319 + #("properties", json.object([])), 320 + ]), 321 + ), 322 + ]), 323 + ), 324 + ]) 325 + 326 + let types_lexicon = 327 + json.object([ 328 + #("lexicon", json.int(1)), 329 + #("id", json.string("com.example.types")), 330 + #( 331 + "defs", 332 + json.object([ 333 + #( 334 + "fragmentType", 335 + json.object([ 336 + #("type", json.string("object")), 337 + #("properties", json.object([])), 338 + ]), 339 + ), 340 + ]), 341 + ), 342 + ]) 343 + 344 + let assert Ok(builder) = 345 + context.builder() 346 + |> context.with_validator(field.dispatch_data_validation) 347 + |> context.with_lexicons([main_lexicon, global_lexicon, types_lexicon]) 348 + 349 + let assert Ok(ctx) = builder |> context.build() 350 + let ctx = context.with_current_lexicon(ctx, "com.example.test") 351 + 352 + let schema = 353 + json.object([ 354 + #("type", json.string("union")), 355 + #( 356 + "refs", 357 + json.array( 358 + [ 359 + json.string("#localType"), 360 + json.string("com.example.global#main"), 361 + json.string("com.example.types#fragmentType"), 362 + ], 363 + fn(x) { x }, 364 + ), 365 + ), 366 + ]) 367 + 368 + // Test local reference match 369 + let local_data = json.object([#("$type", json.string("localType"))]) 370 + union.validate_data(local_data, schema, ctx) |> should.be_ok 371 + 372 + // Test global main reference match 373 + let global_data = 374 + json.object([#("$type", json.string("com.example.global#main"))]) 375 + union.validate_data(global_data, schema, ctx) |> should.be_ok 376 + 377 + // Test global fragment reference match 378 + let fragment_data = 379 + json.object([#("$type", json.string("com.example.types#fragmentType"))]) 380 + union.validate_data(fragment_data, schema, ctx) |> should.be_ok 381 + } 382 + 383 + // Test full schema resolution with constraint validation 384 + pub fn union_data_with_schema_resolution_test() { 385 + let main_lexicon = 386 + json.object([ 387 + #("lexicon", json.int(1)), 388 + #("id", json.string("com.example.feed")), 389 + #( 390 + "defs", 391 + json.object([ 392 + #( 393 + "main", 394 + json.object([ 395 + #("type", json.string("union")), 396 + #( 397 + "refs", 398 + json.array( 399 + [ 400 + json.string("#post"), 401 + json.string("#repost"), 402 + json.string("com.example.types#like"), 403 + ], 404 + fn(x) { x }, 405 + ), 406 + ), 407 + ]), 408 + ), 409 + #( 410 + "post", 411 + json.object([ 412 + #("type", json.string("object")), 413 + #( 414 + "properties", 415 + json.object([ 416 + #( 417 + "title", 418 + json.object([ 419 + #("type", json.string("string")), 420 + #("maxLength", json.int(100)), 421 + ]), 422 + ), 423 + #("content", json.object([#("type", json.string("string"))])), 424 + ]), 425 + ), 426 + #("required", json.array([json.string("title")], fn(x) { x })), 427 + ]), 428 + ), 429 + #( 430 + "repost", 431 + json.object([ 432 + #("type", json.string("object")), 433 + #( 434 + "properties", 435 + json.object([ 436 + #("original", json.object([#("type", json.string("string"))])), 437 + #("comment", json.object([#("type", json.string("string"))])), 438 + ]), 439 + ), 440 + #("required", json.array([json.string("original")], fn(x) { x })), 441 + ]), 442 + ), 443 + ]), 444 + ), 445 + ]) 446 + 447 + let types_lexicon = 448 + json.object([ 449 + #("lexicon", json.int(1)), 450 + #("id", json.string("com.example.types")), 451 + #( 452 + "defs", 453 + json.object([ 454 + #( 455 + "like", 456 + json.object([ 457 + #("type", json.string("object")), 458 + #( 459 + "properties", 460 + json.object([ 461 + #("target", json.object([#("type", json.string("string"))])), 462 + #( 463 + "emoji", 464 + json.object([ 465 + #("type", json.string("string")), 466 + #("maxLength", json.int(10)), 467 + ]), 468 + ), 469 + ]), 470 + ), 471 + #("required", json.array([json.string("target")], fn(x) { x })), 472 + ]), 473 + ), 474 + ]), 475 + ), 476 + ]) 477 + 478 + let assert Ok(builder) = 479 + context.builder() 480 + |> context.with_validator(field.dispatch_data_validation) 481 + |> context.with_lexicons([main_lexicon, types_lexicon]) 482 + 483 + let assert Ok(ctx) = builder |> context.build() 484 + let ctx = context.with_current_lexicon(ctx, "com.example.feed") 485 + 486 + let union_schema = 487 + json.object([ 488 + #("type", json.string("union")), 489 + #( 490 + "refs", 491 + json.array( 492 + [ 493 + json.string("#post"), 494 + json.string("#repost"), 495 + json.string("com.example.types#like"), 496 + ], 497 + fn(x) { x }, 498 + ), 499 + ), 500 + ]) 501 + 502 + // Test valid post data (with all required fields) 503 + let valid_post = 504 + json.object([ 505 + #("$type", json.string("post")), 506 + #("title", json.string("My Post")), 507 + #("content", json.string("This is my post content")), 508 + ]) 509 + union.validate_data(valid_post, union_schema, ctx) |> should.be_ok 510 + 511 + // Test invalid post data (missing required field) 512 + let invalid_post = 513 + json.object([ 514 + #("$type", json.string("post")), 515 + #("content", json.string("This is missing a title")), 516 + ]) 517 + union.validate_data(invalid_post, union_schema, ctx) |> should.be_error 518 + 519 + // Test valid repost data (with all required fields) 520 + let valid_repost = 521 + json.object([ 522 + #("$type", json.string("repost")), 523 + #("original", json.string("original-post-uri")), 524 + #("comment", json.string("Great post!")), 525 + ]) 526 + union.validate_data(valid_repost, union_schema, ctx) |> should.be_ok 527 + 528 + // Test valid like data (global reference with all required fields) 529 + let valid_like = 530 + json.object([ 531 + #("$type", json.string("com.example.types#like")), 532 + #("target", json.string("post-uri")), 533 + #("emoji", json.string("๐Ÿ‘")), 534 + ]) 535 + union.validate_data(valid_like, union_schema, ctx) |> should.be_ok 536 + 537 + // Test invalid like data (missing required field) 538 + let invalid_like = 539 + json.object([ 540 + #("$type", json.string("com.example.types#like")), 541 + #("emoji", json.string("๐Ÿ‘")), 542 + ]) 543 + union.validate_data(invalid_like, union_schema, ctx) |> should.be_error 544 + } 545 + 546 + // Test open vs closed union comparison 547 + pub fn union_data_open_vs_closed_test() { 548 + let lexicon = 549 + json.object([ 550 + #("lexicon", json.int(1)), 551 + #("id", json.string("com.example.test")), 552 + #( 553 + "defs", 554 + json.object([ 555 + #( 556 + "main", 557 + json.object([ 558 + #("type", json.string("union")), 559 + #("refs", json.array([json.string("#post")], fn(x) { x })), 560 + #("closed", json.bool(False)), 561 + ]), 562 + ), 563 + #( 564 + "post", 565 + json.object([ 566 + #("type", json.string("object")), 567 + #( 568 + "properties", 569 + json.object([ 570 + #("title", json.object([#("type", json.string("string"))])), 571 + ]), 572 + ), 573 + ]), 574 + ), 575 + ]), 576 + ), 577 + ]) 578 + 579 + let assert Ok(builder) = 580 + context.builder() 581 + |> context.with_validator(field.dispatch_data_validation) 582 + |> context.with_lexicons([lexicon]) 583 + let assert Ok(ctx) = builder |> context.build() 584 + let ctx = context.with_current_lexicon(ctx, "com.example.test") 585 + 586 + let open_union_schema = 587 + json.object([ 588 + #("type", json.string("union")), 589 + #("refs", json.array([json.string("#post")], fn(x) { x })), 590 + #("closed", json.bool(False)), 591 + ]) 592 + 593 + let closed_union_schema = 594 + json.object([ 595 + #("type", json.string("union")), 596 + #("refs", json.array([json.string("#post")], fn(x) { x })), 597 + #("closed", json.bool(True)), 598 + ]) 599 + 600 + // Known $type should work in both 601 + let known_type = 602 + json.object([ 603 + #("$type", json.string("post")), 604 + #("title", json.string("Test")), 605 + ]) 606 + union.validate_data(known_type, open_union_schema, ctx) |> should.be_ok 607 + union.validate_data(known_type, closed_union_schema, ctx) |> should.be_ok 608 + 609 + // Unknown $type - behavior differs between open/closed 610 + let unknown_type = 611 + json.object([ 612 + #("$type", json.string("unknown_type")), 613 + #("data", json.string("test")), 614 + ]) 615 + // Open union should accept unknown types 616 + union.validate_data(unknown_type, open_union_schema, ctx) |> should.be_ok 617 + // Closed union should reject unknown types 618 + union.validate_data(unknown_type, closed_union_schema, ctx) |> should.be_error 619 + } 620 + 621 + // Test basic union with full lexicon context 622 + pub fn union_data_basic_with_full_context_test() { 623 + let main_lexicon = 624 + json.object([ 625 + #("lexicon", json.int(1)), 626 + #("id", json.string("com.example.test")), 627 + #( 628 + "defs", 629 + json.object([ 630 + #( 631 + "main", 632 + json.object([ 633 + #("type", json.string("union")), 634 + #( 635 + "refs", 636 + json.array( 637 + [ 638 + json.string("#post"), 639 + json.string("#repost"), 640 + json.string("com.example.like#main"), 641 + ], 642 + fn(x) { x }, 643 + ), 644 + ), 645 + ]), 646 + ), 647 + #( 648 + "post", 649 + json.object([ 650 + #("type", json.string("object")), 651 + #( 652 + "properties", 653 + json.object([ 654 + #("title", json.object([#("type", json.string("string"))])), 655 + #("content", json.object([#("type", json.string("string"))])), 656 + ]), 657 + ), 658 + ]), 659 + ), 660 + #( 661 + "repost", 662 + json.object([ 663 + #("type", json.string("object")), 664 + #( 665 + "properties", 666 + json.object([ 667 + #("original", json.object([#("type", json.string("string"))])), 668 + ]), 669 + ), 670 + ]), 671 + ), 672 + ]), 673 + ), 674 + ]) 675 + 676 + let like_lexicon = 677 + json.object([ 678 + #("lexicon", json.int(1)), 679 + #("id", json.string("com.example.like")), 680 + #( 681 + "defs", 682 + json.object([ 683 + #( 684 + "main", 685 + json.object([ 686 + #("type", json.string("object")), 687 + #( 688 + "properties", 689 + json.object([ 690 + #("target", json.object([#("type", json.string("string"))])), 691 + ]), 692 + ), 693 + ]), 694 + ), 695 + ]), 696 + ), 697 + ]) 698 + 699 + let assert Ok(builder) = 700 + context.builder() 701 + |> context.with_validator(field.dispatch_data_validation) 702 + |> context.with_lexicons([main_lexicon, like_lexicon]) 703 + 704 + let assert Ok(ctx) = builder |> context.build() 705 + let ctx = context.with_current_lexicon(ctx, "com.example.test") 706 + 707 + let schema = 708 + json.object([ 709 + #("type", json.string("union")), 710 + #( 711 + "refs", 712 + json.array( 713 + [ 714 + json.string("#post"), 715 + json.string("#repost"), 716 + json.string("com.example.like#main"), 717 + ], 718 + fn(x) { x }, 719 + ), 720 + ), 721 + ]) 722 + 723 + // Valid union data with local reference 724 + let post_data = 725 + json.object([ 726 + #("$type", json.string("post")), 727 + #("title", json.string("My Post")), 728 + #("content", json.string("Post content")), 729 + ]) 730 + union.validate_data(post_data, schema, ctx) |> should.be_ok 731 + 732 + // Valid union data with global reference 733 + let like_data = 734 + json.object([ 735 + #("$type", json.string("com.example.like#main")), 736 + #("target", json.string("some-target")), 737 + ]) 738 + union.validate_data(like_data, schema, ctx) |> should.be_ok 739 + }