Vite SSR & Environment API Implementation Plan#
For Claude: REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task.
Goal: Replace tsx watch with Vite 8 Environment API for in-process HMR, rewrite server to Web Standard Request/Response, add framework-agnostic SSR, and produce a single deployable artifact.
Architecture: The server becomes a pure (Request) → Response function. A ~50 line Node.js adapter bridges it to createServer for production. In dev, Vite's RunnableDevEnvironment runs the handler with HMR. defineRenderer provides framework-agnostic SSR. vite build produces client assets + server entry in two stages.
Tech Stack: Vite 8 Environment API, Web Standard Request/Response, TypeScript
Working directory: /Users/chadmiller/code/hatk/.worktrees/server-directory
Task 1: Node.js Request/Response adapter#
Create the ~50 line bridge between Node.js HTTP and Web Standard APIs. This is the foundation everything else builds on.
Files:
- Create:
packages/hatk/src/adapter.ts
Step 1: Write adapter.ts
import { type IncomingMessage, type ServerResponse, createServer } from 'node:http'
/**
* Convert a Node.js IncomingMessage to a Web Standard Request.
*/
export function toRequest(req: IncomingMessage, base: string): Request {
const url = new URL(req.url!, base)
const headers = new Headers()
for (const [key, value] of Object.entries(req.headers)) {
if (value) {
if (Array.isArray(value)) {
for (const v of value) headers.append(key, v)
} else {
headers.set(key, value)
}
}
}
const init: RequestInit = {
method: req.method,
headers,
}
// GET and HEAD requests cannot have a body
if (req.method !== 'GET' && req.method !== 'HEAD') {
// @ts-expect-error — Node.js streams are valid body sources
init.body = req
init.duplex = 'half'
}
return new Request(url.href, init)
}
/**
* Pipe a Web Standard Response back to a Node.js ServerResponse.
*/
export async function sendResponse(res: ServerResponse, response: Response): Promise<void> {
res.writeHead(response.status, Object.fromEntries(response.headers.entries()))
if (!response.body) {
res.end()
return
}
const reader = response.body.getReader()
try {
while (true) {
const { done, value } = await reader.read()
if (done) break
res.write(value)
}
} finally {
reader.releaseLock()
res.end()
}
}
/**
* Create a Node.js HTTP server from a Web Standard fetch handler.
*/
export function serve(
handler: (request: Request) => Promise<Response>,
port: number,
base?: string,
) {
const origin = base || `http://localhost:${port}`
const server = createServer(async (req, res) => {
try {
const request = toRequest(req, origin)
const response = await handler(request)
await sendResponse(res, response)
} catch (err: any) {
if (!res.headersSent) {
res.writeHead(500, { 'Content-Type': 'application/json' })
}
res.end(JSON.stringify({ error: err.message }))
}
})
server.listen(port)
return server
}
Step 2: Verify build
Run: cd packages/hatk && npx tsc -p tsconfig.build.json --noEmit
Expected: No errors
Step 3: Commit
git add packages/hatk/src/adapter.ts
git commit -m "feat: add Node.js Request/Response adapter"
Task 2: Response helper functions#
Create the utility functions that the rewritten server will use. These replace the old jsonResponse/jsonError/sendJson functions.
Files:
- Create:
packages/hatk/src/response.ts
Step 1: Write response.ts
import { gzipSync } from 'node:zlib'
import { normalizeValue } from './database/db.ts'
/**
* Create a JSON Response with optional gzip compression.
* Mirrors the old jsonResponse/sendJson behavior.
*/
export function json(data: unknown, status = 200, acceptEncoding?: string | null): Response {
const body = Buffer.from(JSON.stringify(data, (_, v) => normalizeValue(v)))
if (body.length > 1024 && acceptEncoding && /\bgzip\b/.test(acceptEncoding)) {
const compressed = gzipSync(body)
return new Response(compressed, {
status,
headers: {
'Content-Type': 'application/json',
'Content-Encoding': 'gzip',
'Vary': 'Accept-Encoding',
...(status === 200 ? { 'Cache-Control': 'no-store' } : {}),
},
})
}
return new Response(body, {
status,
headers: {
'Content-Type': 'application/json',
...(status === 200 ? { 'Cache-Control': 'no-store' } : {}),
},
})
}
/** Create a JSON error Response. */
export function jsonError(status: number, message: string, acceptEncoding?: string | null): Response {
return json({ error: message }, status, acceptEncoding)
}
/** CORS preflight Response. */
export function cors(): Response {
return new Response(null, {
status: 200,
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Headers': '*',
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
},
})
}
/** Add CORS headers to an existing Response. */
export function withCors(response: Response): Response {
const headers = new Headers(response.headers)
headers.set('Access-Control-Allow-Origin', '*')
headers.set('Access-Control-Allow-Headers', '*')
headers.set('Access-Control-Allow-Methods', 'GET, POST, OPTIONS')
return new Response(response.body, {
status: response.status,
statusText: response.statusText,
headers,
})
}
/** Create a static file Response with correct MIME type. */
export function file(content: Buffer | Uint8Array, contentType: string, cacheControl?: string): Response {
return new Response(content, {
status: 200,
headers: {
'Content-Type': contentType,
...(cacheControl ? { 'Cache-Control': cacheControl } : {}),
},
})
}
/** 404 Not Found. */
export function notFound(): Response {
return new Response('Not Found', { status: 404 })
}
Step 2: Verify build
Run: cd packages/hatk && npx tsc -p tsconfig.build.json --noEmit
Expected: No errors
Step 3: Commit
git add packages/hatk/src/response.ts
git commit -m "feat: add Web Standard Response helper functions"
Task 3: Rewrite server.ts to Request → Response#
This is the biggest task. Rewrite the 1200-line startServer function as a pure createHandler function that returns (Request) → Promise<Response>. Every res.writeHead()/res.end() becomes a return new Response(...). Every readBody(req) becomes await request.text().
Files:
- Modify:
packages/hatk/src/server.ts(complete rewrite)
Step 1: Understand the mapping
| Old pattern | New pattern |
|---|---|
readBody(req) |
await request.text() |
readBodyRaw(req) |
Buffer.from(await request.arrayBuffer()) |
req.url |
request.url (already a full URL string) |
req.method |
request.method |
req.headers['authorization'] |
request.headers.get('authorization') |
jsonResponse(res, data) |
return json(data, 200, acceptEncoding) |
jsonError(res, 400, 'msg') |
return jsonError(400, 'msg', acceptEncoding) |
res.writeHead(200, {...}); res.end(buf) |
return file(buf, 'image/png', 'public, max-age=300') |
url.searchParams.get(...) |
Same (URL is constructed from request.url) |
Step 2: Rewrite the file
The new structure:
import { json, jsonError, cors, withCors, file, notFound } from './response.ts'
// ... existing imports minus createServer, IncomingMessage, gzipSync
export interface HandlerConfig {
collections: string[]
publicDir: string | null
oauth: OAuthConfig | null
admins: string[]
renderer?: (request: Request, manifest: any) => Promise<{ html: string; head?: string }>
resolveViewer?: (request: Request) => { did: string } | null
onResync?: () => void
}
/**
* Create a Web Standard request handler for all hatk routes.
* Returns a pure function: (Request) → Promise<Response>
*/
export function createHandler(config: HandlerConfig): (request: Request) => Promise<Response> {
const { collections, publicDir, oauth, admins } = config
const devMode = process.env.DEV_MODE === '1'
const coreXrpc = (method: string) => `/xrpc/dev.hatk.${method}`
return async (request: Request): Promise<Response> => {
const url = new URL(request.url)
const acceptEncoding = request.headers.get('accept-encoding')
// CORS preflight
if (request.method === 'OPTIONS') return cors()
// ... all the existing route handlers, converted to return Response objects
// Each `jsonResponse(res, data)` becomes `return withCors(json(data, 200, acceptEncoding))`
// Each `jsonError(res, status, msg)` becomes `return withCors(jsonError(status, msg, acceptEncoding))`
return notFound()
}
}
// Keep startServer as a thin wrapper for backward compatibility during migration
export { serve } from './adapter.ts'
Key conversion rules for each route:
- Auth:
req.headers['authorization']→request.headers.get('authorization') - Body:
readBody(req)→await request.text() - Binary body:
readBodyRaw(req)→Buffer.from(await request.arrayBuffer()) - JSON response:
jsonResponse(res, data); return→return withCors(json(data, 200, acceptEncoding)) - Error:
jsonError(res, 400, msg); return→return withCors(jsonError(400, msg, acceptEncoding)) - Binary response:
res.writeHead(200, ...); res.end(png)→return withCors(file(png, 'image/png', 'public, max-age=300')) - HTML response:
res.writeHead(200, ...); res.end(html)→return withCors(file(Buffer.from(html), 'text/html')) - Static files: Same pattern as HTML but with appropriate MIME type
- SPA fallback with OG meta: Read index.html, inject OG meta, return as HTML Response
- Request origin:
req.headers['x-forwarded-proto']→request.headers.get('x-forwarded-proto')
Step 3: Handle the requireAdmin helper
Convert from mutating res to returning null (meaning auth failed):
function requireAdmin(viewer: { did: string } | null, acceptEncoding: string | null): Response | null {
if (!viewer) return withCors(jsonError(401, 'Authentication required', acceptEncoding))
if (!devMode && !admins.includes(viewer.did)) return withCors(jsonError(403, 'Admin access required', acceptEncoding))
return null // auth OK
}
Usage: const denied = requireAdmin(viewer, acceptEncoding); if (denied) return denied;
Step 4: Handle the viewer authentication
Convert from reading req.headers to reading request.headers:
let viewer: { did: string } | null = config.resolveViewer?.(request) ?? null
if (!viewer && oauth) {
try {
viewer = await authenticate(
request.headers.get('authorization'),
request.headers.get('dpop'),
request.method,
`${requestOrigin}${url.pathname}`,
)
} catch (err: any) {
emit('oauth', 'authenticate_error', { error: err.message })
}
}
Step 5: Convert OAuth routes
The OAuth routes (/oauth/par, /oauth/token, /oauth/jwks, etc.) follow the same pattern. The key ones:
handleParcurrently receives(body, origin)— keep this, just get body fromrequest.text()handleTokensamehandleCallbackreceives(code, iss, state)from URL params
Step 6: Convert all remaining routes
Work through every if (url.pathname === ...) block systematically. The conversion is mechanical — just apply the mapping from Step 1.
Step 7: Keep proxyToPds and proxyToPdsRaw unchanged
These functions already use the fetch API internally. They return { ok, status, body } objects. The caller just wraps the result in a Response.
Step 8: Remove the old startServer function
Replace it with:
export function startServer(
port: number,
collections: string[],
publicDir: string | null,
oauth: OAuthConfig | null,
admins: string[] = [],
resolveViewer?: (request: Request) => { did: string } | null,
onResync?: () => void,
): import('node:http').Server {
const handler = createHandler({ collections, publicDir, oauth, admins, resolveViewer, onResync })
return serve(handler, port)
}
Step 9: Verify build
Run: cd packages/hatk && npx tsc -p tsconfig.build.json --noEmit
Expected: No errors
Step 10: Commit
git add packages/hatk/src/server.ts
git commit -m "feat: rewrite server.ts to Web Standard Request/Response"
Task 4: defineRenderer and SSR assembly#
Add the renderer define function and the logic that assembles SSR output into an HTML page.
Files:
- Create:
packages/hatk/src/renderer.ts - Modify:
packages/hatk/src/server.ts(add SSR rendering to SPA fallback route)
Step 1: Write renderer.ts
import { log } from './logger.ts'
export interface SSRManifest {
getPreloadTags(url: string): string
}
export interface RenderResult {
html: string
head?: string
}
export type RendererHandler = (request: Request, manifest: SSRManifest) => Promise<RenderResult>
let renderer: RendererHandler | null = null
let ssrManifest: SSRManifest | null = null
export function defineRenderer(handler: RendererHandler) {
return { __type: 'renderer' as const, handler }
}
export function registerRenderer(handler: RendererHandler): void {
renderer = handler
log('[renderer] SSR renderer registered')
}
export function setSSRManifest(manifest: SSRManifest): void {
ssrManifest = manifest
}
export function getRenderer(): RendererHandler | null {
return renderer
}
export function getSSRManifest(): SSRManifest | null {
return ssrManifest
}
/**
* Render an HTML page by calling the user's renderer and assembling the result
* into the index.html template.
*
* @param template - The index.html content (with <!--ssr-outlet--> placeholder)
* @param request - The incoming Request
* @param ogMeta - Optional OG meta tags to inject
* @returns Assembled HTML string, or null if no renderer is registered
*/
export async function renderPage(
template: string,
request: Request,
ogMeta?: string | null,
): Promise<string | null> {
if (!renderer) return null
const manifest = ssrManifest || { getPreloadTags: () => '' }
const result = await renderer(request, manifest)
let html = template
// Inject SSR head tags (preloads, styles)
if (result.head) {
html = html.replace('</head>', `${result.head}\n</head>`)
}
// Inject OG meta tags
if (ogMeta) {
html = html.replace('</head>', `${ogMeta}\n</head>`)
}
// Inject rendered HTML into the outlet
html = html.replace('<!--ssr-outlet-->', result.html)
return html
}
Step 2: Wire renderer into server.ts SPA fallback
In the SPA fallback section of createHandler (the part that serves index.html), add SSR rendering:
// SSR or SPA fallback for HTML requests
if (publicDir && request.headers.get('accept')?.includes('text/html')) {
const template = await readFile(join(publicDir, 'index.html'), 'utf-8')
const ogMeta = buildOgMeta(url.pathname, requestOrigin)
// Try SSR first
const renderedHtml = await renderPage(template, request, ogMeta)
if (renderedHtml) {
return withCors(file(Buffer.from(renderedHtml), 'text/html'))
}
// SPA fallback — inject OG meta only
let html = template
if (ogMeta) {
html = html.replace('</head>', `${ogMeta}\n</head>`)
}
return withCors(file(Buffer.from(html), 'text/html'))
}
Step 3: Add renderer to scanner
Update packages/hatk/src/scanner.ts to handle __type: 'renderer' and add to ScanResult.
Update packages/hatk/src/server-init.ts to call registerRenderer for scanned renderer modules.
Step 4: Export defineRenderer
Add to packages/hatk/src/cli.ts codegen:
out += `export { defineRenderer } from '@hatk/hatk/renderer'\n`
Add "./renderer": "./dist/renderer.js" to packages/hatk/package.json exports.
Step 5: Verify build
Run: cd packages/hatk && npx tsc -p tsconfig.build.json --noEmit
Expected: No errors
Step 6: Commit
git add packages/hatk/src/renderer.ts packages/hatk/src/server.ts packages/hatk/src/scanner.ts packages/hatk/src/server-init.ts packages/hatk/src/cli.ts packages/hatk/package.json
git commit -m "feat: add defineRenderer and SSR assembly"
Task 5: Rewrite Vite plugin with Environment API#
Replace the tsx watch child process + proxy approach with Vite 8's RunnableDevEnvironment and middleware.
Files:
- Modify:
packages/hatk/src/vite-plugin.ts(complete rewrite)
Step 1: Read Vite 8 Environment API types
Before writing, read:
- Vite's
Plugintype RunnableDevEnvironment/isRunnableDevEnvironmentDevEnvironmentand itsrunnerproperty- How
configureServerhooks work with environments
Step 2: Rewrite vite-plugin.ts
import type { Plugin, ViteDevServer } from 'vite'
import { resolve } from 'node:path'
export function hatk(opts?: { port?: number }): Plugin {
const devPort = opts?.port ?? 3000
let handler: ((request: Request) => Promise<Response>) | null = null
return {
name: 'vite-plugin-hatk',
config() {
return {
environments: {
hatk: {
// RunnableDevEnvironment for in-process module execution
dev: {
optimizeDeps: {
// Externalize native modules
exclude: ['better-sqlite3', '@duckdb/node-api'],
},
},
build: {
outDir: 'dist/server',
ssr: true,
rollupOptions: {
external: ['better-sqlite3', '@duckdb/node-api'],
},
},
},
},
server: {
host: '127.0.0.1',
port: devPort,
watch: {
ignored: ['**/db/**', '**/data/**'],
},
},
test: {
projects: [
{
test: {
name: 'unit',
include: ['test/server/**/*.test.ts', 'test/feeds/**/*.test.ts', 'test/xrpc/**/*.test.ts'],
},
},
{
test: {
name: 'integration',
include: ['test/integration/**/*.test.ts'],
},
},
],
},
}
},
configureServer(server: ViteDevServer) {
// Boot hatk infrastructure and load handlers through the module runner.
// Return a function so our middleware runs AFTER Vite's internal middleware
// (this way Vite handles client assets, we handle backend routes).
return async () => {
// Import the boot module through the hatk environment's runner
const env = server.environments.hatk
if (!env || !('runner' in env)) {
console.error('[hatk] hatk environment not available')
return
}
// Load the hatk boot module — this initializes DB, indexer, OAuth, scans server/
const mainPath = resolve(import.meta.dirname!, 'dev-entry.js')
const mod = await (env as any).runner.import(mainPath)
handler = mod.handler
// Mount hatk as middleware for backend routes
server.middlewares.use(async (req, res, next) => {
const url = new URL(req.url!, `http://localhost:${devPort}`)
// Only handle backend routes — let Vite handle everything else
const isBackend =
url.pathname.startsWith('/xrpc/') ||
url.pathname.startsWith('/oauth/') ||
url.pathname.startsWith('/.well-known/') ||
url.pathname.startsWith('/og/') ||
url.pathname.startsWith('/admin') ||
url.pathname === '/_health' ||
url.pathname === '/info' ||
url.pathname === '/repos' ||
url.pathname === '/robots.txt'
if (!isBackend || !handler) {
next()
return
}
try {
const { toRequest, sendResponse } = await import('./adapter.js')
const request = toRequest(req, `http://localhost:${devPort}`)
const response = await handler(request)
await sendResponse(res, response)
} catch (err: any) {
console.error('[hatk]', err.message)
next(err)
}
})
}
},
// Handle HMR for server/ files in the hatk environment
hotUpdate({ file, server, modules }) {
if (file.includes('/server/')) {
// Invalidate modules in the hatk environment
const env = server.environments.hatk
if (env) {
for (const mod of modules) {
env.moduleGraph.invalidateModule(mod)
}
// Re-import the entry to pick up changes
// The handler reference will be updated on next request
}
}
},
// Two-stage production build
async buildApp(builder) {
// Stage 1: Build client
await builder.build(builder.environments.client)
// Stage 2: Build hatk server
await builder.build(builder.environments.hatk)
},
}
}
Step 3: Create dev-entry.ts
This is the entry module loaded by the module runner in dev mode. It boots infrastructure and exports the handler.
Create packages/hatk/src/dev-entry.ts:
/**
* Dev mode entry point — loaded through Vite's module runner.
* Boots hatk infrastructure and exports the fetch handler.
*/
import { loadConfig } from './config.ts'
import { loadLexicons, storeLexicons, discoverCollections, buildSchemas } from './database/schema.ts'
import { discoverViews } from './views.ts'
import { initDatabase, migrateSchema } from './database/db.ts'
import { createAdapter } from './database/adapter-factory.ts'
import { getDialect } from './database/dialect.ts'
import { setSearchPort } from './database/fts.ts'
import { configureRelay } from './xrpc.ts'
import { initOAuth } from './oauth/server.ts'
import { initServer } from './server-init.ts'
import { createHandler } from './server.ts'
import { startIndexer } from './indexer.ts'
import { getCursor } from './database/db.ts'
import { runBackfill } from './backfill.ts'
import { rebuildAllIndexes } from './database/fts.ts'
import { relayHttpUrl } from './config.ts'
import { validateLexicons } from '@bigmoves/lexicon'
import { log } from './logger.ts'
import { mkdirSync } from 'node:fs'
import { dirname, resolve } from 'node:path'
// Boot sequence (mirrors main.ts but exports handler instead of starting server)
const configPath = 'hatk.config.ts'
const configDir = dirname(resolve(configPath))
const config = await loadConfig(configPath)
configureRelay(config.relay)
const lexicons = loadLexicons(resolve(configDir, 'lexicons'))
const lexiconErrors = validateLexicons([...lexicons.values()])
if (lexiconErrors) {
for (const [nsid, errors] of Object.entries(lexiconErrors)) {
for (const err of errors) console.error(`Invalid lexicon ${nsid}: ${err}`)
}
throw new Error('Invalid lexicons')
}
storeLexicons(lexicons)
const collections = config.collections.length > 0 ? config.collections : discoverCollections(lexicons)
discoverViews()
const engineDialect = getDialect(config.databaseEngine)
const { schemas, ddlStatements } = buildSchemas(lexicons, collections, engineDialect)
if (config.database !== ':memory:') {
mkdirSync(dirname(config.database), { recursive: true })
}
const { adapter, searchPort } = await createAdapter(config.databaseEngine)
setSearchPort(searchPort)
await initDatabase(adapter, config.database, schemas, ddlStatements)
await migrateSchema(schemas)
// Initialize handlers from server/ directory
await initServer(resolve(configDir, 'server'))
if (config.oauth) {
await initOAuth(config.oauth, config.plc, config.relay)
}
// Start indexer
const collectionSet = new Set(collections)
const cursor = await getCursor('relay')
startIndexer({
relayUrl: config.relay,
collections: collectionSet,
signalCollections: config.backfill.signalCollections ? new Set(config.backfill.signalCollections) : undefined,
pinnedRepos: config.backfill.repos ? new Set(config.backfill.repos) : undefined,
cursor,
fetchTimeout: config.backfill.fetchTimeout,
maxRetries: config.backfill.maxRetries,
parallelism: config.backfill.parallelism,
ftsRebuildInterval: config.ftsRebuildInterval,
})
// Run backfill in background (no restart in dev mode)
runBackfill({
pdsUrl: relayHttpUrl(config.relay),
plcUrl: config.plc,
collections: collectionSet,
config: config.backfill,
}).then(() => rebuildAllIndexes(Array.from(collectionSet)))
.catch((err) => console.error('[backfill]', err.message))
// Export the handler for Vite middleware
export const handler = createHandler({
collections: Array.from(collectionSet),
publicDir: null, // Vite serves static assets in dev
oauth: config.oauth,
admins: config.admins,
})
log(`[hatk] Dev server ready`)
log(` Relay: ${config.relay}`)
log(` Database: ${config.database}`)
log(` Collections: ${collections.join(', ')}`)
Step 4: Update main.ts for production
Modify packages/hatk/src/main.ts to use createHandler + serve adapter:
Replace the startServer(...) call near the end with:
import { createHandler } from './server.ts'
import { serve } from './adapter.ts'
const handler = createHandler({
collections,
publicDir: config.publicDir,
oauth: config.oauth,
admins: config.admins,
onResync: runBackfillAndRestart,
})
serve(handler, config.port)
Step 5: Verify build
Run: cd packages/hatk && npx tsc -p tsconfig.build.json --noEmit
Expected: No errors
Step 6: Commit
git add packages/hatk/src/vite-plugin.ts packages/hatk/src/dev-entry.ts packages/hatk/src/main.ts
git commit -m "feat: rewrite Vite plugin with Environment API and dev entry"
Task 6: Update Vite peer dependency#
hatk needs to declare Vite 8 as a peer dependency.
Files:
- Modify:
packages/hatk/package.json
Step 1: Update peer dependency
Add to package.json:
"peerDependencies": {
"vite": "^8.0.0"
}
Move vite from devDependencies to peerDependencies. Keep it in devDependencies as well for development.
Step 2: Commit
git add packages/hatk/package.json
git commit -m "feat: require Vite 8 as peer dependency"
Task 7: Test with statusphere template#
Validate the full dev and SSR flow by converting the statusphere template.
Files:
- Work in:
/Users/chadmiller/code/hatk-template-statusphere
Step 1: Add entry-server and entry-client (if using SSR)
For now, statusphere can skip SSR and just validate the dev mode works without tsx watch:
- Ensure
server/directory exists (from Task 11-12 of prior plan) - Ensure
vite.config.tsuseshatk()plugin - Run
npm run devand verify:- No child process spawned
- XRPC routes work
- Feeds work
- OAuth works
- Frontend hot reloads
- Editing a server/ file reloads the handler
Step 2: Test SSR (optional)
If we want to validate SSR with React or Svelte:
- Add
src/entry-server.tsxandsrc/entry-client.tsx - Add
server/render.tsxwithdefineRenderer - Add
<!--ssr-outlet-->toindex.html - Verify server-rendered HTML appears on initial load
- Verify client hydration takes over
Step 3: Test production build
npm run build
node dist/server/index.js
Verify: static assets served, API routes work, SSR renders (if configured).
Task 8: Test with teal template#
Same as Task 7 but for the more complex teal template. Validates:
- Multiple feeds
- Complex XRPC queries
- OG image generation
- Setup scripts
- Label definitions
This task is validation only — no new code, just running the templates against the new server.