Monorepo for Aesthetic.Computer
aesthetic.computer
1# Ableton Live Set (.als) XML Structure Documentation
2
3*Generated 2025.01.24*
4
5This document provides a comprehensive analysis of Ableton Live Set (.als) file format and structure to enable semantic parsing and animated notation of ALS projects within the Aesthetic Computer environment.
6
7## Table of Contents
8
91. [File Format Overview](#file-format-overview)
102. [XML Structure Analysis](#xml-structure-analysis)
113. [Core Elements](#core-elements)
124. [Track Types and Properties](#track-types-and-properties)
135. [MIDI and Timing Data](#midi-and-timing-data)
146. [Device and Effects Structure](#device-and-effects-structure)
157. [Arrangement and Session Data](#arrangement-and-session-data)
168. [Implementation Notes](#implementation-notes)
179. [Aesthetic Computer Integration](#aesthetic-computer-integration)
18
19## File Format Overview
20
21### Physical Structure
22- **Extension**: `.als`
23- **Format**: Compressed XML data (GZIP)
24- **Version**: Created by Ableton Live 8.0+
25- **Encoding**: UTF-8 XML
26
27### Basic Extraction Process
28```javascript
29// As implemented in bios.mjs
301. File is dropped as .als
312. Decompressed using pako.ungzip()
323. XML content extracted as string
334. Sent to ableton.mjs via "dropped:als" event
34```
35
36## XML Structure Analysis
37
38### Root Element
39```xml
40<Ableton MajorVersion="5" MinorVersion="12.1_12115" SchemaChangeCount="3" Creator="Ableton Live 12.1.5 Suite" Revision="">
41 <LiveSet>
42 <!-- All project data contained here -->
43 </LiveSet>
44</Ableton>
45```
46
47### Primary Sections
48
49#### 1. Global Project Settings
50```xml
51<LiveSet>
52 <LockedScripts/>
53 <MidiControllers/>
54 <OverwriteProtectionNumber Value="1024"/>
55 <SceneNameManager>
56 <Name Value="1"/>
57 </SceneNameManager>
58 <TimeSelection>
59 <AnchorTime Value="0"/>
60 <EndTime Value="0"/>
61 </TimeSelection>
62 <MasterTrack Id="0">
63 <!-- Master track configuration -->
64 </MasterTrack>
65</LiveSet>
66```
67
68#### 2. Tempo and Time Signature
69```xml
70<MasterTrack Id="0">
71 <DeviceChain>
72 <DeviceChain>
73 <AutomationEnvelopes/>
74 <Mixer>
75 <Tempo>
76 <Manual Value="143"/> <!-- BPM VALUE HERE -->
77 </Tempo>
78 <TimeSignature>
79 <TimeSignatureNumerator Value="4"/>
80 <TimeSignatureDenominator Value="4"/>
81 </TimeSignature>
82 </Mixer>
83 </DeviceChain>
84 </DeviceChain>
85</MasterTrack>
86```
87
88## Core Elements
89
90### 1. Tracks Structure
91
92#### MIDI Track Example
93```xml
94<MidiTrack Id="1">
95 <LomId Value="0"/>
96 <LomIdView Value="0"/>
97 <IsGrouped Value="false"/>
98 <TrackGroupId Value="-1"/>
99 <Name>
100 <EffectiveName Value="BASS"/>
101 <UserName Value="BASS"/>
102 <Annotation Value=""/>
103 </Name>
104 <Color Value="2315382"/>
105 <TrackSizeState Value="0"/>
106 <SizeState Value="1"/>
107 <AutomationVisible Value="false"/>
108 <DeviceChain>
109 <!-- Device chain contains instruments and effects -->
110 </DeviceChain>
111</MidiTrack>
112```
113
114#### Audio Track Example
115```xml
116<AudioTrack Id="2">
117 <LomId Value="0"/>
118 <Name>
119 <EffectiveName Value="KICK"/>
120 <UserName Value="KICK"/>
121 </Name>
122 <Color Value="16777215"/>
123 <!-- Audio-specific properties -->
124 <DeviceChain>
125 <!-- Audio effects chain -->
126 </DeviceChain>
127</AudioTrack>
128```
129
130### 2. Track Types Identification
131
132Based on XML elements:
133- **MidiTrack**: MIDI instrument tracks
134- **AudioTrack**: Audio tracks with recorded or imported audio
135- **ReturnTrack**: Send/return effects tracks
136- **GroupTrack**: Folder tracks grouping other tracks
137- **MasterTrack**: Main output bus
138
139### 3. Track Colors and Visual Properties
140
141```xml
142<Color Value="2315382"/> <!-- RGB color as integer -->
143```
144
145Color conversion:
146```javascript
147// Convert integer to RGB
148function intToRgb(colorInt) {
149 return {
150 r: (colorInt >> 16) & 255,
151 g: (colorInt >> 8) & 255,
152 b: colorInt & 255
153 };
154}
155```
156
157## Track Types and Properties
158
159### MIDI Track Properties
160- **Instruments**: Wavetable, Operator, Impulse, etc.
161- **MIDI Effects**: Arpeggiator, Scale, Note Length, etc.
162- **Clips**: MIDI note data, automation
163- **Routing**: Input/output, sends, groups
164
165### Audio Track Properties
166- **Audio Effects**: EQ Eight, Compressor, Reverb, etc.
167- **Clips**: Audio file references, warping data
168- **Recording**: Input settings, monitoring
169- **Processing**: Freeze, flatten options
170
171### Return Track Properties
172- **Effects**: Typically reverb, delay, modulation
173- **Routing**: Receives sends from other tracks
174- **Control**: Send amounts, pre/post fader
175
176## MIDI and Timing Data
177
178### MIDI Clip Structure
179```xml
180<MidiClip Id="0">
181 <LomId Value="0"/>
182 <Time Value="0.0"/>
183 <Duration Value="4.0"/>
184 <Loop>
185 <LoopStart Value="0.0"/>
186 <LoopEnd Value="4.0"/>
187 <StartRelative Value="0.0"/>
188 <LoopOn Value="true"/>
189 <OutMarker Value="4.0"/>
190 <HiddenLoopStart Value="0.0"/>
191 <HiddenLoopEnd Value="4.0"/>
192 </Loop>
193 <Name Value=""/>
194 <ColorIndex Value="0"/>
195 <HasLegato Value="false"/>
196 <MidiKey>
197 <Notes>
198 <!-- Individual MIDI notes -->
199 <KeyTrack Id="0">
200 <Notes>
201 <MidiNote Time="0.0" Duration="0.25" Velocity="100" OffVelocity="64" IsEnabled="true"/>
202 <MidiNote Time="0.5" Duration="0.25" Velocity="100" OffVelocity="64" IsEnabled="true"/>
203 <!-- More notes... -->
204 </Notes>
205 </KeyTrack>
206 </Notes>
207 </MidiKey>
208</MidiClip>
209```
210
211### Key MIDI Properties
212- **Time**: Note start position in beats
213- **Duration**: Note length in beats
214- **Velocity**: Note velocity (0-127)
215- **Pitch**: MIDI note number (implicit from KeyTrack position)
216
217### Audio Clip Structure
218```xml
219<AudioClip Id="0">
220 <LomId Value="0"/>
221 <Time Value="0.0"/>
222 <Duration Value="4.0"/>
223 <Loop>
224 <!-- Loop settings -->
225 </Loop>
226 <SampleRef>
227 <FileRef>
228 <Name Value="KICK_SAMPLE.wav"/>
229 <Type Value="1"/>
230 <Data>
231 <RelativePathElement Dir="Samples"/>
232 <RelativePathElement Dir="Processed"/>
233 <RelativePathElement Dir="Crop"/>
234 <RelativePathElement Dir="KICK_SAMPLE.wav"/>
235 </Data>
236 </FileRef>
237 </SampleRef>
238 <WarpMarkers>
239 <!-- Warping/timing data -->
240 </WarpMarkers>
241</AudioClip>
242```
243
244## Device and Effects Structure
245
246### Instrument Device Example
247```xml
248<InstrumentBranch>
249 <Items>
250 <Wavetable Id="0">
251 <LomId Value="0"/>
252 <LomIdView Value="0"/>
253 <IsExpanded Value="true"/>
254 <On>
255 <LomId Value="0"/>
256 <Manual Value="true"/>
257 </On>
258 <ModulationSourceCount Value="0"/>
259 <ParametersListWrapper>
260 <!-- Device parameters -->
261 </ParametersListWrapper>
262 </Wavetable>
263 </Items>
264</InstrumentBranch>
265```
266
267### Audio Effect Example
268```xml
269<AudioEffectBranch>
270 <Items>
271 <Eq8 Id="0">
272 <LomId Value="0"/>
273 <LomIdView Value="0"/>
274 <IsExpanded Value="true"/>
275 <On>
276 <LomId Value="0"/>
277 <Manual Value="true"/>
278 </On>
279 <ParametersListWrapper>
280 <!-- EQ parameters -->
281 <FilterBands>
282 <HighQualityOn Value="false"/>
283 <Mode Value="0"/>
284 <Bands>
285 <!-- 8 EQ bands -->
286 </Bands>
287 </FilterBands>
288 </ParametersListWrapper>
289 </Eq8>
290 </Items>
291</AudioEffectBranch>
292```
293
294## Arrangement and Session Data
295
296### Session View Clips
297```xml
298<Scenes>
299 <Scene Id="0">
300 <LomId Value="0"/>
301 <LomIdView Value="0"/>
302 <Name Value=""/>
303 <ColorIndex Value="0"/>
304 <TempoEnabled Value="false"/>
305 <Tempo Value="120"/>
306 <TimeSignatureEnabled Value="false"/>
307 <TimeSignature>
308 <TimeSignatureNumerator Value="4"/>
309 <TimeSignatureDenominator Value="4"/>
310 </TimeSignature>
311 </Scene>
312</Scenes>
313```
314
315### Arrangement View Data
316```xml
317<ArrangerAutomation>
318 <Events>
319 <!-- Automation events with timing -->
320 </Events>
321</ArrangerAutomation>
322```
323
324## Implementation Notes
325
326### Current ALSProject Class Enhancement
327
328Based on the zzzZWAP project mentioned in the notebook:
329- **BPM**: 143 (found in `<Manual Value="143"/>`)
330- **Tracks**: BASS (MIDI), KICK (audio), HATS (MIDI), snare (MIDI)
331- **Structure**: Each track has specific MIDI data and timing
332
333### Enhanced Parsing Strategy
334
335```javascript
336class ALSProject {
337 constructor(xmlData) {
338 this.tracks = [];
339 this.scenes = [];
340 this.tempo = 120;
341 this.timeSignature = { numerator: 4, denominator: 4 };
342 this.creator = "Unknown";
343 this.version = "Unknown";
344 this.clips = [];
345 this.devices = [];
346
347 if (xmlData) {
348 this.parseXML(xmlData);
349 }
350 }
351
352 parseXML(xmlData) {
353 // Enhanced parsing for comprehensive data extraction
354 this.parseGlobalSettings(xmlData);
355 this.parseTempo(xmlData);
356 this.parseTracks(xmlData);
357 this.parseScenes(xmlData);
358 this.parseClips(xmlData);
359 this.parseDevices(xmlData);
360 }
361
362 parseGlobalSettings(xmlData) {
363 // Extract project-level metadata
364 const creatorMatch = xmlData.match(/Creator="([^"]+)"/);
365 if (creatorMatch) {
366 this.creator = creatorMatch[1];
367 const versionMatch = creatorMatch[1].match(/(\d+\.\d+)/);
368 if (versionMatch) this.version = versionMatch[1];
369 }
370 }
371
372 parseTempo(xmlData) {
373 // Enhanced tempo parsing
374 const tempoMatch = xmlData.match(/<Tempo>[\s\S]*?<Manual Value="([^"]+)"[\s\S]*?<\/Tempo>/);
375 if (tempoMatch) {
376 this.tempo = parseFloat(tempoMatch[1]);
377 }
378
379 // Time signature
380 const sigNumMatch = xmlData.match(/<TimeSignatureNumerator Value="([^"]+)"/);
381 const sigDenMatch = xmlData.match(/<TimeSignatureDenominator Value="([^"]+)"/);
382 if (sigNumMatch && sigDenMatch) {
383 this.timeSignature = {
384 numerator: parseInt(sigNumMatch[1]),
385 denominator: parseInt(sigDenMatch[1])
386 };
387 }
388 }
389
390 parseTracks(xmlData) {
391 // Parse all track types with detailed information
392 const trackTypes = ['MidiTrack', 'AudioTrack', 'ReturnTrack', 'GroupTrack'];
393
394 trackTypes.forEach(trackType => {
395 const regex = new RegExp(`<${trackType}[^>]*Id="([^"]*)"[^>]*>([\\s\\S]*?)<\/${trackType}>`, 'g');
396 let match;
397
398 while ((match = regex.exec(xmlData)) !== null) {
399 const trackId = match[1];
400 const trackContent = match[2];
401
402 const track = {
403 id: trackId,
404 type: trackType,
405 name: this.extractTrackName(trackContent),
406 color: this.extractTrackColor(trackContent),
407 muted: this.extractTrackMuted(trackContent),
408 solo: this.extractTrackSolo(trackContent),
409 clips: this.extractTrackClips(trackContent),
410 devices: this.extractTrackDevices(trackContent)
411 };
412
413 this.tracks.push(track);
414 }
415 });
416 }
417
418 parseScenes(xmlData) {
419 // Extract scene information
420 const sceneRegex = /<Scene[^>]*Id="([^"]*)"[^>]*>([\s\S]*?)<\/Scene>/g;
421 let match;
422
423 while ((match = sceneRegex.exec(xmlData)) !== null) {
424 const sceneId = match[1];
425 const sceneContent = match[2];
426
427 const scene = {
428 id: sceneId,
429 name: this.extractSceneName(sceneContent),
430 tempo: this.extractSceneTempo(sceneContent),
431 timeSignature: this.extractSceneTimeSignature(sceneContent)
432 };
433
434 this.scenes.push(scene);
435 }
436 }
437
438 parseClips(xmlData) {
439 // Parse MIDI and Audio clips with timing data
440 this.parseMIDIClips(xmlData);
441 this.parseAudioClips(xmlData);
442 }
443
444 parseMIDIClips(xmlData) {
445 const clipRegex = /<MidiClip[^>]*Id="([^"]*)"[^>]*>([\s\S]*?)<\/MidiClip>/g;
446 let match;
447
448 while ((match = clipRegex.exec(xmlData)) !== null) {
449 const clipId = match[1];
450 const clipContent = match[2];
451
452 const clip = {
453 id: clipId,
454 type: 'MIDI',
455 time: this.extractClipTime(clipContent),
456 duration: this.extractClipDuration(clipContent),
457 name: this.extractClipName(clipContent),
458 notes: this.extractMIDINotes(clipContent),
459 loop: this.extractLoopData(clipContent)
460 };
461
462 this.clips.push(clip);
463 }
464 }
465
466 extractMIDINotes(clipContent) {
467 const notes = [];
468 const noteRegex = /<MidiNote[^>]+Time="([^"]+)"[^>]+Duration="([^"]+)"[^>]+Velocity="([^"]+)"[^>]*\/>/g;
469 let noteMatch;
470
471 while ((noteMatch = noteRegex.exec(clipContent)) !== null) {
472 notes.push({
473 time: parseFloat(noteMatch[1]),
474 duration: parseFloat(noteMatch[2]),
475 velocity: parseInt(noteMatch[3])
476 });
477 }
478
479 return notes;
480 }
481
482 // Additional extraction methods...
483 extractTrackName(content) {
484 const nameMatch = content.match(/<UserName Value="([^"]*)"/);
485 return nameMatch ? nameMatch[1] : 'Untitled';
486 }
487
488 extractTrackColor(content) {
489 const colorMatch = content.match(/<Color Value="([^"]*)"/);
490 if (colorMatch) {
491 const colorInt = parseInt(colorMatch[1]);
492 return {
493 r: (colorInt >> 16) & 255,
494 g: (colorInt >> 8) & 255,
495 b: colorInt & 255
496 };
497 }
498 return { r: 128, g: 128, b: 128 };
499 }
500
501 // Generate semantic map for visualization
502 generateSemanticMap() {
503 return {
504 project: {
505 name: this.extractProjectName(),
506 tempo: this.tempo,
507 timeSignature: this.timeSignature,
508 creator: this.creator,
509 version: this.version
510 },
511 tracks: this.tracks.map(track => ({
512 ...track,
513 visualPosition: this.calculateTrackPosition(track),
514 timeline: this.generateTrackTimeline(track)
515 })),
516 timeline: this.generateGlobalTimeline(),
517 structure: this.analyzeProjectStructure()
518 };
519 }
520
521 // Beat-synchronized visualization data
522 generateBeatMap(currentBeat) {
523 const activeElements = [];
524
525 this.clips.forEach(clip => {
526 if (this.isClipActiveAtBeat(clip, currentBeat)) {
527 activeElements.push({
528 type: 'clip',
529 clip: clip,
530 intensity: this.calculateClipIntensity(clip, currentBeat),
531 visualEffect: this.getClipVisualEffect(clip)
532 });
533 }
534 });
535
536 return {
537 beat: currentBeat,
538 tempo: this.tempo,
539 activeElements: activeElements,
540 visualCues: this.generateVisualCues(currentBeat)
541 };
542 }
543}
544```
545
546## Aesthetic Computer Integration
547
548### Visualization Strategy
549
5501. **Minimal Track Display**
551 - Each track as colored bar/strip
552 - Height represents activity/volume
553 - Color from ALS track colors
554
5552. **Beat Synchronization**
556 - Use tempo for animation timing
557 - Flash/pulse on note triggers
558 - Smooth transitions between beats
559
5603. **MIDI Note Visualization**
561 - Vertical position = pitch
562 - Width = note duration
563 - Opacity = velocity
564
5654. **Real-time Animation**
566 ```javascript
567 function drawMiniDiagram(api, beatPosition) {
568 const semanticMap = alsProject.generateSemanticMap();
569 const beatMap = alsProject.generateBeatMap(beatPosition);
570
571 beatMap.activeElements.forEach(element => {
572 if (element.type === 'clip') {
573 this.drawClipActivity(api, element, beatPosition);
574 }
575 });
576 }
577
578 function drawClipActivity(api, element, beatPosition) {
579 const { ink, screen } = api;
580 const track = this.findTrackForClip(element.clip);
581
582 // Draw beat-synchronized visual
583 const intensity = element.intensity;
584 const color = track.color;
585
586 ink(color.r * intensity, color.g * intensity, color.b * intensity);
587 // Draw animated element...
588 }
589 ```
590
591### Integration with Current ableton.mjs
592
593The enhanced ALSProject class should integrate with the existing minimal sculpture design:
594
595```javascript
596// In paint() function
597if (alsProject && wavFile && sound.time > 0) {
598 const beatPosition = (sound.time * alsProject.tempo) / 60;
599 alsProject.drawMiniDiagram(api, beatPosition);
600}
601```
602
603## Advanced Features for Future Implementation
604
605### 1. Device Parameter Automation
606- Extract automation curves
607- Visualize parameter changes over time
608- Sync visual effects to parameter modulation
609
610### 2. Advanced MIDI Analysis
611- Chord detection and visualization
612- Rhythm pattern analysis
613- Key signature detection
614
615### 3. Audio Analysis Integration
616- Extract audio feature data if available
617- Sync visual effects to audio characteristics
618- Multi-layer visualization combining MIDI and audio
619
620### 4. Interactive Elements
621- Click tracks to solo/mute
622- Scrub timeline with mouse
623- Live parameter control
624
625## Real-World Implementation Examples
626
627Based on research of existing ALS parsing tools, here are proven approaches:
628
629### 1. Basic Decompression (Ruby guard-live-set)
630```ruby
631# Converts .als files to XML for git version control
632Zlib::GzipReader.open(path) do |gz|
633 File.open(path + '.xml', 'w') do |file|
634 file << gz.read
635 end
636end
637```
638
639### 2. C# Implementation (corrupt-als-dotnet)
640```cs
641// Blazor app for fixing corrupt ALS files with duplicate IDs
642public static async Task<byte[]> DecompressAsync(byte[] input)
643{
644 using var inmem = new MemoryStream(input);
645 using var output = new MemoryStream();
646 using var gz = new GZipStream(inmem, CompressionMode.Decompress);
647
648 await gz.CopyToAsync(output);
649 return output.ToArray();
650}
651```
652
653### 3. MIDI Note Processing Example
654```cs
655// From corrupt-als-dotnet - processing MIDI notes with IDs
656var midiNoteEvents = keyTrackCollection
657 .Elements().Where((c) => c.Name == "KeyTrack")
658 .Elements().Where((t) => t.Name == "Notes")
659 .Elements().Where((n) => n.Name == "MidiNoteEvent");
660
661foreach (var noteEvent in midiNoteEvents)
662{
663 var noteId = noteEvent.Attributes().First((a) => a.Name == "NoteId");
664 noteId.Value = newUniqueId.ToString();
665}
666```
667
668### 4. XML Structure Navigation
669```cs
670// Finding duplicate MIDI note IDs in the XML structure
671var nodes = root.Descendants()
672 .Where((d) => d.Attributes().Any((a) => a.Name == "NoteId"));
673```
674
675### Key Findings from Real Implementations
676
6771. **File Format**: Confirmed .als files are GZIP-compressed XML
6782. **Structure**: XML contains hierarchical Track > Clip > Notes organization
6793. **MIDI Notes**: Each MidiNoteEvent has unique NoteId attribute
6804. **Navigation**: KeyTracks contain MidiNoteEvent elements with timing/pitch data
6815. **Processing**: Standard XML parsing with XPath/LINQ works well
682
683This documentation provides the foundation for building a sophisticated, semantically-aware visualization system for Ableton Live projects within the Aesthetic Computer environment.