Temporal Weaving with TidalCycles
2020-2024
TidalCycles code on the left generates MIDI notes, visualized in real-time with TouchDesigner on the right.
Real-time coding with TidalCycles, and TouchDesigner.
Part of TOPLAP’s mission to make live coding inclusive and accessible, this session highlights the creative possibilities of programming music and visuals in the moment.
Hydra
Live-Coding Visuals
2022-2024
Hydra is a browser-based visual synthesizer that enables live coding of visuals through simple functions and feedback loops.
Screencaptured live-coding in Hydra
Art In the Open:
Flow State Workshop Series
https://hydra.ojack.xyz/
Clear all - Resets the environment and clears text from the editor.
Load library or extension - community extensions for Hydra-Synth.
Show random sketch - Loads random sketch examples.
Make random change - Modifies a single value automatically.
Upload to gallery - Upload a sketch to Hydra’s gallery and create a
shorter URL.
Show info window - Show overlay window with help text and links.
Core principles: Input + Modify + Output
Osc is the input
Rotate and repeat are the modifiers.
Modify could be lots of things like:
.brightness
.rotate
.kaleid
.invert
Output = o0
Output that’s o0 is like a TV channel, and .out(o0) is telling Hydra, Put my visuals on this channel so it shows up.”
Send this visual to screen number 0 so I can see it.
The visual you build goes into a “pipe.”
.out(o0) opens the pipe and shows it on the main display. Without .out(o0), Hydra made something—but you wouldn’t see it.
| An oscillator is a signal that creates a repeating wave or pattern that goes back and forth — like a smooth pulse. And in Hydra, we’re going to use it as an Input. |
.out(o0)
| Hydra expects an input and an output argument. osc () // Input .out(o0) // Output Hydra is written in JavaScript, a programming language and a core technology of Websites, alongside HTML and CSS. It enables dynamic and interactive content on websites and web applications. Syntax is the grammar of code. It’s how we tell the computer what to do step by step. In Hydra, the code is read from left ot right, top to bottom. |
.out(o0)
| We have 3 arguments in osc() osc() - osc(frequency, sync, offset) Oscillator ( frequency = 20, sync = 0.2, offset = 0) In coding, an argument is information you give to a function so it knows how to behave. • osc() is a function — it creates something. • The dot . chains actions together. • Each pair of parentheses () passes arguments (values) that control behavior. Think of a function like a machine, and arguments are the settings or ingredients you give it. Each function has its own argument. A parameter is a blank spot a function expects, and an argument is the actual value you fill that spot with when you use the function. The first argument 20 is the frequency of the lines, osc (20, 0.2,0) .out (o0) The second number is the synchronization, known as speed. Put 0.2; they're moving quite slowly. Now put number 0.8, you'll see it begins to move a lot faster. Warning too fast and it will flash! |
.out (o0)
| The last number is what's known as the offset. 3 oscillators are moving simultaneously. One of them is red, one is green, and one is blue. If you mix all of them, you get white & black. They're moving slightly out of sync with each other, which will reveal the colors of each of those oscillators and mix them all together. |
.color(1, 0, 0) // Red
.out(o0)
//
osc (20, 0.2,2)
.color(0, 1, 0) // Green
.out(o0)
//
osc (20, 0.2,2)
.color(0, 0, 1) // Blue
.out(o0)
//
osc (20, 0.2,2)
.color(1.1, 0.5, 1) // MIX
.out(o0)
Pro Tip
// Two forward slashes create a comment. //
Add a note to the line of code without breaking the syntax.
.color (1.1, 0.5, 0.5)
.rotate(1)
.out (o0)
//
osc (10, 0.2,2)
.color (1.1, 0.5, 0.5)
.rotate(90 * Math.PI / 25)
.out (o0)
//
osc (10, 0.2,2)
.color (1.1, 0.5, 0.5)
.rotate (90*4)
.out (o0)
Rotation is measured in radians is like a slice of a pie.
3.14 is pi, a full 360-degree rotation would be 3.14 times x2,
JavaScript / Hydra takes math values.
.color (1.1, 0.5, 0.5)
.rotate (90*4)
.pixelate( 25, 25)
.out (o0)
| .pixelate( pixelX = 25, pixelY = 25 ) |
.color (1.1, 0.5, 0.5)
.rotate (90*4)
.pixelate( 50, 50)
.kaleid(3)
.out (o0)
| kaleid( nSides = 3 ) |
.kaleid(3)
.color (1.1, 0.5, 0.5)
.rotate (90*4)
.pixelate( 50, 50)
.out (o0)
| Change order .kaleid(3) to the top. Modulate order matters in Hydra. Code in Hydra moves from left to right and top to bottom. |
.pixelate( 25, 25)
.kaleid(3)
.color (1.1, 0.5, 0.5)
.rotate (90 * Math.PI / 180)
.out (o0)
.kaleid(2)
.rotate (90*Math.PI/90)
// .pixelate( 15, 25)
.color (1.5, 0.5, 0.5)
.out (o0)
| Remix the order, and jam out. Turn lines on and off by commenting with 2 forward slashes // |
| Short Cuts
Run All control + shift + enter Hide Code control + shift + h |
.out (o0)
|
In Hydra, the noise() function is one of the core source generators. It creates a dynamic field of random pixel values that shift and flow over time. You can think of it like digital static or clouds that move and evolve. It’s often used as a base texture or as a modulator to distort other visuals. |
.out (o0)
| clouds like shapes that move and evolve. |
.color(1,0,-1)
.saturate(1.1)
.out(o0)
|
.color (1,0,-1)
.saturate(1.3)
.blend( o0,0.99)
.out (o0)
.color (1,0,-1)
.saturate(1.3)
.blend(osc (10, 0.2,2),0.50)
// .blend( o0,0.75)
.out (o0)
| .blend( texture, amount = 0.50 ) Since the 1st argument can take a texture we could use the oscillator as a texture input. Texture is the graphical source. Commenting out the feedback loop to see what’s happening. |
.color (1,0,-1)
.saturate(1.3)
.blend(osc (10, 0.2,2),0.50)
.add(noise(5000,0.78),0.05)
.blend( o0,0.75)
.out (o0)
| .add( texture, amount = 1 ) Adding .add noise at a very high number add texture. When you “add a texture,” you’re not just stacking an image — you’re passing a texture as an input into another function. That function then uses the texture to modify, mask, or mix visuals |
.color(1,0,-1)
.saturate(1.3)
.blend(osc(10, 0.2, 2), 0.50)
.modulate(osc(15,0.3), 0.4)
// more modulation (0.1–1 range)
.add(noise(5000,0.78), 0.05)
.blend(o0, 0.75)
.out(o0)
| For more of a water feel and less repetition ,we can use .modulate Modulate functions use the colors from one source to affect the geometry of the second source. This creates a sort of warping or distorting effect. . modulate() does not change color or luminosity but distorts one visual source using another visual source. An analogy in the real world would be looking through a texture glass window or water. You can add a second parameter to the modulate() function to control the amount of warping: modulate(o1, 0.9). In this case, the red and green channels of the oscillator are being converted to x and y displacement of the camera image. Think of it like bending one image with another image’s energy — the second texture becomes a map that tells Hydra how to distort the first one. |
.color(1,0, -3)
.saturate(1.3)
.blend(osc (10, 0.2,5),0.50)
.blend(o0, 0.75)
.add(noise(5000,0.78), 0.05)
.modulate(noise(1.5, 0.7))
.out(o0)
Runway‘s Creative Partner Program
2024
As a selected Runway's Creative Partners Program member, I can explore AI-generated art more deeply. The CPP Discord server is constantly chatting with amazing artists worldwide who have been on my journey, finding the latest cutting-edge AI tools and models that Runway continues to update on a massive scale.
Runway's text-to-image Gen-3 Alpha model maiden voyage showcases a fusion of surreal AI-Generated imagery and custom score sound design forward audio.
Gen-3 Alpha FLUX.1-Dev (Images)
Runway Frames
Third Echo
Exploring AI-generated art evolution. Uses Stable Diffusion (text-to-image AI) and AnimateDiff (AI animation tool). Builds on my earlier AI art experiments in StyleGAN.
Training frames came from a notebook called Looking Glass, original notebook by Sber AI @ai_curio. This notebook implements an image-to-image generation technique that fine-tunes ruDALL-E. By using this method, I was able to create new images that closely resembled the given input images.
Input video was made in StyleGAN, generative adversarial network (GAN) that creates highly realistic images by using a style-based generator architecture, allowing fine-grained control over various aspects of the generated images.
Scyphozoa
2021
This run of the machine learning script was trained on jellyfish. It took 10 hours to train this model.
Epoch 200
Features original music live-coded using TidalCycles, a programming language for music creation.
21.04.2022-20.44.45.mp3
Soundcloud
For the past two years, I've been on an exhilarating journey with Delenda. Our collaboration melds her haunting vocals and raw storytelling with my AI-enhanced surreal visuals. From making music videos to designing live visuals, we're exploring new frontiers together.
Catalyst
We merged AI-generated animations with analog video techniques for Catalyst's music video.
Fine-tuned models produced surreal visuals, and recaptured on vintage TVs for a uniquely tailored visual language.
Animated results using Runway's Gen-2 model.
Luminaria
For Delenda’s Luminaria Contemporary Arts Festival show, I transformed live footage of Delenda through custom AI processing, creating a surreal visual backdrop. I ran real-time visuals using TouchDesigner during the performance, performing alongside Delenda and her band.
A dynamic visual environment was created using two projectors running TouchDesigner. This setup allowed us to run fluid, ever-changing lighting conditions that interacted with the performer's various outfits in real time.
I curated two distinct image sets: one from our live-action video shoot and another of visually striking reference images.
These were used to train custom Stable Diffusion 1.5 checkpoint and LoRA models, enabling us to generate AI visuals that could synthesize and amplify Delenda's visual identity.
I applied an AI-style pass to the original footage using Stable WarpFusion and our custom-trained LoRA models. This process tracked movements through generated optical flow and consistency maps. I developed a Python script for frame glitching and leveraged TouchDesigner's feedback network to achieve the final aesthetic.
TouchDesigner live-visual setup and frame grabs.
Treatment
Pathetic
This project began as a live-action music video, traditionally shot and edited. We then applied multiple passes of Stable WarpFusion AI to the footage, creating a surreal, dream-like version.
Project Breakdown
After filming, I created a fully color-corrected live-action edit of the music video.
This served as the foundation for our subsequent AI-enhanced visual treatments.
I then gathered and curated images to train a custom Stable Diffusion 1.5 checkpoint.
I processed sections of the live-action music video through Stable WarpFusion, using my custom model. This was done both on Google Colab and locally on a high-performance gaming laptop.
The final step involved compositing the original live footage over the AI-generated imagery in After Effects, followed by a final pass in DaVinci Resolve.
For the past decade, I've created multimedia experiences with A.M. Architect alongside Daniel Stanush. Our collaboration weaves Stanush's melodic sensibility with my soundscape manipulation, using TouchDesigner and machine learning to transform performances into interactive installations. Through beats, generative visuals, and audience interaction, we're redefining the boundaries between electronic music and digital art.
Hydra
Cynatica Conductor II
Cynatica Conductor invites the viewer to interact with light and sound, playing the conductor overseeing an ensemble of sonic texture and fractured melody.
Art interacts with a conductor (viewer), they grow in complexity and vibrancy, becoming an immersive collaboration between the art and the audience.
Installation Review: Glass Tire Art Magazine:
A Technological Dream: "Cynatica Conductor"
A Technological Dream: "Cynatica Conductor"
Cynatica Conductor I
An immersive art experience created with thirty-five artists' site-specific installations in the former PAC:SAT satellite news headquarters. It was a limited show before the building was demolished for new development on the land.
Avicenna
Avicenna is featured on the Territories LP compilation album from 79Ancestors, released in May 2017. The album includes tracks from other electronic artists such as Telefon Tel Aviv, Deru, and Shigeto.
Color Field
Color Field is the companion film to the 2017 A.M. Architect release, Color Field, with 79 Ancestors.
As a co-partner, videographer, editor, animator, and music producer, I help create vibrant media that inspires and educates.
Nathan’s story
We aimed to portray Nathan as a multi-faceted person - an athlete, designer, and family man - not defined by SMA. This project celebrates living life forward, embracing one's whole self beyond others' expectations.
Media Capabilities
Design