SYNTHesthesia is an audio visualization project that creates a generative visual display in real-time based on the notes and parameters of a synthesizer performance. The project works by using custom Max for Live devices that send the notes and parameters of a synthesizer hosted in Ableton Live over UDP to a standalone Max/MSP/Jitter patch where the visuals are generated based on the audio parameters. The visuals are created using modifications of Vsynth package by Kevin Kripper.
Below is a sample of what the system can generate. In this particular composition, the parameters like (note, waveform, cutoff, and distortion) of two synths are used to control the parameters of the generated visuals. Other parameters of the visuals were controlled in Ableton’s automation system using the custom Max for Live devices.
The SYNTHesthesia system has been used to power another audiovisual project of mine called Synth Cube which essentially projection maps the visual outputs of SYNTHesthesia onto cubes.
If you’d like to learn in detail about the programming behind the SYNTHesthesia project, you can find it all in my Master’s thesis document, starting on page 57.
A primary component of the SYNTHesthesia system is the Color Keyboard which allows you to map colors to notes as seen in the demo video below. (also outlined in thesis document starting pg. 46)
Check out my live VJ/Lighting rig!
or
Some TouchDesigner Visualizers
or
