top of page

Interactive Tech Demonstrations for Live Performance

Historical note Analog vs Digital

Starting back in 1965 I think Variations V is a prelude for things to come:

Cage, Cunningham, Van Der Beek (& Moog and Tudor): Variations V

Video Tracking
Frieder Weiss, the creator of the Eyecon Software used in Chunky Move’s GLOW

Demonstration of EyeCon Software Windows

HARDWARE: Depth Sensing Cameras
Xbox Kinect and similar depth sensing cameras are at the root of a lot of gesture and movement controlled art works/performances.

Rashaad Newsome is using a Kinect for the Five SFMOMA piece, I will demonstrate how that piece works.

Below are a few basic examples so you begin to see the correlation between body and camera, some samples are long so feel free to scrub around these video examples.


I teach the software Max MSP from cycling ’74 so most of these examples are rooted in the use of that software.

(other software of interest: Isadora, MadMapper, TouchDesigner, Processing, openFrameworks)

Demonstration:Computer Vision with Max Jitter.

Demonstration: Motion detection with Max MSP Jitter and a web camera.

Color Tracking with a camera.
Below are two samples of using the color RED painted on hands to create audio effects:

Demonstration: color tracking with Max Jitter

Kinect Sensor Cameras

Basic Kinect with movement:


More advanced samples:

Hiroaki Umeda

Christian Mio Loclair

Claire Bardainne & Adrien Mondot ( (

Molmol Kuo

Zach Lieberman has a lot of wonderful artworks, have a look at his website:


Here a Kinect is used to control lighting effects in real time (combine this with the Heart Beat monitor!)

Demonstration: Kinect sensor and drawing with Max Jitter and Windows

Physical Computing involves interactive systems that can sense and respond to the world around them.

Heart Beats are always an interesting way to make physical the unseen as we saw in the Sandro Masai video.

Arduino, the staple of Physical Computing

affordable heartrate sensor

Sound artist Dafna Naphtali  uses gesture and voice with wii remote controls and robots:

The hardware and software mirrors of artist Daniel Rozin are clear and entertaining examples of human computer interaction.
In this video, it’s interesting to watch how people interact with the works.

Here, Danny Rozin discusses one of his mirrors from his studio.

Interactive Lighting with DJ equipment


Enttec USB to DMX interface

Chauvet 4 channel dimmer pack

Demonstration of Interactive Lighting with DMX and Max


the coded gaze: face tracking

Tracking bodies and color are the ground work for artificial intelligence and machine learning. Face tracking is ubiquitous these days and, disturbingly,  embedded with algorithmic bias.Joy Buolamwini (the poet of code) uses face tracking and art as research to illuminate the social implications of artificial intelligence.

Joy’s website is here for more exploration and information on her artwork: The Aspire MirrorJoy’s work with students to use face tracking to make video graffiti and projection mapping is documented here.

In-Ear sound monitors

These systems can be a little pricey, however you want a good system when using in performance!

Here is a link to an in-ear monitor system

Singers and musicians use in-ear monitors for pitch information and more.

For Annie Dorsen’s piece Yesterday Tomorrow In-ear monitors were used to provide metronome and pitch for the singers to use while they performed live sight-reading of a musical score.
watch an example here:
more info on that project here:


bottom of page