You are on page 1of 9

Chase Skiles

Independent Study
Process Breakdown
5/2/11
How this paper is organized:
I've broken the description of this project into separate categories to make the chunking of
information clear and understandable due to the complexity of this project.

Purpose
The reason of taking on this project is to further develop the brand I’m trying to build as
Kaleid. I derived the name from the word kaleidoscope and the reason behind this is because
of my fascination with the uniformity and expression of music and visual mediums. My main
influences for this project are kaleidoscopes in their generative artistic nature; Cirque Du Soleil
for their uniformity of art, music and physical expression; and the contemporary and emerging
genre of projection mapping, which implements and fuses visuals with a structure.

Planning:
Evaluating Old Presentation
Very similar to this project I had previously created a performance for Professor Betsy Pike my
sophomore year that used a system that I had created using a network of programs: Ableton,
VVVV, OSCulator and VDMX. This system worked as designed with a Windows XP server
running Ableton and VVVV, which had a custom VST wrapper for getting OSC data out of
Ableton Live by means of VVVV. Which then piped the data out of VVVV via UDP and TCP
network protocols over to the MacBook Pro running OSCulator to decode and act as a control
hub between Ableton and VDMX.
Evaluating Equipment
Since the last performance my available gear and amount of controls have increased
dramatically. First I inventoried and surveyed what kind of controls I had at my disposal. Here
is a quick list of my gear that I would be utilizing for this performance.
MIDI
APC-20 - 8x5 Grid for launching "clips", 9 channel sliders
Novation LaunchPad - 8x8 Grid for launching "clips"
Zero4 - 4 Channel mixer and MIDI controller with 8x4 knobs that all can be custom
MIDI mapped.
Korg nanoPAD- 2x8 Grid
Korg nanoKONTROL – 9 Sliders, 9 knobs, 18 buttons
M-Audio Evolution X-Session – 2x8 of knobs and Slider
OSC
iPad - TouchOSC - completely programmable/customizable
Lighting
4 x Chauvet COLORsplash JR. – 4 Channel DMX Par 38 Cans
Chauvet Vue VI- 6 channel DMX LED Effect light
Enttec DMX USB Pro – USB adapter for DMX Control
Audio
Pioneer DJM-600 – 4 Channel Audio Mixer
Native Instruments Audio 8 – 16 Channel External Sound Card

Other
8' T-stand
Cardboard Logo
VGA to RCA/S-video adapter – Created an sd preview for the V-55
Sonic Impact V-55 Monitor- Used as a external preview so I can see what was
happening on the screen behind me.
VJ Kung Fu styled projector mount: http://vjkungfu.tv/archive/build-projector-mount/
Epson Powerlite S3 Projector

Evaluating Current Processes Used by others


Luckily for the development of this project there are resources available online
that explain how to achieve some of these processes.
Memo TV: http://www.memo.tv/
Vade: http://abstrakt.vade.info/
VJKungFu: http://vjkungfu.com/

Evaluating Projection mapping


Online there are a plethora of performances of projection mapping ranging from
extremely complex to simplistic. Vimeo Videos were a great resource in researching this. A
few artists I drew inspiration from were:
Anti-VJ: http://www.antivj.com/
1024: http://1024d.wordpress.com/

Generative vs. Pre-Generated


From experience with the previous performance of live video I took into understanding
the strain that any visuals being presented would put on my video processor. This was
important because I didn’t have the processing power of a quad core Mac Pro, I had a 15”
Macbook Pro with an 2.8 (single core) i7 with 8 GB of memory. Early in the development of
this project I determined I wouldn’t be using as many quartz compositions due to the amount
of strain they can (if demanding enough) that they can put on the system. Especially when one
computer is processing 3 very different controls at once. Through trial and error (trying Pro-Res
initially) and researching across the forums for VJ-ing and VDMX I found that the compression
of the clips should be done with the Apple Intermediate Codec (AIC) for best optimization.
This compression does not use temporal compression, so every frame can be decoded and
displayed immediately without first decoding other frames (http://support.apple.com/kb/HT2704).
This translates into smother playback and less strain on my hardware.

Designing
Configuring Networks
All in all there was a total of three different data networks in place OSC, MIDI, DMX.
To ensure compatibility great measure were taken to prevent data overlap and feedback. Many
of the controls were disabled for individual programs, ex. not sending MIDI visual controls to
the audio program and visa versa. However, the main hub for controlling most aspects of
triggering this performance was handled by Ableton Live.

Controllers to Computer
With the tremendous amount of data flowing into the computer from external
controllers it was a required to control what data when to what programs. Within both Ableton
and VDMX you can limit/control which devices will send MIDI into each application. Since
there are more input methods than my computer would support I used two hubs for allowing
these devices to connect. One hub was a 7 port USB splitter that would allow all of the USB
controllers to connect while leaving one port open on my computer to allow the Audio 8 to
plug in so I wouldn’t encounter latency issues with audio playback. On top of that one of the
controllers (Korg Zero4) uses Firewire to connect to the computer. I have encountered issues
with daisy chaining a hard drive into the duplicate Firewire port on the back of the mixer
before so to avoid these issues I ran both the Zero4 and my external hard drive into a Firewire
hub to prevent crashing. For all controllers the data coming off them would flow like this:
Controller>Hub>Port>Software

Ableton-To-VDMX
Routing MIDI information to VDMX from Ableton is capable due to the IAC Driver
(Interapplication Communications) that enables MIDI data to managed from one application to
another without the need for external routing of MIDI data with hardware. I established this
network and created MIDI Tracks within Ableton with loop lengths that are identical to the
song lengths. These then were place directly beside the audio clips in the session view of
Ableton to allow for perfect synchronization of triggering the video clips with corresponding
parts of the songs. The final routing of this data flows something like this:
Ableton>IAC>VDMX

Ableton-To-DMX
Stage lighting systems run off of a protocol called DMX512, Digital Multiplex with 512
pieces of information. This is often abbreviated to just DMX, which is what I’ll be referring to it
as. All of my lighting fixtures also run this protocol so any fixture can be coordinated and
synced to this network. Since Ableton Live is the main hub for the synchronization of my
performance I needed to find a way to ensure that these would work.
This was achieved by utilizing various tools and software developed for varying
purposes. The first is Max For Live, an implementation of MAX/MSP by Cycling 74’. This is a
visual programming language that is similar to VVVV that allows you to develop both musical
and visual tools that can be implemented inside of Ableton Live seamlessly and is more stable
than the VVVV VST wrapper used previously.
The handling of DMX was made possible by a few patches and plug-ins by a couple of
developers. The first is DmAX, developed by The Impersonal Stereo
(http://www.theimpersonalstereo.com/software/dmax/). This translates any modulation and
envelops into DMX data that then is relayed to the Enttec USB DMX Pro dongle. For DmAX to
communicate with the Enttec device it also needed a custom external to handle the translation
of data. This is where Olaf Matthes’ dmxusbpro MAX/MSP external came into play
(http://www.nullmedium.de/dev/dmxusbpro/). For the routing of this data it flowed something
like this with the exception of how it was triggered externally (via controllers):
Ableton>Max For Live>DmAX>dmxusbpro>DMX>Fixtures

Creation
Sign creation / Logo Design
First off, I made a symbol in illustrator that would be the design of my surface for
projecting onto behind me. For the design of this symbol I took into account a brand logo that
would carry across projects yet be a capstone logo that would become the brand for my
persona Kaleid. Often when using this persona I market and promote it with a reversed k. So
the final logo I created minimized the reversed k with a forward k and limited it to two karats
and a bar. I then made the cardboard sign using this image-projected onto cardboard. The
original size of the cardboard was 4’x8’. I maximized size by scaling the height to the full 4’
height and making the overall width around 5’. I then cut it out, took scraps and hot glued the
corrugated strips in perpendicular directions to increase the rigidity of the individual pieces…it
is just cardboard after all.
The Plus side of developing the surface this way is now I had the same exact image
digitally as the sign that then can be inverted and used to create the mask to map the image
solely to the sign. The learning curve for this procedure caught me initially. I wasn’t sure how
to make something digital align with a physical object.
Motion Projects
For the majority of visual that I used for this performance I generated quite a bit of effect
inside of Apple’s Motion. Within Motion I sampled from some stock footage I have purchased
from Artbeats (http://www.artbeats.com), custom graphics and simple shapes from within
Motion. To nearly all of these I applied some kind of Kaleidoscope filter or Mirror Filter. The
purpose for applying such effects is because of my projection surface being a mirrored shape
and the persona itself, Kaleid. I had every intention to try to fuse this concept into the imagery
of the project so that the identity would be well formed.

Quartz Projects
As mentioned earlier, I had experimented with both generative and pre-generated
materials. Quartz Composer is a visual programming language that allows the creation and
live generation of visuals. It is also a framework that nearly any Apple software developer
would understand since it is one of the core programming structures for Apple. I decided to
use a couple of these regardless of the strain they can put on a system because of the audio
reactive nature of them.

Audio planning
Before the performance can take place the arrangement of tracks isn’t 100% a necessity
for playback however for making the legibility of the performance to be easily handled I
organized two main audio tracks and a side “drum” channel with the accompanying video
triggering MIDI channels and a channel for the lighting triggers. I chose two main audio tracks
because this was the easiest way to trigger two video files without putting too much strain on
the processor and graphics card. Keep in mind my computer is running an DAW, Visual
performance program and also having to control lighting. For my style of audio mixing it also
wasn’t required to mix more than 2 main tracks and the side “drum” audio track was utilized
for simple drum patterns that wouldn’t have direct visual representation.

Coordinating Visuals with Audio


I organized the video clips for each song by bin inside of VDMX by song title. Each of
these bins then can be triggered via MIDI coming through the IAC driver previously explained.
The video triggering midi tracks each had a note (C-2 –C-3) assigned to them that was then
relayed to VDMX to trigger the respective media bin. For each of these media bins(2) they had
notes that would trigger each video track. Each bin would load each track into respective
layers one and two. The notes on separate scales, C4-B4 and C5-B5, triggered both bins clips
so they could essentially load the same clip to two separate layers. This helped automate the
workflow tremendously by using the same scale of notes vs. assigning different notes for each
and every single clip.

Performing
Pre Setup
For this performance to visually work a few details are required for the mapping side of
getting the image onto the shape. For this a "registration" of the projector to the sign is
required. This is made easier by using a couple of plug-ins for VDMX by Memo.tv with MSA
Visual Company. I utilized his MSA Quad Warp and MSA Quad Mask plug-ins which can be
found here: http://vdmx.memo.tv/ . The quad warp allows you to fix both horizontal and
vertical keystone so that regardless of where the projector is coming from it’ll warp to the full
size of the logo. The quad mask was the used to prevent overflow/bleed from the mask onto
any surrounding areas around the sign.

Post Performance
The remainder of the evening I was responsible to continue to perform. For doing so I
didn’t program anything to the Novation Launchpad so that my configuration with Traktor DJ
Studio would still function. In doing so, I would be able to bounce from Ableton to Traktor
with minimal downtime. However, a few controllers do overlap with the Traktor configuration
I have in place I was required to close the MIDI connection with Ableton with the Zero4 and
X-Session controllers. This also required me to switch OSC layouts on the iPad and OSCulator.

Closing Comments
The outcome of this presentation has led to a clearer understanding of the processes
required to create such a performance and has only fueled a desire to create and expand upon
this process to create a full 2 hour performance versus a minimized 30 min performance. I
would also like to challenge the structures upon which I perform. Creating a full 3d
performance enclosure could also be a goal: http://vimeo.com/15734398

You might also like