You are on page 1of 48

UNIVERSITY OF VIRGINIA

A Survey of Modern Music Software Ecosystems

Submitted as part of the qualifying examinations for candidacy to

Doctor of Philosophy in Composition and Computer Technologies

by

Eli Michael Stine

2017
Contents
Introduction ............................................................................................................... 1
Overview .................................................................................................................... 3
1. The Creative Software Ecosystem .......................................................................... 5
1.1 Creative Practice .......................................................................................................... 6
1.2 Defining the Creative Software Ecosystem .................................................................. 7
1.3 Analyzing the Creative Software Ecosystem ................................................................ 9
1.3.1 Design ........................................................................................................................... 9
1.3.2 Functionality .............................................................................................................. 12
1.3.3 Use ............................................................................................................................... 14
1.3.4 Environment .............................................................................................................. 17
2. Music Software Types .......................................................................................... 18
2.1 Music Software Modes of Engagement ....................................................................... 18
2.1.1 Text-Based .................................................................................................................. 19
2.1.1 Score-Based ................................................................................................................ 20
2.1.3 Patch-Based ................................................................................................................ 22
2.1.4 Performance-Based .................................................................................................... 23
2.2 Environmental Metaphors .......................................................................................... 25
2.3 Music Software Characteristics .................................................................................. 28
3. Case Studies ......................................................................................................... 31
3.1 Super Collider 3.7.2 ..................................................................................................... 31
3.2 Renoise 3.1.0 ................................................................................................................ 34
3.3 Max 7.3.3 ..................................................................................................................... 37
3.4 Ocarina 1.4.6 ............................................................................................................... 39
4. Conclusion ............................................................................................................ 41
Bibliography ............................................................................................................ 43

List of Figures
Figure 1. The Complex, Multi-Directional Interaction of Components in the Creative
Software Ecosystem .................................................................................................... 8
Figure 2 Examples of User Design Modification: The GarageBand Effect Plug-ins Menu
(left), a Pro Tools Presets Menu (center), Max Objects Menu (right) ...................... 10
Figure 3 AfroDJMacs Performance and Recording Setup, (from left to right, Akai
MPK49, APC40, and Novation Launchpad). (Belyamani, 2011) ............................ 12
Figure 4 Different Designs, Same Function: Outputting a 440 Hertz Sine Wave in 5
Music Softwares........................................................................................................ 13
Figure 5 A Variety of Uses of the Max Music Software (Max Artists Interviews, 2017) 15
Figure 6 Incorporating Community Into Design: Upload to SoundCloud Feature in
Ableton Live ............................................................................................................. 16

ii
Figure 7 Use-Directed Advertising: Bitwig Studio Emphasizing Its New, Better
Hardware Integration ................................................................................................ 16
Figure 8 Software Use in Creative Practice as a Point of Convergence Between Art,
Community, and Technology.................................................................................... 18
Figure 9 The text-based GUI component of the CsoundQT integrated development
environment (IDE). ................................................................................................... 20
Figure 10 The Jesusonic IDE embedded in Reaper, affording a text-based mode of
engagement. .............................................................................................................. 20
Figure 11 A Sibelius score, including the full score view (top) and accompanying
timeline view (bottom). ............................................................................................. 21
Figure 12 Propellerhead's Reason sequencer view, showing different threads (tracks) for
different musical materials. ....................................................................................... 21
Figure 13 Pure Data Patch (left) and Kyma Patch (right). ................................................ 22
Figure 14 Reaktor Patch Defining a Synthesizer. ............................................................. 23
Figure 15 The User Interface of Pitter, a program created by the author using Max,,
including a GUI of parameters that invite performance ........................................... 24
Figure 16 VirtualDJ Deck Control View. ......................................................................... 24
Figure 17 Drum Set Pro App Primary GUI ...................................................................... 25
Figure 18 A Pro Tools Session, with Edit (left) and Mix (right) Views, Typifying the
DAW environment metaphor. ................................................................................... 26
Figure 19 A Real-Time Spectrum in Amadeus Pro, which assists users while in a
particular mode of engagement ................................................................................. 27
Figure 20 Super Collider, with a primary coding view, accessible help and documentation
(top right), and post window (bottom right). ............................................................ 32
Figure 21 Various uses of (mostly) Super Collider by Joo Won Park, from his 100
Strange Sounds project ............................................................................................. 34
Figure 22 Renoise, with a primary tracker view, pattern edit view (left) and effects
(bottom)..................................................................................................................... 35
Figure 23 The Art of the Tracker Interface ....................................................................... 36
Figure 24 Max's Patching Mode, dominated by a view of objects, connection, and
customizable GUI elements ...................................................................................... 38
Figure 25 Ocarina, with a performance view (left), a social view (center), and settings
view (right)................................................................................................................ 40

List of Tables

Table 1 Music Software Modes of Engagement ............................................................... 19


Table 2 Music Software Characteristics ........................................................................... 28

iii
Introduction
The use of music software in modern musical contexts is ubiquitous, to the extent that

musicians cant do much today without software (Puckette, The Deadly Embrace...,

2014, p. 8). The uses of music software are as varied as the ways in which people engage

with music, encompassing a large, diverse set of users and applications. The context of

these engagements defines the role of the music software, from toy to canvas to

laboratory to instrument. Level of engagement ranges from simple, arbitrary choices,

such as when a child plays with a mobile music app that lets them move brightly-colored

circles around a screen to turn on and off music loops1, to the management of complex

systems that users study for years to master, as is the case when a studio engineer uses a

professional music editing and mixing software2 to prepare a record for global

distribution.

Platforms range from calculators3 to mobile phones4 to video game consoles5,6 to

personal computers7 to professional studio hardware and accompanying software8.

Developers of music software vary from huge companies that employ hundreds9 to small

1
BabyDJ. http://www.babydj.ru/promo
2
Such as Avids Pro Tools. http://www.avid.com/pro-tools
3
Houston Tracker 2. http://irrlichtproject.de/houston/
4
GarageBand for iOS. https://www.apple.com/ios/garageband/
5
Mario Paint, which included the Mario Paint Composer Tool, allowing users playing the
game to arrange sprites on a grid corresponding to time and pitch. An emulator may be
found here: https://danielx.net/composer/
6
Electroplankton, a game bundled with ear buds whose gameplay mechanic is the
creation of interacting systems of plankton that produce sounds.
https://www.nintendo.co.jp/ds/dsiware/electroplankton/index.html
7
FL Studio. https://www.image-line.com/flstudio/
8
Fairlight. http://www.fairlightau.com/
9
Avid. http://www.avid.com/

1
groups of academics who create music software as part of their research10 to freelance

artists who develop individualized music software as part of their artistic practice11. The

life of music software ranges from several performances or works to decades, during

which the software retains its paradigmatic identity or splits off into multiple projects, all

the while undergoing numerous rewrites and updates as a function of the community

using it, advancements in technology, and the feature interests of the developer(s)

(Mllenkamp, 2014, p. 2).

Early on in the history of computer music a concern over lack of engagement with human

gesture gave rise to many different hardware-based software controllers, devices which

make the control of music software visually accessible and tangible, and which grow in

number and rank to this day (Cook, 2001, p. 1). These controllers take many forms:

encrusted with interactive knobs, buttons, pads, and sliders12, attaching to and/or

extending the design of musical instruments13, or mitigating touch altogether and

facilitating the control of music through movement in space14,15.

Networked and distributed collaborative music software16,17 and massive online

communities dedicated to the dissemination and discussion of music made with specific

softwares and in specific genres18,19 bring music software users together. This virtual

10
The Center for Computer Research in Music and Acoustics at Stanford, for example.
11
Michael Klingbeils SPEAR. http://www.klingbeil.com/spear/
12
Launchpad. https://us.novationmusic.com/launch/launchpad
13
K-Bow. https://www.keithmcmillen.com/labs/k-bow/
14
Radio Baton. https://ccrma.stanford.edu/radiobaton/
15
PlayStation Move motion controller. https://www.playstation.com/en-
us/explore/accessories/playstation-move/
16
Splice. https://splice.com/
17
Blend. https://blend.io/
18
SoundCloud. https://soundcloud.com/
19
BandCamp. https://bandcamp.com/

2
social space is complemented by (and more recently a defining factor of) performances

by the many touring live artists whose primary means of expression (and of supporting

themselves) is through their engagement with music software, along with the many

devoted attendees of these music software performances.

The universal appeal of engagement with music not as a consumer but as a producer,

coupled with the current accessibility of modern music software, whose barrier to entry is

seemingly only access to some digital device and a speaker, has led to a global ecosystem

of modern music software. In this ecosystem, musical communities give rise to music

software (or vice versa) and musical usage and software design form a feedback loop,

each component feeding off the other: the music shaping the software, the software

shaping the music, this new music affecting the software, and so on (Puckette, The

Deadly Embrace..., 2014, p. 8).

This work seeks to explore the different types of music software available and analyze

how they function in such ecosystems, with an emphasis on the software existing both

within the creative toolbox of the artist(s) directly engaging with the software and as part

of larger musical, social, and technological environments.

Overview
For the purposes of this paper I define a piece of music software as, quite simply, a

software that is used to create music. The terms create and music are defined by the

people using the software and the communities they use that software within, but

generally these softwares are used to organize, modify, and/or otherwise affect sound.

The emphasis on use in this definition is very intentional. A software tool or set of tools

that may not be considered music software at this point in time may, in a very short

3
amount of time, be used for that purpose and then be redefined20. In this paper I will

concentrate exclusively on software primarily designed for music composition,

production, and/or performance.

Music software facilitates creativity in the domain of sound in many different ways,

ranging from the recording, editing, and affecting of real world sound to the design and

sequencing of software synthesizers, to the programming of code to define complex,

evolving musical textures. Other software aids the user in getting outside of the box,

emphasizing the computers role in physical, analog musical environments and

activating the performative role of the computer (Charrieras & Mouillot, 2015). Other

software interfaces with non-musical disciplines (biology, architecture, computer science,

etc.), using sound, and the control of sound, in a multitude of ways to engage with that

disciplines interests (Vickers, 2011).

These software capabilities and uses came about as a function of the contexts within

which they were spawned. Because of this, it is absolutely necessary to contextualize the

uses of these softwares within a historical, developmental framework as well as

investigate their use as integral components in social, technological, and musical

environments. Research spanning design studies, software studies, ethnographically-

20
A case in point comes comes from Minecraft, a multi-user sandbox video game within
which users can create and destroy blocks (among many other things). In version 1.2 of
the game the developers introduced a special type of block called a Note Block
(http://minecraft.gamepedia.com/Note_Block). When activated by an energy source
called redstone note blocks emit a user-selectable pitch. Combining this functionality
with the various other mechanical control facilities of Minecraft, users began to create
their own note sequencers and music software within the sandbox world of Minecraft.
https://www.youtube.com/watch?v=mjLDM1AY1-E.

4
informed and musicology-driven analyses of the uses of music software will be used in

this contextualization process.

Towards discussing music software ecosystems, in Section 1 I briefly describe and

analyze the creative software ecosystem, including softwares role in the creative process

and its interaction with design, functionality, and use. Informed by this discussion in

Section 2 I then present a set of primary music software modes of engagement, discuss

their involvement in software environment metaphors, and delineate a list of significant

music software characteristics with which music software may be described. Next, in

Section 3 I investigate four exemplars of different types of music software, comparing

and contrasting them through brief system analyses, developmental histories, and

ethnographic portraits of the musical communities that engage with them. I conclude by

describing the topology of the modern music software ecosystem as an ever-shifting

network whose impact on global media and culture is complex and profound (Section 4).

1. The Creative Software Ecosystem


Software alone, an interface, a set of functionality, a program running on a computer and

its associated input and output hardware, does not create music. It is at the first

interaction with a human agent that it begins to sing, that it sculpts, morphs, sequences,

and otherwise controls data over time. It is this interactive performance, this direction of

computation, that I am most concerned with in this work. Towards analyzing the

externalized and internalized human motivations, contexts, histories, and the multitude of

other influences at play during this complex act (Duignan M. , 2008, p. 28), I will

decompose the process of creation using several analytical lenses.

5
1.1 Creative Practice

Creative practices are as varied as creative practitioners, and no two people or groups of

people engage with art and artistic tools in the same way, which is part of what makes art

so expressive and varied. For our purposes, I will model creative practice via the

reflection-in-action paradigm (Schn, 1983). Reflection-in-action is a cycle involving

making an action, analyzing the results of that action, determining ones next action, and

then repeating that cycle until, after analysis of the results of a previous action, ones next

action is to take no action at all. This cycle may take place at many (or simultaneously

multiple) time scales, ranging from intuitive, moment-to-moment performative

decisions21 to the shaping of compositions over years. During the creative process the

analysis step of this paradigm is dynamically informed by local or global factors, that is,

during the reflection-in-action process a larger goal may drive an artists choices or their

next choice may simply be informed by the results of the last action, approaches that I

will label utilitarian and experimental, respectively. The choices made and the influences

on this cycle are informed by a vast number of factors encompassing the artists

knowledge, abilities, resources, and aesthetic preferences.

When a computer is integrated into the reflection-in-action cycle, interactions with the

computers software may be described via the execution-evaluation cycle (Norman, 1988,

p. 41). The execution-evaluation cycle is a necessarily discrete process that involves

establishing a desired output, determining the action(s) needed to (best) achieve that

output (what is called the intention), executing those action(s), perceiving and

interpreting the post-execution system state, and lastly evaluating the system state with

21
Driven by reflexive actions, as when performing an instrument.

6
respect to goals and intentions (and modifying them accordingly) (Dix, 2009, p. 15). This

digital codification of the portion of the reflection-in-action cycle that involves human-

computer-interaction will be used when we discuss the design, functionality, and use of

types of music software.

1.2 Defining the Creative Software Ecosystem


Zooming out from our brief analysis of the intimate creative process, we now expand our

view by tracing how software, its users, its developer(s), and the communities they are a

part of engage with a music software. This tracing process is an analysis of development:

development from the perspective of a user and developer, as they learn and build,

respectively, their relationship to a software.

First, a creative software has a material design, defined by its developer(s) using software

development tools (including software development kits (SDKs) and application

programming interfaces (APIs), tools which may be traced back to earlier development).

This design is most immediately palpable to a user via the graphical user interface (GUI),

but also includes the ways in which that GUI interacts with the softwares capabilities,

what is under the hood of the interface. The design of the software, when combined

with a users knowledge, spawns affordances, defined here as detected potentials for

action (Norman, 1988, p. 11). As these affordances are arranged into a mental model of

the software and subsequently acted upon, the software takes on a particular functionality

to the user, a personal establishment of what the software can do and how it does it

(Duignan M. , 2008, p. 25). As this functionality permeates and intermingles with the

environment of the user, the space within which the creative software exists, the use of

the software, how its functionality is purposefully deployed, is ultimately determined.

7
Figure 1. The Complex, Multi-Directional Interaction of Components in the Creative Software Ecosystem

The interaction of design, functionality, use, and environment is complex, multi-

directional, and saturated with continuous adaptation, with a linear navigation of this

space the exception rather than the rule (Figure 1). Most important to note is that in this

ecosystem design informs but does not dictate a specific functionality, and likewise

functionality informs but does not dictate a specific use, processes which are intimately

linked to a specific environment. To better understand the complexities of this interaction,

we separate and discuss its constituent parts, relating them to concrete music software

examples.

8
1.3 Analyzing the Creative Software Ecosystem
1.3.1 Design
The design of a software, at a given moment during its use, is singular, defined entirely

through its code22. Design encompasses both the look of the software (its GUI) and its

capabilities (the full set of actions it can execute). Software design, as with any type of

design, is highly informed by previous designs and design applications, resulting in the

codifications of models of common designs problems and their solutions, called pattern

languages, in domains including software engineering and user-interface design

(Duignan M. , 2008, p. 38).

The environment that a software is designed in, including the creative and technological

communities of its users, has a significant impact on its design, as application drives

development. In turn, how a software is used affects its design, indirectly through

observation of its deployment by the developers and directly via feedback received from

users. On comparing direct and indirect environmental creative software design

influences Miller Puckette, inventor of the Max lineage of music software23, states much

more can be learned much faster if the software developer becomes personally involved

in at least some projects in which artists use the software. (Puckette, The Deadly

Embrace..., 2014, p. 6).

A user may personally, locally modify the design of a software in a variety of ways. A

user may take advantage of a programs modularity (degree to which a systems

22
More accurately, a softwares design is a result of the effects to the user-perceivable
input and output components of a specific machine that executing that code has.
23
Including jMax, Pure Data, Cycling 74s Max, and several others. Discussed further in
section 3.3.

9
components can be separated and combined24) and incorporate new modules into the

software. In the context of music software, plug-ins, described by audio-visual software

company Avid as special purpose software components that provide additional signal

processing and other functionality25, are widely used and supported. A software may

incorporate plug-ins into its standard distribution and may also support third-party plug-

ins produced by one of the many music software plug-ins companies26 who produce plug-

ins in standardized formats (such as Audio Units, Virtual Studio Technology (VST), etc.).

Figure 2, left, displays plug-ins provided by a software distributor (top) and a menu

populated by user-added plug-ins (bottom).

Figure 2 Examples of User Design Modification: The GarageBand Effect Plug-ins Menu (left), a Pro Tools Presets
Menu (center), Max Objects Menu (right)

24
Definition of modularity. http://www.dictionary.com/browse/modularity
25
Pro Tools Online Docs, Audio Plug-ins Guide Version 11.2
26
Izotope, for example.

10
A user may also customize a software, altering the organization and specifics of its

design as per their preferences, actions facilitated internally by the software. In the

context of music software, customization may be done by saving presets, snapshots of the

set parameters of a particular component of the software. Figure 2, center, displays a

preset menu for an equalization plug-in containing standard distribution presets (top) and

user-defined presets (bottom).

Lastly, if supported by a particular program, a user may also extend a software, writing

new code that alters the programs design at a base level. In the context of music

software, by virtue of accessibility extension most frequently happens in open source

software27 or through the use of a specific API distributed by the developer28, created to

enable users to extend the software. Figure 2, right, displays a list of objects installed in

the Max music software, including external objects created by its community.

Other modifications to design may happen external to the software. Altering the means of

interaction with and overall tangibility of a software design through the incorporation of a

hardware controller, for example, may offer new affordances to the user, changing the

systems functionality and potentially its creative use (Figure 3). Running multiple

softwares simultaneously and linking their designs offers another way to contextualize a

software as part of a larger meta-design (addressed in Section 2.2).

27
As in the creation of Pd-extended from Pure Data (vanilla).
28
Cycling 74 distributes

11
Figure 3 AfroDJMacs Performance and Recording Setup, (from left to right, Akai MPK49, APC40, and Novation
Launchpad). (Belyamani, 2011)

1.3.2 Functionality
As a user perceives, interprets, and evaluates a music softwares design its affordances

are revealed, which, when acted upon, define the softwares functionality. Functionality

may simply encompass a subset of the available actions of a software, or may also

include functions that involve combinations of actions. The same exact software design

may have many different functionalities, as different users bring different skill sets,

backgrounds, and goals to the software, identifying different affordances and their

resultant functionalities (Mllenkamp, 2014, p. 2). Conversely, different software designs

may have the exact same functionality, allowing users to do the same thing (producing a

sine wave at 440 hertz, for example (Figure 4)) in different ways. How many actions this

function requires, along with the perspicuity of the affordances that must be acted upon to

facilitate those tasks (that is, functional clarity), may cause a user or a musical

12
community to privilege one softwares ability to accomplish a task over another, causing

that software to be a go-to software for that particular function in a given music

software ecosystem (D'Errico, 2016, p. 59).

Figure 4 Different Designs, Same Function: Outputting a 440 Hertz Sine Wave in 5 Music Softwares

A users perception of the functionality of a software changes over time as a product of

their knowledge of its affordances, a process that is informed by the users engagement

with a softwares design, the continuous formulation of a mental model driven by that

engagement, their musical and technological training, and other changes caused by the

musical, technological, and social communities that they are a part. Ways that the

subjective affordances of a particular software, and its resultant functionality, may

change include self-guided experimentation, developer tutorials, independently-created

tutorials, and conversational engagement with other users of that software (shop talk).

13
During this process of software design evaluation, users may reveal hidden affordances

of the design, uncovering functionalities that the original designer did not recognize

(Gaver, 1991, p. 80). As mentioned in the preceding discussion of design, the interaction

between users and developers is essential for the improvement of a software and the

evolution of media technology in general, and recognition and deployment of new

functionalities in designs the developer did not consider is a significant part of this

process. The birth of new functionality may cause a developer to modify their design to

make the affordances that facilitate that functionality more foregrounded or expressive,

affecting its subsequent users who may, in turn, reveal more hidden affordances.

1.3.3 Use
Directing functionality to a specific task in an environment defines a softwares use. As

with the relationship between design and functionality, a certain functionality may be

used in many ways (Figure 5), and different functionalities may be used in the same way.

The uses of music software, as mentioned in the introduction, are vast, covering a wide

gamut of creative expression space. A sampling of music software uses includes the

creation, editing, and processing of fixed media (production), live performance (laptop

performance, instrument extension), musical pedagogy, the creation of notation

(engraving), engagement with other media software (video, animation, etc), research in

academic settings, integration into ludic systems (video games), connecting to other

musicians remotely, and many more.

14
Figure 5 A Variety of Uses of the Max Music Software (Max Artists Interviews, 2017)

A music softwares environment directs the way in which its functionality is used, to the

extent that they are absolutely inseparable. A particular functionality (for example, the

ability to produce an A440 sine wave) in one musical setting may be directly applicable

to some application in that setting (say, tuning with an ensemble before performance),

whereas in another musical setting (a bedroom studio, for example) that functionality has

no problem to solve, no musical application to engage with, and thus has no use.

Long before the environment that a user positions themselves in affects how they use a

particular software, software developers and marketing teams consciously direct their

design towards a particular use (Figure 6), privileging behavior and actions whose

15
functionality is frequently deployed in their target use and de-emphasizing other, less-

used actions and behaviors.

Figure 6 Incorporating Community Into Design: Upload to SoundCloud Feature in Ableton Live

This directing of use extends past the bounds of the software, saturating its promotional

materials (Figure 7), its tutorials, and ultimately the minds of its users (and potential

users). This extension may also affect the design of other music software, functioning as

a catalyst for the evolution of the uses of music software systems in general (D'Errico,

2016, p. 53).

Figure 7 Use-Directed Advertising: Bitwig Studio Emphasizing Its New, Better Hardware Integration

16
1.3.4 Environment
Where, when, how, and why a software is used and by whom it is being used are all

elements of a softwares environment. The environment of a software has played an

affective role in descriptions of design, functionality and use, but I now briefly discuss

how the creative practice of engagement with a software pushes outward, defining its

own environment.

First, the production and distribution of music created using music composition software

(its use) is an essential part of defining many musical communities. Producers and

consumers in these musical communities have an interest in how the music they engage

with is created, i.e. the "story behind the sound", an integral part of which is an artist's

engagement with music software and their subsequent transparency of that use. Second,

the interaction between the users of a particular software and its developers is absolutely

essential in order for the software to evolve and improve, and more generally bi-

directional interaction between developers and users drives media technology progress.

This evolution is preempted by users deeply engaging with a software. Third, the

evolution of music produced with music composition software can take place only so

much in the theoretical domain. Exploratory new techniques that create new

combinations, transformations, and presentations of sound come about through the

application of theory and experimental engagement with music software, in other words,

new use expands the set of ways by which music software may be used within a

particular environment, in turn expanding the types of music in that environment (Figure

8).

17
Figure 8 Software Use in Creative Practice as a Point of Convergence Between Art, Community, and Technology

2. Music Software Types


Moving forward from our descriptions of creative process and the creative software

ecosystem, we now focus on methods to compare and contrast the designs, functions, and

uses of music software.

2.1 Music Software Modes of Engagement

In order to better decompose the archetypal designs of music software, I propose as a first

means of differentiating music software types a set of modes of engagement, descriptions

of designs that relate to 1) the GUI, 2) the underlying capabilities of the software, and 3)

the mental model that a user engages with while using these designs within their creative

18
musical practice. Presented below are the four primary modes of engagement I have

chosen to focus on, heavily influenced by the work of Mllenkamp, Duignan, and

DErrico, (Mllenkamp, 2014), (Duignan M. , 2008), and (D'Errico, 2016), respectively):

Table 1 Music Software Modes of Engagement

I. Text-Based
II. Score-Based
III. Patch-Based
IV. Performance-Based

Within a music software these modes of engagement may be presented in parallel (for
example, a design allowing a user to switch between text-based and patch-based modes
of engagement), presented in series (for example, a design that has a compound,
performance-text-based mode of engagement, where a user must engage with both
performative and textual modes of musical creation), or both in parallel and in series (a
design that involves a user mode-switching between compound modes). A brief discuss
of each mode of engagement, along with examples of each, follows.

2.1.1 Text-Based
The text-based mode of engagement privileges a GUI dominated by text. Its execution-

evaluation cycle typically involves writing code that defines ways of processing,

generating, and controlling sound over time, attempting to compile that code into an

executable program, correcting any issues that arose, and repeating this process. This

mode of engagement makes use of the modal affordances of text: conciseness, open-

endedness (flexibility), linearity of comprehension. The mental model engaged with

while coding in this mode includes a mixture of computational and creative thinking

(D'Errico, 2016, p. 90).

19
Figure 9 The text-based GUI component of the CsoundQT integrated development environment (IDE).

Figure 10 The Jesusonic IDE embedded in Reaper, affording a text-based mode of engagement.

2.1.1 Score-Based
The score-based mode of engagement privileges a GUI that shows, and allows for the

control of, abstract representations of musical events over time. Its execution-evaluation

cycle typically involves adding, removing, or moving events on a timeline (score) and

then (optionally) hearing these alterations by auditioning (playing back) the score. The

score events may be defined in many musical dimensions (e.g. pitches, amplitudes,

references to audio file, etc.) with different types of events typically grouped into

different threads or tracks. This mode of engagement is included in a great deal of music

software as a primary means to engage with time-based processes (D'Errico, 2016, p. 38).

20
The mental model of this mode is focused on the management of multiple musical

parameters changing over time.

Figure 11 A Sibelius score, including the full score view (top) and accompanying timeline view (bottom).

Figure 12 Propellerhead's Reason sequencer view, showing different threads (tracks) for different musical materials.

21
2.1.3 Patch-Based
The patch-based mode of engagement privileges a GUI of a virtual environment where

sound-generating and sound-processing objects (elements, modules, building blocks,

etc.) are made to interact. The execution-evaluation cycle typically involves adding,

removing, or changing the relationship between (connecting, disconnecting) these

virtual elements, and auditioning the effects of those changes. Notably, this mode of

engagement typically starts with a clean slate, privileging bottom-up creative

environments in which the programmer-artist can create comprehensive systems or tools

by working through the interactions of the smallest possible units (D'Errico, 2016, p.

111). The mental model of this mode foregrounds a type of sandbox experimentation,

as the deconstruction of sonic design into these small building blocks privileges a

zoomed in, action-to-action sensitive reflection-in-action cycle (D'Errico, 2016, p.

115).

Figure 13 Pure Data Patch (left) and Kyma Patch (right).

22
Figure 14 Reaktor Patch Defining a Synthesizer.

2.1.4 Performance-Based
The performance-based mode of engagement privileges a GUI that foregrounds real-time,

performative activities. The execution-evaluation cycle typically takes place at a short

time scale, with a user executing some action and receiving instant sonic feedback with

which to evaluate their next action. Performance-based modes of engagement range from

a mode that is stateless, entirely focused on real-time performance (a traditional

instrumental mode of engagement (Figure 17)) to a mode with state, used in the service

of other modes (recording performance gestures onto a score or modifying the parameters

of a patch-based synthesizer (Figure 15), for example). The performance may also use the

software dynamically in more experimental or utilitarian ways (see section 1.1). Because

of the emphasis on real-time, fluid execution and evaluation the gulfs of execution and

evaluation are deeply important to this mode of engagement, barriers to use involving

determining how to operate a system at a given moment and determining what state a

system is in related to its last executed actions, respectively (Norman, 1988, p. 38). The

performance-based mode of engagement affords real-time interactivity (between

23
musicians, music software) and integration with hardware controllers. This mode of

engagement is frequently combined with other modes of engagement in the service of

bridging the editing / performance divide (Duignan M. , 2008, p. 209), externalizing

and recontextualizing the engagement with the other modes (typified by real-time music

programming languages such as Super Collider and CSound for example, which are

dominated by a performance-text-based mode of engagement) (Charrieras & Mouillot,

2015, p. 193).

Figure 15 The User Interface of Pitter, a program created by the author within Max,, including a GUI of parameters
that invite performance

Figure 16 VirtualDJ Deck Control View.

24
Figure 17 Drum Set Pro App Primary GUI

2.2 Environmental Metaphors


The concept of metaphor saturates these modes of engagement, defined with respect to

user-interface as a device for explaining some system functionality or structure by

asserting its similarity to another concept or thing already familiar to the user

(Mllenkamp, 2014, pp. 32-38). Typical metaphors used while engaging with these

modes of engagement are orientational metaphors (relating sound and spatial

orientation), ontological metaphors (relating abstracted objects and their relationships),

and structural metaphors (which link a design with its use) (Duignan M. , 2008, pp. 32-

35). These types of metaphors may be in the context of a skeuomorphic design

philosophy, in which interface elements are direct metaphors for real objects (e.g. a drum

set graphic is a physical drum set (Figure 17) and the virtual score is a physical, paper

score (Figure 11)) or may be more abstractly related to real-world objects (Figure 15)

(D'Errico, 2016, p. 7).

Modes of engagement are frequently combined in the service of realizing a real-world,

environmental metaphor, with the most common of these being the multitrack-mixing

25
metaphor embodied in digital audio workstations (DAWs) (Duignan M. , 2008, p. 51).

DAWs combine many modes of engagement in parallel and in series to construct a music

software that typically emulates an environment consisting of the multitrack tape recorder

(as a means to record, edit, and arrange sound in time) and the mixing console (as a

means to layer, affect, and otherwise mix sound), devices which together afford the

primary functionalities of a physical recording and mixing studio (Duignan M. , 2008, p.

52). Designs that facilitate the DAW mode of engagement typically include a sequencing

timeline (score-based), a virtual mixing console (performance-based), and effects racks

(patch-based).

Figure 18 A Pro Tools session with Edit (left) and Mix (right) views, typifying the DAW environment metaphor

The design of DAWs frequently diverges from exactly replicating physical studio

systems, making use of the special affordances attainable through computer software

design. These divergences include assistive GUI elements which aid in a particular mode

26
of engagement (Figure 19), enhanced interconnectivity that would be insanely

impractical (and expensive) to implement in physical space, and deeper levels of

ontological metaphorical abstraction afforded by computational design (Duignan M. ,

2008, p. 64).

Figure 19 A Real-Time Spectrum in Amadeus Pro, which assists users while in a particular mode of engagement

Lastly, users may create their own metaphorical software environments by running

several different softwares simultaneously, effectively combining the functionalities of

different programs. This type of organization makes use of the interoperability of music

software, their ability to send and receive information to and from one another. An

example of an interface that affords interoperability is ReWire29, which facilitates the

streaming of audio to and from ReWire-enabled applications.

29
Originally created by Steingberg and Propellerheads as part of their Rebirth software.

27
2.3 Music Software Characteristics
Complementing the use of modes of engagement and environmental metaphors in

describing a music software, I present a list of music software characteristics. These

software characteristics were chosen as a function of their important relationships to

design, functionality, use, and the environment of the music software, and are heavily

influenced by Duignan (Duignan M. , 2008, pp. 249-265). The characteristics are named,

described through the questions the characteristics answer or possible answers to the

characteristic (if it has a closed set of possible options), and are presented alongside brief

notes about the characteristic.

Table 2 Music Software Characteristics

Characteristic Questions Answered Notes


IDENTITY AND CONTEXT
Developer Who made the software?
Version Which version? Which channel? Different versions and
distribution channels have
different designs and functions.
Platform Hardware? Operating System? In what technological context
may this software be used?
Access Where can one gain access to this Download via a website, only on
software? physical media, through a web
portal, only on a limited number
of machines, etc.
Cost How much? Single purchase? Subscription-
based?
Source Open? Closed? Access to source affects
Availability extensibility specifically and the
type of community involvement
generally.
Development Active? Updated how often? Update regularity and
development activity affect how a
user engages with a softwares
development, as an active,
regularly updated software is
more likely to change in response
to users requests.

28
Developmental What computer language(s) and Is it based on another software?
Resources tool(s) were used to make the
software?
DESIGN AND FUNCTION
Media Types What types of media does the What types of sounds a music
software have the capability to software has the ability to engage
work with? Real-world sound? with inform the modes of
Synthesized sound? Sound event engagement that are best suited to
abstractions (MIDI, notation)? Etc. those types of sound.
Media Asset Existent? Open? Closed? An asset system manages the
System deployment of instances of media
in the software. A closed asset
system does not allow a user to
import sound materials.
Media How is the media graphically Representations are heavily
Representations represented? Waveform? Text? informed by, and used to define,
Piano Roll? Etc. modes of engagement.
Media Metadata None?, Administrative (name, The inclusion of metadata affects
duration, non content-related)?, how a user builds a mental model
Structural (contextual: in a group, of the software and its
of the same type (genre))?, functionality, as it groups or
Descriptive (rich, potentially uses enhances different media
on-the-fly MIR, tagging, instance- instances, simplifying the process
based)? of structural abstraction.
Media Control Non-linear Editing? Performance The way media is controlled is
(Capture)? Algorithmic?, heavily informed by, and used to
Medium-Mapping/Sonification? define, modes of engagement.
Hardware Hardware Ambivalent? Allows Hardware ambivalence is present
Integration Hardware Control? Requires in a number of trackers, hardware
Hardware Control? control is frequently incorporated
in DAWs, and some software
(Maschine, Fairlight, DControl,
etc.) is built-around, and requires
the use of a particular hardware to
operate.
Modularity Modular? Singular? Equal modular parts or an
environment-plug-in metaphor?
Extensibility Can the software be extended? To what extent may the software
be extended?
Customizability What customizability options does
the software have?
Interoperability Can and how does the software Interoperability facilitates the
connect to other software? What creation of meta-designs, the
data do they share? creation of music software
environments through the
modular use of several

29
interconnected music softwares.
E.g. through Propellerheads
ReWire or OpenSoundControl.
Export In what formats may a project Audio only? Audio and a
created with the music software be representation of the project? A
exported, if applicable? concrete representation (sheet
music)?
Modes of Text-based?, Score-based?, Patch- How does the user switch
Engagement based?, Performance-based? between modes of engagement, or
Combinations? are they integrated into one view?
Environmental Virtual Studio? Physical Space? Can the user abstract the
Metaphors Other? programs design to a known
environment?
ACTIVITY ABSTRACTION (after Duignan)
Processing How are effects and processes See (Duignan M. , 2008, pp. 139-
Management applied to sound materials 155)
represented, abstracted, and
controlled?
Voice How are multiple streams of See (Duignan M. , 2008, pp. 157-
Management musical parameters represented, 177)
abstracted, and controlled?
Temporality How is time represented, See (Duignan M. , 2008, pp. 179-
Management abstracted, and controlled? 214)

Reuse and How does all of the work done See (Duignan M. , 2008, pp. 217-
Versioning with the software speak to each 247)
other (communication across
projects)? How does the software
engage with versioning?
ENVIRONMENT
Extra-Musical What types of media and data, Video?, Text?, Scientific data?
Media other than musical data, does the
Integration software understand?
Assistance In what ways does the software Help Files? Embedded Tutorials?,
help the user learn the affordances Real-time Hint Tooltips?
of its design?
Social In what ways does the design of Networked (Live, Offline)?
Integration the software facilitate its Connected to Social Media?
integration into a community?
Community What kinds of communities does Is it quintessential to the
the software have around it? definition of a musical
community? A social
community? A technological
community?

30
3. Case Studies
Using the models and methods of analysis presented in the previous sections, I will now

briefly analyze four examples of music software. These programs were chosen because of

the diversity of their music software ecosystems, the existence of previous analyses, and

enough available documentation, history, and community so as to make an analysis of

their real-world use possible.

3.1 Super Collider 3.7.2


Super Collider is a computer music programming language released in 1996 by James

McCartney. It is free and open source, is available for Linux, maxOS, Windows, and

FreeBSD, and was written in C++ (McCartney, SuperCollider: A New Real Time

Synthesis Language, 1996). It is accessible via its website, www.supercollider.github.io.

On motivations for creating Super Collider, McCartney writes:

Motivations for the design of Super Collider were the ability to realize sound

processes that were different every time they are played, to write pieces in a way that

describes a range of possibilities rather than a fixed entity, and to facilitate live

improvisation by a composer/performer. (McCartney, Rethinking the Computer Music

Language: Super Collider, 2002, p. 61)

Super Colliders GUI (by default) is divided into several panels, with the primary panel

facilitating a text-based mode of engagement with the computer programming language

and secondary panels showing an extensive and well-presented documentation and a

terminal post window, aiding in evaluating the results of execution (Figure 20). The

Super Collider language itself organizes all elements into classes of objects which may be

31
made to interact. For example, members of the class of UGens (unit generators, which

produce a signal) may be grouped together into a Synth (a synthesizer definition). In this

way, in Super Collider when one writes a sound-processing function, one is actually

writing a function that creates and connects unit generators, a mode of engagement that

typifies the patch-based mode of engagement (McCartney, Rethinking the Computer

Music Language: Super Collider, 2002, p. 62). This patch-based mode is combined with

a performance-based mode, as Super Collider allows a user to run sections of code (these

groups of objects) on-the-fly, producing sound from a document while code in that

document is currently being edited, facilitating live coding (Blackwell & Collins, 2005).

Figure 20 Super Collider, with a primary coding view, accessible help and documentation (top right), and post window
(bottom right).

Super Collider may be used for algorithmic composition and sequencing, finding new

sound synthesis methods, connecting your app to external hardware including MIDI

32
controllers, network music, writing GUIs and visual displays and more30. A user may

modify their Super Collider experience via Quarks, class extensions that add more

objects to the program, allowing for new functionality and recontextualizations of

existing objects in the system. The system has also been extended to allow for live,

networked interaction, facilitating group live coding, including sharing information about

the state of objects in the system over a network (de Carvalho Junior, Lee, & Essl, 2015).

The community of Super Collider users takes the form of an annual symposium,

involving talks, lectures, and performances31, regular meetups of users across the world32,

and an online presence consisting of a mailing list, a public code sharing repository33 and

a large network of blogs, video tutorials, and music-sharing hubs that focus on Super

Collider (Figure 21). The DIY community around Super Collider, a function of its unique

and open combination of text-, patch-, and live performance-based modes of engagement,

coupled with its non-proprietary and highly extensible design, has resulted in a music

software ecosystem containing real-time interaction, installations, electroacoustic

pieces, generative music, and audiovisuals and more (Wilson, Cottle, & and Collins,

2011).

30
http://supercollider.github.io/
31
http://supercollider.sourceforge.net/symposium/
32
http://supercollider.sourceforge.net/meetings/
33
http://sccode.org/

33
Figure 21 Various uses of (mostly) Super Collider by Joo Won Park, from his 100 Strange Sounds project34

3.2 Renoise 3.1.0


Renoise is a DAW released in 2002 by Eduard Mller and Zvonko Tesic, with more

development since then by Paul Rogalinski, Martin Alns, Simon Finne, Lucio Asnaghi,

Erik Jlevik, and Kieran Foster35. It is proprietary software, costs $75.00 to purchase, and

is available for Windows, macOS and Linux. It is accessible via its website,

www.renoise.com.

Renoise is based on the source code of NoiseTrekker, a tracker program created by Juan

Antonio Arguelles Rius. A tracker is a text-based, computer keyboard-manipulated

music software that allows the user to create patterns of note data (often 4 bars)

comprising a short passage of music. These patterns resembling a spreadsheet in

appearance, and analogous to a step-sequencer or player piano in function are then

arranged in a specific order to produce a song. The saved file (or module) stores the

34
http://www.100strangesounds.com/
35
http://www.renoise.com/who-are-we

34
song together with all the notes, samples and instrument settings. (Nash & Blackwell,

2011, p. 575). Trackers thus facilitate a multi-level score-based mode of engagement, one

that affords the division of music produced with the software into sections (corresponding

to patterns). This mode of engagement, coupled with the method of input being primarily

the computer keyboard, a device used quickly and fluidly by many people, results in a

system that facilitates a virtuosic, rapid production of music, although one that relies less

on easier to understand, metaphorical engagements with interface than a more traditional

score-based interface (a waveform sequencer, for example) (Nash & Blackwell, 2011, p.

581).

Figure 22 Renoise, with a primary tracker view, pattern edit view (left) and effects (bottom)

Renoise exemplifies a DAW metaphor except that rather than modeling a multi-channel

tape deck the score-based mode of engagement is with this tracker interface, organizing

events on a grid with time from top to bottom and different tracks/parameters from left to

right (Figure 22). On this tracker interface being the primary score-based mode of

35
engagement in Renoise one its developers, Bjrn Nsby, states that we realize that

using Renoise as the main DAW isnt everybodys cup of tea36, but from this choice of

score-based mode of engagement a diverse and highly-engaged community has

developed around Renoise. Use of trackers comes from the demoscene, a computer art

subculture that specializes in realtime, non-interactive audio-visual presentations

designed to demonstrate coding and artistic skill (Nash & Blackwell, 2011, p. 575) and

Renoise is defined both as an extension of this demoscene community and as a part of

the community that has arisen around it in its own right.

Figure 23 The Tracker Interface

36
Meet the programmers: Renoise. http://www.musicradar.com/news/tech/meet-the-
programmers-renoise-604106

36
The community of Renoise users takes the form of an active forum hosted on its website

(with over 11000 users)37, an artist highlighting and interviewing page on its website38,

and the hosting of song competitions39.

Renoises music software ecosystem is a hybrid, both with respect to its incorporation of

DAW and tracker modes of engagement and in the way that it straddles multiple musical

communities, representing a mixture of tradition (stemming from trackers and their

community) and progress (through the incorporation of full-fledged DAW elements).

3.3 Max 7.3.3


Max is a media software that creates interactive sounds, graphics, and custom effects40

released in its current form in 1997 by software development company Cycling 74, but

dating back to Miller Puckettes work on its proto-languages in the mid-1980s41. It is

proprietary software, with licenses ranging from a $399 permanent license to annual and

monthly subscription options, is available for Windows and macOS and is written in C

and C++. It is accessible via its website, www.cycling74.com.

On its design, Miller Puckette states that the Max paradigm can be described as a way

of combining pre-designed building blocks into configurations useful for real-time

computer music performance (Puckette, Max at Seventeen, 2002). In general, the

execution-evaluation cycle of Max involves typing into text boxes which instantiate

objects of different types and functionalities (list processing, signal processing and

37
http://forum.renoise.com/
38
http://renoise.com/artists
39
http://renoise.com/songs
40
https://cycling74.com/products/max
41

https://web.archive.org/web/20090609205550/http://www.cycling74.com/twiki/bin/view/
FAQs/MaxMSPHistory

37
generation, GUI, etc.), combining the functionality of these objects by creating patches of

connected objects, and auditioning the results. Note that the core capabilities and mental

model of Max mirrors that of Super Collider (3.1), but with the privileging of text- and

patch-based modes of engagement inverted.

At the surface level Max appears to exemplify a patch-based mode of engagement, which

it does, but more recently Maxs design has expanded to incorporate text-based (through

the gen~ class of objects42) and score-based (through its integration with Ableton Live, a

DAW with a full timeline and tracker-like pattern sequencer, via Max for Live43) modes

of engagement.

Figure 24 Max's Patching Mode, dominated by a view of objects, connection, and customizable GUI elements

42
https://docs.cycling74.com/max7/maxobject/gen~
43
https://www.ableton.com/en/live/max-for-live/

38
The online community of Max is extensive, including a repository of projects created

with the software44, interviews with artists whose creative practices make use of Max45,

and an active forum46. Although used for music, the generality of Max (enhanced by its

bundling with a video sister program, Jitter) causes it be found in contexts ranging from

installation art, video art, electronics-extended live performance, and many others. Thus,

Max positions itself as a central point in an ecosystem that bridges the divide between

different kinds of media, and is functional as a tool within many extra-musical creative

practices that make use of computers.

3.4 Ocarina 1.4.6


Ocarina is a mobile music app released in 2008 by software development company

Smule (headed by Ge Wang). It is proprietary software, may be downloaded on the Mac

App Store for free (although in the past it was paid), is available exclusively for iPhone,

and is built using the ChucK music programming language (Wang, 2014, p. 8). The

genesis of Ocarina comes about from the developers interests in mobile music, and

Ocarina has been described as one of the first musical artifacts in the age of pervasive,

app-based mobile computing (Ibid.). Describing the design and functionality of this

instrument as an exemplar of mobile media production, DErrico states:

The ocarina interface consists of four separate holes and an antenna icon at the

bottom of the screen (Figure 25, left). In order to play musical notes, the user simply

blows into the iPhone microphone while covering combinations of holes with his or her

fingers. The sounding notes all correspond to a specific musical scale, which is chosen by

44
https://cycling74.com/projects
45
https://cycling74.com/articles
46
https://cycling74.com/forums

39
the player. Tilting the phone downward while blowing into the microphone adjusts the

vibrato rate and depth of the sounding note. Together, these affordances abstract the

nuances of playing an actual ocarina breath control, understanding musical scales,

and producing vibrato with fingers rather than a digital technologyinto an easy to use

app. (D'Errico, 2016, p. 216)[figure inclusion by author].

The performance-based mode of engagement here is as stream-lined as possible,

redefining the materiality of the iPhone that Ocarina is being run on as an instrument and

facilitating performance via a minimalistic, instrumental metaphor-saturated GUI.

Figure 25 Ocarina, with a performance view (left), a social view (center), and settings view (right)

More so than any of the programs discussed thus far, this app engages directly with the

musical community of which it is a part: by tapping the antenna icon at the bottom of

the ocarina interface, the user is taken to a 3D map of the world that displays bright

40
lights and plays sounds from locations where other Ocarina users are creating music

with the app (Ibid.) (Figure 25, center). A user may directly engage with this global

networking of its community by naming their ocarina (Figure 25, right) and actively

playing the instrument, broadcasting themselves live for the entire musical community

based around this app to hear, or may passively interact by listening to others users and

sending them love. Ocarina represents a music software that directly engages with its

community, transcending artificial boundaries between consumer and producer, and

presenting the making of music using music software as something accessible, social, and

inclusive. With respect to the role of music in Ocarinas design, here producing musical

sounds is less important than the experience of connecting with other Ocarina users from

around the world (D'Errico, 2016, p. 220).

4. Conclusion
In this work I have surveyed the creative software ecosystem and creative practice in the

context of music software, presented several strategies for differentiating and analyzing

music software, and lastly presented four brief music software case studies. Tracing the

elements of a music software from its design, through the reasons why it was designed,

onto its functionality and how that functionality is made use of in a particular

environment defines a vast network of interconnectivity between composers, performers,

programmers, consumers and their tools. As music technology and the communities

around music technology evolve and develop, as sonic, aesthetic, and philosophical

relationships to music within musical communities shift, and as the technologies and

contexts of music production and consumption change, at their intersection the global

41
ecosystem of music software will evolve and change in turn, reflecting the ways in which

these communities engage with computerized sound.

42
Bibliography

Belyamani, M. (2011, June 14). AfroDJMac. Retrieved April 22, 2017, from They Make

Music: http://www.theymakemusic.com/interviews/afrodjmac/

Blackwell, A., & Collins, N. (2005). The Programming Language as a Musical

Instrument. Proceedings of PPIG05 (Psychology of Programming Interest

Group), (pp. 284-289).

Cadoz, C. L. Tangibility, Presence, Materiality, Reality in Artistic Creation with Digital

Technology. 40th International Computer Music Conference/11th Sound and

Music Computing Conference, (pp. 754-761).

Charrieras, D., & Mouillot, F. (2015). Getting Out of the Black Box: analogising the use

of computers in electronic music and sound art. Organised Sound , 191-199.

Cook, P. (2001). Principles for Designing Computer Music Controllers. Proceedings of

the 2001 Conference on New interfaces for Musical Expression. National

University of Singapore.

de Carvalho Junior, A. D., Lee, S. W., & Essl, G. (2015). Supercopair: Collaborative live

coding on supercollider through the cloud. International Conference on Live

Coding.

D'Errico, M. (2016). Interface Aesthetics: Sound, Software, and the Ecology of Digital

Audio Production. University of California Los Angeles.

Dix, A. (2009). Human-computer Interaction. Springer US.

Duignan, M. (2008). Computer mediated music production: A study of abstraction and

activity. Victoria University of Wellington.

43
Duignan, M. e. (2004). Metaphors for electronic music production in Reason and Live.

Asia-Pacific Conference on Computer Human Interaction (pp. 111-120).

Heidelberg: Springer Berlin.

Duignan, M., Noble, J., & Biddle, R. (2005, September). A Taxonomy of Sequencer

User-Interfaces. ICMC .

Galloway, A. R. (2012). The Interface Effect. Polity.

Gaver, W. W. (1991). Technology Affordances. Proceedings of the SIGCHI conference

on Human factors in computing systems (pp. 79-84). ACM.

Mllenkamp, A. (2014). Paradigms of Music Software Development. Proceedings of the

9th Conference on Interdisciplinary Musicology.

Manovich, L. (2013). Software Takes Command. A&C Black.

Max Artists Interviews. (2017). Retrieved April 22, 2017, from Cycling '74:

www.cycling74.com

McCartney, J. (2002). Rethinking the Computer Music Language: Super Collider.

Computer Music Journal , 61-68.

McCartney, J. (1996). SuperCollider: A New Real Time Synthesis Language. ICMC.

Nash, C., & Blackwell, A. (2011). Tracking Virtuosity and Flow in Computer Music.

Proceedings of the International Computer Music Conference 2011, (pp. 575-

582).

Norman, D. A. (1988). Psychology of Everyday Action. In D. A. Norman, The Design of

Everyday Things (pp. 37-73). New York: Basic Books.

Pold, S. (2005). Interface realisms: The interface as Aesthetic Form. Postmodern Culture

44
Puckette, M. (2002). Max at Seventeen. Computer Music Journal , 31-43.

Puckette, M. (2014). The Deadly Embrace Between Music Software and Its Users.

Proceedings of the Electroacoustic Music Studies Network Conference. Berlin:

EMS.

Schn, D. A. (1983). The reflective practitioner: How professionals think in action. New

York: Basic Books.

Vickers, P. (2011). Sonification for Process Monitoring. In T. Hermann, A. Hunt, & J. G.

Neuhoff, The Sonification Handbook (pp. 455-491). Berlin, Germany: Logos

Publishing House.

Wang, G. (2014). Ocarina: Designing the iPhones Magic Flute. Computer Music Journal

, 8-21.

Wilson, S., Cottle, D., & and Collins, N. (2011). The Super Collider Book. MIT Press.

45