Chase Mitchusson has a PhD in Experimental Music & Digital Media from LSU, is a VR developer, composer, and computer music instructor. He researches emerging technologies for the purpose of connecting music to digital media in order to create new interfaces which help people express themselves. As part of teams in production classes and Global Game Jams, Chase has helped develop several games and experiences in Unity and Unreal Engine. His work with VR has been an effort to supplement creators’ toolkits and augment live musical performances. Chase's other interests include coding web audio, field recording, and live multi-channel sound.
PhD in Music, Experiment Music & Digital Media, Digital Art Minor, 2020
Louisiana State University
M.M. in Composition, 2015
University of Memphis
B.A. in Japanese Language, 2011
University of Memphis
Instructor of Introduction to Computer Music, 2018-2019
College of Music & Dramatic Arts, Louisiana State University
Lab Instructor of Introduction to Music Technology, 2017-2019
College of Music & Dramatic Arts, Louisiana State University
Primitives
Doctoral Recital, Summer 2020 | bit.ly/primitivesVR
Composed and performed a three movement piece entirely in virtual reality based on the
indeterminate properties of the VR Sequencer. Rolled and arranged dice in VR to build a musical
sequence. Added playheads that trigger different samples at different playback rates and ranges.
Jumbled the dice with a terrain changer using a perlin noise map. Organized three distinguished
musical movements using the dice and playhead arrangements.
VR Sequencer
New Interfaces for Musical Expression 2020 | bit.ly/VRSeqInfo
Developed a tool for making experimental music in virtual reality using Unity Engine and C#. Users
roll dice to generate audio for a sequencer. The spatial orientation and the faces of the dice
determine the rhythm and musical content for the sequence.
Re-Sounding Wild
Really, Really New Music Marathon, November 2018; Red Stick Expo, April
2019
Embedded driftwood with a Raspberry Pi, Arduino, and distance sensors to create an interactive
installation that plays original field recordings of Yellowstone National Park.
Duke Skellington Plays the Xylobone
High Voltage, March 2019; Game Sound Conference,
October 2018; New Interfaces for Musical Expression, June 2018
Built a Unity scene that tracks a performer with Microsoft Kinect while the performer plays audio
processed in Max/MSP.
Ghostwriter
Laptop Orchestra of Louisiana Concert, November 2018
A collaboration with Anthony T. Marasco.
Programmed music in Gibberwocky, a Javascript-based
music language that controls Max/MSP from the web, to develop a networked piece for three laptops
and two performers.
Multipass
International Computer Music Conference, August 2018
Sampled and mapped original clarinet multiphonics recordings in Max/MSP to Ableton Push 2 for live
playback under an ambisonic soundwalk of the LSU campus.
Plinko Machine Learning
Red Stick Expo, April 2017
Combined a p5.js plinko game with Wekinator and Tone.js to make an instrument that changes effects
parameters based on the path the ball takes as it falls through the plinko array.
Landscape
Maker Faire, East Baton Rouge Parish Library Central Library, 2016; LaTex, University of Texas at
Austin, 2016; Digital Media Center Theater, Laptop Orchestra of Louisiana, 2016
A piece for 4 iPads and Max/MSP over a network. Performers write a series of Japanese kanji on iPads
over a pictographic score to create textural music.
Project Columbia
Production Team Project, Spring 2019
Wrote original music and sound effects for an original game and implemented them into Unreal 4 with
Wwise.
Zandra
Production Team Project, Fall 2018
Wrote original music and sound effects for an original game and implemented them into Unity with C#.
Lost In Space
Web Audio Conference, September 2018
Lost In Space is a project utilizing Bluetooth Low Energy (BLE) beacons in tandem with mobile
devices and 3D panning on the web to overlay virtual sound arrangements onto a physical location in
which users can listen through their phones and tablets. Research poster presented at WAC for using
bluetooth and web audio to track
smartphone locations in a room.
RASPutin
Physical Modeling Seminar, Spring 2017
RASPutin is a piece utilizing physical modeling controlled by a FireFader. The model is a set of 3
resonators each with 10 pluck links. The FireFader device is linked to a mass that is linked to a
second mass that is linked to a third mass. Moving the FireFader results in dragging all three
masses across all three resonators’ plectrum resulting in a staggered strumming.