bell / boom (2020)
bell / boom utilizes time-varying, multilayered voices created through network modulation synthesis to explore blooming musical textures. A shallow synthesis tree with one root and five leaves, each with slightly different parameters and algorithms, was used to create sets of audio samples in five voices, each voice unique but related. Steadier voices use predictive feedback, with audio that changes very slowly over time; busier voices were generated through standard oscillating parameter sweeps. Samples are played back in a Max patch using a MIDI controller, with different combinations of voices combining to generate each textural note. The voiced instruments are combined with a chaotic introduction, a drone, a bell, and a boom, each created using the neural network synthesizer. This recording is a direct-to-DAW performance of the system.
A performance of bell / boom will be presented at the ISMIR 2020 conference.
bloviation encouraged! (2019)
bloviation encouraged! is a compositional framework written in Python created for interactive use. Words from a user are translated into parameters for a neural network-based synthesizer. This performance functions as an informal user's manual for the system. On-screen text explains to the viewer that the same word will produce the same sound every time (except for some words) and that longer words produce more interesting sounds. The piece encourages users to extend words by repeating letters, use random collections of characters and finally concludes that the tutorial has been moot and composing music from semantic language is not an ideal use of the system.
to: alex, with regret (2019)
Creating music with neural networks is a form of personal activism, allowing me to grapple with my history in software engineering, a field that increasingly uses technology for unethical purposes. This piece highlights the unexpected and unique timbral effects of predicted audio content; it is, in essence, composing with instability. I hope that general listeners find that the piece demystifies deep learning, as the timbral chaos and harmony are the results of a model that is only as intelligent as the human wielding it. The titular Alex refers to Alex Krizhevsky, who created the groundbreaking AlexNet convolutional neural network that arguably incited the deep learning boom in research and industry.
An algorithmic composition using the enigmatic scale. As the piece progresses, the audio becomes warped by custom fft patches in Max. The real and imaginary components of the Fourier transform are manipulated in ill-advised ways, and the material becomes increasingly unstable as the intruding effects overtake the melodic content. With this piece, I explore the sonic possibilities of creating audio effects that are mathematically unstable or illogical. I can technically describe what is being done to the audio, but I have no semantic explanation for the audible effect. This piece is a different approach to composing with instability.
should white men play the blues (2018)
This track was created as a final project for my computer music composition class. It uses a combination of electric instruments, software instruments, manipulated vocals and found sounds. The composition spans the genres of alternative rock, experimental electronic, and noise. The song is idealogically inspired by the concept of musical ideas being co-opted and warped over time. I wrote the following exerpt for the concert program:
“Black musicians created rock and roll. It sprung forth from the blues and jazz, and just like those genres, white people were soon to follow. What started as outright theft eventually yielded surf rock and the the British Invasion. Sabbath downtuned and created metal. Thurston and Lee tuned randomly and stabbed their guitars with screwdrivers as if they were punk John Cage. Hanatarash drove a bulldozer through a music venue for the love of noise; now all we need is a computer.
How far from the source can material be before it is no longer appropriation? If Thurston threw his marriage away, is there still love in his music? If Robert Johnson never got paid, should white men play the blues?“
loneliness in space is the same on earth (2018)
This track was created as a midterm project for my computer music composition class. It was created using a combination of electric instruments and software instruments. The music is inspired by the indie rock and shoegaze classics I discovered in my undergraduate years. Below, I have included the description accompanying the composition:
“To me, synthesizers sound like space. They came at a time when the general public was the most interested in the possibilities of space, sometimes for nationalistic reasons, but I like to believe mostly out of wonder and hope. Collectively, we are now less invested in space. NASA is ill-funded and space ventures are reserved for private research, with trips only affordable by the ultra-rich. As corporations destroy our planet, I wonder if we realized that space is not the answer to our problems. This planet may die, but others are too far for us to reach. Space travel will not allow us to escape capitalism. Understanding the vastness of the universe and our own insignificance within it does not force us to band together to ease human suffering.
Loneliness in space is the same on Earth. Perhaps it's better to stay and try another way.“
Below are compositions that I have composed for various classes, or smaller experiments that were not fully fleshed out. Some are abandoned forever and documented here for record, and some may be expanded or readapted in the future.
I'll Give You One / Four (2020)
now we’re cookin (2019)
andy's melody (2019)
playing the cave (2019)
cave explosion (2019)
increase the gain (2018)
I played lead guitar in an indie rock band called Garbeau. Our album "Hallways" can be streamed below.