In this post I show how I built headphones into my motorbike helmet.
Since I started riding a motorbike, I gradually took on challenges associated with it, such as riding on the motorway, filtering, and lately – riding with music (sometimes). While it can be considered less safe, it presents a different set of challenges and offers a different experience. Your visual acuity elevates, as you compensate the lack of sound cues from the environment, watching out for more dangers and preempt other road users’ actions.
Listening to music while riding puts the experience many levels up. The tactility of the road, the smell, the wind, are now part of the music experience. Pick the right track, and the ride becomes even more visceral while you massage your cerebellum with heightened senses.
I started working on some sound effects for Swordy over semester break.
Recording effects is a fun process, as it’s more tactile and immediate than trying to synthesize everything in software. I can use my voice or whatever objects I can find as foley.
The toughest part of recording, is the realtime aspect of it. If I have 10 minutes of audio, I have to spend 10 minutes listening to it, and the time increases exponentially as I clean up, tweak EQ, try to extract pieces, cut and export individual effects.
Audacity is the best tool there is for super fast edits, I always revert back to it for splicing and crossfades.
Semester 1 is over, meaning no more sidetrack assignments. I have a year from now to produce a thesis paper.
In review, semester 1 was a very helpful exercise and a lifestyle shift in a way I produce work and talk about things. Academia has a different mode of operating and going through that space changed a lot of things for me.
The best tangible thing that came out of this is of course the interactive piece of work that I’ve produced for Pigsty. I was able to take it a bit further for another assignment, introducing motion control. Using kinect, I was interrogating the play space of the application from the point of view of a participant, observer, the software itself and misalignment between their literal and metaphorical vantage points. I’m hoping to put a few more features into the app, like screenshot saving, some user options and make kinect control available for public download, but perhaps after some of the exhibitions I’m pitching this work for.
Last night, after little sleep and some obligatory writing, I went to Phil’s house, eager to show off my creation and watch him play the [previous version of the] app. I haven’t yet seen anyone interact with it, and I knew the aesthetic aligned with a lot of Phill’s own works.
Half an hour into play and discussion about the sensations, feelings and relationships my work was establishing within that play session, he suggested to remove buffer clearing from the camera to reinforce the painting aspect of the experience. A few seconds later another hour of play ensued.
This move took this from the domain mainly dominated by audio in terms of focus and output, towards the visual. I knew the previous iteration wasn’t final, it’s just what I had at the time the time ran out. Playing that one now, feels like something is missing.
The experience in this new build is drastically different. The aquatic feeling is far less, as it is replaced by a more temporal aspect of particle behavior. Layers upon layers of footprints create the evolving landscape of motion. Sound becomes less important, a background accompaniment to your very presence in the space.
I cleaned up the last minute code I did the night before submission and here’s the first version of what I would call a “piece of software I made”.
I will be using this as a platform for further experimentation and learning, especially that I have another university paper to fulfill, which is the effector to shape the process and outcome of this experiment in play.
WASD, LMB to clear visuals. Gamepad compatible.
Best played with headphones in the dark.
Finally I got to do some experimentation and play.
Here I have 6 loops, of varying density and intensity low to high, for a simple tribal drum beat. The idea is at first to make a test of these 6 separate stages blending with one another based on some parameter in Unity.
FMOD is a big industry standard middleware for video game audio engineering, and is also already driving Unity’s native sound engine, though it obfuscates all access to FMOD.
Luckily however, FMOD.org just released FMOD Studio Pro, license free for indie developers, and offer an integration package for Unity that provides a wrapper to interface Unity with FMOD directly. It gives full access to effects, filters and custom event structures in a project.
These loops so far, have been composed in Ableton Live, using a Kontak 5 “West Africa” instrument set from Native Instruments. I used the inbuilt pattern maker to create the rhythms.
I found this process a massive exercise in play itself. The pattern set to loop, would constantly play back, as I was editing instrument states (rombus, square, x). Often I would get carried away by the organic nature of what I was doing, completely destroying the loop. I would make it incompatible with previous intensity settings and run way off the rhythm, often having to start again using the previous stage as a starting point. It was productive kind of fun.