I started working on some sound effects for Swordy over semester break.
Recording effects is a fun process, as it’s more tactile and immediate than trying to synthesize everything in software. I can use my voice or whatever objects I can find as foley.
The toughest part of recording, is the realtime aspect of it. If I have 10 minutes of audio, I have to spend 10 minutes listening to it, and the time increases exponentially as I clean up, tweak EQ, try to extract pieces, cut and export individual effects.
Audacity is the best tool there is for super fast edits, I always revert back to it for splicing and crossfades.
Semester 1 is over, meaning no more sidetrack assignments. I have a year from now to produce a thesis paper.
In review, semester 1 was a very helpful exercise and a lifestyle shift in a way I produce work and talk about things. Academia has a different mode of operating and going through that space changed a lot of things for me.
The best tangible thing that came out of this is of course the interactive piece of work that I’ve produced for Pigsty. I was able to take it a bit further for another assignment, introducing motion control. Using kinect, I was interrogating the play space of the application from the point of view of a participant, observer, the software itself and misalignment between their literal and metaphorical vantage points. I’m hoping to put a few more features into the app, like screenshot saving, some user options and make kinect control available for public download, but perhaps after some of the exhibitions I’m pitching this work for.
WASD, LMB to clear visuals. Gamepad compatible.
Best played with headphones in the dark.
Last night Phil brought out his guitar when I was trying to work on another iteration of my sound toy. Needless to say, noise happened!
We jammed it up in Ableton, he was playing the guitar, while I took over his laptop to fiddle with audio filters, knobs and sends.
A lot of that is what inspired my new version of the experiment I’m working on.
I will write about it in the next post.
Unity is drastically changing it’s audio tools in the 5th release.
New compression format, lower memory foot print;
resource handling improvements;
audio metadata access;
performance profiling Audio mixer “first offering” (more big features to go?)(interactive audio tools next)
mixing & mastering.
DSP effects anywhere in the chain
ADSR etc in editor
runtime editing (mix while the game is playing)
modeled on existing DAWs (eg. Ableton, Logic etc)
ability to create & chain together multiple mixers
routing of sound groups as hierarchies (groups mixed down together)
3D positioning, doppler etc (spacial sound concepts) applied to source before the mixer.
effects are stacked sequentially (like in a daw)
native plugin effects – custom DSP, custom UI
Snapshots – save states of your edits (awesome!)
transition between states at runtime (even awesomer!)
expose individual effect parameters – can be script drivensends / returns
ducking (side chain compression) – any audio group
A very exciting update for audio in Unity. A very slack presentation, however as an overview, makes me wonder whether I should just hold out till 5.0 or if we will end up using both Unity native and Fmod sound at the same time.
Picked up a kinect on trademe for nzd$30 because why not. Would be cool to set up a motion controlled sound experiment.
Was looking through MS SDK toolkit and their demos to see how it works. You can’t be too close or too far from the camera. Luckily my room has just enough space to capture whole skeleton.
Haven’t been able to get it to work with Unity yet, but there are plenty of resources online. For tonight I’m happy sending the green stickman back to infinity.
If you were able to discover this effect in the previous builds, if you just touch a base zone and go out of it without completely coming through, it triggers off, leaving you in the “zero zone” where the last played bass goes through a low pass filter. I really love the floating feeling of the LP filtering based on velocity, so I made a scene that isolates that effect, using the most appropriate track out of the old 4.
There is something about this that just feels good. It’s meditative and hypnotizing. I wonder how such qualities of carrying out certain actions for sheer pleasure be used as game mechanics or in game actions in respect to game states or mechanics in Swordy.
Made the wall collision impact sound more pleasant, but the main thing, was a spontaneous piece of code I added and the subsequent discovery. I was running the app in dual game/editor screen while trying to convert my friend Jenna to Unity, and duplicated the cube. Both were controllable and both triggered the collision sounds and background track graphics. So why stop at 2?
Press A(xbox) / X(ps) (or left click on a mouse) to spawn more cubes!
There’s no limit, and the clone function runs every frame if you keep it pressed (not advisable). GO! 😀
On the ground floor, this was an exercise in coding in Unity and fmod one shot / event parameter manipulation integration test. Otherwise, it’s also an exercise in visual language for a sonic environment that I’ve set up. And further exploration in spacial relationships and sound.
I spent longer figuring out how to do the RGB split post effect on bound collision than anything. I feel like every sound event must be accompanied by visual feedback, that way both sight and hearing senses get stimulated at the same time, and there’s reinforcement of what triggers the sound event or the visual effect, otherwise known as Juice.
recommended: run in windowed mode, max settings with a controller (or keyboard wasd)
Last weekend I helped Edison put together a RepRap 3D printer.
We managed to get the chassis completely built in about 5 hours and Edison then packed up the wiring and calibrated the next day.
It’s a great piece of tech. It’s awesome how simple it is too.
I’ve always been a DIY enthusiast, building a variety of projects many of which are currently in progress.
The greatest thing about DIY for me, is making something with your hands. Digital is good, but there’s a different, more pure kind of satisfaction from cutting, grinding, bolting and attaching. Physical manifestation of one’s imagination and thought, even if following instructions putting preexisting pieces together. The experience of it, is a journey of locomotion, problem solving and thought that is so satisfying. That’s why LEGO is the greatest toy ever made, and why Arduino and other devboards have gained so much popularity among the maker communities.
I want to make all the things.