I wanted to familiarize myself with this technology.
For convenience, I used one of the abundant plasticine abominations we have around Colab as a subject.
The tech itself is nothing complicated in terms of usability. Take lots of static shots of various angles, load them up and you get a point cloud upon processing (of course I imagine the science and maths behind this process is quite complex).
The potential is there to scan real life 3d objects into a virtual space for further digital manipulation, using a regular camera without expensive scanning equipment.
Definitely an awesome technique to have in the quiver of experimental stuff you can do with computers.
I like the digital processing artifacts and loss of data that’s happening here. A sort of glitch aesthetic, which I think it’s more interesting than a perfect 3D scan of a real world thing.
I started working on some sound effects for Swordy over semester break.
Recording effects is a fun process, as it’s more tactile and immediate than trying to synthesize everything in software. I can use my voice or whatever objects I can find as foley.
The toughest part of recording, is the realtime aspect of it. If I have 10 minutes of audio, I have to spend 10 minutes listening to it, and the time increases exponentially as I clean up, tweak EQ, try to extract pieces, cut and export individual effects.
Audacity is the best tool there is for super fast edits, I always revert back to it for splicing and crossfades.
Semester 1 is over, meaning no more sidetrack assignments. I have a year from now to produce a thesis paper.
In review, semester 1 was a very helpful exercise and a lifestyle shift in a way I produce work and talk about things. Academia has a different mode of operating and going through that space changed a lot of things for me.
The best tangible thing that came out of this is of course the interactive piece of work that I’ve produced for Pigsty. I was able to take it a bit further for another assignment, introducing motion control. Using kinect, I was interrogating the play space of the application from the point of view of a participant, observer, the software itself and misalignment between their literal and metaphorical vantage points. I’m hoping to put a few more features into the app, like screenshot saving, some user options and make kinect control available for public download, but perhaps after some of the exhibitions I’m pitching this work for.
Last night, after little sleep and some obligatory writing, I went to Phil’s house, eager to show off my creation and watch him play the [previous version of the] app. I haven’t yet seen anyone interact with it, and I knew the aesthetic aligned with a lot of Phill’s own works.
Half an hour into play and discussion about the sensations, feelings and relationships my work was establishing within that play session, he suggested to remove buffer clearing from the camera to reinforce the painting aspect of the experience. A few seconds later another hour of play ensued.
This move took this from the domain mainly dominated by audio in terms of focus and output, towards the visual. I knew the previous iteration wasn’t final, it’s just what I had at the time the time ran out. Playing that one now, feels like something is missing.
The experience in this new build is drastically different. The aquatic feeling is far less, as it is replaced by a more temporal aspect of particle behavior. Layers upon layers of footprints create the evolving landscape of motion. Sound becomes less important, a background accompaniment to your very presence in the space.
I cleaned up the last minute code I did the night before submission and here’s the first version of what I would call a “piece of software I made”.
I will be using this as a platform for further experimentation and learning, especially that I have another university paper to fulfill, which is the effector to shape the process and outcome of this experiment in play.
WASD, LMB to clear visuals. Gamepad compatible.
Best played with headphones in the dark.
Last night Phil brought out his guitar when I was trying to work on another iteration of my sound toy. Needless to say, noise happened!
We jammed it up in Ableton, he was playing the guitar, while I took over his laptop to fiddle with audio filters, knobs and sends.
A lot of that is what inspired my new version of the experiment I’m working on.
I will write about it in the next post.
Unity is drastically changing it’s audio tools in the 5th release.
New compression format, lower memory foot print;
resource handling improvements;
audio metadata access;
performance profiling Audio mixer “first offering” (more big features to go?)(interactive audio tools next)
mixing & mastering.
DSP effects anywhere in the chain
ADSR etc in editor
runtime editing (mix while the game is playing)
modeled on existing DAWs (eg. Ableton, Logic etc)
ability to create & chain together multiple mixers
routing of sound groups as hierarchies (groups mixed down together)
3D positioning, doppler etc (spacial sound concepts) applied to source before the mixer.
effects are stacked sequentially (like in a daw)
native plugin effects – custom DSP, custom UI
Snapshots – save states of your edits (awesome!)
transition between states at runtime (even awesomer!)
expose individual effect parameters – can be script drivensends / returns
ducking (side chain compression) – any audio group
A very exciting update for audio in Unity. A very slack presentation, however as an overview, makes me wonder whether I should just hold out till 5.0 or if we will end up using both Unity native and Fmod sound at the same time.
Picked up a kinect on trademe for nzd$30 because why not. Would be cool to set up a motion controlled sound experiment.
Was looking through MS SDK toolkit and their demos to see how it works. You can’t be too close or too far from the camera. Luckily my room has just enough space to capture whole skeleton.
Haven’t been able to get it to work with Unity yet, but there are plenty of resources online. For tonight I’m happy sending the green stickman back to infinity.
If you were able to discover this effect in the previous builds, if you just touch a base zone and go out of it without completely coming through, it triggers off, leaving you in the “zero zone” where the last played bass goes through a low pass filter. I really love the floating feeling of the LP filtering based on velocity, so I made a scene that isolates that effect, using the most appropriate track out of the old 4.
There is something about this that just feels good. It’s meditative and hypnotizing. I wonder how such qualities of carrying out certain actions for sheer pleasure be used as game mechanics or in game actions in respect to game states or mechanics in Swordy.
Made the wall collision impact sound more pleasant, but the main thing, was a spontaneous piece of code I added and the subsequent discovery. I was running the app in dual game/editor screen while trying to convert my friend Jenna to Unity, and duplicated the cube. Both were controllable and both triggered the collision sounds and background track graphics. So why stop at 2?
Press A(xbox) / X(ps) (or left click on a mouse) to spawn more cubes!
There’s no limit, and the clone function runs every frame if you keep it pressed (not advisable). GO! 😀