Posts Tagged: gamedev

Bought a Kinect

Picked up a kinect on trademe for nzd$30 because why not. Would be cool to set up a motion controlled sound experiment.

Was looking through MS SDK toolkit and their demos to see how it works. You can’t be too close or too far from the camera. Luckily my room has just enough space to capture whole skeleton.

Haven’t been able to get it to work with Unity yet, but there are plenty of resources online. For tonight I’m happy sending the green stickman back to infinity.

Low Pass Flow

velocity sound – test 05 (WIN) (OSX)

If you were able to discover this effect in the previous builds, if you just touch a base zone and go out of it without completely coming through, it triggers off, leaving you in the “zero zone” where the last played bass goes through a low pass filter. I really love the floating feeling of the LP filtering based on velocity, so I made a scene that isolates that effect, using the most appropriate track out of the old 4.

There is something about this that just feels good. It’s meditative and hypnotizing. I wonder how such qualities of carrying out certain actions for sheer pleasure be used as game mechanics or in game actions in respect to game states or mechanics in Swordy.

Multicubes

mct_04 (WIN) (OSX)

Made the wall collision impact sound more pleasant, but the main thing, was a spontaneous piece of code I added and the subsequent discovery. I was running the app in dual game/editor screen while trying to convert my friend Jenna to Unity, and duplicated the cube. Both were controllable and both triggered the collision sounds and background track graphics. So why stop at 2?

Press A(xbox) / X(ps) (or left click on a mouse) to spawn more cubes!
There’s no limit, and the clone function runs every frame if you keep it pressed (not advisable). GO! 😀

The Hard Way

I realize I’ve been lying to myself for a long time. I’ve been telling myself that “in the time that it would take me to get any good at writing code, I could get proportionately better at something else”.

Now I find myself in a place where I neither did code, nor learned that “other” thing I would supposedly be better at in the time I could’ve learned to code.

As hard as it is, I need to make steps towards code proficiency beyond simple scripts. Implementing my own audio and setting up relationships to drive my sound design is a perfect opportunity to stop giving myself excuses and get it done.

I want to purposefully avoid using tools like playmaker and node based programming aids. Even though I could make things faster, and it’s fine for prototyping, it is not good for production and my learning.
Thankfully, my team @frogshark is here to help 😀

time to get over it and do some code.

The Project Project

Over the break, I got a privilege to participate in The Project expo at AUT. People came to talk, listen and discuss the various topics of digital disruption.

I was there to showcase Warp, an Oculus VR demo we created at Frogshark (before it was officially formed). We showcased the demo at DigitalnatioNZ expo in 2013, and it hasn’t changed since, as we went onto working on Swordy. The event however wanted to have a VR demo on display, so it was an opportunity for me to see reactions of a different kind of audience to VR tech.

There was mostly mature audience. The highlight of the event for me was seeing people who never played video games, never experience anything like VR, some never even held a controller in their hands, try something like this for the first time.

I couldn’t hook up the sound to the TV, so it was mute, however there was a good number of people who were making their own “pew pew pew” sounds to compensate.
Majority of people who tried, at first stared straight ahead like they’re used to on a normal screen. I would have to prompt people to look around actively. No one noticed they had no legs until I pointed that out.

It was evident again how important it was for the controls to be minimal. Only the left stick and the trigger were required. Majority never used an xbox controller, and would get lost to find the left analogue, but eventually would get everything within seconds.

One particular lady (pictured above) claimed to never have held a controller before (she was also one of those that made their own sounds, and duck and lean with their body in response to VR), have totally engaged with the game. She instantly got the head look aim mechanic, understood the ship controls the moment she touched the analogue sticks, shot all the enemies within seconds as they appeared, and trialed the rest of the buttons on the controller to discover the barrel rolls, which she also accompanied with body movement and vocal sound effects. That lady was awesome.

There was also only one person who insisted that inverted Y is the right way, and he was a young gamer (I would agree if I was playing with an actual joystick, in a free-flight mode, not an on-rails 2.5D Starfox VR hybrid).

Other than that, it was great to see the responses from an older audience to something this unfamiliar.
In fact, the engagement didn’t begin just the time from when they put the headset on, but also the whole ritual that begins way before that. The inquisitive gaze from the distance, the careful approach, the childlike expression of curiosity and the internal battle of excitement vs. unfamiliar.
Something very natal is expressed in that ritual, as if like cavemen themselves, gazing upon fire for the first time. The careful danger dance, the curious courageous touch and the satisfied retreat, to tell the others they have done it.

Fmod Blend test

FMOD drum test 01

Download Build

I couldn’t figure out how to make the loops be stacked with synchronous playback while having them switch blended on the value. Instead, I set up a system where loop regions for every intensity level had sends based on the main parameter knob. Every loop region has a range of that parameter within which the loop repeats. If I turn the knob, the playhead jumps to another loop for that range of values, and plays it from the beginning.

It was too abrupt at first, because the playhead isn’t jumping to the same time in the bar, but to the start of the loop region. Because the are no bars, these are freeform events within events with audio files inside of them. To alleviate the harsh pop, I added ADSR to every loop. That fades off and blends them smoother, although there’s some doubling up going on, or they go quiet with quick changes from value to value trying to catch up. Everything feels desynchronized still.

I need to test a system with loops that just blend and stack with each other well, to keep the playhead in place while swapping out samples.

TV3 Appearance

Thanks to Ben & NZGDA, David Farrier from TV3 approached us, as they were looking to do a segment on Virtual Reality for Easter air.

Having participated in VRJam 2013, we came up with a tech demo, that we subsequently showcased at DIGITALNATIONZ expo late last year.

We went in with mentality that getting a mention is good, and any exposure is good exposure, and indeed, we were able to show both Warp and Swordy, which is great.

We knew however that this wasn’t about us. This was about VR, and portraying New Zealand as a place for innovative new technology, which is also a worthy pitch that I was more than willing to participate in.

It is interesting however to see the final result. We spent around 1hr with David’s team, filming, doing interviews. Fascinating to see how condensed the final edit is, what was used and what wasn’t and how everything was conducted behind the scenes.

We weren’t aware for example, of what stream of material we were going to be among in the edit. We only knew about our direct particular involvement and the roller-coaster playtest with an elderly gentleman, which happened in the same room as us. That, I think could feel less distilled and set up, if it was more genuine in the approach of getting an emotional response out of him, but still worked out fine and everyone had fun.

I think overall, this was a great opportunity and it worked out really well. Swordy is on the national TV, score!

Spaceman game sketch

Initially I was thinking to make a test scene in Unity, where I would spawn a cube at a press of a button, that would make a sound as it collided with the floor, then ramp up the bounciness and see what it looks/sounds like.

It is not at all what I ended up with however.

I started off by working out how to get controller inputs, because I wanted to be able to move the box using analogue stick. Using inControl package, I was able to get input readouts.

But when I tried making a movement controller script, since I don’t have coding experience, I pulled apart an existing one that was using physics.

What I ended up with, was a box that would only shoot into a given direction (analogue stick), and was only able to reconfigure its vector, when it’s speed dropped below a certain threshold. Unexpected result, but It lead to a ☆☆GAME DESIGN IDEA☆☆

You control a spaceman, or rather, spaceman’s arms. You can reach, grab, push, pull, and aim your arms.

The idea of the game is to traverse a space station that is falling apart module by module in zero gravity, pulling and pushing off objects.

This could be endless run through a procedural maze of modules, or something finite, but the QWOP style controls for the arms with random events like explosions, loss of oxygen, fire, collisions, debris and no gravity are the key.

And all that, from a coding mistake in a script to move a box.

Whale bone whale shadow

This is a curious coincidence, I was making a zbrush sculpt of a whale vertebrae, and when I imported my lowpoly model into Unity, the shadow from it looks like a whale itself 🙂