We went in with mentality that getting a mention is good, and any exposure is good exposure, and indeed, we were able to show both Warp and Swordy, which is great.
We knew however that this wasn’t about us. This was about VR, and portraying New Zealand as a place for innovative new technology, which is also a worthy pitch that I was more than willing to participate in.
It is interesting however to see the final result. We spent around 1hr with David’s team, filming, doing interviews. Fascinating to see how condensed the final edit is, what was used and what wasn’t and how everything was conducted behind the scenes.
We weren’t aware for example, of what stream of material we were going to be among in the edit. We only knew about our direct particular involvement and the roller-coaster playtest with an elderly gentleman, which happened in the same room as us. That, I think could feel less distilled and set up, if it was more genuine in the approach of getting an emotional response out of him, but still worked out fine and everyone had fun.
I think overall, this was a great opportunity and it worked out really well. Swordy is on the national TV, score!
Two weeks ago, we (frogshark team) visited Thought-Wired, an Auckland based company, that is doing R&D on thought control interfaces and neurotech. Dmitri Selitskiy showed us where things currently at in that tech space, and it was interesting to see how they’re trying to build a platform for developers like us, to be able to interface with all these different propitiatory BCI hardware.
I got to try the Emotiv BCI. It’s really quite impressive, to get to experience the early tech in this area. It’s not as plug-and-play as one would expect from a consumer product. You have to lubricate the electrodes, you have to spend hours ‘training’ the system to adjust to your specific brainwave activity. It’s a really steep ramp to get it to “just work”.
I think it’s still in very early stages, it will probably be a couple of years before we see some real home consumer applications, but the potential is already here for therapy and medical research. This kind of technology is a blessing for people with disabilities.
The gaming applications for BCI’s however are incredible, and I’m not just talking about playing star wars, or letting disabled people play traditional games, but also the new opportunities of gameplay.
What if I’m using the controller to perform standard mechanical actions that I don’t have to “think” about and just perform from kinesthetic memory, and how does that coexist with brainwave control schemes?
Is there going to be some kind of hyper-threading where the BCI doesn’t just pick up on particular thoughts, but on particular “ways of thinking”?
All these would certainly raise some eyebrows among privacy advocates, however, ethics aside, the future for this technology is very bright.
During VRjam we only had 1 week to put together the demo, hence most of it was assets salvaged from other projects. This plane in particular was a model I built for fun, which temporarily got to be a hero vehicle. Now I am retiring it to serve as a placeholder for enemy ships, while a new hero model is being implemented.
Last week Hamish and I finished our submissions for the vrjam, a game jam ran by OculusVR. We only got to work on it for a week part-time instead of 3, but I’m really stoked about the results. We might continue working to make this into something more.
I armed myself with my 7D, went to the nearest park and walked around this tree.
I decided to play with some motion blur, and see what happens if I increase the blur amount while retaining the speed of motion.
This is a curious coincidence, I was making a zbrush sculpt of a whale vertebrae, and when I imported my lowpoly model into Unity, the shadow from it looks like a whale itself 🙂
A random piece of pixelart I made that I stumbled upon cleaning up my dropbox! I remember this, was made after some impression playing tiny tower when it was still a thing.
A game that I’ve been working on at Gameloft, has just went gold and approved by Apple. It’s much anticipated by fans apparently. I’m no brony, and the only thing I’m excited about is finally not hearing the main theme play over and over in the office.
Things I’ve done on the game were vfx art – when you see sparkles and bursts of butterflies etc, I was responsible for those, and tech art – some behind the scenes sprite optimization, animated sprites and stuff like that. There’s a lot of metamorphosis this game went through in terms of asset management.
A tiny animation I made for my brother’s youtube channel as an intro to his videos. He’s a metal detecting specialist, discovering lost treasures and cleaning the environment from hazardous metal scraps. Check out his site and blog:
Made in Photoshop, sound design done using iPad apps BFXR & PixiTracker 1bit. Soundtrack assembled in Live, frames assembled using VirtualDub.