Two weeks ago, we (frogshark team) visited Thought-Wired, an Auckland based company, that is doing R&D on thought control interfaces and neurotech. Dmitri Selitskiy showed us where things currently at in that tech space, and it was interesting to see how they’re trying to build a platform for developers like us, to be able to interface with all these different propitiatory BCI hardware.
I got to try the Emotiv BCI. It’s really quite impressive, to get to experience the early tech in this area. It’s not as plug-and-play as one would expect from a consumer product. You have to lubricate the electrodes, you have to spend hours ‘training’ the system to adjust to your specific brainwave activity. It’s a really steep ramp to get it to “just work”.
I think it’s still in very early stages, it will probably be a couple of years before we see some real home consumer applications, but the potential is already here for therapy and medical research. This kind of technology is a blessing for people with disabilities.
The gaming applications for BCI’s however are incredible, and I’m not just talking about playing star wars, or letting disabled people play traditional games, but also the new opportunities of gameplay.
What if I’m using the controller to perform standard mechanical actions that I don’t have to “think” about and just perform from kinesthetic memory, and how does that coexist with brainwave control schemes?
Is there going to be some kind of hyper-threading where the BCI doesn’t just pick up on particular thoughts, but on particular “ways of thinking”?
All these would certainly raise some eyebrows among privacy advocates, however, ethics aside, the future for this technology is very bright.
During VRjam we only had 1 week to put together the demo, hence most of it was assets salvaged from other projects. This plane in particular was a model I built for fun, which temporarily got to be a hero vehicle. Now I am retiring it to serve as a placeholder for enemy ships, while a new hero model is being implemented.
Last week Hamish and I finished our submissions for the vrjam, a game jam ran by OculusVR. We only got to work on it for a week part-time instead of 3, but I’m really stoked about the results. We might continue working to make this into something more.
I armed myself with my 7D, went to the nearest park and walked around this tree.
I decided to play with some motion blur, and see what happens if I increase the blur amount while retaining the speed of motion.
A game that I’ve been working on at Gameloft, has just went gold and approved by Apple. It’s much anticipated by fans apparently. I’m no brony, and the only thing I’m excited about is finally not hearing the main theme play over and over in the office.
Things I’ve done on the game were vfx art – when you see sparkles and bursts of butterflies etc, I was responsible for those, and tech art – some behind the scenes sprite optimization, animated sprites and stuff like that. There’s a lot of metamorphosis this game went through in terms of asset management.
A tiny animation I made for my brother’s youtube channel as an intro to his videos. He’s a metal detecting specialist, discovering lost treasures and cleaning the environment from hazardous metal scraps. Check out his site and blog:
Made in Photoshop, sound design done using iPad apps BFXR & PixiTracker 1bit. Soundtrack assembled in Live, frames assembled using VirtualDub.
I bought a Nerf Spectre, for the purpose of modding it. Internally I removed air suppressors and added spacers for more spring power, gave it a vivid black and red paint job, finished with some wear and tear, oil leaks and aging. The whole thing took me roughly 2 days.
Inspired by the Hindu and Buddhist statuettes and the recent theatrical audio visual mashup performance Bathing with Elephants I attended in January at the Civic theater in Auckland, where I saw the Ganesha statues.
Themes of time, recursive motion and repetition influenced this piece.