In November I had the chance to participate in the Natural Interaction Hackathon organized by the OpenNI organization and hosted by GarageGeeks and GameIS. For those who haven’t participated in one, a hackathon is a meeting of programmers that work collaboratively to accomplish a specific task. There are gaming hackathons where people create computer games under a time limit and there are also hackathons where people try to find creative solutions for global concerns such as green energy. The Natrual Interaction Hackathon’s goal is to create a motion controlled application or game using the OpenNI Natural Interaction framework. My team and I went there to do just that — learn about motion control and natural user interfaces and create a motion controlled game. But why create motion controlled games in the first place?

A typical hackathon scene – geeks and laptops… (photo by Yves Tennevin – http://bit.ly/MTnZHm)

Motion control is a form of Natural User Interface (NUI) – a method of controlling a machine in a way that is transparent to the user. Instead of using control devices such as a keyboard or a TV remote,  the user can simply wave his hand to issue a command. Voice control is also a good example of NUI, but it is rather limited in both speed and accuracy, That’s why voice might be great as a smart-phone assistant but can’t really replace the gamepad, keyboard or mouse in gaming environments. A true contender for a ‘replace-all’ control mechanism is motion control.

iPhone’s ‘Siri’ is a good example of NUI

Since my company’s focus is on gaming, our team already had a pretty solid idea  of what we wanted to achieve when we came to the hackathon, and having a clear picture in our minds proved extremely helpful to the development cycle which was under a strict time constraint. We were amazed by the amount of original, gaming and non-gaming ideas other teams had come up with. The Hackathon started with a short lecture by Amir Hoffnung from PrimeSense, showing OpenNI in action using the Unity3D game engine and sample applications demonstrating the depth-sensing capabilities of the technology. Even at that point people were still coming up with new ideas, and this was reflected in the thought-provoking questions they asked.

Amir Hoffnung from PrimeSense demonstrating OpenNI with PrimeSense Sensor

After the demonstration, each person was invited to share his idea, and ask for other participants to join his project. Our idea was to create an arcade game in which the player controls a flying saucer rotating around the earth and firing it’s death ray at it. The motion control scheme we used was pretty straight forward and intuitive – the player aims his weapon with his hand, move around by holding his hand out to the far left or right, and fires by moving his hand forward towards the screen.

Hacker friends trying to create an original control mechanism for their game

The actual hacking started on Saturday morning – PrimeSense, the developer of the 3D depth sensors provided us with code samples and on-site help by some of its gurus (and, of course, the sensors themselves). At first  it took people some time to understand how to work with the technology – it’s a new technology, and the supporting software is ever-changing (and that’s a good thing). Soon enough, my team as well as the other teams managed to get the hang of it and started developing at full speed. The development session consisted of people doing crazy and funny things in front of the sensors. It was no longer a development hackathon but a rather crazy hopping and dancing in front of  a depth sensor session (again, that’s a good thing!).

Our team, working on graphic design and visual effects

One week later , people started uploading their projects to the OpenNI Arena, the ‘app-store’ of applications using OpenNI technology. To me, that’s when things got really interesting seeing all of these crazy ideas come to life. From a ‘Quidditch’ (broom flying) game to a robotic controlled barbecue, and our very own ‘Very Mean Aliens’ – which won first place, by the way 🙂 – all these original concepts, ideas and control mechanisms were put into practice and showed a bit of what this technology can do for us.

‘Very Mean Aliens’ – our award winning game from the hackathon

What’s next?

That’s the big question here. 3D depth sensors can be used for all sorts of things – from making a 3D model without 3D artists having to lose long hours of work (a small revolution in 3D design by itself) to better sensing robots, to art installations. This is the tip of the iceberg for this new revolution in computers and arts and I’m glad that I had the chance to be a part of it.

 

See you all in the next hackathon,

Itzhak Wolkowicz, CEO

282Productions