Skip to main content

A technical note on Vogels! and a not so technical note on Komodo Games and its success

Before I start with the technicalities let me give first (for the ones that don’t know) a little background info on Komodo Games and Vogels!

During the last semester of the previous academic year I was involved in a very unique game experimentation-project. Our team consisted of seven members, four of which were coming from the Utrecht School of the Arts (HKU) together with me and another co-student who were from Utrecht University. Our project needed a name so during our first brainstorm session we came up with the name Komodo Games, not bad I would say. The diversity of our team was evident from the roles each member had. Our group consisted of a Game Designer, Technical Game designer, Sound designer, 2D artist, 3D character designer, 3D environmental artist and me as the programmer. Or more specifically:

  • Adriaan de Jongh – Game Designer
  • Jens van de Water – Technical Game Designer
  • Eri Shiroyama – Sound Designer
  • Sandra da Cruz Martins – 2D Artist
  • Tim Remmers – 3D Character Designer
  • Ronald Houtermans – 3D Environment Artist
  • Francis Laclé – Programmer

All in all, a diverse group with specific roles that needed to complete a game project in 22 weeks (more or less). So what is Vogels! and how did we come up with the idea in the first place?

Vogels! (or Birds! in Dutch) is a serious game designed to stimulate therapy for patients suffering from a condition known as Hemiplegia. Focal Meditech was the company that supplied the hardware for our motion game. They specialize in making custom wheelchairs for such patients and over the last few years had been doing some research into games as a sort of add-on to the rehabilitation process. One of the most important types of rehabilitation is rehabilitating one’s arm. If you think about it you use your arm daily for almost everything. Having lost control of your arm would in fact be worse than losing control of your legs because, for instance, you couldn’t propel yourself when sitting on a wheelchair or you couldn’t feed yourself. For this specific type of rehabilitation they created a mechanical arm-support with counter weights to allow a patient to focus just on the movements and less on the weight of the arm. They already had a 2D game prototype and wanted us to extend the research and come up with a prototype of a game in 3D. In this context 3D meant using their device for tracking movements in all three axes. During our research phase a few of us paid a visit to some clinics to get an idea for possible themes and colors that were relevant to our target audience. We combined that information with information we got from physiotherapists on the range of movements that these patients would execute during a standard therapy session. This turned out to be more than useful because the combined information had a direct consequence on the game mechanic that we would eventually adopt. Having iterated a few times over the results of our research, we came up with the idea of having a bird constantly flying around with the camera placed right behind and constantly targeting our protagonist. This allowed us to utilize all three axes and helped us to intuitively translate the mechanical movements of the hardware to the in-game movements of our bird. This was a very important criterion for our project because some patients had never played a computer game or never even sat behind a computer before.

Its all about motion nowadays

Because this game relied heavily on its unique control system I was put in charge to be responsible for programming the capturing of the movement data and sending that to our game-engine. Not a very straightforward task. One fellow team-member (the technical game designer) was in charge of making sure that what you did with the controller was what you saw in the game. That meant computing the possible range of movements, setting up a calibration scheme and implementing linear interpolation to smooth the movements without causing noticeable delays. It also meant showing visual guides (in the form of wind particles) and turning the bird sideways when it changed directions thus simulating a banking-rotation (see video below).

At the very beginning we were using the Wii Remote to capture the data. However, we had a lot of difficulties with Unity3D, our game-engine. Because Unity3D used at the time an outdated version of Mono, the Wii Remote libraries were not compatible or weren’t working 100%. Having discussed the issue with Focal Meditech it was suggested to find another way to capture the data, not just for technical reasons but also business-wise. Having a custom method for motion tracking is in much ways better then using components from an off-the-shelf game controller, even if it’s just a prototype.

Arno Kamphuis, our supervisor at the university suggested capturing the motion data using OpenCV. I ended up using a wrapper for OpenCV in C# called EmguCV for the following reasons. After doing some research online it turned out that EmguCV was one of the best recommended wrappers. Also, it was nice to program in C# because its managed and would cost less time to create the same functionality in C++. Coming mostly from a Java/PHP background this was the first time that I used C# in a real project so I also learned a thing or two. The last reason for a C# wrapper was the compatibility it provided with our game-engine Unity3D. You can import C# scripts and compile them directly from within the engine (through Mono). Later it turned out that this was not the way to go because the Mono library was outdated and I wasted a lot of time trying to see if it was still possible to combine it with EmguCV. In the end it turned out for the better to separate the two layers because then you can make the input manager independent of the game-engine.

So without the issues that we were having with the Wii Remote and finally coming up with the decision to use a separate piece of software to track the input, I will now explain how the system works without giving too much detailed information. The arm has four LEDs positioned at equal distance from each other. We capture a filtered image through OpenCV with a camera that is placed on the top (pictured above). This gives us four feature points. With these feature points we can make a square, which will then give us the circumference. This is useful if we want to track the vertical height. In other words, the greater the circumference, the smaller the distance between the arm and the camera. Rotation for a certain degree can also be obtained but more time was needed to implement it and it doesn’t work for fast moving motions. This could be technically solved with some hardware adjustments but in the end we just didn’t have enough time to achieve the same robustness that we got from the tracking algorithm, so we decided to drop the feature at the last minute. On a positive note, one feature that did come in handy was point compensation. It would happen often that a patient occludes one LED with his or her arm. In this case I needed to check which point was missing and virtually add that point to the tracking process. But because the shape is a standard square, there was no way to track movement when two points or more were occluded. For tracking you need at least a triangle shape (or three points). Nevertheless, it lowered the probability that the points wouldn’t get tracked.

The final idea was to have a separate running application and send the data through a network protocol (pictured above). This brings off course some performance issues, so we had to carry out some test scenarios to improve the performance quite a bit, both for the motion tracking software as in Unity3D. Besides performance, there were lots of things that could have been done nicer on the technical side. Personally, the most important programming lesson that I have learned is to not spend too much time on making incompatible software compatible, even if you are a stubborn perfectionist.

The Dutch Game Awards 2010

The awards were held this year in the same venue as last year for the Game In The City event in Amersfoort and I have to say what a pleasant event it was. There weren’t many attendees but I was only present during the business program and not on the following day that was meant for the general public. Vogels! got nominated for three categories, Special Award, Best Student’s Game, Best Serious Game. In the end we just won the Special Award, but no one on the team was expecting it I think. The other nominees delivered strong products so the competition was tight (see video below). We also managed to get nominated (on the same day) for the Steve P. Wozniak (Wozzie) Award (link is in Dutch), which was really unexpected. Combing these nominations with the Diamond Trophy Award from last summer, I can say that we can be proud as a team. What made Vogels! attractive in my opinion is its unique input controller and what it means for gaming in general. The field of serious games is expanding fast and entering unknown territories every year. I hope that Focal Meditech releases a commercial product in the near future and I’m expecting a growth in serious games as motion gaming becomes ever more popular thanks to the advances in computer vision and the Wii revolution of the past few years.

Posing at DGA 2010
Here we see Tōru Iwatani (creator of Pac-Man) testing out Vogels! @NLGD 2010 :)

Comments

Popular posts from this blog

But Google what about mobile phones that do not support Javascript?

In the global device market, there are still between 0.2% and 5.4% of phones that do not support Javascript, at least in these set of countries according to this site. In case your mobile website falls within this set than what do you do when you want to optimize CSS delivery by deferring the loading of some CSS but still serving the complete CSS to non-Javascript websites?

Algorithm to sort edge list of simple polygon for 2D and 3D

Sometimes it is handy to sort an edge list. In this case I needed an algorithm to test for concavity of a simple 3D polygon with just one face. You can also apply the procedure on 2D because it just sorts an edge list that could contain either 2D or 3D vertices. The polygons were made in Blender v.2.67 , so the script had to be written in Python and executed via the Run Script button in the text editor. I didn't want to use fancy algorithms to sort edges because we're dealing with simple polygons, so I ended up writing my own. As a side note, the edge-angle checkbox in Blender, which can be used to see if a polygon is convex or concave didn't work for me, so I had no other choice but to first sort edges before I can apply angle calculations on consecutive vertices. Suggestions for improvements are welcome and hopefully it helps someone else who had to deal with the same (or similar) issues in Blender!

A scalable geometrical model for muscle and tendon units: An algorithmic solution on how to fit a template muscle on a high resolution muscle mesh

The reason for this blog post is to share a bit on a particular hard problem that I've encountered during my Master's thesis with a broader audience. I will try my best to write it in plain English, but as the problem is complex expect this to be a lengthy post with domain specific terminology.