Tuesday, December 15. 2009
Summary: Three multitouch tables built using a hybrid touch sensing method to deal with a challenging environment. Additional discussion regarding manual iris control, Point Grey Firefly camera driver advantages, an operator driving users' experience via gesture, and sound domes.
The first time I set up the Pt.Grey Firefly with wide angle lens from Computar, it was a moment of anticipation to see if the math on paper was accurate in deciding which lens would give me a clear & non-distorted view of the table surface. Well that all worked out peachy but I saw a new problem through the camera. The fluorescent lighting overhead was giving off a considerable amount of noise, and even worse, the noise was oscillating/strobing in and out! The blob tracking software I use is CCV (community core vision), and it is simply great at what it does. Great because it’s got everything you need to tweak settings for your particular set-up. One new feature is called “learn” whereby CCV can sense a new addition to the environment which is not a touch and start to combine it with the background intended to be ignored, the “learn” is based on a time threshold. Sure, it works great when a lighting condition suddenly changes. Not so great for a rhythmic oscillation! Without Pt.Grey’s ability to dig into settings deep, I’d be stuck right then & there. I was able to access the shutter speed which is editable in very small steps… I stepped up the shutter speed gradually using the cursor keys and watched the oscillating light speed up until I matched that pesky glow overhead resulting in a “steady on”. I then had to tweak some of my other settings so they jive with the new shutter settings. Problem solved. Thank you Pt.grey!
Hardware: Since I didn’t how tough the environment outside the table would be, I had to reduce the internal environment to the highest contrast view of touches possible. First I made the decision to use a Point Grey Research camera. The Sony PS3 Eye cameras are highly regarded for use in multitouch tables but I wanted the kind of access to driver settings that Pt.Grey provides. They are all about computer vision and the features of their drivers show it. This camera provides a C mount (one of the industry standards for a threaded lens mount, usually found on security cameras and microscopes) which brings me to the second part of my super ninjaskill solution.
Pictured with sound domes.
A mention about the sound domes - some of the content on the tables were video clips that had a narrated audio track and some subtle sound effects. In order to deliver crisp, clear audio to the user without competing with ambient tradeshow noise or the audio from the other tables, we localized audio by using sound domes. It was my first experience with them. They work remarkably well, although the ones we had were underpowered. It would start to "clip" when we hit the audio level that would have been ideal, but even so, we could achieve sound clarity and a usable volume in the most logical sweet spot for the user.
Software: We executed some special tricks on the software side as well. We had a lot of content provided by the client. All of it was to be redone for aesthetic purposes but we couldn't alter any of the user interface flow. In the world of pharmaceutical trade shows, once an experience has gone through it's long & arduous approval process, it is not to be changed (lest it go before lawyers for another round of approvals). Well, the approved interface flow did not lend itself well to a multitouch table designed for attendee education. It was decided that each table was to be staffed by a "driver" who could guide a user through the multifaceted and somewhat deep interface. So now we were faced with something new, an opportunity to create a new kind of interaction. Two people would stand opposite each other, both using the interface, with an opportunity for dialog through the entire experience. To help make the interface natural to use for both parties, we seized that chance to try something cool. We created a gesture (see Seth Sandler's blog post on AS3 Gestures) similar to the turning of a large dial which would rotate the whole screen 180degrees. As simple as that sounds, It was great fun to do. no matter where you spun your five fingertips, a huge 50"+ display rotation would ensue. I saw some people experimenting with the gesture recognition for the fun of it. I learned to always try to include a simple action with an unexpected payoff in the future. Everyone had a good smile to go with their new information - and that was from a trueblue utilitarian function of the interface, not a bit of eyecandy placed there purposely to add funvalue. I was very pleased with the accidental outcome.
Special thanks go to Seth Sandler for his expertise on the Flash Actionscript3 and the implementation of the Gesture. It's always smooth sailing with Seth, he's got skills, speed, problem solving, and a great enthusiasm and work ethic that gets the project executed with style, even if we're down to the final hours. Thanks again my friend.
This post was mentioned on Twitter by alpaykasal: New blog post about 3 Multitouch tables I built for tradeshow http://bit.ly/72ayvx challenges and solutions for public use Comment (1)
Tracked: Dec 15, 22:42