I want to thank all of the members of the NUI group for being so generous with information and assistance!
We are prototyping exhibits for our new Science Center, and have learned a lot about multi-touch thanks to all of your posting and discussions. We’ve made good progress on our own table and would like to show it off and get some feedback.
We have a DI setup using a BenQ MP512 ST (short-throw) projector
BenQ has a nice “projection calculator” web-app that lets you calculate the distance between the projector and the screen and the screen size (make sure you select the right projector in the pull-down menu). Very useful!
This website was very helpful for figuring out the table/projector/mirror layout.
This images shows the Microsoft webcam, but the Firefly MV is in the same location. Note the projector is mounted upside down to an old slide-projector stand that allows adjustments.
We started out with a Microsoft LifeCam VX6000 and, based on the feedback here on the NUI forums, quickly graduated to a FireFly MV from Point Grey Research. For those of you who are hesitant, we found that it is well worth upgrading. The slow frame rates of most webcams are just not worth the headache. It’s impossible to get very smooth interactions without the faster camera, and with FireWire, there is no competition between devices for “attention” as there is with USB. We also have an OptiTrack Slim:V100 from Natural Point that uses high-speed USB that we’re going to be testing. It looks promising so far, and the fact that it does the blob tracking in hardware hopefully means that the interface will be faster.
Speaking of USB, has anyone tried installing a second USB card for their camera? It seems like this would solve some latency issues if the camera has its’ own USB controller that it doesn’t have to share with anything else.
We’ve got three ~24-inch LED strips inside the table pointing down and that seems to provide enough light to produce good blobs and to see the fiducials. Eventually we’ll connect these to an Arduino so the computer can turn them on and off. We may also use this same system to allow the computer to adjust the IR LED brightness depending on the ambient light level.
Because several of our applications require fiducials, we’re using the reactable software for blob and fiducial tracking. If you don’t need to use fiducials, I think touchlib is a better option. In our experience it has better filtering and a better calibration system. Touchlib also seems to be a little more forgiving when tracking blobs.
We’ve been struggling with getting a nice uniform background (partially because the filters in reactable are more picky) and have found that painting the inside of the cabinet white and covering everything inside with white card stock, paper, and Coroplast (corrugated plastic) gives us a nice even background against which we can track blobs and fiducials. We use black construction paper in areas where we get hot-spots due to reflections or ambient light.
We are using 3/8 inch plexi and a piece of vellum for the screen. If we put anything on top of the vellum (even thin, clear plexi or polycarbonate), the fiducials get so blurry that the software can’t recognize them. This could be remedied if we had a higher resolution camera, or two of the lower resolution cameras. Cost-wise, two Firefly MV’s are cheaper than a 1024x768 camera with a comparable frame-rate, and two Firefly’s would have 1280x960 resolution. As other people on this site have mentioned, it would be nice to be able to stitch together two cameras for higher resolution imaging. We are also looking for a more permanent screen solution, and will let you know what we find.
We added an air filter and fan to the cabinet to remove heat from the projector.
Thank you to everyone who has posted their projects online. We’ve learned a lot from the NUI community and I hope we can be as helpful in return.