this is actually my first post, because most of the time I was just browsing and reading in this forum. But now I’m happy to share some of my experiences and hope they are useful for someone.
This is what I used:
40 OSRAM SFH485P
Mitsubishi PK20 Pocket Projector
Knight Optical interference band-pass filter 880FIW12
Java / OpenCV
First I tried the Microsoft Xbox live vision cam, and it took me a while to fix and recompile the opencv framework and the processing library so that it supported my external usb camera, but the results were not really usable. I had around 7fps and decided to invest into a firewire camera, which now gives me 30fps.
If some of you guys are interested in the portable solution, I can post the construction plans.
No, it was just a piece of glass fitted in front of the chip. The camera is really easy to modify, you can disassemble and reassemble it completely without damaging it. The only annoying thing is the autofocus. But there is a small cable you can plug out when it is adjusted to the right distance - then it is holding still.
yea, I replied to your youtube message. Also very interesting sound stuff you do. What are you using for the signal processing / synthesis?
The PK20 is alright, because the projection screen is quite close and you look into the beam. It has a very descent throw ratio as well (I think 1.6) and they are very cheap. You can use the box only in dim light conditions anyway (or a room with only artificial light), because even though I’m using a band-pass filter, the camera gets too much IR light in daylight conditions.
I still have to work a bit on the touch surface, because at the moment I’m using only a diffuser at the back and the plain acrylic. I tried out the heat’n’bond stuff - didn’t work. I tried out the xylene/silicon/tracingpaper method, but could get only crappy xylene here in the UK (it doesn’t evaporate completely as it seems), so that the silicone didn’t dry completely - http://www.anti-slip-paint.co.uk/xylene-litre-p-31.html. But so far the plain acrylic is ok.
Im writing a framework in Java, which is doing the blob detection/tracking/calibration and lets you select an applet you want to launch. It then loads the applet to the second screen and triggers an onBlobsUpdated() function every camera frame. The applets are basically Processing applets (extended PApplets), but I’m writing them with Eclipse (BIG recommendation!!). The sound stuff is done with SuperCollider, receiving OSC messages from the applets.
I’m thinking of extending the whole concept and make some sort of Java suite for Processing people, so that you can write an applet in Processing and open it with the Java multi-touch app. If your applet has an onBlobsUpdated() function it gets fed with blob data. No OSC, just plug and play - could be fun.
I’ve uploaded a screen of my Java app. It’s very simple, but does the job quite well. Here I’m holding two cardboard sticks with black dots on them in front of my crappy xBox live cam (notice the 7fps). For the calibration I’m using a 3D plane that can be rotated and moved until it fits the frame (not the best solution, but works).
never knew that you can write processing sketches with other soft .
What is the main advantage ?
Eclipse is actually a Java IDE. The Processing sketchbook converts what you write with it into Java sourcecode, which is then compiled by the Java complier. The sketchbook actually has a few advantages, it does some type conversions for you and makes programming in overall a bit easier for beginners, BUT it does not show you errors or typos in your code until you compile it. And even then it is often confusing what it complains about. Also managing bigger projects in the sketchbook is not easy. Eclipse is a fully featured IDE, means it tells you what is wrong with your code while you type it, it lists the methods which are available for a specific Object and muchmuch more. In order to write a Processing applet with it, you need to link the processing.core library (core.jar) and extend the PApplet class. There are a few instructions on the web about it (on the Processing forum for example). Nevertheless, for small experiments and just some coding fun I still use the sketchpad - as the name tells
jimihertz - 01 August 2008 03:55 PM
Can you send your Java app in order to compare it with Touchlib ?
Touchlib is probably much more advanced, but I couldn’t use it because I needed something that works on Mac. Therefore I’m also curious about the new OpenTouch. I can put my app on my webspace, but I have to clean some things up first.
I’ve uploaded the construction plan, but it shows only the wooden parts. If someone really wants to build it, I’ll document it in more detail. If there are a few people interested, it might be worth thinking about manufacturing a small series or kits. Unfortunately I will not have access to the university workshop anymore after this month, but there will be other ways…
Wow, that is awesome thoughtdiver! Great Job! It would be awesome if you could get the project and computer be all be self contained, but that would be one large briefcase:D You might be able to do it with an LCD panel, you should look into that.
Hehe, funny, after all this time someone posted again. Here is another video of the project (also pretty old already). It it shows another nice music app I’ve built for the touchcase, a bit like Reaktor. They are not doing much with it in the video, but it shows roughly how it looks like, and how it works. Maybe interesting for someone.