I’ve been trying to figure this out for weeks now.
We need some way of getting finger and fiducial tracking to work in Flash. I’ve tried using GestureWorks, and it works with finger tracking. My only gripe with it is, that it doesn’t really use the underlying detection on the SUR40, but rather some kind of Windows input overlay (or however you’d explain it). This results in jittery touchpoints, much akin to the cursor on the desktop itself. If you check the Input Visualizer on the table, the underlying detection is stable and smooth.
I’m not sure if there’s a way of getting the raw image capture of the PixelSense (camera?) directly into CCV. I’d much rather rely on a 3rd party detector, and then just relaying that info to flash. Unless of course there’s some kind of magic bridge between whatever the SUR40 detects, and flash itself.
We distribute the SUR40 and I wish I had a solution for you, but I have a feeling that the PixelSense tracking is proprietary to Microsoft and they may only want allow the interaction via Windows Touch and not TUIO (I think it’s Windows Touch the SUR40 outputs).
Is there a reason you aren’t coding your applications in a programming language that can accept the Windows Touch communication protocol? Flash is a very intensive language on the CPU/GPU and there are many better languages out there that are less taxing (such as the ones in this particular forum section). We at Peau Productions prefer to code in MT4j since it is very light weight and supports Windows Touch.
The reason we target Flash, is because we’ve been working with it intensively over the last 6 years. We can churn out variuos apps very quickly, and we have full control over almost all aspects.
I’ve built my own MT table once before, using the old tech from the original surface. In that build, CCV captured the black/white image from a PS3 camera (this is something almost everyone here knows). If there was a way for CCV to grab the raw image from the SUR40, I THINK it may work. I’ve lpayed around with Visual Studio Express and the RawImageAnalyzer program. I know there’s a way to grab the image, but I have no idea how to transport it over to CCV.
Flash, as such, does read WM_TOUCH, but the way the SUR40 interprets the actual image from PixelSense is just not accurate enough. The cursor jumps around within the same 3 or 4 pixel area on both axis. If it were more smooth, then I may just use it as it is.
The only way I can think of getting something good, is to let a 3rd part program (like CCV) read and interpret the raw image from PixelSense.
I’ll try and see if I can capture a video of the raw output, and then feed that into CCV… just to see if it’d actually work. Then I just need to find someone who can build the bridge
EDIT: come to think of it, I actually think we bought the modded PS3 camera from you :D
Allright, so I ran the raw capture app, and recorded it with fraps. CCV is more than capable of handling the raw feed, as I fully expected.
Like I said earlier, all we need now is a way of making CCV capture the raw feed in realtime… now how the hell do we do that… There must be some way of building the raw image visualizer directly into CCV…
If anyone want to try it themselves, here’s the raw video (I converted it from fraps video to plain QT mov):
Hmm sounds like an interesting project. Would you need a SUR40 on-hand to effectively build such a bridge? We have a software team that may be capable of doing so, though we don’t have one of the screens at this time.
It seems there are a few TUIO catchers for AS3 on the intertubes. Maybe a quicker route would be to use the InputVisualizer (Part of the Surface 2 SDK again) to send TUIO data through to AS3. The InputVisualizer tracks fingers and objects, with direction and everything, straight from the raw output.
If THAT app could send the info straight to AS3, there’d be no real need for CCV as a bridge.
Ok, so I came across Michael Zoellner’s SUR_TUIO app, which worked perfectly, except it didn’t kill non-active touches. I got a hold of his project, and worked out a way to make it run smoothly with full cursor support (well, except for finger direction...).
I’m in no way a Visual Studio expert, but I did manage to make it run. I’ve tested it on my own SUR40 table, and it works wonders with flash. No more jittery inputs, it’s smooth and stable.
If you can get this working smoothly and enabled without having to run the project files, maybe turn it into an installable program, let us know. We’d love to put it up on our software store for others to find it easily.
I can easily make an installer, that’s not a problem. I’ll have to get a hold of the original author of the beta project. Just to a) check if it’s ok with him, and b) to see when he was planning to expand it himself.
I really would like someone with more experience using Visual Studio Express to clean it up though, maybe expand it further. I am not used to working in Visual Studio, and all I did here was to make sure it cleaned up unused touches. I have also checked it with GestureWorks 3.1.6 today, using it’s upgraded TUIO support. Works wonders
As I said in an earlier post, you can develop for the SUR40 with the Surface 2.0 SDK. The SDK also has an Input Simulator. So you can develop for, and test, on a typical desktop setup.
This app is rooted in Surface 2.0, so I don’t think it’ll work on those monitors. Though I can’t imagine not being able to catch and process touchpoints directly from Windows, and bridging it using the same methods. I can try and see what I can find, once I’m happy with the current project.
Yes I’ve used that one (you’ll see my post further down the page). I own the same DELL XT laptop he has and it worked ok, but what I’m looking for is not a clear overlay that translate the touches, but something that runs all the time in the background.
The application now supports touchpoints and tag markers directly from the Surface 2.0 system to TUIO. I’ve testet it with the AS3_TUIO library, and it transfers touchpoint sessionID’s and tag sessionIDs + tagIDs correctly.
Both types support only X and Y locations so far. Next up is tag orientation.