Its a DirectShow Filter that effectively allows you to stitch multiple webcams together and even lets you do perspective correction on each webcam individually. Its not free, but there is a free demo and it looks like it only has a demo warning scrolling across the bottom of the final output which shouldn’t be too much of a problem if we can tell our blob tracker code to ignore the bottom part of the image.
I bought a second PS3 eye today and I will be testing it out when I get home, I will post some images of my results.
Sounds Great! Only problem I could see is if you were running too many USB cameras that you would be running too much data through your USB host controller and it would barf. But as long as you kept it to less than 5 (fairly random number) I think you would be fine. Do you have any idea what kind of processor overhead it takes to run this directshow filter?
Ok guys, Ive hit a bit of a snag, I cant seem to use 2 PS3 eyes with Alex’s DirectShow Filter and it also only lets me use 30 fps, im not sure if this is a restriction in the PS3 eye filter of the VideoMixer filter. I’m going to keep trying to get both cams working, I will let you guys know if I get it to work.
Pleh, one thing I saw Alex mention before is that you’re probably not going to be able to run more than one ps3 cam on the same usb controller at high framerates. One ps3 cam alone is taking most of the bandwidth the controller needs to run it. So running two ps3 cams at @ 60fps at the higher res doesn’t seem like it’ll work.
Still having problems using 2 PS3 Eyes at once tho, I’m going to try the other drivers tonight, but I hear they aren’t as fast as Alex’s ones. It looks like Alex’s PS3 Eye SDK doesn’t support multiple came either, so it may be a driver limitation rather than a directshow filter limitation :(
Just tested the RogueStream one and it seems alot faster and easier to use. If I ever get both eyes working this is the filter I will probably end up using.
Just tried the other drivers, they let you use more than one cam, but are capped at 30fps and are extremely unstable, they will have to do for now until Alex sorts his out
I have tried it with vcam, which lets you use any direct show filter as a virtual camera, and that worked with tbeta, but using vcam adds a bit of a delay. So im gonna write a touchlib source filter which should work nicely. Untill they release the source of tbeta, then I will write a source filter for that instead.
I have 2 ps3 eyes running at 320x240@60fps smoothly at the moment with the omnivision drivers (waiting for Alex to add multiple camera support)
I will post some screenshots and videos of it in action once I have written a touchlib source filter.
I suggest you use 2 PC to do this job, one connect with 2 ps3eyes, it is in charge of that capture image and analyse the blobs, send the point data to another pc. Another one is in charge of screen show.