Yes, I have the same thing. With the last version of his driver, the delay seems to be less than half a second on my computer. I didn’t notice this until I plugged the cam into a Windows machine and saw the difference xD
It is vlc. If you use another program like cheese or mplayer, there is less lag. I assume it is excessive buffering in vlc.
This is very relieving because I’m working on a touch project using Linux and the PS3 Eye, so I’m glad that the Eye is a usable camera for me.
Kernel 2.6.29 came out the other day also. There is a PS3 Eye driver in that release—hopefully it is more polished.
Right now I am using Kaswy’s 0.3 driver on my laptop and getting very low latency with mplayer and cheese. It seems like there are wrinkles in the completeness of it’s v4l2 implementation: xawtv does not work on it, for example. I haven’t really researched too far into the exact problem yet.
I am hoping to try the kernel driver soon and see how solid it is.
Hmm, I’ll have to mess with that some more.. Time to set up 2.6.29 on Gentoo, and get that camera working (whoohoo, fun..).
I have noticed different results with different programs, but haven’t really experimented much with cheese or mplayer. At least someone else has the same problem, maybe now we can figure out a solution.
By the way, could you guys test those programs from a terminal and post up the freezing errors? I’m curious to see if they are the same.
 Hmmm… I’ve said this before, but I think I figured out the freezing. For some stupid reason, 6 of my USB ports work very poorly with the camera. I finally got a wild hair and tested all of them. Turns out that it works almost perfectly using the one sandwiched between my mouse receiver and ethernet cable.. I am still testing this theory, but it has been working for a few minutes on all most of the useful resolution/frame rate options (320x240 @ 75 fps seems reliable and quick). I will never understand the weird stuff computers can do…
Hi all, I’m new here. I’m hoping to get my ps3 eye working for a robotics class project, although the touch interface stuff you guys are doing sounds awesome. Here’s where I’m at, the screenshot says most of it. I’ve built and installed kaswy’s driver, the camera does show up as /dev/video0, but the video looks all green and lined in Ekiga. vlc refuses to open it, and kaswy’s webcam test program doesn’t build for me. How exactly do I find and install cv.h and highgui.h? Any ideas about the green color and lines? I tested the camera in my playstation, it works great, so it’s not the camera. Also, I’m running ubuntu 8.04, would that make a big difference? Thanks!
I went ahead and upgraded to 8.10, fixed the /v4l/.version file that was mentioned in the blog zun suggested, and recompiled everything. There is about a half second delay in vlc, but it works perfectly in kaswy’s webcam test program, and in Ekiga (hardly any noticable delay at all). hmmm, now to get it working with my own Java program. Anyone do Java and webcams? Any suggestions? thanks again,
I also came across some suggestions that said you’d have to write a JNI, but I don’t think that’s what you’re looking for. I think it is possible to read the stream entirely in Java, but I don’t know if you’ll get the 125 fps, since Java isn’t really suited for real time video processing, if I’m not mistaken
I kind of wouldn’t be surprised either way if Java performed well enough. Modern computers are pretty beefy, and they have been trying to get Java to run as fast as they can. On the other hand, it *is* Java
Is there a reason for choosing Java over C in this case? You may be better off getting to use the V4L apis.
haha, agreed The reason for Java is that my professor has several robotics visualization libraries I plan on integrating with, so it makes sense for this project. it *is* Java, very true. c++ would of course be faster, but I also think Java will do well enough, the vision processing I’m doing is not terribly complicated. I’m just picking out the location of a line laser, and using the parallax between it and the camera to get the shape and location of obstacles. Once it’s calibrated it will mostly be a matter of finding the brightest greenest pixels.
thanks for the input, the thread zun put up looks like all I’ll need I think. idk, Java is still pretty new to me, video stuff especially. Access to the raw image is exactly what I’m going for though, so it can’t be too hard, right? haha, famous last words…
It does what you’re trying to do and it uses a JNI so it’s definitely faster I think the RGB24 format may be of interest to you, since you can get the greenest pixel fairly easy then I think (I have never worked with it though, so I’m not *that* sure ). Good luck, sounds like a fun project
The following features are implemented in v4l4j:
* v4l4j hands out either raw, RGB24 or JPEG-compressed images in a ByteBuffer object.
* v4l4j supports both V4L1 and V4L2 video devices under a single API. Differences between these two versions are managed by v4l4j and hidden from the user application.
* All video controls are reported by v4l4j and made available. This includes standard V4L controls, driver-private controls, V4L2 extended controls and private ioctls that may be implemented by some drivers.
This driver is great..but I’m having some issues with the microphone if you can help..
I’m running Ubuntu 8.10 with kernel 2.6.27-11-server (this is because I have 4gb of ram and ubuntu is on 32bit.
Video is great with the camera..but I think the microphone support is damaged (I suspect this to be a kernel problem ..but would need confirmation from you).
Seems that while the camera is connected alsa cannot save my volume control settings and it gives:
~# alsactl store
alsactl: get_control:262: Cannot read control ‘2,0,0,Mic Capture Volume,0’: Invalid argument
if I remove the webcam ...that mic is gone..and everything is ok.
Where should I look to for solving this ?
Should I try making my own custom kernel from the vanilla sources ? (I’ve done this before on servers)