GSoC week #9 – Webcam support in kcall is working :D

This week I implemented complete webcam support in kcall. Both video input and output are working 😀 Screenshot:

Screenshot of kcall in an audio/video session
Screenshot of kcall in an audio/video session

On the left side you can see the incoming video from the remote end, which in this case is my laptop, capturing myself through its webcam, and on the right side you can see the video that is being sent to the other end. Here, because I don’t have a webcam on my desktop computer as well, I am using gstreamer’s “videotestsrc” element as a video input, just for testing. Also, on the bottom right you can see the video controls for the video input (which is shown above them). For some reason, the video coming from my laptop has wrong colors there (my shirt is actually blue!), but that seems to be a bug in empathy (which is used as the remote client there). The preview in empathy also shows wrong colors, so… 😉

Currently audio/video calls are working only with empathy or kcall on the remote side, using the jabber protocol. I also tried to test it with google’s web client (through windows/ie/gmail), but it doesn’t work. This is probably some bug in one of the underlying subsystems (in telepathy-gabble perhaps), but I don’t really care about it at the moment. There are still bugs, though. Sometimes I experience weird deadlocks in gstreamer threads and also sometimes the video stream is not sent correctly and the other side doesn’t receive anything. Some other times it works fine, though, which makes it really difficult to debug… I’m trying to debug those today, but with this extreme heat here in Crete, it is really difficult to work (today temperature reaches 41°C !!!).

Ok, I think I’ll give up for today and go to the beach… 😀


25 thoughts on “GSoC week #9 – Webcam support in kcall is working :D

    1. Yes, when/if kopete is ported to telepathy, it is possible to integrate this with kopete. And there are two possible ways to do it: 1) launch kcall externally to handle audio/video channels and 2) put most kcall classes in a library and use them in kopete too with a different UI. The first solution is obviously easier, but the second one is also possible, since in the current kcall design, the channel handler is independent from the UI and can be abstracted in a separate library quite easily.

  1. the wrong colours look like the order of the RGB records is mixed; some webcams return BGR instead of RGB, or UYV instead of YUV…

  2. Amazing progress!

    You definitely deserve a lot of applause for having achieved so much in such a sort time-period (… with such heat 😉 )



  3. Wikid!

    U iz a supr l33t hax0r! Me so happy! Kcall 4eva!!!!!!!

    Or something like that. Seriously though, this is awesome. Congratulations! The free software world salutes you sir!

  4. This is awesome stuff!

    Would it be possible to have the webcam API in a separate kdelibs library? 🙂 Is it based on libv4l, or the Kopete codec code?

    I’d love to use it in KMess too, and afaik some other devs are also in need for something like this (e.g. to create a photobooth app).



    1. I didn’t write any webcam API there. I am using *Gstreamer*. Gstreamer provides a v4l capture plugin that can be used to capture video from there. It is very easy to use. See an example with my Qt-Gstreamer bindings:

      using namespace QtGstreamer;

      int main(int argc, char **argv)
      QCoreApplication a(argc, argv);

      QGstPipelinePtr pipeline = QGstPipeline::newPipeline();
      QGstElementPtr src = QGstElementFactory::make(“v4lsrc”);
      QGstElementPtr colorspace = QGstElementFactory::make(“ffmpegcolorspace”);
      QGstElementPtr scale = QGstElementFactory::make(“videoscale”);
      QGstElementPtr rate = QGstElementFactory::make(“videorate”);
      QGstElementPtr sink = QGstElementFactory::make(“xvimagesink”);

      *pipeline << src << colorspace << scale << rate << sink;
      QGstElement::link(src, colorspace, scale, rate, sink);

      return a.exec();

      This is a pipeline where video data flows from the “v4lsrc” element to the “xvimagesink” element (which internally opens a window here), passing through some video filters in the middle to correct the colorspace, framerate and resolution. In the real world of course you need to embed the “xvimagesink” window in a QWidget. There is an api to do that and I have written a class that takes a QWidget pointer and renders the video on it. If you are interested, take a look at libqtgstreamer and libkgstdevices in trunk/playground/network/kcall.

  5. When should we see logo overlays for advertisements like Telstra logos or other kde project colors? This would be an interesting addition not seen on Windows.

  6. Hi, I am an undergraduate student at University of Puerto Rico, working in his final design project. I am using GStreamer and QT running Angstrom without X11, preventing me of using Phonon. I want to know if it is possible to take a video stream (GStreamer udpsrc) and display it in the QT GUI without relying on X11 (i.e. writing directly to the frame buffer). Thank you and keep up your work.

    1. Yes, it is possible to do that. The trick is this: you put a sink element in the pipeline that gives you raw frames (this can either be the standard appsink or some custom sink), then you convert each frame to a QImage and you paint it with QPainter on a QWidget. This is implemented in the phonon gstreamer backend (for places where the x11 sink cannot be used, for example on a plasma widget) and you can use it as an example. Have a look here: at the files widgetrenderer.cpp and qwidgetvideosink.cpp. Good luck with your project!

      1. Thanks for your reply. Do I need to have X11 turned on in order to use Phonon? I ask this because we will not use X11. I took a look to the CPP source files that you suggested and I noticed that Pnonon namespace is utilized. Thank you once again.

        1. I think that X11 is optional in phonon, at least for the gstreamer backend. The source code I showed you is from the phonon gstreamer backend. The problem is that the gstreamer backend is not as good as the others, but if it works good for what you want to do, I believe you can use it.

  7. Hello once again. I’ve been trying to work with your binding but I have not been able to create a project from the Gitorious QtGstreamer repository but I receive an ‘Unknown CMake command “automoc4_add_executable”‘ error. Please let me know if you know how to solve this problem. I am trying to build the project in Qt Creator. Thank you!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s