Tag Archive: qt-gstreamer


During the past month I’ve been working on a new GStreamer element called qtvideosink.  The purpose of this element is to allow painting video frames from GStreamer on any kind of Qt surface and on any platform supported by Qt. A “Qt surface” can be a QWidget, a QGraphicsItem in a QGraphicsView, a QDeclarativeItem in a QDeclarativeView, and even off-screen surfaces like QImage, QPixmap, QGLPixelBuffer, etc… The initial reason for working on this new element was to support GStreamer video in QML, which is something that many people have asked me about in the past. Until now there was only QtMultimedia supporting this, with some code in phonon being in progress as well. But of course, the main disadvantage with both QtMultimedia and phonon is that although they support this feature with GStreamer as the backend, they don’t allow you to mix pure GStreamer code with their QML video item, therefore they are useless in case you need to do something more advanced using the GStreamer API directly. Hence the need for something new.

My idea with qtvideosink was to implement something that would be a standalone GStreamer element, which would not require the developer to use a specific high level API in order to paint video on QML. In the past I have also written another similar element, qwidgetvideosink, which is basically the same idea, but for QWidgets. After looking at the problem a bit more carefully, I realized that in fact qwidgetvideosink and qtvideosink would share a lot of their internal logic and therefore I could probably do one element generic enough to do both painting on QWidgets and on QML and perhaps more surfaces. And so I did.

I started by taking the code of qtgst-qmlsink, a project that was started by a colleague here at Collabora last year, with basically the same intention, but which was never finished properly. This project was initially based on QtMultimedia’s GStreamer backend. As a first step, I did some major refactoring to clean it up from its QtMultimedia dependencies and to make it an independent GStreamer plugin (as it used to be a library). Then I merged it with qwidgetvideosink, so that they can share the common parts of the code and also wrote a unit test for it. Sadly, the unit test proved something that I was suspecting already: the original QtMultimedia code was quite buggy. But I must say I enjoyed fixing it. It was a good opportunity for me to learn a lot of things on video formats and on OpenGL.

How does it work

First of all, you can create the sink with the standard gst_element_factory_make method (or its equivalent in the various bindings). You will notice that this sink provides two signals, an action signal (a slot in Qt terminology) called “paint” and a normal signal called “update”. “update” is emitted every time the sink needs the surface to be repainted. This is meant to be connected directly to QWidget::update() or QGraphicsItem::update() or something similar. The “paint” slot takes a QPainter pointer and a rectangle (x, y, width, height as qreals) as its arguments and paints the video inside the given rectangle using the given painter. This is meant to be called from the widget’s paint event or the graphics item’s paint() function. So, all you need to do is to take care of those two signals and qtvideosink will do everything else.

Getting OpenGL into the game

You may be wondering how this sink does the actually painting. Using QPainter, using OpenGL or maybe something else? Well, there are actually two variants of this video sink. The first one, qtvideosink, just uses QPainter. It is able to handle only RGB data (only a subset of the formats that QImage supports) and does format conversion and scaling in software. The second one, however, qtglvideosink, uses OpenGL/OpenGLES with shaders. It is able to handle both RGB and YUV formats and does format conversion and scaling in hardware. It is used in exactly the same way as qtvideosink, but it requires a QGLContext pointer to be set on its “glcontext” property before its state is set to READY. This of course means that the underlying surface must support OpenGL (i.e. it must be one of QGLWidget, QGLPixelBuffer or QGLFrameBufferObject). To get this working on QGraphicsView/QML, you just need to set a QGLWidget as the viewport of QGraphicsView and use this widget’s QGLContext in the sink.

qtglvideosink uses either GLSL shaders or ARB fragment program shaders if GLSL is not supported. This means it should work on pretty much every GPU/driver combination that exists for linux on both desktop and emebedded systems. In case no shaders are supported, it will fail to change its state to READY and then you can just substitute it with qtvideosink, which is guaranteed to work on all platforms supported by Qt.

qtglvideosink also has an extra feature: it supports the GstColorBalance interface. Color adjustment is done in the shaders together with the format conversion. qtvideosink doesn’t support this, as it doesn’t make sense. Color adjustment would need to be implemented in software and this can be done better by plugging a videobalance element before the sink. No need to duplicate code.

So, which variant to use?

If you are interested in painting video on QGraphicsView/QML, then qtglvideosink is the best choice of all sinks. And if for any reason the system doesn’t support OpenGL shaders, qtvideosink is the next choice. Now if you intend to paint video on normal QWidgets, it is best to use one of the standard GStreamer sinks for your platform, unless you have a reason not to. QWidgets can be transformed to native system windows by calling their winId() method and therefore any sink that implements the GstXOverlay interface can be embedded in them. On X11 for example, xvimagesink is the best choice. However, if you need to do something more tricky and embedding another window doesn’t suit you very well, you could use qtglvideosink in a QGLWidget (preferrably) or qtvideosink / qwidgetvideosink on a standard QWidget.

Note that qwidgetvideosink is basically the same thing as qtvideosink, with the difference that it takes a QWidget pointer in its “widget” property and handles everything internally for painting on this widget. It has no signals. Other than that, it still does painting in software with QPainter, just like qtvideosink. This is just there to keep compatibility with code that may already be using it, as it already exists in QtGStreamer 0.10.1.

This is actually 0.10 stuff… What about GStreamer 0.11/1.0?

Well, if you are interested in 0.11, you will be happy to hear that there is already a partial 0.11 port around. Two weeks ago I was at the GStreamer 1.0 hackfest at Malaga, Spain, and one of the things I did there was porting qtvideosink to 0.11. I must say the port was quite easy to do. However, last week I added some more stuff in the 0.10 version that I haven’t ported yet to 0.11. I’ll get to that soon, it shouldn’t take long.

Try it out

The code lives in the qt-gstreamer repository. The actual video sinks are independent from the qt-gstreamer bindings, but qt-gstreamer itself has some helper classes for using them. Firstly there is QGst::Ui::VideoWidget, a QWidget subclass which will accept qtvideosink, qtglvideosink and qwidgetvideosink just like any other video sink and will transparently do all the required work to paint the video in it. Secondly, there is QGst::Ui::GraphicsVideoWidget and QGst::Ui::GraphicsVideoSurface. Those two are meant to be used together to paint video on a QGraphicsView or QML. You can find more about them at the documentation in graphicsvideosurface.h (this will soon be on the documentation website). Finally, there is a QtGStreamer QML plugin, which exports a “VideoItem” element if you “import QtGStreamer 0.10″. This is also documented in the GraphicsVideoSurface header. All of this will soon be released in the upcoming qt-gstreamer 0.10.2.

QtGStreamer 0.10.1

This weekend I released QtGStreamer 0.10.1, the first stable version of QtGStreamer. This release marks the beginning of the stable 0.10 series of QtGStreamer that will continue for the lifetime of GStreamer 0.10. For those of you that don’t yet know what QtGStreamer is, it is a set of libraries that provide Qt-style C++ bindings for GStreamer, plus extra helper classes and elements for better integration of GStreamer in Qt applications.

I must say thanks a lot to Mauricio, the co-developer of QtGStreamer, who helped me a lot with the design and code, to the GStreamer community, who accepted this project under the GStreamer umbrella with great enthusiasm, to Nokia for sponsoring it, to Collabora for assigning me and Mauricio to work on it and to all those developers who are already using it in their projects and have helped us by providing feedback.

The future

Development of course does not stop here. It just started. We will try to improve the bindings as much as we can by exporting more and more of GStreamer’s functionality, by adding more and more convenience methods/classes and/or gstreamer elements that ease the use of GStreamer in Qt applications and by collecting opinions and ideas from all of you out there that will use this API. This last bit is quite important imho, so, if you have any suggestions to make about things that you don’t like or things that you would like to see implemented, please file a bug to let us know.

Use in KDE

I am quite happy to see that this library already has early adopters in KDE. Apart of course from my telepathy-kde-call-ui (ex kcall), which is the “father” of QtGStreamer, QtGStreamer is also used in kamoso, a cheese-like camera app, whose authors, Alex Fiestas and Aleix Pol, have been very patient waiting for me to release QtGStreamer before they release kamoso and have also been very supportive during all this time (thanks!).

Personal thoughts

I must say this project was fun to develop. During development, I learned a lot about C++ that I didn’t know before and I also learned how GObject works, which I must say is quite interesting, although ugly for my taste. Learning more about C++ was my main source of interest from the beginning of the project, and for some period of time I couldn’t even imagine that this project would ever reach here, but I kept coding it for myself. Obviously, I am more than happy now that this finally evolved into something that is also useful for others and has wide acceptance :)

This week I wrote some exciting (for me) code. Last weekend, while playing with gstreamer, I had this crazy idea to write gstreamer bindings for Qt. So, I started writing it for fun, outside the scope of kcall. It took me about one day to write something usable and I was really excited. Then, I remembered that some days ago, bradh in irc had told me that it would be possible to use solid to autodetect audio/video devices for gstreamer. Being excited with the bindings, I thought about making one library with the 1-1 gstreamer-Qt bindings and one extra library with extra stuff, like device autodetection using solid. So, I started writing this new library as well. I developed those two libraries for about 4 days and I reached a point where they were usable for the purposes of kcall. So, I merged them in kcall and rewrote the part of kcall that handles audio/video streaming to use them. At that point, I also wrote a small telepathy-farsight Qt wrapper (libqtpfarsight), mostly to provide a sane API for it (as the original telepathy-farsight API is really bad) and not to get rid of GObject stuff, but eventually I achieved both. So, now the core kcall code uses only Qt, the GObject ugliness is hidden in the libQtGstreamer and the libqtpfarsight libraries and I have device autodetection using solid :D I think that was worth the effort, although it doesn’t offer any significant functionality to kcall.

And to add to my excitement, there was already interest for my bindings by one guy who is writing a plasmoid that uses a webcam to take photos. He couldn’t use phonon because phonon has no support for video input (yet?), so he started writing it with gstreamer and so he was interested about my work, which he already has started to use. I’m really happy to see my work becoming useful for others :)

Today I spent my day doing debugging, trying to understand why kcall does not receive correctly video from the remote end. I still haven’t reached the answer and I’m really disappointed because everything in the code and the gstreamer logs looks perfect. :(

Sending video is not implemented yet, but with the code as it is now, it is a matter of about 10-20 lines of code to add support for it. I will definitely do this in the following days, possibly tomorrow. I am also going to write a KCM for configuring device preferences, which is mostly done, as the library I mentioned above with the extra stuff that sit on top of QtGstreamer, already has a DeviceChooser widget, which can be used for selecting devices and has also support for saving and loading the selected device using KConfig :D Next weekend this will hopefully be over, and I hope I will also have solved the strange bug regarding receiving video.

The only thing that makes me sad now is that this week of coding essentially sent to the trash the code I wrote two weeks ago, which took me some time to write, but at least I know it was self-educating.

Follow

Get every new post delivered to your Inbox.