Introducing qtvideosink – GStreamer meets QML

During the past month I’ve been working on a new GStreamer element called qtvideosink.  The purpose of this element is to allow painting video frames from GStreamer on any kind of Qt surface and on any platform supported by Qt. A “Qt surface” can be a QWidget, a QGraphicsItem in a QGraphicsView, a QDeclarativeItem in a QDeclarativeView, and even off-screen surfaces like QImage, QPixmap, QGLPixelBuffer, etc… The initial reason for working on this new element was to support GStreamer video in QML, which is something that many people have asked me about in the past. Until now there was only QtMultimedia supporting this, with some code in phonon being in progress as well. But of course, the main disadvantage with both QtMultimedia and phonon is that although they support this feature with GStreamer as the backend, they don’t allow you to mix pure GStreamer code with their QML video item, therefore they are useless in case you need to do something more advanced using the GStreamer API directly. Hence the need for something new.

My idea with qtvideosink was to implement something that would be a standalone GStreamer element, which would not require the developer to use a specific high level API in order to paint video on QML. In the past I have also written another similar element, qwidgetvideosink, which is basically the same idea, but for QWidgets. After looking at the problem a bit more carefully, I realized that in fact qwidgetvideosink and qtvideosink would share a lot of their internal logic and therefore I could probably do one element generic enough to do both painting on QWidgets and on QML and perhaps more surfaces. And so I did.

I started by taking the code of qtgst-qmlsink, a project that was started by a colleague here at Collabora last year, with basically the same intention, but which was never finished properly. This project was initially based on QtMultimedia’s GStreamer backend. As a first step, I did some major refactoring to clean it up from its QtMultimedia dependencies and to make it an independent GStreamer plugin (as it used to be a library). Then I merged it with qwidgetvideosink, so that they can share the common parts of the code and also wrote a unit test for it. Sadly, the unit test proved something that I was suspecting already: the original QtMultimedia code was quite buggy. But I must say I enjoyed fixing it. It was a good opportunity for me to learn a lot of things on video formats and on OpenGL.

How does it work

First of all, you can create the sink with the standard gst_element_factory_make method (or its equivalent in the various bindings). You will notice that this sink provides two signals, an action signal (a slot in Qt terminology) called “paint” and a normal signal called “update”. “update” is emitted every time the sink needs the surface to be repainted. This is meant to be connected directly to QWidget::update() or QGraphicsItem::update() or something similar. The “paint” slot takes a QPainter pointer and a rectangle (x, y, width, height as qreals) as its arguments and paints the video inside the given rectangle using the given painter. This is meant to be called from the widget’s paint event or the graphics item’s paint() function. So, all you need to do is to take care of those two signals and qtvideosink will do everything else.

Getting OpenGL into the game

You may be wondering how this sink does the actually painting. Using QPainter, using OpenGL or maybe something else? Well, there are actually two variants of this video sink. The first one, qtvideosink, just uses QPainter. It is able to handle only RGB data (only a subset of the formats that QImage supports) and does format conversion and scaling in software. The second one, however, qtglvideosink, uses OpenGL/OpenGLES with shaders. It is able to handle both RGB and YUV formats and does format conversion and scaling in hardware. It is used in exactly the same way as qtvideosink, but it requires a QGLContext pointer to be set on its “glcontext” property before its state is set to READY. This of course means that the underlying surface must support OpenGL (i.e. it must be one of QGLWidget, QGLPixelBuffer or QGLFrameBufferObject). To get this working on QGraphicsView/QML, you just need to set a QGLWidget as the viewport of QGraphicsView and use this widget’s QGLContext in the sink.

qtglvideosink uses either GLSL shaders or ARB fragment program shaders if GLSL is not supported. This means it should work on pretty much every GPU/driver combination that exists for linux on both desktop and emebedded systems. In case no shaders are supported, it will fail to change its state to READY and then you can just substitute it with qtvideosink, which is guaranteed to work on all platforms supported by Qt.

qtglvideosink also has an extra feature: it supports the GstColorBalance interface. Color adjustment is done in the shaders together with the format conversion. qtvideosink doesn’t support this, as it doesn’t make sense. Color adjustment would need to be implemented in software and this can be done better by plugging a videobalance element before the sink. No need to duplicate code.

So, which variant to use?

If you are interested in painting video on QGraphicsView/QML, then qtglvideosink is the best choice of all sinks. And if for any reason the system doesn’t support OpenGL shaders, qtvideosink is the next choice. Now if you intend to paint video on normal QWidgets, it is best to use one of the standard GStreamer sinks for your platform, unless you have a reason not to. QWidgets can be transformed to native system windows by calling their winId() method and therefore any sink that implements the GstXOverlay interface can be embedded in them. On X11 for example, xvimagesink is the best choice. However, if you need to do something more tricky and embedding another window doesn’t suit you very well, you could use qtglvideosink in a QGLWidget (preferrably) or qtvideosink / qwidgetvideosink on a standard QWidget.

Note that qwidgetvideosink is basically the same thing as qtvideosink, with the difference that it takes a QWidget pointer in its “widget” property and handles everything internally for painting on this widget. It has no signals. Other than that, it still does painting in software with QPainter, just like qtvideosink. This is just there to keep compatibility with code that may already be using it, as it already exists in QtGStreamer 0.10.1.

This is actually 0.10 stuff… What about GStreamer 0.11/1.0?

Well, if you are interested in 0.11, you will be happy to hear that there is already a partial 0.11 port around. Two weeks ago I was at the GStreamer 1.0 hackfest at Malaga, Spain, and one of the things I did there was porting qtvideosink to 0.11. I must say the port was quite easy to do. However, last week I added some more stuff in the 0.10 version that I haven’t ported yet to 0.11. I’ll get to that soon, it shouldn’t take long.

Try it out

The code lives in the qt-gstreamer repository. The actual video sinks are independent from the qt-gstreamer bindings, but qt-gstreamer itself has some helper classes for using them. Firstly there is QGst::Ui::VideoWidget, a QWidget subclass which will accept qtvideosink, qtglvideosink and qwidgetvideosink just like any other video sink and will transparently do all the required work to paint the video in it. Secondly, there is QGst::Ui::GraphicsVideoWidget and QGst::Ui::GraphicsVideoSurface. Those two are meant to be used together to paint video on a QGraphicsView or QML. You can find more about them at the documentation in graphicsvideosurface.h (this will soon be on the documentation website). Finally, there is a QtGStreamer QML plugin, which exports a “VideoItem” element if you “import QtGStreamer 0.10”. This is also documented in the GraphicsVideoSurface header. All of this will soon be released in the upcoming qt-gstreamer 0.10.2.

23 thoughts on “Introducing qtvideosink – GStreamer meets QML

  1. Very interesting.:-)
    Looks like possible candidate to use in my Mini Player Plasma applet (but would be need to be available through packages or bundled, thought).

  2. The qml application works with playbin2, but how would I go about getting it to work with rtspsrc? Using playbin2 loads the video, but it adds about two seconds of delay. Any way around this?

  3. Great job!
    The application works fine when i create the pipe like:
    m_pipeline = QGst::ElementFactory::make(“playbin2”).dynamicCast();

    But when i use a udpsink the video isn’t working. I was trying so many things about some days but without success. Could somebody help me? That would be very appreciated.

    Server:
    launch-0.10 videotestsrc ! jpegenc ! udpsink host=localhost port=40000

    Client-Code:
    QString pipe1Descr = QString(“udpsrc port=40000 ! jpegdec ! decodebin “); // autovideosink
    m_pipeline = QGst::Parse::launch(pipe1Descr).dynamicCast();

    When i use autovideosink in the pipe, the video is available in a new widget. So I think the problem isn’t the pipe.

    Stefan

  4. Stefan, did you find a solution to getting udpsrc to work with this? I’m having a similar issue with my QGst VideoWidget. I get “Could not get/set settings from/on resource.”

  5. Hi,
    I’m using your QtGStreamer code quite extensively, and it’s really a very nice piece of code : very simple and effective. So first of all thank you for providing this to the Qt community.
    Indeed, I’m now running into an issue with QtGstreamer and Qt 4.8 (tested with Qt 4.8.2 to 4.8.4), which is quite simple to produce and is really annoying.
    I’m using your custom VideoItem QML element. The problem is that it’s systematically drawn at the wrong place (when not at 0,0).
    For instance, if you modify your example “qmlplayer.qml” and replace (for the VideoItem) :
    “width: window.width”
    by
    “width:320; anchors.horizontalCenter: parent.horizontalCenter”,
    you would expect the VideoItem to be horizontally centered, but it will appear on the right of the window.
    Indeed, all the VideoItem elements are drawn with x and y coordinates multplied by 2, and it’s true with and without OpenGL.

    I’ve dug into your code in order to find where could be the issue, but I couldn’t find it.
    Indeed, the coordinates of QPainter are wrong in GraphicsVideoWidget::paint (graphicsvideowidget.cpp, in attachment) :
    Adding :
    QPointF I(-painter->deviceTransform().dx() / 2, -painter->deviceTransform().dy() / 2);
    painter->translate(offset);
    partially solves the issue (only in OpenGL. Not in normal drawing mode since there is then a clipping issue).
    But it’s just an unclean workaround.
    I don’t know much about the Qt architecture, but I suspect a bug in the computations of coordinates in QGraphicsWidget in the Qt code…
    Have you ever heard of this ? I can’t find anything similar on the internet, but I’m surprised I’m the first to run into this issue, which is quite simple to run into…
    Maybe that issue has disappeared with Qt5, but I can’t manage to compile your code with Qt5… Do you plan to make a release for Qt5 ?

    Anyway, I wish you a happy new year with many good useful programs.
    With my best regards,
    B. Steux
    Mines ParisTech – Nexter Robotics

  6. Hello George,

    Thanks for the work you have done on this element.

    I am trying to modify your qmlplayer example to use a more complex GStreamer pipeline, rather than playbin2, and am having trouble getting to see my video.

    The pipeline I am using is
    “v4l2src ! ffmpegcolorspace ! qtvideosink” – I see no video or
    “v4l2src ! ffmpegcolorspace ! ximagesink” – video pops up in new window

    My guess is that I am not doing the “setProperty(“video-sink”…) call correctly, but am at the limit of my knowledge.

    Can you give an example of how to setup a pipeline using QGst::Bin::fromDescription() that does not use playbin2?

    Thanks,
    Steve.

    1. OK, following up my own question with the answer… I now see that setProperty(“video-sink”…) in the qmlplayer example is a playbin2-specific way of overriding its output sink, and the more normal case for a GStreamer pipeline is to add the video sink to the pipeline and link the last stage of the pipeline (in my case ffmpegcolorspace) directly to the video sink. I’m now able to build the qmlplayer example code with a custom pipeline – cool!

      1. Hello! Can you please provide a code example? I’m having trouble writing to the pipeline. Here’s what I’m trying:

        QString pipe1Descr = QString(“v4l2src ! video/x-raw-yuv,width=320,height=240,framerate=30/1 ! ffmpegcolorspace “);
        m_pipeline = QGst::Parse::launch(pipe1Descr).dynamicCast();

        1. Hi Pat.

          I think that you need to add you videosink to the end of your pipeline.

          I’ve moved on from using a camera as a source to using playbin2 to play a file.

          Looking back over my repository, I think that the last time I used a camera I had:

          QML:

          //Open the video stream when we are loaded and close it on unload
          Component.onCompleted: {
          gstreamer.open(“v4l2src ! video/x-raw-yuv,width=640,height=480,framerate=24/1 ! videocrop bottom=140 ! videoflip method=horizontal-flip ! ffmpegcolorspace”);
          }
          Component.onDestruction: {
          gstreamer.close();
          }

          and C++:

          void GStreamer::close()
          {
          if (m_pipeline) {
          m_pipeline->setState(QGst::StateNull);
          }
          }

          void GStreamer::open(const QString & command)
          {
          if (!m_pipeline) {

          m_pipeline = QGst::Pipeline::create();

          if (m_pipeline) {
          QGst::ElementPtr videosrc;
          try {
          videosrc = QGst::Bin::fromDescription(command);
          } catch (const QGlib::Error & error) {
          qCritical() <add(videosrc);
          m_pipeline->add(m_videoSink);

          videosrc->link(m_videoSink);

          //watch the bus for messages
          QGst::BusPtr bus = m_pipeline->bus();
          bus->addSignalWatch();
          QGlib::connect(bus, “message”, this, &GStreamer::onBusMessage);
          } else {
          qCritical() <setState(QGst::StatePlaying);
          }
          }

          This seemed to work for my camera pipeline but failed to link the pipeline when I attempted to play from a file – I suspect that this was because the pipeline was split across more than one bin but never got to the bottom of it.

  7. Hi, thanks for your work.
    Is this work valid yet? I mean, I see no more updates from a lots of month and now there is QT5, I’m looking for a simple way to put my video inside qml using gstreamer (with some particular options).

    Thank you.

  8. I mean if it works even with qt5.

    I’m going crazy with qml+gstreamer, I’ve a videosink provided by nvidia that provide hw acceleration to my videos, but I can’t understand how to make it works with qml/qt

  9. Hello.
    It is impossible to build…
    Generating connect.moc
    /usr/local/qt5pi/bin/moc -I/home/pi/gstreamer_custom/custom_Gstreamer/qt-gstreamer/src -I/usr/include -I/home/pi/gstreamer_custom/custom_Gstreamer/qt-gstreamer/src/QGlib -I/usr/include/glib-2.0 -I/usr/include/glib-2.0 -I/usr/lib/arm-linux-gnueabihf/glib-2.0/include -I/usr/local/qt5pi/include -I/usr/local/qt5pi/include/QtCore -I/usr/local/qt5pi/mkspecs/devices/linux-rasp-pi-g++ -DQT_NO_KEYWORDS -o /home/pi/gstreamer_custom/custom_Gstreamer/qt-gstreamer/src/QGlib/connect.moc /home/pi/gstreamer_custom/custom_Gstreamer/qt-gstreamer/src/QGlib/connect.cpp
    No such file or directory
    AUTOMOC: error: process for /home/pi/gstreamer_custom/custom_Gstreamer/qt-gstreamer/src/QGlib/connect.moc failed:
    No such file or directory
    moc failed…
    make[2]: *** [src/QGlib/CMakeFiles/QtGLib_automoc] Error1
    make[2]: Leaving directory `/home/pi/gstreamer_custom/custom_Gstreamer/qt-gstreamer’
    make[1]: *** [src/QGlib/CMakeFiles/QtGLib_automoc.dir/all] Error 2
    make[1]: Leaving directory `/home/pi/gstreamer_custom/custom_Gstreamer/qt-gstreamer’
    make: *** [all] Error 2

    Help,pls)

  10. Hi, George
    Can qtvideosink be used in windows? I download and built the qt-gstreamer-0.2 in windows, but when I run the player sample, it didnot show any video output except the progress slider stepping.The qmlplayer was normal if enable the QMLPLAYER_NO_OPENGL in the .pro.

    1. sorry, I’m not sure if the qmlplayer is ok, maybe it use autovideosink. In player sample, I create the qtvideosink manually and set it to the playbin2 by setProperty(“video-sink”, video_sink). If use dshowvideosink instead, it would works fine.

  11. Hi Can you help me in how to plug in a videobalance element before the sink with an example code. Somehow in the qmlplayer2 example the video has a green hue. I’m modelling my application after qmlplayer2 because the rest of my application has Qtquick 2.0 dependency.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s