QAbstractVideoFilter Class

The QAbstractVideoFilter class represents a filter that is applied to the video frames received by a VideoOutput type. More...

Header: #include <QAbstractVideoFilter>
qmake: QT += multimedia
Since: Qt 5.5
Inherits: QObject

Properties

  • 1 property inherited from QObject

Public Functions

QAbstractVideoFilter(QObject *parent = Q_NULLPTR)
virtual QVideoFilterRunnable *createFilterRunnable() = 0
bool isActive() const
void setActive(bool v)
  • 32 public functions inherited from QObject

Signals

void activeChanged()

Additional Inherited Members

  • 1 public slot inherited from QObject
  • 11 static public members inherited from QObject
  • 9 protected functions inherited from QObject

Detailed Description

The QAbstractVideoFilter class represents a filter that is applied to the video frames received by a VideoOutput type.

QAbstractVideoFilter provides a convenient way for applications to run image processing, computer vision algorithms or any generic transformation or calculation on the output of a VideoOutput type, regardless of the source (video or camera). By providing a simple interface it allows applications and third parties to easily develop QML types that provide image processing algorithms using popular frameworks like OpenCV. Due to the close integration with the final stages of the Qt Multimedia video pipeline, accelerated and possibly zero-copy solutions are feasible too: for instance, a plugin providing OpenCL-based algorithms can use OpenCL's OpenGL interop to use the OpenGL textures created by a hardware accelerated video decoder, without additional readbacks and copies.

Note: QAbstractVideoFilter is not always the best choice. To apply effects or transformations using OpenGL shaders to the image shown on screen, the standard Qt Quick approach of using ShaderEffect items in combination with VideoOutput should be used. VideoFilter is not a replacement for this. It is rather targeted for performing computations (that do not necessarily change the image shown on screen) and computer vision algorithms provided by external frameworks.

QAbstractVideoFilter is meant to be subclassed. The subclasses are then registered to the QML engine, so they can be used as a QML type. The list of filters are assigned to a VideoOutput type via its filters property.

A single filter represents one transformation or processing step on a video frame. The output is a modified video frame, some arbitrary data or both. For example, image transformations will result in a different image, whereas an algorithm for detecting objects on an image will likely provide a list of rectangles.

Arbitrary data can be represented as properties on the QAbstractVideoFilter subclass and on the QObject or QJSValue instances passed to its signals. What exactly these properties and signals are, is up to the individual video filters. Completion of the operations can be indicated by signals. Computations that do not result in a modified image will pass the input image through so that subsequent filters can be placed after them.

Properties set on QAbstractVideoFilter serve as input to the computation, similarly to how uniform values are specified in ShaderEffect types. The changed property values are taken into use when the next video frame is processed.

The typical usage is to subclass QAbstractVideoFilter and QVideoFilterRunnable:

class MyFilterRunnable : public QVideoFilterRunnable {
public:
    QVideoFrame run(QVideoFrame *input, const QVideoSurfaceFormat &surfaceFormat, RunFlags flags) { ... }
};

class MyFilter : public QAbstractVideoFilter {
public:
    QVideoFilterRunnable *createFilterRunnable() { return new MyFilterRunnable; }
signals:
    void finished(QObject *result);
};

int main(int argc, char **argv) {
    ...
    qmlRegisterType<MyFilter>("my.uri", 1, 0, "MyFilter");
    ...
}

MyFilter is thus accessible from QML:

import my.uri 1.0

Camera {
    id: camera
}
MyFilter {
    id: filter
    // set properties, they can also be animated
    onFinished: console.log("results of the computation: " + result)
}
VideoOutput {
    source: camera
    filters: [ filter ]
    anchors.fill: parent
}

This also allows providing filters in QML plugins, separately from the application.

See also VideoOutput, Camera, MediaPlayer, and QVideoFilterRunnable.

Property Documentation

active : bool

This property holds the active status of the filter.

This is true if the filter is active, false otherwise.

By default filters are active. When set to false, the filter will be ignored by the VideoOutput type.

Access functions:

bool isActive() const
void setActive(bool v)

Notifier signal:

void activeChanged()

Member Function Documentation

QAbstractVideoFilter::QAbstractVideoFilter(QObject *parent = Q_NULLPTR)

Constructs a new QAbstractVideoFilter instance with parent object parent.

[pure virtual] QVideoFilterRunnable *QAbstractVideoFilter::createFilterRunnable()

Factory function to create a new instance of a QVideoFilterRunnable subclass corresponding to this filter.

This function is called on the thread on which the Qt Quick scene graph performs rendering, with the OpenGL context bound. Ownership of the returned instance is transferred: the returned instance will live on the render thread and will be destroyed automatically when necessary.

Typically, implementations of the function will simply construct a new QVideoFilterRunnable instance, passing this to the constructor as the filter runnables must know their associated QAbstractVideoFilter instance to access dynamic properties and optionally emit signals.

© 2017 The Qt Company Ltd. Documentation contributions included herein are the copyrights of their respective owners. The documentation provided herein is licensed under the terms of the GNU Free Documentation License version 1.3 as published by the Free Software Foundation. Qt and respective logos are trademarks of The Qt Company Ltd. in Finland and/or other countries worldwide. All other trademarks are property of their respective owners.