QVideoFrameFormat#
The QVideoFrameFormat
class specifies the stream format of a video presentation surface. More…
New in version 6.1.
Synopsis#
Functions#
def
colorRange
()def
colorSpace
()def
colorTransfer
()def
fragmentShaderFileName
()def
frameHeight
()def
frameRate
()def
frameSize
()def
frameWidth
()def
isMirrored
()def
isValid
()def
maxLuminance
()def
__ne__
(format)def
__eq__
(format)def
pixelFormat
()def
planeCount
()def
scanLineDirection
()def
setColorRange
(range)def
setColorSpace
(colorSpace)def
setColorTransfer
(colorTransfer)def
setFrameRate
(rate)def
setFrameSize
(size)def
setFrameSize
(width, height)def
setMaxLuminance
(lum)def
setMirrored
(mirrored)def
setScanLineDirection
(direction)def
setViewport
(viewport)def
setYCbCrColorSpace
(colorSpace)def
swap
(other)def
updateUniformData
(dst, frame, transform, opacity)def
vertexShaderFileName
()def
viewport
()def
yCbCrColorSpace
()
Static functions#
def
imageFormatFromPixelFormat
(format)def
pixelFormatFromImageFormat
(format)def
pixelFormatToString
(pixelFormat)
Note
This documentation may contain snippets that were automatically translated from C++ to Python. We always welcome contributions to the snippet translation. If you see an issue with the translation, you can also let us know by creating a ticket on https:/bugreports.qt.io/projects/PYSIDE
Detailed Description#
A video sink presents a stream of video frames. QVideoFrameFormat
describes the type of the frames and determines how they should be presented.
The core properties of a video stream required to set up a video sink are the pixel format given by pixelFormat()
, and the frame dimensions given by frameSize()
.
The region of a frame that is actually displayed on a video surface is given by the viewport()
. A stream may have a viewport less than the entire region of a frame to allow for videos smaller than the nearest optimal size of a video frame. For example the width of a frame may be extended so that the start of each scan line is eight byte aligned.
Other common properties are the scanLineDirection()
, frameRate()
and the yCrCbColorSpace().
- class PySide6.QtMultimedia.QVideoFrameFormat#
PySide6.QtMultimedia.QVideoFrameFormat(size, pixelFormat)
PySide6.QtMultimedia.QVideoFrameFormat(format)
- Parameters:
pixelFormat –
PixelFormat
size –
PySide6.QtCore.QSize
Constructs a null video stream format.
Constructs a video stream with the given frame size
and pixel format
.
Constructs a copy of other
.
- PySide6.QtMultimedia.QVideoFrameFormat.PixelFormat#
Enumerates video data types.
Constant
Description
QVideoFrameFormat.Format_Invalid
The frame is invalid.
QVideoFrameFormat.Format_ARGB8888
The frame is stored using a ARGB format with 8 bits per component.
QVideoFrameFormat.Format_ARGB8888_Premultiplied
The frame stored using a premultiplied ARGB format with 8 bits per component.
QVideoFrameFormat.Format_XRGB8888
The frame stored using a 32 bits per pixel RGB format (0xff, R, G, B).
QVideoFrameFormat.Format_BGRA8888
The frame is stored using a 32-bit BGRA format (0xBBGGRRAA).
QVideoFrameFormat.Format_BGRA8888_Premultiplied
The frame is stored using a premultiplied 32bit BGRA format.
QVideoFrameFormat.Format_ABGR8888
The frame is stored using a 32-bit ABGR format (0xAABBGGRR).
QVideoFrameFormat.Format_XBGR8888
The frame is stored using a 32-bit BGR format (0xffBBGGRR).
QVideoFrameFormat.Format_RGBA8888
The frame is stored in memory as the bytes R, G, B, A/X, with R at the lowest address and A/X at the highest address.
QVideoFrameFormat.Format_BGRX8888
The frame is stored in format 32-bit BGRx format, [31:0] B:G:R:x 8:8:8:8 little endian
QVideoFrameFormat.Format_RGBX8888
The frame is stored in memory as the bytes R, G, B, A/X, with R at the lowest address and A/X at the highest address.
QVideoFrameFormat.Format_AYUV
The frame is stored using a packed 32-bit AYUV format (0xAAYYUUVV).
QVideoFrameFormat.Format_AYUV_Premultiplied
The frame is stored using a packed premultiplied 32-bit AYUV format (0xAAYYUUVV).
QVideoFrameFormat.Format_YUV420P
The frame is stored using an 8-bit per component planar YUV format with the U and V planes horizontally and vertically sub-sampled, i.e. the height and width of the U and V planes are half that of the Y plane.
QVideoFrameFormat.Format_YUV422P
The frame is stored using an 8-bit per component planar YUV format with the U and V planes horizontally sub-sampled, i.e. the width of the U and V planes are half that of the Y plane, and height of U and V planes is the same as Y.
QVideoFrameFormat.Format_YV12
The frame is stored using an 8-bit per component planar YVU format with the V and U planes horizontally and vertically sub-sampled, i.e. the height and width of the V and U planes are half that of the Y plane.
QVideoFrameFormat.Format_UYVY
The frame is stored using an 8-bit per component packed YUV format with the U and V planes horizontally sub-sampled (U-Y-V-Y), i.e. two horizontally adjacent pixels are stored as a 32-bit macropixel which has a Y value for each pixel and common U and V values.
QVideoFrameFormat.Format_YUYV
The frame is stored using an 8-bit per component packed YUV format with the U and V planes horizontally sub-sampled (Y-U-Y-V), i.e. two horizontally adjacent pixels are stored as a 32-bit macropixel which has a Y value for each pixel and common U and V values.
QVideoFrameFormat.Format_NV12
The frame is stored using an 8-bit per component semi-planar YUV format with a Y plane (Y) followed by a horizontally and vertically sub-sampled, packed UV plane (U-V).
QVideoFrameFormat.Format_NV21
The frame is stored using an 8-bit per component semi-planar YUV format with a Y plane (Y) followed by a horizontally and vertically sub-sampled, packed VU plane (V-U).
QVideoFrameFormat.Format_IMC1
The frame is stored using an 8-bit per component planar YUV format with the U and V planes horizontally and vertically sub-sampled. This is similar to the Format_YUV420P type, except that the bytes per line of the U and V planes are padded out to the same stride as the Y plane.
QVideoFrameFormat.Format_IMC2
The frame is stored using an 8-bit per component planar YUV format with the U and V planes horizontally and vertically sub-sampled. This is similar to the Format_YUV420P type, except that the lines of the U and V planes are interleaved, i.e. each line of U data is followed by a line of V data creating a single line of the same stride as the Y data.
QVideoFrameFormat.Format_IMC3
The frame is stored using an 8-bit per component planar YVU format with the V and U planes horizontally and vertically sub-sampled. This is similar to the Format_YV12 type, except that the bytes per line of the V and U planes are padded out to the same stride as the Y plane.
QVideoFrameFormat.Format_IMC4
The frame is stored using an 8-bit per component planar YVU format with the V and U planes horizontally and vertically sub-sampled. This is similar to the Format_YV12 type, except that the lines of the V and U planes are interleaved, i.e. each line of V data is followed by a line of U data creating a single line of the same stride as the Y data.
QVideoFrameFormat.Format_P010
The frame is stored using a 16bit per component semi-planar YUV format with a Y plane (Y) followed by a horizontally and vertically sub-sampled, packed UV plane (U-V). Only the 10 most significant bits of each component are being used.
QVideoFrameFormat.Format_P016
The frame is stored using a 16bit per component semi-planar YUV format with a Y plane (Y) followed by a horizontally and vertically sub-sampled, packed UV plane (U-V).
QVideoFrameFormat.Format_Y8
The frame is stored using an 8-bit greyscale format.
QVideoFrameFormat.Format_Y16
The frame is stored using a 16-bit linear greyscale format. Little endian.
QVideoFrameFormat.Format_Jpeg
The frame is stored in compressed Jpeg format.
QVideoFrameFormat.Format_SamplerExternalOES
The frame is stored in external OES texture format. This is currently only being used on Android.
QVideoFrameFormat.Format_SamplerRect
The frame is stored in rectangle texture format (GL_TEXTURE_RECTANGLE). This is only being used on macOS with an OpenGL based Rendering Hardware interface. The underlying pixel format stored in the texture is Format_BRGA8888.
QVideoFrameFormat.Format_YUV420P10
Similar to YUV420, but uses 16bits per component, 10 of those significant.
- PySide6.QtMultimedia.QVideoFrameFormat.Direction#
Enumerates the layout direction of video scan lines.
Constant
Description
QVideoFrameFormat.TopToBottom
Scan lines are arranged from the top of the frame to the bottom.
QVideoFrameFormat.BottomToTop
Scan lines are arranged from the bottom of the frame to the top.
- PySide6.QtMultimedia.QVideoFrameFormat.YCbCrColorSpace#
Use ColorSpace
instead.
Enumerates the Y’CbCr color space of video frames.
Constant
Description
QVideoFrameFormat.YCbCr_Undefined
No color space is specified.
QVideoFrameFormat.YCbCr_BT601
A Y’CbCr color space defined by ITU-R recommendation BT.601 with Y value range from 16 to 235, and Cb/Cr range from 16 to 240. Used mostly by older videos that were targeting CRT displays.
QVideoFrameFormat.YCbCr_BT709
A Y’CbCr color space defined by ITU-R BT.709 with the same values range as YCbCr_BT601. The most commonly used color space today.
QVideoFrameFormat.YCbCr_xvYCC601
This value is deprecated. Please check the
ColorRange
instead. The BT.601 color space with the value range extended to 0 to 255. It is backward compatible with BT.601 and uses values outside BT.601 range to represent a wider range of colors.QVideoFrameFormat.YCbCr_xvYCC709
This value is deprecated. Please check the
ColorRange
instead. The BT.709 color space with the value range extended to 0 to 255.QVideoFrameFormat.YCbCr_JPEG
The full range Y’CbCr color space used in most JPEG files.
QVideoFrameFormat.YCbCr_BT2020
The color space defined by ITU-R BT.2020. Used mainly for HDR videos.
- PySide6.QtMultimedia.QVideoFrameFormat.ColorSpace#
Enumerates the color space of video frames.
Constant
Description
QVideoFrameFormat.ColorSpace_Undefined
No color space is specified.
QVideoFrameFormat.ColorSpace_BT601
A color space defined by ITU-R recommendation BT.601 with Y value range from 16 to 235, and Cb/Cr range from 16 to 240. Used mostly by older videos that were targeting CRT displays.
QVideoFrameFormat.ColorSpace_BT709
A color space defined by ITU-R BT.709 with the same values range as ColorSpace_BT601. The most commonly used color space today.
QVideoFrameFormat.ColorSpace_AdobeRgb
The full range YUV color space used in most JPEG files.
QVideoFrameFormat.ColorSpace_BT2020
The color space defined by ITU-R BT.2020. Used mainly for HDR videos.
New in version 6.4.
- PySide6.QtMultimedia.QVideoFrameFormat.ColorTransfer#
Constant
Description
QVideoFrameFormat.ColorTransfer_Unknown
The color transfer function is unknown.
QVideoFrameFormat.ColorTransfer_BT709
Color values are encoded according to BT709. See also https://www.itu.int/rec/R-REC-BT.709/en. This is close to, but not identical to a gamma curve of 2.2, and the same transfer curve as is used in sRGB.
QVideoFrameFormat.ColorTransfer_BT601
Color values are encoded according to BT601. See also https://www.itu.int/rec/R-REC-BT.601/en.
QVideoFrameFormat.ColorTransfer_Linear
Color values are linear
QVideoFrameFormat.ColorTransfer_Gamma22
Color values are encoded with a gamma of 2.2
QVideoFrameFormat.ColorTransfer_Gamma28
Color values are encoded with a gamma of 2.8
QVideoFrameFormat.ColorTransfer_ST2084
Color values are encoded using STME ST 2084. This transfer function is the most common HDR transfer function and often called the ‘perceptual quantizer’. See also https://www.itu.int/rec/R-REC-BT.2100 and https://en.wikipedia.org/wiki/Perceptual_quantizer.
QVideoFrameFormat.ColorTransfer_STD_B67
Color values are encoded using ARIB STD B67. This transfer function is also often referred to as ‘hybrid log gamma’. See also https://www.itu.int/rec/R-REC-BT.2100 and https://en.wikipedia.org/wiki/Hybrid_log–gamma.
New in version 6.4.
- PySide6.QtMultimedia.QVideoFrameFormat.ColorRange#
Describes the color range used by the video data. Video data usually comes in either full color range, where all values are being used, or a more limited range traditionally used in YUV video formats, where a subset of all values is being used.
Constant
Description
QVideoFrameFormat.ColorRange_Unknown
The color range of the video is unknown.
QVideoFrameFormat.ColorRange_Video
The color range traditionally used by most YUV video formats. For 8 bit formats, the Y component is limited to values between 16 and 235. The U and V components are limited to values between 16 and 240
For higher bit depths multiply these values with 2^(depth-8).
Constant
Description
QVideoFrameFormat.ColorRange_Full
Full color range. All values from 0 to 2^depth - 1 are valid.
New in version 6.4.
- PySide6.QtMultimedia.QVideoFrameFormat.NPixelFormats#
- PySide6.QtMultimedia.QVideoFrameFormat.colorRange()#
- Return type:
Returns the color range that should be used to render the video stream.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.colorSpace()#
- Return type:
Returns the color space of a video stream.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.colorTransfer()#
- Return type:
Returns the color transfer function that should be used to render the video stream.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.fragmentShaderFileName()#
- Return type:
str
- PySide6.QtMultimedia.QVideoFrameFormat.frameHeight()#
- Return type:
int
Returns the height of frame in a video stream.
- PySide6.QtMultimedia.QVideoFrameFormat.frameRate()#
- Return type:
float
Returns the frame rate of a video stream in frames per second.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.frameSize()#
- Return type:
Returns the dimensions of frames in a video stream.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.frameWidth()#
- Return type:
int
Returns the width of frames in a video stream.
See also
- static PySide6.QtMultimedia.QVideoFrameFormat.imageFormatFromPixelFormat(format)#
- Parameters:
format –
PixelFormat
- Return type:
Returns an image format equivalent to a video frame pixel format
. If there is no equivalent format QImage::Format_Invalid is returned instead.
Note
In general QImage does not handle YUV formats.
- PySide6.QtMultimedia.QVideoFrameFormat.isMirrored()#
- Return type:
bool
Returns true
if the surface is mirrored around its vertical axis. This is typically needed for video frames coming from a front camera of a mobile device.
Note
The mirroring here differs from QImage::mirrored, as a vertically mirrored QImage will be mirrored around its x-axis.
- PySide6.QtMultimedia.QVideoFrameFormat.isValid()#
- Return type:
bool
Identifies if a video surface format has a valid pixel format and frame size.
Returns true if the format is valid, and false otherwise.
- PySide6.QtMultimedia.QVideoFrameFormat.maxLuminance()#
- Return type:
float
- PySide6.QtMultimedia.QVideoFrameFormat.__ne__(format)#
- Parameters:
- Return type:
bool
Returns true if other
is different to this video format, and false if they are the same.
- PySide6.QtMultimedia.QVideoFrameFormat.__eq__(format)#
- Parameters:
- Return type:
bool
Returns true if other
is the same as this video format, and false if they are different.
- PySide6.QtMultimedia.QVideoFrameFormat.pixelFormat()#
- Return type:
Returns the pixel format of frames in a video stream.
- static PySide6.QtMultimedia.QVideoFrameFormat.pixelFormatFromImageFormat(format)#
- Parameters:
format –
Format
- Return type:
Returns a video pixel format equivalent to an image format
. If there is no equivalent format Format_Invalid
is returned instead.
Note
In general QImage does not handle YUV formats.
- static PySide6.QtMultimedia.QVideoFrameFormat.pixelFormatToString(pixelFormat)#
- Parameters:
pixelFormat –
PixelFormat
- Return type:
str
Returns a string representation of the given pixelFormat
.
- PySide6.QtMultimedia.QVideoFrameFormat.planeCount()#
- Return type:
int
Returns the number of planes used. This number is depending on the pixel format and is 1 for RGB based formats, and a number between 1 and 3 for YUV based formats.
Returns the direction of scan lines.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.setColorRange(range)#
- Parameters:
range –
ColorRange
Sets the color transfer range that should be used to render the video stream to range
.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.setColorSpace(colorSpace)#
- Parameters:
colorSpace –
ColorSpace
Sets the colorSpace
of a video stream.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.setColorTransfer(colorTransfer)#
- Parameters:
colorTransfer –
ColorTransfer
Sets the color transfer function that should be used to render the video stream to colorTransfer
.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.setFrameRate(rate)#
- Parameters:
rate – float
Sets the frame rate
of a video stream in frames per second.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.setFrameSize(size)#
- Parameters:
size –
PySide6.QtCore.QSize
Sets the size of frames in a video stream to size
.
This will reset the viewport()
to fill the entire frame.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.setFrameSize(width, height)
- Parameters:
width – int
height – int
This is an overloaded function.
Sets the width
and height
of frames in a video stream.
This will reset the viewport()
to fill the entire frame.
- PySide6.QtMultimedia.QVideoFrameFormat.setMaxLuminance(lum)#
- Parameters:
lum – float
Sets the maximum luminance to the given value, lum
.
- PySide6.QtMultimedia.QVideoFrameFormat.setMirrored(mirrored)#
- Parameters:
mirrored – bool
Sets if the surface is mirrored
around its vertical axis. This is typically needed for video frames coming from a front camera of a mobile device. Default value is false.
Note
The mirroring here differs from QImage::mirrored, as a vertically mirrored QImage will be mirrored around its x-axis.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.setScanLineDirection(direction)#
- Parameters:
direction –
Direction
Sets the direction
of scan lines.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.setViewport(viewport)#
- Parameters:
viewport –
PySide6.QtCore.QRect
Sets the viewport of a video stream to viewport
.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.setYCbCrColorSpace(colorSpace)#
- Parameters:
colorSpace –
YCbCrColorSpace
Note
This function is deprecated.
Use setColorSpace()
instead
Sets the Y’CbCr color space
of a video stream. It is only used with raw YUV frame types.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.swap(other)#
- Parameters:
Swaps the current video frame format with the other
.
- PySide6.QtMultimedia.QVideoFrameFormat.updateUniformData(dst, frame, transform, opacity)#
- Parameters:
frame –
PySide6.QtMultimedia.QVideoFrame
transform –
PySide6.QtGui.QMatrix4x4
opacity – float
- PySide6.QtMultimedia.QVideoFrameFormat.vertexShaderFileName()#
- Return type:
str
- PySide6.QtMultimedia.QVideoFrameFormat.viewport()#
- Return type:
Returns the viewport of a video stream.
The viewport is the region of a video frame that is actually displayed.
By default the viewport covers an entire frame.
See also
- PySide6.QtMultimedia.QVideoFrameFormat.yCbCrColorSpace()#
- Return type:
Note
This function is deprecated.
Use colorSpace()
instead
Returns the Y’CbCr color space of a video stream.
See also