Image Gestures Example¶
Demonstrates the use of simple gestures in a widget.
This example shows how to enable gestures for a widget and use gesture input to perform actions.
We use two classes to create the user interface for the application:
MainWidget
andImageWidget
. TheMainWidget
class is simply used as a container for theImageWidget
class, which we will configure to accept gesture input. Since we are interested in the way gestures are used, we will concentrate on the implementation of theImageWidget
class.
ImageWidget Class Definition¶
The
ImageWidget
class is a simpleQWidget
subclass that reimplements the generalevent()
handler function in addition to several more specific event handlers:class ImageWidget : public QWidget { Q_OBJECT public: ImageWidget(QWidget *parent = nullptr); void openDirectory(const QString &path); void grabGestures(const QVector<Qt::GestureType> &gestures); protected: bool event(QEvent *event) override; void paintEvent(QPaintEvent *event) override; void resizeEvent(QResizeEvent *event) override; void mouseDoubleClickEvent(QMouseEvent *event) override; private: bool gestureEvent(QGestureEvent *event); void panTriggered(QPanGesture*); void pinchTriggered(QPinchGesture*); void swipeTriggered(QSwipeGesture*); ... };We also implement a private helper function,
gestureEvent()
, to help manage gesture events delivered to the widget, and three functions to perform actions based on gestures:panTriggered()
,pinchTriggered()
andswipeTriggered()
.
ImageWidget Class Implementation¶
In the widget’s constructor, we begin by setting up various parameters that will be used to control the way images are displayed.
ImageWidget::ImageWidget(QWidget *parent) : QWidget(parent), position(0), horizontalOffset(0), verticalOffset(0) , rotationAngle(0), scaleFactor(1), currentStepScaleFactor(1) { setMinimumSize(QSize(100, 100)); }We enable three of the standard gestures for the widget by calling
grabGesture()
with the types of gesture we need. These will be recognized by the application’s default gesture recognizer, and events will be delivered to our widget.Since
QWidget
does not define a specific event handler for gestures, the widget needs to reimplement the generalevent()
to receive gesture events.bool ImageWidget::event(QEvent *event) { if (event->type() == QEvent::Gesture) return gestureEvent(static_cast<QGestureEvent*>(event)); return QWidget::event(event); }We implement the event handler to delegate gesture events to a private function specifically written for the task, and pass all other events to
QWidget
‘s implementation.The
gestureHandler()
function examines the gestures supplied by the newly-deliveredQGestureEvent
. Since only one gesture of a given type can be used on a widget at any particular time, we can check for each gesture type using thegesture()
function:bool ImageWidget::gestureEvent(QGestureEvent *event) { qCDebug(lcExample) << "gestureEvent():" << event; if (QGesture *swipe = event->gesture(Qt::SwipeGesture)) swipeTriggered(static_cast<QSwipeGesture *>(swipe)); else if (QGesture *pan = event->gesture(Qt::PanGesture)) panTriggered(static_cast<QPanGesture *>(pan)); if (QGesture *pinch = event->gesture(Qt::PinchGesture)) pinchTriggered(static_cast<QPinchGesture *>(pinch)); return true; }If a
QGesture
object is supplied for a certain type of gesture, we call a special purpose function to deal with it, casting the gesture object to the appropriateQGesture
subclass.To illustrate how a standard gesture can be interpreted by an application, we show the implementation of the
pinchTriggered()
function, which handles the pinch gesture when the user moves two fingers around on the display or input device:void ImageWidget::pinchTriggered(QPinchGesture *gesture) { QPinchGesture::ChangeFlags changeFlags = gesture->changeFlags(); if (changeFlags & QPinchGesture::RotationAngleChanged) { qreal rotationDelta = gesture->rotationAngle() - gesture->lastRotationAngle(); rotationAngle += rotationDelta; qCDebug(lcExample) << "pinchTriggered(): rotate by" << rotationDelta << "->" << rotationAngle; } if (changeFlags & QPinchGesture::ScaleFactorChanged) { currentStepScaleFactor = gesture->totalScaleFactor(); qCDebug(lcExample) << "pinchTriggered(): zoom by" << gesture->scaleFactor() << "->" << currentStepScaleFactor; } if (gesture->state() == Qt::GestureFinished) { scaleFactor *= currentStepScaleFactor; currentStepScaleFactor = 1; } update(); }The
QPinchGesture
class provides properties to interpret the changing distance between the two touch points as a zoom factor, and the angle delta as a rotation to be applied to the image. The center point between the touch points could be used to drag the image, but in this example we use the pan gesture for that purpose.The
scaleFactor()
is a relative value representing how much the zoom should change from one event to the next, whereastotalScaleFactor()
provides the amount of zoom that has been expressed since the gesture began. When the touch points are released and another gesture begins,totalScaleFactor()
will begin again at 1.0. In this case we storetotalScaleFactor()
into thecurrentStepScaleFactor
variable so that it can be used inpaintEvent()
to scale the image. Alternatively it would be possible to simply multiply the stored total scale factor byscaleFactor()
here in the pinch handler.In contrast,
rotationAngle()
represents the amount of rotation since the pinch gesture began, whilelastRotationAngle()
provides the previous value. So it is necessary to subtract in order to get an incremental delta. When the user begins a new pinch gesture,rotationAngle()
will start from zero, and we want the image to begin to rotate from its current angle. This is achieved by adding the delta to the storedrotationAngle
(which will be applied inpaintEvent()
). If we simply assignedtotalRotationAngle()
to the storedrotationAngle
, a new gesture would cause the image to reset to a right-side-up orientation before beginning to rotate again. But it would be possible to store the rotation angle since the gesture began and add it torotationAngle
inpaintEvent()
, just as we store the amount of zoom since the gesture began.The pan and swipe gestures in this example are also handled in separate functions, and use the values of properties from the
QGesture
objects passed to them.void ImageWidget::paintEvent(QPaintEvent*) { QPainter p(this); const qreal iw = currentImage.width(); const qreal ih = currentImage.height(); const qreal wh = height(); const qreal ww = width(); p.translate(ww / 2, wh / 2); p.translate(horizontalOffset, verticalOffset); p.rotate(rotationAngle); p.scale(currentStepScaleFactor * scaleFactor, currentStepScaleFactor * scaleFactor); p.translate(-iw / 2, -ih / 2); p.drawImage(0, 0, currentImage); }In
paintEvent()
, scaleFactor represents the zoom level before the pinch gesture began, while currentStepScaleFactor represents the additional zoom factor while a pinch gesture is in progress. But for rotation, only the current rotationAngle is stored. The horizontal and vertical offsets represent the distance that the image has been dragged by the pan gesture.
© 2022 The Qt Company Ltd. Documentation contributions included herein are the copyrights of their respective owners. The documentation provided herein is licensed under the terms of the GNU Free Documentation License version 1.3 as published by the Free Software Foundation. Qt and respective logos are trademarks of The Qt Company Ltd. in Finland and/or other countries worldwide. All other trademarks are property of their respective owners.