PDA

View Full Version : Using touch screen affects both qt application and desktop



msheron
20th April 2017, 16:00
Hello,

I am having a problem with my X11 qt5 application running on a raspberry pi 3 (os - Jessie) connected to the official raspberry pi 7 inch touch screen display.

The application works exactly as desired if using a mouse, but when the user uses the touch screen, the event is ALSO picked up and handled by the desktop. In other words, if i press on a location where i know a desktop icon to be (while it is covered by my full screen application), the system will think I am trying to open that new application, or move it, etc.

I found this thread: http://www.qtcentre.org/threads/66214-Raspberry-Pi-QT5-mouse-focus-on-application-AND-in-background-operating-system
that describes an identical problem, but unfortunately there was no resolution.

I'm spinning my wheels trying to figure this one out. Does anyone have any suggestions? I will of course provide any necessary information, I'm just not sure what exactly is relevant at this point.

sedi
21st April 2017, 21:33
But you do accept those events, do you?

d_stranz
22nd April 2017, 18:19
But you do accept those events, do you?

In other words, in your event handler, you call the method QEvent::accept() through the pointer to the event. If not, then the event will be sentup the line for further handling.

msheron
24th April 2017, 14:34
Hello, Thanks for the replies.

I fear I might have explained my issue poorly. My application uses widgets that implement mouseEvent handlers, and in those handlers I do accept the events. I read somewhere that for most use cases this should be sufficient for implementing touch features (just basic 'clicking'/'pressing' is all I require), and until now this had been ringing true. I am developing the application directly on a raspberry pi 3 connected to a usb touch screen monitor. When the application is running on the usb monitor, everything works exactly as intended; Mouse clicks are mouse clicks and touches are mouse clicks that do not interact with anything but the application window.

The problem only appears when running the application on the 7 inch touch screen. Mouse clicks stop at the application window, but pressing with a finger goes to and through the application to the desktop -- Anywhere, whether on a widget or just the application window itself. Actually, I made a new project with a blank MainWindow and was able to recreate the behavior, so I'm thinking it's not even really my code, but rather the screen or X11 erroneously sending events to the desktop as well.

And if that is actually the case, would anyone know how to test for / fix that?

msheron
28th April 2017, 20:46
Hello again,

I resolved this problem well enough for my purposes. Just wanted to follow up and leave a record of it in case anyone else has to deal with this.

I was never able to figure out what the root cause was, so a workaround was required instead. So if the desktop receiving events from "underneath" the application was the problem, removing the desktop should be the solution. And that's exactly what I did; Using raspi-config, I set the raspberry pi to boot into a console (tty1). After that, I added my application to start on '@reboot' with crontab. If the desktop is ever required for any reason, the application can be closed with 'ctrl+alt+backspace', followed by a 'startx' command at the terminal to launch the desktop.

So while this is not a fix to the real problem, it certainly mitigates what was a show-stopper into a minor inconvenience at worst.

Thanks again for your interest.

sedi
22nd August 2017, 23:57
Hm - this really sounds like a bug to me. Would you mind raising this issue on the interest@qt-project.org mailing list or / and create a bug report on https://bugreports.qt.io/?

[Edited:] Oops. Didn't see how old this post was. Nevermind. ;)

TFx
13th January 2018, 22:18
Hi,

I'm digging this post because I'm facing the same issue, still valid with Qt 5.10
While working in the CLI "solves" the mouse issue, keyboards events are also forwarded and the CLI receives them.

I've built Qt directly on the raspberry, in particular with udev, libinput and xkbcommon supported.
This seems to be specific to the raspberry because the same application does not forward the mouse and keyboard events in a centos...

Any idea because it is very annoying ?...


The simple program I've created to highlight the issue is:

import QtQuick 2.10
import QtQuick.Window 2.10
import QtQuick.Controls 2.2


Window {
visible: true
title: qsTr("Hello World")
visibility: "FullScreen"

width: 480
height: 800
maximumHeight: height
maximumWidth: width

// Background
Rectangle {
id: mainBackground
anchors.fill: parent
width: parent.width
height: parent.height
focus: true

gradient: Gradient {
GradientStop {
position: 0
color: "#808080"
}
GradientStop {
position: 1
color: "#666666"
}
}

MouseArea {
anchors.fill: parent
onClicked: {console.log("mouse click trapped"); mouse.accepted = true}
onPressed: {console.log("mouse press trapped"); mouse.accepted = true}
}

Keys.onPressed: {
console.log("key pressed trapped")
event.accepted = true
}
Keys.onReleased: {
console.log("key released trapped")
event.accepted = true
}

Button {
width: 100
height: 30
text: "quit"
anchors.verticalCenter: parent.verticalCenter
anchors.horizontalCenter: parent.horizontalCenter
onClicked: {
Qt.quit()
}
}
}
}