PDA

View Full Version : QTimer is slowing down



Lodhart
9th October 2013, 10:11
At first I know that Windows is not realtime system, so I really do not expect that QTimer will have some bigger accuracy. But I can see that my QTimer is slowing down from 200ms timoeout to 500ms timeout in ~7minutes.
I am using debug output and timing stamps so I really can confirm that QTimer is slowing down. I am using like this:


void timeoutFnc() {
stopTimer();
/* doing something (it can takes 1s) */
startTimer(200);
}

So I need from each run 200ms distance. My idea is that maybe it is not good to stop and start QTimer so many times. But I need this behaviour.

ChrisW67
9th October 2013, 10:46
Provide a minimal, complete program that demonstrates the problem. This, for example, shows no such issue:


#include <QtCore>

class Thingy: public QObject {
Q_OBJECT
QTimer timer;
QTime stopwatch;

public:
explicit Thingy(QObject *p = 0): QObject(p) {
connect(&timer, SIGNAL(timeout()), SLOT(doSomethingLong()));
timer.setInterval(200);
timer.start();
stopwatch.start();
}

private slots:
void doSomethingLong() {
qDebug() << "Enter" << stopwatch.elapsed();
timer.stop();

// Simulate long runner
qDebug() << " Doing stuff";
QEventLoop loop;
QTimer t;
t.singleShot(1000, &loop, SLOT(quit()));
loop.exec();

timer.start();
qDebug() << "Leave" << stopwatch.elapsed();
}
};

int main(int argc, char **argv) {
QCoreApplication app(argc, argv);
Thingy t;
return app.exec();
}
#include "main.moc"

I get 200 milliseconds or 201 milliseconds interval consistently. Total drift over 7 minutes of 23 milliseconds on Linux but more like 400 or 500 milliseconds on Windows.

wysota
9th October 2013, 11:05
Do not expect a timer to fire every 200ms if in the meantime you're occupying the program for more than 200ms without returning to the event loop.

Furthermore QTimer does not guarantee that it fires each 200ms (or whatever you set the timeout to), it guarantees that the timespan between two consecutive timeouts will be not less than 200ms (or whatever you set the timeout to). So if one trigger is late by x ms (because of whatever reason) then all consecutive triggers will be late by not less than x ms. You need to correct the skew yourself if you want a regular timespan between timeouts.

Lodhart
9th October 2013, 12:13
to wysota: I know, why you think that I wrote 'At first I know that Windows is not realtime system'. Problem is that I'm stoping timer and starting again and the timeout is every second bigger.

Problem is that it is no matter what I am doing when I stop timer and start again. For me for example I am sending data to COM Serial bus. But if I start again timer, next timeout is every second more bigger then 200ms which I want. After 7minutes it is about 400-500ms. Does not matter what I am doing on Windows it is always the same behaviour. It will be ok if timeout will be from 200 to 300 maybe also to 400ms but after one minute when program is running it is about 1000ms. It is growing.

wysota
9th October 2013, 14:24
to wysota: I know, why you think that I wrote 'At first I know that Windows is not realtime system'.
A real time system has nothing to do with this. I'm talking purely about QTimer semantics.


Problem is that it is no matter what I am doing when I stop timer and start again. For me for example I am sending data to COM Serial bus. But if I start again timer, next timeout is every second more bigger then 200ms which I want. After 7minutes it is about 400-500ms. Does not matter what I am doing on Windows it is always the same behaviour. It will be ok if timeout will be from 200 to 300 maybe also to 400ms but after one minute when program is running it is about 1000ms. It is growing.
Show us your code which performs what you have written here in a form of a minimal compilable example.