PDA

View Full Version : Qtimer, basictimer bad precision ?



tonnot
7th October 2010, 20:59
I have measure the Qtimer, basictimer precision with diferent intervales
( I use the simplest code :

basictimer.start(50, this);
void AnalogClock::timerEvent(QTimerEvent *event)
{
qDebug()<<QTime::currentTime().second()<<"."<<QTime::currentTime().msec();
}


For a 50 msecs I have :
0 . 0
0 . 62
0 . 125
0 . 187
0 . 250
0 . 312
0 . 375
0 . 437
0 . 500
0 . 562

For a 200 msecs I have :

30 . 78
30 . 281
30 . 484
30 . 687
30 . 890
31 . 93
31 . 296
31 . 500
31 . 703
31 . 906
32 . 109

I suposse my computer is 'normal'.
How can I improve my precision ?

wysota
7th October 2010, 21:44
I suposse my computer is 'normal'.
How can I improve my precision ?
Get a real-time operating system.

Timers don't guarantee the exact interval specified. They can only guarantee that you will be notified not before the interval passes. The operating system will try to get you as close to this interval as it can but on a "normal" OS it will practically never be exactly the interval you want. The smaller intervals you choose the more accurate timers Qt will use but it all depends on what the operating system has to offer. Currently the devs are working on a new class of timers (I don't know if they made it into 4.7) which will allow higher inaccuracy of a single timer in favour of increasing performance of the whole application by synchronizing timers to fire in the same moment (which will make your application be woken up not so often reducing power and cpu consumption).

SixDegrees
7th October 2010, 22:42
Note, also, that you are making to separate calls to currentTime() in your output statement. This is guaranteed to fail whenever the millisecond value "wraps around" during I/O. You should make a single call, store the result, then format it before printing.