PDA

View Full Version : How to achieve completely smooth HD video rendering in Qt?



ZeikoRomsan
29th April 2012, 19:21
Ok here is my problem, I cannot get completely smooth HD video playback in Qt when trying to play a video in 30 FPS.
I don't want to use Phenon player so instead I'm playing the videoframe in a QGLWidget by rendering a quad with a texture of the video frame on it.

Here comes the strange part, when I play the video without and an fps lock (created with a highperformance timer for Windows (QueryPerformanceTimer)) the video runs at around 60 fps and the video looks reallysmooth most of the time.
As soon as I turn on the FPS lock the video starts to look a little bit choppy, not much but its annoying. If I update the glWidget 2 times each update the video looks a bit smoother.
When playting the video with windows media player or VLC player it is completely smooth, so it should be possible to achieve.
I've also tried setting different priorities of the threads and processed and also tried turning on vsync but nothing helped.
It seems like Qt is a bit inconsistent when updating the glWidget. I've also tried QGraphicalView with opengl support but with the same result.
Its also worth to mention that the video is also captured in a separate thread.

If I run the video at 60 FPS but displaying the same image 2 times it actually looks smooth but its a really ugly hack.

It feels like I've tried everything but obviously there must be something I've missed because obviously it's smooth on VLC player which is created in Qt.

Any good ideas?
It must be other than me that have faced the same problem, help appreciated!

stampede
29th April 2012, 20:06
when I play the video without and an fps lock (...) the video looks really smooth
So the problem must be in fps lock. Can you show how you implemented it ?

ZeikoRomsan
29th April 2012, 20:31
I've tried making a test application without Qt and with opencv its still not completely smooth video on HD.
So maybe its the fps lock or opencv's imshow does not either render when it should

Here take a look:

////////////////////////High performance timer class

void Utilities::TimerHR::StartTimer()
{
LARGE_INTEGER li;
if(!QueryPerformanceFrequency(&li)) cout << "QueryPerformanceFrequency failed!\n";

PCFreq = double(li.QuadPart)/1000.0;

QueryPerformanceCounter(&li);
CounterStart = li.QuadPart;
}

double Utilities::TimerHR::GetElapsedTime()
{
LARGE_INTEGER li;
QueryPerformanceCounter(&li);
return double(li.QuadPart-CounterStart)/PCFreq;
}

////////////////////////FPS lock within opencv test application choppy video too
cv::Mat videoFrame;
cv::VideoCapture videoCapture;
videoCapture.open("D:\\noiseWithMovement3.mpeg");
double desiredFrameTime = 1000/29;
Utilities::TimerHR timerTR;

while (true)
{
videoCapture >> videoFrame; //capture frame

double elapsedTime = timerTR.GetElapsedTime();
if (elapsedTime < desiredFrameTime)
{
//timeBeginPeriod(1);
Delay(desiredFrameTime - elapsedTime);
}

timerTR.StartTimer();
cv::imshow("testar", videoFrame); //show image
cv::waitKey(1);
}

stampede
30th April 2012, 01:21
I don't know about that code, all the "Delay" and "waitkey" stuff looks unfriendly and probably will be gone in your final Qt based code.
Try to create implementation, which use QTimer with proper interval - on each timeout, get one frame from video file and display it in OpenGL widget. Don't use busy waiting or "sleep", just a simple timer with "timeout()" signal connected to a "grabFrame()" slot.
You can see this thread (http://www.qtcentre.org/threads/43056-Qt-OpenCV-simple-example), there is an example of how you can use OpenCV with Qt.

ZeikoRomsan
30th April 2012, 11:25
Sorry, forgot the delay definition, here it comes:

void Delay( double ms )
{
Utilities::TimerHR t;
t.StartTimer();
//int sleep = ms - 10;
//if(sleep > 0) Sleep(sleep);
while(t.GetElapsedTime() < ms){}
}

Waitkey is needed for opencv to update the GUI window, when I measure it and the imshow function they take between 5-16 msecounds. In the Qt application theese lines are replaced with this:

glWidget.deleteTexture(1);
glWidget.bindTexture(mainVM->VideoFrame(), GL_TEXTURE_2D, GL_RGBA, QGLContext::LinearFilteringBindOption);
glWidget.updateGL();



/////////////////////////////////////////////////////
void Views::GLWidget::paintGL()
{
glEnable(GL_TEXTURE_2D);

glClear(GL_COLOR_BUFFER_BIT);
glBindTexture(GL_TEXTURE_2D, 1);
glPushMatrix();
glBegin(GL_QUADS);
glTexCoord2i(0,0); glVertex2i(0,height());
glTexCoord2i(0,1); glVertex2i(0,0);
glTexCoord2i(1,1); glVertex2i(width(),0);
glTexCoord2i(1,0); glVertex2i(width(),height());
glEnd();
glPopMatrix();
}

d_stranz
30th April 2012, 20:59
Please use
tags when you post source code.

[CODE]void Delay( double ms )
{
Utilities::TimerHR t;
t.StartTimer();
//int sleep = ms - 10;
//if(sleep > 0) Sleep(sleep);
while(t.GetElapsedTime() < ms){}
}

This isn't a "delay" at all; instead, your CPU is just burning up cycles waiting for the loop to exit. You might as well be calculating a million digits of PI. If this is the code you are actually using in your playback loop, it's no wonder your video looks choppy. You aren't letting the CPU do anything else except ask millions of times, "Is it time yet? Is it time yet? Is it time yet?" So everything else in your playback app, like disk access, buffering, refreshing the screen, servicing mouse events, etc. has to be done in the period when the CPU isn't staring at its watch.

Replace this with a proper Qt timer, as was suggested above, and use the timeout() slot to fetch and display the next frame. Then the OS can use the time until the next timeout() to do all those things you aren't letting it do in your loop.

ZeikoRomsan
30th April 2012, 23:15
I had the delay loop in another working thread but I've also tried using: QTimer::singleShot(timeBeforeNextVideoUpdate, this, SLOT(UpdateVideoFrame()) ); in GUI thread instead, which did not change anything.

I have one thread capturing and decoding new videoframes, one thread applying some effects on the last captured frame (worker thread) and one GUI thread.
A slot in the GUI thread is called as soon as a new processed frame is ready to be copied from the worker thread. As soon as the GUI thread copied the frame the workerthread continous with applying effects to another frame.
Meantime in the GUI thread, it checks how long time was it since the last frame update and then use a QTimer::singleShot which will be called after (elapsedTimeFromLastRenderedVideoFrame - desiredTime).
Desired time (ms) is 1000/29 for a video that should be run at 29 FPS.

I don't see any errors in this except pherhaps that the time to render the actual frames is not taken care which I don't see any problem with atleast.

d_stranz
1st May 2012, 00:13
If none of the operations you are performing (capturing, decoding, applying effects, displaying, transfers, etc.) is a rate-limiting step (that is, takes more than 34.5 ms), and if you truly are processing in parallel (that is, your thread synchronization isn't actually causing the threads to run in series on each frame), then you need to look elsewhere for the source of the problem.

But if any single step takes more than 34 ms, and the other threads have to wait for that step to finish before they can go, then you have a bottleneck. What about your copying operations? Are you using shared memory, are you using bitblt operations to let the operating system copy frames most efficiently? Do you have more than one buffer so you can work on one frame while another is being copied or transformed? I am pretty sure that media players such as VLC have all kinds of optimizations like this built in.

ZeikoRomsan
1st May 2012, 14:46
They run in paralell and when I'm not using any FPS lock the video is running at around 60 FPS.
I'm using memcpy to copy the frame data from the worker thread to the GUI thread.

I get best result by using the delay function right before I'm rendering the frame with the glWidget.updateGL(); command. When I'm using QTimer::singleShot it get's worse, I guess its because its not so accurate.
The best option I see is to render the qglwidget in another thread and using the Delay function to update it as accuratly in time as possible. Does any one else has a better idéa?
Yes I have one frame buffered in memmory to work with while the capture thread captures a new one.

ZeikoRomsan
3rd May 2012, 16:32
The solution in windows was to use a multimedia timer instead of for example qtimer or qbasictimer.
http://msdn.microsoft.com/en-us/library/windows/desktop/dd742877(v=vs.85).aspx
Now I get basically perfect smooth video. :)