PDA

View Full Version : Multiplatform Video Viewer Application Design Options



PhilippB
14th August 2008, 16:56
Hi!

I'm doing some research about a planned video viewer application. In fact, I already have a
proof-of-concept application using qt4 for the GUI and the Linux xvideo extension for rendering.
This design choice was based on the fact that Linux was the only target platform. Now, the
requirements have changed. The application is now intended to run on Linux and Mac OSX.

Like other video viewers, the application should receive an infinite series of (uncompressed)
images, and display them for a short amount of time.

My problem is, that I'm not too familiar with the performance limitations of the X11 server
design and the resulting details of the QT paint model. As far as I know, the most relevant
bottleneck is the transfer of image data to the x server. I imagine that, once the image
data is transferred, it can be stored as some kind of resource inside the x server, which allows
fast (recurrent) usage. But this is not what I need, as every image is only showed once...

The only options I know, are xvideo extensions (which are probably not compatible with OSX) and
SDL. SDL should run on OSX, but has a "one window/overlay per process" limitation. This is a
problem, because the viewer should display multiple videos in separate (MDI-)windows.

Is there any other solution, using built-in QT features? I know that OpenGL can be utilized for
painting, but, IIRC, this doesn't speed up the image data transfer.
For example, I've found the QGLFramebufferObject in the QT docs. The name sounds like it could be
the thing I'm looking for...

Thanks for any answers. Also, general information (links) about Qt/OSX painting performance are
greatly appreciated.

wysota
14th August 2008, 17:42
OpenGL does speed up things although it is better to utilize platform dependent features as most players do (for instance the xvideo extension under X11 or DirectShow under Windows). Are you going to run movies or a series of stills? Because if the former, maybe you can use Phonon?

PhilippB
18th August 2008, 09:38
Phonon may be a really nice solution, indeed!

>Are you going to run movies or a series of stills?

hmm, hard to tell... . The application will be a 'Demo-Viewer' for cameras. Basically, you can think of,
e.g USB webcams with uncompressed yuv/rgb output. I would say, its rather a series of stills that I
want to display. Normally, the user will watch live video from the cameras, but I need also features like
"single shot", where single images are acquired and shown.

I read some information about phonon, and it seems that the biggest problems will be:
- Lack of documentation: It seems that only the most common use cases are documented yet.
- Overhead: Phonon does a lot of things I do not need.

I suppose, that I need to extend the AbstractMediaStream class for my purpose. But, looking at the
documentation (e.g. here (http://www.englishbreakfastnetwork.org/apidocs/apidox-kde-4.0/kdelibs-apidocs/phonon/html/classPhonon_1_1AbstractMediaStream-members.html)) , I don't see, for example, where the actual image data format is determined.

The next question would be: When I understand the Phonon model well enough to use it for my purpose,
will it allow to control all the details I need to control? The most important issue is latency, I guess. A delay
of 1-2 (maybe 3) images due to buffering is acceptable. But thinking of a fat backend optimized for
smooth movie playback, this might be an issue.