I am doing a cross platform music jukebox app. While playing a music file, I display the current running time, and the total running time of the track in this format:
mm:ss [mm:ss]
This is updated every second using the media object's tick signal. I use the TotalTime() function of the media aobject to get the running time of the music track. In Linux, this works fine, but in Windows this returns the wrong amount. This is OS independent, as it happens on the XP development machine, as well as the Win7 deployment machine. There doesn't seem to be any particular pattern to how much it is wrong by, but it is always wrong, and it is always much greater than the actual running time of the track. I did a Google search on this but came up empty. This isn't exactly a show stopper, but it is an annoyance. Sound familiar?
Bookmarks