PDA

View Full Version : QDataStream << QMultiHash upper bounds?



ucntcme
25th June 2009, 21:43
I've got a QMultiHash<QString,qint32> that I am attempting to save to file via QDataStream. Prior to this particular structure being saved I save a few other QHash and QMultiHash instances.

The problem I am running into is that when i hit this save part in my code, it just sits there. For several minutes. There are ~134,000 entries in the QMultiHash. With a smaller size it worked just fine, and took less than 45 seconds for the entire save operation. Now, with the larger data set it is exceeding several minutes (6+) and not finishing (I have qDebug() statements wrapping it to track progress). The destination file's size grows very, very, very slowly.


Is there an upper bounds to saving these?

Edit: The smaller version has around 80k entries, and saves almost immediately.

wysota
25th June 2009, 23:05
Can we see the code? By the way, adding qDebug() statements slows down the whole operation several times.

ucntcme
25th June 2009, 23:40
Yeah I know qDebug slows it down but it is the same across runs. The only difference was the data set size and it only outputs a qDebug when it starts and finishes as in the following:



qDebug() <<"Saving id map";
outstream << idMap;
qDebug() << "Done saving id map";


That debug hit should be the same whether I have 1 or 10 million records in the map, as it is in a save function that is called only once.

I've written some basic code to create QMultiHashes of "small,medium,large" sizes to isolate the problem and managed to narrow it down to the problem being due to a large number of repetitive keys in the QMultiHash, not the sheer number of them. When I create a set of 150,000 with a large percentage of repeated keys it bogs badly, but with a low percentage it is nice and zippy. I've not yet narrowed down the exact profile, I'll need to write some more specific code to determine a better performance profile.