Results 1 to 3 of 3

Thread: QDataStream << QMultiHash upper bounds?

  1. #1
    Join Date
    Jan 2006
    Posts
    44
    Thanks
    9
    Qt products
    Qt4 Qt/Embedded
    Platforms
    MacOS X Unix/X11

    Default QDataStream << QMultiHash upper bounds?

    I've got a QMultiHash<QString,qint32> that I am attempting to save to file via QDataStream. Prior to this particular structure being saved I save a few other QHash and QMultiHash instances.

    The problem I am running into is that when i hit this save part in my code, it just sits there. For several minutes. There are ~134,000 entries in the QMultiHash. With a smaller size it worked just fine, and took less than 45 seconds for the entire save operation. Now, with the larger data set it is exceeding several minutes (6+) and not finishing (I have qDebug() statements wrapping it to track progress). The destination file's size grows very, very, very slowly.


    Is there an upper bounds to saving these?

    Edit: The smaller version has around 80k entries, and saves almost immediately.
    Last edited by ucntcme; 25th June 2009 at 20:55. Reason: added info
    --
    The Real Bill

  2. #2
    Join Date
    Jan 2006
    Location
    Warsaw, Poland
    Posts
    33,359
    Thanks
    3
    Thanked 5,015 Times in 4,792 Posts
    Qt products
    Qt3 Qt4 Qt5 Qt/Embedded
    Platforms
    Unix/X11 Windows Android Maemo/MeeGo
    Wiki edits
    10

    Default Re: QDataStream << QMultiHash upper bounds?

    Can we see the code? By the way, adding qDebug() statements slows down the whole operation several times.
    Your biological and technological distinctiveness will be added to our own. Resistance is futile.

    Please ask Qt related questions on the forum and not using private messages or visitor messages.


  3. #3
    Join Date
    Jan 2006
    Posts
    44
    Thanks
    9
    Qt products
    Qt4 Qt/Embedded
    Platforms
    MacOS X Unix/X11

    Default Re: QDataStream << QMultiHash upper bounds?

    Yeah I know qDebug slows it down but it is the same across runs. The only difference was the data set size and it only outputs a qDebug when it starts and finishes as in the following:

    Qt Code:
    1. qDebug() <<"Saving id map";
    2. outstream << idMap;
    3. qDebug() << "Done saving id map";
    To copy to clipboard, switch view to plain text mode 

    That debug hit should be the same whether I have 1 or 10 million records in the map, as it is in a save function that is called only once.

    I've written some basic code to create QMultiHashes of "small,medium,large" sizes to isolate the problem and managed to narrow it down to the problem being due to a large number of repetitive keys in the QMultiHash, not the sheer number of them. When I create a set of 150,000 with a large percentage of repeated keys it bogs badly, but with a low percentage it is nice and zippy. I've not yet narrowed down the exact profile, I'll need to write some more specific code to determine a better performance profile.
    --
    The Real Bill

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Digia, Qt and their respective logos are trademarks of Digia Plc in Finland and/or other countries worldwide.