Ok, what is the best way to do this? Should I not start a new get() call until one has finished? So instead, call out to get() for the next file in the requestFinished() slot? Rather than calling them all at once?
Ok, what is the best way to do this? Should I not start a new get() call until one has finished? So instead, call out to get() for the next file in the requestFinished() slot? Rather than calling them all at once?
I reimplemented the program so that there is only one QFile open at a time and I still get the same problem, although the program uses a lot less memory, which is good.
I output whenever a request is started, when the header is received and when the request is finished. The program hangs after the request is started and never receives a header. I do not believe this is a problem with the QFiles, although that change was good.
I don't really know what to do... if this is unavoidable is there someway to make it timeout if it doesn't get a header? Shouldn't it do this automatically?
Thank you again.
No, it won't timeout by itself. Does the download stop after the same number of files each time? What system are you running your application under?
No, it stops after different numbers each time. I'm am running it on Windows Vista.
Please try running the same application on some other OS (MacOS or Linux).
I ran it on OpenSuse Linux 10.3 only to run into the same problem.![]()
You can try using my downloader available here: http://www.qtcentre.org/forum/f-qt-p...-qt4-5695.html (at the end of the thread). See if it works for you.
Ok, I checked out your httpdownloader and looked at the design. I then changed the main.cpp to download the files from the ul server. Your program also stalled.
I can see why. Your program has 5 concurrent downloads so after it ran into 5 that it couldn't get a response header from it just basically stopped and waited for those files which just weren't coming.
So does this mean its a problem with the server just not being able to provide the files?
Last edited by last2kn0; 14th October 2007 at 20:03.
It could be that it blocks traffic if it sees a big burst of data. You can try downloading slower - delaying subsequent GETs a little. Using a technique similar to the one used by the public downlaoder I managed to successfuly download over a thousand images from a single server. If you want the code, PM me and I'll provide it to you. I don't want it to spread to public as it's not polished yet.
Bookmarks