I'll test it out with my firewall and such turned off and see if it changes anything.
I know the host can handle it because I performed the same downloads with wget with no problem.
If it still doesn't work, I'll post how my code looks.
I'll test it out with my firewall and such turned off and see if it changes anything.
I know the host can handle it because I performed the same downloads with wget with no problem.
If it still doesn't work, I'll post how my code looks.
I tried getting rid of the firewall and it didn't fix anything. Ok so basically I created an Http class with an interface like this:
Qt Code:
public: void downloadFiles(); signals: void done(); private slots: void requestFinished(int id, bool error); void done(bool error); void responseHeaderReceived(const QHttpResponseHeader& resp); private: QHttp* http; QQueue<QFile*> files; QQueue<QFile*> started;To copy to clipboard, switch view to plain text mode
The constructor basically creates QFile*'s for files min to max. Then these are placed in the files queue. When downloadFiles() is called, the http->get() is called on each of the files, and the QFile*'s are placed in the started queue.
When the requestFinished signal is emitted the requestFinished function is called and checks the last response header for errors. If there was an error, it deletes the file, else it closes the file and dequeues it from the started queue.
When http::done(bool) is emitted, the connect done(bool) emits its own done() signal.
[EDIT:] I added a request started connection and it shows that the file request is started and then it does nothing. [/EDIT]
If you need more info or some snippets from the implementation file, please let me know.
Last edited by last2kn0; 12th October 2007 at 22:16.
My long shot is that you are opening too many files at once. Processes have limits for active file descriptors. You can open a file only when it's needed.
I open each file right before I call its respective QHttp::get and close and delete the pointer when its finished.
Qt Code:
void Http::downloadFiles(){ while(files.size()){ http->get("/stylepages/"+info.fileName(),files.front()); started.enqueue(files.dequeue()); } } void Http::requestFinished(int id, bool error){ if(http->lastResponse().statusCode() != 200 || error){ std::cout << id << " encountered an error:\n "; std::cout << http->lastResponse().statusCode() << " " << http->lastResponse().reasonPhrase().toStdString() << "\n"; started.front()->remove(); delete started.dequeue(); } else{ started.front()->close(); delete started.dequeue(); std::cout << "File closed. " << id << std::endl; } }To copy to clipboard, switch view to plain text mode
But get() calls only cause the request to be queued and not executed immediately. Thus if you do:
You end up with 1000 files open by the time when file1 gets downloaded.Qt Code:
for(int i=0;i<1000;i++)To copy to clipboard, switch view to plain text mode
If you could show us the exact code we'd be able to say more - as I said, my opinion is a long shot, but it seems probable is you open files in advance.
The exact code is posted right above your post. If you look, I was opening the file in the front of the queue right before I called get(). I removed that line and it was able to download a lot more files but still stopped.
Here is my implementation file:
Qt Code:
//Setup http connect(http,SIGNAL(requestFinished(int,bool)),this, SLOT(requestFinished(int,bool))); connect(http,SIGNAL(done(bool)),this,SLOT(done(bool))); connect(http,SIGNAL(responseHeaderReceived(const QHttpResponseHeader&)), this,SLOT(responseHeaderReceived(const QHttpResponseHeader&))); connect(http,SIGNAL(requestStarted(int)),this,SLOT(requestStarted(int))); //Create QFile* and place in download queue, files for(int i = min; i < max; ++i){ + "-AWM-Fixed.rtf"); std::cout << "adding " << i << std::endl; files.enqueue(temp); } } void Http::downloadFiles(){ while(files.size()){ http->get("/stylepages/"+info.fileName(),files.front()); started.enqueue(files.dequeue()); } } void Http::requestFinished(int id, bool error){ if(http->lastResponse().statusCode() != 200 || error){ std::cout << id << " encountered an error:\n "; std::cout << http->lastResponse().statusCode() << " " << http->lastResponse().reasonPhrase().toStdString() << "\n"; started.front()->remove(); delete started.dequeue(); } else{ started.front()->close(); delete started.dequeue(); std::cout << "File closed. " << id << std::endl; } } void Http::responseHeaderReceived(const QHttpResponseHeader& resp){ std::cout << resp.statusCode() << " " << resp.reasonPhrase().toStdString() << std::endl; } void Http::requestStarted(int id){ std::cout << id << " started\n"; } void Http::done(bool error){ if (started.size() ==0){ emit done(); std::cout << "Files done\n"; return; } if (error){ std::cout << "Error in done()\n"; } }To copy to clipboard, switch view to plain text mode
How should I download these files if I can't do it like this??
Last edited by last2kn0; 13th October 2007 at 01:26.
Ok, what is the best way to do this? Should I not start a new get() call until one has finished? So instead, call out to get() for the next file in the requestFinished() slot? Rather than calling them all at once?
I reimplemented the program so that there is only one QFile open at a time and I still get the same problem, although the program uses a lot less memory, which is good.
I output whenever a request is started, when the header is received and when the request is finished. The program hangs after the request is started and never receives a header. I do not believe this is a problem with the QFiles, although that change was good.
I don't really know what to do... if this is unavoidable is there someway to make it timeout if it doesn't get a header? Shouldn't it do this automatically?
Thank you again.
No, it won't timeout by itself. Does the download stop after the same number of files each time? What system are you running your application under?
No, it stops after different numbers each time. I'm am running it on Windows Vista.
Please try running the same application on some other OS (MacOS or Linux).
I ran it on OpenSuse Linux 10.3 only to run into the same problem.![]()
Bookmarks