PDA

View Full Version : QHttp::get seems to hang on request



last2kn0
12th October 2007, 01:42
I have a class that uses QHttp and its get method to download multiple files. It does this fairly well although once and a while it will just hang after its done a few hundred files. I believe it hangs after sending the request. Any idea why this would be happening?

Thank you in advance!

wysota
12th October 2007, 12:19
Hard to say without seeing any code. Are you sure this is caused by QHttp and not by your system, some firewall or the remote host? I've been using QHttp to download many files and I never experienced such behaviour.

last2kn0
12th October 2007, 13:20
I'll test it out with my firewall and such turned off and see if it changes anything.
I know the host can handle it because I performed the same downloads with wget with no problem.

If it still doesn't work, I'll post how my code looks.

last2kn0
12th October 2007, 21:51
I tried getting rid of the firewall and it didn't fix anything. Ok so basically I created an Http class with an interface like this:


public:
Http(int min, int max, QObject* parent=0);
void downloadFiles();
signals:
void done();

private slots:
void requestFinished(int id, bool error);
void done(bool error);
void responseHeaderReceived(const QHttpResponseHeader& resp);
private:
QHttp* http;
QQueue<QFile*> files;
QQueue<QFile*> started;

The constructor basically creates QFile*'s for files min to max. Then these are placed in the files queue. When downloadFiles() is called, the http->get() is called on each of the files, and the QFile*'s are placed in the started queue.

When the requestFinished signal is emitted the requestFinished function is called and checks the last response header for errors. If there was an error, it deletes the file, else it closes the file and dequeues it from the started queue.

When http::done(bool) is emitted, the connect done(bool) emits its own done() signal.

I added a request started connection and it shows that the file request is started and then it does nothing.

If you need more info or some snippets from the implementation file, please let me know.

wysota
12th October 2007, 23:00
My long shot is that you are opening too many files at once. Processes have limits for active file descriptors. You can open a file only when it's needed.

last2kn0
12th October 2007, 23:11
I open each file right before I call its respective QHttp::get and close and delete the pointer when its finished.


void Http::downloadFiles(){
while(files.size()){
files.front()->open(QIODevice::WriteOnly);
QFileInfo info(*files.front());

http->get("/stylepages/"+info.fileName(),files.front());
started.enqueue(files.dequeue());
}
}


void Http::requestFinished(int id, bool error){
if(http->lastResponse().statusCode() != 200 || error){
std::cout << id << " encountered an error:\n ";
std::cout << http->lastResponse().statusCode() << " " <<
http->lastResponse().reasonPhrase().toStdString() << "\n";
started.front()->remove();
delete started.dequeue();
}
else{
started.front()->close();
delete started.dequeue();
std::cout << "File closed. " << id << std::endl;
}
}

wysota
12th October 2007, 23:51
But get() calls only cause the request to be queued and not executed immediately. Thus if you do:

for(int i=0;i<1000;i++)
http->get(QString("/file%1").arg(i+1), new QFile(QString("file%1").arg(i+1)));
You end up with 1000 files open by the time when file1 gets downloaded.

If you could show us the exact code we'd be able to say more - as I said, my opinion is a long shot, but it seems probable is you open files in advance.

last2kn0
13th October 2007, 01:19
The exact code is posted right above your post. If you look, I was opening the file in the front of the queue right before I called get(). I removed that line and it was able to download a lot more files but still stopped.

Here is my implementation file:

Http::Http(int min, int max, QObject* parent) : QObject(parent){
//Setup http
http = new QHttp("iq.ul.com");
connect(http,SIGNAL(requestFinished(int,bool)),thi s,
SLOT(requestFinished(int,bool)));
connect(http,SIGNAL(done(bool)),this,SLOT(done(boo l)));
connect(http,SIGNAL(responseHeaderReceived(const QHttpResponseHeader&)),
this,SLOT(responseHeaderReceived(const QHttpResponseHeader&)));
connect(http,SIGNAL(requestStarted(int)),this,SLOT (requestStarted(int)));

//Create QFile* and place in download queue, files
for(int i = min; i < max; ++i){
QString newPath = "C:/Users/Desktop/UL/Update/Http/files/";
QFile* temp= new QFile(newPath +QString::number(i)
+ "-AWM-Fixed.rtf");
std::cout << "adding " << i << std::endl;
files.enqueue(temp);
}

}

void Http::downloadFiles(){
while(files.size()){

QFileInfo info(*files.front());

http->get("/stylepages/"+info.fileName(),files.front());
started.enqueue(files.dequeue());
}
}


void Http::requestFinished(int id, bool error){
if(http->lastResponse().statusCode() != 200 || error){
std::cout << id << " encountered an error:\n ";
std::cout << http->lastResponse().statusCode() << " " <<
http->lastResponse().reasonPhrase().toStdString() << "\n";
started.front()->remove();
delete started.dequeue();
}
else{
started.front()->close();
delete started.dequeue();
std::cout << "File closed. " << id << std::endl;
}
}

void Http::responseHeaderReceived(const QHttpResponseHeader& resp){
std::cout << resp.statusCode() << " " << resp.reasonPhrase().toStdString() << std::endl;
}

void Http::requestStarted(int id){
std::cout << id << " started\n";
}

void Http::done(bool error){
if (started.size() ==0){
emit done();
std::cout << "Files done\n";
return;
}
if (error){
std::cout << "Error in done()\n";
}


}


How should I download these files if I can't do it like this??

last2kn0
13th October 2007, 04:25
Ok, what is the best way to do this? Should I not start a new get() call until one has finished? So instead, call out to get() for the next file in the requestFinished() slot? Rather than calling them all at once?

last2kn0
13th October 2007, 05:28
I reimplemented the program so that there is only one QFile open at a time and I still get the same problem, although the program uses a lot less memory, which is good.

I output whenever a request is started, when the header is received and when the request is finished. The program hangs after the request is started and never receives a header. I do not believe this is a problem with the QFiles, although that change was good.

I don't really know what to do... if this is unavoidable is there someway to make it timeout if it doesn't get a header? Shouldn't it do this automatically?

Thank you again.

wysota
13th October 2007, 10:46
No, it won't timeout by itself. Does the download stop after the same number of files each time? What system are you running your application under?

last2kn0
13th October 2007, 18:35
No, it stops after different numbers each time. I'm am running it on Windows Vista.

wysota
13th October 2007, 19:39
Please try running the same application on some other OS (MacOS or Linux).

last2kn0
14th October 2007, 00:07
I ran it on OpenSuse Linux 10.3 only to run into the same problem. :(

wysota
14th October 2007, 10:47
You can try using my downloader available here: http://www.qtcentre.org/forum/f-qt-programming-2/t-qhttp-in-thread-qt4-5695.html (at the end of the thread). See if it works for you.

last2kn0
14th October 2007, 19:50
Ok, I checked out your httpdownloader and looked at the design. I then changed the main.cpp to download the files from the ul server. Your program also stalled.

I can see why. Your program has 5 concurrent downloads so after it ran into 5 that it couldn't get a response header from it just basically stopped and waited for those files which just weren't coming.

So does this mean its a problem with the server just not being able to provide the files?

wysota
14th October 2007, 23:00
It could be that it blocks traffic if it sees a big burst of data. You can try downloading slower - delaying subsequent GETs a little. Using a technique similar to the one used by the public downlaoder I managed to successfuly download over a thousand images from a single server. If you want the code, PM me and I'll provide it to you. I don't want it to spread to public as it's not polished yet.