nikomaster
22nd November 2011, 07:51
Hi
I am building an application which requires a certain number of threads to works. The main idea is to retrieve data from network sites at the same time with several working threads.
Here is the main idea of the class definition
class workingThread:public QThread
{
Q_OBJECT
ThreadParams params;
protected:
void run();
public:
void SetParams(int param1,int param2,int etc)
signals:
void jobdone(int threadIndex)
}
The idea is to have many instances of the same thread and for each thread it is a assigned a different task, in this case each task corresponds
getting data from a web-site
workingThread threads[10];
The slot receiving the signal JobDone reassigns the task to thread in this case it assigns a new site to retrieve data from, so if the user requires
getting data from 100 different sites it will be distributed along the threads. At this point everything works fine.
The problem comes when all threads reach the bottleneck function at the same time(curl_easy_perform) in which they retrieve data. Some of these threads are terminated while others skip code. In both cases the thread never send the signal jobdone nor the signal "finished". I do not know whether there is a problem with network connections and threads specially with the use of cURL functions. This happens equally in both Windows and Linux.
I am building an application which requires a certain number of threads to works. The main idea is to retrieve data from network sites at the same time with several working threads.
Here is the main idea of the class definition
class workingThread:public QThread
{
Q_OBJECT
ThreadParams params;
protected:
void run();
public:
void SetParams(int param1,int param2,int etc)
signals:
void jobdone(int threadIndex)
}
The idea is to have many instances of the same thread and for each thread it is a assigned a different task, in this case each task corresponds
getting data from a web-site
workingThread threads[10];
The slot receiving the signal JobDone reassigns the task to thread in this case it assigns a new site to retrieve data from, so if the user requires
getting data from 100 different sites it will be distributed along the threads. At this point everything works fine.
The problem comes when all threads reach the bottleneck function at the same time(curl_easy_perform) in which they retrieve data. Some of these threads are terminated while others skip code. In both cases the thread never send the signal jobdone nor the signal "finished". I do not know whether there is a problem with network connections and threads specially with the use of cURL functions. This happens equally in both Windows and Linux.