PDA

View Full Version : how to optimize QSqlQuery?



karankumar1609
9th April 2014, 11:40
Hello Everyone,

I am working with QSqlQuery to fetch data from database tables.
Somehow i find it is quite slow for me.

What i am doing is i am fetching data from table around 15539502 rows.
which itself take 40-45 seconds., which is fine.
But when i am traversion this much amount of data it takes around 7-10 minutes.
which is quite long time.



QString query("SELECT * FROM DATA_TABLE_1 (TICK_DATE BETWEEN :date_to AND :date_from) ORDER BY TICK_DATE ASC, TICK_TIME ASC");
QSqlQuery sql_query(m_ServerDB);
sql_query.prepare(query);
sql_query.setForwardOnly(true);
bool ok = sql_query.exec();

if(ok &&
sql_query.size() > 0) {
// Do calculations
}


is there any way to doing the same with multiple threads to save time?
or anyidea to solve my problem.?

wysota
9th April 2014, 11:48
First of all you are executing the query twice. First in line #2 and then in line #4.

karankumar1609
9th April 2014, 11:59
m_ServerDB is the object of QSqlDatabase then how can it gets executed?
without even having a query?
please correct me if i am wrong?

wysota
9th April 2014, 12:27
m_ServerDB is the object of QSqlDatabase then how can it gets executed?
without even having a query?
please correct me if i am wrong?

No, it's ok, especially after you have edited your original post. Anyway, unless you show us your "do calculations" part there is not much we can help you with.

karankumar1609
9th April 2014, 13:06
if(ok &&
sql_query.size() > 0) {
QSqlRecord record = sql_query.record();
unsigned char index_tick_date = record.indexOf("TICK_DATE");
unsigned char index_tick_time = record.indexOf("TICK_TIME");
unsigned char index_open = record.indexOf("OPEN");
unsigned char index_high = record.indexOf("HIGH");
unsigned char index_low = record.indexOf("LOW");
unsigned char index_close = record.indexOf("CLOSE");
unsigned char index_volume = record.indexOf("VOLUME");

int row = sql_query.size();

int date_prev = -1;
int time_prev = -1, time_weekly_prev = -1, time_monthly_prev = -1;

float open_daily_prev, open_weekly_prev, open_monthly_prev;
float high_daily_prev, high_weekly_prev, high_monthly_prev;
float low_daily_prev, low_weekly_prev, low_monthly_prev;
float close_daily_prev, close_weekly_prev, close_monthly_prev;
float volume_daily_prev, volume_weekly_prev, volume_monthly_prev;

bool init_daily, init_weekly, init_monthly;
unsigned char minutes_counter = 0;

init_daily = init_weekly = init_monthly = true;

while(sql_query.next()) {
QString date_str = sql_query.value(index_tick_date).toDate().toString ("yyyyMMdd");
int date = date_str.toInt();

QString time_str = sql_query.value(index_tick_time).toTime().toString ("hhmm");
int time = time_str.toInt();
float open = sql_query.value(index_open).toFloat();
float high = sql_query.value(index_high).toFloat();
float low = sql_query.value(index_low).toFloat();
float close = sql_query.value(index_close).toFloat();
float volume = sql_query.value(index_volume).toInt();

if(time_prev == -1) {
time_prev = time;
time_monthly_prev = time;
time_weekly_prev = time_str.left(2).toInt() * 100;
}

if(date_prev == -1)
date_prev = date;

if(init_daily) {
init_daily = false;
open_daily_prev = open_weekly_prev = open_monthly_prev = open;
high_daily_prev = high_weekly_prev = high_monthly_prev = high;
low_daily_prev = low_weekly_prev = low_monthly_prev = low;
close_daily_prev = close_weekly_prev = close_monthly_prev = close;
volume_daily_prev = volume_weekly_prev = volume_monthly_prev = close;
}

// Daily checks
// one minute resolution data
{
if(time == time_prev) {
high_daily_prev = ::max(high, high_daily_prev);
low_daily_prev = ::min(low, low_daily_prev);
close_daily_prev = close;
volume_daily_prev += volume;
} else if(time > time_prev || date > date_prev) {
// Process Data Here
// DoSomething()
createOHLCData(date_prev, time_prev, open_daily_prev, high_daily_prev, low_daily_prev, close_daily_prev, id_element_price[INDEX_DAILY]);
createVolumeData(date_prev, time_prev, volume_daily_prev, id_element_volume[INDEX_DAILY]);

// Weekly checks
// One hour resolution data
{
// 60 minutes in one hour
if(((time_weekly_prev/100) < (time/100)) ||
date > date_prev ||
init_weekly) {
if(!init_weekly) {
// Process weekly data here
// DoSomething()
createOHLCData(date_prev, time_weekly_prev, open_weekly_prev, high_weekly_prev, low_weekly_prev, close_weekly_prev, id_element_price[INDEX_WEEKLY]);
createVolumeData(date_prev, time_weekly_prev, volume_weekly_prev, id_element_volume[INDEX_WEEKLY]);

minutes_counter = 0;
time_weekly_prev = time_str.left(2).toInt() * 100;
} else { init_weekly = false; }

open_weekly_prev = open_daily_prev;
high_weekly_prev = high_daily_prev;
low_weekly_prev = low_daily_prev;
close_weekly_prev = close_daily_prev;
volume_weekly_prev = volume_daily_prev;
} else {
high_weekly_prev = ::max(high_daily_prev, high_weekly_prev);
low_weekly_prev = ::min(low_daily_prev, low_weekly_prev);
close_weekly_prev = close_daily_prev;
volume_weekly_prev += volume_daily_prev;
}
}

// Monthly checks
// One day resolution
{
// 60 minutes in one hour
if(date > date_prev ||
init_monthly) {
if(!init_monthly) {
// Process monthly data here
// DoSomething()
createOHLCData(date_prev, time_monthly_prev, open_monthly_prev, high_monthly_prev, low_monthly_prev, close_monthly_prev, id_element_price[INDEX_MONTHLY]);
createVolumeData(date_prev, time_monthly_prev, volume_monthly_prev, id_element_volume[INDEX_MONTHLY]);
date_prev = date;
time_monthly_prev = time;
} else { init_monthly = false; }

open_monthly_prev = open_daily_prev;
high_monthly_prev = high_daily_prev;
low_monthly_prev = low_daily_prev;
close_monthly_prev = close_daily_prev;
volume_monthly_prev = volume_daily_prev;
} else {
high_monthly_prev = ::max(high_daily_prev, high_monthly_prev);
low_monthly_prev = ::min(low_daily_prev, low_monthly_prev);
close_monthly_prev = close_daily_prev;
volume_monthly_prev += volume_daily_prev;
}
}

open_daily_prev = open;
high_daily_prev = high;
low_daily_prev = low;
close_daily_prev = close;
volume_daily_prev = volume;
}

time_prev = time;

}
}

if(row >= 1) {
createOHLCData(date_prev, time_prev, open_daily_prev, high_daily_prev, low_daily_prev, close_daily_prev, id_element_price[INDEX_DAILY]);
createVolumeData(date_prev, time_prev, volume_daily_prev, id_element_volume[INDEX_DAILY]);

createOHLCData(date_prev, time_weekly_prev, open_weekly_prev, high_weekly_prev, low_weekly_prev, close_weekly_prev, id_element_price[INDEX_WEEKLY]);
createVolumeData(date_prev, time_weekly_prev, volume_weekly_prev, id_element_volume[INDEX_WEEKLY]);

createOHLCData(date_prev, time_monthly_prev, open_monthly_prev, high_monthly_prev, low_monthly_prev, close_monthly_prev, id_element_price[INDEX_MONTHLY]);
createVolumeData(date_prev, time_monthly_prev, volume_monthly_prev, id_element_volume[INDEX_MONTHLY]);
}
}


here is the calculation under sql query execution.. this part doesn't help me much...
hope you will find the lack.. ;)

wysota
9th April 2014, 13:26
Did you measure which part takes the longest time in each iteration?

karankumar1609
9th April 2014, 14:35
yes i have done function profiling and it says that iteration QSqlQuery multiple number of times and doing calculations after that takes longer time.
but calculations are necessary. :(

can't we do the same with multiple threads to save my time?

wysota
9th April 2014, 15:38
You can use threads if your calculations for iteration A do not require results of calculation for any previous step.

ChrisW67
10th April 2014, 07:34
Simple speed rules for larger data sets:

Don't select more data than your require. "SELECT *" is almost always more than you require and causes more IO/network traffic as a result. It has negative maintainability effects as well.
Don't iterate if an SQL set operation can do the job. Often these are faster even if temporary tables are required.
Know what indexes might speed execution. If the limited data you are selecting all comes from an index then the engine may not need to read the actual table at all.


I assume that createOHLCData() and createVolumeData() create records in other tables, so you potentially have millions of SQL INSERT/UPDATE transactions being executed by this loop. You do not say which database is underlying but if it is Sqlite make sure you are doing the entire process inside a database transaction: http://sqlite.org/faq.html#q19

Since you have the data in a relational database you probably should consider using the set processing power of SQL to do the work. Record by record iteration should only be used as a last resort on large record volumes.

karankumar1609
10th April 2014, 08:20
Thnx ChrisW67,

I am using mysql database.
Actually i am using "Select *" because all data is required. createOHLCData() and createVolumeData() functions do not create entry in tables. they are just creating entry in a structure which is just a small operation.
And next record insertion is linked with previous record. Feeling quite helpless.

Lesiok
10th April 2014, 12:59
But the functions createOHLCData() and createVolumeData() are called 3 times each in one loop iteration. Loop has 15539502 iterations. 0,1 ms saved in one iteration gives 1553 seconds. If You can show us this functions. Maybe it is a problem with memory management with big cantainers.