PDA

View Full Version : QtConcurrent and QSqlQuery



^NyAw^
10th April 2013, 13:09
Hi,

Has anyone experienced on executing multiple INSERT sql queries into MySQL database using QtConcurrent?

I have to store big data on every query and I want to know if using multiple threads that will execute an INSERT query with the data will increase the insertion speed.

Thanks,

BalaQT
10th April 2013, 13:19
Hi Oscar,

The general idea shall be like this, create a sql file with data and try to import using mysqlimport

Hope it helps,
Bala

^NyAw^
10th April 2013, 13:23
Hi,

Storing the data into a file and then insert the file data into the database? Are you sure that it will speedup my process? :p

Thanks,

BalaQT
10th April 2013, 13:40
Hi oscar,

Hope so.Im not sure about that. may be masters like wysota,Lykurg,others can answer this correctly.
My thought is, inserting from Qt will take much time than db import app.
May be the db import tool optimized.


Check with the mysql tuning http://dev.mysql.com/doc/refman/5.1/en/innodb-tuning.html
hope it helps,
Bala.

Added after 4 minutes:

Some tips on mysql bulk import [from the mysql website],


When importing data into InnoDB, make sure that MySQL does not have autocommit mode enabled because that requires a log flush to disk for every insert. To disable autocommit during your import operation, surround it with SET autocommit and COMMIT statements:

SET autocommit=0;
... SQL import statements ...
COMMIT;
If you use the mysqldump option --opt, you get dump files that are fast to import into an InnoDB table, even without wrapping them with the SET autocommit and COMMIT statements.

If you have UNIQUE constraints on secondary keys, you can speed up table imports by temporarily turning off the uniqueness checks during the import session:

SET unique_checks=0;
... SQL import statements ...
SET unique_checks=1;
For big tables, this saves a lot of disk I/O because InnoDB can use its insert buffer to write secondary index records in a batch. Be certain that the data contains no duplicate keys.

If you have FOREIGN KEY constraints in your tables, you can speed up table imports by turning the foreign key checks off for the duration of the import session:

SET foreign_key_checks=0;
... SQL import statements ...
SET foreign_key_checks=1;
For big tables, this can save a lot of disk I/O.

^NyAw^
10th April 2013, 13:46
Hi,

Think on that I need to store the data into a file first, so I think this will be slower.

What I want is to create multiple threads that will store the data concurrently. Maybe the concurrent insertion will give me the same insertion time and will not speedup it.

Thanks,

Hi,

Will try to disable "autocommit", insert all data and then reenable it. It maybe will help as I have multiple insertions at a time.

Thanks,

BalaQT
10th April 2013, 15:28
Will try to disable "autocommit", insert all data and then reenable it. It maybe will help as I have multiple insertions at a time.
u can use transaction and commit for multiple insertions

bala