PDA

View Full Version : Bulk insert into SQLite



munna
26th October 2006, 11:04
Hi,

I have some text data in a CSV file which needs to be imported into a SQLite database. Following are the steps that can be used to import

1. File->import.
2. Select the CSV file that you want to import
3. This data is now shown in a QTableWidget which can be edited.
4. User can also select the column name which will correspond to some table in the database.
5. When user is done with editing, he can click on the import button.

Now, the problem.

The SQLite database has 2 tables, let us call them name_table and address_table. As the name suggests name_table stores the names and the address_table stores the addresses. Also, these two are related by an id. name_table has a field called id and address_table has a field called nameid which will always be equal for a particular pair of name and address.

While importing, I first insert into name_table, then get the id of the inserted row and then insert into address_table. I use QSqlTableModel to do this. Something like this:




QSqlTableModel tableModel;
QSqlRecord rec;

tableModel.setTable("name_table");
tableModel.select();
int rc = tableModel.rowCount();
tableModel.insertRow(rc);
rec = tableModel.record(rc);

rec.setValue(QString("prefix"),QVariant(nameDetails.prefix()));
rec.setValue(QString("firstname"),QVariant(nameDetails.firstName()));
rec.setValue(QString("middlename"),QVariant(nameDetails.middleName()));
rec.setValue(QString("lastname"),QVariant(nameDetails.lastName()));
rec.setValue(QString("suffix"),QVariant(nameDetails.suffix()));
rec.setValue(QString("nickname"),QVariant(nameDetails.nickName()));

tableModel.setRecord(rc,rec);
tableModel.submitAll();
tableModel.select();
rec = tableModel.record(tableModel.rowCount()-1);
currentNameId = rec.value("id").toInt();

//using currentNameId now insert into address_table in similar way



I call the above code in a loop for every row in the table. This process is too slow and can insert not more than 100 records per minute. I know this is a completely wrong way but cannot think of anything better.

Can someone please suggest the best way to do this so that the process of importing becomes fast ?

Thanks a lot for your time.

patrik08
26th October 2006, 12:18
Yesterday .... I imported a sqlite import file .sql from 36Mb on 3 minute moore as 250'000
(500 Mb ram , 3GHz intel )
row ... but... i not use qt sqlite plugin i use a static lib .... same on window / mac /linux
http://svn.sourceforge.net/viewvc/qtexcel-xslt/static_sqlite_table/

IMO comment http://sourceforge.net/forum/forum.php?thread_id=1493057&forum_id=301849

munna
26th October 2006, 15:22
Is there a Qt way to do this ?

Thanks a lot

patrik08
26th October 2006, 16:14
Is there a Qt way to do this ?

Thanks a lot

Oh.. yes... (I can write only qt4 i owerflow heavy c++ books ;-) ) you import the the cvs file and convert to SQL - insert to sqlite and on the table model display only 100 row or 50 row ... button next >> prev <<< edit ready...

Or other elegant method ..... only by window ...
1- you make a new odbc connection to file cvs connect (setup file name path ecc.) all on plugin qt odbc ....

a small snip to qt4 next & prev button have a look
http://qtforum.de/forum/viewtopic.php?t=2431 simple & fast

You need ony a QTableWidget & 2 button (next & prev)

Or a little moore work and You build a fast static lib

natbobc
16th November 2007, 00:38
I know this comment is old, but is it possible to improve performance using transactions? From what I understand from the sqlite performance comparison document (yes very old, but probably vaguely applicable) under normal circumstances the db file is opened and closed each time an insert is applied unless it is encapsulated in a transaction.

Below is some pseudo code of what I mean;


SQL>> BEGIN TRANSACTION; // execute as QSqlQuery

csv = FileParser::open( "somefile.csv", CSV );

while( current_record = csv.nextRecord() ) {
QSqlQuery q( "INSERT INTO table (firstname,lastname,etc) VALUES (:firstname,:lastname,:etc)" );

q.bindValue( ":firstname", current_record[0] );
q.bindValue( ":lastname", current_record[1] );

...

if( !q.exec() ) throw RecordException; // or however you want to handle an error
}

close( csv );

SQL>> COMMIT

I think the alternative and possibly preferred method maybe using QVariantList's with a prepared statement and execBatch() for the insert.

Once you've loaded the data into the db, then grab the table using QSqlTableModel. Note it's just an idea, haven't actually tested it for performance. Hence the mashed code.

Regards,

Nate

jacek
16th November 2007, 17:18
I know this comment is old, but is it possible to improve performance using transactions?
If SQLite docs say so, then it might work. This really depends on the DBMS and other databases might like different constructs.

For tips on PostgreSQL read this: http://www.depesz.com/index.php/2007/07/05/how-to-insert-data-to-database-as-fast-as-possible/



while( current_record = csv.nextRecord() ) {
QSqlQuery q( "INSERT INTO table (firstname,lastname,etc) VALUES (:firstname,:lastname,:etc)" );
...
q.bindValue( ":firstname", current_record[0] );
...
If you pass the query string to QSqlQuery constructor, it will execute it immediately --- you have to use prepare() if you have placeholders:

QSqlQuery q;
q.prepare( "INSERT INTO table (firstname,lastname,etc) VALUES (:firstname,:lastname,:etc)" );
while( current_record = csv.nextRecord() ) {
...
q.bindValue( ":firstname", current_record[0] );
...
Also by moving call to prepare() out of the loop, you can save a bit of time (esp. in case of nasty queries).

sabeesh
19th November 2007, 03:56
Hi,
Please try this also
http://www.qtcentre.org/forum/f-qt-programming-2/t-qt-sqlite-connection-8380.html/?highlight=SQLite