I know this comment is old, but is it possible to improve performance using transactions? From what I understand from the sqlite performance comparison document (yes very old, but probably vaguely applicable) under normal circumstances the db file is opened and closed each time an insert is applied unless it is encapsulated in a transaction.
Below is some pseudo code of what I mean;
SQL>> BEGIN TRANSACTION; // execute as QSqlQuery
csv = FileParser::open( "somefile.csv", CSV );
while( current_record = csv.nextRecord() ) {
QSqlQuery q
( "INSERT INTO table (firstname,lastname,etc) VALUES (:firstname,:lastname,:etc)" );
q.bindValue( ":firstname", current_record[0] );
q.bindValue( ":lastname", current_record[1] );
...
if( !q.exec() ) throw RecordException; // or however you want to handle an error
}
close( csv );
SQL>> COMMIT
SQL>> BEGIN TRANSACTION; // execute as QSqlQuery
csv = FileParser::open( "somefile.csv", CSV );
while( current_record = csv.nextRecord() ) {
QSqlQuery q( "INSERT INTO table (firstname,lastname,etc) VALUES (:firstname,:lastname,:etc)" );
q.bindValue( ":firstname", current_record[0] );
q.bindValue( ":lastname", current_record[1] );
...
if( !q.exec() ) throw RecordException; // or however you want to handle an error
}
close( csv );
SQL>> COMMIT
To copy to clipboard, switch view to plain text mode
I think the alternative and possibly preferred method maybe using QVariantList's with a prepared statement and execBatch() for the insert.
Once you've loaded the data into the db, then grab the table using QSqlTableModel. Note it's just an idea, haven't actually tested it for performance. Hence the mashed code.
Regards,
Nate
Bookmarks