View Full Version : Multiple database connections

2nd June 2010, 18:05
I'm building an app that is using a memory-backed database for speed (the database needs to handle at least 60 inserts/second, and I can't batch them). Periodically, I need to copy all of the records from the memory-backed database to a disk-backed one.

I assumed that I could open two connections, and then use fully-qualified names to move around the records. My code looks something like this (simplified, so it may not compile). Both databases contain behavioraldata tables with xpos,ypos, and time columns:

//Open disk-backed database
sessionDb_ = QSqlDatabase::addDatabase("QSQLITE",databaseName);
sessionDb_.setDatabaseName(QCoreApplication::appli cationDirPath() + "/" + databaseName + ".sqlite");

//Open memory-backed database
cacheDb_ = QSqlDatabase::addDatabase("QSQLITE",cacheDatabaseName);

// .....Useful code removed.....

//Try to copy the behavioraldata table from the memory database to the disk database
QSqlQuery flushQ(cacheDb_ );
QString queryStr = QString("INSERT INTO %1.behavioraldata(xpos,ypos,time) "
"SELECT xpos, ypos, time "
"FROM behavioraldata")

//open memory-backed database

However, when I execute this code, the query fails because it can't find the destination table. Since it wasn't entirely clear what the database "name" was, I tried using both the connection name and the file name, but neither works.

Does anyone have any idea what I'm doing wrong here? Does SQLITE/QSqlDatase not handle multiple databases? Am I screwing up the

2nd June 2010, 18:17
You can't use one sql query on multiple databases.

What is flushQ?

the database needs to handle at least 60 inserts/second, and I can't batch them
Why not?

2nd June 2010, 18:35
flushQ is a query on the cache (in-memory) database. I failed to include it, but edited the initial post to fix my mistake.

The application is a server that sits between a data producer and a data consumer. The producer is generating data at 60-100Hz (basically at a monitor's refresh rate) and sending it to the server to be permanently stored. At the same time, the consumer wants to see what is going on with an absolute minimum latency (ideally less than 10 frames). If I were to batch the inserts (say committing once a second), then the consumer won't have access to the most recent data (in this case, there will be a 1 second delay).

(In case you're really curious, this is an application used for data collection in a research lab. If you really want to know more, PM me.)

2nd June 2010, 18:53
At the same time, the consumer wants to see what is going on with an absolute minimum latency (ideally less than 10 frames). If I were to batch the inserts (say committing once a second), then the consumer won't have access to the most recent data (in this case, there will be a 1 second delay).

Depending on the amount of data, I don't think the consumer needs to see it all immediately (that's my personal opinion though). With a chart I can understand it because a chart can display a lot more information than text on the same space. But if it's a list of data (text), I have my doubts anyone can keep up with hundreds of items per second.

Looking at your insert query, I don't think inserting 1000 items in one batch will take you more than a fraction of a second. The only problem I can see is if you need to store millions of items per second and there's no way to slow down the datastream from the client(s).

But like I said above, you have two seperate databases. You can't use one query to get information from one database and insert it into the other. You will need at least two queries. So, you need to first select all the items in your memory database and then use a second query to insert them in the other database. But when you do that, I don't think I see a performance gain when using this technique instead of directly storing the items in the session database.

2nd June 2010, 19:05
So, the application in question is a neurological test environment. The subject is using a mouse to interact with an experiment at one machine. The server application (where my problem exists), is recording the data that is input. The experimenter needs to be able to see what the subject is doing with a minimum amount of delay, so the server is streaming x,y coordinates to the experimenter's machine (the experimenter is likely making adjustments to the experiment in near real-time). If we batch everything, the delay created makes it so that the experimenter is watching a delayed version of the experiment. However, if I can run the whole thing without batching, using a :memory: database, SQLITE is plenty fast.

The copy operation from the cache to the session database is occurring during a break in the experiment, where no new data is being collected, so we don't care about latency (also the data is already collected, so I can batch the inserts).

I'm pretty sure that I can use a single query to move information from one database to another. Otherwise, what would be the point of the ATTACH command? However, I am not a database guy, so I'm clearly struggling here.

2nd June 2010, 19:13
You can indeed use the attach command. Did you already try it?

Now that I now a bit more about your program, why don't you display the data directly to the experimenter without going through the database first? Or did you already do that? At least, this should eliminate any lag from the database. The database can then be used as a "tape recorder".

Edit: ohh and sorry for the indirect questions to your question. I'm trying to understand what you want to do and why.

2nd June 2010, 19:18
I'm actually using the addDatabase command since it is just as easy, and works the same way. At least according to this:


Because it's possible that multiple experimenters will be viewing the experiment, I need to route everything through the server. I could use some sort of simple data structure to store it all, but since I was already using the database, it made sense to store everything there (also this means that if I decide to change the schema, I can simply change the tables in the database).

If it turns out that there is no way to do this with a single SQL statement, I can read everything out and write it back by hand, but this will likely be slower.

2nd June 2010, 19:39
With the attach command, you can use one query. I didn't know about that.
You do need to use the actual database name, as in filename, when using the attach command. The connection name is a way for the Qt classes to make a difference between multiple databases.

Here's what I would do though (see attachment)
There's a direct connection between the main experimenter (experimenter 1) and the subject. Whatever the subject does with his mouse, the data is transfered and displayed in real time on the computer of experimenter 1. The server (running on the computer of experimenter 1 for example, but can be stand alone too) directly sends the data to the connected clients (as in a proxy) in real time. The other experimenters see the data in real time too (well, at least with a very minor lag, there's always some overhead). As a last step, the data is cached for a second or two and then written to the database in the background. The database is only used to record everything, not to display (unless at a later time).

3rd June 2010, 17:53
So, I don't really want to rearrange the entire system to make this happen.

My current solution is simply to use 2 seperate queries. I SELECTed all of the relevant records from the memory-backed cacheDb_, and then in a single transaction, I am INSERTing them into the table in the disk-backed sessionDb_. I am assuming that this is slower than doing the whole thing in a single SQL command, but it is working for the moment.

However, I would still be interested in figuring out how to move data between two databases without having to do it one record at a time.

4th June 2010, 11:32
I still don't understand why you can't use transactions. A SELECT statement will still return up-to-date data, even in the middle of a transaction. A disk database with transactions will easily support 10,000 simple inserts a second, so you could skip the in-memory database. Anyway, here is a sample of using ATTACH. I doubt it's much faster than doing it in Qt, but it is a bit more elegant.

void setupDiskDb()
QSqlDatabase db = QSqlDatabase::addDatabase("QSQLITE", "disk");

QSqlQuery query(db);
query.exec("CREATE TABLE T1(F1, F2, F3)");

void doSqLiteTest2()
QSqlDatabase db = QSqlDatabase::addDatabase("QSQLITE", "mem");

QSqlQuery query(db);
query.exec("CREATE TABLE T1(F1, F2, F3)");

query.prepare("INSERT INTO T1 (F1, F2, F3) "
"VALUES (:1, :2, :3)");

for (int i = 0; i < 50000; i++)
query.bindValue(":1", i + 0);
query.bindValue(":2", i + 1);
query.bindValue(":3", i + 2);

qDebug() << query.exec("ATTACH DATABASE 'CatchyName' AS Catchy");
qDebug() << query.exec("INSERT INTO Catchy.T1(F1, F2, F3) SELECT F1, F2, F3 FROM T1");

int main(int argc, char * argv[])
QApplication a(argc, argv);


return a.exec();