PDA

View Full Version : Fetch millions of records using QtSql



mvbhavsar
13th January 2011, 07:49
Hi,

I have requirement where there is an Oracle table which has almost 40 million data. This data I need to process and has to be written in various flat files which is to be given to client. The problem is whenever I fetches records from this table, it gives memory fault error after 10-15 million data fetched. Hence I am looking for solution which will fetch resultset in batches, clear memory and get next resultset so that I can avoid memory error.

Thanks in advance

Manish

tbscope
13th January 2011, 07:56
For large datasets try to use transactions (if possible of Oracle of course).

Assuming that you can process the results in batch sizes of 100 records for example, you can easily use SQL to limit the results. See SQL limit on google.

mvbhavsar
13th January 2011, 08:04
Thanks for prompt reply.
SQL limit clause can be used in PL/SQL, not sure if can be used directly in SQL.
If possible can you elaborate how we can code using transaction.

Thanks

Manish

JohannesMunk
13th January 2011, 11:23
This sounds like you are having a memory leak and not a design problem. If you look at the taskmanager is it your application or the database-server that accumulates memory? Some time ago i didn't properly clear the query objects which resulted in a memory pileup in the sql-server. also try only fetching and not processing. maybe the problem is your output-code?

Johannes

marcvanriet
13th January 2011, 11:57
Hi,

Maybe use a QSqlQuery and use setForwardOnly(). setForwardOnly() should prevent loading of the entire result in memory at once.

Regards,
Marc

mvbhavsar
13th January 2011, 15:22
Thanks for reply.
I will try with setForwardOnly option and will post details.
However I just want to add that there is database or application issue. It is cache which is getting full due to huge number of records from resultset.

Manish

mvbhavsar
14th January 2011, 06:10
Thanks marcvanriet.

It has worked. setForwardOnly has solved memory leak problem. Now I am able to fetch 100 million records without any problem.

Manish