PDA

View Full Version : QProcess and memory usage



ts66
8th December 2011, 17:54
Hello everybody,

I am just working on program that turns out to have a huge memory leak. In this program I use QProcess to start an external program which is generating data that I process in my program. The external programm writes its output to stdout from where I read it with QProcess::readLine(). Reading is triggered by the QProcess::readyReadStandardOutput() signal. The external program creates up to a few hundred MB of data every second. It looks like QProcess never throws that data away after I read the data from stdout, eating up my systems memory.

Is there a way to free that memory used by QProcess again? But the QProcess has to be active all the time.

Or is there a fundamental flaw with my concept? Did I make obvious mistakes? I want to move to sockets to get the data in the future, but this is not a option now. I have to wait until the developer of the external program is able to do it. Until then I would like to have my program work with passing data via stdout.

Thanks in advance.

(Edit: I am developing on a Linux machine, but I want my program to work on other systems at a later point)

ChrisW67
9th December 2011, 01:19
Are you reading a few hundred megabytes per second or is QProcess buffering it because you cannot keep up? Is there a memory leak in your reading code that has nothing to do with QProcess?

In general, once memory is allocated for use in a program it is not freed to the operating system until program termination, or until the OS explicitly reclaims unused memory that is still allocated to a process (which will happen if memory is exhausted). Between QProcess buffering data and you doing something with that data that probably involves creating copies it is easy to see a memory consumption that rapidly climbs to a high, but reasonable stable level even if there is no leak.

ts66
9th December 2011, 20:58
I did some more testing and it probably is as you suggested.

I reduced the amount of data the external program generates (which I run with QProcess) and now the memory consumption stays on a constant, reasonable level.

So, maybe the old external program just created more data than I could handle with my Qt Application, and that's why the data got buffered and the memory usage skyrocketed. That's a bummer, I will have to rethink much of the concept I currently want to implement and do some more testing.

Whats your opinion in general on this: If a QProcess creates up to 1000 sets of data per second, each data set basically a oneliner on stdout, and I fetch the data with QProcess::readLine(), every reading triggered by a readyreadLine signal, should Qt be able to handle his?

And thanks for giving me the hint.

ChrisW67
12th December 2011, 03:25
What is producing 100+MB of data per second in a text form?

As to the performance... maybe. If it is a 1000 lines of 10 bytes per second that's a vastly different proposition to 1000 lines of 100000+ bytes, which is what you started with. Since a QIODevice does not have a signal to tell you that a whole line is present, only that some new data is available (QIODevice::readyRead() or the stdout/stderr versions in QProcess) you must at least be doing some buffering and construction of whole lines from fragments. Then, of course, you will be doing something to the data that takes time, which may be trivial or time expensive.

ts66
16th December 2011, 21:34
Sorry for the long delay, but I was busy with other things. To give some more precise numbers: I tried to process about 20MB of data per second, cominig in about 3.000.000 lines. Each line triggers a readline event. Thats aparently too much data. When I reduce the data input to about one third of that number my program works fine without excessive memory consumption.