I did some more testing and it probably is as you suggested.
I reduced the amount of data the external program generates (which I run with QProcess) and now the memory consumption stays on a constant, reasonable level.
So, maybe the old external program just created more data than I could handle with my Qt Application, and that's why the data got buffered and the memory usage skyrocketed. That's a bummer, I will have to rethink much of the concept I currently want to implement and do some more testing.
Whats your opinion in general on this: If a QProcess creates up to 1000 sets of data per second, each data set basically a oneliner on stdout, and I fetch the data with QProcess::readLine(), every reading triggered by a readyreadLine signal, should Qt be able to handle his?
And thanks for giving me the hint.
Bookmarks