I have an external compression application that can run independently or piped via stdin. I would like to implement the piping option to take advantage of the performance benefits of avoiding writing to disk twice. My data is a binary stream of several MB, often a few GB.
I have implemented this using QProcess, launching the executable in QIODevice::ReadWrite mode with the appropriate arguments using QProcess::start(). My data is then output using a QDataStream attached to my QProcess. I don't have access to the outside application's source code, but other implementations have done this succesfully using libraries other than Qt.
Two things. First, no data actually appears to transfer until I have my application sit around waiting for the QProcess to finish. How do I force a flush of QProcess?
Second, when the actual data transfer between processes occurs, it is incredibly slow. The transfer rate appears to be around 400KB/s. When I compress a file in stand-alone mode, it takes approximately 1.369secs to run a 68MB file, so I know that the compression itself is relatively fast. The same file, when piped, takes approximately 170 seconds, and the CPU reports a 0% load on either process.
I suspect that the slow performance has something to do with the fact that QProcess is optimized for basic text communication between processes (very small buffers), not for huge data transfers, but I see no way to change the buffer size for QProcess so that I can test this theory.
I'm developing and running on Windows 7 64bit Pro, using MSVC2010.
Bookmarks