View Full Version : Fast Data Colllection Buffering and File Writing

19th April 2012, 00:12
In my running application, at 100 ms epoch rates or faster, I need
to create content which will represent signal levels for post
mortum analysis of signal timing. I plan on using the Variable Change
Dump (VCD) or FST file formats for this.

However, the data needs to be generated in real time, while
the writing can occur at some termination point after the
active session terminates.

The data to be written can either be ASCII text or binary
depending upon the output file format selected.

This is for the Linux and Windows platforms.

I am looking for a fast mechanism to allow for collecting this
data, perhaps a memory mapped file, a heap allocated chunk
of memory with managed offsets, or other.

One consideration would be to allocate a very large chunk from the
heap, and simply perform memcpy's to this heap as data is
collected, followed by writing the content to a physical
file at the end of the collection session.

Perhaps a disadavantage of this approach would be data collection
would be limited to the single large allocation pre-simulation.

Of course, I could always fill the memory buffer, and then
write it to a disk file once the buffer is full, fill it
again, which provides allows for file sizes to build larger than the
singular initial heap allocation.

The notion of buffering the data in memory and flushing it
in chunks to disk is probably already provided as part of
the QFile class.

However, I certainly do not need all of the features of a
QFile or perhaps a QBuffer or QByteArray. I am not aware if QFile
provides APIs for setting the buffer size used prior to a disk
flush. I do not need to share the contents collected as a
shared memory element.

Anyone do something similar and have any recommendations?