Thanks, but ... sorry, I don't get it. I can't see anything wrong with this. Could you please point out what's causing a memory leak here?
This is executed every time a client calls connectToHost(). Thus if only 1 client connects, then I will get only 1 instance of serversocket. The socket descriptor is passed to the tcp socket who then emits the signal readyRead() whenever data is coming in. The signal is connected to my method readBlockData() which reads the data from the stream, and thus should free the buffer.
I set a breakpoint at these lines to check how often a ServerSocket is instantiated, and it was only once as expected.
Or did I miss anything else here.
Bookmarks