So most of the time your threads spin idle loops. You are not using all your cores, you are wasting all your cores.
So do that in threads but what is the point of having threads just to receive data from network which happens quite rarely.Encoding and decoding of messages is a non-trivial amount of time.
How many connections at most do you serve concurrently? How many cores do you have in your machine? Does your OS has to context switch between threads only for the thread to see it doesn't have anything to do?All together putting networking/processing in a different thread allows the actual GUI thread more time to operate.
Sure. You get disconnected after waiting 10ms before fetching the dataIf you don't retrieve the data quickly enough from the client buffers the server may decide to disconnect you, or it could simply throttle its throughput rate. In some cases the client side OS will disconnect you (the internal buffers are only so big).
Really? Spawn some (= the number of cores you have) additional threads and make them do "while(1);". You'll see your other threads are getting delayed even though it's another thread does puts a strain on the cpu.Putting that in a separate thread ensures they are always clearing the buffers at a good rate.
What you are doing is a wrong approach. Sorry to be so blunt but it is true. Using threads for fetching data from network yields no benefits and is a very common mistake based on false axiom (yes, axiom - without actually proving that!) that networking is better done in threads. It's better to offload real work to threads and make them actually do something useful instead of having tens of empty event loops spinning around in vain.
Bookmarks