Hello!
I'm developing an application that 'do queries to a' MySQL.
As far as I know, all times that I instantiate a new query and do a research in the MySQL server, the data returns and is stored in the query object, and from that it is read. No problems in that; the software runes fine.
The problem is about the memory that is occupied with the data received fromt he MySQL. If I declare the query as a pointer or an object in a function, as far as I know (by reading the documentation), when the function stop (i.e. go to its end), the query is destroyed and [I image and hope] the memory occupied is released. But what if I declare the query as a global variable (which is the case in my app), so the query is never destroyed? In this case, does all search and data reception is stored in the RAM memory and never goes out till the software is closed (so the query used memory never stop growing)?
-
I'm doing this question because my software is time-by-time growing the used RAM memory in the computer, so after 2 days or more the free memory goes out and the computer "crashes". And now I have to find in a 16 million lines of code software what is causing this RAM problem. What is most interesting, however, is that even if the software is closed, the increased used RAM memory is not released!
-
Thanks,
Momergil
Bookmarks