ucntcme
24th July 2007, 23:29
First the relevant background: I have a spool directory that I need to get the total number of files in. I don't care about the contents, just how many files. Here's the catch: it's a big number. In this case 1,000,000 files. (Yes it should be hashed but that is not currently an option.)
So, I've tried QProcess with find or ls. They work. I've also tried QDir and it's count() method. It works.
So what's the problem? It's friggin slow. For comparison if I run find manually, piping to "wc -l", it is a lot faster. Doing it by hand takes from 1/4 to 1/2 the time that doing it in Qt using QProcess, and even less than QDir::count() - count takes a few minutes.
So any idea as to which *should* be the fastest route? Why does QDir::count() take *so* long (~4-5 minutes)?
So, I've tried QProcess with find or ls. They work. I've also tried QDir and it's count() method. It works.
So what's the problem? It's friggin slow. For comparison if I run find manually, piping to "wc -l", it is a lot faster. Doing it by hand takes from 1/4 to 1/2 the time that doing it in Qt using QProcess, and even less than QDir::count() - count takes a few minutes.
So any idea as to which *should* be the fastest route? Why does QDir::count() take *so* long (~4-5 minutes)?