Correct me if I'm wrong but 100000 x 100000 items equals E+10 (10G) items. If each item occupies let's say 10 bytes of memory (it probably occupies more), this gives... wow.... Do you have 100GB of memory in your system? Maybe you should try to limit the number of items a little? Anyway, I guess you should try to see what is the bottleneck of your data model. You can probably introduce some caching and thus increase performance.
Bookmarks