I have here some Data model which hold about 100.000 rows and each row has 3 columns. When I iterate through all rows to modify one of the three items, this takes a little bit more than a minute. On the other hand if I have only 6.500 rows in that model, than the same change for all items in one row is below 1 second. So the time for this change is not linear, it is either quadratic or something like n*log(n).

Is there any recommended way of iterating through such a pretty large set of data?

I used (this one took 86 seconds):
Qt Code:
  1. for(int r = 0; r < _model->rowCount(); ++ r) {
  2. QStandardItem *item = _model->item(r, 0);
  3. item->setData(foo(item->data().toUInt()), Qt::DisplayRole);
  4. }
To copy to clipboard, switch view to plain text mode 

And I tried (which took 76 seconds):
Qt Code:
  1. QModelIndex idx = _model->indexFromItem(_model->item(0, 0));
  2. for(int r = 0; r < _model->rowCount(); ++ r) {
  3. _model->setData(idx, foo(_model->data(idx, Qt::UserRole + 1).toUInt()), Qt::DisplayRole);
  4. idx = idx.sibling(r + 1, 0);
  5. }
To copy to clipboard, switch view to plain text mode 

The initial filling of the model with all these 100.000 rows was done within 1 or 2 seconds. However, this was done in the constructor before show() was called. So maybe turning off some graphical issues might also improve the programm.

Any ideas?