JimJey
9th April 2008, 06:10
I'm writing a raytracing application that should display the image being synthesized to the user. Currently I was taking this approach (QGraphicsView and QGraphicsScene are used to present the image):
1) Construct QImage to hold the image data
2) Periodically update QImage using setPixel
3) Convert the QImage to QPixmap and call
QGraphicsPixmapItem.setPixmap to reflect the changes in the QGraphicsScene
First I wasn't specifying any special viewport. Updates on the QImage were performed quite quickly, however zooming proved to be laggy.
Then I switched the viewport to an OGLWidget. Zooming became very smooth, but updating the QImage takes about seconds now!
I can't really imagine how setting a different viewport affects the performance of the setPixel operation (diving into the source code of QImage also didn't clarify the issue).
Is the way I am using QImage/QGraphicsView/QGraphicsScene the recommended way? Why becomes setting pixels so slow whenever I change the viewport to OpenGL?
Any solutions?
Thank you very much.
1) Construct QImage to hold the image data
2) Periodically update QImage using setPixel
3) Convert the QImage to QPixmap and call
QGraphicsPixmapItem.setPixmap to reflect the changes in the QGraphicsScene
First I wasn't specifying any special viewport. Updates on the QImage were performed quite quickly, however zooming proved to be laggy.
Then I switched the viewport to an OGLWidget. Zooming became very smooth, but updating the QImage takes about seconds now!
I can't really imagine how setting a different viewport affects the performance of the setPixel operation (diving into the source code of QImage also didn't clarify the issue).
Is the way I am using QImage/QGraphicsView/QGraphicsScene the recommended way? Why becomes setting pixels so slow whenever I change the viewport to OpenGL?
Any solutions?
Thank you very much.