PDA

View Full Version : Alpha channel weirdness with QGLContext



renaissanz
13th March 2006, 19:22
Hi, I'm using QGLContext to render a small viewport with some triangles and rectangles in it. I have problems when trying to read the alpha channel back out using glReadPixels. Can this have something to do with using QGLContext?

Please see below:

I initialize like so:



QGLContext * context = 0;
QGLFormat format;
context = new QGLContext(format, (QPaintDevice *)glPanel);
context->create();
context->makeCurrent();
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glClearDepth(1.0f);


I draw like so:



context->makeCurrent();
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_CULL_FACE);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glEnable(GL_ALPHA_TEST);
glAlphaFunc(GL_GREATER, 0.0);

for(int i = 0; i < numWidgets; i++)
{
if(Widget[i]->type == TriangleWidget)
{
Widget[i]->DrawTriangle(mode);
}
else
{
Widget[i]->DrawRectangle(mode);
}
}

glFlush();

if(context->format().doubleBuffer())
context->swapBuffers();


I use glReadPixels like so:



unsigned char * LUT2D = new unsigned char[256 * 256 * 4];
context->makeCurrent();
glReadBuffer(GL_FRONT);
glReadPixels(0, 0, 256, 256, GL_RGBA, GL_UNSIGNED_BYTE, LUT2D);

// debug
FILE * fp = fopen("/tmp/LUT2D.txt", "w+");
if(fp)
{
unsigned int * tmp = (unsigned int *)LUT2D;
for(int h = 0; h < 256; h++)
{
for(int w = 0; w < 256; w++)
{
fprintf(fp, "0x%08X, ", tmp[w * h]);
}
fprintf(fp, "\n");
}
fclose(fp);
}
// debug


As you can see, I'm writing the buffer to a file to see what's in the alpha channel. It's 0xFF (or 1.0 or 255 depending on your preference) for every value.

Once again, is this some weirdness with using QGLContext?

renaissanz
14th March 2006, 21:18
OK, I also tried

format->setAlpha(TRUE);

And that gave me an invalid context. Help anyone?

renaissanz
15th March 2006, 16:10
Solved!

OK, this is for any programmer who comes along with a similar problem. If you need the alpha channel for any purpose, you need to enable it when you create the application by using a special X11 (this was done on Linux) version of the QApplication constructor.

Code as follows:



static int attrListDbl[] =
{
GLX_RGBA, GLX_DOUBLEBUFFER,
GLX_RED_SIZE, 4,
GLX_GREEN_SIZE, 4,
GLX_BLUE_SIZE, 4,
GLX_ALPHA_SIZE, 4,
GLX_DEPTH_SIZE, 16,
None
};

int main( int argc, char ** argv )
{
Display * dpy = XOpenDisplay(0);
int scr = DefaultScreen(dpy);
XVisualInfo * vis = glXChooseVisual(dpy, scr, attrListDbl);

// QApplication a( argc, argv );
QApplication a(dpy, argc, argv, (long unsigned int)vis);

CFormVolumeViewer w;
a.setMainWidget((QWidget *)&w);
w.show();
a.connect( &a, SIGNAL( lastWindowClosed() ), &a, SLOT( quit() ) );
return a.exec();
}


This enables the alpha channel for any subsequent QGLContext that you create. Remember that you still need to call QGLFormat::setAlpha(TRUE) before it will work.

Unfortunately, it doesn't look like there's a WIN32 equivalent to this. I know from programming on WIN32 with OpenGL that you DO need to set the alpha bits using the Pixel Format Descriptor.