11-17-2010 12:29 PM
From code or from OpenGL?
With the format being 0xAARRGGBB.
Use GL_UNSIGNED_BYTE for "type" and GL_RGBA for "format" and "internalformat".
11-19-2010 09:46 AM
It's not stored in the Bitmap (from what I can tell) in RGBA8888 format, it's stored in RGB565 format with a separate "layer" (as stated previously) for alpha.
The function converts the values from RGB565 to RGB888 when the function is called. It's not to hard to convert from 565 to 888 (http://stackoverflow.com/questions/2442576/how-doe
Though I'll leave it up to you if you want to search for it but someone a while ago made a post stating that using alpha causes a large performance hit when drawing with Graphics. I can only presume that this has to do with the conversion.
Instead of Bitmap->drawBitmap (where it simply copies and pastes from the Bitmap to the screen buffer) we have Bitmap->drawBitmap (where it needs to convert the values to RGB8888 and copy the alpha into the image, repeat the process for the screen buffer, apply the alpha for the pixels to the sceen buffer pixels, then convert back into RGB565 and place back into the screen buffer).
Hope that clears up the supposed contradiction.