It's a little more involved than that. You need to load textures as sRGB instead of the normal RGB variants (RGB8, DXT1-RGB, etc)*, to tell the gfx driver to apply the inverse gamma correction when sampling the texture to retrieve linear space color values (on the presumption that textures were authored to display as-is on a monitor, which naturally applies the same/a similar correction). The GL_FRAMEBUFFER_SRGB flag then tells the gfx driver to treat the render surface as linear space, and apply gamma correction when copying to output to display correctly. Mathematically the two processes cancel out when no intermediate processing is done, but any intermediate processing that is done (such as lighting or filtering) works in a linear color space which makes it easier to create more accurate color and lighting ramps.AnyOldName3 wrote: ↑16 Oct 2017, 03:51 Is gamma correct rendering an issue (disregarding the fact that Morrowind specifically doesn't do it)? I thought it was as simple as enabling GL_FRAMEBUFFER_SRGB and then it magically worked when you emit 32-bit floats from the fragment shader.
However, given that a gamma-corrected image has a non-linear intensity ramp, smaller variations are possible in the lower intensity values while higher intensity values have bigger variations (the difference between Red=0 and Red=1 is not the same as the difference between Red=245 and Red=255, for example). As a result, linear 8bpc (bits per color) can't represent certain intensity values that sRGB 8bpc can, and vice-versa. This ultimately reduces the effective bit-depth, causing color-banding. The fix is to use a greater bit-depth for the linear space render surface, which increases the dynamic range and allows those lost sRGB values to be represented again.
* With GL3, you can use sampler objects to toggle sRGB sampling separately from the texture format, so you can load textures as normal RGB color and tell the gfx driver to treat them as sRGB later.