HDR (consumer 10-bit) Displays are coming and that jump will be like going from VHS to Betacam SP because there's so much more colour data. (four times more, in fact) 10 years from now, we're going to look at 8-bit Colour the way we look at VHS now.
Windows 10 already supports HDR in the anniversary update without buying a Firepro or a Quadro and I can imagine Linux won't be far off. Yeah, I know the textures aren't 10-bit, but there are texture mods to look forward to.
support 10-bits per channel
- psi29a
- Posts: 5362
- Joined: 29 Sep 2011, 10:13
- Location: Belgium
- Gitlab profile: https://gitlab.com/psi29a/
- Contact:
Re: support 10-bits per channel
We still have a VHS deck under the TV in the living room. Plenty of films still not yet released on DVD or Blu-Ray.
Re: support 10-bits per channel
Also, no offense, but if you want to push people to move to a new technology, using a technology that is associated with not becoming the consumer standard is probably not the greatest idea.
Regarding this; it would be an OSG thing anyway (I believe), and it would be mostly a moot point: 10-bit data is not going to be output because there is no 10-bit output for Morrowind and its content. Likewise, there are likely compatibility issues to iron out. Once it becomes common, OpenMW could look at adopting it, but right now there's no guarantee that it really provides anything that is super important to the OpenMW experience but it would do so at the cost of embracing cutting-edge (potentially never to be widely adopted) technology that the project would have to work on independently.
Regarding this; it would be an OSG thing anyway (I believe), and it would be mostly a moot point: 10-bit data is not going to be output because there is no 10-bit output for Morrowind and its content. Likewise, there are likely compatibility issues to iron out. Once it becomes common, OpenMW could look at adopting it, but right now there's no guarantee that it really provides anything that is super important to the OpenMW experience but it would do so at the cost of embracing cutting-edge (potentially never to be widely adopted) technology that the project would have to work on independently.
Re: support 10-bits per channel
This reminds me of a recent thread: viewtopic.php?f=3&t=3687.
Btw: Since the number of bits refers to the bits per channel, we end up with 64 times the tones compared to 8 bits. (Plus another 10 bits for the alpha channel? Or still 8 bits?)
Anyway, if I got that right, the increased bit count makes it possible to avoid typical banding issues even on 8 bit monitors -- given your data input for the image processing is in 10 bit.
Btw: Since the number of bits refers to the bits per channel, we end up with 64 times the tones compared to 8 bits. (Plus another 10 bits for the alpha channel? Or still 8 bits?)
Anyway, if I got that right, the increased bit count makes it possible to avoid typical banding issues even on 8 bit monitors -- given your data input for the image processing is in 10 bit.
- AnyOldName3
- Posts: 2678
- Joined: 26 Nov 2015, 03:25
Re: support 10-bits per channel
Don't shaders usually work with floats, then scale them to be between 0 and 1 (clamping if necessary for the intended exposure value), gamma correct them, multiply by 255, and then convert to int? If this is the case, even if the input data (e.g. textures) isn't HDR, you just change the multiplication to 1023, and bam, it works. Support for higher colour depth textures can be added later. The shading system is already creating values different to the original colours in the textures, so those with the hardware might as well use them.
Re: support 10-bits per channel
Shaders do work on a normalized 0...1 floating point scale. And since color scaling and modulation is part of the rendering pipeline, an increased output color depth can be utilized even if the input textures are still 24-bit color, as long as the rendering buffers have enough precision.
Fun-fact though. While textures are typically 8 bits per color, they're also generally intended as sRGB color space, which is on an exponential scale. So for proper blending in linear color spaces, you'll need a rendering precision that's greater than 8bpc anyway to "unpack" such colors (converting 8-bit sRGB to 8-bit linear will cause massive precision loss). The rendered output then needs to be converted back to sRGB for output. Neither Morrowind or OpenMW do that, though; they treat the input and output color spaces as linear, so while fully lit textures come through correctly (two errors canceling each other out), any in-between blending isn't working on the correct scale.
Fun-fact though. While textures are typically 8 bits per color, they're also generally intended as sRGB color space, which is on an exponential scale. So for proper blending in linear color spaces, you'll need a rendering precision that's greater than 8bpc anyway to "unpack" such colors (converting 8-bit sRGB to 8-bit linear will cause massive precision loss). The rendered output then needs to be converted back to sRGB for output. Neither Morrowind or OpenMW do that, though; they treat the input and output color spaces as linear, so while fully lit textures come through correctly (two errors canceling each other out), any in-between blending isn't working on the correct scale.