This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Why there are sRGB/RGB modes in ASTC encoder

In ASTC encoder, we can use -cl to compress RGB texture  and -cs to compress sRGB texture, but why does it need to tell sRGB/RGB texture for astc encoder. From the astc encoder code, I can see some different in processing, but I don't understant why it is needed 

(1) astc encoder tries to fit the texels to a line, it relates to the data distributition, fitting sRGB texure or RGB texture will be not different.

(2) ASTC spec CEM, there is only LDR/HDR mode, no LDR_sRGB mode , so sRGB/RGB will share the LDR CEM.

Parents
  • For compression, you don't really need it although it does influence rounding in error calculations. 

    The reason we have it is that the decoder aims to be accurate to the hardware implementation, and for that linear and sRGB round decompressed outputs at different points in the pipeline. Linear uses a full 16 bit result, whereas sRGB uses the top 8 bits with rounding based off the 9th bit.

    ASTC spec CEM, there is only LDR/HDR mode, no LDR_sRGB mode , so sRGB/RGB will share the LDR CEM.

    The sRGB-ness of a texture is a property of the texture format set via the graphics API, it's not set in the per-block encoded data. 

Reply
  • For compression, you don't really need it although it does influence rounding in error calculations. 

    The reason we have it is that the decoder aims to be accurate to the hardware implementation, and for that linear and sRGB round decompressed outputs at different points in the pipeline. Linear uses a full 16 bit result, whereas sRGB uses the top 8 bits with rounding based off the 9th bit.

    ASTC spec CEM, there is only LDR/HDR mode, no LDR_sRGB mode , so sRGB/RGB will share the LDR CEM.

    The sRGB-ness of a texture is a property of the texture format set via the graphics API, it's not set in the per-block encoded data. 

Children