This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

ASTC, difference between LDR

I'm looking into the details of ASTC and planning for developing in hardware. The spec version 1.0  says it has LDR & full profile. What is the difference between LDR & HDR modes? What is Full profile then ?

How does each modes process data. What is the input data format  of LDR mode ? Is HDR accepts only 32bit ieee 784 floating point numbers ?  How many partitions a block can have ?  How about the decoding, the spec says it has to be bit exact decode. If lot of floating point operations are required, then is it possible to get bit exact decoding, because of approximation of fixed point implementation for floating point calculations ? 

Parents
  • Ben,

    There three profiles for ASTC are supersets of each other, as follows:

    • LDR profile supports only 2D LDR (8-bit color) encoding modes.
    • HDR profile adds HDR, so 2D LDR and HDR (16-bit color) encoding modes are supported.
    • Full profile adds 3D, so 2D and 3D, LDR and HDR encoding modes are supported.

    For input data, the evaluation codec (available from ASTC Evaluation Codec) takes 8-bit UNORM values for LDR, input as an image, usually in TGA, BMP, GIF or PNG format. For HDR, we take 16-bit pixel values in KTX or DDS formats. The encoder itself works with 16-bit IEEE 754 floats, which are stored in a pseudo-logarithmic format internally to the ASTC data.

    A block may have from 1 to 4 partitions, each with a separate set of color endpoints. In addition, there is the option to specify a second set of weights for one channel of the image data. This allows more flexible encoding for textures with uncorrelated channels, such as X+Y for normal mapping, or L+A or RGB+A for decals.

    The requirement for bit-exactness was requested by the content developers, as it makes it very much easier to qualify content for multiple platforms if the output is guaranteed. The decoder is specified very exactly using integer operations on the internal representation of HDR data, which synthesise the floating-point operations. This allows us to specify the exact bit-patterns delivered to the filtering stage of the texture pipeline. After that, of course, we have to place our trust in the filter implementation.

    I hope that this is helpful.

Reply
  • Ben,

    There three profiles for ASTC are supersets of each other, as follows:

    • LDR profile supports only 2D LDR (8-bit color) encoding modes.
    • HDR profile adds HDR, so 2D LDR and HDR (16-bit color) encoding modes are supported.
    • Full profile adds 3D, so 2D and 3D, LDR and HDR encoding modes are supported.

    For input data, the evaluation codec (available from ASTC Evaluation Codec) takes 8-bit UNORM values for LDR, input as an image, usually in TGA, BMP, GIF or PNG format. For HDR, we take 16-bit pixel values in KTX or DDS formats. The encoder itself works with 16-bit IEEE 754 floats, which are stored in a pseudo-logarithmic format internally to the ASTC data.

    A block may have from 1 to 4 partitions, each with a separate set of color endpoints. In addition, there is the option to specify a second set of weights for one channel of the image data. This allows more flexible encoding for textures with uncorrelated channels, such as X+Y for normal mapping, or L+A or RGB+A for decals.

    The requirement for bit-exactness was requested by the content developers, as it makes it very much easier to qualify content for multiple platforms if the output is guaranteed. The decoder is specified very exactly using integer operations on the internal representation of HDR data, which synthesise the floating-point operations. This allows us to specify the exact bit-patterns delivered to the filtering stage of the texture pipeline. After that, of course, we have to place our trust in the filter implementation.

    I hope that this is helpful.

Children