Issue while decoding a U16 decimal value in equivalent hex in intel hex file for armcc compiler

I am changing value of a macro which is parameter to a function. The function parameter is of UNSIGNED16 type. When I change the value of this macro within U8 range, the change is clearly visible and understandable in the generated intel hex file.

For example if I change the value from 2 to 255 I can see 0x02 and 0xFF in the hex file.

But when I change the value beyond U8, I am not able to trace the equivalent value in the generated intel hex file.

For example if I change value 1000 to 300,

the I can see value 4f f4 80 70 is changed to 40 f2 01 10 in the generated hex file.(data is stored in little endian format)