Why does data type conversion not raise an error?

Hi, I have this code:

void main(void)
{ data unsigned char a = 0; data unsigned char b = 0; data unsigned int c = 1000;

b = c;

while(1) { a+= 1; }
}

And it compiles with no errors or warnings. Why is this? Why does it not raise an error because the 16-bit value c is being stored in the 8-bit value b? The Intpromote is turned off, so that is not it.

Yours, confused.

Robbie Martin.

Parents
  • Why should it?

    If this was considered an error, how would a compiler be able to use the normal fgetc() etc to read characters from a file and assign the value to a character? fgetc() et al returns an int, not a char.

    The compiler could potentially note that the size of the destination is too small, and issue a warning about the loss of significant bits of data. However, the standard does say in 5.1.2.3, point 10:

    "EXAMPLE 2: In executing the fragment
    char ch1, ch2;
    /* ... */
    c1 = c1 + c2;
    the 'integer promotions' require that the abstract machine promote the value of each variable to int size and then add the two ints and truncate the sum. [...]"

    How fun would it be to work with characters if the compiler on one hand is required to convert characters to int, and on the other and would always require a typecast (just to acknowledge that it is performing the truncation the standard requires) before the assign, to suppress a warning?

    Having a compiler that may deviate from the standard and not promote characters to int is a trade-off to reduce the code size in the tiny C51 processor. But the tradeoff has to be as compatible as possible, since you want the code to behave similarly as long as an overflow doesn't occur. Requiring a typecast when using the C51 compiler, but no typecast when using a standards-compliant compiler would be quite strange, don't you think?

Reply
  • Why should it?

    If this was considered an error, how would a compiler be able to use the normal fgetc() etc to read characters from a file and assign the value to a character? fgetc() et al returns an int, not a char.

    The compiler could potentially note that the size of the destination is too small, and issue a warning about the loss of significant bits of data. However, the standard does say in 5.1.2.3, point 10:

    "EXAMPLE 2: In executing the fragment
    char ch1, ch2;
    /* ... */
    c1 = c1 + c2;
    the 'integer promotions' require that the abstract machine promote the value of each variable to int size and then add the two ints and truncate the sum. [...]"

    How fun would it be to work with characters if the compiler on one hand is required to convert characters to int, and on the other and would always require a typecast (just to acknowledge that it is performing the truncation the standard requires) before the assign, to suppress a warning?

    Having a compiler that may deviate from the standard and not promote characters to int is a trade-off to reduce the code size in the tiny C51 processor. But the tradeoff has to be as compatible as possible, since you want the code to behave similarly as long as an overflow doesn't occur. Requiring a typecast when using the C51 compiler, but no typecast when using a standards-compliant compiler would be quite strange, don't you think?

Children
No data
More questions in this forum