This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Enums and Ints


Bug, or just weird subtlety of ANSI C when it comes to enums and ints?

typedef enum
    {
    es16_1, es16_2, es16_3,

    es16_Force16 = 0x7fff
    } EnumS16;

typedef enum
    {
    eu16_1, eu16_2, eu16_3,

    eu16_Force16 = 0xffff
    } EnumU16;


According to C51 7.02b, the sizeof(EnumS16)== 2, as you would expect. Sizeof(EnumU16) == 1, however.

Is this a compiler bug? Or a result of ANSI C integer/enum rules? Enums, after all, are a kind of int (as opposed to "unsigned int"). It would be reasonable, then, to reject the explicit value of "0xffff" as unrepresentable, and generate a compiler error. But the code compiles. So, you might consider 0xffff to be the hard way to write "-1" -- but it's still 16 bits wide. On the other hand, perhaps it's just considered -1, and represented as 0xff in a 1-byte integer, which the enum would otherwise allow. On the third hand, ANSI C doesn't allow for variable-sized enums anyway (though any decent embedded compiler does), so perhaps the interpretation is up to Keil.

I'd think the second example rates at least a warning message; it surprised me, anyway.

0