This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

difference between short and char

Hi all,

What is the difference between the data types char and short? According to the book, they are both 8 bits and have the exact same range.

Thanks,
Steve

  • oops, my bad.....

    I meant to say the difference between short and int. They are both 16 bits. Is it just a symantic difference?

    Thanks
    Steve


  • Which book? My C51 compiler manual PDF (version 7.02) says that a short is 16 bits.

    ANSI C requires a short to be at least 16 bits, and a long to be at least 32.

    In any event, if a short were 8 bits wide, it would be the same as a char -- just as in actuality, a "short" is the same as an "int" on this particular compiler.

    Most programmers I know adopted a habit of typedef'ing specific integer sizes as appropriate for their platform, and then using those typedefs instead. Something along the lines of:

    typedef unsigned char U8;
    typedef signed char S8;
    typedef unsigned short U16;
    typedef signed short S16;
    typedef unsigned long U32;
    typedef signed long S32;

    That way you always know what you're dealing with, and don't have as much trauma moving from platform to platform. Of course, there's lots of variation: uint8, UBYTE / UWORD / ULONG, and so on.

    The ISO C99 standard also has a bunch of new integer types in <inttypes.h>, which serve the same purpose, among others. (intptr_t, for instance: an integer guaranteed to be long enough to hold a pointer).

  • "Of course, there's lots of variation: uint8, UBYTE / UWORD / ULONG, and so on."

    Absolutely - after all, they're just names!

    However, things like "Word", "DWord", "Long", etc have the definite disadvantage that they are ambiguous: they do not explicitly state the size of the object - which is, after all, the whole point of the excercise!

    Therefore I always use and recommend (as a 'Search' will show!) U8, S16, etc - as they show both size and signed-ness clearly, explicitly, and succinctly.
    What more could you want?!

  • Thats kindda what I thought....just symantics, but I just wanted to make sure. Thanks for the help

    Steve


  • I also am not fond of the UBYTE / UWORD style. "WORD", in particular, is really problematic. It ought to mean the natural bus width of the machine -- probably the data bus -- but a lot of people use it to mean "16 bits", even on eight- or 32-bit machines. (And then they proceed to use DWORD to mean "32 bits", since it's twice a "WORD".) Then you stack that usage on top of some hardware device that has a different word width, and also keeps using the word "WORD" in their documentation and/or sample code. Just too confusing, especially since the whole point of the series of typedefs is to give you a way to specify a known, exact, width.

    C99 has standardized a scheme in <inttypes.h> that serves this purpose -- uint8_t, etc. I'm still trying to talk myself into learning to use it; I've gotten too settled in "my" way. Besides, I dislike the "_t" ANSI-ism. (I know it's a type name, okay? That's why it comes in front of the variable name or inside the cast... You don't have to bludgeon me about the head and shoulders with the fact.)

    As always, it boils down to whatever works for you.