This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Define strings as unsigned

Hello,

I'm using a graphical Display and have a printf() Function which expects an unsigned char pointer: void LCD_DrawString(unsigned char* s);

using LCD_DrawString("Hello World"); generates a PC-Lint Warning because the constant string is treated as signed by default.

Is there a modifier in c to treat constants like that as unsigned by default like #define 1000U?

I'd like to avoid a cast each time I call this Function just to please Lint and on the other Hand I don't want to disable this check in Lint.

Regards,
Marcus

Parents Reply Children
  • Seems appropriate to me to cast inside the function, where you suddenly decide you want to treat a "character" as an integer for a lookup index. Something like:

    LCD_DrawString (char* string)
        {
        lut[(U8)string[i]];
        }
    

    is not really burdensome or hard to read.

    That way, the function interface is accurate -- it expects strings, which is to say pointers to char. And the internal code is also type-correct, in that it explicitly notes the conversion of a character to an unsigned integer that happens to be its ASCII code value.

    There's no such thing as an "signed character" or an "unsigned character". What's a negative A look like? Reverse field, light on a dark background? Or worse, a negative '-'? Is that a '+'?

    I much prefer to typedef 8-bit integer types ("U8" and "S8" for signed and unsigned) to keep the distinction between the character and an 8-bit integer type. Text characters can then just be declared as "char", without specifying signed-ness. Function signatures will automatically match no matter what the default of the compiler.

  • The macro sounded simple on the first view but isn't really a nice style. Doesn't seem to work with variable Parameter lists either.

    I also defined my own Types for signed and unsigned 8, 16 and 32 Bit Integer Types to be independent from varying architectures.
    What bothers me is when I started writing my Code I used my 8bit unsigned type for strings. The Compiler just accepted without any Warning. And for the use of strings it doesn't matter if they default to signed or unsigned so I never questioned.

    I think your suggestion is in deed the best solution. Treating the standard type char as the Type for strings an characters (without caring for signed or unsigned) somehow never came to my mind :)
    Thx