This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Strange behaviour 89C52RD with printf()

Hello,

I have an application consisting of a atmel 89c51RD µC, 32k external SRAM and a memory mapped i/o device at adress 8000h (spc3 profibus controller). As the controller has integrated 1k xdata ram I tried to spare the cost for the external ram. I have done the following to get it running:

1. modifiying startup.asm to enable the xram. i have tested it with a simple testprogram. this works.
2. in the target option i specify 1k xdata from 0000h and length of 0400h. i didn't specify the spc3 which is explicit specified with the _at_ command at adress 8000h. memory model : large.
3. it didn't work.
4. now i put some printf statements into the program (main()) to locate the point of error over the serial port which i connedted to a pc with a terminal program running. And it worked (!).
5. the problem is, whenever i remove these "debug" commands, it's not working. not working means that the controller is reset, but i did turn the watchdog off. the strange thing is the following:
i have one working version with some printfs. i know the point where the program is reset. if i remove a printf after this point it will also not work(!). i think that it is some point of memory overlap or misleading pointer, or simple misconfiguration of the compiler or linker, but i really have no idea at the moment.

if someone has expierience with using the internal xdata of the 89c51rd or an idea i would be thankful for a little help.

thanks.

  • It seems that you have not initialize the local var.
    I hope this can help you.

  • Possibly not related to your problem, but check the Thread, "Weird behaviour of 'printf' with unsigned char"
    http://www.keil.com/forum/msgpage.asp?MsgID=624

    Note that the Keil implementation of printf requires the arguments to be the correct size to match the format specifiers: eg, if you specify "%d" then you must provide a 16-bit value to go with it; if you only supply a char, printf will still take 16 bits making that and all further actual parameter values rubbish!

    Either cast the value to [unsigned] int, or use 'B' to specify a single byte in the format specifier.

  • Note that the Keil implementation of printf requires the arguments to be the correct size to match the format specifiers: eg, if you specify "%d" then you must provide a 16-bit value to go with it;

    Are there other compilers that DON'T require this? :-)

    Jon

  • Well if you leave ANSI integer promotion enabled (bad for performance) then printf will promote your char to int before passing it in as a parameter and printf will work as expected.

    As with any variadatic function, you should explicitly cast any variables that do not match the type specifier (i.e. char to a %d should be cast to int).

    - Mark

  • I'm not criticising Keil on this one; it's all quite natural and is well documented in the manuals. But it can still be a trap for the unwary (yes, I've fallen for it).

    I think other compilers - especially on "larger" target machines - get away with it by passing parameters on the stack, and always using at least a "word" [1], therefore automatically "padding" a byte.

    Big/Little-endian could also make a difference.

    Of course, 'C' assumes everything is 'int' unless told otherwise. So an explicit cast or enabling the integer promotion would "fix" it, as Mark said;
    or you can specify a single byte in the format specifier by using 'B'.

    [1] That's "word" in the generic sense of the machine's basic word size; eg,
    16 bits for an 808x;
    32 bits for a 386.