Hi,
I'm using uVision IDE V4.53.0.0 and the sprintf() function behaves strange... my piece of code:
_disable_irq(); len = sprintf((char*)myBuf,"%+2.2fV %+1.3fA",9.13788,-0.004069); _enable_irq();
This will create a string in myBuf:
"+9.14V -0.004A"
This code is called each 500ms. Suddenly after a minute or so it generates:
"-7265617330774872000000/0/0000000.000000000/0/000000000.00V +17574296777337024000/0/0000.000A"
where /0/ stands for "about 40 zero's"
Also 'len' contains a value of 66 (in stead of 15) which is the position of the first '.' in the resulting string.
As all interrupts are disabled the variables cannot be overwritten by some interrupt handler.
Does anybody know?
Thanks,
Henk
Are there any other calls to sprintf() - that does not disable interrupts? So maybe sprintf() relies on some global variables that are already in unstable states. Or maybe the floating-point code somewhere is in any unstable state when you get to this code sequence. Or your code have overwritten some global variables used by the CRTL.
Stack overflow?
I increased the stack by 512 Bytes but the problem still exists.
I modified the code in such a way that when the conversion fails it immediately re-executes the sprintf() function. But it still returns the same erroneous string.
The only cause I can think of is that the sprintf() function declares some static variables which are used for conversion and that my program somehow overwrites one or more of these static variables.
So for now I quit using sprintf() for float values and I did write my own conversion function which runs solid as a rock.
Maybe 512 wasn't enough...?
A simple way to catch a stack overflow is to put a data-access breakpoint just beyond the "expected" end of the stack; then, if that breakpoint ever "fires", you know you've got an overflow.
len = sprintf((char*)myBuf,"%+2.2fV %+1.3fA",9.13788,-0.004069);
Why do you have to cast the buffer pointer?
What is that hiding...?
Except, of course, if the stack overflows by a giant leap right across your trap, instead of walking nice and slowly into it...
That's a good question. Let me explain:
I can't remember if a char or int is by default signed or unsigned. I think it depends on the compiler. Also it depends on the platform if an int is 16-bit or 32 bit.
So I decided to convert all standard variable types to unsigned and signed followed by the bit width using typedefs. All the typedefs I use are: u08, u16, u32 for unsigned s08, s16, s32 for signed
so my u08 is of type uint8_t.
myBuf is declared as u08 (uint8_t).
The compiler generates a warning when using u08 with sprintf. Therefore I cast it to the requested char*.
Correction:
All the typedefs I use are: u08, u16, u32 for unsigned s08, s16, s32 for signed pu08, pu16, pu32 for pointer to unsigned ps08, ps16, ps32 for pointer to signed
so my pu08 is of type uint8_t*.
myBuf is declared as pu08 (uint8_t*).
Therefore I cast it to the requested char*.
I had the same problem. I use RTX and in all tasks that use sprintf() with floats I now have specified user defined stack of 1 kByte after I checked the stack and found that sometimes it can grow over 512 bytes.