Working with a Freescale Kinetis K60. When file optimization is set at level 1. Optimize for time is on. Cross-module optimization is off. We see a RAM variable that gets corupted. This variable is passed as an argument to a function. When the function returns the parameter has changed unexpectedly. Please note this parameter is not a pointer. This does not happen when optimization is off. Here is the code:
AoVal = (uint16_t) par1; if (LimitAoSet(Console, AoVal)) { // AoVal is not changed here Ux_putline(Console, "AO1 set "); Ux_putdec2ctz(Console, 2, (int)AoVal); // AoVal gets corrupted here SetAo1(AoVal); }
Why does this happen?
Is Ux_putdec2ctz() a function, or a #define'd macro? And what does the prototype look like?
Ux_putdec2ctz is not a macro. It is a function. The prototype: uint8_t Ux_putdec2ctz (uint8_t comport, uint8_t dotpos, int c);
I did increase the heapsize, the rtx stack and the task specific stacksize. This did not make a difference so i guess i ruled out that one. The only way to avoid the problem is to compile in optimization level 0 (-O0). When set to optimization level 'default' the problem persists.
Here are the disassembly listings with optimization and without optimization. There are branches to other code so i'm not sure the clue is to be found here.
Disassembly listing with optimization Level 1: 1360: case 100: /* set ao1, analog1 output (10uA steps) */ 0x0000EFC0 F7FFBB2B B.W 0x0000E61A 1361: AoVal = (uint16_t) par1; 0x0000EFC4 B2AC UXTH r4,r5 1362: if (LimitAoSet(ConsPort, AoVal)) { 0x0000EFC6 4621 MOV r1,r4 0x0000EFC8 982D LDR r0,[sp,#0xB4] 0x0000EFCA F7FAFC17 BL.W LimitAoSet (0x000097FC) 0x0000EFCE 2800 CMP r0,#0x00 0x0000EFD0 F43FAB23 BEQ.W 0x0000E61A 1363: Ux_putline(ConsPort, "AO1 set "); 0x0000EFD4 A15F ADR r1,{pc}+4 ; @0x0000F154 0x0000EFD6 982D LDR r0,[sp,#0xB4] 0x0000EFD8 F7F9FD3E BL.W Ux_putline (0x00008A58) 1364: Ux_putdec2ctz(Console, 2, AoVal); 0x0000EFDC 4826 LDR r0,[pc,#152] ; @0x0000F078 0x0000EFDE 4622 MOV r2,r4 0x0000EFE0 2102 MOVS r1,#0x02 0x0000EFE2 7800 LDRB r0,[r0,#0x00] 0x0000EFE4 F7FAFBB0 BL.W Ux_putdec2ctz (0x00009748) 1365: Ux_putline_eol(ConsPort, " mA"); 0x0000EFE8 A15D ADR r1,{pc}+4 ; @0x0000F160 0x0000EFEA 982D LDR r0,[sp,#0xB4] 0x0000EFEC F7F9FD94 BL.W Ux_putline_eol (0x00008B18) 1366: SetAo1(AoVal); } 1367: break; Disassembly listing with optimization level 0: 1360: case 100: /* set ao1, analog1 output (10uA steps) */ 0x00011C14 E57D B 0x00011712 1361: AoVal = (uint16_t) par1; 0x00011C16 9834 LDR r0,[sp,#0xD0] 0x00011C18 B280 UXTH r0,r0 0x00011C1A 902B STR r0,[sp,#0xAC] 1362: if (LimitAoSet(ConsPort, AoVal)) { 0x00011C1C 992B LDR r1,[sp,#0xAC] 0x00011C1E 9833 LDR r0,[sp,#0xCC] 0x00011C20 F7F9FF08 BL.W LimitAoSet (0x0000BA34) 0x00011C24 B180 CBZ r0,0x00011C48 1363: Ux_putline(ConsPort, "AO1 set "); 0x00011C26 A1CB ADR r1,{pc}+2 ; @0x00011F54 0x00011C28 9833 LDR r0,[sp,#0xCC] 0x00011C2A F7F8FE09 BL.W Ux_putline (0x0000A840) 1364: Ux_putdec2ctz(Console, 2, AoVal); 0x00011C2E 2102 MOVS r1,#0x02 0x00011C30 48CB LDR r0,[pc,#812] ; @0x00011F60 0x00011C32 7800 LDRB r0,[r0,#0x00] 0x00011C34 9A2B LDR r2,[sp,#0xAC] 0x00011C36 F7F9FDAD BL.W Ux_putdec2ctz (0x0000B794) 1365: Ux_putline_eol(ConsPort, " mA"); 0x00011C3A A1CA ADR r1,{pc}+2 ; @0x00011F64 0x00011C3C 9833 LDR r0,[sp,#0xCC] 0x00011C3E F7F8FE60 BL.W Ux_putline_eol (0x0000A902) 1366: SetAo1(AoVal); } 0x00011C42 982B LDR r0,[sp,#0xAC] 0x00011C44 F007F870 BL.W SetAo1 (0x00018D28) 1367: break;
Would kind of suggest that R4 is getting corrupted by Ux_putline(), you could step over that call in the debugger and see if R4 changes. Then you'd drill into the code for Ux_putline() to understand why that violates the ABI
R4 holds AoVal in the optimized version, on the non-opt version it's held on the stack.