This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Optimization problem with Cortex M3

Hi,

im starting a project with the new STM32 Cortex M3 from ST and i found some extrange result during a debug.

the code used is this:

#define X_PORT GPIOB
#define SWITCH_X_SEL_B1 GPIO_Pin_1
#define SWITCH_X_SEL_B1 GPIO_Pin_2
#define SWITCH_X_SEL_B1 GPIO_Pin_8
unsigned char IO_read;

        /* read I current set */
        IO_read = (unsigned char)GPIO_ReadInputDataBit(X_PORT,SWITCH_X_SEL_B0);
        IO_read = IO_read + (unsigned char)(GPIO_ReadInputDataBit(X_PORT,SWITCH_X_SEL_B1) << 1);
        IO_read = IO_read + (unsigned char)(GPIO_ReadInputDataBit(X_PORT,SWITCH_X_SEL_B2) << 2);

The port B io configuration is ok and the debug window show me that all this bits are 1.

The problem occur when i set the optimization level to 3 or 2 and the last bit is allways read as 0.
If i try a "if code" the variable IO_read becomes undefined during debug and adding a temp variable just rotating the IO read result before the sum, this line dont generate code.

Setting optimization level to 1, everything works. I think that optimization is considering that all the three bits is together (in order).

thanks.

Jose Paulo Remor