In a little piece of test code, I have the following:
int main(void) { osStatus_t osStat; HAL_Init(); osStat = osKernelInitialize(); if (osStat != osOK) { NVIC_SystemReset(); } while(true); return 0; }
Very simple.
When compiling with -O0, the disassembly for the relevant part looks like:
main: .Lfunc_begin1: .loc 2 31 0 @ ../main.c:31:0 .fnstart .cfi_startproc @ %bb.0: .save {r7, lr} push {r7, lr} .cfi_def_cfa_offset 8 .cfi_offset lr, -4 .cfi_offset r7, -8 .pad #8 sub sp, #8 .cfi_def_cfa_offset 16 movs r0, #0 str r0, [sp, #4] .Ltmp1: .loc 2 34 2 prologue_end @ ../main.c:34:2 bl HAL_Init .loc 2 36 11 @ ../main.c:36:11 bl osKernelInitialize .loc 2 36 9 is_stmt 0 @ ../main.c:36:9 str r0, [sp] .Ltmp2: .loc 2 37 6 is_stmt 1 @ ../main.c:37:6 ldr r0, [sp] .Ltmp3: .loc 2 37 6 is_stmt 0 @ ../main.c:37:6 cbz r0, .LBB1_2 b .LBB1_1 .LBB1_1: .Ltmp4: .loc 2 39 3 is_stmt 1 @ ../main.c:39:3 bl _ZL18__NVIC_SystemResetv
But with -O1 or higher optimization:
main: .Lfunc_begin1: .fnstart .cfi_startproc @ %bb.0: .loc 2 34 2 prologue_end @ ../main.c:34:2 bl HAL_Init .Ltmp1: .loc 2 36 11 @ ../main.c:36:11 bl osKernelInitialize .Ltmp2: @DEBUG_VALUE: main:osStat <- undef .loc 2 39 3 @ ../main.c:39:3 bl _ZL18__NVIC_SystemResetv
All of a sudden, the comparison after the call to osKernelInitialize is gone...The while(true) is only there temporarily, and I am aware that it can be optimized away as it undefined behavior, but I do expect the compiler to actually handle the preceding if-statement properly first.