This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

error messages versus opt level

I have noticed that I get error messages or I don't sometimes depending on optimization level. For example, I had a statement that tried to store a value into code space and at opt level zero, I got an error saying that it couldn't convert the lvalue. I understood this. However, at opt level 8, I got no error and in fact the 'optimizer' simply left out the offending line. Why should it do this? If it is an error, it should report it and not throw the code away.

Parents
  • We tried to duplicate your problem, but no matter which optimization level or compiler version we are using we are getting a correct error message. You your case must be different from ours. How can we duplicate it?

    Our test file is as follows:

    stmt level    source
    
       1          struct  {
       2            unsigned char      v;
       3            unsigned char code *ptr;
       4          } s;
       5          
       6          void main (unsigned char t)  {
       7   1        *((unsigned char * )s.ptr) = t;
       8   1      }
    *** ERROR C183 IN LINE 7 OF Y.C: unmodifiable lvalue
    

Reply
  • We tried to duplicate your problem, but no matter which optimization level or compiler version we are using we are getting a correct error message. You your case must be different from ours. How can we duplicate it?

    Our test file is as follows:

    stmt level    source
    
       1          struct  {
       2            unsigned char      v;
       3            unsigned char code *ptr;
       4          } s;
       5          
       6          void main (unsigned char t)  {
       7   1        *((unsigned char * )s.ptr) = t;
       8   1      }
    *** ERROR C183 IN LINE 7 OF Y.C: unmodifiable lvalue
    

Children
  • Well,I am puzzled. I tried your simple program and got the error on both opt level 0 and opt level 8. I am running v2.20a by the way. So, I tried making it a little bit more like our real program. This is what I did.
    #define BYTE int
    struct {
    unsigned char v;
    void code *ptr;
    } s;
    void chooseOne(BYTE a)
    {
    int i;
    switch(a)
    {
    case 1:
    i=1;
    break;
    case 2:
    i=2;
    break;
    case 3:
    *((BYTE * )s.ptr) = a;
    break;
    case 4:
    i=4;
    break;
    default:
    break;
    }
    }


    void main (void)
    {
    int a=3;
    chooseOne(a);
    }
    I set up the target options the same as our real program. I got the same error at both opt levels. I don't know why I don't get the error with opt level 8 with our actual code. I have asked my colleague who originally wrote it to try to look into it. At least he was able to reproduce my results on his computer so it is not just with my machine.

  • #define BYTE int

    Just a note, the above is quite disgusting and misleading. This is exactly what typedef's are for and int's on many platforms are more than a BYTE (PIC excepted). What is wrong with
    typedef int TwoBytes;
    If you need to see if the typedef is "defined" you can still simply define something like:

    typedef int TwoBytes;
    #define TWO_BYTE_TYPE
    I know I should just keep my mouth shut, I'm sorry for butting in but this mis-use of a #define made me spill my beer - another crime.

    - Mark

  • Sorry, but we need an example that allows us to duplicate your initial problem. If you have a more complex example, you may want to send this to:

    support.intl@keil.com