This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

compiler optimizations

Hello,

1. What is this different style of value assigning? What is the use?

Unsigned char const event_strings[11] = {
~0x3f, ~0x06, ~0x5b, ~0x4f, ~0x66, ~0x6d, ~0x7d, ~0x07, ~0x7f, ~0x6f, ~0x00};

2. What are the drawbacks / overheads on the controller if we use volatile type? Can we assign volatile type to structure members and bdata variables as shown below:

Eg 1:

typedef struct dtmf_scan_block{
  unsigned char state;
  unsigned char user;
  unsigned char s_r;
  unsigned char r_id;
}DTMF_SCAN_BLOCK;

DTMF_SCAN_BLOCK volatile xdata dtmf[NO_DTMF];

Eg2:

unsigned char volatile bdata dtvar;
sbit dt0        = dtvar^0;
sbit dt1        = dtvar^1;
sbit dt2        = dtvar^2;
sbit dt3        = dtvar^3;

Eg3:

typedef struct extention_data
{
        unsigned char volatile dgt_buf[40];
        unsigned char volatile how[16];
        unsigned char call_privacy[3];
        unsigned char id;
}EXTN_DATA;

3. I am comparing bit type variable with unsigned char variable at many places in my code. Should I follow typecasting?

4. Is it necessary to turn off compiler optimizations {Level 9} for hardware driver initialization code in my project? I am initializing drivers for switch array MT8816, GM16C550 - UART, RTC - DS1380, DTMF IC MT8888,etc

please advise.

Parents
  • Rather than writing "if (mybit == TRUE)", which requires the compiler to promote the bit to a byte (at least)

    I suspect that as 'bit' is a non-standard type the compiler is allowed to roll its own behaviour.

    I found the following interesting for a variety of reasons (8.02, optimisation level 0):

    ----- FUNCTION main (BEGIN) -----
     FILE: 'bar.c'
       10: void main(void)
       11: {
       12:  bit a;
       13:
       14:
       15:  if(a)
    00000F 300002            JNB     a,?C0001?BAR
       16:  {
       17:          Foo();
    000012 1126              ACALL   Foo
       18:  }
    000014         ?C0001?BAR:
       19:
       20:  if(a==1)
    000014 300002            JNB     a,?C0002?BAR
       21:  {
       22:          Foo();
    000017 1126              ACALL   Foo
       23:  }
    000019         ?C0002?BAR:
       24:
       25:  if(a==2)
    000019 300002            JNB     a,?C0003?BAR
       26:  {
       27:          Foo();
    00001C 1126              ACALL   Foo
       28:  }
    00001E         ?C0003?BAR:
       29:
       30:  if(a==93845938)
    00001E 300002            JNB     a,?C0005?BAR
       31:  {
       32:          Foo();
    000021 1126              ACALL   Foo
       33:  }
    000023         ?C0004?BAR:
    000023         ?C0005?BAR:
       34:
       35:    while(1);
    000023 80FE              SJMP    ?C0005?BAR
       36: }
    000025 22                RET
    ----- FUNCTION main (END) -------
    

Reply
  • Rather than writing "if (mybit == TRUE)", which requires the compiler to promote the bit to a byte (at least)

    I suspect that as 'bit' is a non-standard type the compiler is allowed to roll its own behaviour.

    I found the following interesting for a variety of reasons (8.02, optimisation level 0):

    ----- FUNCTION main (BEGIN) -----
     FILE: 'bar.c'
       10: void main(void)
       11: {
       12:  bit a;
       13:
       14:
       15:  if(a)
    00000F 300002            JNB     a,?C0001?BAR
       16:  {
       17:          Foo();
    000012 1126              ACALL   Foo
       18:  }
    000014         ?C0001?BAR:
       19:
       20:  if(a==1)
    000014 300002            JNB     a,?C0002?BAR
       21:  {
       22:          Foo();
    000017 1126              ACALL   Foo
       23:  }
    000019         ?C0002?BAR:
       24:
       25:  if(a==2)
    000019 300002            JNB     a,?C0003?BAR
       26:  {
       27:          Foo();
    00001C 1126              ACALL   Foo
       28:  }
    00001E         ?C0003?BAR:
       29:
       30:  if(a==93845938)
    00001E 300002            JNB     a,?C0005?BAR
       31:  {
       32:          Foo();
    000021 1126              ACALL   Foo
       33:  }
    000023         ?C0004?BAR:
    000023         ?C0005?BAR:
       34:
       35:    while(1);
    000023 80FE              SJMP    ?C0005?BAR
       36: }
    000025 22                RET
    ----- FUNCTION main (END) -------
    

Children
  • Sorry about the BOLD, don't know what went wrong there.

  • I suspect that as 'bit' is a non-standard type the compiler is allowed to roll its own behaviour.

    True enough, though a sense of consistency might compel the compiler authors to treat a bit like other integers, or perhaps a bitfield.

    But considering the instruction set of the 8051, I'm not sure how you'd compare a bit to a byte without either promoting the bit, or testing the bit and then the byte. Either sequence will take several instructions (unless there's a neat trick I'm overlooking). if (bit) compiles directly to JB/JNB.