This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

35% smaller, and 14% faster code!

There is a new 8051 C compiler in beta test that beats Keil's compiler by 35% code size, and 14% speed, on the Dhrystone benchmark. And, there is no need to select a memory model or use special keywords to control data placement.

More details here: www.htsoft.com/.../silabs8051beta

Parents
  • bresk
    'theraded'
    nefficient

    If you write code with the same cavalier manner in which you write your responses, then I can understand your dependance on debugging tools; however, even a lowly optimizing compiler normally picks up typographical errors like these.

    Also, 'debuggability'; whose dictionary do you use?

Reply
  • bresk
    'theraded'
    nefficient

    If you write code with the same cavalier manner in which you write your responses, then I can understand your dependance on debugging tools; however, even a lowly optimizing compiler normally picks up typographical errors like these.

    Also, 'debuggability'; whose dictionary do you use?

Children
  • Also, 'debuggability'; whose dictionary do you use?
    none just tried to emphasize that I did, by no means, try, by 'debuggability' to indicate that I did not consider anything could not be debugged, but were referring to the ability to do efficient debugging (ICE).

    Re cavalier manner ... typographical errors
    I am 'redecorating' and thus having the keyboard in my lap. also, a person should be able to 'read through' a few errors, I would never expect a computer to do so.

    Re "amateur - hobbyist" I have produced working systems (not flawless in all cases, but with very few flaws in all cases) for more years than I am willing to admit, let me just say, some were before the microcontroller even existed.

    Erik

    To consider their code so superior that debuggability should not be a concern is a typical attitude of mateurs and hobbyists"

  • "I have produced working systems (not flawless in all cases, but with very few flaws in all cases) for more years than I am willing to admit, let me just say, some were before the microcontroller even existed."

    Likewise, but I prefer to be more rounded when it comes to saying what is best - There is no perfect 'one fits all' solution and I would have expected someone with your claimed experience to appreciate that fact. I do not, in general, see the requirement to use ICE or even consider it when initially developing code. If there is an awkward problem, then I get it out - Else I prefer to study the code.

    End.

  • I do not, in general, see the requirement to use ICE or even consider it when initially developing code. If there is an awkward problem, then I get it out
    and what are you going to do then if the optimizer has 'threaded' your code and you can not set a 'working' breakpoint.

    1) there is no issue "when initially developing code", my sole point is optimizer 'threading' vs breakpoints
    2) debugging what is not "an awkward problem" need no special measures
    3) if your method is not so that "an awkward problem" can be debugged in a reasonable time, you will miss some deadlines.

    Erik

  • I've seen too many well qualified graduates who insist on going straight for an ICE when they see a problem, rather than looking for logic errors in their code.

    There's no real substitute for thinking and no excuse for laziness.

  • I've seen too many "well qualified" graduates who insist on going straight for an ICE when they see a problem, rather than looking for logic errors in their code.
    you forgot the quotes. I have seen many that were "well qualified" as far as academia goes, but not in the real world.

    There's no real substitute for thinking and no excuse for laziness.
    I wholehardedly agree.

    as to your first point, this is not ICE, but the scope. Many many moons ago, when scopes were slower, our best hardware guy and the undersigned 'universalist' started hunting a "once a day" hit. Although software was blamed (I had recoded the process to run 4 times faster) I had no doubt it was hardware. Finally we decided to use the "DNA scope" and looking through the schematic of a peripheral with as far as I remember ~400 TTL gates found a possible pulse that the scope could not see. We removed the possibility of that pulse and the unit worked. Approaching the original designer he stated "that can not happen" and pointed to typical values in the datasheet AAARGH!

    now, as far as "going straight for an ICE", the ICE is never going to tell you what is wrong, but it can be very helpful in finding where to look. E.g. when 2 processors communicate (I am currently having 8 running in tandem) the ICE can tell you whether it is transmission or reception. My main 'beef' with those that are "going straight for an ICE" is not their debugging method, but those types often "insert a fix" instead of actually finding the root problem by "looking for logic errors in their code". I always stated that bugs should NEVER be fixed, they should be removed.

    Erik

  • Mr. black,
    Normally I don't intervene in such discussions or should I say - exchanges, and Erik certainly does not require my help - but I find your comments to be simply childish:

    bresk
    'theraded'
    nefficient

    If you write code with the same cavalier manner in which you write your responses, then I can understand your dependance on debugging tools; however, even a lowly optimizing compiler normally picks up typographical errors like these.

    Also, 'debuggability'; whose dictionary do you use?

    Dude, if you cannot deal with Erik's arguments with effective replies, just don't bother...

  • "...if you cannot deal with Erik's arguments with effective replies, just don't bother..."

    Trouble is, I don't see any valid argument, just a predisposition to coming up with a fixated priority with regards to optimization and debugging.

    Just because he requires a limited form of optimization, does not mean to say that it is the right one.

    It would appear that he is in a minority and it is not justifiable, else compiler writers would already support his requirements.

    Going back to dedicated TTL circuit debugging really does not help to emphasise the argument. And yes, I have done my share of TTL circuit debugging, so I feel I know enought to say - It does not have a great deal of similarity to compiler optimizations.

    Valid arguments - Phooey.

  • Going back to dedicated TTL circuit debugging really does not help to emphasise the argument. And yes, I have done my share of TTL circuit debugging, so I feel I know enought to say - It does not have a great deal of similarity to compiler optimizations.
    Well, if you reread the post you will see that the above was in support of YOUR argumnent that thinking is required. That thinking is required, of ourse has nothing to do with optimization, but what we are discussing is debugging and the hurdles the 'threading' optimization put to that

    Erik