This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

35% smaller, and 14% faster code!

There is a new 8051 C compiler in beta test that beats Keil's compiler by 35% code size, and 14% speed, on the Dhrystone benchmark. And, there is no need to select a memory model or use special keywords to control data placement.

More details here: www.htsoft.com/.../silabs8051beta

Parents
  • I do not, in general, see the requirement to use ICE or even consider it when initially developing code. If there is an awkward problem, then I get it out
    and what are you going to do then if the optimizer has 'threaded' your code and you can not set a 'working' breakpoint.

    1) there is no issue "when initially developing code", my sole point is optimizer 'threading' vs breakpoints
    2) debugging what is not "an awkward problem" need no special measures
    3) if your method is not so that "an awkward problem" can be debugged in a reasonable time, you will miss some deadlines.

    Erik

Reply
  • I do not, in general, see the requirement to use ICE or even consider it when initially developing code. If there is an awkward problem, then I get it out
    and what are you going to do then if the optimizer has 'threaded' your code and you can not set a 'working' breakpoint.

    1) there is no issue "when initially developing code", my sole point is optimizer 'threading' vs breakpoints
    2) debugging what is not "an awkward problem" need no special measures
    3) if your method is not so that "an awkward problem" can be debugged in a reasonable time, you will miss some deadlines.

    Erik

Children
  • I've seen too many well qualified graduates who insist on going straight for an ICE when they see a problem, rather than looking for logic errors in their code.

    There's no real substitute for thinking and no excuse for laziness.

  • I've seen too many "well qualified" graduates who insist on going straight for an ICE when they see a problem, rather than looking for logic errors in their code.
    you forgot the quotes. I have seen many that were "well qualified" as far as academia goes, but not in the real world.

    There's no real substitute for thinking and no excuse for laziness.
    I wholehardedly agree.

    as to your first point, this is not ICE, but the scope. Many many moons ago, when scopes were slower, our best hardware guy and the undersigned 'universalist' started hunting a "once a day" hit. Although software was blamed (I had recoded the process to run 4 times faster) I had no doubt it was hardware. Finally we decided to use the "DNA scope" and looking through the schematic of a peripheral with as far as I remember ~400 TTL gates found a possible pulse that the scope could not see. We removed the possibility of that pulse and the unit worked. Approaching the original designer he stated "that can not happen" and pointed to typical values in the datasheet AAARGH!

    now, as far as "going straight for an ICE", the ICE is never going to tell you what is wrong, but it can be very helpful in finding where to look. E.g. when 2 processors communicate (I am currently having 8 running in tandem) the ICE can tell you whether it is transmission or reception. My main 'beef' with those that are "going straight for an ICE" is not their debugging method, but those types often "insert a fix" instead of actually finding the root problem by "looking for logic errors in their code". I always stated that bugs should NEVER be fixed, they should be removed.

    Erik