This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Use c51 v3.4 compiler with windows XP.

Hello.
I'm trying to obtain a *.hex file using WXP and C51 Compiler V7.0 or upper like the file that was obtained using W98 and C51 Compiler V3.4. I have the following file *.bat:
… SET c51bin=c:\c51_v340\bin
SET C51LIB=C:\C51_v340\LIB
SET C51INC=C:\C51_v340\INC

c:\c51_v340\bin\fmake

But there is a problem called "stack overflow" when I execute this *.bat from DOS. The instruction "c:\c51_v340\bin\fmake" is not well received.
I try to compilate using uVision and changing the tool folders but the program doesn't respond.
I can do it using VWARE, a W98 emulator but I want to know if I can do it without this application.
Did anybody try something like this? Can anybody help me?
Thanks!

  • In general we have found that the batch process works much better than the IDE

    I was referring to the really old stuff.

  • I do not understand the meaning of 'fixed up' but based on the full sentence have to GUESS that it means 'finding a bug'.

    Uhm, sorry about that :-) What I meant was "kept alive" by some kind of life support system or other rather critical medical device. Compiled/linked using a whole new toolset, simply because the libraries should be "identical" and improved.

    as said above using newer compiler has almost always, when I have been in such a situation, revealed errors in the code.

    For that reason some people/companies decide that it's a lower risk to modify existing (old) code using the -original- tools, so that at least the largest part of the code is binary identical and all test results are still valid. Introducing a new compiler/linker could/will introduce new problems that would require all test cases to be repeated OR might not even show up using the existing test cases.

    "identical hex files - IMHO totally irrelevant, answered above

    Not if you want to be able to make a statement about software quality. If your argument is going to be "just retest everything, no matter what the cost/duration"...well...the pointy haired boss doesn't agree.

    I can not accept the philosophy "if an error does not hurt it is not an error" experience shows that it WILL, eventually. 'hurt'

    I agree. Some projectleaders, marketeers or budget-determining people don't :-)

    I also agree that keeping up to date with the latest toolsets is a good thing but it has to be done at the right time and the right way (test cases, etc.). But sometimes that's not an option (timing, cost, knowledge, technical restrictions, risks) and that's why people have to maintain old tools -with their old toolsets-.


  • >> In general we have found that the batch process
    >>works much better than the IDE

    > AMEN and amen
    I was referring to the really old stuff.

    In my opinion make+shell+options gives me a much better feeling of and control over what's being done than an IDE. But I guess that's a personal preference. Commandline on Windows does have it's challenges though :-)

  • For that reason some people/companies decide that it's a lower risk to modify existing (old) code using the -original- tools, so that at least the largest part of the code is binary identical and all test results are still valid.

    Actually in the distant past I did just that. What 'converted' me was a case where (using old tools with old code) after adding a minuscule additional feature, an intermittent problems that in no way could be related to the added feature started popping up. It turned up that a coding error in the original code (revealed by using new tools) was the cause. Learning that existing coding errors (which have a tendency to show their ugly face when an 'unrelated' addition is made) will often be revealed by using new tools I 'converted' and whatever the reason, whatever the original toolset always use the current tools.

    The fact is that if the code can not 'handle' assemble/compile/link with current tools it has a bug.

    The argument "if we keep the original code the same, we do not need to test everything" is TOTAL BALONEY, just see the example above where the problem had no relation to the added code, just to adding code.

    Erik

  • Actually in the distant past I did just that. [...] It turned up that a coding error in the original code (revealed by using new tools) was the cause.
    The fact is that if the code can not 'handle' assemble/compile/link with current tools it has a bug.
    The argument "if we keep the original code the same, we do not need to test everything" is TOTAL BALONEY

    I'm not sure if it's TOTAL baloney. Personally I think keeping as much original and tested code the same is -in general- a lower risk. But I agree that running everything through the latest toolset never hurts. Both a change with the old toolset -or- introducing a new toolset, can introduce bugs...

    It probably also depends on the amount of changes you're going to make. If most of the app changes anyway, it might be a good moment to switch toolsets. If it's "just" a very minor change (if there is such a thing as a minor change in a very complex (legacy) system), keeping the old toolset might be a wise decision. With an additional check using the latest tools.

    I just added a discussion item for our team meeting :-)

    Regards,
    Joost Leeuwesteijn

  • Hello again.
    Definitively WXP doesn't work with v 3.40 so the only way is work with virtual machines. Others old compilers, like in C16X, can do it, but Technical services of Keil gave me this information.
    Erik, thanks for your time.
    Thanks Joost Leeuwesteijn for your help, I think that you understood perfectly what I needed and what the real problem is. Perhaps, a Forum is not the best way to find solutions. The only thing that I have to say is that Bea is a Spanish woman name :-)

    So, the problem only can be solved using new compilers and obtaining different *.hex files or using virtual machines to simulate old operating systems over newer systems.

  • I'm not sure if it's TOTAL baloney. Personally I think keeping as much original and tested code the same is -in general- a lower risk

    "lower risk", that may be, but 'risk' is not the issue.
    Reliance on "tested code still being "tested" without retesting after ANY addition IS TOTAL BALONEY.

    I can give a very simple example (I could, probably, come up with 100s). In a system with internal and external code memory, the two running at somewhat different speed a small addition to some routine pushes a timing loop in an unrelated routine in the code that does not "need to be tested" into the other memory area.

    Erik

  • I don't think people are suggesting that you can make a change with no testing at all. What they are suggesting is that most of the object code doesn't change (in a binary sense) then you're very confident you've introduced no new bugs, regardless of the test results.

    You're assuming that just because a product passes the test suite, then there are no bugs. In reality, test suites are often highly fallible, and fail to detect many bugs.

    So, new code that is only slightly different that passes regression 100% is still less likely to have new bugs than new code that's completely different from the old that also passes the same regression suite 100%. The old code has been passing the real-world test suite of actual operation, while the new code has not.

    Testing is necessary but not sufficient to insure quality.

    The second step after this one is to realize that if the old code doesn't change, for a given amount of finite testing effort, you can concentrate that effort -- not exclusively, but preferentially -- on the part of the code that didn't change, thus making it more likely that you find the new bugs. You may not want to do so -- maybe you'd rather look for old bugs, too -- but at least you have a choice. If every byte in the hex file changes, then you have to retest evenly.

  • So, just when you thought you'd got your Jurassic Keil happily running on the latest PC - Microsoft go and move the goalposts with aonther new OS:

    Yes, XP is now old-hat; Vista is here!

    www.microsoft.com/.../

    So now we can have another few years of exciting "How can I make C51 v0.000001 work with Vista" threads...

    "Oh it all makes work for the working men to do..."