This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Use c51 v3.4 compiler with windows XP.

Hello.
I'm trying to obtain a *.hex file using WXP and C51 Compiler V7.0 or upper like the file that was obtained using W98 and C51 Compiler V3.4. I have the following file *.bat:
… SET c51bin=c:\c51_v340\bin
SET C51LIB=C:\C51_v340\LIB
SET C51INC=C:\C51_v340\INC

c:\c51_v340\bin\fmake

But there is a problem called "stack overflow" when I execute this *.bat from DOS. The instruction "c:\c51_v340\bin\fmake" is not well received.
I try to compilate using uVision and changing the tool folders but the program doesn't respond.
I can do it using VWARE, a W98 emulator but I want to know if I can do it without this application.
Did anybody try something like this? Can anybody help me?
Thanks!

Parents
  • Thank you all very much.
    I know that these tools are very old, but new compilers haven't got the same libraries, so the *.hex files are different.

    What is the problem? the libraries supplied with new compilers match those

    I need to use older compilers.
    the above does not explain why

    If I can do it through uVision, it's perfect, but I'm afraid that uVision can't work with old compilers.
    uVision with an old compiler would give you no advantages, less you absolutotally insist (again WHY) on an IDE.

    My e-mail address is bea.ruiz at yahoo.es.
    I'll gladly send you what I suggested, but till you show me that the above "need to" is 'true' (not as opposed to 'lying', but as opposed to absolutely needed) I will not know what you really need

    Erik

Reply
  • Thank you all very much.
    I know that these tools are very old, but new compilers haven't got the same libraries, so the *.hex files are different.

    What is the problem? the libraries supplied with new compilers match those

    I need to use older compilers.
    the above does not explain why

    If I can do it through uVision, it's perfect, but I'm afraid that uVision can't work with old compilers.
    uVision with an old compiler would give you no advantages, less you absolutotally insist (again WHY) on an IDE.

    My e-mail address is bea.ruiz at yahoo.es.
    I'll gladly send you what I suggested, but till you show me that the above "need to" is 'true' (not as opposed to 'lying', but as opposed to absolutely needed) I will not know what you really need

    Erik

Children
  • Thanks Erik!
    We are selling a product with a program developed in old style c and this one is not a very good code. It has "times" made from simple loops and without using timers. The company does not want to develop a new code for this product but it's possible that we have to make a small revision, so, we want to be sure than the assembler with new versions of compilers is the same as the old one.
    I am testing a new version of the program made with the new compiler but, can anybody assure to me that the assembler program is "exactly" the same?
    I know that it's a bit estrange but It's not my decision….
    Bye!

  • I am testing a new version of the program made with the new compiler but, can anybody assure to me that the assembler program is "exactly" the same?

    let me go out on a linb, if it is C most definitely the .hex will NOT be the same.

    Erik

  • You can either:
    1. struggle to make the old tools work in a modern environment;

    2. try to maintain a contemporary environment on which to run the old tools.

    It should still be possible to boot any PC into true MS-DOS mode - so the 2nd option might be the simpler.

    I trust you are making clear to the company that neither of these is a no-cost option!

  • Thanks everybody for your help and your time.
    I' sorry, Erik. I was asking for your help, I know the most things that you are saying; if you remember my first message was "Did anybody try something like this? Can anybody help me?" If you can't do it, it doesn't matter, it's enough, thank.
    And thanks Andy, you are saying two things that I'm doing and I'm asking if someone did it before me.

  • Reinhard>"the solution is to understand what 'fmake' does and setup the project in uVision."

    Andy>Or, understand what 'fmake' does, and set up your XP Command Shell appropriately.

    We're using an older (3.12) C16X compiler on XP. The compiler and assembler work OK (fortunately!), with a few quirks (resizing command prompt windows, spontaneous aborts with unspecified exit code, no long filenames, etc.).
    If your compiler etc are working OK, isn't that enough? fmake sounds like a make tool and we're using the latest GNU make (compiler for Windows) and it works fine. Just a thought. That might save you the VWWARE or convert-to-IDE trouble.

    Another thought, could it be possible that it crashes on long paths/filenames with spaces etc, used in XP? We had to take that into account in our versioning system.

    Regards,
    Joost Leeuwesteijn

  • The compilers will almost certainly have new (improved?) code optimisation, which will potentially impact on any software timing loops in embedded code.

    The improvements in compiler performance may also permit programming errors in the original stable source code to surface, requiring additional development time to fix those previously undiscovered bugs (features?).

  • OP> Thank you all very much. I know that these tools are very old, but new compilers haven't got the same libraries, so the *.hex files are different.

    Erik> What is the problem? the libraries supplied with new compilers match those

    OK... So Erik, you don't mind being fixed up by some back-in-production-after-several-years medical device which contains software that has just been recompiled/linked using new libraries supplied with new compilers simply because they "match" the old ones?

    OP> I need to use older compilers.

    Erik> the above does not explain why

    Sure it does. He needs identical .hex files for the original code to make sure no new side-effects are added by "new and improved"(tm) tools and libraries.

    He just wants to reproduce identical binaries from old code, so his company can give at least some statement about the software quality (or large parts of it if they do have to make some revisions), and is asking for some help...

  • OK... So Erik, you don't mind being fixed up by some back-in-production-after-several-years medical device which contains software that has just been recompiled/linked using new libraries supplied with new compilers simply because they "match" the old ones?
    I do not understand the meaning of 'fixed up' but based on the full sentence have to GUESS that it means 'finding a bug'.
    No, I do not mind, if I am to 'enhance', 'modify' or whatever some code, my first action is to verify it with a recent toolset. If there are stupidities like timing loops in C I immediately fix those. Then I verify that everything works, and - actually - most bugs I find in that process DO exist in the 'original previously compiled' code. Then - and only then - do I do the requested modifications/enhancements. It is amzing to see the progress in the compiler when you see the errors the new verion catches that (probably - who knows what the original developer did) snuck by the older compiler.

    OP> I need to use older compilers.

    Erik> the above does not explain why

    Sure it does. He needs identical .hex files for the original code to make sure no new side-effects are added by "new and improved"(tm) tools and libraries.
    "identical hex files - IMHO totally irrelevant, answered above

    He just wants to reproduce identical binaries from old code, so his company can give at least some statement about the software quality (or large parts of it if they do have to make some revisions), and is asking for some help...
    as said above using newer compiler has almost always, when I have been in such a situation, revealed errors in the code. I can not accept the philosophy "if an error does not hurt it is not an error" experience shows that it WILL, eventually. 'hurt'

    Erik

  • We run into this a lot.

    For that reason we either keep an old computer running, or run an older operating system in a virtual window.

    This method has allowed us to keep using the old Keil versions intact.

    In general we have found that the batch process works much better than the IDE when we have to delve into ancient projects.

  • In general we have found that the batch process works much better than the IDE

    AMEN and amen

    Erik

  • In general we have found that the batch process works much better than the IDE

    I was referring to the really old stuff.

  • I do not understand the meaning of 'fixed up' but based on the full sentence have to GUESS that it means 'finding a bug'.

    Uhm, sorry about that :-) What I meant was "kept alive" by some kind of life support system or other rather critical medical device. Compiled/linked using a whole new toolset, simply because the libraries should be "identical" and improved.

    as said above using newer compiler has almost always, when I have been in such a situation, revealed errors in the code.

    For that reason some people/companies decide that it's a lower risk to modify existing (old) code using the -original- tools, so that at least the largest part of the code is binary identical and all test results are still valid. Introducing a new compiler/linker could/will introduce new problems that would require all test cases to be repeated OR might not even show up using the existing test cases.

    "identical hex files - IMHO totally irrelevant, answered above

    Not if you want to be able to make a statement about software quality. If your argument is going to be "just retest everything, no matter what the cost/duration"...well...the pointy haired boss doesn't agree.

    I can not accept the philosophy "if an error does not hurt it is not an error" experience shows that it WILL, eventually. 'hurt'

    I agree. Some projectleaders, marketeers or budget-determining people don't :-)

    I also agree that keeping up to date with the latest toolsets is a good thing but it has to be done at the right time and the right way (test cases, etc.). But sometimes that's not an option (timing, cost, knowledge, technical restrictions, risks) and that's why people have to maintain old tools -with their old toolsets-.


  • >> In general we have found that the batch process
    >>works much better than the IDE

    > AMEN and amen
    I was referring to the really old stuff.

    In my opinion make+shell+options gives me a much better feeling of and control over what's being done than an IDE. But I guess that's a personal preference. Commandline on Windows does have it's challenges though :-)

  • For that reason some people/companies decide that it's a lower risk to modify existing (old) code using the -original- tools, so that at least the largest part of the code is binary identical and all test results are still valid.

    Actually in the distant past I did just that. What 'converted' me was a case where (using old tools with old code) after adding a minuscule additional feature, an intermittent problems that in no way could be related to the added feature started popping up. It turned up that a coding error in the original code (revealed by using new tools) was the cause. Learning that existing coding errors (which have a tendency to show their ugly face when an 'unrelated' addition is made) will often be revealed by using new tools I 'converted' and whatever the reason, whatever the original toolset always use the current tools.

    The fact is that if the code can not 'handle' assemble/compile/link with current tools it has a bug.

    The argument "if we keep the original code the same, we do not need to test everything" is TOTAL BALONEY, just see the example above where the problem had no relation to the added code, just to adding code.

    Erik

  • Actually in the distant past I did just that. [...] It turned up that a coding error in the original code (revealed by using new tools) was the cause.
    The fact is that if the code can not 'handle' assemble/compile/link with current tools it has a bug.
    The argument "if we keep the original code the same, we do not need to test everything" is TOTAL BALONEY

    I'm not sure if it's TOTAL baloney. Personally I think keeping as much original and tested code the same is -in general- a lower risk. But I agree that running everything through the latest toolset never hurts. Both a change with the old toolset -or- introducing a new toolset, can introduce bugs...

    It probably also depends on the amount of changes you're going to make. If most of the app changes anyway, it might be a good moment to switch toolsets. If it's "just" a very minor change (if there is such a thing as a minor change in a very complex (legacy) system), keeping the old toolset might be a wise decision. With an additional check using the latest tools.

    I just added a discussion item for our team meeting :-)

    Regards,
    Joost Leeuwesteijn