We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
I do not know how familiar you are with the RTCA/DO-178b, but it is a traceable, documented system for software development that assures reliability and instills trust in software developed to perform tasks for commercial airlines. For 'Level A' software you need to not only test every case of every decision point in you software, you also have to break down the compiled code and compare it to the machine code and ensure that it is traceable and there is no 'dead' code or anything that might lead to a undesired path, etc... Also, the tools used to test and create software must be certified. This brings me to my conundrum. Seems a lot of embedded software was written for a certain processor, the 8051, and as a result a lot of Off the Shelf software exists for that chip. Rockwell had chosen to use the Archimedes C-51 Development Kit V1.12B to comile thier code. The specifics of this kit are:
Archimedes Compiler V5.1 Archimedes Linker/Locater V3.52 Archimedes Assembler V5.02
Some time ago Rockwell created the Software Control Library and then archived it into that library for future use. Then a system that had been compiled using that hardware/software combination came back needing an update. When they pulled the software out of archive they found that a single file had been corrupted, making the software useless. And the corrupted file is part of the Compiler v5.1. This company, KEIL, bought out Archimedes and upgraded the software a while back, but no longer has ANY copies of the older versions. If we cannot find a replacement for that then we have to purchase a newer version of the compiler and go through a large amount of testing to validate it and then write the update, probably less than 10 lines of code changes, recompile it, and go through a large amount of testing all over again before the update can be certified to be put back on the plane. A great deal of work could be avoided if we could get a copy of the FILES.1 for the compiler v 5.1. Does anyone have an old copy floating around or know of someone who might have access to that file?
What we would actually prefer would be an actual copy of the orginal software. This would not be expected to be free, they are willing to compensate. We have the actual file with its corruption and we could compare the corrupted to the replacement and see how different they are. I believe the original is truncated. The FILES.1 is actually the Archimedes form of a zip file with a CRC number and if the file you were to provide is not the proper file then the installer would refuse to parse the file and halt instalation, so IF the compiler installs fully without error THEN we can be pretty confident the FILES.1 is proper. After the compiler was installed the orginal code would be compiled and if everything was kosher then the compiled executable would be identical. Again, if they were not identicle then we would not be able to use this compiler without further testing, but if they were identical then changes could be made for the software update with confidence. The RTCA/DO 178b is a very complete methodology that has worked very well to be able to trace changes and software development processes. The idea is that software compiled with the Archimedes 5.1 compiler has been tested and shown to behave as expected. IF we need to we can use a new compiler and it would probable generate comparable code but that is not enough by itself so the compiler would need to be tested fully and the entire compiled package would have to be tested to see that the new, possibly slightly different machine code behaves as expected; where as if we have the original compiler then we can know that the other parts of the software, previously tested, have not changed by simple file comparison and we only need to test what has changed.
I feel your pain - having once been on the receiving end of a failed backup system myself! :-(
"The RTCA/DO 178b is a very complete methodology..."
Looks like there is a hole in the system in confirming that archived data can be restored intact to a "virgin" system.
Don't forget to archive multiple copies of the development hardware too. Nothing like trying to load a program onto a current PC that originally ran on an old PC, only to find out that the new PC is not capable of running the old program, or the old PC won't run anymore. I've been bit by that one before.
Of course, no matter how well you plan and prepare, Mr. Murphy still reigns supreme.
Sigh...
1) Murphy was an optimist 2) I have found it futile to "keep the tools used when making". So often, you have to find a win95 or some such just to TRY to run it. Then you have to 'relearn' how those tools worked. Then you have to use sneakernet to get it to the programmer, then the format does not fit the ICE, then you have to reconfigure something else, then .... True, it is a pain to have to 'adapt' existing stuff to current tools, but, I have had cases of "updating the updated" where I used expressions not fit for print about me not 'adapting' at the first update.
Erik
If we cannot find a replacement for that then we have to purchase a newer version of the compiler and go through a large amount of testing to validate it and then write the update, probably less than 10 lines of code changes, recompile it, and go through a large amount of testing all over again before the update can be certified to be put back on the plane
If you are in an environment where 'validation' is a must, then you have to go through the whole, total and complete process, if one comma is changed. Thus, what is the difference upgrading the tools or not?.
If upgrading the tools 'create a bug' that bug was there in the first place, however much you 'validated'.
If the compiler is validated then we only have to re-test the objects that have changed. And we would not need to test the compiler, it has already been tested. If we go with a different compiler, even if we made no changes to the source, what are the chances that even one of the object files would be the same? They may be soooo close, but using a different register, and we have to test them all over again. By reusing the original compiler we only need to test the parts that have changed. By showing that the other object files are unchanged we can show that they have already been tested. Hopefully someone out here has a copy of or knows were to find a copy of the Archimedes C51 Development Kit V1.12b.
"we only have to re-test the objects that have changed."
The method might say that, but it is a flawed assumption.
If one object changes, it could affect the location of all the other objects after the final link - and that could bring to light some latent bug (eg, something position-dependant that shouldn't be) that has just never happned to manifest before.
How likely is that? I don't know - but how certain do you need to be at 50,000 feet...
The Archimedes Compiler V5.10 is identical with the Keil C51 V5.10. We can still supply this version (but you should contact our distributors for it).
Any similarities or equivalents to the linker/locator or the assembler?