I was sitting here discussing PWB artwork with my coworkers. One of the coworkers is a rookie software engineer, and the topic was the risks associated with fabricating a PWB. The first spin has the highest probability of an error (duh), either due to an error in the electrical engineering design, or an 'oops' such as the wrong transistor pattern, or a non-connected net due to a misspelling.
The cost associated with those mistakes is not only the PWB fabrication price, but also the time delays to the schedule.
Just like software, there are many-many opportunities to make a mistake when designing the electronics and sending it to a Board House for fabrication.
To get the rookie to understand the 'risks' involved in having the board fabricated, I made the analogy that, "it is like writing your code, double-checking it, triple checking it, and then sending the code it out to a "Compiler House" for them to compile, link, and send back the finished .HEX file.... vacuum sealed in packs of ten.
The "Compiler House" won't inform you of the errors and warnings nor will they provide those helpful hints as to the types of errors and warnings. In order to determine if it was built correctly, you have to effectively buzz the board, populate the board, power-it up, stimulate it, and then detect any errors. Those warnings are harder to detect, but that depends upon the thoroughness of your test suite.
So, when you hit "compile" or "build-all" and begin that iterative process to resolve those errors and warnings, it is like spinning a board each time. The new spin doesn't cost that much: just a few minutes of time, ...and now its down by half! ... only 17 errors and 12 warnings to go!
I think that many people rely too heavily on this process of having the tools spew out a list of where the code-monkey's brain got lazy.
But, at the same time, we code-monkeys shouldn't spend too much time writing error-free code like you would for a [costly] board spin. It is just too cheap to not allow the tools to identify these simple errors.
I think the over-reliance of the tools to tell you everything you did wrong (well, not everything) leads to sloppy coding behaviors and practices. We should spend a bit more time making sure it does compile "more cleanly" the first time around in order to develop our coding skills, instead of atrophying them.
--Cpt. Vince Foster 2nd Cannon Place Fort Marcy Park, VA
We should spend a bit more time making sure it does compile "more cleanly" the first time around in order to develop our coding skills
Excellent idea. Having eradicated the 'iterative compile' programming technique we could develop our language skills and move away from the 'iterative debug' technique as well.
The new project starts tomorrow ... fire up the ICE!
Why use an ICE? If the downloaded application doesn't work, just think for a while. Then apply the needed changes and download again.
If the downloaded application doesn't work, just think for a while. Then apply the needed changes and download again. in many cases that is the right way
Why use an ICE? when the above fail.
The issue in debugging is to use whatever method get you there the fastest, let alone the cases where only one method exist. One absolute case for the ICE is when an external source behaves like a teenager and "does something it is not supposed to do". When working with external events the printf method (that I abhor, because it) often will upset the applecart by wrecking havoc with the timing.
Erik
Yes, an ICE can be good to have.
My comment was a response to the ironic "The new project starts tomorrow ... fire up the ICE!"
I don't think the ICE should be the first tool to grab for, whenever something doesn't work, so no need to fire it up until it really is needed.
The day we leave our brains at home, our magic tools will not help us save the day.
The good guys managed to write really advanced programs and then key them in as binary, octal or hexadecimal numbers on a control panel. A fool may have any amount of tools, and still not manage to produce an advanced program.
Nothing wrong with tools, to help and avoid monotonous work, but we must not assume that it is the tools that are the final answer to a problem.
but we must not assume that it is the tools that are the final answer to a problem.
but we must not assume that it is the tools that are the final answer to all problems.
The issue in debugging is to use whatever method get you there the fastest, let alone the cases where only one method exist.
I agree. However, taking one step back, it is best to minimise the number of occasions where debugging is necessary in the first place. Many (most?) bugs solved by many (most?) programmers using whatever debugging technique are a result of the programmer having made a coding mistake rather than a bug in either the hardware or the compiler.
I won't bother to repeat for the umpteenth time how I think this situation can be fixed, or at least greatly improved.
My first mentor, Howard Taylor had an expression that i still use: "typographical errors" describing things like using r1 when it should be r2 (yes that was in the assembler only days).