I was sitting here discussing PWB artwork with my coworkers. One of the coworkers is a rookie software engineer, and the topic was the risks associated with fabricating a PWB. The first spin has the highest probability of an error (duh), either due to an error in the electrical engineering design, or an 'oops' such as the wrong transistor pattern, or a non-connected net due to a misspelling.
The cost associated with those mistakes is not only the PWB fabrication price, but also the time delays to the schedule.
Just like software, there are many-many opportunities to make a mistake when designing the electronics and sending it to a Board House for fabrication.
To get the rookie to understand the 'risks' involved in having the board fabricated, I made the analogy that, "it is like writing your code, double-checking it, triple checking it, and then sending the code it out to a "Compiler House" for them to compile, link, and send back the finished .HEX file.... vacuum sealed in packs of ten.
The "Compiler House" won't inform you of the errors and warnings nor will they provide those helpful hints as to the types of errors and warnings. In order to determine if it was built correctly, you have to effectively buzz the board, populate the board, power-it up, stimulate it, and then detect any errors. Those warnings are harder to detect, but that depends upon the thoroughness of your test suite.
So, when you hit "compile" or "build-all" and begin that iterative process to resolve those errors and warnings, it is like spinning a board each time. The new spin doesn't cost that much: just a few minutes of time, ...and now its down by half! ... only 17 errors and 12 warnings to go!
I think that many people rely too heavily on this process of having the tools spew out a list of where the code-monkey's brain got lazy.
But, at the same time, we code-monkeys shouldn't spend too much time writing error-free code like you would for a [costly] board spin. It is just too cheap to not allow the tools to identify these simple errors.
I think the over-reliance of the tools to tell you everything you did wrong (well, not everything) leads to sloppy coding behaviors and practices. We should spend a bit more time making sure it does compile "more cleanly" the first time around in order to develop our coding skills, instead of atrophying them.
--Cpt. Vince Foster 2nd Cannon Place Fort Marcy Park, VA
Hope you didn't take offence.
After a small temper tantrum, and my medication... nope. None taken.
Your tantrums are fast. 1 hour 16 min for the tantrum, the medicine and the soothing calmness.
We should be lucky that a number of lint rules have been added to the compilers. Most of todays compilers can also produce multiple meaningful errors by the addition of smart code reduction when the compiler reaches a missing parenthesis or semicolon. The old bare-bones YACC-based compilers did always explode in a billion errors for the tiniest syntax error.
By renaming variables, structure members, data types or function names, we can let the compilers help us catch all places that needs rewrites when we refactor code.
Placing constants to the left of equality comparisons will also improve the error-tracking capabilities of the compiler.