We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
Let me tell you a story about a guy named Jed...
A long long time ago (pre-ANSI C), in a galaxy far far away I had worked for a company that had to develop internal "C" coding standards and "Jed" worked on one aspect of the standard while I worked on another. We would hold weekly meetings to reconcile our differences. In attendance, we had other professionals for simple sanity checking and to gain insights from different points of view.
Chris was one of our attendees and was a very experienced software veteran who had plenty of code in various satellite systems orbiting our planet today. By then, Chris was in upper management and graced us with his wisdom when he could.
Well during one of our weekly meetings, "Jed" and I got into a simple disagreement on a Rule about header files. We were at an impasse, so we waited for Chris to arrive and have him make the final decision: about five of us professional engineers were in the room.
When Chris arrived, he heard the arguments, and quickly announced that I was right. (Hence, Jed was wrong).
Well, Jed freaked out and wanted to take the guy outside and teach him a lesson! ... Jed was red-faced, quickly stood up, even took a step towards Chris, and said "Chris, lets just step outside and settle this! I am right and you don't know what you're talking about!" etc etc.
The other attendees and I were duly impressed over Jed's technique of handling technical disagreements. Especially with upper management.
Instead of Jed trying to learn that he *might* be wrong, Jed leaped into the confrontation method of getting his way. Bullies do this because they lack the brain-power to reason through a disagreement. It is a childish trait.
Children are at a huge disadvantage when arguing with "an adult" (or somebody who is much smarter than they are) and they will become very frustrated over their strong desire to assert themselves and their inability to win the mental sparring. They will get physical and/or verbally abusive. Some people out grow this, and some don't.
I think Jed showed his 'abilities' quite well. I find that this is true with so many people on so many subjects. I've seen this behavior many times over. I've seen it here on this forum.
When an "Original Poster", asks a question and people try to answer it (after much refinement of the OP's question) you get these side-bar posts where somebody will start attacking another poster's efforts. And I mean 'attack' and not augment or refine.
I don't have a problem with correcting or clarifying others, or even the occasional sprinkling of sarcasm, but when it is ALWAYS devolves into some vindictive vitriol between a brisling poster and the rest of 'us,' I wonder if it is out of ignorance, malice, or some twisted form of self-entertainment. All three of which are adolescent behaviors. (en.wikipedia.org/.../Adolescence)
Since the regular players here are detail oriented and thus they are savvy enough to know who I'm talking about, I don't think I have to name names.
He is critical enough to figure it out himself, so I would expect that the offender would read this and ask himself if he is demonstrating Ignorance, Malice, Entertainment, or is he being an adult and providing a constructive post before he does so.
And, I hope his "Mea Clupea" (en.wikipedia.org/.../Mea_culpa) will be a silent one, because I'm kind of tired of reading his Hostile Postings (HP).
</rant> --Cpt. Vince Foster 2nd Cannon Place Fort Marcy Park, VA
Erik: "please define 'raw'".
My definition of raw memory structures, is to transmit or store data in the exact format that the compiler puts the data in memory. Such data will have much of it's format defined by the compiler vendor, not by you or the person responsible for the other side of a communication link. And since the compiler vendor has the full right to change that definition, you can not be in ownership of a document that correctly document the data format used.
Me: "Transmitted or stored data should be described by a 100% complete document" Erik: "it is, of course, how else could i use it?"
It can't be 100% documented if it relies on mechanisms that the compiler vendor may change between different releases of the compiler, or that are likely to fail if the source code is built with another vendors compiler.
To be 100% documented, the document must specify the actual bit location of every single bit. And the source code must make sure that the information is really placed at that bit position and doesn't just get placed there by chance because the current compiler because of some private design decision chooses that location.
There is no problem using any kind of byte order for a transmission, as long as you have a document that says that little-endian is always used, or that bit 0x40 of the third transmitted byte (before any endian byte-swappings have been performed) in message xx specifies which - of two possible - endian alternatives that is used. Just relying on memcpy() will not enforce the required endian. If you know that your processor has correct byte order memcpy() when writing may do the job, but what happens if the code is run on a different processor?
Transmitting bit fields (as oposed to manually handled flags) will always be borked since you can't write a documentation that takes into account possible future changes of a compiler.
If the other side is transmitting a raw bit-field, then you have to try to deduce the current location of these fields, while living with the knowledge that a changed compiler on the other end may require you to require your side of the communication. If the coder (or technical lead) on the other side of the communication link was a fool, you will have to suffer, since both sides will - by implication - be non-portable.
Using bit fields inside code gives cleaner code. But a lot of developers intentionally selects to manually assign the bits, just to avoid the extra work of having to write conversion functions "flags_to_native" and "native_to_flags" when they need to share information, or store he information on a medium where it may later be read by an application built with another compiler or built for another architecture.
and I doubt the endianness will change there.
and there are no bitfields, just #defines of 'masks'
Erik
Dave wrote: Whatever y'all do... please don't stop. It's always refreshing to find these lengthy discourses, and to take the time to read them. I second his post. Even the extra flaming. I have noticed when the flaming gets 'out of hand', the thread will disappear so I guess that some moderator must ocassionally look at the posts.
Erik writes:
defines 'sprat' as a *** fishy thing.
Well gents I submit this with apologies to the more erudite and entertaining suggestions/comments of this post. But I see what appears to me to be a basic problem in most posts other than lack of registration which I'm voting for adoption. Most posts suffer severely from what might be called sarchasm . Also, this apparent confusion is exaggerated by the very common dopeler effect of multiple posters. The spellings are correct in my dictionary.
Definitions: Sarchasm: The gulf between the author of sarcastic wit and the person who doesn't get it. Dopeler effect: The tendency of stupid ideas to seem smarter when they come at you rapidly from several directions. Bradford
if you do not "Wrap code in preprocessor directives" it can never be portable by the "sardine standard"
No. I said that wrapping code in preprocessor directives indicates that it is not portable. The preprocessor directives do not make non-portable code portable. Please try to understand this.
One one hand you claim the efficiency is not important, on the other you calim that the "sardine standard code" is not inefficient
No. Please re-read my comments on efficiency. The world is not black and white - there is only worth in making code efficient if it needs to be efficient, otherwise it is better to write it with portability and clarity as the foremost priorities. If you want to consider this further please take a look at my response to your comments about malloc() further down this thread.
I care about efficiency where efficiency is the most important factor. With sensible design, however, it rarely is.
so, "sensible design" makes effciency irrelevant. I have always thought that "sensible design" was efficient desitgn.
There are many factors to consider in a sensible design - development cost, maintainability, reliability to name but a few. Perhaps you could do with broadening your horizons a little?
You are very confused.
here you go again making statements about me that you have no background for making.
Well, either you are confused or are deliberately misunderstanding a quite simple point.
Any more thoughts on whether my apparently baseless assertion about your knowledge of the 'C' standard remains baseless?
I stated (clearly to all but you) that using 'bit' was an example of where adapting code to a particular processor/compiler would affect efficiency.
An example, according to your favoured definition, is representative of a group or type. You are now arguing that what you presented as an example is not representative of what you meant. This makes no sense.
WOW, only a small insult at the end, I think this one deserves a civil answer. if you do not "Wrap code in preprocessor directives" it can never be portable by the "sardine standard"
No. I said that wrapping code in preprocessor directives indicates that it is not portable. The preprocessor directives do not make non-portable code portable. Please try to understand this. now I am (I confess) totally lost. if you can not use preprocessor directives to make portable, how can you then make it portable, say between little endian and big endian situations, not to mention between different length ints?
Please re-read my comments on efficiency. The world is not black and white - there is only worth in making code efficient if it needs to be efficient, otherwise it is better to write it with portability and clarity as the foremost priorities. even in a situation where code does not need to be efficient I would go, at least, half the way in making it efficient. I have had to do an almost total rewrite of an inherited project that was "efficient enough" till an added beature was requested. I do disagree with your implication that efficient code can not be reasonably portable and fully maintainable.
There are many factors to consider in a sensible design - development cost, maintainability, reliability to name but a few. Perhaps you could do with broadening your horizons a little? in addition to what I replied to the previous quote which applies here as well: re "development cost, maintainability, reliability", again, you make the asssumption that because I mention one point, I am ignorant of the remaining. I do my utmost to make what leaves my desk or my lab efficient, maintainable and reliable. As far as development cost, that is a function of anticipated production volume. I hope with the above to have clarified that I do not need "broadening your horizons a little". If you still ASS U ME so, that will be your problem.
Erik (real name)
Erik: "now I am (I confess) totally lost. if you can not use preprocessor directives to make portable, how can you then make it portable, say between little endian and big endian situations, not to mention between different length ints?"
Vince: "I don't think ANY embedded professional would expect to write in pure 'C'."
I might manage a truly magnificently portable program slightly larger than "Hello World" if I target a Posix platform.
In this case, we are arguing (at least mainly) about embedded development.
So, our programs are not portable. They can't be. But we can affect the cost and time needed to port them to another platform.
Littering the code with huge amounts of #ifdef will not help, since the cost and time needed to maintain such a monster would be gruesom. But it is a direct approach, and hence the first way a beginning developer learns how to write "portable" - or in this case compilable for multiple platforms.
A better way is to see what parts of the software that is very closely tied to the hardware, and separate this into separate files. It will not be "drivers", but it is a beginning. When speed is important, the full functionality may be moved to a target-specific file. When speed isn't as important, the code may be splitted in two layers - one that worries about the bits, and one that worries about decisions.
In some cases, you may then need a few #ifdef in one or two header files to select which set of target-specific functions to use. In some situations you may use the project manager and specify which source files to include in the build. Most often you will be stuck with something in-between.
So a "portable" program may contain #ifdef, and often have to. But the #ifdef used will not be sprinkled around. They may define a UNPACK32(p) macro that knows in which order the four bytes should be loaded, allowing the send_config()/receive_config() functions to manage without caring about the required byte order. Or, you may have buffer_target1.c, buffer_target2.c, buffer_targetx86.c, ... that contains the implementation of get_u32(ptr) functions, letting the project select the correct function for your active target.
In the end, the magical factor isn't if the code contains #ifdef, but how the code has been partitioned and how/where any #ifdef are used. The maintainance cost isn't so much affected by a target.h file containing one or a houndred #ifdef. But if I have 50 source files, each with 5-20 #ifdef, then I suddenly have a lot of source files to worry about.
Any more thoughts on whether my apparently baseless assertion about your knowledge of the 'C' standard remains baseless? 1) it is baseless if you mean 'none' not if you mean "can be recited completely without the document". 2) what concern of yours is it whether I read the C standard as my bedside litterature or use it as a reference? 3) 'assertion', I appreciate your choice of word www.thefreedictionary.com/assertion "a positive statement, usually made without evidence" however, some of your statements in this regard have bee quite a bit more than 'assertions'
you are missing the one I refer to Mr. Sprat: "Wrapping code in preprocessor directives doesn't make it portable - in fact, it makes it clear that it is non-portable."
So, our programs are not portable. They can't be. But we can affect the cost and time needed to port them to another platform. BRAVO! BRAVISSIMO!
Littering the code with huge amounts of #ifdef will not help, since the cost and time needed to maintain such a monster would be gruesome. BRAVO! BRAVISSIMO!
But if I have 50 source files, each with 5-20 #ifdef, then I suddenly have a lot of source files to worry about. I inherited one of those and the only way I could get it to a level where I could maintain it was to make it far less 'portable' and, as usual, a port never was requested.
your post is even more interesting for me as I am (still) working on porting C167 code to an ARM. I can port most of the following modules without significant changes: queue(s), synchronization primitives, timers, trace buffer, scheduling logic. There are no preprocessor dependencies - all I did was to change most of the basic types used to unsigned long to get rid of some inefficient assembly. I am still struggling with the target specific features that really make the difference - interrupt handling is so different and apart from that, I just HATE making target specific modules that are a salad of many "target specific" functions.
There are no preprocessor dependencies - all I did was to change that is what has beaten a dead horse to a pulp in this thread: the contention that you can port without either having preprocessor directives or making changes.
"...I do not give a hoot about portability..."
"that is what has beaten a dead horse to a pulp in this thread"
Looking back through this thread, it would appear that YOU were the first one to mention it !!??
Why, oh why, oh why then do you insist on writing about it so much?
Before you get on that high horse (yet) again, realise that I just wrote a rhetorical question.
the correct (and complete) quote would be
all I did was to change most of the basic types used to unsigned long to get rid of some inefficient assembly
the nature of the modules that I mentioned is that they are platform independent in the sense that they are pure C implementation. I did put some effort into porting them but the implementation details did not change much. what are suggesting? that I would rewrite (and test...) then all the satisfy my crazed desire to cut off 0.5 microseconds here and there?!
Erik: "you are missing the one I refer to"
Mr. Sprat: "Wrapping code in preprocessor directives doesn't make it portable - in fact, it makes it clear that it is non-portable."
No, I wasn't actually missing it. I was more pondering what to do with it. The question here is what "wrapping" means.
Use of #define in itself doesn't ruin an otherwise acceptable program. Standard guarding of header files is an example where we are "expected" to use #define statements.
Abuse of #define can on the other hand (and tend to) produce unreadable code.
The bad thing is that it can be a fine line between the two.
If kept together where I expect to find them, then I, personally, see them as useful. If spread out, I see them as evil.
Maybe Jack uses the word "wrapped" to mean "sprinkled with". Maybe his tolerance level is lower than mine. Some people are of the opinion that the preprocessor should be totally banished. enum and const declarations allows a large number of #define to be replaced with direct language constructs.
Build systems that allows different targets to have different include directories and that allows easy selection of what groups of source files that should be included in the build can remove the need for a lot of (maybe all) #ifdef.
But in the end we fall back to the problem of "portable". Exactly what is the definition of "portable" in the current context?