In the book "TCP/IP Lean" [1], the author states: "I have used #define in preference to typedef because compilers use better optimisation strategies for their native data types." Is this true of the Keil C51 compiler? ie, will C51 generate better-optimised code from source using
#define U8 unsigned char
typedef unsigned char U8
Thanks, Jon. This is consistent with my limited knowledge of compiler design. The compiler knows how to deal with objects and expressions having particular characteristics, with size and "type" being among those characteristics. It should not make any difference whether a compiler learns about an object having one of the "native" types directly through a macro expansion or by "looking back" to resolve a typedef. Two objects having the same "root type" should be treated the same. Of course, I could be wrong entirely. ;-) --Dan Henry