I have a structure I would like to compute the offset to members within.
I've attempted this numerous ways all of which fail to generate a CONSTANT offset instead the compiler insists and create code for what is essentially the subtraction of 2 constants. (mutter) so
typedef unsigned short uint16_t; #define SENSOR_TYPE_OFFSET ((uint16_t)((char *)&ralph.constants.some_stuff.silly_putty) - (uint16_t)((char *)&ralph))
generates lots of cute code but not a constant offset to the member of the type ralph is. So are there any suggestions of beating the compiler into doing what I want? I would rather not compute the offsets by hand. I suppose I could add up the sizeof() for each member also but either one is kind of annoyingly messy and very ugly.
Actually, ANSI C has a macro for that. It's called offsetof(), and it's defined in stddef.h. Hopefully, Keil implemented it to generate efficient code.
Adding up individual sizeof() is very, very, very bad!
Does Keil see the full declaration of your pointer? What type of pointer is it? A generic pointer, or XDATA etc?
just a couple of complementing comments: Per's assertion that Adding up individual sizeof() is very, very, very bad! are based on portability issues: the compiler might insert padding into your structure to make it more comfortable to work with. if so, adding up elements size might leave your program very dead. always use something like 'offsetof', which is a macro that looks like this (for Keil):
#define offsetof(t, memb) ((__CLIBNS size_t)__INTADDR__(&(((t *)0)->memb)))
notice how 0 is cast to a pointer to the proper type and member.