This is the error message I get. The projext is for the NXP P80C51MC2. The code size is around 400K bytes. This occurs after I add another instance of a structure to the code. I have about 450 instances of this structure. I can take out one of them and the problem goes away. This project is several years old and we add code to it on occasion. The error occurs during the compile of the C file that includes the header file with the structures. I have tried splitting the header file up, but that did not help. Any ideas?
I am using Keil 8.05a on Win2000.
We are thinking of moving to a more powerful processor. We just have a lot of time and code written for this one. The current project is a decendent of an earlier one. We are definitely considering an ARM in 16 bit mode. We don't need 16 bit really, just the speed. Any suggestions? if you want to stay with the '51 and want speed, the "speed demon" in the '51 family is the SILabs F12x/f13x which is a one clocker running at 100MHz i.e. 100 times faster then a plain vanilla '51 running at 12MHz. I looked at the limits. The one I do not understand is. Keep in mind that I have 450 instances of this structure already. you said approximately 450, could it be 511?
Erik
We already use the SiLabs part F122. It is VERY fast (100MIPS), but only 128K bytes. We are using over 400K now.
Actually 434 instances in a header file and 132 instances in a C file. The instances in the C file are declarations of pointers passed as parameters in a function. That would make it 566.
but only 128K bytes In my opinion anything that uses more than 64k code memory does not belong on the '51. (thus I use the '132).
HOWEVER, it seems that much of your 'code' actually is data (strings etc), have you considered an external data ROM? I have 2MByte of external data flash on one of mine. If speed (re showing a string) is not of essence, you could even use a serial flash.
In theory, you can process the header with another compiler (a normal 32-bit gcc compiler for example) and create a pre-chewed hex-dump of raw data for the Keil linker to add in.
The 32-bit program could then create an array of 434 offsets into the data which can be typecast into pointers to the structures.
Not the slightest portable, but it wouldn't take more than a couple of hours to a day to write such a program and it shouldn't be too hard to modify your code to not use the individual objects, but instead typecast from a pointer to the bick magick block of data together with an offset from the lookup table.
The gcc compiler will not run into troubles and with a bit of cleverness when preprocessing the data, you may even manage to shrink the information slightly.
I really can't see why you need 132 pointers in the c file after such an approach.
Actually 434 instances in a header file and 132 instances in a C file.
This bears hints of possible foul play, by itself. You might be mixing up definitions and declarations.
There should never, ever, be 434 instances of anything in a single header file. Certainly not 434 definitions. Definitions go to the C file. And if those 132 instances in the C file are definitions, there should be a header file with the 132 declarations to go with that. Not 434.
Not always. I use the header file not to decalare public access but to break up files into manageable chunks. Since header files are included inline with the code it makes a nice way of cutting the file you look at down. Maybe it's not Ansi C but it makes my life easier. Only one file includes this header.
View all questions in Keil forum