I am developing a handheld data logger. The CPU is a STM32F10x, the storage chip is a 16MByte SPI-Flash.
Information of the Flash Chip: Sector Size: 4096 Bytes Programming Page Size: 256 Bytes
Currently I am using a simple File system. I append the acquired record to a binary file. The data is not necessarily stored as a file.
The structure of the record:
typedef struct strRecord{ char Info[56]; uint32_t TimeStamp; uint32_t SomeNumber; ....//Some other data }Record;
The length of the record is fixed. Before the record is appended to the simple binary file, the system must query if there is a record has the same Info(the first member of the structure) already exists in the file. If such a record exists, the new record will be merged to that record, otherwise a new record will be appended. There are no relation between these records, they come in random.
Currently I use an exhaustive approach, read from begin to end and compare. This solution only works when the file contains just hundreds of record or less.
The problem is : As the file size increases, the query process will become very slow. The system should work with tens of thousands of record.
I have attempted to port an older sqlite version to the system, but I have only about 140K of Flash size and 48K of RAM to spare. I could only trim the database to about 200K and the attempt failed.
Since the record has a fixed-length string in it, maybe some Hash-Table style trivial structure would rescue me. But all the string hash-table algorithm implementations I know of store the hash-table in memory.
Could anyone give me some hint of how to implement a hash-table on simple File System or Raw Flash.