We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
Hello, I'm new to the 8051 and I am hoping someone has had to solve a similar problem to this before, although this is not an 8051 specific problem. Can someone give me assistance on how to convert a 12-digit binary coded decimal value to hexadecimal on the 8051? The BCD value is packed into 6 consecutive bytes. Any help would be appreciated. Regards, Stephen McSpadden.
The best way is the direct one: BINARY_VALUE = FIRST_DIGIT * BINARY_HUNDRED_THOUSAND + SECOND_DIGIT * BINARY_TEN_THOUSAND + THIRD_DIGIT * BINARY_THOUSAND + ... Regards, M.
The following code will convert the array of BCD digits in variable bcd to a 32-bit binary value in hex. Of course, a 32-bit binary cannot hold the full range of values that you can get from 12 decimal digits. This code just indicates how to go about the conversion.
main( void ) { data unsigned char bcd[6] = { 0x00, 0x00, 0x12, 0x34, 0x56, 0x78 }; data unsigned char loop; data unsigned char data *p; data unsigned long hex; hex = 0; loop = sizeof( bcd ); p = &bcd[0]; do { hex = hex * 10; hex = hex + ( *p >> 4 ); hex = hex * 10; hex = hex + ( *p & 0x0F ); p++; }while( --loop != 0 ); }
Graham has presented such a nice, elegant algorithm for converting BCD to binary. Should we have a competition to see who can write the smallest version measured using the compiler listing file? Jon
I don't think you want to be issuing a challenge like that to Graham! ;-) Are you excluding solutions using #pragma ASM... ?
Thanks to Graham and everyone who took the time to reply. I'm afraid I didn't see the reply before I implemented my own solution but it was interesting to compare results. (And no, it hasn't taken this amount of time to implement it!! ;) ) Again, apologies for the length of time it has taken me to acknowledge your help. Regards, Stephen McS.