This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Converting 4 digits inline ASCII to 16bit Hex/Dec

Dear Experts,

Need some quick reference code in assembly which are supremely compact in size wise (as size of code is a big constraint now, so assembly is also the last option) used to convert an incoming serial ASCII line of digits to hex codes for computation. Roughly 4 ASCII digits, e.g. in ASCII: (31h 32h 33h 34h) to be read as 1234dec OR 4D2hex.

Appreciate some really good helping hands!

Cheers,
James

Parents
  • You're under the false assumption that the CPU were using "hex codes" for actual computations. It's not. Numbers are just that: numbers. They are stored in memory in a way that some humans prefer looking at in a hexadecimal representation --- but that representation is not the actual number.

    You're re-inventing atoi(), which see.

Reply
  • You're under the false assumption that the CPU were using "hex codes" for actual computations. It's not. Numbers are just that: numbers. They are stored in memory in a way that some humans prefer looking at in a hexadecimal representation --- but that representation is not the actual number.

    You're re-inventing atoi(), which see.

Children
No data