This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Converting 4 digits inline ASCII to 16bit Hex/Dec

Dear Experts,

Need some quick reference code in assembly which are supremely compact in size wise (as size of code is a big constraint now, so assembly is also the last option) used to convert an incoming serial ASCII line of digits to hex codes for computation. Roughly 4 ASCII digits, e.g. in ASCII: (31h 32h 33h 34h) to be read as 1234dec OR 4D2hex.

Appreciate some really good helping hands!

Cheers,
James

  • What's wrong with:

    U16 total = 0;
    while (chars_to_process)
       {
       total = total * 10 + (getchar() & 0x0f);
       }
    

    I doubt coding that particular loop in assembly will save you a lot of bytes of code, but feel free.

    More error checking, or support for different input formats will cost more, of course.

    There's also atoi(), strtoul(), and sscanf(), depending on what functions you already use in your application.

  • Thanks Drew!

    forgot that just by multiplying it by 10s and add them respectively will place the digits nicely by it's placement to create a hex digit.

    cheers,
    James

  • You're under the false assumption that the CPU were using "hex codes" for actual computations. It's not. Numbers are just that: numbers. They are stored in memory in a way that some humans prefer looking at in a hexadecimal representation --- but that representation is not the actual number.

    You're re-inventing atoi(), which see.