Dear Erik!
What do I have to do recieve a 32-bit result if I multiply 2 16-bit variables? For example: unsigned int a; unsigned int b; unsigned long c;
c = a * b; Why I have 16-bit result?
This is covered by the C standard, which is always good to take a closer look at now and then.
What do you think happens if at least one of a or b are also 32 bit long?