#include <REGX52.h> unsigned char i; float x; void main() { x=0; for(i=1;i<=30;i++) {x+=1.3;} }
This should ideally produce an output of x=39.
But in debugger the value is shown as x=38.999999. What could be the issue?
I guess this is why the librarty en.wikipedia.org/.../C_mathematical_functions includes round()
No, the round() function has more general use.
But this is why the normal printf() function tries to print floating point numbers with less number of value digits than the full number of bits in the float mantissa can represent - the last bits of the mantissa are often garbage.
It's just that cancellation errors and numerical stability of formulas will affect how many of the least significant bits that are garbage so printf() can't know how many "safe" digits that remains in the floating point number.
Not even with double or extended precision floating point arithmetic will it be possible to guarantee a fixed number of "safe" digits if a too numerically unstable formula has been used. Bad cancellation can throw away all value digits in a single step.