atof returning a different floating value

Hi,

I'm making an application to use GPS input to calculate distances for use on the golf course, and therefore I have been playing with parsing nmea data. However I have discovered that if you convert a string like 3.14 to a double value and convert this into to integers (one for the integer value and one for the decimals, it returns 3.13.

Why is that?

char str[] = "3.14";
int piR = 0;
double piD = 0;
int dec = 0;


void setup(){
 Serial.begin(4800);
  }
  
void loop(){
  piR = atoi(str);
  piD = atof(str);
  dec = 100*(piD-piR);
  
  Serial.print(piR);
  Serial.print(".");
  Serial.println(dec);
  delay(1000);
}

This is not unexpected in the world of computers, it is down to rounding errors and the format that floats are stored in.

You can flog through the maths and see why this is but it does happen and, if it is an issue, you have to fix it by how you handle the numbers rather than trying to fix the compiler.

It is particularly important to recognize that comparing two floating point numbers for equality is almost always a bad idea.

Something like

double piD = 3.14;

if(piD == 3.14)
{
   Serial.print("piD is equal to pi");
}

is unlikely to work because when piD is stored it may be stored as 3.139999 or something that is very close to but not exactly equal to 3.14.

" piR = atoi(str); "
" piD = atof(str); "

Ascii to Integer!

Try with atol (ascii to Long)

Had a fit with proper return values and printing floats with atoi & itoa....

Almost ALL of the "automatic" conversions between floats and integer types in C are simple truncation, so converting 3.13999999 * 100 to an int will give you 3.13 (for example.) To get the values you expect, you probably have to do a significant amount of (+ 0.5) in order to get rounding instead.