I have an arduino controlled aircraft that logs GPS position at 1 second intervals.
After the flight I load the log file into an excel spreadsheet and do some processing to see how far it's flown. If I want to see the total distance flown (assuming a straight line) I can just work out the distance between the take off and landing and compute the overall distance.
This method won't work if the aircraft lands where it takes off, so instead I look at the distance flown in each 1 second interval and then add up the total sum of differences for the whole flight. The problem is that I seem to accumulate what I assume are rounding errors as the distances involved get smaller. If I look at the distance flown every 10 seconds, then the overall total distance over-reads by about 15%, if I reduce the period to every 5 seconds, then the overall total distance overreads by about 30%, and if I reduce the period to once a second, then the errors just get too big.
So, in summary, is there any way of accurately measuring the distance of an aircraft over a meandering flight?