 # GPS cumulative errors.

Hi,

I have an arduino controlled aircraft that logs GPS position at 1 second intervals.

After the flight I load the log file into an excel spreadsheet and do some processing to see how far it's flown. If I want to see the total distance flown (assuming a straight line) I can just work out the distance between the take off and landing and compute the overall distance.

This method won't work if the aircraft lands where it takes off, so instead I look at the distance flown in each 1 second interval and then add up the total sum of differences for the whole flight. The problem is that I seem to accumulate what I assume are rounding errors as the distances involved get smaller. If I look at the distance flown every 10 seconds, then the overall total distance over-reads by about 15%, if I reduce the period to every 5 seconds, then the overall total distance overreads by about 30%, and if I reduce the period to once a second, then the errors just get too big.

So, in summary, is there any way of accurately measuring the distance of an aircraft over a meandering flight?

It would be instructive to do your 1 second log with the aircraft on the ground and not moving. GPS is only good for a few meters and that error wanders around. You need a faster plane. 8^)

The issue is that there is a position measurement error associated with each sample interval so your computed distance is zigzagging around the true flight path.

Given that you are reasonably certain that the aircraft follows a smooth trajectory, you can do some form of curve fitting that goes through a sequence of points. In Excel this might be based on the "LINEST" function.

For example you might take 6 data points, compute a quadratic fit through those points (independently for each axis (x,y,z) or (lat,lon,alt)), then using the coefficients for that fit take the distance between the 3rd and 4th of those 6 points as the linear distance traveled during that second.

Precision real time navigation systems do something like this via a technique called Kalman filtering, which is a non-trivial topic. Some GPS receivers can provide a filtered navigation solution.

As MrMark said, each point has its own error. Adding (or subtracting) the points together also adds the errors. You can reduce the error by applying a "model" (a "filtered navigation solution") that makes assumptions about the real-world behavior of the platform. Some GPS devices allow configuring whether it's a stationary platform, pedestrian, car or aircraft. Curve approximation is another way, but it doesn't incorporate the physics constraints like the platform model.

Adding an IMU and/or compass is another possibility. The magnetic orientation, acceleration and rotation can be used to reduce the error on each point. Those devices have have their own set of errors, of course.

Are you logging a `float` for the latitude and longitude? You could be losing significant digits. Depending on your parser, it may have a long integer form of the lat/lon. Save that instead, and scale it in Excel.

thanks for everyones suggestions.

Are you logging a float for the latitude and longitude? You could be losing significant digits.

I copy the lat/long data from the GPS straight to the log file as a text string.

I can understand how the distance between two close points may be (mis)calculated with an error of a few centimeters. But I had assumed that if if I'm measuring the distance between several hundred such points, then the error for each one would be plus or minus a few centimeters, but instead it seems that the error is always a plus and never negative

I think the error is more than "a few centimeters". Again, just place your plane on the ground and record the GPS every second to see how much error you get.

If seeing distance is the end goal and you have an accelerometer/speedometer logging velocity

avg velocity X time of flight = distance traveled

Fulliautomatix: But I had assumed that if if I'm measuring the distance between several hundred such points, then the error for each one would be plus or minus a few centimeters, but instead it seems that the error is always a plus and never negative

The error that accumulates is effectively "miss distance" which is always a non-negative value. "Miss distance" is the radial distance between the position reported by the GPS and the true position of the sensor regardless of the direction of the error.

Statistically, the error in measurement might be well represented as a zero mean Gaussian distribution while the corresponding miss distance is a Rayleigh distribution.

INTP: If seeing distance is the end goal and you have an accelerometer/speedometer logging velocity

avg velocity X time of flight = distance traveled

While the math makes sense, generally the bias and drift errors of inertial sensors are such that this approach doesn't work well in practice. If the accelerometer/speedometer are GPS based then the core problem hasn't really changed.

I think the error is more than "a few centimeters". Again, just place your plane on the ground and record the GPS every second to see how much error you get.

I already have this data - from when the aircraft has been on the ground prior to take-off. The GPS is very consistent and static. It's only if the logger is left on for hours that a drift can be noticed.

avg velocity X time of flight = distance traveled

True, but I only log airspeed, so this is only true if there's zero windspeed.

The error that accumulates is effectively "miss distance" which is always a non-negative value. "Miss distance" is the radial distance between the position reported by the GPS and the true position of the sensor regardless of the direction of the error.

True. But if looking at the distance between two consecutive points you might hope that the miss-distance will be off by the same amount for both fixes. - i.e. if both points are 20cm inaccurate to the north, then overall the distance between the two should be the same.