Understanding a point in polygon code

Hi, ive recently found this point in polygon code on a website that i can no longer find, and i was wondering if it was possible if someone could explain the code and if possible break it down for me so i can understand just how a single line is created between two point. i am using this for a geofenced rc car but i do not understand it therefore cannot troubleshoot or try to change anything without breaking my whole code.

here is the code and please just assume each variable in it is already declared.

if ((longitude[i]< testy && longitude[j]>=testy || longitude[j]< testy && longitude[i]>=testy)  &&  (latitude[i]<=testx || latitude[j]<=testx)) {
      c^=(latitude[i]+(testy-longitude[i])/(longitude[j]-longitude[i])*(latitude[j]-latitude[i])<testx);
      }
     

    return c;}

Your snippet is missing tons of info ...

If you want to know more you should Read about the Jordan curve theorem and the Ray casting algorithm.

The idea is based on counting the number of time a Ray from your test point is crossing the polygon boundaries. If you have a test point (Px,Py), a possible Ray is to test either along the X axis (Py constant, any x in one direction) or Y axis (Px constant, any y in one direction)

Typical algorithm use a comparison to flip a boolean to count the number of hits but You will find the optimization technique with XOR in this article (third code)

Usually one adds first à bounding box test, looking at the min and max of each vertex X and Y. If the point you test is not within this encompassing rectangle, then the point is outside the polygon

1 Like

Thanks for the response.

I was just wondering though, which part of the code actually defines the min and max for each testx and testy

Just sit down with a piece of graph paper and work your way through it.

This bit of the test determines if "testy" longitude is between those of points i and j.

longitude[i]< testy && longitude[j]>=testy

thanks that helped. now i have worked that part out im just having one other problem.

here is the full sample of code but within it when it print latitude and [j] and longitude and [j] all i get is 0.000000 for all of them. can anyone tell me what im doing wrong?
```
*void testInPoly() {

int  i, j=polyCorners-1 ;

c = false;
for (i=0; i<polyCorners; i++) {
 
        Serial.print("latitude[i] = ");
        Serial.print(latitude[i], 6);
        Serial.print("\n");
        Serial.print("latitude[j] = ");
        Serial.print(latitude[j], 6);
        Serial.print("\n");
        Serial.print("longitude[i] = ");
        Serial.print(longitude[i], 6);
        Serial.print("\n");
        Serial.print("longitude[j] = ");
        Serial.print(longitude[j], 6);
        Serial.print("\n");

if ( ((latitude[i] > testy) != (latitude[j] > testy)) && (testx < (longitude[j] - longitude[i]) * (testy - latitude[i]) / (latitude[j] - latitude[i]) + longitude[i]) ) {
    c = !c;
  Serial.print("is this happening?");
 
return c;}

}
}
_
```*_

This is my output. and im also for some reason getting 8 outputs instead of 4

Position: lon: 151.350616
lat: -32.795494 Num Satelites: 10
10 Satellites Found
testx = 151.350616
testy = -32.795494
latitude[i] = 0.000000
latitude[j] = 0.000000
longitude[i] = 0.000000
longitude[j] = 0.000000
latitude[i] = 0.000000
latitude[j] = 0.000000
longitude[i] = 0.000000
longitude[j] = 0.000000
latitude[i] = 0.000000
latitude[j] = 0.000000
longitude[i] = 0.000000
longitude[j] = 0.000000
latitude[i] = 0.000000
latitude[j] = 0.000000
longitude[i] = 0.000000
longitude[j] = 0.000000

Well... what did you store in your array??

You get two runs of the for loop, so twice the prints

in each of the arrays i have 4 variables. one for each lat and long for the four corners of the polygon

The code you gave is testing to see if the object (testy testy) is within a RECTANGLE +/- a bit.

  1. It does not test for the object being within in a polygon!.

All rectangles are polygons - BUT not all polygons are rectangles!

Mark

I understand what you are saying but i then how do i define it as a polygon for the point to be tested inside?

Did you read the links I pointed at in the first answer ?

A polygon is defined by a sequential list of vertexes (some say also vertices, in the same way the plural of index can be indices...)

So your polygon is defined in an array of (X,Y) coordinates that you have to pre-fill before calling the function. They will define the “border” of your hit area.

If you have only 4 vertexes (with four edges and four vertexes ) you have a quadrilateral - some of which have a specific name depending on the vertexes' geometric properties relative to each others (square, rhombus, parallelogram, trapezoid, rectangle,...)

If you have a plain rectangle with sides parallel to the axis of your coordinate system then checking if à point is within this rectangle does not need complex iteration. Find the min and max of X and Y of your 4 corners and check if your hit point is in between those min and max values