Start video from arduino with sensor

I have got an arduino UNO and a sensor SHARP 2Y0A02.
Also, I have got a code by Processing.

// Using integration with GLGraphics for fast video playback.
// All the decoding stages, until the color conversion from YUV
// to RGB are handled by gstreamer, and the video frames are
// directly transfered over to the OpenGL texture encapsulated
// by the GLTexture object.
// You need the GLGraphics library (0.99+) to use this functionality:
// http://glgraphics.sourceforge.net/

import processing.opengl.*;
import codeanticode.glgraphics.*;
import codeanticode.gsvideo.*;
import processing.video.*; //Camera


Capture video; //Camera

GSMovie mov;
GLTexture tex;

int fcount, lastm;
float frate;
int fint = 3;


float x[] = new float[10000]; //Camera
float y[] = new float[10000]; //Camera 
color c[] = new color[10000]; //Camera
boolean show[] = new boolean[10000]; //Camera
int sum = 0; //Camera
color trackColor; //Camera

void setup() {
  size(640, 480, GLConstants.GLGRAPHICS);
  //frameRate(90);
  
  mov = new GSMovie(this, "station.avi");
  
  video = new Capture(this,width,height,30); //Camera
  
  // Use texture tex as the destination for the movie pixels.
  tex = new GLTexture(this);
  mov.setPixelDest(tex);
  
  // This is the size of the buffer where frames are stored
  // when they are not rendered quickly enough.
  tex.setPixelBufferSize(10);
  // New frames put into the texture when the buffer is full
  // are deleted forever, so this could lead dropeed frames:
  tex.delPixelsWhenBufferFull(false);
  // Otherwise, they are kept by gstreamer and will be sent
  // again later. This avoids loosing any frames, but increases 
  // the memory used by the application.
  
  mov.loop();
  
  noStroke();
  
  trackColor = color(255,255,255);
  
  smooth();
}

void draw() {
  
  if (video.available()) //Camera
  {
      video.read(); //Camera
    }
  
      // Using the available() method and reading the new frame inside draw()
      // instead of movieEvent() is the most effective way to keep the 
      // audio and video synchronization.
      if (mov.available())
      {
        mov.read();
    
        // putPixelsIntoTexture() copies the frame pixels to the OpenGL texture
        // encapsulated by the tex object. 
        if (tex.putPixelsIntoTexture())
        {
          // Calculating height to keep aspect ratio.      
          float h = width * tex.height / tex.width;
          float b = 0.5 * (height - h);

          image(tex, 0, b, width, h);
      
          /*String info = "Resolution: " + mov.width + "x" + mov.height +
                        " , framerate: " + nfc(frate, 2) + 
                        " , number of buffered frames: " + tex.getPixelBufferUse();*/
        
          fill(0);
          //rect(0, 0, textWidth(info), b);
          fill(255);
          //text(info, 0, 15);

          fcount += 1;
          int m = millis();
          if (m - lastm > 1000 * fint)
          {
            frate = float(fcount) / fint;
            fcount = 0;
            lastm = m; 
          }
        }
      }

  scale(-1.0, 1.0);
  image(mov, mov.width,0);

  // Before we begin searching, the "world record" for closest color is set to a high number that is easy for the first pixel to beat.
  float worldRecord = 500; 

  // XY coordinate of closest color
  int closestX = 0;
  int closestY = 0;

  // Begin loop to walk through every pixel
  for (int x = 0; x < video.width; x ++ ) 
  {
    for (int y = 0; y < video.height; y ++ ) 
    {
      int loc = x + y*video.width;
      // What is current color
      color currentColor = video.pixels[loc];
      float r1 = red(currentColor);
      float g1 = green(currentColor);
      float b1 = blue(currentColor);
      float r2 = red(trackColor);
      float g2 = green(trackColor);
      float b2 = blue(trackColor);

      // Using euclidean distance to compare colors
      float d = dist(r1,g1,b1,r2,g2,b2); // We are using the dist( ) function to compare the current color with the color we are tracking.

      // If current color is more similar to tracked color than
      // closest color, save current location and current difference
      if (d < worldRecord) 
      {
        worldRecord = d;
        closestX = x;
        closestY = y;
      }
    }
  }

  // We only consider the color found if its color distance is less than 10. 
  // This threshold of 10 is arbitrary and you can adjust this number depending on how accurate you require the tracking to be.
  if (worldRecord <40) 
  { 
    // Draw a circle at the tracked pixel 
    fill(trackColor);
    strokeWeight(30.0);
    stroke(0);
    ellipse(closestX,closestY,32,32);  
  
    // Two options: 
    if(keyPressed)
    {
      x[sum] = closestX;
      y[sum] = closestY;
      c[sum] = 255;
      show[sum] = true;
      if (sum>10000)
      {
        sum = 0;
      }
     }
  else 
  {
    x[sum] = closestX;
    y[sum] = closestY;
    c[sum] = 0;
    show[sum] = true;
    if (sum>10000) 
    {
      sum = 0;
    }
  }
 
    sum++;  
  }
  
  // Paint surface
  noStroke();
fill(255,1);
  rect(0,0,-width,height);

  for(int i = 1; i < sum; i++) 
  {
    //strokeWeight(random(1,10));
    //line(x[i],y[i],x[i-1],y[i-1]);
    if (show[i] == true)
    {
      stroke(c[i]);
      point((x[i]-640), y[i]);
    }
  }
}

I need your help.

I want by arduino the following:

  1. When a person enters into the room and passing along the right of the sensor then the video starts playing.
  2. when they fled from space and spent WRESTLING from the sensor next to it playing a musical.

How to do that. I don't know very good programming arduino and processing.

What has that processing code got to do with your project?
Nothing I can see.

When a person enters into the room and passing along the right of the sensor then the video starts playing.

So you need an Arduino sketch to send a message through the serial port to processing.
The processing sketch needs to receive this message and trigger the playing of the movie.

and spent WRESTLING from the sensor next to it playing a musical.

??????????

My guess is that you mean resting and music, but what you said was very funny.

I think that you do not know enough about computers to attempt this project at this stage.
Get the sensor and just write a sketch to read it and see how that works.
Then learn how to code in Processing, how to get it to play a video.
Only then will you be in a position to connect up the two systems and understand what is going on.

Quote
and spent WRESTLING from the sensor next to it playing a musical.
??????????
Wrestling - Wikipedia
Musical theatre - Wikipedia

I wanted to say music.