Lanc Controller with Arduino...

There is tons of info on how to control a camera with an Arduino. But I'm wanting to use a Lanc based remote control to control an Arduino.

I want to take a Lanc Zoom Controller and have it send commands to an Arduino. But I think my problem is with the emulating a video camera so the Lanc Zoom Controller will sync with the Arduino and send it's commands. The communication is a 8 bit serial over one line. I'm creating a pulse to send binary to the remote, and listening for a binary response. The first two bytes are commands from the lanc remote, the next two are not used, then the next four are a response from the camera with recording info, or timecode....

Right now I can tell that the remote control isn't getting the pulse correctly. The controller has a red LED on it. It blinks when it has power and no sync. When the controller is plugged into a camera the Led is off. I use the same wiring build to send recored and zoom commands to a video camera with no problems.

Thanks for any help or thoughts. I'm new to a lot of this.
dan.

Here one of the main sites that I'm getting most of my info from:

I'm using this build

And here is a extremely simplified version of my code. For right now I'm trying to emulate the basics of a video camera.

#define cmdPin 7 
#define lancPin 11
int bitDuration = 104; //Duration of one LANC bit in microseconds.



void setup() {

  Serial.begin(9600); //open the serial port
  pinMode(lancPin, INPUT); //listens to the LANC line
  pinMode(cmdPin, OUTPUT); //writes to the LANC line
  pinMode(recButton, INPUT); //start-stop recording button
  digitalWrite(cmdPin, HIGH);
  bitDuration = bitDuration - 8; //Writing to the digital port takes about 8 microseconds so only 96 microseconds are left till the end of each bit

}
  void loop() {
  

    digitalWrite(cmdPin, LOW);  //Write bit 0 (Start Bit) 
    delayMicroseconds(bitDuration);
    
     
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    
    
    digitalWrite(cmdPin, HIGH);  //Write bit 1 (Stop Bit) 
    delayMicroseconds(10);
    
    digitalWrite(cmdPin, LOW);  //Write bit 0 (Start Bit) 
    delayMicroseconds(bitDuration);
    
     
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration);
    
    digitalWrite(cmdPin, HIGH);  //Write bit 1 (Stop Bit) 
    delayMicroseconds(10);
    
        
    digitalWrite(cmdPin, LOW);  //Write bit 0 (Start Bit) 
    delayMicroseconds(bitDuration);
    
     
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration);
    
    digitalWrite(cmdPin, HIGH);  //Write bit 1 (Stop Bit) 
    delayMicroseconds(10);
    
        
    digitalWrite(cmdPin, LOW);  //Write bit 0 (Start Bit) 
    delayMicroseconds(bitDuration);
    
     
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration);
    
    digitalWrite(cmdPin, HIGH);  //Write bit 1 (Stop Bit) 
    delayMicroseconds(10);
    
        
    digitalWrite(cmdPin, LOW);  //Write bit 0 (Start Bit) 
    delayMicroseconds(bitDuration);
    
     
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration);
    
    digitalWrite(cmdPin, HIGH);  //Write bit 1 (Stop Bit) 
    delayMicroseconds(10);
    
        
    digitalWrite(cmdPin, LOW);  //Write bit 0 (Start Bit) 
    delayMicroseconds(bitDuration);
    
     
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration); 
    delayMicroseconds(bitDuration);
    
    digitalWrite(cmdPin, HIGH);  //Write bit 1 (Stop Bit) 
    delayMicroseconds(10);
    
        
    digitalWrite(cmdPin, LOW);  //Write bit 0 (Start Bit) 
    delayMicroseconds(bitDuration);
    
     
    digitalWrite(cmdPin, LOW);
    delayMicroseconds(bitDuration);
    digitalWrite(cmdPin, LOW); 
    delayMicroseconds(bitDuration);
    digitalWrite(cmdPin, LOW);
    delayMicroseconds(bitDuration);
    digitalWrite(cmdPin, LOW); 
    delayMicroseconds(bitDuration);
    digitalWrite(cmdPin, LOW); 
    delayMicroseconds(bitDuration);
    digitalWrite(cmdPin, LOW); 
    delayMicroseconds(bitDuration);
    digitalWrite(cmdPin, LOW); 
    delayMicroseconds(bitDuration);
    digitalWrite(cmdPin, LOW); 
    delayMicroseconds(bitDuration);
    
    digitalWrite(cmdPin, HIGH);  //Write bit 1 (Stop Bit) 
    delayMicroseconds(10);
    
        
    digitalWrite(cmdPin, LOW);  //Write bit 0 (Start Bit) 
    delayMicroseconds(bitDuration);
    
    digitalWrite(cmdPin, LOW);
    delayMicroseconds(bitDuration);
    digitalWrite(cmdPin, LOW); 
    delayMicroseconds(bitDuration);
    digitalWrite(cmdPin, LOW);
    delayMicroseconds(bitDuration);
    digitalWrite(cmdPin, LOW); 
    delayMicroseconds(bitDuration);
    digitalWrite(cmdPin, LOW); 
    delayMicroseconds(bitDuration);
    digitalWrite(cmdPin, LOW); 
    delayMicroseconds(bitDuration);
    digitalWrite(cmdPin, LOW); 
    delayMicroseconds(bitDuration);
    digitalWrite(cmdPin, LOW); 
    delayMicroseconds(bitDuration);
    
    digitalWrite(cmdPin, HIGH);  //Write bit 1 (Stop Bit) 
    delayMicroseconds(10);
    
    
    digitalWrite(cmdPin, HIGH);  //Write bit 1 (Stop Bit) 
    delayMicroseconds(10);
    delay(6);
  
  }

What protocol is the serial link to the lanc using?

Mark

  1. I think the problem is that the circuit and code you copied emulates a Lanc controller, and you want to emulate a video camera. So the code needs to wait for a command from the controller, and send a suitable response.

  2. The interface circuitry is more complicated than necessary. All that is needed is one I/O pin with the 4K7 pullup to +5V and a 100 ohm series resistor to Lanc signal. The open-collector output can be emulated by switching the pin mode (or data direction register) to Output to send a Low, and Input to send a High.

DC42 -
That worked perfect! I was able to build out my code to emulate a video camera. I'm printing out the commands from the lanc controller like a champ. It's great how simple the circuit works using a single line! Thanks a million for the help!

I do have another question for any one though:
I'm storing the binary response in an array. But I'm new to arduino sketch / C++. How do I dump my array into a long string or one 8 bit integer? I have done a lot of searching. But maybe I'm not searching the right terms.

Here is how I'm capturing the bits:

// Start Reading Byte 0 
    pinMode(lancPin, INPUT);
    delayMicroseconds(bitDuration/2); 
    for (int i=7; i>-1; i--) {
      lancBit[i] = digitalRead(lancPin);  // Read bits into array 
      delayMicroseconds(bitDuration); 
    }

// Start Reading Byte 1
    pinMode(lancPin, INPUT);
    delayMicroseconds(bitDuration/2); 
    for (int i=15; i>7; i--) {
      lancBit[i] = digitalRead(lancPin); // Read bits into array
      delayMicroseconds(bitDuration);
    }

Arduino really seems to have a lot of people who care about the community.
Thanks again for all of the help!

From your code I deduce that you are receiving a 16-bit word in the following bit order: 7 6 5 4 3 2 1 0 15 14 13 12 11 10 9 8. Here is how to read all the bits into a 16-bit unsigned integer:

// Start Reading Byte 0 
    uint16_t recValLo = 0;
    pinMode(lancPin, INPUT);
    delayMicroseconds(bitDuration/2); 
    for (int i=0; i < 8; i++) {
      recValLo <<= 1;
      if (digitalRead(lancPin) == HIGH)
      {  
          recValLo |= 1;
      } 
      delayMicroseconds(bitDuration); 
    }

// Start Reading Byte 1
    uint16_t recValHi = 0;
    pinMode(lancPin, INPUT);
    delayMicroseconds(bitDuration/2); 
    for (int i=0; i < 8; i++) {
      recValHi <<= 1;
      if (digitalRead(lancPin) == HIGH)
      {  
          recValHi |= 1;
      } 
      delayMicroseconds(bitDuration); 
    }

    uint8_t recVal = (recValHi << 8) | recValLo;

The 2 loops are identical apart from the variables they use, so you should consider writing a function and calling it twice.