Arduino-to-Arduino SPI Problem

I have two Arduinos connected via SPI and am using the SPI.h library. I want to turn on an LED on the slave Arduino but using the master Arduino's Serial Terminal. That is, if I type a "1" in the Master Terminal, the slave LED is turned on.

My SPI connection seems to be working okay, because the slave Serial Terminal responds with the appropriate "Slave receive = " message. The problem is that the slave LED doesn't turn on when the data is received from the master Arduino.

Suggestions?

Master Arduino Code:

#include<SPI.h>
byte byteRead;

void setup(void)
{
Serial.begin (9600);

  digitalWrite(SS,HIGH);//turn Slave Select OFF
    // Put SCK, MOSI, SS pins into output mode
    // also put SCK, MOSI into LOW state, and SS into HIGH state.
    // Then put SPI hardware into Master mode and turn SPI on...
  SPI.begin();
  // slow down the Master's clock speed
  SPI.setClockDivider(SPI_CLOCK_DIV8);
}

void loop(void)
{

  // Send byte entered in Serial Terminal

   /*  check if data has been sent from Serial Terminal: */
  while (Serial.available() > 0) {
    /* read the most recent byte */
    byteRead = Serial.read(); //Read Serial Terminal output
    digitalWrite(SS, LOW); //turn Slave Select ON
   Serial.println(byteRead); //Send ASCII equivalent of byteRead to Master terminal
   Serial.write(byteRead);  //Send binart equivalent of byteRead to Master terminal
   Serial.println();
   SPI.transfer(byteRead);
   }
    
  // disable  Select Slave
      digitalWrite(SS, HIGH); //turn Slave Select OFF   
      delay (1000); // 1 second delay between transmissions
}

Slave Arduino Code:

#include<SPI.h>
byte c = 0;
volatile boolean process_it;
int ledPin = 2;

void setup(void)
{
  Serial.begin (115200);  // debugging
  pinMode(ledPin,OUTPUT);
  digitalWrite(ledPin,LOW);
  Serial.println("Start");

  //Set MISO pin as output
    pinMode(MISO, OUTPUT);
  // turn on SPI in slave mode
    SPCR |= _BV(SPE);//set the SPI Control Register Enable Bit (bit 6) using bitwise OR (|=) 
                     //and the _BV (Bit Value) macro.The Enable bit enable SPI operation as a whole
  // get ready for an interrupt
    process_it = false;
  // now turn on interrupts 
    SPI.attachInterrupt();
}
  // ***SPI interrupt service routine *****
  ISR (SPI_STC_vect)
    {
    byte c = SPDR; //grab byte from SPI Data Register
        process_it = true;
    } // end of interrupt routine SPI_STC_vect

void loop(void)
{
  //only do something if a complete transmission occurs.

  if(process_it)
    {
    Serial.print("Slave Receive =  ");//Print received byte to Slave terminal...
    Serial.println(c);

        if(c == 1) //If received byte =  "1", turn on LED
          {
          digitalWrite(ledPin,HIGH);
          delay(2000);
          }
          else 
          {
          digitalWrite(ledPin,LOW);
          }
   
     process_it = false;  
          
      }

}

You're sending an ascii character to the slave and comparing it to 1, not ascii 1.

0x31 != 0x01

if(c == '1') //If received byte =  "1", turn on LED

Try this?

-jim lee

Thanks for the replies.

I think I found one problem.

In the Slave code header, the variable c is defined: byte c = 0;
In the first part of the ISR, I again define c as a byte:

// ***SPI interrupt service routine *****
ISR (SPI_STC_vect)
{
byte c = SPDR; //grab byte from SPI Data Register

If I change the line to c = SPDR; (without the byte datatype), the sketch works okay. Either c == '1' or c == 49 will turn on the LED.

Does byte c = SPDR; reset c to 0 in spite of the "= SPDR"?

Thank you, Delta_G, for the prompt (five minutes!) and informative answer.

It also explains why I was seeing "Slave Receive = 0" in the Slave terminal for the value of c in the main loop. That's because it was still the 0 value assigned to the global variable c in the initial declaration section.