Attiny85 8dip vs 8soic

I have been working in a project that uses an Attiny85 to light some leds using an remote control. I'm using an TSOP2238 to receive the signal.

I used the instructions I found in https://www.instructables.com/id/Attiny-IR-library/

I was able to make it work as I wanted, but I decided to change the (big) Attiny85 8dip with a (little) Attiny85 8soic.

I used

to conect to my Arduino and I pput the same code in the new chip. The code was not working.

Try to know whay happend I used this code:

#include <IRremote.h>

int RECV_PIN = 3;
int red = 0;
int green = 1;

bool r = false;
bool a = false;

IRrecv irrecv(RECV_PIN);
decode_results results;
int entrada;
char binary[33] = {0}; 


void setup() {
 pinMode(red, OUTPUT);
 pinMode(green, OUTPUT);
 irrecv.enableIRIn();
 digitalWrite(red, HIGH);
 digitalWrite(green, HIGH);
 delay(2000);
 digitalWrite(red, LOW);
 digitalWrite(green, LOW);   
}

void loop() {
 if (irrecv.decode(&results)) {
   entrada = results.value;
   itoa(results.value, binary, 2);
   
   digitalWrite(red, HIGH);
   delay(results.value);
   digitalWrite(red, LOW);    
   
   //begin sequence light on
   digitalWrite(red, HIGH);
   digitalWrite(green, HIGH);
   delay(1000);
   digitalWrite(red, LOW);
   digitalWrite(green, LOW);
   //begin sequence light off

   for(byte i = 0; i < 33; i++){
     if (binary[i] == '0') {
       digitalWrite(red, HIGH);
       delay(1000);
       digitalWrite(red, LOW);
       delay(500);
     }
     else {
       if (binary[i] == '1') {
         digitalWrite(green, HIGH);
         delay(1000);
         digitalWrite(green, LOW);
         delay(500);
       }
     }
   }
   //end sequence light on
   digitalWrite(red, HIGH);
   digitalWrite(green, HIGH);
   delay(1000);
   digitalWrite(red, LOW);
   digitalWrite(green, LOW);    
   //end sequence light off
   irrecv.resume();
 }
 delay(100);
}

I used that code to "decode" the received code from the remote control. I convert the result value to a binary string and I light the leds with the red as o value and the green as 1.

The code in the 8dip works fine, but in the 8soic don't.

The new entry is detected [irrecv.decode(&results) return 1] and the progran begin the "lighting" process, but the result.value is cero. The delay(results.value) is delay(0) because the red led is not seen. And between the begin sequence light and the end sequence light there is only one red light (0 value).

Does anyone have an idea what is happen? Is there any difference between the 2 elements?

Thanks for your time, and any help is wellcome.

(deleted)

This is what I was tinking when I decide to change one for the other.

Is there any way to know if the code inside the two chips is the same? or to clone the content from one to the other?

Thanks for your time

used this code

Please see #7 in the forum guide.

Did you use "burn bootloader" with the dip chip to set clock to internal 8MHz, but forget to do this for the soic? They default to internal 1MHz.

Fist, thanks for the advice. I have write the tag in the post.

PaulRB:
Did you use "burn bootloader" with the dip chip to set clock to internal 8MHz, but forget to do this for the soic? They default to internal 1MHz.

That means that the code will not work at 1Mhz?

I will repeat all the steps with a new chip and I will be sure to burn it with the clock al internal 8Mhz.

Thanks for your time.

gibernet:
I have write the tag in the post.

+1 Karma

gibernet:
That means that the code will not work at 1Mhz?

It might work at 1MHz. But if you upload code with the board setting of 8MHz but the chip is set to 1MHz, or the board setting is 1MHz but the chip is set to 8MHz, then probably it will not work.

Thanks for your help.

You were right, PaulRB. I have done all the process from the begin and I have take care that the "burn" is the same that the "upload" and all have work.

Thanks a lot for your time and knowledge.