AT Tiny 13 + Microcore issue

Hey guys,
I'm trying to build a simple code generator with a AT Tiny 13 microcontroller. I originally designed and tested the project on an Arduino Uno, with the intention of porting it to the Attiny13 with the microcore library (GitHub - MCUdude/MicroCore: A light-weight Arduino hardware package for ATtiny13).

Basically all the microcontroller has to do is lookup a value from a byte array, and change the state of an output pin to match, once every millisecond. The next millisecond it moves to the next byte in the array, and changes the output to suit. The byte array is 32 bytes long (1 value for each bit, the data stream code I want to output is just the same 4 bytes repeated over and over). Once it reaches the end of the array, it starts again from the start.
It also checks the state of an input pin (high or low), and if it is found to be low for a certain amount of code loops, it stops outputting the code.

I have written the code and it works perfectly on a uno with atmega328. However after compiling it for the attiny13, burning the bootloader (which is only to set fuses on the attiny13), and programming with a uno as ISP programmer, the attiny13 outputs gibberish data. The fuses all report as being written correctly, and the compiled data uploads to the attiny13, but it doesn't work properly. Sometimes it outputs gibberish for a few seconds and the gets stuck high, like its in an endless loop or something.

I have stripped the code down to find the problem. It no longer includes checking for the state of the input pin. It is meant to simply output 8 different values, 1 per millisecond, over and over. Again this works perfectly on the uno but not on the attiny.

#define DATA_RATE 1000 //microseconds between data bits (1000 = 1 bit per second)
#define DATA_LENGTH 7 //length of data packet in bits (0 based)
#define PIN_OUTPUT 4 //Output pin
const byte data[] = {1, 0, 1, 0, 1, 1, 1, 0}; //1 = HIGH, 0 = LOW
byte currentDataBit = 0;
unsigned long lastDataTime = 0;
//int loopCount = 0;

void setup() {
  //Serial.begin(115200);
  pinMode(PIN_OUTPUT, OUTPUT);
  lastDataTime = micros();
}

void loop() {
  //loopCount++;
  if (micros() > (lastDataTime + DATA_RATE)) {
    lastDataTime = micros();
    digitalWrite(PIN_OUTPUT, data[currentDataBit]);
    currentDataBit++;
    if (currentDataBit > DATA_LENGTH) {
      currentDataBit = 0;
    }
    //Serial.println(loopCount);
    //loopCount = 0;
  }
}

To use this with Microcore I had to enable the micros() function in the core_settings.h file. The Microcore github page says micros() is a working function.

I am using the 9.6mHz internal oscillator of the attiny. The uno (atmega328) running at 16mhz, with the loopCount parts in the code not commented out, reports 140 loops between each change of data (140 loops per millisecond) even with the overhead to write to serial for me to see. I would assume the code is not too fast for the attiny.

I have attached an image showing the output measured with an oscilloscope (Picoscope 4425). You can see the atmega328 outputting the same correct thing over and over. Whatever the attiny is outputting is not what the code ive written says it should be.

The above (cut down) code compiles for the attiny and says it uses 598 bytes (58%) of storage space, and 17 bytes (26%) of dynamic memory.
The complete code (with the extra output data, checking input pin etc) says it uses 712 bytes (69%) of storage space and 44 bytes (68%) of dynamic memory.

I'm thinking maybe I should be using an interrupt on the attiny, instead of constantly checking micros(), but Ive got no experience with them.

Thanks in advance

I tried the code you posted above on my ATtiny13 dev board, and hooked my scope to it. Seems like it works just fine!

Here's a thing you can try. Delete the previous version you have installed, and go over to the MicroCore github repo and do a manual install. By doing this you'll get the latest updates to the MicroCore repo.

Hi,

Thanks for your help.

I've done a manual install of MicroCore, and the above code now compiles and says it uses 590 bytes of storage (instead of 598 bytes it was using), so something has changed.

I'll upload it soon and see how it goes, but this looks promising.

I only recently installed microcore (only about a month ago), so unless you have changed something within that time maybe the Arduino boards manager was installing an old version of something?

I'm still having identical problem.

However I think ive found what it is. If I change the clock speed to 1.2mhz (which i imagine is what the chips ship with), it works fine on the oscilliscope.

Maybe arduino ISP isn't burning the fuses correctly? Even though it says it is

Maybe you have a bad chip? It's true that 1.2 MHz is the default value. Micros will not work correctly if the actual clock don't match the program.

Do you have a reasonable fast signal generator you can check with? I often use a separate generator (set to 9.6 MHz sine wave) when I'm working with timing critical application for the ATtiny13.

I don't know if your programmer is causing the issue, but i highly recommend using a dedicated programmer such as the USBasp or USBtinyISP. I always have these two close when working on AVRs, and they are very reliable

Ok I've confirmed the Arduino ISP is at least part of the problem.
I've tried several chips with the same result (I bought 10 of them from china).

I've programmed the chip with my minipro TL866 programmer. When I initally read the chip it showed the fuses were not set to what the arduino IDE said they were (low fuse/1fuse was something like $3A. High fuse and lock looked to be programmed correctly).

I used an online fuse calculator and have set the fuses to Low: 6A, High: FB, Lock: FF. I then exported the above test code as compiled binary and used the minipro to program it.
It worked perfectly, and the chip re read as I had programmed it, so i assume that its working correctly at 9.6mHz.

However using the same method doesnt work for my full code. i guess it might be too much for the Attiny. My full code is as follows:

#define DATA_RATE 1000 //microseconds between data bits
#define DATA_LENGTH 31 //length of data packet (0 based)
#define PIN_OUTPUT 4 //Output pin to drive data line
#define PIN_INPUT 3 //Input pin pulled low to disable data
#define LOOPS_TO_DISABLE 100 //how many consecutive low input loops to disable data
const bool data[] = {1,0,1,0,1,0,1,1,0,0,1,0,1,0,1,1,0,0,1,1,0,0,1,0,1,1,0,0,1,1,0,0}; //1 = HIGH, 0 = LOW
int currentDataBit = 0;
unsigned long lastDataTime = 0;
byte inputState;
byte inputFrames = 0;


void setup() {
  // put your setup code here, to run once:
  pinMode(PIN_OUTPUT, OUTPUT);
  pinMode(PIN_INPUT, INPUT);
  lastDataTime = micros();
}

void loop() {
  // put your main code here, to run repeatedly:
  inputState = digitalRead(PIN_INPUT);
  if (inputState == HIGH) {
    if (micros() > (lastDataTime + DATA_RATE)) {
      lastDataTime = micros();
      digitalWrite(PIN_OUTPUT, data[currentDataBit]);
      currentDataBit++;
      if (currentDataBit > DATA_LENGTH) {
        currentDataBit = 0;
      }
    }
  } else {
    inputFrames++;
    if(inputFrames >= LOOPS_TO_DISABLE){
      digitalWrite(PIN_OUTPUT, LOW);
      lastDataTime = micros();
      inputFrames = LOOPS_TO_DISABLE;
    }
  } 

}

Maybe im running out of memory or the digital reads are taking too long?

Ok so I've improved my code to use less memory and it now uploads and works almost correctly.
It is outputting the correct 4 bytes of data I want it to as it should.

#define DATA_RATE 1000 //microseconds between data bits
#define DATA_LENGTH 4 //length of data packet (in bytes)
#define PIN_OUTPUT 4 //Output pin to drive data line
#define PIN_INPUT 3 //Input pin pulled low to disable data
#define LOOPS_TO_DISABLE 200 //how many consecutive low input loops to disable data
const byte data[] = {0xAB, 0x2B, 0x32, 0xCC}; //from first transmitted to last
byte currentBit = 8; //Start at MSB first
byte currentByte = 0;
unsigned long lastDataTime = 0;
byte inputState;
byte inputFrames = 0;

void setup() {
  // put your setup code here, to run once:
  pinMode(PIN_OUTPUT, OUTPUT);
  pinMode(PIN_INPUT, INPUT);
  lastDataTime = micros();
}

void loop() {
  // put your main code here, to run repeatedly:
  inputState = digitalRead(PIN_INPUT);
  if (inputState == HIGH) { 
    //OK to transmit
    inputFrames = 0;
    if (micros() > (lastDataTime + DATA_RATE)) {
      //Time for next bit
      lastDataTime = micros();
      digitalWrite(PIN_OUTPUT, bitRead(data[currentByte], currentBit-1));      
      currentBit--;
      if (currentBit < 1) {
        //Reached the end of this byte
        currentBit = 8;
        currentByte++;
        if (currentByte >= DATA_LENGTH) {
          //reached the end of complete data packet
          currentByte = 0;
        }
      }
    }
   } else {
    inputFrames++;
    if(inputFrames >= LOOPS_TO_DISABLE){
      digitalWrite(PIN_OUTPUT, LOW);
      lastDataTime = micros();
      inputFrames = LOOPS_TO_DISABLE;
    }
  }
}

The only issue is the speed at which its outputting.
With the DATA_RATE set at 1000 microseconds (1ms), the actual data rate measured with the oscilloscope is 1 bit per 9.2mS (9 times slower than it should be). If I set the DATA_RATE to these values I get this output:
1000 = 9.2ms
500 = 5.21ms
200 = 3.9ms
100 = 2.6ms
50 = 2.6ms
10 = 2.6ms
I guess once it gets down to the low values I'm running out of clock cycles to get everything done in time, so it does it at the minimum possible.

I'm guessing my fuses are still not set correctly, and its still running at 1.2mHz. What should the fuses actually be set to? Ill try and program it with my minipro TL866 if Arduino ISP still fails. I'm not sure if it was actually Arduino ISP or my code that was the problem earlier on now.

So the issue was incorrectly set fuses as I thought. Things are getting pretty close now. I'm no expert but I'm learning a lot as I go and having fun.

The program is now outputting the correct signal at roughly 1ms per bit. I believe the problem is roughly was good enough and worked on the 16mhz atmega328, but isn't quite good enough on the attiny.
Some bits are just over 1.1ms in duration. I've measured with the scope and the time to transfer 1 byte/8 bits was actually 9.2ms, so by the time i get towards the end of the byte my timing is out by over 1 bit. By the end of the 4 byte packet it would be out roughly 4 or 5 bits.

I'm thinking now ill rewrite the code to output data using a CTC timer interrupt. I've read of some examples but haven't got any experience with them. I've also read there might be an issue with the core library using the interrupt (Flashing an LED with Attiny13: Timer, Watchdog timer and Sleep – Arduino, ESP8266, ESP32 & Raspberry Pi stuff). Am I going to have issues using microcore and using a CTC timer interrupt?