Attiny85, Arduino as ISP, changed clock prescaler, can't upload code

Hi. I was fooling around with an attiny85 using arduino as ISP. I was testing different sleep modes, WDT interrupt to wake it up etc. when I started messing with the clock prescaler in order to further reduce current draw. The programming seemed to work as my LED now blinks with a very low frequency, but I can no longer upload a new code :( The reason I was so careless was because I thought the MCU acts as a slave while it's being programmed so that the system clock speed would have no impact in the actual uploading, but I guess I was wrong. I tried changing the clock speed in the boards.txt file, but that had no effect, I also tried writing #define F_CPU=31250 in the sketch, but that didn't do much either. This is the error message I get:

avrdude: please define PAGEL and BS2 signals in the configuration file for part ATtiny85 avrdude: Yikes! Invalid device signature. Double check connections and try again, or use -F to override this check.

Is it possible to save my chip? It's important for me to lower the current draw as much as possible, I might even consider using the WDT as system clock speed so I don't want these uploading issues :(

Get yourself an AVR ISP so you can get communication control back via the SPI lines and reset your fuses.

Plecto: Is it possible to save my chip?

Yes. You will be using this... https://code.google.com/p/arduino-tiny/downloads/list?can=2&q=tinyisp ...instead of ArduinoISP.

• Open _TinyISP_BuildOptions.h • Add this line...

define PROGRAMMER_SPI_CLOCK SLOW

• Upload TinyISP to your Arduino • Try again to change the ATtiny85 fuses

Thanks for the reply. I'm struggling to see how the fuses can be a problem, I'm pretty sure I haven't changed any of them. I'm also wondering what the difference between a real programmer and the arduino is, isn't it just a matter of sending some bits to the attiny? Are the bits from a real ISP any better than from an arduino?

Yes. You will be using this... https://github.com/Coding-Badly/TinyISP/archive/master.zip ...instead of ArduinoISP.

• Open _TinyISP_BuildOptions.h • Add this line...

define PROGRAMMER_SPI_CLOCK SLOW

• Upload TinyISP to your Arduino • Try again to change the ATtiny85 fuses

I downloaded it and added the line you mentioned, but how do I upload tinyISP to my arduino? I'm also curious to what the big deal about these fuses are. If the fuses can be set by sending bytes through MISO, why can't any piece of software pull this off? I assume that it's not a question of processing power or hardware, it's not like the bits have to be ultra fancy with colors on them, right?

Plecto:
I’m struggling to see how the fuses can be a problem, I’m pretty sure I haven’t changed any of them.

They may not be. If you did not change the fuses then you changing the prescaler is the problem. In either case, the target’s clock rate is very likely the problem.

I’m also wondering what the difference between a real programmer and the arduino is, isn’t it just a matter of sending some bits to the attiny?

Changing the bit-rate on a “real programmer” is typically easier.

I downloaded it and added the line you mentioned, but how do I upload tinyISP to my arduino?

How did you upload ArduinoISP?

I assume that it’s not a question of processing power or hardware, it’s not like the bits have to be ultra fancy with colors on them, right?

It’s a question of clocking. The target (your ATtiny85) has to have three (or more) clock transitions to be able to process each bit coming in from the programmer. Assuming the number you posted is accurate (31250 cycles per second), the bit rate has to be reduced to about 10K per second.

Alright, so I’ve uploaded tinyISP and I’m managing to upload codes again to my attiny chip, so the problem is solved :slight_smile: I want to know what actually went on here though.

It’s a question of clocking. The target (your ATtiny85) has to have three (or more) clock transitions to be able to process each bit coming in from the programmer. Assuming the number you posted is accurate (31250 cycles per second), the bit rate has to be reduced to about 10K per second.

You are talking about the bit rate of the actual uploading from my arduino? As said in the opening post, isn’t the attiny a slave device during uploading? Isn’t it just tagging along to whatever frequency the arduino is uploading the code at regardless of what it’s own clock frequency is? Also, what did tinyISP do that arduinoISP couldn’t? One other issue that popped up was that I can no longer change the clock prescaler :frowning: I think my code is correct:

CLKPR = 0x40;
  CLKPR = 0x02;

Quote from the datasheet:

To avoid unintentional changes of clock frequency, a special write procedure must be followed to change the
CLKPS bits:

  1. Write the Clock Prescaler Change Enable (CLKPCE) bit to one and all other bits in CLKPR to zero.
  2. Within four cycles, write the desired value to CLKPS while writing a zero to CLKPCE.

Plecto:

It's a question of clocking. The target (your ATtiny85) has to have three (or more) clock transitions to be able to process each bit coming in from the programmer. Assuming the number you posted is accurate (31250 cycles per second), the bit rate has to be reduced to about 10K per second.

You are talking about the bit rate of the actual uploading from my arduino?

Correct.

As said in the opening post, isn't the attiny a slave device during uploading?

It is. But it is not a simple slave. It must frame each command (four bytes) and then execute that command. In order to execute commands, it has to have a running clock. In order to receive data, it has to have a running clock. In order to receive data, it has to have a running clock that is three or more times faster than the ISP bit rate.

Isn't it just tagging along to whatever frequency the arduino is uploading the code at regardless of what it's own clock frequency is?

No.

Also, what did tinyISP do that arduinoISP couldn't?

Slower bit rate.

One other issue that popped up was that I can no longer change the clock prescaler :( I think my code is correct:

Before spending a great deal of time trying to get the processor running as slow as possible bear in mind that there is a valid argument for running the processor as fast as possible. The idea is to wake, work as fast as possible to minimize the awake time, sleep as deeply as possible.

I have a few handheld gadgets that run from AA batteries. The devices are used once to a few times a day for a few minutes at a time. I typically change batteries less than once per year. None of them are clocked below 1 MHz.

It is. But it is not a simple slave. It must frame each command (four bytes) and then execute that command. In order to execute commands, it has to have a running clock. In order to receive data, it has to have a running clock. In order to receive data, it has to have a running clock that is three or more times faster than the ISP bit rate.

I'm beginning to get the picture, but I still have some questions if you don't mind. Why is it executing these four byte commands? Isn't it supposed to start executing commands after the uploading is complete and the program has started running? Can't the data just be stored during upload and the cpu can have a look at it afterwards?

You said that the system clock have to be 3-4 times faster than the ISP bit rate. The system clock speed can be pretty darn slow if I read the datasheet correctly, choosing the WDT as system clock with a prescaler of 256 comes to mind :D So how is the ISP bit rate set when dealing with projects that require very low system clock speeds? Perhaps low clock speeds are rather uncommon?

MISO is also connected, there has to be some acknowledge bits sent back to the ISP, right? Can't the MCU say "hold on" between each four byte transfer so it can keep up?

Before spending a great deal of time trying to get the processor running as slow as possible bear in mind that there is a valid argument for running the processor as fast as possible. The idea is to wake, work as fast as possible to minimize the awake time, sleep as deeply as possible.

I have a few handheld gadgets that run from AA batteries. The devices are used once to a few times a day for a few minutes at a time. I typically change batteries less than once per year. None of them are clocked below 1 MHz.

That's a good point. The device I'm making right now is very simple. It's going to wake up, light one IR (with a carrier signal) for 300us, then another IR led for 300us and then fall asleep for 16ms. I guess it's possible to just write the output HIGH, sleep, wake up after 400us and then write it low again, this is unless the GPIO's are written LOW automatically. The 38khz carrier signal is supplied by a timer so this will keep running, it might be a good idea to lower the clock speed so that the timer interface uses less power though? Or perhaps this won't make much a difference compared to the power consumption of the LED.

Plecto: Why is it executing these four byte commands?

Because you told it to. The commands are things like "write page to Flash" and "write fuse byte". They are upload instructions.

Isn't it supposed to start executing commands after the uploading is complete and the program has started running?

Of course. Which are not the "commands" we we are discussing.

Can't the data just be stored during upload and the cpu can have a look at it afterwards?

Something has to be responsible for going through the steps necessary to write to Flash. In the case of AVR processors, that something is the processor itself.

So how is the ISP bit rate set when dealing with projects that require very low system clock speeds?

Very slow.

Perhaps low clock speeds are rather uncommon?

Not mentioned very often on this forum.

MISO is also connected, there has to be some acknowledge bits sent back to the ISP, right?

Sort of. The command itself is echoed back with any expected data. You can find the details in the datasheet.

Can't the MCU say "hold on" between each four byte transfer so it can keep up?

Only if it can communicate with the programmer. And, as I stated earlier, the only way it can communicate with the programmer is when the target is running three times (or more) faster than the programmer.

Plecto: That's a good point. The device I'm making right now is very simple. It's going to wake up, light one IR (with a carrier signal) for 300us, then another IR led for 300us and then fall asleep for 16ms. I guess it's possible to just write the output HIGH, sleep, wake up after 400us and then write it low again...

Sounds reasonable.

...this is unless the GPIO's are written LOW automatically.

No.

The 38khz carrier signal is supplied by a timer so this will keep running, it might be a good idea to lower the clock speed so that the timer interface uses less power though?

You need to focus first on getting the thing working at a reasonable clock speed (like 1 MHz or 8 MHz). As you reduce the clock speed, it will become increasingly difficult to generate a clean carrier and correctly modulate it.

Or perhaps this won't make much a difference compared to the power consumption of the LED.

Quite possibly.

You need to focus first on getting the thing working at a reasonable clock speed (like 1 MHz or 8 MHz). As you reduce the clock speed, it will become increasingly difficult to generate a clean carrier and correctly modulate it.

Yeah, that's true. I've been testing a little bit with different sleep modes, trying to get the power consumption as low as possible. While it's in power down mode, the consumption is minuscule, the datasheet states 0.1uA. To make the timer work, I have to set the sleep mode to idle, but doing this only reduces power consumption from about 6mA to 2.5mA. I've turned off the ADC as well as writing the PRR=0xFF (thus turning off all timers as well as the USI and adc clock). If idle mode is chosen, the cpu clock is stopped. If then the clock to all peripherals has stopped, what is drawing 2.5mA? I know the BOD is still operational, but this isn't drawing that much. With idle mode selected and one of the timers still operational, it's back to drawing 6mA (my cheap DMM isn't quite sure) so I don't quite understand whats going on. Without putting the MCU to sleep, it will draw around 6mA regardless of whether the timer is on or not. When putting it to idle sleep, the MCU will draw 2.5mA without the timer, and 6mA with the timer turned on :(

Be sure to disable the ADC before you set PRR, otherwise it doesn't power down the ADC. That 2.5mA might easily be that.

  // This needs to be done *before* writing to PRR
  ACSR = ADMUX = ADCSRA = 0;

If you spend most of your time asleep then lowering the clock speed might actually increase overall power consumption because the program takes longer to run when it wakes up.

Be sure to disable the ADC before you set PRR, otherwise it doesn’t power down the ADC. That 2.5mA might easily be that.

I’ve disabled the ADC before setting the PRR. Writing “ACSR = ADMUX = ADCSRA = 0;” had no effect. The datasheet states that the analog comparator can’t be used if the PRADC bit in PRR is set, there is no mention of having to turn the comparator off before setting PRR. The ADMUX register sets the adc voltage reference, the adjustment of the bits in the data register as well as the adc multiplexer, it doesn’t turn off any clocks or anything. The datasheet states that the attiny85 should use about 1.2mA at idle mode with a clock speed of 8Mhz and vcc=5V. What I don’t understand is how the clock speed can make a difference in idle mode if every peripheral that uses a clock is turned off, this further makes me wonder why the idle mode draws way more current than power down mode, if every peripheral is turned off, what’s the difference between the two?

Here’s my code btw:

#include <avr/sleep.h>

void setup() {
  pinMode(0, OUTPUT);
  TCCR0A = (1<<COM0A0) | (1<<COM0B1) | (1<<WGM01) | (1<<WGM00);
  TCCR0B = (1<<WGM02) | (1<<CS02) | (1<<CS00);
  OCR0A = 255;
  OCR0B = 127;
  ADCSRA = ADCSRA&0b01111111;
  ACSR = 0b10000000;
  PRR=0xFF;  
  set_sleep_mode (SLEEP_MODE_IDLE);  
  sleep_enable();
  sleep_cpu();
}

Plecto: The datasheet states that the analog comparator can't be used if the PRADC bit in PRR is set, there is no mention of having to turn the comparator off before setting PRR.

See description of PRR, Bit 0:

Bit 0 – PRADC: Power Reduction ADC Writing a logic one to this bit shuts down the ADC. The ADC must be disabled before shut down.

The ADC, yes, but the comparator? The comparator and ADC have different control registers, it doesn't state that the comparator have to be turned off before setting PRADC. Anyway though, the only thing that impacts current draw is turning off the ADC, messing with the comparator doesn't seem to do much.