Bit banging issues, ok @ 16 MHz not at 8

I've written some bit-banging code which basically just sets a pin high to send a one and low to send a zero. It does this at 100 Hz. The code works great on my Uno, which has an Atmega 328 running at 16 MHz. However, it doesn't work well on a plain Atmega 168 running at 8 MHz - most bits arrive ok, but every few dozen bits there's a repeated bit. I think (perhaps incorrectly?) that the issue is with the timing - right now I'm calling delayMicroseconds(period in us), but I suspect there is a better way of doing this to get the timing down. Is there? Are there any techniques for writing timing-critical code for Arduino?

Thanks!

I can see no reason for the CPU speed to effect you bit banging code, apart from the obvious that it would run at 1/2 the speed.

How about you post this code, it's pretty hard to comment on nothing. USE CODE TAGS PLEASE.

Are there any techniques for writing timing-critical code for Arduino?

Yes, but at 100Hz you are not even close to needing such techniques unless there are some timing constraints we are not aware of.


Rob

Sorry, I didn’t post the code because it’s so simple but here it is anyway -

#define sbi(a,b) (a) |= (1<<(b))
#define cbi(a,b) (a) &= ~(1 << (b))
...
void transmitByte(uint8_t data)
{
    for(int i=0; i<8; i++) {
        if (data & 0x01)
            sbi(PORTD, 4);
        else
            cbi(PORTD, 4);
    }
    delayMicroseconds(period in us);
    data >>= 1;
}

I agree, I had assumed that an 8 MHz micro should have no trouble bit banging at 100 Hz. I’m not sure what would explain the difference in performance though between the 328 and the 168.

How about some real code, you know stuff that actually compiles.

As Graynomad said there is no obvious reason why it should run into trouble so it must be something you are missing. With you giving us so little of what you are doing it is not obvious to us as well.

#define sbi(a,b) (a) |= (1<<(b))
#define cbi(a,b) (a) &= ~(1 << (b))

void setup()
{
    pinMode(4, OUTPUT);
    for(;;) {
        preamble();
        transmit((uint8_t *) "QWERTYUI");
        delay(100);
    }
}

void loop () {}

void transmit(uint8_t *data)
{
    while(*data)
        transmitByte(*data++);
}

void preamble()
{
    for(int i=0; i<8; i++) {
        sbi(PORTD, 4);
        delayMicroseconds(10000);
        cbi(PORTD, 4);
        delayMicroseconds(10000);
    }
}

void transmitByte(uint8_t data)
{
    for(int i=0; i<8; i++) {
        if (data & 0x01)
            sbi(PORTD, 4);
        else
            cbi(PORTD, 4);
    }
    delayMicroseconds(10000);
    data >>= 1;
}

Transmits fine on an Uno, with 328 @ 16 MHz. Same code, compiled for the lilypad and running on a plan 168 @ 8 MHz, works but with many more bit errors.

edit - I should clarify what I mean by “bit error”. The bits are being sent wirelessly, when the receiver sees the preamble it starts recording 64 bits at 100 Hz. So when the 328 is sending, the receiver gets all the bits OK. With the 168, the the receiver doesn’t - typically, it looks like the same bit is (erroneously) received twice in a row. Which I took to mean the sender wasn’t actually sending at quite 100 Hz.

The bits are being sent wirelessly

Drip drip drip,
The information drips in. So you didn't think that was relevant ?
Well it is.

How are you sending and recieving this data?

By the way what is that for loop supposed to be doing in the setup function?

Sending by turning a magnetic field off/on, receiving on an android device by detecting the field changes.

The loop tries repeatedly sending that eight character string, waiting 100 ms in between sends.

Wowi did not see that !

Why are you keeping things a secret? In order to help we need full information.

The problem now is obvious the transfer function on your wireless has a poor low frequency responce.

The cure? Well unless you are willing to provide full information on your wireless device I can't tell.

Yes:

    delayMicroseconds(10000);

}

You cannot pass large values to delayMicroseconds(). I think the upper limit is
about 16000 on 16MHz system and about 8000 on 8MHz systems.

[ Actually I'm wrong, I think its 32000 for 8MHz systems in fact, so probably
not the problem, but its something that can catch you out ]

Doh, I’m slow today:

Yes:

#define sbi(a,b) (a) |= (1<<(b))

#define cbi(a,b) (a) &= ~(1 << (b))

void transmitByte(uint8_t data)
{
    for(int i=0; i<8; i++) {  // loop 8 times for one bit without delay
        if (data & 0x01)
            sbi(PORTD, 4);
        else
            cbi(PORTD, 4);
    }
    delayMicroseconds(10000);
    data >>= 1;  // pointless shift of a dead value
}

Suggest:

void transmitByte(uint8_t data)
{
    for(int i=0; i<8; i++)
    {
        if (data & 0x01)
            sbi(PORTD, 4);
        else
            cbi(PORTD, 4);
       delayMicroseconds(10000);
       data >>= 1;
   }
}

You're still faster than us, took me a while to see what you changed :slight_smile:


Rob

What about the receive side of this protocol? I see you have a hard coded baud rate, did you remember to update the receiver.

Given the placement of the delay instruction it's hard to see how this worked at any speed, despite the topic title.

Why didn't you just use normal serial style signalling? I.e. signal goes high to indicate a carrier. A low going start bit edge is provided on each byte for the receiver to sync on - the receiver takes its first bit simple 1.5 bit times after this edge, then one bit time between samples. Finally end with one or more stop bits to assert continued carrier (no carrier == possible baud rate mismatch). I've bit banged several times (e.g. embedded device pulsing a message out on its LED for a webcam to read), but I prefer to stick to tried and true signalling methods.

Ugh, very sorry about that - the computer I'm coding this on doesn't have internet, so I transcribed the code here and of course made a mistake - the delayMicroseconds line and data >>= 1 line should be inside the for loop. Correct code is what MarkT suggested.

Thanks for the suggestion about a maximum value for delayMicroseconds - I hadn't realized the limit would be different based on the micro's clockspeed, so I'll check that out.

Grumpy Mike, why would the transfer function on the wireless device respond differently when the data is being sent by an 8 MHz micro vs. a 16 MHz? In both cases the data should be sent at the same rate, and as far as I understand that's all the receiver should care about.

DonMilne, the issue is the receiver's maximum sampling rate is 100 Hz - so when I received a start bit, it wouldn't be possible to read the next bit 1.5 times after that edge, it would have to be either 1 or 2 times after that edge. Right? I very probably could be misunderstanding how that would work.

No, the receiver has a clock from which the sample rate is derived, but they are usually not the same. E.g. a UART typically uses a clock which is 16x the bit rate (sample rate).

The discussion is moot if you don't control the receive side of the protocol too.

Yeah, this would be much easier if I could control the sampling frequency on the receive side.

Anyway, I got it working. I uploaded a program on the Arduino and plain micro to just generate a square wave, with the Arduino using the period which I knew worked for transfering data. Viewed both signals on an oscilloscope, and adjusted the period of the plain micro until the frequency roughly matched that of the Arduino, then used that period for transferring data, and it worked nicely. Ended up using delay(10); delayMicroseconds(250);.

Thanks all for inputs/suggestions.

As you appear to be able to control the code on both TX & RX devices maybe you could use Manchester code and then you will have no potential problems with long runs of 0 or 1 bits as the transmitted data will provide the clock & data. I think people have written Manchester libraries for arduino to do the heavy lifting for you.

And for general background: http://en.wikipedia.org/wiki/Self-clocking_signal