IR-Receiver and I2C Master polling at the same time

Hi there,

I have a project which makes use of both, I2C and IR-Library. So both parts work on their own without any issues BUT if I combine them there are problems decoding the IR signal correctly.

Any ideas how I can handle that together?

void loop() {
  //### IR-RECEIVER ###################################
  if (irrecv.decode(&results)) {
      Serial.println(results.value, HEX);
      irrecv.resume(); // Receive the next value
  }

if(millis() - I2CTimer > 100) {
  I2CTimer = millis();  
  //### I2C ###########################################
  Wire.requestFrom(8, 1);    // request from slave device

  while (Wire.available()) { // slave may send less than requested
    char c = Wire.read(); // receive a byte as character
  }
  //###################################################
}

So this should receive IR-Signals and poll I2C clients if there are data to handle... I testet restricting the I2C polling to every 100ms but that doesn't help. As long as the I2C polling is active, most of the IR codes are not decoded correctly. The IR codes are using NEC protocol with 32bit packages.

I guess it is an interrupt or timer collision? Is there any solution to fix that problem? I would like to priorise IR over I2C.

Or I'm doing it totally wrong?

Thanks for any help!!!

The IRremote library polls the IR signal every 50µs, using a timer interrupt. These interrupts have precedence over the I2C interrupts, so I could understand problems with receiving I2C data, but not with IR. I never tried using I2C and IR together, though.

With the default I2C clock of 100kHz it will take 80µs to receive a byte. Your I2C read loop, once started, reads any number of bytes. Have you tried to also limit the number of bytes, read in one transfer?

The I2C asks for only 1 byte per request.

I'm using the following IR-Library, not sure if there are differences between different libs:
http://z3t0.github.io/Arduino-IRremote/

If the IR is priorized over the I2C that would be fine. But it seems to be broken while talking to I2C.

The Signal I'm receiving ist completly messed up when I2C is active within the loop. If I deactivate I2C within the loop IR is fine.

I'm relatively new to timers and interrupts so I'm not sure what's happening within the librarys in detail. I already thought about using another arduino as master and using the ir receiver als client to reduce some polling but that would not be the prefered way :frowning:

Or are my 32bits packets over IR too long?

Thanks for your help!

I2C accepts as many bytes as the slave sends, and the slave sends until it reaches some command specific limit, or a NAK from the master. If you have a dumb I2C register, that can do nothing but send a byte, your loop will terminate only when the slave didn't respond quickly enough - here a matter of the I2C clock and the polling loop time. Did you already try to output the bytes or number of bytes, read from I2C?

But an IR clock mismatch, between sender and receiver, can also cause problems. Did you try different IR senders and receivers already, or another Arduino? I vaguely remember that I already tried to optimize the NEC timing, for my devices.

You also can try my modified IRremote library.

My IR-Sender is another Arduino and the I2C client is another arduino as well.

I will test your modified IR library perhaps I have more luck using that one.

Thanks so far... I will come back after testing.

Multitasking with the IR library and almost anything else is just crap, IMHO :smiling_imp:

My personal preference is to move the entire IR software to a t85 as dedicated:
Dedicated t85 IR receiver

Ray

If your remote uses the NEC protocol, I wrote a library here which does not use timers.
It uses one interrupt pin, 2 or 3 on an UNO.

http://forum.arduino.cc/index.php?topic=317625.0

I think I found the problem :o :confused:

software serial was still active in my sketch and that seems to collide with the IR receiver. I sent something via software serial when the I2C recevies some signal... so that looked like I2C caused the problem >:(

So now I change my question to: Is it feasable to use IR receiver AND software serial or is there no chance to use both at the same time?

Thx for all your help and solutions!

SoftwareSerial disables interrupts during read and write :frowning:
AltSoftSerial doesn't block interrupts, but needs another timer.

Hi FaVo,
I had the same issue. I created a UWP (Universal Windos Platform) App that runs on a Raspberry Pi3. This App controls different things (like a small Smart Home). I wanted to controll also my TV via IR without my Standard TV-remote. It's not implemented in my App yet, but the App will be controlled via voice commands (in the past there was no German language support for Windows IOT, but I think it is already implemented (I will find out soon)). That means for example: My TV will tune the Volume down if I say something like "Xbox leiser" (I will find a certain Name for my "Smart Home" in future :slight_smile: ).
I'm sure that it is possible to send IR-Signals with Raspberry and Windows IOT. But to be honest, I don't know how (I'm a mechanical engineer :stuck_out_tongue: ). Becaus of other reasons I already set up the I2C communication between my Raspberry (with UWP) and an Arduino (advantage of Arduino is the analogous GPIOs). And to implement IR functionality with Arduino is pretty simple (because there are existing libraries).
I used the IRremote library without success (same issues with Timing, clock...). It took me a little while, but i found out, that the IRLib library works pretty fine for this purpose! Please see my post in the Microsoft MSDN Forum: LINK.
Here you can find the Arduino Code that receives (in this case) one Byte and translates it (Switch/case) to the correct IR-HEX-Code (that I decoded before).

Regards René

PS: Sorry for typos, some letters could be corrected automaticly because of the wrong language...