Issues sending IR signals with IRRemote

So I'm trying to send and receive IR signals with ATTiny85s. The receivers I use are VS1838b working at 38khz.

Transmitter sends a Sony signal in a loop:

#include <Arduino.h>
#include <IRremote.h>

IRsend _irSend;

void setup()
{
  OSCCAL = 0x8A;
}

void loop()
{
  for (int i = 0; i < 3; i++)
  {
    _irSend.sendSony(0xa90, 12);
    delay(40);
  }
  delay(5000); //5 second delay between each signal burst
}

Receiver prints any signal onto bitbanged serial:

#include <Arduino.h>
#include <IRremote.h>
#include <SoftwareSerial.h>


SoftwareSerial _serial(-1, PB2);
IRrecv _irRecv(PB0);
decode_results _results;
char message[32];

void setup() {
  OSCCAL = 0x4F;
  _serial.begin(9600);
  _irRecv.enableIRIn();
}

void loop() {
  if (_irRecv.decode(&_results)) {
    sprintf(message, "Received %lu in %d", _results.value, _results.decode_type);
    _serial.println(message);
    _irRecv.resume(); // Receive the next value
  }
}

Now I have a working remote and confirm the receiver works as expected for Sony and NEC signals. However when I point the transmitter at the receiver, the receiver begins to spill out garbage readings in -1 (unknown) IR protocol.

I've tried replacing the IR LED with a visible LED. To my surprise the light seemed to constantly pulse instead of the expected "pulse | delay(5000)" pattern.

Edit: I've also tried uploading blinking code on the same pin (PB1) to confirm the chip's timing is working as expected.

Can anyone point me out where I did wrong? Thanks.

  1. Which core are you using?
  2. What crystal/clock setup?
  3. Have you actually calibrated the r/c clocks on both processors? How?

WattsThat:

  1. Which core are you using?
  2. What crystal/clock setup?
  3. Have you actually calibrated the r/c clocks on both processors? How?
  1. I've tried damellis's and SpenceKonde's cores.
  2. Internal 8MHz.
  3. Serial-based Tiny Tuner (I don't have a scope). Code tested with and without custom OSCCAL set.