Sending ir signal to samsung tv failed

#include <Arduino.h>
#include <IRremote.hpp>

#define IR_SEND_PIN 3

void setup() {
    Serial.begin(9600);
    IrSender.begin(IR_SEND_PIN);
    Serial.println("IR-Sender bereit");
}

const uint16_t rawData[68] = {
  4500, 4450,
  600, 1650, 600, 1600, 650, 1600, 600, 500,
  650, 500, 600, 500, 650, 450, 650, 500,
  600, 1650, 600, 1600, 650, 1600, 650, 450,
  650, 500, 600, 500, 650, 500, 600, 500,
  600, 1650, 600, 1650, 600, 1600, 650, 500,
  600, 500, 600, 500, 650, 500, 600, 500,
  650, 450, 650, 500, 600, 500, 650, 1600,
  600, 1650, 600, 1600, 650, 1600, 650, 1600,
  600
};


void loop() {
    uint16_t samsungAddress = 0x7; // Standard-Adresse für Samsung TVs
    uint8_t samsungCommand = 0x7;    // Beispiel: Power
    uint8_t repeats = 0;

    IrSender.sendSamsung(samsungAddress, samsungCommand, repeats);
    //IrSender.sendRaw(rawData, sizeof(rawData) / sizeof(rawData[0]), 38);
    Serial.println("Samsung Power gesendet");
    delay(5000);
}


I try to send an ir signal to my samsung tv. Both options (sendSamsung and sendRaw) failed. I validated the rawData and the protocol and command before through a tsop receiver. I already tested with a normal led and checked if it flashs and this worked.

Can you successfully send commands to some other device which has an IR receiver?

If not, how do you know this is a programming issue? It could be a problem with your circuit.

You should post a schematic.

No I couldnt. I connect a 1k resistor to PIN3 of my ATmega chip and the ir led and then from the led to ground.

That's going to give a very weak signal. Did you try the circuit right up against the receiver on the TV?

This works! Thanks very much. Didnt know that this is so weak.

You could reduce the value of the resistor to say 220Ω to increase the LED brightness and hence the range.

Some IR LEDs can be run at even higher currents, but you would need a transistor or MOSFET to switch the current.
Check the datasheet of your LED, to find what current you can use, or provide us with details of your LED.

Or if you don't have transistors on hand, you can use 3 IR-LEDs all in series with 68ohm resistor for ~3x power

So you are happy with this situation, now that you understand what the problem is? If not, why did you mark the topic as solved?

Assuming your ATmega is powered by 5V, and the IR LED has a typical forward voltage of 1.2V, the instantaneous current that flows when a pulse is sent out would be around (5-1.2)/1K = 3.8mA.

Most IR LEDs can handle 20mA of continuous current, so you can certainly try a resistor of (5-1.2)/20 = 0.19K or 190 ohms (choose the next highest available value). The higher the current, the brighter the light and the better the range.

If you have the model/part number of the IR LED, you can search online for it's data sheet to find it's instantaneous maximum current. Let's say that is 100mA. In that case you could use a resistor of around (5-1.2)/100 = 0.038K or 38 Ohms.

I've sometimes wondered about boosting the signal, a Samsung TV remote, by simply attaching an IR photosensor to the remote, maybe a logic IC and some high powered IR LEDs.

You can bounce the existing remote output off ceilings etc., so my idea was to simply flood the room with TV remote signals. If you are lazy, like me, and can't get up to point the remote at the TV, you park a separate booster somewhere where and you can stay seated.

Before anyone asks, it's a cluttered room with obstacles. Why don't I move the obstacles? Another day, another chore.

Never got round to trying it.

Make that 100mA.
Common remotes pulse with 200mA, sometimes more.
20mA pulsed wil only bridge about 1-2 metres.
Leo..

2 Likes

I just need to have any signal as output for further analysis in another tool. It’s not important in my case that it is weak. That’s why I marked this as solved. Thank you.

1 Like

Agreed, but an ATmega pin can't source/sink that much, and a transistor would be needed. The ATmega could source/sink 20mA no problem, which is why I suggested that.

Aside: I wonder how come 3~5mm IR LEDs can handle 100mA continuous. That would burn out most 3~5mm visible LEDs. The forward voltage is lower (~1.2V IR vs. 1.8V for red LEDs) which certainly helps. At 100mA, an IR LED would dissipate 120mW. A red LED dissipating 120mW would have a continuous current of 67mA, more than double the usual max rating of 20~30mA.

1 Like

I've been wondering the same. Maybe just there's no market for high power 5mm indicator LEDs, so they can be cheaply made and rated for 20-30mA.

Vishay, for example, has 5mm 50mA visible LEDs with max power dissipation ratings (135mW), similar to their 5mm 100mA IR-LEDs (160mW).

1 Like

Try Googling for "strawhat LEDs".
Leo..

1 Like

I've got some 8mm "straw hat" LEDs rated at 0.5W. You have to be quite careful with them to avoid burning them out, which is quite easy to do. For example you can test them on breadboard at lower current, but breadboard doesn't allow for much heat dissipation. On PCB, you would need to provide large pads, ground plane etc to help dissipate the heat they generate. Also use a constant current circuit to drive them, otherwise the magic smoke soon escapes.

100mA (average) through a 5mm IR diode is only 0.15watt.

The trick of giving a remote control more range is more LEDs, not so much higher currents.
A choir is louder than a single singer.
Leo..