Lower voltage using resistors - calculate the variation?

Hi, I have an e-ink display that works with 3.3V. I thought about wiring it using two resistors in series, with R = 540 Ohm and 1000 Ohm, this would give me a voltage of 3.24V. But the display has a power consumption rating of 0.017 - 40 mW, and the voltage tolerance of 2.4 - 4V. How can I calculate how much will the voltage vary depending on current drawn? Is this a usable solution? Thanks

Your oled is a load in parallel with the 1000 ohm resistor.

But simply, that 40 mW equals 12 mA at 3.3V. 12 mA times 540 ohms equals a 6V voltage drop over the resistor so no voltage left for the oled.

Don't use this approach, use a 3.3V power source.

Use a [u]voltage regulator[/u].

Check the dropout voltage before choosing a voltage regulator. For example, if a 3.3V regulator has a 1V drop-out rating, it will drop-out of regulation with less than 4.3V in... You'll get less than the rated 3.3V out and it will no longer be regulating.

Voltage dividers (a pair of resistors) are ONLY good for low-power, low-current "signals" or "reference voltages". They are NOT for power!

To say this again... You use a voltage divider (for example) when you want to measure a voltage on an analog pin that is a HIGHER potential than the voltage reference or above the maximum parameter value of that pin on a controller.

You would use one to set the reference voltage on an adjustable regulator.

You would never use one to supply power to an active circuit component.