So I put a strip of close to 1000 WS2812B LEDs in my shop and then built a cool (IMHO) demo controller for it. You can see the result here:
Anyway, the problem: it was working well when it was all temporary wire with speaker wire pigtail twisted together, but I was only getting 3.9V where I was finally injecting on the far end, as it's a long run of 16AWG wire. So for that and other reasons, I undertook to improve the wiring for the final "permanent" installation.
The more I did, the worse it got - the LED signal would get messed up, a lot of the LEDs would light white, which would push my power supply past 10A, and bad mojo.
So I moved the power supply closer so that it was 2' from the strip instead of 12', and worse still. Turned it up to 5.5V to boost it a little for the run, no change or worse. Different supply no change, different ESP32 chip no change, but no matter what I did I couldn't get it to work right like it did when it was all sloppy and hacky!
Then for some reason, I turned DOWN the voltage, which I guess wound up replicating the test setup I had. When it was poor wiring I sent 5V and got 4.0 at the project, and now I had the supply right at it and it didn't work... unless I turned the power down to 4.0V.
Running the power supply at 4V and all works great, but why???? They're a 5V strip!
I'm a software guy, so I don't have much knowledge here, but my only guess is maybe a bias or delta between the data voltage and the supply voltage helped? Maybe bringing the supply voltage down a bit made the data signal more "readable"? That's a stretch. But I'm running an ESP32 chip that MIGHT only be supplying 3.3 for the data line. I bet a level converter on that line might help?
Anyone have any guesses as to what causes the phenomena I'm seeing here? And should I have a 3.3V to 5V step-up level converter on the output from the ESP32 to the LED strip?