NeoPixel Tube Light

Hi there,

I'm new to the forum - dusting off my electronics box after a very long hiatus. I've got a little project to start off with but would really appreciate some steerage from experienced heads - especially on the fundamental set-up.

I'm planning to make a basic tube light - embedding 2m of individually addressable strip LEDs in a long translucent tube. (I'm competent with analogue electronics but fairly new to Arduino and microcontrollers.)

The light would have two functions.

A) Emit a pleasant, static colour of light (colour can be changed)
B) Emit custom animated light patterns - generated either via code, or derived from an image/video file. e.g. fire, graphic equaliser etc. Something to experiment with.

For light quality/pastel tones I'd like to use RGB+W LEDs. This seems ideal for (A), but perhaps makes (B) a little trickier. I've found plenty of project examples online for controlling strip lights, especially using 24bit RGB. But my worry is if I start using 32bit RGB+W - it's going to get complicated.

The relative ease/fluidity of this Amy Goodchild example:
[Learn to control LEDs with Fadecandy and Processing - YouTube] which uses the FadeCandy chip + Processing to generate effects is great. I'd love to use Processing for pixel mapping. Unfortunately that chip will only work with 24bit RGB strips, and not the 32bit RGB+W.

So the cross-roads I'm at:

Option 1: Go with RGBW (Neopixel 144p/m strip) Use an Arduino Uno to emulate the functionality/dithering offered by the FadeCandy chip (if required). The white LED can be ignored for LED animations derived from video files. Would Processing (perhaps running on a Raspberry Pi) still be viable to generate the pixel mapping/animations for this?

Option 2: Stick with RGB (does the white even make that big a difference?) RGB Strip + FadeCandy as per example. Or the FadeCandy may be superfluous on the more modern DotStars so my old 2009 Arduino (Duemilanove) could be substituted.

Any guidance would be much appreciated!

Hi and welcome to the forum. Here is how you post a link, with the “insert link” icon:

Thanks Paul!

I don't have any experience of RGBW leds, fadecandy or Processing, so can't help much with those questions.

I can say that pre-defined rgb patterns are beginner to intermediate difficulty. Patterns based on audio data would be quite an advanced project. Patterns based on image/video data would be very advanced level and probably well beyond the capabilities of your Duemilanove.

I can speak to using RGB-W WS2812B modules.

I am working on a project to create sparkling treasure in a treasure chest with lights underneath faceted plastic gems. My experience is that they are very easy to program. I never worked with pixel lights up til now. Adafruit provides a good neopixel library and there are some good tutorials for the library as well. Arduino can use single data, no clock strands where, in my research, you will run into problems using a Raspberry Pi without separate clock connections. Something to consider. You can run such lights using Raspberry Pi, but may require a controller from my investigation unless your LEDs have a separate clock.

The added white pixel delivers real pleasant pastel colors when mixed with the RGB if that is your desired look. The link you provided shows some real complex controlling, taking camera image data and using it to turn pixels on and off. I havn't tackled anything quite that complex.

I easily programed the neopixels to randomly apply colors to each of the leds in my strip within a set color pallet. The color pattern changes every time the Arduino is plugged in. Then the code causes a random pixel to turn white for a very brief second and then back to it's original color.

The adafruit examples demostrate all kinds of patterns and fades, so achieving the look in the video is certainly possible. The complexity of the video is in converting video images to drive the pixel behavior.

Much appreciate the feedback on my project.

I've decided to go with RGBW NeoPixels, driven via a UNO board.

Generation of pixel patterns would be done using Processing on a laptop and uploaded as required - I think the flash memory on the UNO will be more than enough for the data (for 288 pixels in the 2 metre string).

So the hardware is on its way, I will put up another post when I have some progress to report.

d_nova:
I’ve decided to go with RGBW NeoPixels, driven via a UNO board.

A Nano is a more practical version (and cheaper, especially clones) for most projects. If you have a UNO, keep it for simple experiments, or use with a “shield”. :grinning:

d_nova:
Generation of pixel patterns would be done using Processing on a laptop and uploaded as required - I think the flash memory on the UNO will be more than enough for the data (for 288 pixels in the 2 metre string).

Each frame will require 228 * 4 = 914 bytes.
So to run a single LED over the length of the strip you would need:-
228 * 914 = 208392 bytes of memory, a lot more than you have on an Uno

How many frames are you looking to have in an animation?
You might consider storing the frames in a more compact way, like run time encoding or frame differences.

Thanks again for comments!

I apologise for a mistake I made regarding the use of flash memory, when I meant SRAM for storing the dynamic data - and there's only 2 Kbytes in the UNO. So I'm hoping to store one "frame" at a time in the UNO and upload each fresh one from the laptop running the Processing code.

I'm sure I'll hit limitations with serial link speed and memory capacity but hope it'll be enough to still create a dynamic pattern on the NeoPixel strip.

So I'm hoping to store one "frame" at a time in the UNO and upload each fresh one from the laptop running the Processing code.

And how fast do you think that will be?
More to the point how fast do you need it to be?

You could consider Arduino Pro Micro (not to be confused with Pro Mini). Has a little more ram memory but more importantly a native usb port, so serial Comms runs at usb speed (you still have to set a baud rate in Serial.begin() but it has no effect).

But also a lot depends on how you are going to encode the frame data.

As well as the format like frame differences as I mentioned before.