Lost on how to control multiple led strips with MEGA

I characterized the execution time of meteorRain() and it appears that it's taking 1.34-seconds to complete the meteor effect for each string. That was with a delayMicroseconds(250) delay; with a delay(1) value the time was only marginally longer at 1.4-seconds, indicating that the time consumer is not the delays right now.

With the fade portion commented out the execution time drops to 1.08-seconds (with delayMicroseconds(250)).

The real time consumer is running FastLED.show() for every iteration. When that's commented out and the fade logic re-enabled the execution time drops to 278mS. (With the fade stuff commented out it drops to ~20mS).

As a sanity check: you have 12 strips and 37 LEDs per strip or 444 LEDs. Each LED requires 24 bits of data so 444 x 24 or 10656 bits of data must be sent every time .show() is called. At a nominal effective clock of 800kHz, 10656 bits is going to require 13.32mS each time show is called. This happens 37 x 2 or 74 times per strip giving a nominal per strip time of just about 1-second (0.986-sec), not counting any overhead in FastLED.

You might be able to make the things run faster if a more aggressive fade value is used or a smaller meteor size is used but you're not going to get around the basic, underlying gate of ~1-sec just to clock the data out for one strip.

It would be nice if the FastLED library gave the ability to .show( uint8_t pin ) -- so you only clock out the strip of interest -- but I don't think it does.