Other way to control an RGB Led

I've tried to write a program control for an RGB Led and all was OK. The problem comes when I use an union so that the color could be used in an hexadecimal number, not in separate variables:

#define RED 11
#define GREEN 10
#define BLUE 9

union Color {
  unsigned int color;
  struct {
    unsigned char R;
    unsigned char G;
    unsigned char B;
    unsigned char A;
  };
};

void setup() {
  // put your setup code here, to run once:
  pinMode(RED, OUTPUT);
  pinMode(GREEN, OUTPUT);
  pinMode(BLUE, OUTPUT);
}

void loop() {
  // put your main code here, to run repeatedly:
  Color c1;
  c1.color = 0x00FFFF;
  color_RGB(c1);
}

void color_RGB(Color color) {
  analogWrite(RED, color.R);
  analogWrite(GREEN, color.G);
  analogWrite(BLUE, color.B);
}

But curiously, when I tried my RGB to show "white" color, or any color which use BLUE, the led doesn't light. And also i get the following warning:

warning: large integer implicitly truncated to unsigned type [-Woverflow]

c1.color = 0xFFFFFF;

Anyone knows why? Thank you.

PD: unsigned char A; it would be for Alpha, but in this case has no sense, is just because i copied my code from another program non-related to arduino.

You are trying to union a struct that contains 4 bytes and an int that is two bytes. The elements in union need to be the same size.

Oh, i thought a int in arduino is 4 bytes... Thanks.

depends on which arduino

On the Arduino Uno (and other ATMega based boards) an int is a 16-bit (2-byte) value but on the Arduino Due, an int is a 32-bit (4-byte) value.

A long int (or better for you an unsigned long) would be 32 bits

Because of a v ery short-sighted decision by Kernighan and Ritche, every C compiler is allowed to decide for itself what size an int should be. To deal with this, there is a library (which is already linked into your sketch by default) that defines types named uint8_t, uint16_t, and uint32_t.