10 Watt high power LEDs.

The resistor on the base does! (Ugh)

Have you calculated power dissipation in the transistor?

Without seeing a schematic I'm not exactly following what you are doing, but linearly controlling or current-limiting a 10W or 3W LED with a 2n2222 "feels like" you are going to fry the transistor. With 12 Volts, the current limiting resistor (or transistor) dissipates more power than the LED.

You can only (safely) pass the maximum-rated current through a transistor when it's in saturation (with nearly zero voltage across it). So, in a switching application (such as PWM), you CAN use the maximum current & voltage ratings. In linear applications, you need to consider power (Current x Voltage).

Also, the gain of the transistor tends to increase with heat. Without a feedback-control circuit you'll get more current through the transistor & LED as the temperature rises. Depending on where you are along the voltage/current curve, this can result in thermal runaway where more current = more heat = even more current = even more heat, until the transistor dies. (In other cases, you'll get more current but less voltage and everything will stabilize, or something else will limit the current, or something else will die first.)