do i need any resistance to control current to LED ?
Yes, you should have something to control the current. As slipstick says, with high-power LEDs (1W or more) a special constant-current (controlled-current) power supply/driver is normally used. (That's not an easy thing to build yourself.)
With regular little LEDs we use a series resistor and drop about half the voltage across the resistor and half across the LED. We use Ohm's Law to calculate the required resistance based on the (approximate) voltage across the resistor and the required current (which is the same through both series components). Then with current limited/controlled, the voltage across the LED "magically" falls into place.
A series resistor is inefficient because with the same voltage & current as the LED it has to dissipate the same power as the LED. That's not a problem with a regular little LED, but the same method with a high-power LED requires a high-power resistor and it wastes power so it's not often done. (You don't actually have to drop equal voltages across both components but it "works better" with more voltage across the resistor.)
LEDs (like all diodes) are non-linear and they are "current operated". The current spec is "exact". The voltage rating is approximate and varies from part-to-part and with temperature. Their resistance changes inversely and drastically as the voltage changes. A slight over-voltage can result in over-current and a fried LED or power supply. A slight under-voltage will disproportionally dim the LED because the resistance drops, multiplying the effect of the lower voltage.
LEDs are the opposite of how almost everything else works... With most "things" you apply a constant voltage and the current "falls into place". With LEDs you apply constant current and the voltage "falls into place". Regular diodes are similarly non-linear but the (forward) voltage drop is a fraction of a volt and there is normally something else controlling/limiting the current.