ACS712 current sensor issue

see , I recently bought a acs712 current sensor 30A ampere rated . I got some program which I simulated using proteus using virtual arduino and gave satisfactory results . but when i did the hardware , I used a 50 ohm resistor and provided 12v AC supply parallel which would be sufficient enough to drive a 0.24 A . first I measured the sensor using zero input voltage and I observed current ranging 0.05A (ideally 0A required) . later when I switched on the supply , it just went off to 0.112 , then after 2 or 3 seconds , again back to 0.05 A .

I thought it was due to A.C., later I used a 7805 ic for DC application . I used it across generic LED (2.2V) , after connecting the sensor in series and turned on the supply , LED turned on , but still the current shown in arduino serial monitor was something like 0.05 itself ..

I thought it was due to A.C., later I used a 7805 ic for DC application . I used it across generic LED (2.2V) , after connecting the sensor in series and turned on the supply , LED turned on , but still the current shown in arduino serial monitor was something like 0.05 itself ..

What do you expect? 50mA is a quite high value for an LED. Why do you use a 30A current sensor to measure currents in the mA range? The ACS712 has a resolution of about 30mA if used with the Arduino ADCs.

does that mean I should use it for current above 1A ??

It means that you can't display current of a 30A sensor with three decimal places.
Trying to print more than you have will result in jumps in the readout.
Add some smoothing/averaging code, and restrict printing to one decimal place.
Leo..