i'm testing a radio waves remote control with a voltimeter to see what happens when a push a button on the control and i saw voltage drops down 0,4v when i push it.
when i'm trying do the same with the analog port of arduino i can't understand where is the variation becauce it is always oscillating with no constant value.
what i'm doing wrong? how can i measure voltage on the analog port as a voltimeter do???
Hello Hackaro, and everyone else around here (tis my first post)
The way any remote control works is not usually solely based on a specific voltage. What they put out when you push a button is a high frequency encoded digital signal (this is true for infrared and radio remotes).
As an example, let's say the remote sent out a repeating pattern of 11011, where 1 is 5v and 0 is 0v. A multimeter would read this as 4v because the 5v is on 80% of the time, 0v 20% of the time. (80% being 4 1's out of 5 pulses, 4/5=80%). 80% of 5v is 4v. This is also how Pulse Width Modulation (PWM) works. What your multimeter is reading is the average value of the signal.
So the real question is: Are you trying to duplicate the remote control or are you trying to use the Arduino for accurate AC signal analysis?
i have a micro car remote controlled. it's very simple, just 4 push buttons to control 2 little motors. i disconnect the 4 wires of this remote control circuit, unless the wires of the batteries.
Now what a want it to reUse that circuit for my own projects. what i have done??? i just use the multimeter to check what is the difference between the voltage without any button pressed and the voltage with the button pressed. i checked a little variation.
i want to use my arduino as a bridge from the remote control to my project.
i didn't think in PWN because what i have measured is the voltage to control the motor, after it pass through all the circuit.