I'm developing test equipment for measuring hydraulic pressure 0-400bar and lineær distance. I've managed the software part to get arduino to log to excel.
Now I have to figure out the hardware part. I'm a bit new to this so please excuse my lack of knowledge.
The sensor gives out 4-20mA signal. How do I connect that to my 0-5V analog in at the arduino. I've got some tips to use a 250ohm resistor in the signal loop and measure the voltage drop over it.
Can someone please give me a wiring diagram or tell me weather or not the diagram attached here is suitable..
Scale down your drawing and maybe one can actually see it. Also most (and yours according to the second datasheet) 4-20ma transmitters utilize only use two wires to the sensor, labled + & -. You have to create a current loop comprising of a voltage source (10V to 30V d.c. (24V d.c. maximum above 110°C)) and a 250 ohm resistor. The wiring would go from negative of the voltage source to resistor, then resistor to - of transmitter then + of transmitter to positive of voltage source. The arduino wiring would be wire ground pin of voltage source to Arduino ground pin and other end of resistor to analog input pin. Measurement voltage to Arduino would be 1-5vdc corresponding the 0-100% of pressure measurement range.
You have not defined exactly which model of 3100 transducer you are intending to use. You mention 4-20ma but I believe that is because you are unaware that the three wire device is voltage output, whereas the 4-20 device is a 2-wire device. You talk about 4-20 (2-wires) but your drawing shows 3-wires (voltage output)
Until you define exactly which you intend to use, we might be giving false guidance.
We will give you the correct answer to your question but only when you give sufficient information on which to base the answer.
Another thing about this. How accurate can one expect the accuracy to be. The pressure sensor has its own accuracy on 0,25%, but how about the arduino and reading the voltage accuratley? I have to calibrate it to some referance equipment and calibrate it to fit that pressure. But how is the repetability?
Bear in mind that the bottom 20% of the 0-5volt range is "lost" sonce the scale range starts at 4ma which produces 1 volt across the 250ohm resistor.
Yes, it is a pain but that's tranducers for you.
Well there was a valid reason for the 'live zero' 4-20ma standard to be designed that way. It allowed a system to be able to tell the difference between a valid zero reading and a broken current loop wire which would also result in a zero reading. The main advantage of current loop instrumentation loops is that the sensor can be hundreds and even thousands of feet away from the measuring point with no loss of accuracy, unlike voltage sensors where long cable runs can result in voltage drop causing measurement errors.
Yeh I know that but was just advising about the elevated zero in case it was not immediately appreciated by Frode. Old enough to go back to the good old days 3-15psi then 10 to 50ma and finally 4-20. Couldn't be bothered with today's virtual signals all running down the same wire and dependant upon multiple single points of failure.
Yeh I know that but was just advising about the elevated zero in case it was not immediately appreciated by Frode. Old enough to go back to the good old days 3-15psi then 10 to 50ma and finally 4-20. Couldn't be bothered with today's virtual signals all running down the same wire and dependant upon multiple single points of failure.
I'm old enough too.
I worked for 27 years in a oil refinery (Chevron) until retirement and we went through all that stuff. They had, when I left, so far resisted going to field bus and other digital only instrumentation loops, but I'm sure they have started by now as the instrument and control system vendors had been investing millions and constantly badgering us to upgrade to their latest stuff.
I too had big reservations about the possible failure modes where a given plant can lose a million dollars a day in lost profits if shutdown because of instrumentation/control system failures. It could be a very stressful job at times, but pay and benefits were first class.
The accuracy of your primary element (the sensor) is specified by the manufacturer in their data sheet as 0.25% of full scale. The term "full scale" is important since, let us say the range is 0-400bar, then the accuracy is 0.25% of this, which is +- 1bar. If your measurement is only 10 bar on a 400bar unit the true value could be anywhere between 9 bar and 11bar, which is actually 10% of the measured value. Whilst this doesn't sound very good, think of it as an analogue guage ranged 0-400bar. How good do you think your eye and judgement are when the gauge is measuring only 10 bar.
Where most folk fall down is in the 250ohm resistor. You need to get a precision unit typically 0.1% or better, which you will not find in your local junk shop. If you have any friends in industry they might be able to "release" one for you.
The arduino ADC is, I think, a 10-bit device which gives a resolution of 1 part in 1000 or 0.1%.
All of these errors need to be considered as accumulative. They may well cancel, but maybe not. So at best your system accuracy will be somewhere in the order of 0.5%. Repeatability in the short term should be somewhere around 0.3% but the sensor has a poorish thermal stability so if the ambient changes from say 0degC to 30degC you might , at best, expect system accuracy to be no better than 1%.
I'm sure the mathematicians will challenge these numbers, but I'm equally sure you will appreciate that "exactness" is an impossible target to achieve. Life is a compromise whereby "near enough" has to be "good enough". (Unless of course, you have unlimited funds.