How can I read resistance values to micro ohms accuracy?

I know it's been a few months since this topic was posted, but I have a question right along these same lines, and I'm hoping you all can help. I'm not sure if I should start a new topic or post here...

Here's my issue: I need to heat up a wire of a very specific length to a very specific temperature, and I'm doing that using resistive heating with a low voltage, high current power supply. Currently I'm using an infrared imaging source to monitor the temperature then turn off the power supply when the proper temperature is reached. It works, but it's a little crude and I'd like a little more fine control. I figure if I can measure the resistance, I can feed that into a calibrated lookup table on the arduino which will determine what my current needs to be to achieve that specific temperature then adjust the power supply accordingly.

My resistivity calculations say that the resistance in my specimen will be as low as .006 Ohms and as high as .06 Ohms. But, from what I understand, as the wire is heated, the resistance will change, so I'd like to measure the resistance during the actual heating process.

I built the circuit mentioned in ptingzon's original link as a test (and to gain understanding). I used 1.1V as my reference to improve the resolution, a constant current of .024 A out of the LM317, and the four wire kelvin method to measure a resistor... it works great for these relatively high resistance measurements, but as you all surmised, it's just not sensitive enough to meet my needs.

Now to the crux of my question... If I replace the LM317's .024A with a constant current of 100 A, how can I input the voltage measurement into the arduino? Do I just measure across the load and not worry about the high amperage? Should I tie the grounds together (power supply and arduino), or should I use the arduino like a floating ground? I'm confused because the arduino isn't "pulling the amperage", the power supply is a constant source. So would the arduino only pull what's necessary... or does the whole 100A get "pushed" into the Arduino?

I would appreciate any help you all could offer.

I've split the topic because this seems a bit different (you are heating the wire).

or does the whole 100A get "pushed" into the Arduino?

Current doesn't get pushed into the Arduino any more than the power station pushes its gigawatt output into your light globe, making it explode with a brilliant incandescent display.

However I think your notion of measuring the temperature by trying to deduce what the current is doing is flawed. It's like trying to work out how fast your car is going by measuring the fuel rate into the engine.

You must be able to get a suitable sensor that directly measures the temperature of the wire.

If your current source is a constant 100A DC, you can measure the voltage over the specimen directly with the Arduino.
There should be 6volt max across a 0.06ohm source.
You even have to use a voltage divider, to drop it to <5volt.
Make sure there is always a ~10k resistor between source and analogue in, to protect the input.
Leo..

If your current source is a constant 100A DC, you can measure the voltage over the specimen directly with the Arduino.

That is until the wire breaks due to the heat and the full voltage blasts out your Arduino's analogue input.

Good point.
A 10k resistor should protect the input to 15volt (1mA pin current).
I don't expect a 100A supply will have a higher output voltage than that.
But maybe wise to also add a schottky diode between input pin and +5volt.
Leo..

@Nick Gammon: Thank you for that analogy I understand perfectly now! After I read your reply, it completely clicked. Regarding direct measurement: trying to put a contact probe is difficult in my application for a few reasons: but the biggest problem is that I'm using an automated process where the lengths of the wires change quickly. Trying to re-establish physical contact after each change would add layers of complexity. This is the main reason I am using the infrared camera.

@Wawa and Grumpy_Mike: If I limit the power supply to a max of 5V output that would solve both issues... right?

I would still use a 10K resistor between source and analogue in.
Overvoltage, and therefore overcurrent through the internal pin protection diode is damaging.
Arduino measures with high resistance (~100Megohm), so contact resistance is not likely an issue (carbon contacts?).

You also need accurate current measurements to be able to calculate the temp.
Leo..

just thinking out loud, so pardon my ramblings.....

you have a power supply that you can control very accurately.
you have a wire that you can monitor temperature, very accurately.

if you have one fixed length of wire and one unknown length.
and put in x amps of current.

how would the known length react as compared to the unknown length ?

Hi,
I agree with Nick, you cannot deduce the temperature of the wire by calculating the amount of energy you are feeding it.
Ambient temp, humidity and air movement will all contribute and with resistances that low, contact resistance of the terminals comes into play.

You need to use temperature as your feedback, so a non contact sensor, you might have to look for an IR thermometer with some PC comms.

Tom.... :slight_smile:

Hi, again.
I take no responsibility for this item.

But it may be a way to get your wire surface temperature.

Unfortuntely its a instructables, but is instructive.

Tom..... :slight_smile:

We had another lengthy thread along similar lines recently. Someone wanted to control output by monitoring the input. Basically it is an idea that won't fly.

Look, it's like trying to monitor how happy your cat is by measuring how much food you feed it.

You have to monitor the output, not the input.

If the cat jumps on your lap and purrs, it's happy, regardless of the food input.

Gentlemen, I will politely disagree.

I have yet to see a temperature sensor that measures temperature.
we see a change in one things that was caused by a change in temperature. such as expansion of metal or flow through resistors or some such. Can anyone honestly say that an infrared temperature sensor is not measuring electrons ? that represent speed of other electrons ? that infer temperature ?
in every case, some artificial device is used to represent some change and we measure that.
that my friends, is the world we live in.

mercury expands based on temperature. we measure expansion with a ruler. no temperature is used.
bi-metal has two metals that are bonded, one expands at a different rate. we measure movement.
ir measures electron in the device and that device 'see's the speed of electrons at a distance, we infer that the speed of remote electrons by counting local electrons. no temperature is used.

a wheatstone bridge is bases on having known devices and comparing them to the unknown.

the OP said a specific length of wire to a specific temperature.
he also said it changes rapidly.

sounds like a production line and making various lengths of wire and then testing individual ones.

also using my crystal ball... the OP wants to get close, so the final test can be done faster.
if I am reading the OP correctly, if the wire has 'x' ohms, he can set the power supply to 'y*1.784' so that wire will be pretty close to the value and testing will be more rapid.

that said, one length, tested, would give a close approximation. using the results of that test will help the final test.

with that, the final powering/heating/testing could be much faster.

Hi,
IR method...

support.fluke.com/raytek-sales/Download/Asset/IR_THEORY_55514_ENG_REVB_LR.PDF

Tom.... :slight_smile:

I need to heat up a wire of a very specific length to a very specific temperature,

...

but the biggest problem is that I'm using an automated process where the lengths of the wires change quickly.

When you said "very specific length" I read that as "a certain specific length", not "the lengths change quickly". Next thing you'll be saying is that the temperatures change quickly too, eh?

What is the temperature range and dimension of the wire?

Measuring temperarture by determining the resistance of the wire is fully possible. This is wa RTD's do. The problem here is the VERY low reistance of the wire. I have to think this through but it should be possible to control the temperarture by setting the feeding impedance to a specific one (maybe negative)

Metal wire RTD's such as Platium RTD's have hundreds of ohms at high temperature, not a fraction of a milli-ohm. You got your order of magnitude bit off.

http://www.omega.com/temperature/pdf/rtdspecs_ref.pdf

You usually apply a small fixed current and watch voltage.