I am clearly not getting this.
Trying to calibrate this sensor, Don't know why, I don't need it.... but I hate being beaten.
After looking into all the parts, this is a PIR sensor that is controlled by a PIC.
There is an XTR117 4-20mA generator that is connected to both the PIC and an LM35-LP temp sensor.
From what I can deduce, the temp sensor output alters the output of the 4-20mA IC depending on the temp. Fine. That all actually seems to work.
The PIR via the PIC seems to short this input out high, therefore I presume to hold the XTR117 output at 20mA.
So, on the PCB, it has a 3 pin connector (J3) that is the calibration point.
This consists of 3 pins:
A: GND
B: The output of the sensor
C: The input to the XTR117 IC.
RV2 is a multi-turn 500k pot on the input to the XTR117 IC.
In normal operation, B&C are shorted. If I short these terminals, the mA reading on the 4-20mA output appears to change as expected with temperature (but un-calibrated).
SO.. on the back of the PCB is written:
With test jig connected to J3,
set RV2 to read 6mA with 100mV
injected.Test for 18mA with 300mV injected.
Place link between B & C when complete.
So, this is where I am clearly going wrong.
I made a simple voltage divider on a 5v supply that produces 100mV and 300mV (approx).
I inject 100mV across pins A and C and get zip. The output of the 4-20mA IC doesn't change at all with the alteration of RV3.
What am I doing wrong? Is it my janky 100mV/300mV injection circuit?
Think I may have to admit defeat here!