Voltage drop on alligator clips?!

I'm wondering if anyone has experienced this before. I'm currently losing 0.3 volts for every alligator clip I use (measured the clips with a multimeter). I've never had this happen before, so I'm wondering if it could be symptomatic of a bigger problem.

The setup is a board designed to run off 12V. I have 13.8V coming off a power supply and into a 12V regulator (which I've verified works properly). From the regulator's output I go to the board.

There are 4 alligator clips to complete the circuit, and I'm seeing a loss of about 1.5 volts. Doesn't seem to add up. Could it be that my alligator clips were not meant to handle 12V? They're somewhat thin. I haven't had these issues with a 5V supply.

It would depend on what current your circuit is pulling, the interfaace between the clips and the circuit and the wiring to the clips. Personally with my circuits using a few 10's of milliamps, I don't notice any drop with croc clips, but no doubt there would be a drop if you were pulling a much higher current. Good ol' Ohms law......... :wink:

Could it be that my alligator clips were not meant to handle 12V?

Nope. It's not the voltage you need to worry about: it's the current. There's a real resistance across the connection to the alligator clip, but it should be pretty small: well under a single Ohm, and close to 0. But, if the clips (or what you're clipping them to) are dirty or corroded, you could easily wind up with a couple or three Ohms of contact resistance. If your project is drawing significant current (like, say, running a motor or big solenoid), that resistance could cause a noticeable voltage drop.

When you look at the designs of connectors designed for high reiabilty and/or high current capacity, you'll see that they tend to have multiple/large points of contact, that slide along each other as you plug them in. That helps to scrape off dirt and corrosion, and decrease the chance that one dirty spot will preclude/degrade all contact. And now you know why :wink:

Ran

I have 13.8V coming off a power supply and into a 12V regulator ...

I would check the minimum input voltage required for the 12V regulator to operate correctly. The 1.8V difference (13.8-12) may not be sufficent to give you a stable 12V output (if you have a reverse polarity diode on input, difference may be as low as 1.1V). If the difference is less than minimum as per the regulator datasheet, voltage may drop below 12V even for modest loads.

I'm drawing about 700mA of current, and the resistance on the clips is about 1 ohm, so I guess it is somewhat realistic. Thanks for the advice. I may just have to adjust the voltage to account for the resistance, or use shorter wires with better contact.

The project being powered is an array of LEDs (2 boards of 7x12). A total of 168 LEDs.

one board draws 350mA, and I'm losing 300mV, so that seems to correspond correctly with the ~1 ohm on the clips.