This is not a high volume thing.
If you're hiring me to do commercial design, you need a lesson in business.
EDIT - Misclicked solution button.
This is not a high volume thing.
If you're hiring me to do commercial design, you need a lesson in business.
EDIT - Misclicked solution button.
Use the socket strips mentioned in reply #15. Then you don't have to stock all the different sizes. You have to be careful to align them right during soldering though...
Ehhhhhh. Maybe.
Solder them with the IC inserted and they will "self-align".
Heat cycles can cause the chips the lift up in their sockets.
Back in the 70's I had a very early Apple ][
(I actually paid extra to get it soldered vs kit),
It had sockets for many of the components and it was a total pain, about every few months the machine would get unreliable. It was a combination of oxidation between the chip pins and the socket connections and the chips lifting up.
It was particularly true of the memory chips.
So every few months I had to lift up and re-seat many of the chips on the motherboard. Occasionally I even had to remove the memory chips and clean the legs.
For reliability, soldering is much better than sockets.
Sockets should only be used if there is an anticipated need for chip replacement.
And even then, only if de-soldering for replacement is not practical.
--- bill
If you are following ESD procedures.
There's an important difference between turned pin sockets:
and the cheaper type:
Turned pin sockets are more expensive to make, but offer a much higher contact pressure because the contact takes place on the four corners of the IC pin, whereas on the cheaper type the contact takes place across the whole surface of the pin. Whilst the latter might sound good, it isn't: it reduces the contract pressure sufficiently that the contact is no longer airtight, and corrosion can occur on the pin and contact surfaces.
The contact force on the turned pin type is concentrated at the corners, and is sufficient for the contact to be considered airtight. I've never known turned-pin sockets to give bad connections due to corrosion.
As @bperrybap says, thermal cycling can cause the IC to walk out of those cheaper sockets; I've never found that to happen with the turned pin sockets.
Sockets are brilliant during prototyping, but you won't find them in any production equipment unless the IC needs to be replaceable for some reason (e.g. the processor chip on a motherboard).
Sockets add cost and reduce reliability. OK for hobbyists, but not for serious applications.
I have never had this happen even with a radio repeater control system that ran in an unheated barn for 20 years. If this is a problem then it is probably due to over straightening of the IC pins before insertion. The pins should be slightly bent outwards to provide more grip.
Seat one side just in, and use a ruler to spring the other side parallel as you insert it.
I tend to use a pair of snipe nose pliers held upside down to spring the other side into place. I slide the pliers down the length of the IC.
I dislike the more expensive turned pin sockets because while they are better in making a contact for the first time, they are not good at multiple insertions because of the distortion they make to the corners of the pins.
That would presume you wanted to put the same IC back into a socket as against replacing a dead chip. The only plausible case for that would seem to be removing a MCU to re-program it where I emphasise the "re". But Arduinos use ICSP or bootloaders, so that should not be necessary. And programmers should always use ZIF sockets.
Further to my comment in #30, have you ever used those IC inserter(/ remover) tools which hold the legs parallel? (I don't think I ever had one. I have a basic remover clip - somewhere. )
No not the only case. I occasionally do this in the debug phase as well.
I always use ZFI sockets for reprogramming the chip but the ZIF socket is often too bulky to have on the target device, this is of course when an ISP solution is not possible.
However, sometimes it is necessary to remove a chip while debugging a project. For example suppose there is no signal seen on the input of a chip when there should be one. This could be caused by a fault with the device generating the signal or the chip receiving it. One possible cause could be a fault where say the input to it is a short circuit. Then removing that chip will show which it is the sender or receiver. Depending on the result you could end up reinserting the same chip back in the socket.
There are other such debugging situations like when you want to remove a chip to break the output chain of a circuit. Or even to remove one pin from a device by removing the chip, bending the pin out at right angles and reinserting it. Then after investigation removing the chip bending the pin back and reinserting it. You can use that trick too many times as the pin will break after being bent like this a few times.
Then there’s the case where you have run out of a particular chip and want to borrow one from an old project. I have ended up replacing the original chip again in the old project when that is needed for say an exhibition and there is a supply shortage on the original chip.
I did try one I seem to recall, about 30 years ago or so, but it didn’t seem worth the money so I never bought one.
Yes, that's the case for me, too. I use sockets routinely while developing and debugging a circuit.
This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.