This makes a lot of sense, per ohm's law, it should be 250 ohms. I'll proceed with 250 ohm resistors as "current limiting resistors" for each pin.
I am, however, really curious as to why other resources (most of what I've found online), seem to recommend 100 ohms only. Between 5V and ground, a 100 ohm resistor would have a 0.05 amp (50mA) of current, which exceeds a pin's maximum current limits of 40mA (yes, recommended current is 20mA). Am I misunderstanding something or are these resources incorrect at the same time?
Here are a few examples of that say 100 ohm:
Well, don't get me started :) but in my opinion:
1) There are a lot of people pretending to know more than they actually do. Especially on the internet.
2) There are a lot of people copying each other without fully understanding.
3) It's a precautionary measure, so in many cases such a thing never actually gets exercised.
4) The current spec for the chip has more caveats and conditions than most do, it is tiresome to go through them all and understand the complete picture. So many people stop reading at the 40mA number.
5) In general (again, don't get me started on social analysis, though...) I see more of an attitude that "if it works, don't fix it", or "see, it works, what could possibly go wrong...", and other such lazy thinking, than I am used to seeing in a professional environment (or with people who approach recreational pursuits in a really intellectual way). It isn't just here, it's everywhere.
Also realize that a resistor is not the only way to guarantee that the current is limited. Most circuits get on just fine without it. But it is up to the designer to limit it in some way.