Capacitors don't have a specfic "capacity". They store charge. The capacitance is the amount of coulombs of charge that they store, and it is proportional. It's a ratio. So many coulombs per volt.

A theoretical "ideal" capacitor has no limit to the charge it can store and the voltage potential.

Real capacitors have a voltage rating, which indicates the voltage at which the dielectric will break down or be degraded. It is generally recommended that you choose capacitors with a voltage rating substantially above the actual voltage you expect it to experience. I don't know where this "95% capacity" notion comes from exactly. I suppose it depends on the specific application.

Suppose you have a 100 microfarad capacitor rated at 300 volts. It will store 100 micro-coulombs of charge for each volt that it is charged to. If you charged it to 200 volts, it would have 20,000 micro-coulombs of charge in it.

You measure the amount of charge in a capacity by measuring the voltage.