I want some advice from the real electronics experts here.
I saw this interesting video by Martin Lorton about measuring capacitance:
I decided to reproduce his test results myself.
I used his test circuit:
Feeding in a 1 kHz signal (square wave) at 1V peak to peak, we see this:
Zooming in on one cycle, we find the point where the voltage reaches 63.2% (that is, 632 mV).
We are choosing 63.2% because this is:
1 - e-1 = 0.63212
Switching to the "time" cursor we measure how long it takes to charge to this point:
From the Wikipedia article, we read:
The RC time constant, the time constant (in seconds) of an RC circuit, is equal to the product of the circuit resistance(in ohms) and the circuit capacitance (in farads), i.e. T = R * C.
Thus C is equal to T / R
I measured 47 uS as the time, so thus the capacitance was:
C = 0.000047 / 1000
In other words, 47 nF, which is in fact the capacitor I used. So far so good, the theory agrees with practice. And I've learnt a bit more about capacitors.
Now the interesting thing is if we increase the frequency. Say, to 3 kHz. The original waveform (the 1 kHz one) is saved in the background in white.
Now it looks like the capacitor barely has time to discharge before recharging again.
And if we increase to 8 kHz:
Now the capacitor is not fully discharging nor fully charging.
I presume the frequency at which this starts to happen has a name. Is it the "cut-off frequency"?