The important of calibration of oscilloscopes. Or not?

I understand that precision instruments need calibration - high digit voltmeters, frequency counters, etc. What I don't understand is if oscilloscopes need this. I see calibration dates and "calibrate by" dates on oscilloscopes. But oscilloscopes seem to me to be basically non-precision but "big picture" diagnostic equipment and I don't understand why calibration is needed unless the drift is massive. Basically there are two axes, time and voltage. Time is based on a crystal and is unlikely to ever drift more than a few hundred PPM. An oscilloscope is not considered to be a frequency counter and a drift of that amount is going to just cause a line in a waveform to move left or right probably not even an actual pixel on the screen for most measurements and since every measurement gets shifted if the time base is off, who cares, all the relative measurements are the same as if measured with a perfect time base. Same thing for voltage. The voltage being off a few percent doesn't really matter, does it? Is anyone doing a high precision voltage measurement with an oscilloscope? Most scopes can't even resolve true 8 bit (256 positions) out of the voltage axis anyway. What matters is the overall change of that waveform over time and you get that from the overall shapes of the waveforms. Or am I missing something here?

If you work i a company with a ISO9000 label calibration is mandatory, whether needed or not. If you use for hobby work don't bother. If somewhere in between, do something in between...

nilton61:
If you work i a company with a ISO9000 label calibration is mandatory, whether needed or not. If you use for hobby work don't bother. If somewhere in between, do something in between...

I will accept your corporate standards explanation without further questioning because all it requires is that I accept the existence of blind corporate stupidity. But, on the second half of your answer, "If somewhere in between, do something in between..." I have to ask why? What do you gain? In what way can an oscilloscope get out of calibration that can cause a materially misleading reading?

The only thing I can think of is something like a dirty pot inside of the scope.

nilton61:
If you work i a company with a ISO9000 label calibration is mandatory, whether needed or not.

I was about to post the same, but I was going to word it like this rather:

If you work in a company with a ISO9000 label calibration is mandatory, unless the instrument is marked "for indication only, uncalibrated" or words to that effect.

JoeN:
it requires is that I accept the existence of blind corporate stupidity.

It's neither blind nor stupid. You may understand the workings of a 'scope and know that they are infallible. Your client may have no such arcane knowledge and the only way he can be sure that the service you supply to him is reliable is to see some evidence from an independent authority.

JoeN:
I understand that precision instruments need calibration - high digit voltmeters, frequency counters, etc. What I don't understand is if oscilloscopes need this.

Oscilloscopes are precision instruments, too.

(And they have a lot more components in them then a multimeter - more to go wrong).

JimboZA:

JoeN:
it requires is that I accept the existence of blind corporate stupidity.

It's neither blind nor stupid. You may understand the workings of a 'scope and know that they are infallible. Your client may have no such arcane knowledge and the only way he can be sure that the service you supply to him is reliable is to see some evidence from an independent authority.

Worse: Some third party may have access to precision equipment and use your readings against your client in a court case.

My desk scope is clearly labelled "For indication only", but lab bench scopes are calibrated annually.

A scope is the only way to measure pulsed voltages or the time of non repeditave pulses. So if you need to measure these things, which you do a lot then a scope is the only way.

For example how can you measure the baud rate of an aysynchronous series signal without one?

Joen

I think I understand what you are getting at.

Measurement is either "relative" or "absolute".

In the engineering world we require measurement to be "absolute" and referenced against "national" standards.

In the domestic world of home electronics, "relative" is generally good enough for practical purposes. viz is this voltage more, equal to, or less than that one.

In the "home brew" system everything is relative and it matters little whether the volt you measure is a "true" volt or not.

We all own test gear, which no doubt came without "genuine" certification and has never been recertified since the day it was bought. It matters little as long as it is good enough for the purpose for which it is intended. And "good enough" is the operative phrase.

I must have the best part of a dozen test meters, a couple of signal generators, a couple of frequency counters and a couple of 'scopes. None are "calibrated" but they are all "near enough" for practical purposes and for home dabbling that is good enough. The volt my power supply puts out may not be a "national standard" volt but it is the volt I use and the volt that powers my devices.

As others have clearly indicated, calibrated test equipment is essential in the industrial environment but that's not the world most of us "play" in. So, if your cheap (or expensive) equipment isn't calibrated but meets your needs, then all well and good.

A simple example.
Measuring amplitude.
Scope has a var pot if adjusted v per div is no longer valid , also a change of probe can cause problems.

A scope should. Be calibrated before every use to counteract this by the user.

Timebase , things can drift , if you want to be confident about your measurment a timenase standard can be used.

Some scopes have a measure function but that circuitry can drift too.

Scopes have a large bandwidth performance and generally have substantial amount of analogue components.
Drift is possible there .

Also as a hobbyist the scope is the most expensive bit of kit i have, its the only but i can use for accurate measurment of time varying signals as other precision instruments would be too exspensive.

Edit.
I would add, its possible to use a badly uncalibrated scope to make prescision measurments as long as you have a suitable signal to compare to.
Properly calibrated is much more convenient to use however.

Most oscilloscopes aren't particularly precise in voltage measurements(*), but
time measurements are another matter - 5+ digits of precision in measuring
frequency isn't uncommon, and perhaps 4 digits in measuring time intervals with
a delayed timebase maybe?

In practice quartz crystals are very reliable and timebase drift isn't really
going to happen in a modern scope.

(*) Digital scopes are often sampled at 8-bit resolution for instance. The analog
circuitry before the sampler is perhaps the weak link - the gain/attenuation
network and anti-alias/equalization filters are crucial and exposed to the
most abuse.

MarkT:
In practice quartz crystals are very reliable and timebase drift isn't really
going to happen in a modern scope.

(*) Digital

No but the crt scan is analogue and subject to external influence.
Leading to non linearity across the screen.

I have seen a scope display a diffetent waveform just by moving it 90 deg to show someone the measured signal.
They can be worse in the field depending what is around.

That was a high end tek scope as well.

CRT scopes are soooo 20th century.

My latest scope is digital, display is a big (8") LCD screen. Seems more like running a dedicated program on a laptop, only with real buttons to push & knobs to turn vs clicking/dragging icons with a mouse.

Grin
Yes at 20 th century prices.

Ihave a 20 th century pension.

Well, you do what you gotta do to support your addiction hobby 8)

Boardburner2:

MarkT:
In practice quartz crystals are very reliable and timebase drift isn't really
going to happen in a modern scope.

(*) Digital

No but the crt scan is analogue and subject to external influence.
Leading to non linearity across the screen.

I have seen a scope display a diffetent waveform just by moving it 90 deg to show someone the measured signal.
They can be worse in the field depending what is around.

That was a high end tek scope as well.

Decent ones had mumetal screening covering the length of the CRT to prevent external magnetic fields influencing the display.

Yes.

The field scopes were noticibly worse.
I suspect the shields were somehow to blame.