I used to think a lot about the distortion (harmonic and IM) on mains power lines, because I was involved in "carrier-current AM broadcasting" starting around high school.? As some of you know, "carrier-current" involves putting the AM (MW) broadcast RF signal on the AC mains power lines, rather than a proper transmitting antenna.? The advantage was you could get decent coverage over a limited range, because the RF does not pass through the mains step-down distribution transformers.? It was great for non-licensed low-power broadcasters.? The FCC strongly encouraged it.
?
But the AC mains "circuit" is a rather dirty and nonlinear one, leading to lots of IM (intermodulation) distortion between the 60 Hz mains voltage and the weaker RF carrier waves.? The result was quite a lot of HUM added to your transmitted signal.? I think it was unavoidable.
?
I'm sure every electronic power supply added to it.
?
I'm pretty sure all the fluorescent lamps did too.? In a school environment, fluorescent lamps were everywhere.? Since you can see them flicker, they must be doing something nonlinear at a 60x2 = 120 Hz rate, giving your signal lots of HUM.
?
I did not have the means to see the effect on a sinewave, and honestly I didn't care to because I figured it could be small enough to be invisible, and yet cause too much hum.
?
But I believe there are more extreme cases where the mains 60 Hz waveform might have distortion around 25% or so.? And those are the kinds of waveforms you actually might want to look at on a 'scope.? 25% THD looks pretty bad.
?
To me, it hardly matters whether you use a 'scope or an FFT analyzer to see it.? Either way, you need to safely extract that signal from the power wires without also messing with it.? So the problem of safety is the same.
?
Andy
?
?
?