Can an amplified antenna actually be worse than a non-amped one?

Not usually… but that doesn’t mean that an amplifier “couldn’t hurt.” An amplified antenna was “the way to go” in the days before HDTV but the reasons for using one have almost all disappeared. It’s just important to know when to use an amplifier and when using one can actually hurt you.

Today’s television signals are digital, and digital signals behave differently than analog ones. Amplifiers make the difference important. When you amplify a signal, you’re amplifying the whole signal… and that means you’re also amplifying things that you don’t want.

With analog signals, even if the signal was “noisy” you could usually get something from it by amplifying it. Analog TV tuners work like your ears… turning up the volume makes it easier for you to pick out tiny details. Digital tuners don’t work like that. The “signal-to-noise ratio” is much more important in digital broadcasting because it’s much easier for noise to totally drown out the ones and zeroes in a digital signal.

Here’s the part you need to know… all amplifiers not only amplify noise, they also add it. You can use complicated digital filters to reduce the noise and try to make up more signal where you can, but it only works so well. So, if you amplify a very weak digital signal, you could actually be making it worse.

The best time to use an antenna amplifier is when the signal is strong but occasionally fades, such as on rainy days. An amplifier will also help you take a strong signal and distribute it to multiple rooms.

About the Author

Stuart Sweet
Stuart Sweet is the editor-in-chief of The Solid Signal Blog and a "master plumber" at Signal Group, LLC. He is the author of over 9,000 articles and longform tutorials including many posted here. Reach him by clicking on "Contact the Editor" at the bottom of this page.