Yes it can be difficult. I’ve written really long tutorials on cable length, but maybe you just want a quick and easy set of numbers. So, here’s the 5-minute guide:
First, look at your splitters. You’ll lose 3dB for every 1×2 splitter, 7dB for every 1×4, and 8dB for every 1×8. These are generic numbers but they’re pretty good.
At antenna frequencies, a decent cable will lose about 3dB per 100 feet. At satellite frequencies it’s more like 4dB per 100 feet.
So, add up the losses from your splitters and cable runs. Say you’re running an antenna system with a 1×4 splitter and the longest total run is 300 feet (from antenna to TV.) That’s 7dB for the splitter and 9dB for the cable… 16dB. You need an amplifier that gives you at least 16dB boost to compensate. If you’re running a 14dB amplifier, that’s not enough.
Here’s another example: Say you’re using DISH, or DIRECTV non-SWM. So the total run from the dish to the receivers is 400 feet. You’ve lost 16dB of signal, too much for your average 14dB amplifier to compensate for.
DIRECTV SWM is another whole beast, partially because there’s the matter of the whole-home signal and because you can’t use amplifiers after the SWM in most cases. If you’re not able to do a lot of measuring, the best thing to do is limit SWM cable runs to under 150 feet.
If your cable run is too long for the amplifier you’re using, it’s possible to add another amplifier but remember there does come a point where the signal is just too weak to amplify. In cases like that, a second dish or antenna is your best choice.