If you connect the antenna ports of two radios together and transmit from one into the other, that would be bad, right? Just how bad would it be and what could you do differently?
Before I dig in, you might ask yourself why on Earth this question even arises.
Consider having two radios and one antenna. You couldn't use a T-piece to connect two radios to the antenna unless both were receivers. So, after connecting and disconnecting coax for a decade, you might decide to use a two position coaxial switch instead. Set the switch to one port and the first radio is connected to the antenna, flick it to the other port and you've just avoided swapping coax between radios.
I'll point out that in most cases a coaxial switch can be used to connect multiple antennas to one radio, or in reverse, connect multiple radios to one antenna.
When you do start looking for a switch it would be good to test that at no point it connected any two switching ports together, potentially causing the magic smoke to escape from your radio.
A less obvious issue is that a coaxial switch has a property called isolation. It's a measure of what part of a signal leaks between ports and you'll see the isolation or cross-talk of a switch described in decibels or dB.
If you recall, a dB is a relative measure. It means that it's something in comparison with something else, in our case, the amount of signal going into one port compared with the amount of signal leaking through to a disconnected port.
You'd think that in a perfect switch none of the signal would leak through, but it turns out that under different frequencies a switch responds differently, even one specifically designed for switching radio frequencies. It might be that a 1 kHz signal is completely isolated, but a 1 GHz signal is not, which is why when you look at the specifications of a coax switch, you'll see something like "greater than 70 dB isolation at 200 MHz". It's worth noting that the lower the frequency, the higher the isolation, indicating that in the worst case, at 200 MHz, there's 70 dB isolation, but at lower frequencies it has higher isolation, sometimes much higher.
If you were to transmit into this switch with 5 Watts at 200 MHz, the amount of signal that can leak through would be 70 dB less than 5 Watts.
You might recall that you can convert Watts to dBm to allow you to do some interesting calculations. As with other dB scales, it's in comparison to something else, in this case a dBm is in reference to 1 milliwatt and 5 Watts is the equivalent of 37 dBm. This means that if you had a switch with 70 dB isolation, you'd start with a 37 dBm transmission, take 70 dB isolation and end up with a -33 dBm signal leaking through. That's the same as 0.0005 milliwatts. In other words your 5 Watt transmission leaks through your coax switch to the tune of 0.0005 milliwatts.
Is that enough to damage your radio?
Well, that depends on the radio, but let's put some numbers against it.
S9 on VHF and UHF was defined in 1981 as -93 dBm assuming a 50 Ohm impedance of your radio.
So, our leaking signal, -33 dBm, is 60 dB higher than S9. You'd report it as a 60 over 9 contact, a tad excessive, but not unheard of. So by that metric, you should be fine.
Many, but not all, radios specify the maximum radio frequency or RF power that they can handle. For example, according to the documentation, both the NanoVNA and a Icom IC-706 can each handle a 20 dBm or 200 milliwatt signal without doing damage. That means that your -33 dBm signal should't do any damage to those two devices.
I'm off to see what the isolation is for cheap 12V relays to see if I can construct a cost effective, modular, remote control antenna switch with lightning detection.
What are you building next?
I'm Onno VK6FLAB
Create your
podcast in
minutes
It is Free