Have you ever asked yourself a question that turned out to be a rabbit hole so deep you could spend a lifetime exploring and likely never come out the other end?
I did. Yesterday.
What's a Volt?
This came about when I started exploring how to measure the power output of my WSPR or Weak Signal Propagation Reporter beacon. According to the specifications the output level is 23 dBm or 200 milliwatts.
If you read the fine print, you'll discover that the power output actually varies a little depending on which band you're on, for my specific transmitter it says that the output on the 10m band is 22 dBm, or 158 mW.
That comes with a disclaimer, that there can be some variation on individual transmitters of about 1 dB. So, on 10m, my output could vary between 21 and 23 dBm, or between 125 and 200 mW. With my attenuator connected, the output could be between 12 and 20 mW, and that's assuming that my attenuator is exactly 10 dB, it's not.
Measuring anything means to compare it against something else. To give you a physical example. If you look at a tape measure, the distance between the marks is determined in the factory. The machine that prints the lines is configured to make the lines just so. In the factory there will be a specific master tool that determines how far apart the lines are in that factory. That tool is called "the standard". The process of lining up the standard with the machine making the lines is called "calibration".
If you build a house on your own with just that tape measure, everything should work out fine, but if you have a mate help you and they bring their own tape measure, from a different factory, their lines might not quite match yours and the fun begins.
If you don't believe me, as I've said previously, pull out all the tape measures and rulers around your house and see just how much variation there is.
In my house, well, my CNC, there's a standard that came with my micrometer kit. It specifies physically how long 25mm is. I also have a 50mm and a 75mm standard. When I compare the 75mm with the 50mm and 25mm together, they're the same within one hundredth of a millimetre. It's likely that it's better than that, but I'm still learning how to hold a micrometer and not have it overheat and stretch while I'm measuring. Yes, temperature changes the size of things.
The point is, in my CNC world, my current standard sits in my micrometer box. At some time in the future I might want to improve on that, but for now it's fine.
The standard that I have was at some point calibrated against another standard. That standard was in turn calibrated against another standard and so-on. Eventually you end up with an SI unit of 1 meter as defined by the International System of Units. In case you're wondering, it's defined as the length of the path travelled by light in vacuum during the time interval of one second. One second is defined in terms of the unperturbed ground-state hyperfine transition frequency of the caesium-133 atom. I know right, runs right off the tongue. I can't help myself, that frequency is 9,192,631,770 Hz.
Oh, this system is also subject to change. In 2019 four of the seven SI base units were redefined in terms of natural physical constants, rather than relying on a human artefact like the standard kilogram. This is an ongoing process. For example, in 1960, the meter was redefined as a certain number of wavelengths instead of a physical bar in a vault in Paris and there was also not just one bar, there were 30. National Prototype Metre Bar no. 27 made in 1889 was given to the United States and served as the standard for defining all units of lengths in the US between 1893 and 1960 - yes, perhaps surprisingly, the USA is metric, really. One inch used to be defined as "three grains of barley, dry and round, placed end to end lengthwise" but since 1959 is defined as exactly 2.54 centimetres or 0.0254 meters.
Back to power output on my beacon transmitter. Assuming for a moment that I had an actual tool available to measure this, I'd still be comparing my tool against another standard.
Let's imagine that I could measure the power output of my beacon with an oscilloscope. When the oscilloscope says 1 Volt per division. How do I know that it really is? If you start reading the calibration steps, you'll discover that they state that you need to connect your scope to a reference, another word for standard, and that's if you're lucky. Some documents just wave their hands in the air and say something like "push the auto calibrate button".
The Volt is defined as the electric potential between two points of a conducting wire when an electric current of one Ampere dissipates one Watt of power between those points. The Ampere definition involves counting elementary charges moving in a second. It's in the order of a 10 with 19 zeros. Not to mention that there's also a definition of how much an elementary charge is. You get the point, this is a rabbit hole.
So, now let's pretend that I have a calibrated oscilloscope. Let's say that our oscilloscope is calibrated within 1 dB. Cool. So I plug in my beacon and measure, what?
I'll end up with a reading, that's plus or minus 1 dB of "reality". In my case, perhaps I read 22.5 dBm. That means that it could be as low as 21.5 dBm or as high as 23.5 dBm, or between 141 and 224 mW. So, it's within specifications, great, but I don't actually know what the actual output power is.
Another way to look at this is to use a measurement to determine if the power is within specification or not. I'm guessing that Harry already did that test before he put my beacon in the box and shipped it to me.
Long story short, I'm no closer to knowing just how much power is coming out of my beacon, but I'm still working on finding a friend with a calibrated tool that might give me something a little more precise than fail or pass.
You know that there's a saying about turtles all the way down? I think it's rabbits myself.
I'm Onno VK6FLAB
Create your
podcast in
minutes
It is Free