Into the deep end indeed. Of course there is likely no practical need to know our output power down to the milliwatt. No doubt that the software can be configured to display as many significant figures as we'd like. The question is, at least in this particular case, how many should we be displaying? Since we talk about test equipment here I thought it would be a worthwhile exercise.?
I got home from work today and read the article, then got the extras from the arrl "in-depth" website. I'm extra confused now since I shyly think that there is a bug. I'll get to that later.
I'm going to start with using 4 significant figures in the measurements. Let's say we're only looking at a pure sine wave, which I think is a reasonable abstraction. We turn up our ultra-accurate QRP rig and put exactly 1 watt (30 dBm) into the 50 ohm dummy load. That should be 7.071 V rms or 9.998 V peak. The diode / capacitor detector circuit should read peak voltage. From there the voltage is reduced to allow more range in the 0-5V arduino input, so a voltage divider divides by 29. So the voltage at the ADC would be 0.3448 V. Each bit in the arduino ADC is 4.883 mV per bit. That math gives 70.60, but it is binary so round to 71? The code averages 30 readings to smooth the reading. It then squares the ADC reading and divides by the load resistance, then multiplies by a "Calibration Offset" of 0.008704. The grand total is 877.5 milliwatts. Since that "Calibration Offset" value is meant to be used for calibration, in this example we'll use 0.009919 since that makes our numbers work. 71 squared, divided by 50, time 0.009919 is 1.0000.?
The issue I started with is possible error, and the displayed 4 digit value in watts. What are the possible errors? The diode drop. The tolerance of the resistors in the voltage divider network. The temperature coefficient of the load resistors. The linearity of the arduino ADC. Accuracy of the arduino's ADC reference voltage. The ardiuno Uno's stated ADC absolute accuracy is?+/- 2 LSB
So let's say that everything else is perfect and we only consider the ADC absolute accuracy. That means that our ADC could have read anywhere from 69 to 73. Those values translate to 0.9445 mW to 1.057 W. That's only two significant figures at most.?
On the higher end of the scale the ADC absolute accuracy will still impact our values (because we're squaring the voltage). 100 watts in, 99.98 volts peak, (do the above math), gets between 98.32 watts and 99.44 watts. Again, two significant figures. Add to this resistor errors, reference errors, and non-linearity of the ADC and I think that the 4 significant digit display is misleading and doesn't reflect reality.?
The potential bug is in the CalculateWatts subroutine. This code is passed the binary value of the ADC. The code then adds a constant "DIODEVOLTAGEDROP" which is defined as 0.7. I understand that they were trying to correct the peak voltage reading by compensating for the diode voltage drop, but in this part of the code the value is not in volts. It is a binary reading of the ADC, equal to about 0.3448 volts per bit. To properly correct for a 0.7 volt diode drop they should add 2, which would be 0.6896 volts. I could be dead wrong, so if others out there who are more code savvy than me could correct me I'd appreciate it.
I'd love to hear what you folks think about this, as well as the potential bug in the software.
Thanks for reading?
Mike M.
KU4QO