IIRC, Hans already has stated that the calibration must be done
at full power, which makes sense as it is a way of measuring, and
compensating for, the non-linearity inherent in the final
amplifier design. That would necessarily mean the calibration
must be done at operating voltage.
Hans has also suggested that the calibration isn't necessary, as
he has included a set of constants that are pretty good in all of
the cases he has tried, but that calibration will definitely improve
the radio's performance.... Which for most hams means it is, well,
essential...
Hans has also stated that his QMX finals easily pass the engineer's
calibrated finger test, which would seem to lend some credence to
those whose measurements have shown the BS170 has a too low Vds
to safely operate on the lower bands.
Not to disparage those making such claims, but "measuring" is
"changing", in the RF world.... but you gotta start somewhere.
Transistor manufacture is still a variable enough process that most
transistors manufactured will exceed the standards shown on the
data sheets.
Alas, most is not all.
-Chuck Harris, WA3UQV
On Tue, 25 Mar 2025 06:57:02 -0700 "Joel W9JFK via groups.io"
<w9jfk@...> wrote:
I had the same thing happen... QMX+ is on the bench with the BS170's
removed, and one of them indeed tested bad. While waiting on my
replacement BS170's I am carefully cleaning out the holes. I really
don't want to deal with this again.
Would reducing the operating voltage during the calibration help
prevent it again? Does the calibration depend on the actual voltage
used to be accurate. I normally operate at 12V with a low dropout
precision regulator. If I calibrated at, say, 9V, would the
calibration be accurate for use at the normal 12V?
Just thinking out loud...trying hard not to do a second transistor
transplant. Not sure the patient would survive lol...
73,
Joel W9JFK