¿ªÔÆÌåÓý

ctrl + shift + ? for shortcuts
© 2025 Groups.io

A technique for testing a DUT with reduced power


 

There was a discussion in another group about what to do if the output level of the NanoVNA was too high for the device under test (DUT). You can reduce the power level via Console command but that may not lower it enough. I suggested a technique where a quality SMA attenuator is attached to the NanoVNA and the calibration is done after the attenuator. In effect you are "calibrating out" the attenuator. This technique does result in some inaccuracies in the measurements like return loss, SWR and impedance to name a few and the more attenuation you use the worse it gets. When measuring S21, for insertion or transmission loss, dynamic range will be reduced by the amount of attenuation used. This method has been discussed in this group before but I thought some actual test results might be of interest.

In the test below I measured a 20m dipole in the standard way and took a screenshot. Then I attached a 10 dB attenuator to the NanoVNA . Calibration was performed by attaching the SOL loads with the attenuator in place. Then the dipole was measured and the side-by-side results are posted below. For the second measurement I used a 20 dB attenuator. You can see the measurements get worse but may still be useful if trying to get an estimate of things like input impedance vs. frequency.

I have used this method to test active filters, amplifiers or the front-end of receivers and do not want to overload them.

Roger


 

Thanks for sharing - I had plans to do this very same thing for testing bandpass filters that utilize a LNA, so as to prevent overloading. My interpretation of the effect shown with the attenuators added, is the detected result appears to be down closer to the noise floor of the device... not a major game changer - you can still get the results you're looking for when testing for the cutoff of a bandpass filter, or similar device...


 

In new firmware you can reduce output power in CALIBRATE->POWER (need also made calibrate if change)

Or use 'power' console command

or use NanoVNA-App - and set power in it

in any case need recalibrate for new power value!


 

Some further comments on my post:

- The attenuator method may be required in cases where the output of the NanoVNA is too high for the device being tested and will either damage or overload it. This is the case when testing pre-amplifiers or the input of radio receivers. With NanoVNA's the lowest output possible is about -10 to -13 dBm with 2 ma. of Si5351 drive current which is quite high for many DUT's.

- In this example only CH0 is being used. An antenna would normally not be tested with this method. But I used an antenna for this example only because it shows a wide range of complex impedance (R and X) and SWR.

- If the gain versus frequency of an amplifier is being measured then 2 attenuators may be required. One on CH0 with enough attenuation to not overload the amplifier input and another on the input of CH1 so that the maximum level is kept below 0 dBm. For example if the NanoVNA output is 0 dBm; the amplifier has a gain of 30 dB, and the maximum amplifier input is -20 dBm you would need a 20 db attenuator on CH0 and a 10 dB one on CH1. You would do the SOL, Isolation and Through with both in place.

Roger


 

I had a similar concern and determined that on my S-A-A-2 the output level is NOT adjustable from the touch screen menu system. I only have WinXP so I can't run the software that's available, which might let me change the output level from -10dBm.

I can add one or two 10dB SMA attenuators at the input of the preamp, but only for measuring the gain. I can leave attenuators in place when calibrating the THRU parameter and the LOGMAG screen will correctly display the actual preamp gain without me having to do the math.

Unfortunately I can't reduce the input level to the preamp when measuring return loss. If a pad is installed at the input of the preamp, the NanoVNA will end up measuring a combination of the pad's return loss and the preamp's input impedance. The particular preamp I'm testing seems to want a stimulus level of -30dBm; anything higher than about -21dBm causes the preamp's input return loss to become very poor, going from around 20dB to 8dB. The only way to get around this is to reduce the NanoVNA's output level, and that's not possible with the S-A-A-2 from its touch screen menu system.

Testing this particular preamp with a real HP VNA or a spectrum analyzer / tracking generator / return loss bridge produces a proper return loss value as long as the input level is at or below -21dBm. The preamp is designed to amplify very small signals in the microvolt range, not signals around 0.1V. Strong signals upset the bias of the preamp, which alters the input return loss and produces output compression.

It would be nice if the NanoVNAs had the ability (via the touch screen menu system) to set the CH0/Port1 signal level, in 5 or 10dB steps, from -30dBm to 0dBm, or even -40dBm to -10dBm, or even just a selection of -10dBm or -30dBm, or "normal" and "reduced/attenuated" for the less technical users.


 

On 10/15/20 3:20 AM, Bob M. wrote:
I had a similar concern and determined that on my S-A-A-2 the output level is NOT adjustable from the touch screen menu system. I only have WinXP so I can't run the software that's available, which might let me change the output level from -10dBm.
I can add one or two 10dB SMA attenuators at the input of the preamp, but only for measuring the gain. I can leave attenuators in place when calibrating the THRU parameter and the LOGMAG screen will correctly display the actual preamp gain without me having to do the math.
Unfortunately I can't reduce the input level to the preamp when measuring return loss. If a pad is installed at the input of the preamp, the NanoVNA will end up measuring a combination of the pad's return loss and the preamp's input impedance. The particular preamp I'm testing seems to want a stimulus level of -30dBm; anything higher than about -21dBm causes the preamp's input return loss to become very poor, going from around 20dB to 8dB. The only way to get around this is to reduce the NanoVNA's output level, and that's not possible with the S-A-A-2 from its touch screen menu system.
Testing this particular preamp with a real HP VNA or a spectrum analyzer / tracking generator / return loss bridge produces a proper return loss value as long as the input level is at or below -21dBm. The preamp is designed to amplify very small signals in the microvolt range, not signals around 0.1V. Strong signals upset the bias of the preamp, which alters the input return loss and produces output compression.
It would be nice if the NanoVNAs had the ability (via the touch screen menu system) to set the CH0/Port1 signal level, in 5 or 10dB steps, from -30dBm to 0dBm, or even -40dBm to -10dBm, or even just a selection of -10dBm or -30dBm, or "normal" and "reduced/attenuated" for the less technical users.

That would require a step attenuator (hardware change), and would reduce the SNR (raise the uncertainty) of the measurement as well.

If you leave the pad on Port 1 (Ch0), and do the cal, why wouldn't it work? You're still measuring the S11 of the amplifier.


 

If you put a 10dB pad on the CH0/Port1 and then try to calibrate it, the "open" won't be an open circuit, it'll be whatever parallal resistance the pad exhibits to ground. Similarly the "short" won't be a short circuit; it'll be whatever series resistance is in the pad. A 50 ohm termination should cause the pad to produce something that resembles 50 ohms.

When trying to measure the input return loss of the preamp, the signal coming out of the NanoVNA is reduced by 10dB. The reflected signal due to the RL of the preamp is also attenuated by 10dB on its way back to the NanoVNA. So you end up with a RL that's already 20dB, not the actual preamp's RL.

Perhaps adding some attenuation to the outgoing NanoVNA signal could be compensated for by increasing the sensitivity to the incoming NanoVNA signal, to deal with the SNR issue. I didn't have such a problem using a SA/TG/RLB at -30dBm coming out of the TG.


 

On 10/16/20 4:55 AM, Bob M. wrote:
If you put a 10dB pad on the CH0/Port1 and then try to calibrate it, the "open" won't be an open circuit, it'll be whatever parallal resistance the pad exhibits to ground. Similarly the "short" won't be a short circuit; it'll be whatever series resistance is in the pad. A 50 ohm termination should cause the pad to produce something that resembles 50 ohms.
Actually, not quite..
The pad is no different than losses *inside* the VNA - in the reflection bridge for instance. Taking the Open as an example, the calibration process "measures" the response when the connector is open. With the 10 dB pad, and a perfect bridge, the ratio between the stimulus and the reflected signal is -20dB. So that now becomes set to a calibrated value of 0dB (with the phase as measured).

The whole point of the calibration process is to take all these factors out mathematically.

Consider that reflection bridge on Port 1 - There's a stimulus signal, the UUT port, and a measurement port to a receiver that is (mostly) the reflected power from the UUT. And, either a power divider for the stimulus, or a forward power port on the bridge (if you're using directional couplers, for instance).


The bridge and power divider aren't ideal. The stimulus doesn't put out constant power. Not all the reflected power shows up at the measurement port. But you can measure the (complex) ratio between the forward signal and the reflected signal.

And then, you put some known UUTs out there - a 0 degree reflection, a 180 degree reflection, and a no reflection. You now can solve for the gain and leakage terms in your bridge/divider.

This is no different than calibrating with the standards at the end of a piece of coax. The loss and phase shift of the coax gets taken into account in the calibration.


When trying to measure the input return loss of the preamp, the signal coming out of the NanoVNA is reduced by 10dB. The reflected signal due to the RL of the preamp is also attenuated by 10dB on its way back to the NanoVNA. So you end up with a RL that's already 20dB, not the actual preamp's RL.

If you've calibrated the NanoVNA with the 10 dB pad in, that attenuation is already factored into the reading. In VNA speak, you've moved the "reference plane" of the measurement outside the attenuator. That's the beauty of VNA calibration.



Perhaps adding some attenuation to the outgoing NanoVNA signal could be compensated for by increasing the sensitivity to the incoming NanoVNA signal, to deal with the SNR issue. I didn't have such a problem using a SA/TG/RLB at -30dBm coming out of the TG.

The tracking generator spectrum analyzer approach (or a sweeper and oscilloscope with a detector) doesn't have the mathematical calibration process of the VNA. The key difference is that the typical sweeper scheme is that you don't get a phase measurement, so you can't fully calibrate the effects of the bridges and test set. There *are* amplitude only VNAs, but they require a bit more sophistication in the math and implementation - look up 6-port analyzers.

Once, though, you can make phase measurements, and do the underlying stepped or swept measurement automatically, and record the results, and then do the math, now you can do real Vector Network Analysis.

That was the huge change when VNAs were first invented - the math provided a systematic way to manage all those things, by creating a model of the losses and leakages, and then solving for them using measurements on known standards. It's really something that requires a computer to do (yes, you can do it by hand, but boy is it tedious - typically by hand, you'd do the measurements and calculations for one frequency.