So perhaps this is a naive perspective, but I would think that having AGC enabled in a spectrum analyzer would always be sub-optimal during measurement sweeps - because the changing gain/attenuation adds yet more variables to the calibration between samples within a single sweep.? Having the ability to do a "pre-sweep" of sorts to define the maximum/optimum gain settings for a given input could be valuable (along the lines of an "Auto" button for the analyzer, to pre-set gain balance), but throughout a single sweep I would expect the most consistent results to come from a fixed gain setting. Obviously a fixed gain setting throughout the sweep may result in a lower overall dynamic range, but that seems like a tradeoff that makes sense.? Of course, all of my spectrum analyzer experience is with older units that are happy to have 70-80dB of dynamic range under ideal conditions! Josh, KB8NYP On Tue, Feb 18, 2020 at 6:15 PM Jerry Gaffke via Groups.Io <jgaffke=[email protected]> wrote: I don't yet have working hardware, so can't verify your results.? |