?
I also had another look at the block diagram of the Tegam 1830A in application note 217. I've amended it to suit operation with the 478A sensor as this sensor has one end grounded. So parts of the Tegam circuit aren't needed here. See the diagram below.
?
Assuming I understand it correctly, I think the bridge balances when the voltage across 200R resistor Rref (circled in pink) is the same as the voltage measured at S+ (circled in sky blue).
?
I think it relies on being able to measure the 200R resistance of Rref to better than 0.1% accuracy and the two differential amplifiers need to be able to measure all the relevant voltages with any offsets correctly trimmed out. These offsets have to be correctly trimmed for the case when RF is off and when it is on and it needs to be able to do this over the full 30mW input power range where the voltages will change quite a bit between RF off and RF on.
?
It also has to measure the temperature compensation bridge in a similar way but I guess the offset issue is less critical here as long as it is stable. However, any tens of uV drift measured by the system should be due to the thermistor temperature changes and not due to the summing of offset drifts in any of the four differential amplifiers when operating in compensated mode.
?
I don't have much experience designing precision instrumentation/differential amplifiers but the gain of each amplifier will also have to be precisely defined (using precision resistors?) and all offsets carefully managed. The accuracy of the ADC used to measure these voltages needs to be managed and any non-linearity issues across the ADC range must be correctly managed.
?
I think this is why the Tegam manual shows various offsets and gain calibration constants. Maybe this stuff is trivial to manage with modern amplifiers and components but there seems to be an awful lot of things to manage here.
?
By contrast, if I look at the HP 432A system, once the mount resistance is measured using a 4 wire bench DMM to <= 0.1% accuracy, the DC substitution is all about how good the external DVM is. Obviously, the mismatch uncertainty and sensor efficiency are also important but these factors affects both the 432A and Tegam meters equally.
?
A decent external 6.5 digit DVM should be good enough for most 432A users but Keysight seem to recommend using a $$$ 3458A 8.5 digit DVM for the N432A. Maybe this has the required linearity and accuracy to get the most out of the system? I don't see how the Tegam meter can compete here, but maybe I'm missing something. Either way, any differences are likely to be so tiny that only places like NIST and maybe Keysight need to be concerned about this.
?
I would choose the N432A over the Tegam meter for various reasons, even if the accuracy is the same or regardless if one is ever so slightly better than the other. But then again, I think I prefer the HP 432A for home use because it just needs a decent 6.5 digit DVM to operate it with DC substitution. No ned to return it for an annual calibration :)
?
?