Since you asked what we think - I think this:
For amateur radio use, there is no reason that I know of to have high accuracy in the measurement of cable impedance.
It may be fun for an intellectual exercise, but practically, all we need to know is if the coax is 50-ohm, 75-ohm, 92-ohm, etc., with enough accuracy that we can see that it is a good cable. And DiSlord's current algorithm gives at least this level of accuracy.
We combine that measurement of impedance with a loss measurement and perhaps an electrical and physical length measurement to evaluate a cable's suitability for our purpose.
Even the best coax cables are not 'exact' in their impedance, and in fact, due to manufacturing tolerances, may vary a couple of percentage points of impedance along the length of the cable. A 50-ohm cable that measures 49.1 ohms of impedance is just fine. These variations and minor differences from an 'exact' 50-ohms or 75-ohms make almost no difference in actual use of the cable for any practical purpose. (The VSWR resulting from the difference between 49.1 and 50 ohms is only 1.018:1.)
The 'loss' calculation of the cable is of much more practical use than impedance, since it will determine power delivered to the load, extra loss due to SWR, etc. But even that loss measurement only needs to be 'reasonably' accurate, maybe to 10ths of a dB, for almost any practical amateur radio application. And when building matching stubs with coax, we have to measure the electrical length anyway, which measurement includes any variation due to imperfect impedance. So again, I don't put any value on highly-accurate cable impedance measurements.
Stan KC7XE