I set up a test circuit using TTL components and 74141 Nixie driver to
count from 0 to 9. The HV source I have is adjustable up to 200V
output. I get good illumination of all digits if I use a 26K anode
resistor, with a voltage of 181V. The loaded voltage measured after the
resistor is ~136V and the current measured between the anode resistor
and the tube is 1.7-1.8 mA. If I crank the voltage to 200.0V the
current between the resistor and the tube cathodes is ~2.5mA. The
digits look sharper and brighter. The voltage after the resistor is ~
137V. My questions are: 1. To get maximum tube longevity, should I
measure the voltage before or after the anode resistor? 2. Am I
measuring the current correctly between the anode resistor and the tube
cathode? The data sheet for this tube states: Firing voltage no more
than 170V, working current digits (2.5-4.5 mA). Stated longevity no
less than 10,000 hours if Voltage ≤ 200V and anode current for
digits ≤ 2.5 mA. But there are 8,760 hours per year and this
would mean the tubes would fail before two years of continuous use.
Others have stated that the life -time of these tubes can be for a
decade. I assume that this is due to running the tubes at lower
currents and voltages. 3. What would most of you run these tubes at
voltage and current? 4. Does it matter what Voltage and anode resistor
combination as long as the cathode current doesn't exceed 2.5 mA?