The required voltage rating of a cathode driver has been debated in this forum a few times. There are 2 opinions that I know of
- The driver needs to be rated at the full anode-supply voltage - The driver only needs to be rated at (Anode_supply_voltage - Voltage_drop_across_nixie_tube) Based on my knowledge of semiconductor physics, it's fine to use a bipolar (NPN) cathode driver that is rated at a lower voltage than the anode supply because the breakdown mechanism is non-destructive as long as the current is limited. When an NPN is off, presumably when it's base terminal is grounded (NOT open...), there will not be any current gain from collector leakage current, so it will stay off. I suspect an OPEN base could result in some visible glow, depending upon the collector leakage current Ico. This is why legacy drivers like the K155 or 7441 can drive a nixie that requires about 160V even though the IC itself can handle only 50 V. If you are using a MOSFET as your segment driver, which is typical in IC's, then you need to use a driver rated at the full anode supply voltage because a MOSFET's construction is sensitive to excess voltage (oxide breakdown) as well as current. Long-term overstress of the oxide will cause reliability problems and lead to failure. -- You received this message because you are subscribed to the Google Groups "neonixie-l" group. To unsubscribe from this group and stop receiving emails from it, send an email to neonixie-l+unsubscr...@googlegroups.com. To post to this group, send an email to neonixie-l@googlegroups.com. To view this discussion on the web, visit https://groups.google.com/d/msgid/neonixie-l/c0c015b5-15e9-4387-b7f7-911193e174a4%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.