I'm curious if there is any definitive data which shows the relationship between the amount of sputtering of a pulsed current vs a constant current in a nixie. Example: A tube runs at 3mA constant current vs a tube run at 50% duty cycle using a 6mA current pulse. In theory, depending on the tube response time, the apparent brightness might be similar. If things were linear, there would be no difference in the amount of sputtering that would be generated by equal time periods. If things were not linear and maybe the effect is exponential, maybe in the example above, the sputtering could be many times greater for the pulsed example in the same time period. Obviously, this impacts the tube life, cathode poisoning and possibly other factors.
Is there any factual data that can shed some light (no pun) on this? It would be useful in discussions about the pros and cons of drive method. I'm guessing that there must have been some study during the production years of Nixies, although it may have pertained more to the density of sputtering from a fixed area at different current rates with different materials as opposed to a discussion of pulsed drive (which had not yet come of age). -- You received this message because you are subscribed to the Google Groups "neonixie-l" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send an email to [email protected]. To view this discussion on the web, visit https://groups.google.com/d/msgid/neonixie-l/6abd1806-9da5-435f-9b0b-8a7f705437bf%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
