On 2021-12-03 02:51, Robin wrote:
If you put your detector in a well grounded Faraday cage, it may eliminate most 
radio interference produced by sparking.
Use metal (not nylon) fly wire for the Faraday cage (or at least for a window 
if you prefer the whole cage be made of
metal sheet). The space between the wires is small enough to shield most EM 
below about 150 GHz, but alpha, beta, or
gamma should get through easily. I suggest you add a little credit card sized 
microprocessor to the detector, that can
run on batteries for a few hours, and can easily be included in the Faraday 
cage, with no protruding wires. The
microprocessor can log the counts, and the time, and store it on a microSD card 
for later use.
(Protruding wires would act as an antenna, for the EM, defeating the purpose of 
the Faraday cage.)

BTW to eliminate the Radon, just make the experiment portable, and take it 
elsewhere. Also let the detector run for a
while before the experiment starts, so that you get a good indication of 
average background radiation.

My Geiger detector was apparently immune to the sparking and it never showed anything that could be attributed to that. On the other hand, it seemed sensitive to radioactive dust and one time I managed increase the already somewhat high background signal by 3 times by just putting it in front of a 120mm fan in a closed room. I never saw anything with it during the tests after enclosing it in a sealed plastic box. I don't have the Geiger counter anymore, in any case.

The CMOS/CCD webcam detector could possibly benefit from being put in a sealed box inside a Faraday cage; whether it would be able to see much more than background radiation is the question. The low sensitivity (counts per unit of time) is a problem. Variations due to temperature are also an issue. When it did not malfunction, proximity to the plasma electrolysis cell increased the amount of false detections due to sensor noise).

I thought in the past about using a Faraday cage, but in the end also due to the very low budget nature of the tests I just "embraced" such emissions and tried finding conditions that maximized them. Generally this simply meant using higher voltages (typically up to 72V in my case, which is unsustainable for more than short periods with KOH at or close to saturation at room temperature due to the violent reaction), although other parameters also have an effect as mentioned earlier. It seems for example that the hotter the cathode, the higher the emissions, which appears to make sense on an intuitive level (stronger thermionic emission). Cathode materials that do not oxidize easily also seemed to work better.

Some authors have suggested that the electromagnetic emission itself is the result of novel processes occurring in the plasma/spark reaction, so just measuring the EMI seemed like it would be a very simple strategy to maximize them. Thus my tests were mostly focused on lowering the voltage from which the plasma reaction could start and increasing the amount of EMI generated.

I never tried seriously measuring excess heat. Evaporation calorimetry is not straightforward because much of the electrolyte is efficiently aerosolized from the cathode region, which may give the impression of much larger heat generated than in reality. Measuring the temperature in one single point may also give false results due to heat stratification or heat gradients in the electrolyte (highly likely for cathodic plasma electrolysis).

Cheers, BA

Reply via email to