-----Original Message----- From: Rich Murray 3 puzzling long runs with H-Ni on thin Ni strips at 350-750 K, some heat, gamma, S Focardi et al 2004 :
http://www.lenr-canr.org/acrobat/FocardiSevidenceof.pdf These results, and even the inconsistency between them, is fully explainable with what can be called "Bethe-fusion" of dense hydrogen, such as occurs in our sun: P + P --> D + positron + neutrino This would be a QM version (lower probability w/tunneling) that occurs under 2D conditions of high-density / low-heat, instead of high-density / high-heat as in our sun. If Ni nanoparticles had been used instead of Ni strips, there would have been less inconsistency. Obviously the neutrino in this reaction is not easily detectable, and it sets the stage for variability, since it can carry away much (to most) of the energy, but in a variable proportion (0.26 MeV is the average energy carried away by the neutrino but that can vary wildly based on unplanned factors) ... ... thus small changes in the Forcardi parameters, such as surface morphology of the strip or minor contaminants in nickel - can result in large variations in apparent results: even a swing from little observable excess energy to lots of it. When most of the positrons are vectored into reciprocal space, there is little radiation in 3-space, but with small changes, more can show up. Reciprocal space is being used in an expansive way to include another dimension (or fractal), as well as Dirac's epo field, or "k-space" - the space in which the Fourier transform of a spatial function is represented. A Fourier transform takes us from "real space" to reciprocal space or vice versa, but both are equally real in final effects. When the reaction produces excess heat in 3-space, then positron annihilation will more likely be in evidence - thus gamma radiation. Overriding it all, if the reactions we are dealing with are QM-based and divorced from the expectations of thermonuclear fusion, is that a probability field develops for various small changes, based on initial conditions, and Focardi is not likely to be aware of them. Moreover, if we base everything on Dirac, the epo field and some kind of ZPE force being utilized in a step-wise process which first converts hydrogen to 'pycno', or a denser allotrope, then the energy-deficit caused by the first step (densification) might force positrons into a hidden vector (Dirac reciprocal space) as makeup for energy already extracted and that probability becomes persistent via entanglement, as JS Brown suggests. Jones. The Fourier transform is well-verbalized in the Wiki entry as a mathematical operation that decomposes a signal into its constituent frequencies. The reality of it as a practical matter, is more contentious. This is reminiscent of the neutrino being merely invented as a mathematical convenience, when first introduced - now proved to be real. The Fourier transform of a musical chord is a mathematical representation of the amplitudes of the individual notes that make it up, thus seeming to be 'unreal' in one sense - yet we can use the transform and create the identical musical chord with a proper device, so it has derivative reality. The original signal depends on time, and therefore is called the time domain representation of the signal, whereas the Fourier transform depends on frequency and is called the frequency domain representation of the signal. As we can expect, but I will save him the trouble - Fran Roarty will be quick to report that this is how and where 'time distortion' due to cavity QED and relativistic effects can enter into the picture.

