I need to summarise my factoids on this test in one place, as opposed to the scattered remarks I've made thus far. I'm focusing here on the pulsed regime, which constituted the bulk of the test time.
1. There exists controversy as to where exactly the power measurements were made. Was it on the input (mains) side or on the output (device) side of the control box? Recall that mains was 3-phase and device drive was single phase. I will assume here that it was on the mains side, and thus 3-phase. 2. The report shows the device temperature varying synchronously, up to a small phase lag, with the pulses. This is expected behaviour. 3. The report states that, in the pulse ON state, the input and output powers are identical (~ 810 W), up to measurement error. This implies that the chief component of any jiggery-pokery is going to happen during the pulse OFF state. 4. In the pulse OFF state, the only power draw reported is due to the control box (~110 W). Even when it's assumed, maximally conservatively, that 100% of this power gets to the device (and is therefore not consumed within the control box), the report still calculates a healthily over-unity COP. If you put all this together, then there appear to be only two candidates for deception A) The mains feed contains a DC offset, and/or contains RF power higher than about 60 KHz, since then in either case it's undetectable to the meter. B) There's something in the control box that makes up the difference. B) seems unlikely because it would require batteries, and Hartman states that it was much lighter than that. Battery technology does not exist that could be that light, and/or occupy so little volume, and make up that total energy difference as measured over 100+ hours. Therefore, it seems that the only workable theory of possible deception is A). Andrew

