one of my stupid plumber ideas, is to add a tank after the flowmeter to avoid problem with back pressure...
a similar solution in electricity is to use a 3rd party "online" UPS to reconditionate AC. other ideas is to use old-tech instruments including old electric counters, water counters... 2014-05-13 23:34 GMT+02:00 Jed Rothwell <[email protected]>: > Susanna Gipp <[email protected]> wrote: > > >> did anybody have the chance to read this? >> >> http://matslew.wordpress.com/2014/05/12/defkalion-demo-proven-not-to-be-reliable/ >> >> kinda unbelievable how was cheap the trick they used to fool Gamberale >> > > This trick is so cheap, and so transparent, I doubt it was a trick. My gut > feeling is that it was a stupid mistake. It would be mind boggling if a > "trick" like that would work on an engineer or scientist. I have heard that > the people from NI took one look at the shoddy setup and told Defkalion > they were no longer invited to NI Week. > > I do not know how long it took Gamberale to discover this problem, but . . > . > > While I do not mean to boast here, it would take me 10 minutes to discover > the flow rate is wrong by a margin as large as this. The first thing I do > when checking flow calorimetry is measure the inlet and outlet temperatures > with a hand-held thermocouple, and then I measure the flow rate with a stop > watch and a graduated cylinder. (Or a carafe and a weight scale.) This is > not rocket science! It is easy. > > I have done this several times at various labs. As I recall, I found large > errors during Patterson's demo, during one of Gene's experiments, at > Hydrodynamics, and at two other places I do not recall. That is why I do > not trust flow meters. The darn things get clogged up, or they run > backwards, as Gamberale described. They are the Achilles' heal of flow > calorimetry. You can't trust them until you verify them. You need to keep > checking them throughout the experiment. I recall the user manual for one > of them specifically said you should test the instrument by collecting > water in graduated cylinder. It is just common sense. > > As I said, when you measure the flow rate manually, the answer is > approximate. If the flow meter says 1.16 L/min, and you get somewhere > between 0.9 and 1.2 L, you are good to go. You know the thing is working > right. Actually, though, with a little practice and several tries, you can > get closer than that. You need to do this several times during the course > of a test to be sure the flow rate is not fluctuating significantly. > > Try this at home! You do not even need a flow meter. Turn on the tap and > measure the flow of water several times. You will see that the variation is > small. Flush the toilet and see if you can measure the difference from the > drop in water pressure. > > When the output is steam, you use a bucket of cold water to sparge the > steam. Then you measure the increase in weight and temperature. It amounts > to the same thing as measuring a flow of liquid water with a graduated > cylinder. > > If I had been at Defkalion's test and they said "no, you are not allowed > to measure the flow rate" I would have told them: "Then I must assume you > people are frauds, and I will take the next plane home and tell everyone > that." That is more or less what I told Patterson when he refused to let me > make my own measurements. He thought about it and changed his mind. > > Rossi told me I would not be allowed to make measurements so I did not go. > I suppose Defkalion uninvited me three times after they realized I meant to > actually measure things. > > I could not make manual measurements of high precision equipment such as > SRI's. You can't monkey with that. Fortunately, people like McKubre, Storms > and Miles are professionals who use redundant instruments and they check > everything to a fare-thee-well. This is described in their papers. Still, > if I were to visit them I would check the flow rate if I could. You cannot > as easily check the performance of a Seebeck calorimeter. The blue > Thermonetics Seebeck calorimeter in Ed's lab belongs to me, so I guess I > should believe it. > > - Jed > >

