Since when does 6 C correspond with 42.8 F?





Sent from Windows Mail





From: CB Sites
Sent: ‎Sunday‎, ‎August‎ ‎24‎, ‎2014 ‎7‎:‎12‎ ‎PM
To: vortex-l@eskimo.com





Jojo, I really think you miss the point.  Let assume a moment the global 
average temperature was 6C above average.  That is 42.8 degrees Fahrenheit!  
You and the deniers have got to get an understanding of what that means.  It 
means extinction of life as we know it.  I know you deniers think some how man 
kind will survive.  To be honest, I think that is doubtful.  Economic systems 
will not survive, food supplies will not provide, and warring political systems 
will doom the planet.   



I really don't need to say much more, reality will take control and play out 
future events that the deniers will bitch about all the way to the extinction 
of man. 










On Sun, Aug 24, 2014 at 9:38 PM, David Roberson <dlrober...@aol.com> wrote:


Eric, I realize how complex the problem that these guys are facing must be.  
That is the root cause of their problem.  You have listed several good points 
and I will take them into consideration.

 

My main issue with the current models is that new processes and interactions 
are being uncovered frequently which modify the behavior of the models in a 
significant manner.  I ask what would be the output of a model at the end of 
this century that had all of the known and unknown pertinent factors taken into 
consideration?  The recent acknowledgement of a new factor that allows for a 30 
year pause in temperature rise is not an issue to be taken lightly.  It also 
inflicts upon me the concern that there are likely more of these factors that 
remain hidden as of today.

 

I suspect you have relied upon curve fitting routines in the past and realize 
that enough variables can be chosen and adjusted to match any set of input data 
as closely as desired as long as that data is sparse.  You also probably 
realize that a polynomial fit to a high power order yields coefficients that 
vary depending upon the order of the polynomial chosen.  Many combinations of 
coefficients will fit the input/output data over a restricted range.  The 
problem shows up once you use those different coefficients to project the curve 
forwards into unknown future points.

 

We are now clearly in witness to an example of the type of problem that I am 
speaking of.  The old data apparently matched the functional relationship that 
the modelers have chosen to an excellent degree until the pause.  They were 
confident that no pause would appear and many suggested that they would be 
worried if the pause lasted for more than about 5 years.  As we know that time 
period came and passed and the pause continued which forced many of these guys 
to seek an explanation.

 

Now, after several more years of unexpected pause, they have come up with their 
best explanation due to the 30 year Atlantic current cycle.  Where was this 
cycle included during the long hockey stick period?  Some might consider that 
the high rate of heating during the earlier period might have come about due to 
added heating by this same cycle.  That certainly makes sense to me.

 

So, I can not help but to question predictions that have been based upon a 
defective model.  Furthermore, how confident can you possibly be that these 
guys now have all the important factors included within their models?  The 
proof can only be demonstrated by the performance of the models during a period 
of time where they show reasonable results that compare to the real world.  We 
are seeking knowledge of the world's climate in 100 years time as we make plans 
to counter the expected dangers.  It is non sense to trust a model that does 
not work 20 years into the future for this purpose.  The past fits are trivial 
and can always be obtained by curve fitting.  The future fit reveals how good 
the model actually performs.  That is where they are lacking.

 

Eric, when I design an electrical network that is built and tested I expect it 
to perform as my model predicts.  If I measured results that were seriously in 
error I would not recommend the circuit to others for the same application with 
known problems.  Instead I would dig deeper into the model and devices until 
the results match the model fairly well.  I have in fact done this on several 
occasions.  Only then is the model useful to generate predictions of value.

 

Dave

 

 

 

 


-----Original Message-----
From: Eric Walker <eric.wal...@gmail.com>
To: vortex-l <vortex-l@eskimo.com>


Sent: Sun, Aug 24, 2014 4:51 pm
Subject: Re: [Vo]:global warming?








On Sun, Aug 24, 2014 at 12:43 PM, David Roberson <dlrober...@aol.com> wrote:




Eric, I suppose the difference between your beliefs and mine amounts to my 
expectation that the climate change scientists should be held to a high 
standard as is required of most other endeavors.  You apparently are willing to 
give them a free pass since you have a gut feeling that they are right to some 
degree.




I don't think anyone is arguing for giving climate scientists a free pass for 
anything they want to do, anymore than we would argue here for giving 
physicists a free pass to endlessly pour money into ITER or the National 
Ignition Facility; certainly not me.  I'm arguing for humility before expertise 
gradually developed in understanding a wicked problem.  We can question policy 
and funding decisions that are based on uncertain conclusions.  But stepping in 
and saying that we (the general public) are in as good a position to weigh the 
data as capable climate scientists is to lose a sense of the proportion in the 
face of the amount of time and effort that must be expended to discern signal 
from noise in a complex domain.




Without such humility, we are prone to a little bit of unintentional hubris.  
It is similar to making the following statements as members of the general 
public:

What you electrical engineers are saying about instantaneous power is bunk.  I 
know that if the sine and the cosine fluctuate too rapidly, they'll jam 
together like the keys on a typewriter and throw the power out of hoc.
Making a practical quantum computer is not as hard as you guys make it out to 
be, for I have built one out of an erector set and rubber bands and know 
something about the basic principles involved.
Moore's law is not at all insurmountable.  The electrical engineers are simply 
failing to see that if you add in some refrigeration lines, the temperature 
will be sufficiently decreased to allow a continued exponential increase in 
circuit density.  This is simple thermodynamics.

This is probably what we sound like to people who have studied climate science 
when we interject with our analyses without having spent years of our lives 
trying to understand the nuances of the problem.  One hesitates to do something 
similar in the context of LENR, and only does so because almost no one who has 
the proper qualifications is willing to undergo the stigma that will attach to 
anyone in physics who publicly examines LENR.




The overfitting of a model to a set of data is a generally known risk, and ways 
of avoiding it are taught in undergraduate courses.  If we do not give climate 
scientists the benefit of the doubt on this one, we will be proceeding from an 
assumption that they're incompetent.




In trying to understand what climate scientists are doing, I would draw an 
analogy to using our knowledge of radioactive decay half-lives to understand 
how much of a radionuclide will exist after a certain amount of time.  Because 
the process is a stochastic one, the knowledge of the half-life is close to 
useless in predicting whether an individual nucleus will decay at a certain 
time.  But over a period of time, the half-life will allow one to calculate the 
amount of the original radionuclide remaining to within a high degree of 
precision.  I doubt that this ability was something that was acquired 
overnight.  It probably took a few years of trial and error to empirically 
tease out the exponential decay relation.  But even when they were working with 
less than reliable models, I'm guessing they were able to discern the general 
trend.




Another analogy to what climate scientists are trying to do is to that of a 
mechanical engineer attempting to predict the temperature of an engine that has 
been running for a certain period of time.  It is probably difficult to predict 
the temperature at a specific thermocouple at an instance in time beyond a 
certain broad range.  But I'm guessing that it's not too hard to anticipate the 
average temperature across the thermocouples after one has become familiar with 
the operating characteristics of the engine in question.  Climate scientists 
are doing something similar, but at a stage when the laws of thermodynamics 
were less well understood.  Nonetheless general trends can be discerned.




I would not at all be surprised if the relevant time ranges for useful 
predictions in climate change models were on the order of decades.  Each system 
being modeled has its own range of times within which statements are relevant.  
In some nuclear decays, the time range for some decays is on the order of 10^-8 
- 10^-20 seconds.  I would be surprised, in fact, if climate scientists were 
able to bring model predictions to within less than tens of years, given the 
great amount of latency involved for changes to show up in the system.




As for climate scientists adjusting their models periodically in the face of 
new facts, I am reminded of a quote attributed to Keynes, who was responding to 
a similar complaint:  "When my information changes, I alter my conclusions. 
What do you do, sir?"




Eric

Reply via email to