Thanks James!

Hmm, this explains a lot.

I learned a good deal of statistics as an EE, trying to follow in the
footsteps of Norbert Weiner. In practice, most EE applications start
with the model that a signal exists, and that it has one of two (or N,
typically 2**N) values. So the decision isn't  "how likely is this
observation given the existence of the signal" , but "which of these
signals is more likely given the observation"?

"Frequentism" doesn't really enter into it in stochastic systems engineering.

Deciding whether the climate as any sensitivity to CO2 at all always
struck me as pretty much meaningless. We know that greenhouse gases
are important. Why shouldn't *extra* greenhouse gases be important? So
I was always baffled by the emnphasis on the "detection" question.
Clearly for me the policy-relevant questions were always continuous,
not boolean.  There really is no sensible null hypothesis without
throwing away a lot of extant knowledge.

"Do you believe in global warming" is and has always been a question
of very low utility. Attempting to answer validates the ill-posed
question.

On the other hand, I think that calling it a  "Bayesian/frequentist"
distinction makes it all seem too subtle and even pretentious in a
public discussion. The jargon threw me off, and I actually think this
way! It is important to try to make these issues accessible.

Many people with little statistical background can understand the
concept of the right question vs the wrong question.

The attribution question as usually phrased is "assuming there were a
bunch of trials with no forcing, would less than 5% of them show as
much signal as we have now?", with the "signal" in turn often boiled
down to global mean surface temperature (throwing away lots of
information. Despite Roger's incorrect analysis, the answer to this
question is somewhere about halfway between 95% and 99.999999999% by
now, but it's the wrong question!

A better question, if we must identify the problem as "global
warming", is, given the evidence of a warming signal, what is the most
likely proportion of it that is due to anthropogenic forcing of the
atmosphere, as opposed to 1) free modes 2) natural forcing and 3)
measurement error? Denialists should note that nothing a priori
constrains this proportion to be less than 100%; there may be natural
cooling or a cooling bias in the measurements.

There are other useful questions besides that one, but it's a start;
it seems like a much sounder question to pose in consideration of the
observational record.

mt

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
Global Change ("globalchange") newsgroup. Global Change is a public, moderated 
venue for discussion of science, technology, economics and policy dimensions of 
global environmental change. 

Posts will be admitted to the list if and only if any moderator finds the 
submission to be constructive and/or interesting, on topic, and not 
gratuitously rude. 

To post to this group, send email to [email protected]

To unsubscribe from this group, send email to [EMAIL PROTECTED]

For more options, visit this group at 
http://groups.google.com/group/globalchange
-~----------~----~----~----~------~----~------~--~---

Reply via email to