Yes, Dr. Greenberg, I concede your point. In one's immediate research
one must go far beyond having faith in the publishing process.
        By the way, do journals keep accurate data on their rejection rates,
on re-submission rates, etc.  This would be the sort of information that
could be used to distinguish between legitimate journals and journals with
political agendas.
      However, at least in part, my remarks were directed toward our
acceptance of work well outside our field.  I would like to hold intelligent
opinions on climate change, for instance, without having to understand all
the climatology, meteorology, oceanography, paleontology, modeling, etc.
that truly enlightened opinions are based on.  I would like to believe that
the voodoo-sounding stuff and the particle zoo that physicists talk about is
well-founded in theory and experiment, but I don't understand their
mathematics and I never will.  So when physicists say they have found the
top quark, or that there ought to be a Higgs boson, I have to take that on
faith, or perhaps, as Dave Raikow suggested in an earlier post, we should
call it "confidence."  Condidence that those guys know what they're talking
about, that their journal editors and reviewers aren't nuts or corrupt,
confidence that their mathematics isn't black magic.
        As an aside, here's a question I put to you: Is this confidence I'm
talking about very different, at the psychological level, than the
confidence that a tribal person might have in the magical powers of his/her
shaman?  "Well, no *I *don't understand how his curse makes my milk go sour,
but *he *understands.  He has worked with some of the best shamans arround.
His father slew seven leopards.  There was a circle around the moon on the
night he was born."  This is a rather whimiscal set of credentals, but is
the psychology by which they might be accepted in a tribal context any
different from that in effect when laymen in our society accept our
scientific credentials?

          Martin

2009/7/8 Jonathan Greenberg <greenb...@ucdavis.edu>

> Martin:
>
>   I certainly hope most scientists don't rely on "faith" in the peer review
> process to determine if a paper is valid or not.  I've always treated
> peer-review as just setting a low-end of reliability -- e.g. the paper isn't
> AWFUL if it made it into this journal, and is at least worthy of me reading
> it -- the better the journal, typically, the higher the bar, but no journal
> comes close to being infallible.  If you've reviewed for mid to upper tier
> journals, you'll know that the vast majority of submissions are terrible --
> we throw out a LOT of bad research.  Since science requires repeatability of
> results, if a paper is absolutely novel and brand new, I will ALWAYS spend a
> LOT more time reading through it than if its basically confirming what a lot
> of other papers have confirmed -- peer review + repetition of results =
> higher reliability.
>   Personally, I disagree with the statement "The problem is that no
> individual has enough time, knowledge, and
> background to know if the scientific method is being properly by all those
> who claim to be doing so."  If you are citing a paper or using a paper to
> guide your own research, as a scientist you should be reading the paper
> carefully enough to decide whether or not it is scientifically grounded --
> if you are just pulling out "facts" from the abstract and discussion, you
> aren't really doing your job.  This type of behavior WILL catch up with you,
> eventually -- if you are basing your own research on an assumption of
> validity of someone else's work simply because that work made it into a
> journal, and that work proves to be in error, you are essentially shooting
> yourself in the foot down the road.
> --j
>
>
> Martin Meiss wrote:
>
>>      I find this exchange very interesting, and it points up a major
>> problem caused by the burgeoning of scientific knowledge and the
>> limitations
>> of the individual.  As scientists, we believe (have faith) that the
>> scientific method is the best means of arriving at truth about the natural
>> world.  Even if the method is error-prone in some ways, and is subject to
>> various forms of manipulation, it is historically self-correcting.
>>       The problem is that no individual has enough time, knowledge, and
>> background to know if the scientific method is being properly by all those
>> who claim to be doing so. We hear someone cite a suspicious-sounding fact
>> (i.e., a fact that doesn't correspond to our perhaps-erroneous
>> understanding), and we want to know if it is based on real science or
>> pseudo-science.  So what to we do?  We ask if the supporting research
>> appeared in a peer-reviewed journal (i.e., has this been vetted by the
>> old-boys network?).  This sounds a little like the response of the people
>> who first heard the teachings of Jesus.  They didn't ask "How do we know
>> this is true?"  They asked "By whose authority do you speak?"
>>        These two questions should never be confused, yet the questions
>> "Did
>> it appear in a peer-reviewed journal" and "Is that journal REALLY a
>> peer-reviewed journal?" skate perilously close to this confusion.  We are
>> looking for a short-cut, for something we can trust so we don't have to be
>> experts in every branch of science and read every journal ourselves.  I
>> don't know the answer to this dilemma, and perhaps there is none, but we
>> should be looking for something better than "Does this have the stamp of
>> approval of people who think like I do?"  We should be looking for
>> something
>> that is not just an encodement of "Does this violate the doctrine of my
>> faith?"  The pragmatic necessity of letting others decide whether certain
>> research is valid should be no excuse for relaxing our personal vigilance
>> and skepticism. Otherwise, we fall into the same trap that ensnares the
>> religionists who are trying to undermine science because it threatens
>> their
>> faith.
>>
>>                 Martin M. Meiss
>>
>>
>> 2009/7/8 Kerry Griffis-Kyle <kerr...@yahoo.com>
>>
>>
>>
>>> I am teaching a Sophomore/Junior level evolution course at Texas Tech
>>> (where a significant proportion of my students believe evolution is
>>> anti-God).  One of the activities I have them do is take three
>>> creationist
>>> claims about science and use the peer-reviewed scientific literature to
>>> find
>>> evidence to support or refute the claim.  It makes them really think
>>> about
>>> the issues; and if they follow the directions, it does a better job than
>>> any
>>> of my classroom activities convincing them that the claims against
>>> evolution
>>> are just a bunch of hooey.  Unfortunately, there are journals claiming
>>> peer-review status that are not.  It can be very frustrating.
>>>
>>> Like Raphael, I also wonder if there is a good source the students can
>>> use
>>> as a rubric for telling if a journal article is peer-reviewed.
>>>
>>> *****************************
>>> Kerry Griffis-Kyle
>>> Assistant Professor
>>> Department of Natural Resources Management
>>> Texas Tech University
>>>
>>> --- On Tue, 7/7/09, Raphael Mazor <rapha...@sccwrp.org> wrote:
>>>
>>>
>>> From: Raphael Mazor <rapha...@sccwrp.org>
>>> Subject: [ECOLOG-L] "real" versus "fake" peer-reviewed journals
>>> To: ECOLOG-L@LISTSERV.UMD.EDU
>>> Date: Tuesday, July 7, 2009, 5:03 PM
>>>
>>>
>>> I've noticed a number of cases lately where groups with a strong
>>> political
>>> agenda (on topics like climate change, evolution, stem cells, or human
>>> health) cite "peer reviewed" studies in journals that are essentially
>>> fabricated for the purpose of advancing a specific viewpoint.
>>>
>>> What's a good way to tell when a journal is baloney? Of course, it's easy
>>> for a scientist in his or her own field to know when a journal is a sham,
>>> but how can we let others know it's obviously fake? For example, are only
>>> "real" journals included on major abstract indexing services?
>>>
>>> -- <><><><><><><><><>
>>> Raphael D. Mazor
>>> Biologist
>>> Southern California Coastal Water Research Project
>>> 3535 Harbor Boulevard, Suite 110
>>> Costa Mesa, CA 92626
>>>
>>> Tel: 714-755-3235
>>> Fax: 714-755-3299
>>> Email: rapha...@sccwrp.org
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
> --
>
> Jonathan A. Greenberg, PhD
> Postdoctoral Scholar
> Center for Spatial Technologies and Remote Sensing (CSTARS)
> University of California, Davis
> One Shields Avenue
> The Barn, Room 250N
> Davis, CA 95616
> Cell: 415-794-5043
> AIM: jgrn307, MSN: jgrn...@hotmail.com, Gchat: jgrn307
>

Reply via email to