I have to disagree with Annette (and others) on this one (I almost always agree with Annette :-) ). I was the chair of the IRB where I taught before and now I serve on the college wide IRB.
I think this is exactly an example on an IRB gone wild. I think this is an example that contributes to the perception that IRB are overstepping their bounds as the "ethics police" and unduly interfering with research (especially minimally risk research). I know Tricia Keith-Speigel is doing research on this as we speak - perhaps she can chime in.
 I am aware of the fact that IRBs can chose to set the bar higher than the federal regulations (which certainly this is an instance of). However, I think this is dangerous here for several reasons (in no particular order).
-many measures used by researchers (incl. me) do not have reliability and validity data. There would be no way to provide evidence of reliability and validity because it does not exist (beyond "these are the measures that I've used before or that other people use")
-often you ask new questions that you simply have to write yourself.
-in order to collect data on reliability and validity you often have to ask questions that are "bad"
-a very important educational function is for students to write items themselves and do the best they can. They then realize (just like we do) that they are probably not that great. Having the IRB serve as a policing function for such research is not only unrealistic -  I do not think that is the job of the IRB.
-IRBs are rarely experts in the research "you" do. How can they reasonably make a judgment about the quality of measures?
-according to the federal guidelines anonymous, minimal research is EXEMPT from review. Unless the IRB has chosen to set a higher bar than the federal guidelines, most research that is done by students is never reviewed (again, if one follows the regs).
Just some thoughts before teaching my last class of the semester (yeah!)
Marie


Annette Taylor, Ph. D. wrote:
As chair of our IRB I have sometimes done the same thing, especially if the 
measures send up a red flag somehow. If the measures are reliable and valid 
then this is an extremely easy task. If they are not, then even in a minimum 
risk study you are abusing your participants if you are asking them to give up 
their time and energy on a useless task.

As a psychologist I find I am more mindful of such issues than my colleagues 
from other disciplines. Some of them--especially from the 'hard' sciences--seem 
clueless about tests even having reliability and validity. I don't think the 
IRB has gone wild at all. It is doing its job. This should be easily 
accomplished and easily remedied.

If that was the only thing problematic with the proposal I would have marked it 
as 'approved pending modifications" then usually within a day of getting the 
requested information would have gotten back to the researcher and told them it 
was approved. At least at our school it is not a big hassle. Over the years of 
doing such things I find most researchers end up grateful for the heads-up on a 
problem with their studies--it boils down to "it's not what you say but how you 
say it".

Annette

Quoting [EMAIL PROTECTED]:

  
Our relatively new IRB has sent back a proposal from a colleague.   The IRB 
refuses to evaluate the proposal without the author addressing issues of 
RELIABILITY and VALIDITY of measures.   I find this to be a bit scary.  
While I 
feel that the IRB is properly charged with evaluating the risk to
participants 
using a given method, I do not feel that the IRB has any place evaluating the

appropriateness of the method beyond the evaluation of risk...especially in 
cases with minimum risk.   My contention is that the reliabilty and validity
of 
measures should be outside the perview of the IRB unless risk levels exceed 
minimum and a cost/benefit decision must be discussed.   

Thoughts?   Can anyone help me out here?


---
You are currently subscribed to tips as: [EMAIL PROTECTED]
To unsubscribe send a blank email to [EMAIL PROTECTED]

    


Annette Kujawski Taylor, Ph. D.
Department of Psychology
University of San Diego 
5998 Alcala Park
San Diego, CA 92110
[EMAIL PROTECTED]

---
You are currently subscribed to tips as: [EMAIL PROTECTED]
To unsubscribe send a blank email to [EMAIL PROTECTED]
  

-- 
*********************************************
Marie Helweg-Larsen, Ph.D.
Associate Professor of Psychology
Dickinson College, P.O. Box 1773
Carlisle, PA 17013
Office: (717) 245-1562, Fax: (717) 245-1971
Webpage: www.dickinson.edu/~helwegm
*********************************************
---
You are currently subscribed to tips as: [EMAIL PROTECTED]
To unsubscribe send a blank email to [EMAIL PROTECTED]

Reply via email to