That analogy was not intended, and I’m merely trying to point out that AI is 
here to stay and is expected to become a more useful tool in our industry as it 
has been proven to be in the medical profession.


From: Douglas Nix <> 
Sent: Tuesday, October 10, 2023 1:03 PM
Subject: Re: [PSES] AI & Regulatory Compliance


You cannot compare ChatGPT to a medical AI that has been tailored for a 
specific task like medical image analysis. That’s like saying a 1968 Mini 
Cooper and Dodge 440 HEMI Charger are comparable because they are both cars. 


Using ChatGPT to summarize a paper, produce an abstract from uploaded text, or 
produce a set of points as a starting point is perfectly fine. You are giving 
the software the specific input material from which to generate the output. You 
cannot ask it research questions an expect a valid response because ChatGPT has 
no parameters for correctness. It only wants to give you a plausible sounding 
answer. It will give you an authoritative answer with no reference to anything 
resembling truth.


Doug Nix <>

(519) 729-5704


"All animals except man know that the ultimate joy of life is to enjoy it."  -- 
Samuel Butler



On Oct 10, 2023, at 14:15, Ralph McDiarmid < 
<> > wrote:


Physicians have used AI (expert systems) in their offices for many years as a 
tool to help them diagnose a problem more accurately and will greater speed.  
It’s a tool to speed productivity, and that’s how I use it today.


This message is from the IEEE Product Safety Engineering Society emc-pstc 
discussion list. To post a message to the list, send your e-mail to 

All emc-pstc postings are archived and searchable on the web at:

Instructions: (including how to 
List rules:

For help, send mail to the list administrators:
Mike Sherman at:
Rick Linford at:

For policy questions, send mail to:
Jim Bacher:  <>
To unsubscribe from the EMC-PSTC list, click the following link:

Reply via email to