-----Original Message-----
From: Doctor  Plum <[EMAIL PROTECTED]>
To: Doctor  Plum <[EMAIL PROTECTED]>
Sent: Sat, 19 Apr 2008 12:18 am
Subject: [ctrl] The Government Is Trying to Wrap Its Mind Around Yours






















    

            



Imagine a world of streets lined with video cameras that alert authorities to 
any suspicious activity. A world where police officers can read the minds of 
potential criminals and arrest them before they commit any crimes. A world in 
which a suspect who lies under questioning gets nabbed immediately because his 
brain has given him away.



Though that may sound a lot like the plot of the 2002 movie "Minority Report," 
starring Tom Cruise and based on a Philip K. Dick novel, I'm not talking about 
science fiction here; it turns out we're not so far away from that world. But 
does it sound like a very safe place, or a very scary one?



It's a question I think we should be asking as the federal government invests 
millions of dollars in emerging technology aimed at detecting and decoding 
brain activity. And though government funding focuses on military uses for 
these new gizmos, they can and do end up in the hands of civilian law 
enforcement and in commercial applications. As spending continues and 
neurotechnology advances, that imagined world is no longer the stuff of science 
fiction or futuristic movies, and we postpone at our peril confronting the 
ethical and legal dilemmas it poses for a society that values not just personal 
safety but civil liberty as well.



Consider Cernium Corp.'s "Perceptrak" video surveillance and monitoring system, 
recently installed by Johns Hopkins University, among others. This technology 
grew out of a project funded by the Defense Advanced Research Projects Agency 
-- the central research and development organization for the Department of 
Defense -- to develop intelligent video analytics systems. Unlike simple video 
cameras monitored by security guards, Perceptrak integrates video cameras with 
an intelligent computer video. It uses algorithms to analyze streaming video 
and detect suspicious activities, such as people loitering in a secure area, a 
group converging or someone leaving a package unattended. Since installing 
Perceptrak, Johns Hopkins has reported a 25 percent reduction in crime.



But that's only the beginning. Police may soon be able to monitor suspicious 
brain activity from a distance as well. New neurotechnology soon may be able to 
detect a person who is particularly nervous, in possession of guilty knowledge 
or, in the more distant future, to detect a person thinking, "Only one hour 
until the bomb explodes." Today, the science of detecting and decoding brain 
activity is in its infancy. But various government agencies are funding the 
development of technology to detect brain activity remotely and are hoping to 
eventually decode what someone is thinking. Scientists, however, wildly 
disagree about the accuracy of brain imaging technology, what brain activity 
may mean and especially whether brain activity can be detected from afar.



Yet as the experts argue about the scientific limitations of remote brain 
detection, this chilling science fiction may already be a reality. In 2002, the 
Electronic Privacy Information Center reported that NASA was developing brain 
monitoring devices for airports and was seeking to use noninvasive sensors in 
passenger gates to collect the electronic signals emitted by passengers' 
brains. Scientists scoffed at the reports, arguing that to do what NASA was 
proposing required that an electroencephalogram (EEG) be physically attached to 
the scalp.



But that same year, scientists at the University of Sussex in England adapted 
the same technology they had been using to detect heart rates at distances of 
up to 1 meter, or a little more than three feet, to remotely detect changes in 
the brain. And while scientific limitations to remote EEG detection still 
exist, clearly the question is when, not if, these issues will be resolved.



Meanwhile, another remote brain-activity detector, which uses light beamed 
through the skull to measure changes in oxygen levels in the brain, may be on 
the way. Together with the EEG, it would enhance the power of brain scanning. 
Today the technology consists of a headband sensor worn by the subject, a 
control box to capture the data and a computer to analyze it. With the help of 
government funding, however, that is all becoming increasingly compact and 
portable, paving the way for more specific remote detection of brain activity.



But don't panic: The government can't read our minds -- yet. So far, these 
tools simply measure changes in the brain; they don't detect thoughts and 
intentions.



Scientists, though, are hard at work trying to decode how those signals relate 
to mental states such as perception and intention. Different EEG frequencies, 
for example, have been associated with fear, anger, joy and sorrow and 
different cognitive states such as a person's level of alertness. So when 
you're stopped for speeding and terrified because you're carrying illegal drugs 
in the trunk of your car, EEG technology might enable the police to detect your 
fear or increased alertness. This is not so far-fetched: Some scientists 
already are able to tell from brain images in the lab whether a test subject 
was envisioning a tool such as a hammer or a screwdriver or a dwelling, and to 
predict whether the subject intended to add or subtract numbers. Just last 
month, scientists announced a new study aimed at decoding visual imagery in the 
brain.



Although brain-based lie-detection technology has been quite controversial and 
has only been tested on a limited basis, early researchers have claimed high 
accuracy at detecting deception. But there's a problem: Most brain-based 
lie-detection tests assume that lying should result in more brain activity than 
truth-telling because lying involves more cognition. So these lie-detection 
methods may fail in sociopaths or in individuals who believe in the falsehood 
they're telling.



Whether such technology will be effective outside the laboratory remains to be 
seen, but the very fact that the government is banking on its future potential 
raises myriad questions.



Imagine, for example, a police officer approaching a suspect based on 
Perceptrak's "unusual activity" detection. Equipped with remote 
neural-detection technology, the officer asks her a few questions, and the 
detection device deems her responses to be deceptive. Will this be enough 
evidence for an arrest? Can it be used to convict a person of intent to commit 
a crime? Significant scientific hurdles remain before neurotechnology can be 
used that way, but given how fast it's developing, I think we must pause now to 
ask how it may affect the fundamental precepts of our criminal justice system.



Americans have been willing to tolerate significant new security measures and 
greater encroachments on civil liberties after the terrorist attacks of Sept. 
11, 2001. Could reports of significant crime reduction such as that seen by 
Johns Hopkins, or incidents such as the student shootings last year at Virginia 
Tech or more recently at Northern Illinois University, be enough to justify the 
use of pre-crime technology? Could remote neural monitoring together with 
intelligent video analytics have prevented those tragedies? And if they could, 
should they be allowed to?



These are just some of the questions we must ask as we balance scientific 
advances and the promise of enhanced safety against a loss of liberty. And we 
must do it now, while our voices still matter. In a world where private 
thoughts are no longer private, what will our protections be?







http://www.washingtonpost.com/wp-dyn/content/article/2008/04/11/AR2008041103296_pf.html



http://groups.yahoo.com/group/doctorplum/





    
  

    
    
    
    




    
    
 

Reply via email to