This stuff goes on all the time, and is a combination of ignorance and a type 
of disregard. Back in the 90s my wife and I paid a lot of money to get 
portraits made at a fancy place here in Atlanta called Olan Mills. Olan Mills 
was the place people used for weddings and other important events. Having them 
do your pictures indicated you had taste--and some money. We were greeted by a 
young white girl who proceeded to snap test shots of us. After a while it 
became clear she was having some trouble. Finally she said "Sorry it's taking 
so long, but it's just harder t photograph black skin". 
I asked why, and she said because our skin doesn't reflect as much light. I 
asked a buddy (white) who is a photographer, and he agreed with her. Aside from 
my confusion--it would seem to me that really pale skin could be just as 
problematic for reflecting too much late versus the background--I was very 
angry. Bottom line is there's a range of skin colors in this world, and any 
photog worth her salt would be able to make the adjustments as needed. And even 
if I did buy that black skin made picture taking that much harder, my wife and 
I are both medium toned, not anywhere close to the blackness of, say, Wesley 
Snipes. I asked the girl what she'd do for really dark-skinned people, and she 
obliviously replied, "Oh, they're even harder!" 
How, I wondered, did the mostly white photogs for National Geographic get those 
stunning pictures of Africans and Middle Easterners all those years if it was 
so tough? Unfortunately we'd pre-paid, so we had to suffer through the 
sessions. The pictures ended up looking washed out, as the girl obviously set 
the light too bright. I was done at that point with Olan Mills. 
A few years later I did family pics for my entire family as a gift. This time i 
spoke directly to black folk and discovered the vast majority of them use 
JCPenney. So I took my family there, and was pleasantly surprised to be greeted 
by a young black girl. I asked her about the skin color thing and she laughed 
knowingly and said "It's no more difficult than really white skin. You just 
have to know what you're doing". There were six of us, ranging from the light 
end of medium ("yellow" as some say), to me and my wife in the middle, to my 
darker-skinned brother and dad. The pics came out perfectly, everyone's skin 
looked great, lighting was perfect, and it took far less time than with Olan 
Mills. 
Reminded me again that some people in this world have the luxury of thinking 
that what they represent is the norm,and don't often pay the price for that 
narrow-minded world view. 


----- Original Message ----- 
From: "Mr. Worf" <hellomahog...@gmail.com> 
To: scifinoir2@yahoogroups.com 
Sent: Saturday, January 30, 2010 2:20:18 AM GMT -05:00 US/Canada Eastern 
Subject: [scifinoir2] Another racist camera? 







Face-Detection Cameras: Users' Racism Charges Explained 
Time.com

    • Buzz up! 1137 votes 
    • Send 



        • Email 
        • IM 

    • Share 



        • Delicious 
        • Digg 
        • Facebook 
        • Fark 
        • Newsvine 
        • Reddit 
        • StumbleUpon 
        • Technorati 
        • Twitter 
        • Yahoo! Bookmarks 

    • Print 




Models show Nikon's digital camera Coolpix series, as they are unveiled in 
Seoul, South Korea, Thursday Aug. 30, 2007. The Coolpix P5100, black body aAP – 
Models show Nikon's digital camera Coolpix series, as they are unveiled in 
Seoul, South Korea, Thursday … 
By ADAM ROSE Adam Rose – Fri Jan 22, 5:45 am ET 

When Joz Wang and her brother bought their mom a Nikon Coolpix S630 digital 
camera for Mother's Day last year, they discovered what seemed to be a 
malfunction. Every time they took a portrait of each other smiling, a message 
flashed across the screen asking, "Did someone blink?" No one had. "I thought 
the camera was broken!" Wang, 33, recalls. But when her brother posed with his 
eyes open so wide that he looked "bug-eyed," the messages stopped. 

Wang, a Taiwanese-American strategy consultant who goes by the Web handle 
"jozjozjoz," thought it was funny that the camera had difficulties figuring out 
when her family had their eyes open. So she posted a photo of the blink warning 
on her blog under the title, "Racist Camera! No, I did not blink... I'm just 
Asian!" The post was picked up by Gizmodo and Boing Boing , and prompted at 
least one commenter to note, "You would think that Nikon, being a Japanese 
company, would have designed this with Asian eyes in mind." (See Techland's top 
10 gadgets of 2009.) 

Nikon isn't the only big brand whose consumer cameras have displayed an 
occasional - though clearly unintentional - bias toward Caucasian faces. Face 
detection , which is one of the latest "intelligent" technologies to trickle 
down to consumer cameras, is supposed to make photography more convenient. Some 
cameras with face detection are designed to warn you when someone blinks; 
others are programmed to automatically take a picture when somebody smiles - a 
feature that, theoretically, makes the whole problem of timing your shot to 
catch the brief glimpse of a grin obsolete. Face detection has also found its 
way into computer webcams, where it can track a person's face during a video 
conference or enable face-recognition software to prevent unauthorized access. 

The principle behind face detection is relatively simple, even if the math 
involved can be complex. Most people have two eyes, eyebrows, a nose and lips - 
and an algorithm can be trained to look for those common features, or more 
specifically, their shadows. (For instance, when you take a normal image and 
heighten the contrast, eye sockets can look like two dark circles.) But even if 
face detection seems pretty straightforward, the execution isn't always smooth. 

Indeed, just last month, a white employee at an RV dealership in Texas posted a 
YouTube video showing a black co-worker trying to get the built-in webcam on an 
HP Pavilion laptop to detect his face and track his movements. The camera 
zoomed in on the white employee and panned to follow her, but whenever the 
black employee came into the frame, the webcam stopped dead in its tracks. "I 
think my blackness is interfering with the computer's ability to follow me," 
the black employee jokingly concludes in the video. "Hewlett-Packard computers 
are racist." (See the 50 best inventions of 2009.) 

The " HP computers are racist" video went viral, with almost 2 million views, 
and HP, naturally, was quick to respond. "Everything we do is focused on 
ensuring that we provide a high-quality experience for all our customers, who 
are ethnically diverse and live and work around the world," HP's lead 
social-media strategist Tony Welch wrote on a company blog within a week of the 
video's posting. "We are working with our partners to learn more." The post 
linked to instructions on adjusting the camera settings, something both 
Consumer Reports and Laptop Magazine tested successfully in Web videos they put 
online. 

Still, some engineers question how a webcam even made it onto the market with 
this seemingly glaring flaw. "It's surprising HP didn't get this right," says 
Bill Anderson, president of Oculis Labs in Hunt Valley, Md., a company that 
develops security software that uses face recognition to protect work computers 
from prying eyes . "These things are solvable." Case in point: Sensible Vision, 
which develops the face-recognition security software that comes with some Dell 
computers , said their software had no trouble picking up the black employee's 
face when they tested the YouTube video . 

YouTube commenters expressed what was on a lot of people's minds. "Seems they 
rushed the product to market before testing thoroughly enough," wrote one. "I'm 
guessing it's because all the people who tested the software were white," wrote 
another. HP declined to comment on their methods for testing the webcam or how 
involved they were in designing the software, but they did say the software was 
based on "standard algorithms." Often, the manufacturers of the camera parts 
will also supply the software to well-known brands, which might explain why HP 
isn't the only company whose cameras have exhibited an accidental prejudice 
against minorities, since many brands could be using the same flawed code. TIME 
tested two of Sony 's latest Cyber-shot models with face detection (the DSC-TX1 
and DSC-WX1) and found they, too, had a tendency to ignore camera subjects with 
dark complexions. 

But why? It's not necessarily the programmers' fault. It comes down to the fact 
that the software is only as good as its algorithms, or the mathematical rules 
used to determine what a face is. There are two ways to create them: by 
hard-coding a list of rules for the computer to follow when looking for a face, 
or by showing it a sample set of hundreds, if not thousands, of images and 
letting it figure out what the ones with faces have in common. In this way, a 
computer can create its own list of rules, and then programmers will tweak 
them. You might think the more images - and the more diverse the images - that 
a computer is fed, the better the system will get, but sometimes the opposite 
is true. The images can begin to generate rules that contradict each other. "If 
you have a set of 95 images and it recognizes 90 of those, and you feed it five 
more, you might gain five, but lose three," says Vincent Hubert, a software 
engineer at Montreal-based Simbioz, a tech company that is developing 
futuristic hand-gesture technology like the kind seen in Minority Report . It's 
the same kind of problem speech-recognition software faces in handling unusual 
accents. 

And just as the software is only as good as its code and the hardware it lives 
in, it's also only as good as the light it's got to work with. As HP noted in 
its blog post, the lighting in the YouTube video was dim, and, the company 
said, there wasn't enough contrast to pick up the facial shadows the computer 
needed for seeing. (An overlit person with a fair complexion might have had the 
same problem.) A better camera wouldn't necessarily have guaranteed a better 
result, because there's another bottleneck: computing power. The constant flow 
of images is usually too much for the software to handle, so it downsamples 
them, or reduces the level of detail , before analyzing them. That's one reason 
why a person watching the YouTube video can easily make out the black 
employee's face, while the computer can't. "A racially inclusive training set 
won't help if the larger platform is not capable of seeing those details," says 
Steve Russell, founder and chairman of 3VR, which creates face recognition for 
security cameras . 

The blink problem Wang complained about has less to do with lighting than the 
plain fact that her Nikon was incapable of distinguishing her narrow eye from a 
half-closed one. An eye might only be a few pixels wide, and a camera that's 
downsampling the images can't see the necessary level of detail . So a 
trade-off has to be made: either the blink warning would have a tendency to 
miss half blinks or a tendency to trigger for narrow eyes. Nikon did not 
respond to questions from TIME as to how the blink detection was designed to 
work. 

Why these glitches weren't ironed out before the cameras hit Best Buy is not 
something that HP, Nikon or Sony , when contacted by TIME, were willing to 
answer. Perhaps in this market of rapidly developing technologies, consumers 
who fork over a few hundred dollars for the latest gadget are the test market . 
A few years ago, speech-recognition software was teeth-gnashingly unreliable. 
Today, it's up to 99% accurate. With the flurry of consumer complaints out 
there, most of the companies seem to be responding. HP has offered instructions 
on how to adjust its webcam's sensitivity to backlighting. Nikon says it's 
working to improve the accuracy of the blink-warning function on its Coolpix 
cameras. (Sony wouldn't comment on the performance of its Cyber-shot cameras 
and said only that it's "not possible to track the face accurately all the 
time.") Perhaps in a few years' time, the only faces cameras won't be able to 
pick up will be those of the blue-skinned humanoids from Avatar . 

Read "Sony's Robot-Cam: Partying Without a Photographer." 

-- 
Celebrating 10 years of bringing diversity to perversity! 
Mahogany at: http://groups.yahoo.com/group/mahogany_pleasures_of_darkness/ 



Reply via email to