<https://www.businessinsider.com/apple-threatened-to-kick-facebook-off-app-store-human-trafficking-2021-9>
Apple threatened to kick Facebook off its App Store after a 2019 BBC
report <https://www.bbc.com/news/technology-50228549> detailed how human
traffickers were using Facebook to sell victims, according to The Wall
Street Journal
<https://www.businessinsider.com/reviews/out?platform=browser&postSource=bi%7C6143a99466d85d39bd6c02a5&sc=false&type=LINK-WITH-REVIEW-SLASH-OUT&u=https%3A%2F%2Fwww.wsj.com%2Farticles%2Ffacebook-drug-cartels-human-traffickers-response-is-weak-documents-11631812953%3Fmod%3Dhp_lead_pos7&sessionid=1631959211099>.
The paper viewed company documents that show a Facebook investigation
team was tracking down a human trafficking market in the Middle East
whose organizers were using Facebook's services. What appeared to be
employment agencies were advertising domestic workers that they could
supply against their will, per the Journal.
The BBC published a sweeping undercover investigation of the practice,
prompting Apple to threaten to remove Facebook from its store, the paper
said.
An internal memo found that Facebook was aware of the practice even
before then: A Facebook researcher wrote in a report dated 2019, "was
this issue known to Facebook before BBC inquiry and Apple escalation?,"
per the Journal.
Underneath the question reads, "Yes. Throughout 2018 and H1 2019 we
conducted the global Understanding Exercise in order to fully understand
how domestic servitude manifests no our platform across its entire life
cycle: recruitment, facilitation, and exploitation."
Apple and Facebook did not immediately respond to requests for comment.
The Wall Street Journal on Thursday also reported how Facebook's AI
content moderators cannot detect most languages
<https://www.businessinsider.com/facebook-content-moderation-ai-cant-speak-all-languages-2021-9>
used on the platform, a needed skill if the company is going to monitor
content in foreign markets where it has expanded.
The paper found that human moderators don't know how to speak the
languages used in those markets, leaving a blind spot in the company's
efforts to crack down on harmful content.
One result was drug cartels and human traffickers using the platform to
conduct their business, per The Journal.
_______________________________________________
nexa mailing list
[email protected]
https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa