Why YouTube's New Plan to Debunk Conspiracy Videos Won't Work


YouTube continues to try figure out ways to battle false conspiracy
videos that rank highly on YouTube -- sometimes even into the top
trending lists -- and that can spread to ever more viewers via
YouTube's own "recommended videos" system. I've offered a number of
suggestions for dealing with these issues, most recently in "Solving
YouTube's Abusive Content Problems -- via Crowdsourcing"

YouTube has now announced a new initiative that they're calling
"information cues" -- which they hope will address some of these

Unfortunately, this particular effort (at least as being reported
today) is likely doomed to be almost entirely ineffective.

The idea of "information cues" is to provide false conspiracy YouTube
videos with links to Wikipedia pages that "debunk" those conspiracies.
So, for example, a video claiming that the Florida student shooting
victims were actually "crisis actors" would presumably show a link to
a Wikipedia page that explains why this wasn't actually the case.

You probably already see the problems with this approach.

We'll start with the obvious elephant in the room. The kind of viewers
who are going to believe these kinds of false conspiracy videos are
almost certainly going to say that the associated Wikipedia articles
are wrong, that they're planted lies. FAKE NEWS!

Do we really believe that anyone who would consider giving such videos
even an inch of credibility is going to be convinced otherwise by
Wikipedia pages? C'mon! If anything, such Wikipedia pages may actually
serve to enforce these viewers' beliefs in the original false
conspiracy videos!

Not helping matters at all is that Wikipedia's reputation for 
accuracy -- never all that good -- has been plunging in recent years, 
sometimes resulting in embarrassing Knowledge Panel errors for Google in 
search results.

Any Wikipedia page that is not "protected" -- that is, where the
ordinary change process has been locked out -- is subject to endlessly
mutating content editing wars -- and you can bet that any editable
Wikipedia pages linked by YouTube from false conspiracy videos would
become immediate high visibility targets for such attacks.

If there's one thing that research into this area has already shown
quite conclusively, it's that the people who believe these kinds of
garbage conspiracy theories are almost entirely unconvinced by any
factual information that conflicts with their inherent points of view.

The key to avoiding the contamination caused by these vile, lying,
false conspiracy videos is to minimize their visibility in the
YouTube/Google ecosystem in the first place.

Not only should they be prevented from ever getting into the trending
lists, they should be deranked, demonetized, and excised from the
YouTube recommended video system. They should be immediately removed
from YouTube entirely if they contain specific attacks against
individuals or other violations of the YouTube Terms of Service and/or
Community Guidelines. These actions must be taken as rapidly as
possible with appropriate due diligence, before these videos are able
to do even more damage to innocent parties.

Nothing less can keep such disgusting poison from spreading.

Lauren Weinstein (lau...@vortex.com): https://www.vortex.com/lauren 
Lauren's Blog: https://lauren.vortex.com
Google Issues Mailing List: https://vortex.com/google-issues
Founder: Network Neutrality Squad: https://www.nnsquad.org 
         PRIVACY Forum: https://www.vortex.com/privacy-info
Co-Founder: People For Internet Responsibility: https://www.pfir.org/pfir-info
Member: ACM Committee on Computers and Public Policy
Google+: https://google.com/+LaurenWeinstein
Twitter: https://twitter.com/laurenweinstein
Tel: +1 (818) 225-2800
pfir mailing list

Reply via email to