says the "new tools are just part of the evolving process of
digital book burning begins. Facebook is rolling out a tool it claims
will be “Addressing Hoaxes and Fake News.”
imagine this would include fake news like WMDs, Benghazi videos,
Syrian chemical weapons, 2016 election polls, and so much
as expected, all the news outlets that helped spread destructive and
deceptive news stories like the ones above will, as
be considered, de facto, legitimate
news outlets, and “won’t be able to be flagged”
as fake news.
will obviously leave all the “other” narratives (the US
government does not want people to read) to be flagged as “fake
news”, and subsequently punished by Facebook.
stories will be reviewed by Facebook researchers and sent on to
third-party fact-checking organizations for further
verification…where once “researched”, stories marked as fake
will be punished.
too, one wonders how much good will checking will take place
considering that these “researchers” will be bombarded with tens
of thousands of flagged articles daily, until it ultimately become a
rote move to simply delete anything flagged as flase by enough
disgruntled readers, before moving on to the next article, while in
the process not touching the narrative spun by the liberal
“legitimate news outlets”, the ones who would jump at the
opportunity to have dinner with Podesta in hopes of becoming Hillary
Clinton’s public relations arm.
believe in giving people a voice and that we cannot become arbiters
of truth ourselves, so we’re approaching this problem
Mosseri, Facebook’s vice president of News Feed, said in a blog
post. So, what Facebook will do, is give the voice to all those
others who praise any article they agree with, and slam and flag as
“fake news” antyhing they disagree with. At least no book burning
will be involved.
Facebook VP promised that “we’ve
focused our efforts on the worst of the worst, on the clear hoaxes
spread by spammers for their own gain, and on engaging both our
community and third party organizations.”
only that, but Facebook’s algorithm that decides what gets the most
prominence in News Feed, will also be tweaked, one would assume to
give more prominence to the abovementioned “legitimate news
oulets”… such as WaPo and the NYT.
will the algo determine if a story is potentially fake? If a story is
being read but not shared, Mosseri said that may be a sign it’s
misleading. Which in turn means that clickbait articles are about to
explode at the expense of deep-though, long-read pieces which the
current generation of Facebook readers has no time for.
going to test incorporating this signal into ranking, specifically
for articles that are outliers, where people who read the article are
significantly less likely to share it,” he said.
gets better: the next step in Facebook’s plan to rid the site of
fake news involves sending flagged stories to third-party
fact-checking organizations, which include Snopes, Politifact, and
Factcheck.org, which as the recent election showed, are just as
biased as the so-called “fake news” sites, however they cover
their partiality under the cloak of being objective, which they
conflate with being “factual.”
group of Facebook researchers will initially have the responsibility
of sifting through flagged stories and determining which ones to send
to the fact-checking organizations. If it’s determined to be fake,
the story will be flagged as disputed and include a link explaining
the punishment: flagged stories can still be shared, but
readers will be warned in advance, and they’ll be more likely to
appear lower in News Feed. These
stories also won’t be able to be promoted or turned into
the narrative has since shifted to fake news following the disastrous
WaPo report on “Russian Propaganda” outlets, which ultimately
crushed the credibility of its author,
and has been replaced with the “Putin hacked the election”
narrative, the quiet push to silence non-compliant voices continues.
the team at Facebook has made it clear they don’t want censorship
on the site and that these new tools are just part of the evolving
process of combating misinformation. And yet, crowdsourced censorship
is precisely what Facebook has just unrolled.
what will end up happening is that One half of Facebook users will
flag what they read by one half the media as fake, and vice versa,
while millions of users will simply leave the now censorship
endorsing social network out of disgust.
few weeks ago we previewed
some of the things we’re
working on to address the issue of fake news and hoaxes. We’re
committed to doing our part and today we’d like to share some
updates we’re testing and starting to roll out.
believe in giving people a voice and that we cannot become arbiters
of truth ourselves, so we’re approaching this problem carefully.
We’ve focused our efforts on the worst of the worst, on the clear
hoaxes spread by spammers for their own gain, and on engaging both
our community and third party organizations.
work falls into the following four areas. These are just some of the
first steps we’re taking to improve the experience for people on
Facebook. We’ll learn from these tests, and iterate and extend them
Reporting We’re testing several ways to make it easier to
report a hoax if you see one on Facebook, which you can do by
clicking the upper right hand corner of a post. We’ve relied
heavily on our community for help on this issue, and this can help us
detect more fake news.
Stories as Disputed We
believe providing more context can help people decide for themselves
what to trust and what to share. We’ve started a program to work
with third-party fact checking organizations that are signatories
International Fact Checking Code of Principles.
We’ll use the reports from our community, along with other signals,
to send stories to these organizations. If the fact checking
organizations identify a story as fake, it will get flagged as
disputed and there will be a link to the corresponding article
explaining why. Stories that have been disputed may also appear
lower in News Feed.
will still be possible to share these stories, but you will see a
warning that the story has been disputed as you share
a story is flagged, it can’t be made into an ad and promoted,
We’re always looking to improve News Feed by
listening to what the community is telling us. We’ve found that if
reading an article makes people significantly less likely to share
it, that may be a sign that a story has misled people in some way.
We’re going to test incorporating this signal into ranking,
specifically for articles that are outliers, where people who read
the article are significantly less likely to share it.
Incentives for Spammers
We’ve found that a lot of
fake news is financially motivated. Spammers make money by
masquerading as well-known news organizations, and posting hoaxes
that get people to visit to their sites, which are often mostly ads.
So we’re doing several things to reduce the financial incentives.
On the buying side we’ve eliminated the ability to spoof domains,
which will reduce the prevalence of sites that pretend to be real
publications. On the publisher side, we are analyzing publisher sites
to detect where policy enforcement actions might be necessary.
important to us that the stories you see on Facebook are authentic
and meaningful. We’re excited about this progress, but we know
there’s more to be done. We’re going to keep working on this
problem for as long as it takes to get it right.