If you use Facebook this could be important information
Comments
from Michael Green
The
following message was just sent to the inbox of CNET News Senior
Writer on Google and security, Seth Rosenblatt:
Interesting
revelations in your article on CNET. Mark Zuckerberg apologizes for
the recently revealed research study. Why? "We messed up."
Meaning, he got caught.
Are
you aware of another equally evil Facebook research study? For over 8
months now, numerous FB users, including me, have noticed that we can
no longer set friends posts to appear on our feed in the usual mode
of All, Most and Only Important. That space is whited-out in the
drop-down and it is NOT a Java script issue (appears too on multiple
browsers).
So
many people have been affected that there is a FB page on it. Many
these people wrote to FB and, through back doors, I also reached out
to high level management. No response.
Why
would FB do that? Well, we know some things with certainty. They are
indeed tweaking their feed algorithms. They are control freaks, and
their algorithms are all-important. On the more sinister side of the
equation, FB is also in bed with the corporate world, not to mention
the "sanitized news world" of the corporate state. One
doesn't need to be a conspiracy freak, but think about that for a
moment.
For
some time, it seemed to me that what FB might be doing is using an
ever-expanding group (and it is ever-expanding) of research subjects
to test user-reaction to FB's exerting more control than ever over
our feeds.
The
only control left to me is follow/don't follow, and because I cannot
parse the friends setting (or even see it anymore), I'm not following
90% of my friends at all. That hurts relationally, and it also keeps
a lot of important stories away from me. Soon, all control over our
feeds will devolve to Mr. Zuckerberg and his Facebook crew.
I
have written to a few journalists on this but no one has yet replied.
Now that we know this is the sort of thing that FB does, perhaps that
will change. Hopefully, you will be interested.
Facebook's
mood study: How you became the guinea pig
That
controversial research into how posts affect users' emotions is just
latest in a long line of privacy flaps -- and apologies -- for the
social networking giant
CNet,
2
July, 2014
When
news spread over the weekend that Facebook had manipulated its news
feed to study how social media posts affect people's emotions, the
real surprise was that anyone was that surprised.
The
study (PDF),
published in the Proceedings of the National Academy of Sciences and
conducted by Facebook researcher Adam Kramer, Jeffrey Hancock of
Cornell University, and Jamie Guillory of the University of
California at San Francisco, found that people tended to reflect
the emotional state of their friends' posts.
So
if your friends wrote happy posts, your posts in turn tended to be
happier. Friends posting about a bad day at work would tend to bring
you down.
The
disclosure triggered a sharp backlash and elicited an attempt by
Facebook to seek forgiveness -- one in a long line of mea culpas the
company has issued over the years. Yet anyone paying close attention
to the boilerplate disclaimers that tech companies regularly publish
might have seized upon a couple of seemingly innocent-sounding
phrases tucked away in the company's data
use policy that spoke volumes.
Among
other things, Facebook says quite clearly in the published document
that it might use the information it gathers from people on its
social network "for internal operations, including
troubleshooting, data analysis, testing, research and service
improvement."
The
tandem phrases "data analysis" and "research"
appear to be unique to Facebook's user legalese. They do not appear
in Google's terms
of service and privacy
policy, while "research" does appear in Yahoo's privacy
policy but not its terms
of service. LinkedIn is open in its privacy
policy about the research it conducts on its users.
Google
and Yahoo did not respond to a request for comment on whether they
perform similar research on their users.
Line crossed?
However,
privacy advocates warn that use agreements are carefully worded by
design.
"There
is no word in any privacy policy that is not there for a reason. If
something is missing, then it's missing for a reason," said
Brian Pascal, an attorney and privacy researcher at the University of
California Hastings College of Law in San Francisco. He added that
while there may or may not be a "practical impact" by
Facebook's specific policy phrasing, "it's certainly
interesting."
"It's
one thing for Facebook to A/B
test some advertising structure," he said, referring to
internal tests that websites frequently conduct to determine what
resonates with visitors -- some people see one set of ads, some see
another. "It's another to tweak their News Feed to manipulate
[users'] emotional state."
That
goes to the heart of the latest argument about whether Facebook
crossed a red line. It's one thing to test whether Facebook users
search more, thus generating more revenue, when presented with more
or fewer links. For many people, though, it's quite something else
when the company tests whether users' emotional state can be altered
artificially.
Carey
Sebera, an attorney who's worked on Facebook privacy cases, said that
while the research may not have violated the law or even the
company's own policies, Facebook ought to have been more ethical. She
voiced concerns raised by other critics when she noted that legal
documents don't necessarily equate with morality and that something
that's legal isn't necessarily ethical.
Responding
to the blowback, Kramer posted
a brief response defending the project as one of many
attempts by Facebook "to provide a better service" and
"never to upset anyone."
"Nobody's
posts were 'hidden,' they just didn't show up on some loads of Feed,"
he wrote on Sunday.
"I
can understand why some people have concerns about it, and my
coauthors and I are very sorry for the way the paper described the
research and any anxiety it caused," Kramer subsequently wrote.
"In hindsight, the research benefits of the paper may not have
justified all of this anxiety."
That
response did little to mollify critics after they learned that of the
689,003 people experimented on in the study, which is less than 0.06
percent of Facebook's 1.2 billion users, none
was aware of their participation. While that's considered
standard operating procedure in business, Princeton University
professor and privacy expert Ed Felten noted "the
large gap" that exists between the ethical standards of industry
practice, versus the research community's ethical standards for
studies involving human subjects.
"Industry
practice allows pretty much any use of data within a company, and
infers consent from a brief mention of 'research' in a huge
terms-of-use document whose existence is known to the user,"
according to Felten. But if people voluntarily give Facebook their
data and the company operates within its own terms of service and
privacy policy, the upshot is that Facebook can do with that
information what it likes.
This
is not the first time Facebook has run experiments on its users. In
2010, the company created an "I Voted" button, not unlike a
"like" button, that displayed who among a user's friends
had indeed voted. Facebook said in 2012 that it believed more than
300,000 voters turned
out at the polls as a result of the study.
Facebook's
actions are also bound to raise more questions about the power that
large Internet companies wield and what they're doing with user data.
It's part of a larger struggle within Silicon Valley over consumer
privacy: What's the proper balance between enticing customers to use
a service and then packaging that information for advertisers to help
them to better target ads at their users?
Act first, apologize later
It's
been a bumpy path. For instance, when Google in 2010 launched one of
its first social-networking efforts, called Buzz, the company came
under fire for weak privacy settings. Chief among them was a
default setting that published a list of names that Gmail users most
frequently emailed or chatted with. Most recently, a European court
has forced Google to allow users to request that certain information
about them be removed from the company's databases, including search
results.
In
2012, Facebook settled
with the Federal Trade Commission over charges that the
company deceived customers by telling them they could keep
information on the network private, but then allowing it to be made
public. Facebook promised to offer users a "clear and prominent
notice" when their information is shared beyond standard privacy
settings. Whether or not Facebook violated that agreement with this
research experiment remains unclear.
There
have been other instances when Facebook inadvertently fanned privacy
fears among its many users.
In
2006, the introduction of the news feed without sufficient privacy
controls brought a public
apology from CEO Mark Zuckerberg, who acknowledged "we
really messed this one up." Zuckerberg had toagain
apologize a year later after the debut of a controversial
product called Beacon that let Facebook friends see what you
were doing on partner websites. Zuckerberg said Beacon was conceived
as a way "to let people share information across sites with
their friends," but he acknowledged that Facebook "simply
did a bad job with this release, and I apologize for it."
(Facebook subsequently offered a way to opt out of Beacon before
dumping the project entirely.)
That
congenital tone-deafness has Felton urging more stringent ethical
rules on scientific research that would require specific, fully
informed consent from the subject, including the right to opt out of
a study at any point without penalty.
Considering
how many people now use Facebook, the company sees clear benefits in
more research about their likes and dislikes. The company has often
acted first and then apologized and amended its policies after public
outcry and Pascal thinks this situation will be no different.
"A
definite first step is [for Facebook] to become much more transparent
about how they decide to conduct research on their users," he
said. "The truth is that Facebook has access to data that nobody
else has. The answer can't be that Facebook must give up on its
research. What we want is some degree of accountability and
transparency when they do undertake research."
No comments:
Post a Comment
Note: only a member of this blog may post a comment.