Author Topic: An academic's perspective on the Facebook users emotion manipulation study.  (Read 4502 times)

elagache

  • Global Moderator
  • Storm
  • *****
  • Posts: 6494
    • DW3835
    • KCAORIND10
    • Canebas Weather
  • Station Details: Davis Vantage Pro-2, Mac mini (2018), macOS 10.14.3, WeatherCat 3
Dear WeatherCat Netizens,

I'm sure most of you have heard about the 2012 study conducted by researchers at the University of California, San Francisco that deliberately manipulated the news feeds of almost 700,000 Facebook users.  The news has generated considerable controversy, but careful thought about the matter has been more difficult to come by.

A good example of the response from the Internet culture is this piece on "Good Morning Silicon Valley:"


How do you Like it now? Facebook may have messed with your mind

While the rest of the article isn't quite so naive it starts with a very puzzling assertion:

Quote
It could?ve happened to you: Facebook experimented with manipulating users? emotions through their News Feeds. The research ? which found that emotions can be contagious ? involved almost 700,000 people, a small percentage of the social network?s  more than 1 billion users.  Among the biggest questions: Who cares?

There are two more balanced articles to be found in a report on by BusinessWeek:

Facebook's Emotional Manipulation Test Was Unethical?and So Is the Rest of Social Media

This piece argues that manipulating the users of Facebook is basically unavoidable given the Facebook profit model.  In order to demonstrate that ads on the system work, the Facebook staff need to try to extract the greatest effect from the information users experience.

This second piece from the Register makes a persuasive argument that the researchers in this case did not have a legitimate claim of implied consent:

FACEBOOK 'manipulated emotions' in SECRET EXPERIMENT on users

I wanted to offer some perspective on this as someone who had to file a human subject's protocol as part of PhD work.  Whenever researchers expect to interact with human subjects beyond purely passive observation in public settings, they are expected to produce a document that is reviewed by a human subjects committee to insure that the work won't be destructive.

Human subject's protocols basically consist of two thrusts: consent and harm management.  The second component is reasonably clear, but keep in mind that some research is extremely invasive and risky.  Testing new medications to cure cancer might kill the patient.  Researchers doing this sort of work are expected monitor the patients medically and provide medical aid if the "cure" turns out to be potentially lethal.

That brings us back to the first thrust: consent.  The example of the new cancer drug is a good way to imagine this aspect of a human subjects protocol.  It isn't enough to inform a patient that they are taking a new drug, the researchers are expected to go through all the risks.  Consent means the research subject understands what they are about to go through, the risks involved, and the options to deal with those risks as the experiment proceeds.  Consent isn't about merely asking for permission, it is about informing the subject thoroughly about what might happen - AND - allowing them a way out if they don't like what is being proposed.

Whether or not the legalize of the Facebook user contract appears to open the door for such work from a legal standpoint, it is beyond any doubt that the researchers in this case did not get legitimate consent from these users from a research protocol point of view.  The Facebook user contract doesn't have any language about those with mental conditions being expected to have to cope with more difficult circumstances than are actually exaggerated or fictitious.  According the CDC, 1 in 10 adults in the United States suffers from depression.  So something on the order of 70,000 individuals in this study were suffering from depression at the time.  It isn't merely the case that the researchers of this study had no control over how depressed Facebook users would react to this manipulation, they couldn't even identify the potential sufferers.  It should be clear that this is an extremely dangerous sort of research practice and I do hope these researchers will face disciplinary action for their part in this research.

Unfortunately, there is no academic committee overseeing Facebook's own research and development staff.  As the BusinessWeek article points out, Facebook is conducting experiments of this type all the time and doing so for a difficult reason to refute: without revenue, the company will die.  While the need to survive cannot be easily dismissed, the point about depressed people (and all other sorts of mental illnesses) applies.  Could Facebook be exacerbating the health conditions of its users and not be in a position to even detect it?

Mark Zuckerberg may well be headed to replace Bill Gates as the cyber-CEO everybody loves to hate, but in this case I think the issues go beyond any company and raise issues that have loomed since the dot-com crash.  In short: "Is it reasonable to expect so much from the Internet for free?"

When Good Morning Silicon Valley asks: Among the biggest questions: Who cares?, we all should get worried.  Alas, the people who should get most worried may not be able to: those who are "addicted" to the Facebook way of interacting with the Internet.  Once you have invested too much into a particular sort of Internet infrastructure - it may be all but impossible to extricate yourself from it - no matter how evil it might become.  Worse still, is Facebook evil for trying to survive?  The root of this evil lies with the users themselves who insist they can have their Facebook for free when everybody knows: "there is no such thing as a free lunch."

Sincerely, Edouard

Blicj11

  • Storm
  • *****
  • Posts: 3941
    • EW3808
    • KUTHEBER6
    • Timber Lakes Weather
  • Station Details: Davis Vantage Pro2 Plus | WeatherLinkIP Data Logger | iMac (2019), 3.6 GHz Intel Core i9, 40 GB RAM, macOS Ventura 13.6 | Sharx SCNC2900 Webcam | WeatherCat 3.3 | Supportive Wife
Once again, a thoughtful perspective, Edouard. In my former life I worked for a biotech firm that put several drugs through human clinical trials. If a biotech or big pharma firm ignored consent and risk like Facebook did, the regulatory agencies (whether in the US or western Europe) would shut them down. There are no such agencies for the Internet because its "free." But as you point out, it ain't so free after all. Somebody is paying for it somewhere. In the beginning I did not worry so much about privacy issues; now I am becoming a bit paranoid. My nature is to trust most people but it is becoming increasingly difficult to do so. For what it's worth, I do trust most of the WeatherCat crowd. :o
Blick


Bull Winkus

  • Storm
  • *****
  • Posts: 782
    • EW0095
    • KARHORSE2
    • WU for Horseshoe Bend, Arkansas
  • Station Details: Davis Wireless Vantage Pro 2, iMac 24"
What Blick said. Here here!

For my part, "Who cares?" means, who cares about their results? What a ridiculous notion that doesn't need to be proven! It's my opinion that someone was just trying to impress the boss with the power to manipulate as a value added proposition to up sell advertising. With a manipulation bonus, advertisers might indeed pay more. In the aggregate, that can add another billion to the annual balance sheet and a few billion to the company's gross market capitalization.

Thanks for posting, Edouard. I may have to stir the pot over at FB by sharing some of those articles.

Herb
Herb

elagache

  • Global Moderator
  • Storm
  • *****
  • Posts: 6494
    • DW3835
    • KCAORIND10
    • Canebas Weather
  • Station Details: Davis Vantage Pro-2, Mac mini (2018), macOS 10.14.3, WeatherCat 3
Dear Blick, Herb, and WeatherCat social observers,

Thanks for your kind words once more.  I do remember being very nervous about the human subjects protocol for the pilot study of what I hoped would be my dissertation project.  As it turned out, I had nothing to worry about.  You see my study didn't manipulate my subjects at all . . . . they didn't learn anything!!!

For my part, "Who cares?" means, who cares about their results? What a ridiculous notion that doesn't need to be proven!

Unfortunately, while this Facebook study didn't demonstrate anything exciting, I do fear we are entering a new period of propaganda and social manipulation that hasn't happened since the 1930s.  I've come to realize that world leaders were stunned by how the American public reacted to the 9/11 terrorist attack, almost more than the event itself.  Since 2001, there is a subtle but detectable change in the way news is presented and how governments disseminate information.  One common trick for example is to release bad news (like poor economic data) on a Friday afternoon.  That way the journalists have little time to assess it, and the public will be distracted by all the events over the weekend.  By midday Monday, the bad news has become old news and doesn't have as much impact.

The news media has also undertaken a sort of self-censorship.  For example, coverage of extremely ferocious crimes is now downplayed - based on the rationale that too much press exposure might encourage copycat crimes.  This situation is particularly disturbing because it is all under the cuff and outside the realm of public scrutiny.  Certainly we don't want encourage violent crime, but we do have a right to get an unbiased understanding of our world - even if the news is very pessimistic.

It is an extremely slippery slope for a democracy to be sliding upon.  Trust of politicians and journalists are near historic lows.  Even if well-intentioned, "public calming" efforts could very easily backfire.  Increasing distrust and helplessness is likely to encourage the desperate among us to rebel - for some in violent ways.

So this Facebook study should concern us all, as much for the other studies we don't know about as the moral failings of this particular effort.

Sincerely, Edouard

Bull Winkus

  • Storm
  • *****
  • Posts: 782
    • EW0095
    • KARHORSE2
    • WU for Horseshoe Bend, Arkansas
  • Station Details: Davis Wireless Vantage Pro 2, iMac 24"
Very salient points, Edouard. As may have been said before, when learned people start to worry, we all should worry. This especially rings true for government Orwellian style manipulation where the common man is unable to even recognize these influences.

Herb
Herb

elagache

  • Global Moderator
  • Storm
  • *****
  • Posts: 6494
    • DW3835
    • KCAORIND10
    • Canebas Weather
  • Station Details: Davis Vantage Pro-2, Mac mini (2018), macOS 10.14.3, WeatherCat 3
Dear WeatherCat good Netizens,

The fallout from the Facebook users emotion manipulation study continues.  This morning InformationWeek ran an editorial with some interesting recommendations:

http://www.informationweek.com/software/social/facebook-mood-manipulation-10-bigger-problems/a/d-id/1279124?page_number=1

My point of view is more radical than this, but there are some good complaints and some suggestions that should be taken seriously by companies that are rushing headlong into the "cloud."

cheers, Edouard  [cheers1]

Blicj11

  • Storm
  • *****
  • Posts: 3941
    • EW3808
    • KUTHEBER6
    • Timber Lakes Weather
  • Station Details: Davis Vantage Pro2 Plus | WeatherLinkIP Data Logger | iMac (2019), 3.6 GHz Intel Core i9, 40 GB RAM, macOS Ventura 13.6 | Sharx SCNC2900 Webcam | WeatherCat 3.3 | Supportive Wife
The InformationWeek article is thoughtful and well-written. It was worth the read. Thanks for sharing. The Facebook study didn't prove anything that we didn't already know, but it did raise my personal level of concern about companies manipulating my own account on their sites. Caveat utilitor (let the user beware).
Blick