That Facebook Research and Academia


I have waited a few weeks for the Internet to react to the Facebook research that was revealed and caused a big buzz (again) about privacy. The short summary: Facebook manipulated the news feeds of thousands of its users, without their knowing consent, in order to do some research. They wanted to know if they could have an effect on people’s behavior in the network.


Oh wait - that was back in 2010 when they were looking at U.S. voting patterns in the midterm elections. That story was told in 2012 by Nature magazine. Not much of a public reaction. No real outcry about questionable ethics.

But this latest study that Facebook conducted was co-designed by researchers at Cornell University. This research examined how positive or negative language spreads in social networks. If you see more negative comments and news, do you become more negative yourself in your posts?

This time there were two negative reactions by the public and the press. First, in this year following the NSA and Snowden revelations, there was a very vocal outcry of criticism about whether
Internet users should be informed about experiments that test human behavior. (Facebook likes to point out that users did "allow" the study by agreeing to the terms of service.)

The second concern was that a university played a role in the research design.

What were the results of the research? Users who saw fewer positive posts were less likely to post something positive, and vice versa, but the effect was small and faded as days passed. That sounds like common sense, right? Actually, existing research had seemed to indicate that seeing a number of happy, positive news feed items from friends, they felt a negativity about their own lives.

Researchers in academia are used to having research approved first by an Institutional Review Board. Did that happen at Cornell?  The data scientist at Facebook conducted the actual research. He collaborated with a Cornell researcher and his former postdoc on the design and subsequent analysis. But, since the Cornell researchers did not participate in the data collection, the university’s IRB concluded that the study did not require oversight as it would usually require with human-subjects research. 

The research study was published in early June in the respected journal, Proceedings of the National Academy of Sciences.

The revelations about the NSA snooping had a split reaction. Some people saw Snowden as a hero whistle-blower alerting us to wrongdoing and wanted changes to be made in what was allowed. Others saw him as dangerous because he revealed a kind of research that the government needs to do to protect us.

The Facebook/Cornell research certainly doesn't come anywhere near the complexity or seriousness of the NSA case. Nevertheless, some people want to see this kind of research controlled or stopped and our online privacy protected better. A smaller number think that this is part of the price of using the Net and social media.

My conclusion? This kind of social research will continue. BUT - it will be done (with your approval, even if you don;t read the fine print before clicking that AGREE button), but it is unlikely to be public. It will be kept private and will not be published. And colleges will be much more careful about making research collaborations with corporations - especially those that operate online.





33 Ethicists Defend Facebook’s Controversial Mood Study

A group of bioethicists wrote in a column that Facebook’s controversial study of mood manipulation was not unethical, and harsh criticism of it risks putting a chill on future research. The article was written by six ethicists, joined by 27 others.


Trackbacks

Trackback specific URI for this entry

Comments

Display comments as Linear | Threaded

No comments

Add Comment

Enclosing asterisks marks text as bold (*word*), underscore are made via _word_.
Standard emoticons like :-) and ;-) are converted to images.
BBCode format allowed
E-Mail addresses will not be displayed and will only be used for E-Mail notifications.
To leave a comment you must approve it via e-mail, which will be sent to your address after submission.

To prevent automated Bots from commentspamming, please enter the string you see in the image below in the appropriate input box. Your comment will only be submitted if the strings match. Please ensure that your browser supports and accepts cookies, or your comment cannot be verified correctly.
CAPTCHA