Friday, April 13, 2018

How Facebook could reinvent itself – 3 ideas from academia

File 20180412 540 mdk8m5.jpg?ixlib=rb 1.1
What will he decide to do? AP Photo/Andrew Harnik
Jeff Inglis, The Conversation

Facebook CEO Mark Zuckerberg’s testimony in front of Congress, following disclosures of personal data being misused by third parties, has raised the question over how and whether the social media company should be regulated. But short of regulation, the company can take a number of steps to address privacy concerns and the ways its platform has been used to disseminate false information to influence elections.

Scholars of privacy and digital trust have written for The Conversation about concrete ideas – some of them radical breaks with its current business model – the company could use right away.

1. Act like a media company

Facebook plays an enormous role in U.S. society and in civil society around the world. The leader of a multiyear global study of how digital technologies spread and how much people trust them, Tufts University’s Bhaskar Chakravorti, recommends the company accept that it is a media company, and therefore

take responsibility for the content it publishes and republishes. It can combine both human and artificial intelligence to sort through the content, labeling news, opinions, hearsay, research and other types of information in ways ordinary users can understand.”

2. Focus on truth

Facebook could then, perhaps, embrace the mission of journalism and watchdog organizations, and as American University scholars of public accountability and digital media systems Barbara Romzek and Aram Sinnreich suggest,

start competing to provide the most accurate news instead of the most click-worthy, and the most trustworthy sources rather than the most sensational.”

3. Cut users in on the deal

If Facebook wants to keep making money from its users’ data, Indiana University technology law and ethics scholar Scott Shackelford suggests

“flip[ping] the relationship and having Facebook pay people for their data, [which] could be [worth] as much as US$1,000 a year for the average social media user.”

The ConversationThe multi-billion-dollar company has an opportunity to find a new path before the public and lawmakers weigh in.

Jeff Inglis, Science + Technology Editor, The Conversation

This article was originally published on The Conversation. Read the original article.

Thursday, April 5, 2018

Understanding Facebook's data crisis: 5 essential reads

File 20180404 189804 6fxdh5.jpg?ixlib=rb 1.1
What will Mark Zuckerberg say to Congress? AP Photo/Noah Berger
Jeff Inglis, The Conversation

Most of Facebook’s 2 billion users have likely had their data collected by third parties, the company revealed April 4. That follows reports that 87 million users’ data were used to target online political advertising in the run-up to the 2016 U.S. presidential election.

As company CEO Mark Zuckerberg prepares to testify before Congress, Facebook is beginning to respond to international public and government criticism of its data-harvesting and data-sharing policies. Many scholars around the U.S. are discussing what happened, what’s at stake, how to fix it, and what could come next. Here we spotlight five examples from our recent coverage.

1. What actually happened?

A lot of the concern has arisen from reporting that indicated Cambridge Analytica’s analysis was based on profiling people’s personalities, based on work from Cambridge University researcher Aleksandr Kogan.

Media scholar Matthew Hindman actually asked Kogan what he had done. As Hindman explained, “Information on users’ personalities or ‘psychographics’ was just a modest part of how the model targeted citizens. It was not a personality model strictly speaking, but rather one that boiled down demographics, social influences, personality and everything else into a big correlated lump.”

2. What were the effects of what happened?

On a personal level, this level of data collection – particularly for the 50 million Facebook users who had never consented to having their data collected by Kogan or Cambridge Analytica – was distressing. Ethical hacker Timothy Summers noted that democracy itself is at stake:

“What used to be a public exchange of information and democratic dialogue is now a customized whisper campaign: Groups both ethical and malicious can divide Americans, whispering into the ear of each and every user, nudging them based on their fears and encouraging them to whisper to others who share those fears.”

3. What should I do in response?

The backlash has been significant, with most Facebook users expressing some level of concern over what might be done with personal data Facebook has on them. As sociologists Denise Anthony and Luke Stark explain, people shouldn’t trust Facebook or other companies that collect massive amounts of user data: “Neither regulations nor third-party institutions currently exist to ensure that social media companies are trustworthy.”

4. What if I want to quit Facebook?

Many people have thought about, and talked about, deleting their Facebook accounts. But it’s harder than most people expect to actually do so. A communications research group at the University of Pennsylvania discussed all the psychological boosts that keep people hooked on social media, including Facebook’s own overt protestations:

“When one of us tried deactivating her account, she was told how huge the loss would be – profile disabled, all the memories evaporating, losing touch with over 500 friends.”

5. Should I be worried about future data-using manipulation?

If Facebook is that hard to leave, just think about what will happen as virtual reality becomes more popular. The powerful algorithms that manipulate Facebook users are not nearly as effective as VR will be, with its full immersion, writes user-experience scholar Elissa Redmiles:

“A person who uses virtual reality is, often willingly, being controlled to far greater extents than were ever possible before. Everything a person sees and hears – and perhaps even feels or smells – is totally created by another person.”

The ConversationAnd people are concerned now that they’re too trusting.

Jeff Inglis, Science + Technology Editor, The Conversation

This article was originally published on The Conversation. Read the original article.