Skip to main contentSkip to navigationSkip to navigation
Facebook's news feed experiments have sparked a debate about 'informed consent'.
Facebook’s news feed experiments have sparked a debate about ‘informed consent’. Photograph: DADO RUVIC/REUTERS
Facebook’s news feed experiments have sparked a debate about ‘informed consent’. Photograph: DADO RUVIC/REUTERS

MPs say they need 'serious discussion' with social networks over users' data

This article is more than 9 years old

All-party committee demands clearer terms and conditions, and suggests global ‘kitemark’ for responsible services

Social networks should simplify their terms and conditions, to ensure that their users fully understand how their personal data will be collected and used, MPs have concluded.

A report by the Commons science and technology committee calls for the British government to work with the Information Commissioner’s Office (ICO) on new guidelines for how social media companies should explain their data collection policies to uses.

The MPs criticised the “opaque, literary style” of many social networks’ terms and conditions documents, suggesting that they “are drafted for use in American court rooms” rather than for non-lawyers to understand, and thus give their informed consent to however that company plans to use their personal data.

“We doubt that most people who agree to terms and conditions understand the access rights of third parties to their personal data. The terms and conditions currently favoured by many organisations are lengthy and filled with jargon,” claimed the report.

‘Facebook’s ‘negative emotions’ experiment highlighted concerns’

The report follows a series of controversies around how social networks and apps are using – or misusing – personal data collected from their users. That includes Facebook’s infamous experiment to manipulate the stories shown in 689,000 users’ news feeds, to see if it could affect their emotional state.

The researchers working on the study argued that it was legal, because people agreeing to Facebook’s data use policies when signing up for the social network had given informed consent for this kind of research.

“Facebook’s experiment with users emotions highlighted serious concerns about the extent to which, ticking the terms and conditions box, can be said to constitute informed consent when it comes to the varied ways data is now being used by many websites and apps,” said Andrew Miller MP, chair of the science and technology committee.

In Facebook’s written evidence to the committee, however, the company argued that “the highly prescriptive nature of explicit consent requirements could actually result in more intrusive mechanisms to ask for consent for specific activities”, suggesting that “inundating people with tick boxes and warnings” may trivialise the issue of consent rather than reinforcing it.

‘This report is not drawn up from a position of mistrust’

Miller told the Guardian that the report was not drawn up “from a position of distrust” of the social networks, but called for more dialogue between these companies, regulators and their users.

“There’s a place somewhere for a serious discussion to occur between the key players to say ‘how can we actually make this work better, more effectively, and more fairly for the consumer?’ but at the same time not wreck the business model for these businesses,” said Miller.

However, he admitted that “these companies are sometimes their own worst enemy” in their reluctance to engage in the debate around their data collection and privacy policies, which may fuel theories that the lack of clarity around these policies is intrinsic to their business models.

“The conspiracy theorists, I think, are wrong, but these companies don’t always help themselves, and so some of these stories gather momentum,” said Miller.

“My solution would be let’s create a real dialogue between social media companies, independent experts and government perhaps acting as no more than a referee. I don’t think in the first instance I would reach for the regulatory stick. I’d want to persuade somebody to set up a really good, top-practice model that other people are then encouraged to adopt.”

Calls for new international kitemark for responsible use of data

The committee’s report came four months after the ICO published its own Big Data and Data Protection report calling for companies processing personal data – including social networks – to ensure they were operating within existing data protection laws.

The new report calls for an “internationally recognised kitemark” that could be used to show which social media platforms were treating their users’ data responsibly, but also suggested that the government should be setting better standards through its own services, citing the controversial care.data plans to collect and share medical records of NHS patients.

“Whilst we support the government in encouraging others to meet high standards, we expect it to lead by the example. The government cannot expect to dictate to others, when its own services, like care.data, have been found to be less than adequate,” concluded the MPs, calling for the government to explain how it plans to audit those services for their use of personal data.

Data protection expert Dr David Erdos, of the University of Cambridge, told the Guardian that any crackdown on personal data and informed consent runs the risk of “lots of things being wrapped up in the same framework”.

He pointed to a “vast difference” between companies like Facebook and Twitter collecting personal data to profile individuals for commercial uses, and less individualised forms of research, whether by companies using anonymised analytics to improve their services, or academic research “for the public benefit”.

Current data protection laws are adequate - not but enforced

Erdos also suggested that the key challenge in this area is that while existing data protection legislation is strong, enforcement of those laws is weak, and so lack of compliance from online companies is “vast”.

He pointed to another report involving the ICO from September 2014: a global survey of more than 1,200 mobile apps that found 85% of them failed to clearly explain how they were collecting, using and disclosing users’ personal information.

“At the moment, we have quite stringent laws – possibly overly stringent in some cases – but really limited enforcement,” he said. “Perhaps what needs to happen is to make the standards more realistic, and to do more enforcement, to make it clear that those standards aren’t just paper-based: they live in the real world.”

Miller agreed that enforcement is a major challenge. “There’s a real enforcement problem. The whole of this area is fraught with difficulties. There aren’t quick fixes to these things, but where you are getting front-end problems, as we’re talking about in our report, are there practical ways of picking off some of those issues? I think there are,” he said.

“I’d certainly encourage the social media companies not to put up the shutters, but to start to help promote a dialogue about practical solutions that people buy into. We need to have a serious discussion around something that is manifestly a problem now, and likely to become an even bigger problem as more of us buy in to different forms of social media.”

Comments (…)

Sign in or create your Guardian account to join the discussion

Most viewed

Most viewed