Facebook whistleblower claims checked for breach of UK law
- Published
The data-privacy watchdog has written to a Facebook whistleblower, requesting her full evidence to see whether the technology company has broken UK law.
The information commissioner, Elizabeth Denham, says she wants to analyse the documents from a UK perspective, particularly relating to children.
Former Facebook employee Frances Haugen claimed the social-media company hid "behind walls" about how it used data.
But Facebook founder Mark Zuckerberg rejected her allegations.
"Most of us just don't recognise the false picture of the company that is being painted," he said.
'Take action'
Ms Denham, who is stepping down next month, told BBC News: "We're looking very closely about what is publicly available right now from Frances's testimony - but I've also written to her to ask for access to the full reports of her allegations.
"Because what I want to do with that information is analyse it from the UK's perspective - are these harms applicable in the UK, especially through the lens of children?
"We have rolled out a new children's code, which specifies design consideration to protect kids online.
"I want to see if these allegations point to any contravention of UK law and then I will take action."
The whistleblower claimed Facebook's products could pose a risk to children's mental health and stoke divisions in society.
"Facebook's closed design means it has no real oversight," Ms Haugen, who used to work on the company's algorithmic products, told a US Senate committee.
"Only Facebook knows how it personalises your feed for you.
"Facebook hides behind walls that keep researchers and regulators from understanding the true dynamics of their system."
Ms Haugen is to give evidence to the UK Parliament's online safety bill committee on 25 October.
Elizabeth Denham has done far more than most to rein in big tech. But she is alarmed at the mismatch in power between democracies and Silicon Valley - and efforts to politicise the regulator she leaves.
For the past five years, stories about the harms caused by social media giants have followed a wearisome and familiar pattern.
First, scandal erupts. Next, the cry goes up: Something Must Be Done! (See also: Up With This We Will Not Put!)
Then, reflecting headlines and noise on (ironically) social media, a meeting is convened, for which ambassadors of the tech giants are summoned.
Finally… not much happens.
This pattern has been seen over and over and over again.
And the common theme in these stories is how some big societal harm often falls between regulators, with Ofcom, the Advertising Standards Authority, and the Competition and Markets Authority, all sticking understandably to their remits.
Elizabeth Denham, the information commissioner who steps down next month, has been something of an antidote to this pattern.