Facebook has given a full-throated defence of end-to-end message encryption, despite admitting that the security service helps “bad people do bad things.”
Days after WhatsApp cofounder Jan Koum quit Facebook amid reports he disagreed with plans to collect user data on the messaging service, Facebook reassured users that encryption isn’t going anywhere anytime soon.
WhatsApp users have enjoyed end-to-end message encryption since 2016, while Facebook Messenger users can opt-in to the service. Encryption secures messages so that no one can intercept communications.
Facebook’s defence of encryption will be welcomed by those who have concerns about data breaches, but it is likely to frustrate governments and law enforcement agencies who believe accessing messages could be key to preventing and investigating terror attacks or other crime.
Only last year, the British government demanded access to WhatsApp following the Westminster terror attack, in which aggressor Khalid Masood was active on the service just two minutes before driving into pedestrians.
“We need to make sure our intelligence services have the ability to get into situations like encrypted WhatsApp,” former UK Home Secretary Amber Rudd said at the time.
Gail Kent, a former executive at British National Crime Agency and now Facebook’s global public policy lead on security, admitted that she understood the frustration having seen the debate from both sides of the fence.
“I hear from government officials who question why we continue to enable end-to-end encryption when we know it’s being used by bad people to do bad things. That’s a fair question,” she said in a blog.
But she said there would be a clear trade-off without encryption — a trade-off Facebook is not prepared to make. Kent explained: “It would remove an important layer of security for the hundreds of millions of law-abiding people that rely on end-to-end encryption.”
She added that a “backdoor” for law enforcement agencies to access messages of just those suspected of wrongdoing would be impossible to create without it being “discovered and exploited by bad actors.”
Facebook does hand over some “limited personal information” from WhatApp users when hit with legal requests from public agencies. Kent admitted that this has proved “controversial,” but said it is a compromise when there is a need for both “secure ways to communicate and strong safeguards against everyday threats.”