Meta has begun its rollout of automatic encryption of all Facebook and Messenger chats, the company has announced.

Messages and calls protected by end-to-end-encryption (E2EE), which has been an option since 2016, can be read only by the sender and recipient.

Under the changes, Meta will no longer have access to the contents of what users send or receive, unless one user in a chat chooses to report a message to the company.

The new features will be available immediately but the company said it would take some time for end-to-end encryption to be rolled out to the more than one billion users on the platform.

Users will receive a prompt to set up a recovery method to restore their messages once the transition is completed.

However, critics, including UK police and government, have claimed the rollout will make it harder to detect child sexual abuse on the platform.

When E2EE is default, we will also use a variety of tools, including artificial intelligence, subject to applicable law, to proactively detect accounts engaged in malicious patterns of behaviour instead of scanning private messages

Meta statement

Loredana Crisan, head of Messenger, wrote in a post announcing the change: “The extra layer of security provided by end-to-end encryption means that the content of your messages and calls with friends and family are protected from the moment they leave your device to the moment they reach the receiver’s device.

“This means that nobody, including Meta, can see what’s sent or said, unless you choose to report a message to us.”

She added: “This is the biggest set of improvements to Messenger since it was first launched in 2011.

“I’m proud of what Messenger has become: a fast and reliable service, with enjoyable features and strong safety tools, and now with the added privacy and security of end-to-end encryption.”

Apps including iMessage, Signal and WhatsApp all already protect the privacy of messages with E2EE.

Simon Bailey, a former police chief constable who was national lead for child protection at the National Police Chiefs’ Council, accused Meta of a “complete loss of social and moral responsibility” over the plans.

By starting to roll out end-to-end encryption on their services, Meta are choosing to turn a blind eye to crimes against children we know to be proliferating on their platforms. Where is their duty of care to children in taking this step?

NSPCC chief executive Sir Peter Wanless

John Carr, who is secretary of a coalition of UK children’s charities to deal with internet safety, called the move “utterly unconscionable”.

Their comments came after head of the National Crime Agency Graeme Biggar said introducing end-to-end encryption on Facebook would be like “consciously turning a blind eye to child abuse”.

Then home secretary Suella Braverman alleged that Facebook Messenger and Instagram direct messages were the platforms of choice for online paedophiles, telling the BBC that “we are arresting in this country about 800 perpetrators a month, we are safeguarding about 1,200 children a month from this evil crime”.

Meta said it had worked with outside experts, academics, advocates and governments to identify risks to “ensure that privacy and safety go hand-in-hand”.

It said: “When E2EE is default, we will also use a variety of tools, including artificial intelligence, subject to applicable law, to proactively detect accounts engaged in malicious patterns of behaviour instead of scanning private messages.”

The firm also announced that it would add a number of new features, including the ability to edit messages for up to 15 minutes after they have been sent.

It is now up to Ofcom to show its teeth and demonstrate it is serious about protecting the privacy and safety of some of the most vulnerable people in our society

Susie Hargreaves, Internet Watch Foundation

It will also give users the ability to control if people who send messages receive “read receipts” telling them a message has been read.

NSPCC chief executive Sir Peter Wanless said: “By starting to roll out end-to-end encryption on their services, Meta are choosing to turn a blind eye to crimes against children we know to be proliferating on their platforms. Where is their duty of care to children in taking this step?

“Without telling us how they will spot such activity in future, we can only conclude they are happy to allow groomers to exploit young people at will on their services, instead of enabling abusers to be spotted and punished. This flies in the face of the priority the public attaches to basic child safety online.”

Susie Hargreaves, chief executive of the Internet Watch Foundation, said: “We are outraged Meta has chosen to prioritise the privacy of paedophiles over the safety of our children. We strongly urge other platforms not to follow this dreadful example.

“This catastrophic decision to encrypt messaging services, without demonstrating how protection for children won’t be weakened, will lead to at least 21 million reports of child sexual abuse going undetected. Meta is effectively rolling out the welcome mat for paedophiles.

“It is now up to Ofcom to show its teeth and demonstrate it is serious about protecting the privacy and safety of some of the most vulnerable people in our society.”

Source: Independent

LEAVE A REPLY

Please enter your comment!
Please enter your name here