A few weeks ago, Facebook Messenger introduced new rules around message forwarding to limit the spread of misinformation on its platform. Speaking today at TechCrunch Disrupt 2020, Facebook Messenger VP Stan Chudnovsky offered more detail about how Facebook views its role in fighting the spread of misinformation and other harmful content across its messaging platform, while still balancing the idea that Messenger is meant to be a platform for private and, at times, encrypted conversations (aka “secret” messages).
Chudnovsky explained that Facebook’s goal is to make Messenger feel like the digital equivalent of having a private conversation between friends and family in the living room. But the company also acknowledged that with the rise of digital tools and new mediums, there are things that Facebook needs to be cognizant of, when it comes to how those tools can be abused.
“Messenger is obviously a private means of communicating. And we want to make sure it is private. This is a very important priority for us,” Chudnovsky started. But when users began forwarding messages at scale, Messenger is then no longer about having a private conversation. It becomes a tool for one-to-many information sharing, he explained.
“This is…more like a public broadcast,” he said.
Facebook had first announced last year that it was “adding friction” to message forwarding for Messenger users in Sri Lanka, so people could only share a particular message a certain number of times. The limit was set to five people or groups at the time. Those same rules have now expanded across the Messenger platform, with the same forwarding limit of five people or groups.
The new limits, the exec continued, aim to stop this spamming behavior. “Certain pieces of information cannot be forwarded too many times…that’s something that we think is really going to help in stopping the spread of the misinformation, especially in the times that we are in right now,” he added.
Chudnovsky also noted that because of how Messenger is connected to Facebook, when false information gets flagged by Facebook’s partnered fact checkers, that same warning about the information’s inaccuracy can then be inserted into any Messenger conversations, to warn users who may have been sent the misleading or otherwise harmful content.
“That doesn’t violate privacy at all because it all comes through the same big pipelines,” he pointed out.’
Facebook’s website that details how its fact-checking program works doesn’t yet include a mention of Messenger, only Facebook and Instagram.
One thing Facebook won’t consider is putting an end to link-sharing entirely, Chudnovsky said.
“I think those things are core to the internet,” Chudnovsky said of link-sharing and forwarding. “[Completely banning] the ability for people to exchange information on the internet defeats the purpose of [the] internet itself,” he said.