Telegram quietly updated its policy to allow users to report private chats to its moderators after founder Pavel Durov was arrested in France last month for “crimes committed by third parties” on the platform. The messaging app, used by nearly 1 billion people each month, has a reputation for having minimal supervision of user interactions.
On Thursday night, Telegram started making changes to its moderation policy. The company now lets users easily flag illegal content for moderators with a “Report” button on all Telegram apps. They have also provided an email address for automated takedown requests, where users can send links to content that needs attention.
It is unclear how this change will affect Telegram’s response to requests from law enforcement agencies in the future. The company has previously agreed to share some information about its users as required by court orders.
Durov’s arrest in France is related to an investigation into crimes like child sexual abuse images, drug trafficking, and fraudulent transactions. In response to his arrest, Durov criticized the action on his Telegram channel, saying it was a mistake to charge a CEO for crimes committed by others on the platform.
He argued that the usual practice for countries unhappy with an internet service is to legally challenge the service itself, not its management. Durov warned that if entrepreneurs are held accountable for the misuse of their products, innovation will suffer. TechCrunch has contacted Telegram for their response to these policy changes.
Other Stories
YouTube limits teens’ exposure to fitness and weight videos.