August 28th, 2024

Telegram repeatedly refuses to join child protection schemes

Telegram has declined to join child protection initiatives like NCMEC and IWF, raising concerns about its effectiveness in combating child sexual abuse material despite claims of moderating harmful content.

Read original articleLink Icon
Telegram repeatedly refuses to join child protection schemes

Telegram, the messaging app founded by Pavel Durov, has consistently declined to participate in international child protection initiatives aimed at detecting and removing child abuse material online. The app is not affiliated with the National Centre for Missing and Exploited Children (NCMEC) or the Internet Watch Foundation (IWF), both of which collaborate with numerous online platforms to combat such content. Durov, who has been arrested in France for alleged failures in content moderation related to drug trafficking and child sexual content, claims that Telegram actively moderates harmful content. However, the platform's lack of participation in established programs like NCMEC’s CyberTipline raises concerns about its effectiveness in addressing child sexual abuse material (CSAM). Reports indicate that Telegram has ignored multiple requests from NCMEC to join their efforts. The IWF has also noted Telegram's absence from its membership, which limits the app's ability to proactively identify and remove confirmed CSAM. Furthermore, Telegram does not engage in regular transparency reporting like other social networks, which publish data on content removal due to law enforcement requests. This lack of transparency and cooperation with child protection organizations has drawn criticism, especially as Telegram remains popular in regions with significant issues related to online abuse.

- Telegram refuses to join child protection programs like NCMEC and IWF.

- Pavel Durov, Telegram's CEO, is under investigation in France for content moderation failures.

- The app claims to moderate harmful content but lacks participation in established reporting systems.

- Telegram does not publish regular transparency reports on content removal.

- The platform's inaction raises concerns about its effectiveness in combating child sexual abuse material.

Link Icon 10 comments
By @squarefoot - 5 months
Here's a cartoon I found years ago that says it better than a thousand words. I posted it like a half dozen times, and it keeps being spot on.

https://starecat.com/content/wp-content/uploads/control-of-i...

By @southernplaces7 - 5 months
I guess when you label your mendacious, snooping, encryption-breaking, backdoor sneak schemes created for the sake of easier mass surveillance as "child protection measures", moral alchemy turns them into wholesome good programs that only monsters would object to.
By @codedokode - 5 months
Note that only US-companies are required by (US, I guess) law to join these programs.

Also note this part:

> IWF said that the company did remove CSAM once material was confirmed but said it was slower and less responsive to day-to-day requests.

So in the end Telegram removed the content.

I think it would be better if Telegram used the hash lists, however I think that they should use manual review and not remove content automatically, because this is an US platform that theoretically can be misused to remove legal content that US govt doesn't like.

By @MichaelRo - 5 months
Easiest thing is to say that you are protecting children. End to end encryption is that technology which ensures that when your wife sends you the shopping list on WhatsApp, tavarish militsiyan cannot eavesdrop and see that you ran out of toilet paper and liquid soap. But they must see! What if you accidentally dropped some pedoporno on that list? It's for kids protection!

So with this attack on Telegram encryption, definitely EU didn't wanna see what political opponents are doing or who's organizing what protest so they undermine it before it happens. We're just hunting pedophiles, what's your problem?

By @akomtu - 5 months
From the french gov's perspective, telegram is a worldwide web of underground tunnels that are inaccessible to the gov. And the gov, being a paranoidal control-freak, gets really upset when you're hiding something from it.
By @rKarpinski - 5 months
> Another norm that Telegram does not conform to in the usual way

Seems beyond a "norm" if your CEO is jailed for not "conforming"

By @notinmykernel - 5 months
Backdooring a social media platform to undermine encryption "in the name of children"... It's for child safety alone. Sure. ::insert eye roll::
By @amy-petrik-214 - 5 months
Hot take: the way to end CSAM (childhoods sexual abusing MPEGs (motion picture expert group) ) aka CP (childhood predator images) for those whom don't know the acronyms -- it is to legalize (((AI generatrated))) CSAM. No child harmed. As we see the internet is already full of AI slop, there can be no question that an infinite amount of CSAMslop may be generated. The reason why this is good, is because anyone looking to profit or market REAL such material, now there is no longer a market to sell it, nor is there a market for giving it for free to get some sort of fucked up pedo kudos. For people who are actual victims and their data shared, their data is but a drop in a vast ocean.