Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> So in your view it's perfectly ok to run an internet service where you don't check for CSAM?

Well, that's quite the assumption. The commenter you've replied to said nothing like this. And yet this is your first conclusion?? Is this how you operate in real life, at your job?

Telegram does moderate for CSAM. The claim that it does not is completely unsubstantiated. You can find CSAM across Meta's products. Does that mean they do not check for CSAM? No.



They ignore taking action when confronted with it. That's why Durov is a disgusting human being.

"the app had gained a reputation for ignoring advocacy groups fighting child exploitation.

Three of those groups, the U.S.-based National Center for Missing & Exploited Children (NCMEC), the Canadian Centre for Child Protection and the U.K.-based Internet Watch Foundation, all told NBC News that their outreach to Telegram about child sexual abuse material, often shorthanded as CSAM, on the platform has largely been ignored."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: