Complicity - Administration of an online platform to allow an illegal transaction in an organised band,
- Refusal to communicate, at the request of the authorised authorities, the information or documents necessary for the realisation and exploitation of interceptions authorised by law,
- Complicity - Detention of the image of a minor of a child-pornographic nature,
- Complicity - Dissemination, offer or making available in an organised tape of images of a minor of a pornographic nature,
- Complicity - Acquisition, transport, holding, offer or disposal of narcotic products,
- Complicity - Offer, assignment or making available without legitimate reason of equipment, an instrument, a program or data designed or adapted for the attack and access to the operation of an automated data processing system,
- Complicity - Organised gang scam,
- Association of criminals with a view to committing a crime or offence punishable by 5 years of imprisonment at least,
- Money laundering of crimes or offences in organised gangs,
- Provision of cryptology services to ensure confidentiality functions without a declaration of conformity,
- Provision of a cryptological means not exclusively ensuring authentication or integrity control functions without prior declaration,
- Import of a cryptology means that does not exclusively perform authentication or integrity control functions without prior declaration.
Involved with CSAM means having a platform that openly has CSAM and not doing anything about it despite many warnings and requests in years. Do you disagree that is illegal? See my examples of others being charged...
> Involved with CSAM means having a platform that openly has CSAM and not doing anything about it despite many warnings and requests in years
You'd have to show that he knew about specific instances and declined to intervene.
I'm not saying that's unlikely. But I haven't seen it shown, and I suspect part of what the French police are trying to get is that evidence of wilful inaction versus gross negligence.
So in your view it's perfectly ok to run an internet service where you don't check for CSAM? You're entitled to that view I suppose but it's a minority view and allowing CSAM on your platform is illegal in most places (including the US where there's a carve out for that sort of stuff in the law that protects Google, Facebook et al) and in all of those jurisdictions the public is not going to be able to see the CSAM or be able to examine the substance of the allegations either before or during trial.
Every social network or file sharing site that I've been aware of has a Trust and Safety department for just this reason even X. The executives don't want to go to jail.
> So in your view it's perfectly ok to run an internet service where you don't check for CSAM?
Well, that's quite the assumption. The commenter you've replied to said nothing like this. And yet this is your first conclusion?? Is this how you operate in real life, at your job?
Telegram does moderate for CSAM. The claim that it does not is completely unsubstantiated. You can find CSAM across Meta's products. Does that mean they do not check for CSAM? No.
They ignore taking action when confronted with it. That's why Durov is a disgusting human being.
"the app had gained a reputation for ignoring advocacy groups fighting child exploitation.
Three of those groups, the U.S.-based National Center for Missing & Exploited Children (NCMEC), the Canadian Centre for Child Protection and the U.K.-based Internet Watch Foundation, all told NBC News that their outreach to Telegram about child sexual abuse material, often shorthanded as CSAM, on the platform has largely been ignored."
Those are the charges and the core argument. Do you disagree?
"Telegram is a key component of the ecosystem of individuals trading and selling child sexual abuse materials, and is the only major platform to implicitly allow the exchange of CSAM on private channels, many of which are not end-to-end encrypted.” Stamos is now chief information security officer at cybersecurity company SentinelOne.
A June report from the Stanford Internet Observatory found that Telegram was the only major platform not to forbid illegal material in private channels and chats. “Telegram has also been observed by SIO as failing to perform even basic content enforcement on public channels, with instances of known CSAM being detected and reported by our ingest systems,” the report said.
Jean-Michel Bernigaud, secretary general of Ofmin, a French police agency focused on preventing violence against minors, said in a LinkedIn post Monday that Durov’s arrest was related to the app’s inability to deal with offensive content against minors. “At the heart of the case is the absence of moderation and cooperation on the part of the platform,” Bernigaud said, “especially in the fight against child sex crimes.”
Yes, of course I disagree. This is one claim among many. Specifically, this appalling characterization and accusation you made:
> Worse, Durov was involved with CSAM material and did nothing about it. For this, he is immoral and in my opinion, a disgusting human being.
You have no idea who Durov is. You have a handful of claims by French authorities without evidence and you immediately jump to loathing this person and reviling them as a "disgusting human being"? Shame on you.
I'm glad we agree that if the claims are true, he would be considered an appalling and awful figure. We'll see what the court decides, that's how justice works.
I trust the French government and several global monitoring CSAM agencies more than I trust a random Twitter or HN user.
You keep good company: the French government continues to shelter the actually convicted sexual predator, Roman Polanski, and sheltered Iran's first Supreme Leader, Ayatollah Khomeini.
You didn't provide the link to SIO report, but I assume this is it: [1]. The report is mostly dedicated to teenagers trying to find ways to sell self-filmed content. You cherry-picked claims against Telegram to make allegations look more serious than they are, and didn't mention that there are more serious claims against Western platforms.
This is a quote from the beginning of the report:
> Large networks of accounts, putatively operated by minors, are openly
advertising self-generated child sexual abuse material (SG-CSAM) for sale.
(by the way this might be because it is very difficult to find a legitimate job if you are a teenager without any natural skills and talents. Why doesn't government do anything to change this? Where are teenagers from poor families supposed to get money from?)
> Instagram is currently the most important platform for these networks, with
features that help connect buyers and sellers
> Instagram’s recommendation algorithms are a key reason for the platform’s
effectiveness in advertising SG-CSAM.
> Twitter had an apparent regression allowing CSAM to be posted to public
profiles, despite hashes of these images being available to platforms and
researchers.
Can we expect to see Musk and Zuckerberg in the same jail with Durov then? Or justice doesn't apply to everyone equally?
Note also that the report gives following recommendations in the conclusion:
> When an account is identified as selling SG-CSAM, disabling the account should be accompanied by messaging to the seller to attempt to discourage recidivism. This messaging might include:
> The fact that this content is widely illegal and can result in prosecution;
being a minor does not prevent legal consequences
So basically what reports suggests is not to do something to help teenagers from poor families to find a legitimate job, but to threaten them with a jail term for selling their own photos. So American!
> A June report from the Stanford Internet Observatory found that Telegram was the only major platform not to forbid illegal material in private channels and chats. “
If you read the report, this means that Telegram's ToS do not explicitly forbid to post illegal material in private groups. But do you need to forbid explicitly what is already forbidden by the law?
The report contains further claim though:
> It further states that “All Telegram chats and group chats
are private amongst their participants. We do not process any requests related to
them
This is alarming but this not exactly how it works because you can actually report messages even in one-to-one private chats, for example, if you get spam from a new contact, and they can get blocked. I never got illegal material from contacts (only spam) so I don't have experience reporting it.
> Telegram has also been observed by SIO as failing to perform even basic content enforcement on public channels, with instances of known CSAM being detected and reported by our ingest systems
If you read further, by "failing to perform basic content enforcement" they mean that Telegram doesn't check posted images again CSAM database, and imply that Telegram is obliged to do this. However, I am not sure if the law requires this.
Now I want to comment on other vaguely written claims.
> Telegram is a key component of the ecosystem of individuals trading and selling child sexual abuse materials,
What makes Telegram a "key component"? Did Durov designed Telegram and added features with primary intent to make selling illegal materials easier? This sounds implausible.
> At the heart of the case is the absence of moderation
Does he mean a lack of pre-moderation (reviewing every message before posting) or lack of response to reports? There definitely is moderation in Telegram, so the "absense of moderation" doesn't ring true to me. If would be good if they presented more details instead of vague words.
> absence of ... cooperation
"cooperation" is a vague word. Maybe France just wants to be able to read all messages in private groups under an excuse of fighting crime? This would be a completely different story then.
No social network is perfect but talk to me when Twitter or Meta ignore CSAM agencies repeated requests. I'll be waiting. Otherwise, Telegram is complicit.
"the app had gained a reputation for ignoring advocacy groups fighting child exploitation.
Three of those groups, the U.S.-based National Center for Missing & Exploited Children (NCMEC), the Canadian Centre for Child Protection and the U.K.-based Internet Watch Foundation, all told NBC News that their outreach to Telegram about child sexual abuse material, often shorthanded as CSAM, on the platform has largely been ignored."
Ignoring reports of illegal material is one thing; ignoring invitations to join US-based programs or cooperating with them which Telegram is not required to do by law is different thing. The article for some reasons doesn't clearly states what it means; the author uses vague ambigious wording instead like politicians do.
The article mentions a 2023 report of SIO [1] on minors trying to earn money by selling their own photos online; the report mentions Telegram, but notes that Instagram and Twitter are worse:
> Instagram is currently the most important platform for these networks, with
features that help connect buyers and sellers
> Instagram’s recommendation algorithms are a key reason for the platform’s
effectiveness in advertising SG-CSAM.
> Twitter had an apparent regression allowing CSAM to be posted to public
profiles, despite hashes of these images being available to platforms and
researchers
Yet, for some strange reason Musk and Zuckerberg are not under investigation.
Note, that the report also doesn't give any recommendations to govts to help minors to earn money they need the legal way to solve the root issue.
Source?