this post was submitted on 02 Apr 2025
641 points (98.8% liked)

Technology

68244 readers
4702 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Lightsong@lemmy.world 84 points 1 day ago (23 children)

1.8m users, how the hell did they ran that website for 3 years?

[–] danny161@discuss.tchncs.de 24 points 1 day ago (5 children)

That’s unfortunately (not really sure) probably the fault of Germanys approach to that. It is usually not taking these websites down but try to find the guys behind it and seize them. The argument is: they will just use a backup and start a “KidFlix 2” or sth like that. Some investigations show, that this is not the case and deleting is very effective. Also the German approach completely ignores the victim side. They have to deal with old men masturbating to them getting raped online. Very disturbing…

[–] drmoose@lemmy.world 4 points 8 hours ago (2 children)

I used to work in netsec and unfortunately government still sucks at hiring security experts everywhere.

That being said hiring here is extremely hard - you need to find someone with below market salary expectation working on such ugly subject. Very few people can do that. I do believe money fixes this though. Just pay people more and I'm sure every European citizen wouldn't mind 0.1% tax increase for a more effective investigation force.

[–] Geetnerd@lemmy.world 2 points 6 hours ago

Discovery of this kind of thing is as old as civilization.

Someone runs their mouth, or you catch someone with incrimination evidence on them. Then you lean on them to tell you where to go.

[–] Ledericas@lemm.ee 1 points 6 hours ago (1 children)

they probably make double/triple in the private sector, i doubt govt can match that salary. fb EVEN probalby paid more, before they starte dusing AI to sniff out cp.

[–] drmoose@lemmy.world 1 points 6 hours ago

I'm a senior dev and tbh I'd take a lower salary given the right cause tho having to work with this sort of material is probably the main bottle neck here. I can't imagine how people working this can even fall asleep.

[–] Maeve@midwest.social 1 points 8 hours ago

And yet there are cases like Kim Dotcom, Snowden, Manning, Assange...

[–] TheProtagonist@lemmy.world 21 points 20 hours ago* (last edited 17 hours ago)

I think you are mixing here two different aspects of this and of similar past cases. I the past there was often a problem with takedowns of such sites, because german prosecutors did not regard themselves as being in charge of takedowns, if the servers were somewhere overseas. Their main focus was to get the admins and users of those sites and to get them into jail.

In this specific case they were observing this platform (together with prosecutors from other countries in an orchestrated operation) to gather as much data as possible about the structure, the payment flows, the admins and the users of this before moving into action and getting them arrested. The site was taken down meanwhile.

If you blow up and delete)such a darknet service immediately upon discovery, you may get rid of it (temporarily) but you might not catch many of the people behind it.

[–] recall519@lemm.ee 11 points 20 hours ago (1 children)

This feels like one of those things where couch critics aren't qualified. There's a pretty strong history of three letter agencies using this strategy successfully in other organized crime industries.

[–] Geetnerd@lemmy.world 1 points 6 hours ago

Like I stated earlier, someone was caught red-handed, and snitched to get a lesser sentence.

[–] taladar@sh.itjust.works 9 points 21 hours ago (2 children)

Honestly, if the existing victims have to deal with a few more people masturbating to the existing video material and in exchange it leads to fewer future victims it might be worth the trade-off but it is certainly not an easy choice to make.

[–] Geetnerd@lemmy.world 2 points 6 hours ago* (last edited 6 hours ago) (3 children)

Well, some pedophiles have argued that AI generated child porn should be allowed, so real humans are not harmed, and exploited.

I'm conflicted on that. Naturally, I'm disgusted, and repulsed. I AM NOT ADVOCATING IT.

But if no real child is harmed...

I don't want to think about it, anymore.

[–] ZILtoid1991@lemmy.world 4 points 4 hours ago (1 children)

Issue is, AI is often trained on real children, sometimes even real CSAM(allegedly), which makes the "no real children were harmed" part not necessarily 100% true.

Also since AI can generate photorealistic imagery, it also muddies the water for the real thing.

[–] Geetnerd@lemmy.world 1 points 4 hours ago

I didn't think about that.

The whole issue is abominable, and odious.

[–] misteloct@lemmy.world 1 points 4 hours ago (1 children)

Somehow I doubt allowing it actually meaningfully helps the situation. It sounds like an alcoholic arguing that a glass of wine actually helps them not drink.

[–] Geetnerd@lemmy.world 1 points 3 hours ago

I agree.

There's no helping actual pedophiles. That's who they are.

[–] Ledericas@lemm.ee 3 points 6 hours ago (1 children)

that is still cp, and distributing CP still harms childrens, eventually they want to move on to the real thing, as porn is not satisfying them anymore.

[–] Geetnerd@lemmy.world 1 points 6 hours ago
[–] yetAnotherUser@discuss.tchncs.de 4 points 14 hours ago* (last edited 14 hours ago) (1 children)

It doesn't though.

The most effective way to shut these forums down is to register bot accounts scraping links to the clearnet direct-download sites hosting the material and then reporting every single one.

If everything posted to these forums is deleted within a couple of days, their popularity would falter. And victims much prefer having their footage deleted than letting it stay up for years to catch a handful of site admins.

Frankly, I couldn't care less about punishing the people hosting these sites. It's an endless game of cat and mouse and will never be fast enough to meaningfully slow down the spread of CSAM.

Also, these sites don't produce CSAM themselves. They just spread it - most of the CSAM exists already and isn't made specifically for distribution.

[–] taladar@sh.itjust.works 2 points 11 hours ago (1 children)

Who said anything about punishing the people hosting the sites. I was talking about punishing the people uploading and producing the content. The ones doing the part that is orders of magnitude worse than anything else about this.

[–] yetAnotherUser@discuss.tchncs.de 2 points 7 hours ago (1 children)

I'd be surprised if many "producers" are caught. From what I have heard, most uploads on those sites are reuploads because it's magnitudes easier.

Of the 1400 people caught, I'd say maybe 10 were site administors and the rest passive "consumers" who didn't use Tor. I wouldn't put my hopes up too much that anyone who was caught ever committed child abuse themselves.

I mean, 1400 identified out of 1.8 million really isn't a whole lot to begin with.

[–] taladar@sh.itjust.works 1 points 7 minutes ago

If most are reuploads anyway that kills the whole argument that deleting things works though.

load more comments (17 replies)