this post was submitted on 10 Jun 2025
324 points (98.5% liked)

Memes

51178 readers
501 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] AbnormalHumanBeing@lemmy.abnormalbeings.space 71 points 2 weeks ago (3 children)

Meanwhile, the POV bots should be getting:

(I have to set it one up for my Fediverse stuff one of these days as well)

[–] wise_pancake@lemmy.ca 38 points 2 weeks ago (2 children)

I keep seeing this on serious sites and it makes me happy

[–] bigBananas@feddit.nl 9 points 2 weeks ago (3 children)

Such a weird thing that it essentially discriminates Mozilla based browsers though, I'd expect bots would follow the most-used-approach. So yeah, this does not make me happy..although the anime-girl kinda does

[–] Anafabula@discuss.tchncs.de 17 points 2 weeks ago

It doesn't discriminate Mozilla based browsers. It checks if the User-Agent string contains "Mozilla".

Due to historical reasons, every browser (and software pretending to be a browser) has "Mozilla" in it's User-Agent string.

This is a User-Agent string for Google Chrome on Windows 10:

Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/137.0.0.0 Safari/537.36
[–] nickwitha_k@lemmy.sdf.org 4 points 2 weeks ago

Is it blocking you? I pretty much exclusively use Gecko at this point and don't have an issue yet.

[–] tastemyglaive@lemmy.ml 4 points 2 weeks ago

What's the problem with Gecko browsers exactly? The only issue I have is disabling JShelter for new domains.

[–] Dreaming_Novaling@lemmy.zip 8 points 2 weeks ago

At first I was getting it for some proxy services and fediverse services, and didn't think much of it cause I thought it was just some thing small projects used instead of cloudflare/google. But yeah now I've been seeing it on more "official" websites and I'm happy about it after I took time to read their github page.

I especially love it since I don't have to cry over failing 30 "click the sidewalk" captchas in a row for daring to use a VPN + uBlock + Librewolf to look at a single page of search results. I can sit on my ass for 5 sec and breeze through, assured that I'm not a robot 🥹

[–] Link@rentadrunk.org 14 points 2 weeks ago (2 children)
[–] kautau@lemmy.world 27 points 2 weeks ago (2 children)
[–] brbposting@sh.itjust.works 14 points 2 weeks ago

It was created by Xe Iaso in response to Amazon's web crawler overloading their Git server, as it did not respect the robots.txt exclusion protocol and would work around restrictions.

Jeff wouldn’t do that!

[–] RVGamer06@sh.itjust.works 9 points 2 weeks ago

Even a wikipedia page lmfao

[–] NotProLemmy@lemmy.ml 8 points 2 weeks ago* (last edited 2 weeks ago)

If humans can't view the page, so won't bots.