this post was submitted on 03 Jul 2025
846 points (95.0% liked)

196

18039 readers
380 users here now

Be sure to follow the rule before you head out.


Rule: You must post before you leave.



Other rules

Behavior rules:

Posting rules:

NSFW: NSFW content is permitted but it must be tagged and have content warnings. Anything that doesn't adhere to this will be removed. Content warnings should be added like: [penis], [explicit description of sex]. Non-sexualized breasts of any gender are not considered inappropriate and therefore do not need to be blurred/tagged.

If you have any questions, feel free to contact us on our matrix channel or email.

Other 196's:

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] qaz@lemmy.blahaj.zone 3 points 6 days ago (1 children)

This image has characteristics of generative AI, but I'll allow it considering the importance of the subject (and because I didn't catch it before)

[–] IAmNorRealTakeYourMeds@lemmy.world 8 points 6 days ago (2 children)

yhea, this is definitely a skeleton:

As much as I dislike AI, it's getting accepted into the norm. the genie is out of the bottle. We're stuck with it now.

[–] isVeryLoud@lemmy.ca 2 points 5 days ago (2 children)

I find genAI imagery extremely uncanny and creepy, and I can't condone the usage of a system whose creators yearn for a day where companies won't have to pay human creators anymore and can simply funnel their funds directly into the pockets of giant corporations instead.

Additionally, commercial-scale generative AI is already destroying the environment in communities across the world due to its power use.

It's not something I can accept nor condone, and I will continue to shame people for facilitating the transfer of wealth and destruction of our environment.

[–] Hackworth@sh.itjust.works 2 points 5 days ago* (last edited 5 days ago)

Speaking as a professional "creative", art and commerce are antithetical. I'll be happy to see the relationship end.

[–] IAmNorRealTakeYourMeds@lemmy.world 2 points 5 days ago (1 children)

yhea, you're not wrong.

the funny thing about AI,

I've seen artists use AI to make amazing art, but it takes them hours, and AI is more of a really fancy brush.

however a non artist can arbitrarily create slop.

the key is still love and hard work.

that's what separates soulless slop and someone who just used some AI to fill a background a bit.

Uncreative slop is the problem, and AI makes it trivial to create it.

[–] isVeryLoud@lemmy.ca 2 points 5 days ago* (last edited 5 days ago)

To be clear, I'm perfectly ok with ethically trained (open-source weights and data set) generative AI being used locally on a small scale, I think generative AI is a double-edged technology like anything else.

It should never be the end product, but simply a tool.

In this picture here, you can see the skeleton is weird and other images and text is a bit wonky, these elements should have been touched up by a human. This is what I consider slop, raw AI output has this look and feel to it that makes it immediately identifiable, it is up to the artist to touch it up and adjust colours. Again, it should never be the final product. Something as simple as text should probably have been created normally.

I'm against Meta, Google, OpenAI, Anthropic, MidJourney, etc.'s use of generative AI for the reasons stated above, but small scale genAI on your local device? Go for it.

[–] dgdft@lemmy.world 2 points 6 days ago* (last edited 6 days ago) (1 children)

I challenge you or anyone else who thinks this is AI to try to duplicate the image using any standard gen AI tooling. Please post what you get, I fucking dare you.

This is 100% crappy vector art thrown together into a crappy infographic by hand, and that thing on the bottom of the skeleton is called a pelvis.

[–] IAmNorRealTakeYourMeds@lemmy.world 1 points 5 days ago (1 children)

those icons are definitely AI made

the text and composition, that's by a human. but those icons, AI

[–] dgdft@lemmy.world 1 points 5 days ago* (last edited 5 days ago)

Ah that’s fair, I can see where you’re coming from on that. Those icons could 100% be generated with AI given the right prompting.

In my book, they look way more like stock assets to me due to how generic the symbols are, and the consistent styling. The “army guard” icon is kinda sus because of the stick “gun”, but that can be read as deliberate ambiguity to appease potential corporate customers who don’t want gun depictions in their vector stock images, and same deal with the generic “six point star”.

You’d also think they’d have chosen some sort of more detailed depiction of “isolation & surveillance” than a megaphone, or a lightning head for “fear & control”. If any of the accompanying text was included in the prompt to generate these images, the output would’ve been completely different.

[–] Duke_Nukem_1990@feddit.org 69 points 1 week ago (8 children)
[–] smock@sh.itjust.works 1 points 5 days ago (1 children)

This seems like a pretty harmless use of AI, this doesn’t hurt artists or graphic designers, it just saves some time to create an image that helps OP communicate more effectively. You can argue about the environmental impact of AI in general, but for one image?

I don’t understand blanket hatred of ALL AI, there are some cases in which it is more useful than harmful.

[–] Duke_Nukem_1990@feddit.org 1 points 5 days ago

I don't hate all AI, I hate genAI and LLMs.

[–] RobotZap10000@feddit.nl 50 points 1 week ago (7 children)

That """"human"""" skeleton in the fourth item gave it away immediately. Now that I look at it further, "Isolation & Surveillance" and a picture of a megaphone??? "Fear as a tool of control" with a lightning bolt in someone's head??? Did OP even read their slop before vomiting it here?

[–] Duke_Nukem_1990@feddit.org 31 points 1 week ago (1 children)

Also the color of the background. For some reason genAI uses that a lot.

[–] Trainguyrom@reddthat.com 15 points 1 week ago (2 children)

Yeah I've seen so much AI slop with the yellow tinge. It's kinda hilarious that we're watching AI model collapse in real time but the bubble keeps growing

load more comments (2 replies)
[–] obinice@lemmy.world 16 points 1 week ago (4 children)

What's wrong with the skeleton? It's stylised of course as these sorts of icons tend to be, but generally correct. Pelvis, spine, ribs, head, etc.

The megaphone seems like a very good way to evoke images of an abusive overseer controlling the camp's prisoners using technology of the modern day, an effective image for a section on monitoring and control, no?

There is no standardised symbol for fear within a person's mind, so again, a stylised symbol showing a lightning bolt is fine. Especially given that it is likely there on purpose - think shocks. Shocks of a different kind you may receive under an evil oppressive prisoner camp system (imagine the sudden shock in ones mind as a guard shouts or lashes out at you, I would certainly consider symbolising that in this manner).

It's as if you've never looked at anything anyone's made with simple clipart and the like before, and assume everything must be extremely deep and custom designed by experts?

Even if this were made with the help of AI, I don't see the message being any less valid, just because the person didn't go download an image editor to a PC, learn how to use it, learn how to import SVG icons and research for the most appropriate ones, build the image and export it appropriately, etc.

Not everybody is as skilled or capable as you or I may be in producing something that we might consider simple. Heck, some people only have a smartphone, not everybody has the luxury of owning a PC and proper software, nor the time or inclination to learn such tools.

The message in this image is conveyed very well, and is relevant to the current fascist regime's actions in the USA (and indeed is a universally important message).

If you want to suggest it's bad (or "slop", as you so evocatively put it) just because you don't like the image creator used to put it to print, well, that's a weird hill to die on, to be honest.

You better hope your country never duplicates the USA's slide into fascism, or you yourself may one day end up in a camp... or worse. How quick to attack the people trying to raise awareness of these abuses of human rights then, I wonder?

[–] superb@lemmy.blahaj.zone 2 points 6 days ago

Someone owns stock in AI 💀💀

[–] brucethemoose@lemmy.world 2 points 6 days ago* (last edited 6 days ago) (1 children)

that’s a weird hill to die on, to be honest.

Welcome to Lemmy (and Reddit).

Makes me wonder how many memes are "tainted" with oldschool ML before generative AI was common vernacular, like edge enhancement, translation and such.

A lot? What's the threshold before it's considered bad?

[–] superb@lemmy.blahaj.zone -1 points 6 days ago (1 children)

Well those things aren’t generative AI so there isn’t much of an issue with them

[–] brucethemoose@lemmy.world 2 points 6 days ago* (last edited 6 days ago) (1 children)

What about 'edge enhancing' NNs like NNEDI3? Or GANs that absolutely 'paint in' inferred details from their training? How big is the model before it becomes 'generative?'

What about a deinterlacer network that's been trained on other interlaced footage?

My point is there is an infinitely fine gradient through time between good old MS paint/bilinear upscaling and ChatGPT (or locally runnable txt2img diffusion models). Even now, there's an array of modern ML-based 'editors' that are questionably generative most probably don't know are working in the background.

[–] superb@lemmy.blahaj.zone -2 points 6 days ago* (last edited 6 days ago) (1 children)

Id say if there is training beforehand, then its “generative AI”

[–] brucethemoose@lemmy.world 2 points 6 days ago* (last edited 6 days ago)

Not a great metric either, as models with simpler output (like text embedding models, which output a single number representing 'similarity', or machine vision models to recognize objects) are extensively trained.

Another example is NNEDI3, very primitive edge enhancement. Or Languagetool's tiny 'word confusion' model: https://forum.languagetool.org/t/neural-network-rules/2225

[–] Duke_Nukem_1990@feddit.org 15 points 1 week ago (7 children)

What's wrong with the skeleton is, that it has a second head where an ass should be.

[–] superb@lemmy.blahaj.zone 2 points 6 days ago

The rib cage is also way too long, and wtf is that bone under the ass head

[–] jpeps@lemmy.world 1 points 6 days ago

I think it is a pelvis on the left as others have said. I have to admit though I thought I was looking at two skulls, probably because I was biased to look from left to right so I just accepted the left one as a skull and then the right skull actually looks like a skull. My first thought though was that it was an abstract depiction of overcrowding, so it was intentional to show two skeletons pushed close together.

load more comments (5 replies)
load more comments (1 replies)
load more comments (5 replies)
[–] huppakee@feddit.nl 41 points 1 week ago* (last edited 1 week ago) (5 children)

This might get me a lot of downvotes, but when ai 'draws' text it generates each individual letter which makes them a bit wiggly and often not on a straight line. The fact these are all grammatically correct sentences all on perfectly straight lines give me the impression this isn't raw output. Could be that the image was made with text later added on top though, but even the most advanced ai generators aren't this consistent with text.

[–] qaz@lemmy.blahaj.zone 3 points 6 days ago

AFAIK some new "AI" image generators also utilize LLM's to generate text overlays.

[–] Caketaco@lemmy.dbzer0.com 13 points 1 week ago (3 children)

Could be that just the icons are AI generated, or the whole image was fed through an AI upscaler/enhancer to sharpen the image.

load more comments (3 replies)
load more comments (3 replies)
load more comments (5 replies)
[–] Bebopalouie@lemmy.ca 64 points 1 week ago (14 children)

Oh, you mean Alligator Auschwitz that was just built in florida?

load more comments (14 replies)
[–] weeeeum@lemmy.world 38 points 1 week ago (1 children)

"Well they aren't concentration camps, but even if they are, they aren't that bad, but even if they are that bad, those people deserve it, and even if they were innocent, there are a lot who aren't, and even if they are all innocent, we need to exterminate brown people"

[–] StThicket@reddthat.com 0 points 6 days ago (2 children)

Yeah, can't argue with stupid

[–] weeeeum@lemmy.world 2 points 5 days ago

Its not stupidity, its just their excuses to hide their agenda

"Whatever, we need to exterminate brown people" is not stupid, just plain evil.

[–] theneverfox@pawb.social 36 points 1 week ago (4 children)

This is fucking stupid. It's when you concentrate as group in a place. That fucking simple. And it's, it's always horrible

[–] daniskarma@lemmy.dbzer0.com 16 points 1 week ago

Concerts stress me out too with all that people concentrated in a small place, but I think I would be more stressed in Auschwitz.

load more comments (3 replies)
[–] jupyter_rain@discuss.tchncs.de 18 points 1 week ago

Informationen about concentration camps on TikTok because of recent events. Did not have this on my bingo card at all.

[–] Diplomjodler3@lemmy.world 17 points 1 week ago

6/6. Nobody does fascism like Donnie does! Everybody says so. Beautiful fascism!

load more comments
view more: next ›