this post was submitted on 28 Jul 2025
66 points (94.6% liked)

Futurology

3070 readers
23 users here now

founded 2 years ago
MODERATORS
top 24 comments
sorted by: hot top controversial new old
[–] givesomefucks@lemmy.world 35 points 1 day ago (3 children)

This list included words like “underscore,” “comprehend,” “bolster,” “boast,” “swift,” “inquiry,” and “groundbreaking,” in addition to “delve” and “meticulous.” The researchers then tracked the frequency of these words in over a million YouTube videos and podcast episodes from before and after ChatGPT’s launch.

Sounds more like YouTube "content producers" are likely using AI to generate the words they read aloud.

That is something I've noticed on lots of scientific content. You start realizing the person saying the words doesnt actually understand it and the who/what ever wrote it doesn't really understand it either.

Extrapolating social media content into everyday human speech is just fucking ridiculous though, especially when it's so heavily censored. Real people don't talk like that in person.

[–] Mac@mander.xyz 1 points 1 day ago

Fun fact: did you know Veritasium is partially owned by VC firm?

Probably unrelated, right?

[–] Rhaedas@fedia.io 2 points 1 day ago (1 children)

People using a word without paying attention to or understanding its meaning is not a new phase from LLM usage. The first thing that came to mind is the regional use of "could care less". There's plenty more, and they easily predate even the internet. I do agree that widespread use of LLMs can certainly inject word use that might be new to some people and show a spike overall, but remember that LLMs are trained on word use frequency from human writing. That and language is constantly evolving, so any stimulus is going to affect what direction it takes.

[–] givesomefucks@lemmy.world 2 points 1 day ago (1 children)

It's not that they don't know what a specific word means, it's that they don't know what they're talking about, just reading a script

Then noticing that who/whatever wrote the script also doesn't know anything.

You're talking about something completely different than what I was in the comment you replied to

[–] Rhaedas@fedia.io 1 points 1 day ago

I took what you said as more than just implying that all YT performers were reading a script blindly. If that's all you meant, fair enough. I just expanded on the total thread overall, that internet has helped the spread and change of language in its short lifetime, and now that is spurred by lots of people using the same tool that behaves in a certain ways to create increases where they may not have normally occurred. They're termed YT "influencers" for a reason, their viewpoint and focus as well as whatever reading of a script without understanding can have a large effect quickly, and if those viewers are already exposed to the same tool and seeing the same stuff, that reinforces that evolution even more.

But again, this isn't a new thing, just far faster because of how it propagates.

[–] Lugh@futurology.today 2 points 1 day ago

Sounds more like YouTube “content producers” are likely using AI to generate the words they read aloud.

I've noticed this too, and it sounds like a an example of what Marshall McLuhan was talking about when he said "The Medium is the Message”. The form of a medium (e.g., TV, print, digital) has a more profound effect on society than the actual content it carries.

[–] Broadfern@lemmy.world 18 points 1 day ago (1 children)

AI writes like academic nerds -> wider populace uses AI more -> wider populace uses vocabulary beyond a third grade reading level -> anyone with the misfortune of having been a nerd before all this gets accused of being/using AI

I’ve had to intentionally trim down my vocabulary and use of punctuation to try and remain visibly human online. It sucks.

[–] drspod@lemmy.ml -1 points 1 day ago (1 children)

AI writes like academic nerds

In what branch of academia? SEO optimization?

[–] phdepressed@sh.itjust.works 5 points 1 day ago* (last edited 1 day ago)

Scientific articles use a lot of this type of writing to try and be like "hey guys this is really important and no one else has done it yet"/"hey this is important and we did it better than those other guys". "novel" as in new was extremely prominent for a while.

Saying that the research you're doing is important, unique, new, faster/better is needed to make people agree to fund or publish your research. Using more "complex" words allow doing so with a chance of not sounding like every other paper/article but then people shifted the goalposts so now you have to sound like that to even be considered mediocre and not using that wording can make you be considered as "less" therefore less able to get funding or publish.

E:biomedical areas but other scientific disciplines too

[–] Stillwater@sh.itjust.works 21 points 1 day ago (1 children)

I already used most of these words

[–] SpaceNoodle@lemmy.world 11 points 1 day ago (1 children)
[–] Stillwater@sh.itjust.works 16 points 1 day ago

Thank you for your correction! I appreciate your input and will take it into account. If you have any further insights or questions, feel free to share!

[–] falidorn@lemmy.world 9 points 1 day ago (2 children)

These are all normal words…

[–] CluckN@lemmy.world 5 points 1 day ago

Found the AI

[–] Jarix@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

That's not the point though.

We are ceding the future of our own language, of the incredibly important part of how we form thought and express ideas to ourselves and other people to the machines that are invading our private spaces more and more. Like an ooze that you can't hold back it will eventually get inside your own mind.

We have designed these machines for this purpose, intentionally and unintentionally.

It won't improve enhance our expand, but will contract, the ability to think of things from different perspectives. It was also allow us to communicate with more people, overtime it will erode individuality of language into a homogenized pattern of speaking, and therefore thinking.

It's absolutely fucking disgusting {to me anyways}

And this isn't the start of that process, but it is absolutely going to be one of the most significant chapters of our species, if it proliferates as it does in my minds eye. Address already asking machines how to be better humans.... And people are just okay with this. It's fucking Looney Toons. And what really crazy is... It will actually help a lot of people in some ways. They will in those ways be better off.

It's definitely got some chilling connotations.

Jesus I don't want to live in this world anymore.

We have no fucking self control.

We have no hope

Duck this

This reminds me of that study done once where a baby chimp was raised with a baby human to see how intelligent the primate could become. The study ended when the human baby started thinking it was a chimp, not the other way around. Same energy.

[–] jjjalljs@ttrpg.network 10 points 1 day ago (2 children)

Hypothesis: Stupider people with weaker senses of self are more likely to use chatgpt. That kind of person is also more likely to adopt the language of others. There are many such people.

Thus, the language used on average changes.

I expect smarter people with stronger identities won't have much change.

[–] Lugh@futurology.today 9 points 1 day ago* (last edited 1 day ago) (1 children)

Stupider people with weaker senses of self are more likely to use chatgpt.

No. AI use correlates with being younger and more educated.

Characteristics of ChatGPT users from Germany: implications for the digital divide from web tracking data

[–] jjjalljs@ttrpg.network 4 points 1 day ago (1 children)

Interesting. Thanks for the link. I realize I didn't really define "smart", so I'm not sure education is a good proxy for it, but this does make my hypothesis look more doubtful

[–] Lugh@futurology.today 3 points 1 day ago

I think you can find ethically good, bad and gray uses for AI.

The top commenter here mentions Youtube content creators using it. Most of them are on YT to make money. So its a rational smart choice to let AI do your writing, if it makes you more efficient and means you can earn more.

[–] Skua@kbin.earth 4 points 1 day ago (1 children)

It's pretty normal to adopt the language of others, isn't it? I can't pretend to be all that knowledgeable on linguistics, but that just sounds like what a dialect is

[–] jjjalljs@ttrpg.network 1 points 1 day ago

To an extent. but think about how like some people are super eager to get in on the latest in joke and drive it into the ground. You see it on forums sometimes where someone will come up with a bit, and then some people just want to keep repeating it forever. They're the kind of person who relentlessly said "I'm on a boat" in 2010.

[–] Petter1@discuss.tchncs.de 8 points 1 day ago

I suspect, that this style of speaking is now more to be seen on YouTube since more young people who trained the AI with their social media activities, are now the ones uploading YouTube videos 🤔

ChatGPT:

•	Spurious

Implies that the observed relationship is false or coincidental rather than genuine. Often used in the phrase “spurious correlation.” – E.g. “They’ve drawn a spurious link between AI and slang evolution.”

And/or

•	Fallacious

Indicates reasoning that’s logically unsound—i.e. based on invalid inferences. – E.g. “The authors’ fallacious argument mistakes youth-driven word-choice for AI-driven change.”

[–] BlueLineBae@midwest.social 2 points 1 day ago

It's always funny when I get an email at work that starts with "Dear ____, I hope this email finds you well."