this post was submitted on 15 Apr 2025
1285 points (96.0% liked)
memes
14201 readers
4050 users here now
Community rules
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to !politicalmemes@lemmy.world
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads
No advertisements or spam. This is an instance rule and the only way to live.
A collection of some classic Lemmy memes for your enjoyment
Sister communities
- !tenforward@lemmy.world : Star Trek memes, chat and shitposts
- !lemmyshitpost@lemmy.world : Lemmy Shitposts, anything and everything goes.
- !linuxmemes@lemmy.world : Linux themed memes
- !comicstrips@lemmy.world : for those who love comic stories.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I hate that we call any algorithm that gets information by looking at data "AI." If people consider something like linear regression (a supervised model) to be "AI", then "AI" isn't going to pass. Hell, even neural networks are just a shit ton of addition and multiplications.
All computing is just shit tons of math operations.
That's the "artificial" part of "artificial intelligence", so I'm not really sure what you expect AI to look like.
I'm not a big fan of LLMs and I don't think they're intelligent, but if you're disqualifying them based on using math then nothing is ever going to satisfy you
I agree, and "computing" is a great umbrella term for all math operations. And there's a reason you used the term LLM instead of AI, and that's because LLM better describes what you're referring to. The name reflects the function or the most defining characteristic of what you're referring to.
The way people throw around 'Artificial Intelligence' feels wrong to me. The words "Artificial Intelligence" suggest these models are conscious or sentient, which they’re not, so the term ends up being misleading.
So while it’s not technically wrong to use "AI" as a catch-all for anything data-driven, I don’t think it’s nearly as useful or accurate as more specific terms like LLM.
Also, when I hear people use the term AI, it’s usually by those who have no idea what they’re actually talking about. It’s always in the vague, buzzword-y context of “we need to AI our processes” even though, realistically, most systems already have some form of "AI" baked in.
Edit: It's just a huge buzzword that's starting to lose meaning to me: https://www.youtube.com/watch?v=-qbylbEek-M&t=26s
I don't like this take because intelligence isn't defined as human and AI is not "artificial human". Saying linear regression is not AI is the most pseudo intellectual thing ever at this point we get it you saw a guy on twitter say it but do you even know what it means and how it's just that guys opinion?
When I hear "Artificial Intelligence," I picture a computer with sentience, something that thinks and perceives like an animal or human. But instead, the term is being thrown around to describe basic models like linear regression, which clearly don't think for themselves. Even if it's meant as a shorthand for all algorithms, calling every math operation "AI" cheapens the meaning and blurs the line between true intelligence and simple computation.
Basically AI research is the study of the mathematics of knowledge learning and reasoning. Models aren't just 'linear regression' and you could say linear regression is a beautifully simple way to find patterns in high dimensional space it's just word choice. You can hate on the hype all you want but it isn't going away because it was always here before the hype.