this post was submitted on 08 Jun 2025
1322 points (97.1% liked)

Microblog Memes

8121 readers
3129 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] supersquirrel@sopuli.xyz 24 points 6 days ago (1 children)

In 30 years the world will be an ecological wasteland from all the energy usage we spent pursuing dumb shit hype like "AI".

[–] Tryenjer@lemmy.world 5 points 6 days ago* (last edited 6 days ago) (1 children)

It seems we are heading towards the fallout timeline.

[–] Noodle07@lemmy.world 4 points 6 days ago

That would be the best case scénario

[–] bizza@lemmy.zip 21 points 6 days ago

He tweeted, with a ghibli-slop avatar

[–] menas@lemmy.wtf 21 points 6 days ago (2 children)

Running LLM in 30 years seems really optimistic

[–] TriflingToad@sh.itjust.works 5 points 6 days ago (3 children)

how so? they can't make locally run LLMs shit and I assume hardware isn't going to get any worse

[–] frezik@midwest.social 5 points 6 days ago* (last edited 6 days ago) (1 children)

There are local LLMs, they're just less powerful. Sometimes, they do useful things.

The human brain uses around 20W of power. Current models are obviously using orders of magnitude more than that to get substantially worse results. I don't think power usage and results are going to converge enough before the money people decide AI isn't going to be profitable.

[–] jj4211@lemmy.world 3 points 6 days ago (1 children)

The power consumption of the brain doesn't really indicate anything about what we can expend on LLMs... Our brains are not just biological implementation of the stuff done with LLMs.

[–] frezik@midwest.social 6 points 6 days ago (2 children)

It gives us an idea of what's possible in a mechanical universe. It's possible an artificial human level consciousness and intelligence will use less power than that, or maybe somewhat more, but it's a baseline that we know exists.

[–] spicehoarder@lemm.ee 4 points 6 days ago

You're making a lot of assumptions. One of them being that the brain is more efficient in terms of compute per watt compared to our current models. I’m not convinced that’s true. Especially for specialized applications. Even if we brought power usage below 20 watts, the reason we currently use more is because we can, not that each model is becoming more and more bloated.

[–] Tryenjer@lemmy.world 3 points 6 days ago* (last edited 5 days ago)

Yeah, but a LLM has little to do with a biological brain.

I think Brain-Computer Interfaces (BCIs) will be the real deal.

[–] Buddahriffic@lemmy.world 3 points 6 days ago (1 children)

I was thinking in a different direction, that LLMs probably won't be the pinnacle of AI, considering they aren't really intelligent.

[–] menas@lemmy.wtf 2 points 6 days ago

Assuming they would be enough food to maintain and fix that hardware, I'm not confident that we will have enough electricity to run LLM on massive scale

[–] WorldsDumbestMan@lemmy.today 2 points 6 days ago

It literally runs on my phone, and is at least decent enough at pretending to care that you can vent to it.

[–] DragonAce@lemmy.world 10 points 6 days ago
[–] BlessedDog@lemmy.world 5 points 6 days ago (4 children)

This guy's name translates to something like "Matt Cock"

load more comments (4 replies)
load more comments
view more: next ›