this post was submitted on 03 Mar 2025
570 points (96.9% liked)

Technology

65819 readers
5197 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] BlameTheAntifa@lemmy.world 10 points 6 days ago (1 children)

There’s no need for huge, expensive datacenters when we can run everything on our own devices. SLMs and local AI is the future.

[–] yarr@feddit.nl 5 points 6 days ago (1 children)

This feels kinda far fetched. It's like saying "well, we won't need cars, because we'll all just have jetpacks that we use to get around." I totally agree that eventually a useful model will run on a phone. I disagree it's going to be soon enough to matter to this discussion. To give you some ideas, DeepSeek is a recent model. It's 671B parameters. Devices like phones are running 7-14B models. So, eventually what you say will be feasible, but we have a ways to go.

[–] BlameTheAntifa@lemmy.world 1 points 6 days ago

The difference is that we’ll just be running small, specialized, on-demand models instead of huge, resource-heavy, all-purpose models. It’s already being done. Just look at how Google and Apple are approaching AI on mobile devices. You don’t need a lot of power for that, just plenty of storage.

[–] RizzRustbolt@lemmy.world 2 points 6 days ago

Poor ELIZA, she's going to have to start hitting the corner's again.

[–] Petter1@lemm.ee 0 points 6 days ago

I see it more like they are confident to get running LLMs less resource intensive 🤔

[–] Blue_Morpho@lemmy.world 122 points 1 week ago (5 children)

Cancelling new data centers because deep seek has shown a more efficient path isn't proof that AI is dead as the author claims.

Fiber buildouts were cancelled back in 2000 because multimode made existing fiber more efficient. The Internet investment bubble popped. That didn't mean the Internet was dead.

[–] scarabic@lemmy.world 3 points 5 days ago

This is a good point. It’s never sat right with me that LLMs require such overwhelming resources and cannot be optimized. It’s possible that innovation has been too fast to worry about optimization yet, but all this BS about building new power plants and chip foundries for trillions of dollars and whatnot just seems mad.

[–] contrafibularity@lemmy.world 76 points 1 week ago (3 children)

yeah, genai as a technology and field of study may not disappear. genai as an overinflated product marketed as the be all end all that would solve all of humanity's problems may. the bubble can't burst soon enough

[–] scarabic@lemmy.world 1 points 5 days ago* (last edited 5 days ago)

Sometimes the hype bubble bursts and then the products eventually grows to be even larger than the hype. But you never know how connected hype actually is to any realistic timeline. Hype can pop like a cherry tree flowering on the first sunny day of spring, thinking summer has arrived, but then get drenched by another few weeks of rain. And as stupid as that cherry tree is, summer will eventually arrive.

[–] ShepherdPie@midwest.social 19 points 1 week ago

Exactly. It's not as if this tech is going in the dumpster, but all of these companies basing their multi-trillion-dollar market cap on it are in for a rude awakening. Kinda like how the 2008 housing market crash didn't mean that people no longer owned homes, but we all felt the effects of it.

[–] frezik@midwest.social 2 points 1 week ago

Historically, the field of AI research has gone through boom and bust cycles. The first boom was during the Vietnam War with DARPA dumping money into it. As opposition to the Vietnam War grew, DARPA funding dried up, and the field went into hibernation with only minor advancement for decades. Then the tech giant monopolies saw an opportunity for a new bubble.

It'd be nice if it could be funded at a more steady, sustainable level, but apparently capitalism can't do that.

[–] FooBarrington@lemmy.world 46 points 1 week ago (14 children)

I'm gonna disagree - it's not like DeepSeek uncovered some upper limit to how much compute you can throw at the problem. More efficient hardware use should be amazing for AI since it allows you to scale even further.

This means that MS isn't expecting these data centers to generate enough revenue to be profitable, and they're not willing to bet on further advancements that might make them profitable. In other words, MS doesn't have a positive outlook for AI.

Exactly. If AI were to scale like the people at OpenAI hoped, they would be celebrating like crazy because their scaling goal was literally infinity. Like seriously the plan that openai had a year ago was to scale their AI compute to be the biggest energy consumer in the world with many dedicated nuclear power plants just for their data centers. That means if they dont grab onto any and every opportunity for more energy, they have lost faith in their original plan.

load more comments (13 replies)
[–] Kazumara@discuss.tchncs.de 29 points 1 week ago (2 children)

Fiber buildouts were cancelled back in 2000 because multimode made existing fiber more efficient.

Sorry but that makes no sense in multiple ways.

  • First of all single mode fiber provides magnitudes higher capacity than multi mode.

  • Secondly the modal patterns depend on the physics of the cable, specifically its core diameter. Single mode fibers has a 9 micrometer core, multi mode 50 or 62.5 micrometers. So you can't change the light modes on existing fiber.

  • Thirdly multi mode fiber existed first, so it couldn't be the improvement. And single mode fiber was becoming the way forward for long distance transmission in 1982 already, and the first transatlantic cable with it was laid in 1988. So it couldn't be the improvement of 2000 either.

You must mean something else entirely.

[–] sploosh@lemmy.world 14 points 1 week ago

I think they conflated multimode with DWDM.

load more comments (1 replies)
[–] b3an@lemmy.world 3 points 1 week ago

Yeah you echo my thoughts actually. That efficiency could be found in multiple areas, including deepseek. That perhaps too that some other political things may be a bit more uncertain.

[–] ohshittheyknow@lemmynsfw.com 69 points 1 week ago (5 children)

AI is a tool, like a hammer. Useful when used for its purpose. Unfortunately every tech company under the sun is using it for the wrong fucking thing. I don't need AI in my operating system or my browser or my search engine. Just let it work on protein folding, chemical synthesis and other more useful applications. Honestly can't wait for the AI hype to calm the fuck down.

[–] Petter1@lemm.ee 1 points 6 days ago

It works pretty well as research/learn tool at my job.. I learned a lot very fast using AI as a tool in my browser.

[–] AbsoluteChicagoDog@lemm.ee 27 points 1 week ago

You forgot mass surveillance. It's great at that.

[–] Tattorack@lemmy.world 19 points 1 week ago (2 children)

The only way it's going to die down is if it gets replaced with the next tech bro buzzword.

The previous one was "smart", and it stuck around for a very long time.

[–] Sturgist@lemmy.ca 1 points 6 days ago

Seem to have forgotten NFT. Swear to fuck, if some firm thought their customers really were that dumb, they'd have claimed that every bottle of their milk had an integrated NFT and the lactose protein was backed by the Blockchain™.

[–] Cryophilia@lemmy.world 18 points 1 week ago (1 children)
[–] cyberpunk007@lemmy.ca 6 points 1 week ago (1 children)

Preach it. I have been so sick of AI hype and rolling my eyes any time a business advertises it, and in some cases moving on. I don't care about your glorified chat bot or search engine.

load more comments (1 replies)
[–] drmoose@lemmy.world 2 points 1 week ago

It'll balance out. I'm old enough to remember many web tech being this way from flash, to Bluetooth to Cloud.

[–] sundrei@lemmy.sdf.org 52 points 1 week ago

There's been talk for a while that "AI" has reached a point where merely scaling up compute power is yielding diminishing returns; perhaps Microsoft agrees with that assessment.

[–] simplejack@lemmy.world 35 points 1 week ago (7 children)

My guess is that, given Lemmy’s software developer demographic, I’m not the only person here who is close to this space and these players.

From what I’m seeing in my day to day work, MS is still aggressively dedicated to AI internally.

[–] jj4211@lemmy.world 15 points 1 week ago

That's compatible with a lack of faith in profitable growth opportunity.

So far they have gone big with what I'd characterize as more evolutionary enhancements to tech. While that may find some acceptance, it's not worth quite enough to pay off the capital investment in this generation of compute. If they overinvest and hope to eventually recoup by not upgrading, they are at severe risk of being superseded by another company that saved some expenditure to have a more modest, but more up to date compute infrastructure.

Another possibility is that they predicted a huge boom of a other companies spending on Azure hosting for AI stuff, and they are predicting those companies won't have the growth either.

[–] theneverfox@pawb.social 7 points 1 week ago

I think deepseek shook them enough to realize what should have been obvious for a while... Brute force doesn't beat new techniques, and spending the most might not be the safest bet

There's a ton of new techniques being developed all the time to do things more efficiently, and if you don't need a crazy context window, in many use cases you can get away with much smaller models that don't need massive datacenters

[–] homesweethomeMrL@lemmy.world 5 points 1 week ago (2 children)

I am sure the internal stakeholders of Micro$oft's AI strategies will be the very last to know. Probably as they are instructed to clean out their desks.

load more comments (2 replies)

Why would a company or government use Azure or windows if MS is compromising it with ai?

Pick a lane

[–] turnip@sh.itjust.works 3 points 1 week ago* (last edited 1 week ago)

Because investors expect it, whether it generates profit or not. I guess we will see how it changes workflows, or whether people continue to do things like they always have.

load more comments (2 replies)
[–] shads@lemy.lol 28 points 1 week ago (2 children)

Hmm, not meaning to get my conspiracy hat on here but do we think this could relate to the fact that Microsoft now has a quantum computing chip that they can hype to their investors to show they have the next big thing in the bag?

AI has served its purpose and is no longer strategically necessary?

Since they are only spending investors money it doesn't matter if they burn billions on leading the industry down the wrong path and now they can let it rot on the vine and rake in the next round of funding while the competition scrambles to catch up.

[–] balder1991@lemmy.world 1 points 6 days ago (1 children)

How would that be a conspiracy. If the AI bubble bursts eventually, I’m sure Microsoft won’t want to be among the last ones to leave.

[–] shads@lemy.lol 1 points 6 days ago

Had a big thing written out, didn't like it when I read it back. So keeping it simple, I equivocated to try to deflect some of the potential rough replies from the cultists who have already drunk the Koolaid.

[–] _stranger_@lemmy.world 5 points 1 week ago

Literally IBM a decade ago. AI->Quantum

[–] misk@sopuli.xyz 17 points 1 week ago (3 children)

„Microsoft stopped building AI data center infrastructure, therefore Microsoft signals that there’s not enough demand” is a valid point in itself but not enough to merit a blog post that’s this long.

I’m getting an impression that minor fame and success went into Ed Zitron’s head because he now brags about those word counts and other pretentious shit on BlueSky constantly.

load more comments (3 replies)
[–] gmtom@lemmy.world 13 points 1 week ago (1 children)

Yeah I mean, when has Microsoft of all companies ever been wring about the future of technology.......

[–] cyberpunk007@lemmy.ca 3 points 1 week ago

Hmmm let me just bring this on Internet explorer on my windows phone.

[–] WhatSay@slrpnk.net 6 points 1 week ago (1 children)

It's not like Microsoft has their finger on the pulse of technology advancement, they only got involved with AI to seem relevant, and now it's not worth doing anymore.

[–] balder1991@lemmy.world 5 points 6 days ago

I was thinking this. Microsoft got some participation on OpenAI and has been paying them with cheap credits to run on their data centers. I guess they’re starting to worry that once the house of cards collapse, they’ll be the ones to pick up the pieces for any over-investment.

Maybe thanks to tariffs the importation of components made overseas will become cost prohibitive vs any expected potential gains from further development of LLM/AI. Or, perhaps in addition, an expected economic downturn has caused them to re-evaluate large investments in the immediate future. Or maybe they think AI is dumb.

[–] Darkcoffee@sh.itjust.works 3 points 1 week ago

I had a feeling this was coming.

load more comments
view more: next ›