this post was submitted on 13 Aug 2025
444 points (95.5% liked)

Technology

74131 readers
3692 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] Zak@lemmy.world 12 points 4 days ago

The study is based on having LLMs decide to amplify one of the top ten posts on their timeline or share a news headline. LLMs aren't people, and the authors have not convinced me that they will behave like people in this context.

The behavioral options are restricted to posting news headlines, reposting news headlines, or being passive. There's no option to create original content, and no interventions centered on discouraging reposting. Facebook has experimented with limits to reposting and found such limits discouraged the spread of divisive content and misinformation.

I mostly use social media to share pictures of birds. This contributes to some of the problems the source article discusses. It causes fragmentation; people who don't like bird photos won't follow me. It leads to disparity of influence; I think I have more followers than the average Mastodon account. I sometimes even amplify conflict.

[–] tacosanonymous@mander.xyz 9 points 4 days ago

Neat.

Release the epstein files then burn it all down.

Social media was a mistake, tbh

[–] General_Effort@lemmy.world 9 points 4 days ago (1 children)

The original source is here:

https://arxiv.org/abs/2508.03385

Social media platforms have been widely linked to societal harms, including rising polarization and the erosion of constructive debate. Can these problems be mitigated through prosocial interventions? We address this question using a novel method – generative social simulation – that embeds Large Language Models within Agent-Based Models to create socially rich synthetic platforms. We create a minimal platform where agents can post, repost, and follow others. We find that the resulting following-networks reproduce three well-documented dysfunctions: (1) partisan echo chambers; (2) concentrated influence among a small elite; and (3) the amplification of polarized voices – creating a “social media prism” that distorts political discourse. We test six proposed interventions, from chronological feeds to bridging algorithms, finding only modest improvements – and in some cases, worsened outcomes. These results suggest that core dysfunctions may be rooted in the feedback between reactive engagement and network growth, raising the possibility that meaningful reform will require rethinking the foundational dynamics of platform architecture.

load more comments (1 replies)
[–] zeropointone@lemmy.world 8 points 4 days ago (3 children)

Fixing social media is like fixing guns so they can't hurt or kill anyone anymore. Both have been designed for a very particular purpose.

[–] paraphrand@lemmy.world 6 points 4 days ago (6 children)

Lemmy is social media. So is Mastodon. So is peer tube. And everything else in the fediverse.

So I wouldn’t compare social media to a gun, across the board.

load more comments (6 replies)
load more comments (2 replies)
[–] roguetrick@lemmy.world 6 points 4 days ago* (last edited 4 days ago)

Pre print journalism fucking bugs me because the journalists themselves can't actually judge if anything is worth discussing so they just look for click bait shit.

This methodology to discover what interventions do in human environments seems particularly deranged to me though:

We address this question using a novel method – generative social simulation – that embeds Large Language Models within Agent-Based Models to create socially rich synthetic platforms.

LLM agents trained on social media dysfunction recreate it unfailingly. No shit. I understand they gave them personas to adopt as prompts, but prompts cannot and do not override training data. As we've seen multiple times over and over. LLMs fundamentally cannot maintain an identity from a prompt. They are context engines.

Particularly concerning sf the silo claims. LLMs riffing on a theme over extended interactions because the tokens keep coming up that way is expected behavior. LLMs are fundamentally incurious and even more prone to locking into one line of text than humans as the longer conversation reinforces it.

Determining the functionality of what the authors describe as a novel approach might be more warranted than making conclusions on it.

[–] kibiz0r@midwest.social 5 points 4 days ago (1 children)

Because how to use it is baked into what it is. Like many big tech products, it’s not just a tool but also a philosophy. To use it is also to see the world through its (digital) eyes.

load more comments (1 replies)
[–] Feyd@programming.dev 5 points 4 days ago

Let's just pretend nothing after MySpace ever happened

[–] AceFuzzLord@lemmy.zip 3 points 4 days ago

I mean, I feel like just shutting it down would solve at least some problems. Shuttering it all, video sharing platforms included.

Not a situation most anyone would agree on, but it's an idea.

[–] General_Effort@lemmy.world 5 points 4 days ago (1 children)

I'm not surprised. I am surprised that the researchers were surprised, though.

Bridging algorithms seem promising.

The results were far from encouraging. Only some interventions showed modest improvements. None were able to fully disrupt the fundamental mechanisms producing the dysfunctional effects. In fact, some interventions actually made the problems worse. For example, chronological ordering had the strongest effect on reducing attention inequality, but there was a tradeoff: It also intensified the amplification of extreme content. Bridging algorithms significantly weakened the link between partisanship and engagement and modestly improved viewpoint diversity, but it also increased attention inequality. Boosting viewpoint diversity had no significant impact at all.

load more comments (1 replies)
[–] Cocopanda@lemmy.world 3 points 4 days ago

Getting banned from Facebook. After a decade of clapping back against racists. Has been the best thing in my life. So glad to be out of there. Just wish I could have saved my pics first.

[–] Korkki@lemmy.ml 4 points 4 days ago

The dream was that social media would help revitalize the public sphere and support the kind of constructive political dialogue that your paper deems "vital to democratic life." That largely hasn't happened.

Their idea is basically that people need to be told the same things to what to believe in so that democracy can work as it's supposed to and social media is disrupting that with all the conspiracy shit, flame wars and polarization of opinions. The issue is that this common idea is fermented by the boomer generation. They grew up in really quite anomalous post war world when there was first time in human history basically monolithic mass media that people watched it AND had high trust in AND the system provided more for the masses more than it does now. Those then lead to to high societal inclusion and high social cohesion that again fed into the prosperity. Now we have fragmented information sphere and things are shit are shit, political center is hated by most and radicalism is once again rising.

However so called democracy or collective decision making in general itself does not rely on people not believing in crazy shit, not being fed the best possible validated information, or god forbid having unorthodox ideas of their own or developing factionalism or totally different reading on reality. It helps make it smoother and avoids violence, but that "smoothness of process" that boomers have come to expect is also why society in wider terms is politically stagnant and rotting. People seem to live in different realities, because in a sense we are, because our economic realities can be so different and decoupled form the mainstream narrative. It never didn't have to get this bad, but social media only a venting mechanism not the reason for the growing divides. The division in society and the general anguish is real IRL, it just takes forms of all kinds of irrational and counterproductive forms online. The problem isn't really that people are factional and can't agree with each other, it's that nobody can no longer agree with the monolithic unpopular political center that is holding on to power for dear life.

Good thing is, you don't need to use it. Bad thing is, it affects reality.

load more comments
view more: ‹ prev next ›