this post was submitted on 16 Apr 2025
320 points (98.2% liked)

Technology

69109 readers
2363 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] catloaf@lemm.ee 98 points 5 days ago (2 children)

And it looks more like a machine translation error than anything else. Per the article, there was a dataset with two instances of the phrase being created from bad OCR. Then, more recently, somehow the bad phrase got associated with a typo: in Farsi, the words "scanning" and "vegetative" are extremely similar. Thus, when some Iranian authors wanted to translate their paper to English, they used an LLM, and it decided that since "vegetative electron microscope" was apparently a valid term (since it was included in its training data), that's what they meant.

It's not that the entire papers were being invented from nothing by Chatgpt.

[–] wewbull@feddit.uk 24 points 5 days ago (1 children)

It's not that the entire papers were being invented from nothing by Chatgpt.

Yes it is. The papers are the product of an LLM. Even if the user only thought it was translating, the translation hasn't been reviewed and has errors. The causal link between what goes in to an LLM and what comes out is not certain, so if nobody is checking the output it could just be a technical sounding lorem ipsum generator.

[–] Tobberone@lemm.ee 1 points 4 days ago

That's an accurate name for the new toy, but not as fancy as "ai", i guess. Because we know that anything that comes out is gibberish made up to look like something intelligent.

[–] criitz@reddthat.com 10 points 5 days ago (2 children)

It's been found in many papers though. Do they all have such excuses?

[–] catloaf@lemm.ee 9 points 5 days ago

From the article, it sounds like they were all from Iran, so yes.

[–] BussyCat@lemmy.world 7 points 5 days ago

It probably is decently common to translate articles using ChatGPT as it is a large language model so that does seem likely