this post was submitted on 30 Jun 2025
209 points (100.0% liked)
TechTakes
2011 readers
215 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
What about using LLMs to convert legal language in contracts etc. into basic English that is more accessible to the lay person?
LLMs are bad even at converting news articles to smaller news articles faithfully, so I'm assuming in a significant percentage of conversions the dumbed down contract will be deviating from the original.
First, we are providing legal advice to businesses, not individuals, which means that the questions we are dealing with tend to be even more complex and varied.
Additionally, I am a former professional writer myself (not in English, of course, but in my native language). Yet, even I find myself often using complicated language when dealing with legal issues, because matters tend to be very nuanced. "Dumbing down" something without understanding it very, very well creates a huge risk of getting it wrong.
There are, of course, people who are good at expressing legal information in a layperson's way, but these people have usually studied their topic very intensively before. If a chatbot explains something in “simple” language, their output usually contains serious errors that are very easy for experts to spot because the chatbot operates on the basis of stochastic rules and does not understand its subject at all.
sure sounds like a great way to get bad advice full of holes
LLMs continue to be abysmal at fine detail, and that matters a lot with law