this post was submitted on 10 Oct 2025
640 points (99.2% liked)

Programmer Humor

26846 readers
531 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Xylight@lemdro.id 12 points 4 days ago (8 children)

There is a reason there is sometimes a notable decrease in quality of the same AI model a while after it's released.

Hosters of the models (like OpenAI or Microsoft) may have switched to a quantized version of their model. Quantization is a common practice to increase power efficiency and make the model easier to run, by essentially rounding the weights of the model to a lower precision. This decreases VRAM and storage usage significantly, at the cost of a bit of quality, where higher quantization results in worse quality.

For example, the base model will likely be in FP16, full floating point precision. They may switch to a Q8 version, which nearly halves the size of the model, with about a 3-7% decrease in quality.

[–] mcv@lemmy.zip 2 points 3 days ago (6 children)

But if that's how you're going to run it, why not also train it in that mode?

[–] Xylight@lemdro.id 0 points 2 days ago (5 children)

That is a thing, and it's called quantization aware training. Some open weight models like Gemma do it.

The problem is that you need to re-train the whole model for that, and if you also want a full-quality version you need to train a lot more.

It is still less precise, so it'll still be worse quality than full precision, but it does reduce the effect.

[–] mudkip@lemdro.id 1 points 18 hours ago (1 children)

Your response reeks of AI slop

[–] Xylight@lemdro.id 0 points 15 hours ago (1 children)
[–] mudkip@lemdro.id 1 points 15 hours ago (1 children)

Is it, or is it not, AI slop? Why are you using so heavily markdown formatting? That is a telltale sign of an LLM being involved

[–] Xylight@lemdro.id 0 points 14 hours ago (1 children)

I am not using an llm but holy bait

Hop off the reddit voice

[–] mudkip@lemdro.id 1 points 13 hours ago

...You do know what platform you're on? It's a REDDIT alternative

load more comments (3 replies)
load more comments (3 replies)
load more comments (4 replies)