i guess it comes down to a philosophical question
no, it doesn't, and it's not a philosophical question (and neither is this a question of philosophy).
the software simply has no cognitive capabilities.
i guess it comes down to a philosophical question
no, it doesn't, and it's not a philosophical question (and neither is this a question of philosophy).
the software simply has no cognitive capabilities.
LLMs know nothing. literally. they cannot.
that atleson person is absolutely first class sneerer.