

People need to understand it’s a really well-trained parrot that has no idea what is saying. That’s why it can give you chicken recipes and software code; it’s seen it before. Then it uses statistics to put words together that usually appear together. It’s not thinking at all despite LLMs using words like “reasoning” or “thinking”
We should to define “thinking” before we discuss any further. Thinking is the self-directed process of reasoning, problem-solving, and generating ideas, involving awareness, evaluation, and adaptation beyond reactive pattern recognition.
That said, LLMs predict responses based on context and learned information.