IIRC, consciousness is really, really complicated. We might not be capable of knowing if we are LLMs in meatsuits. An LLM might be a highly vocal infant, and we simply wouldn’t have a great way of really making that judgement. Shit, we still can’t define consciousness in humans–or other animals–in any meaningful way.
IIRC, consciousness is really, really complicated. We might not be capable of knowing if we are LLMs in meatsuits. An LLM might be a highly vocal infant, and we simply wouldn’t have a great way of really making that judgement. Shit, we still can’t define consciousness in humans–or other animals–in any meaningful way.