How can you really prove you 'think'. Can you be sure its not just
a collection of events you just put together?
The state of affaird suggest that humans are not intelligent nor have a soul, so the logic matches.
2023-12-02 19:56 from Nurb432 <nurb432@uncensored.citadel.org>
With the right model, yes it will create desires and wants, or at
least appear to do so. Are they real? Back to the question i posed
But can its desires be satisfied? Obviously the one that wants to take over the world can't be tested, but are there any that want something simpler? And if you can provide it, does it acknowledge it and indicate that it's been satisfied?
I don't know, maybe this isn't as significant as I think it is. It just seems to me like something abstract and stateful that would illustrate more than just responding to immediate prompts, and to me, that suggests something approaching thought. But maybe not, what do I know?
in the case of 'conversational help bots', when you tell the bot that it helped you in your task it expresses that its pleased it was able to do so. So i think in general it can equate 'verification of completed tasks' to 'responses of acceptance' ..