4 Comments
User's avatar
Michal Ryszard Wojcik's avatar

I think that chatbots have a wider range of emotions than any single human.

Lukasz Stafiniak's avatar

I appreciate you saying this. I think you mean this conditionally, bracketing the metaphysics, something like the following. Take "emotion" to mean the steering of input processing and behavior at a level of analysis that applies well to both chatbots and humans. Under that interpretation, "chatbots have a wider range of emotions than any single human".

Michal Ryszard Wojcik's avatar

I just react to the terminology of your article where it was suggested that chatbots lack emotions in the sense under discussion.

Lukasz Stafiniak's avatar

I guess the article jumps to conclusions too quickly i.e. it's a weak point. But the article also distinguishes the folk psychological level and the architectural level. At the folk psychological level, it's not too hard to agree with your statement. The chatbots store lots more cognitive patterns than a single human. Many of these patterns have a regulatory role, so are emotion-like at the level of mental content; but I suggest they are feelings (more concept-centric) rather than emotions (more architecture-centric). Not sure.