The Science of Emotion and why they can be modeled
Updated: Feb 14
Disclaimer: This article was written a year or two ago when I was interested in the topic, maybe there have been recent advances that I'm not keeping track of and mostly these are just musings.
Often emotions are considered to be inferior to logic, logic takes the centre stage in rational decision making.
We can now create somewhat rational agents but it would seem that emotional agents are still scarce.
I pose the notion that emotions are merely a more sophisticated version of logic. They are not only logical but they are more complex version of simple logic. Emotions are efficient logic. Then...
Can we model an agent to mimic emotions and can we make it selfless?
Let us assume an agent A requires X gold coins to succeed.
In order to get X gold coins it has to either trade the coins in such a manner that giving away it's coins will make another agent B happy and agent B will reward agent A by giving it some profit.
Agent A can gain X dollars by either doing transactions with 3 individual agents B, C, D separately and maximizing reward or making all three agents happy and getting X dollars. The agent A is not aware that making 3 agents happy will give him X dollars. Ultimately agent A will figure this out via reasoning.
Thus making three agents happy has made it happy.
Now this was rational part. We train a hundred agents like agent A that go through the same process and then we train another agent to learn only from their behavior despite the reward. The new agent will be aware of the value of empathy and may forgo profit in leu of making other agents happy. We have successfully created an irrational agent which also happens to be of superior logic.
This is overly simplistic in proposal but doable in practice, although the empathy learned will be a hardcoded version where eventually the reward shouldn't matter but it'll be gained anyway.