So, if you were creating an A.I, how would you program emotion? It's a very confounding idea, but I'm sure you can.
>>38398370
>how would you program emotion?
I wouldn't.
Actual AI researcher here.
All emotions and instincts are desires to DO something.
You don't just feel an emotion and that's it, it makes you want to DO something.
Every action every human does, every word, every sentence, everything is thanks to some sort of emotion or instinct.
Nothing anybody does is unique or actually *them*. It's all driven by emotions and instincts.
So how would you program emotion? I don't think you'd have to.
All you need is a good enough ML model, let it optimize and run through a lot of situations that reward desired behavior, and punish undesired behavior.
With that method, you'd get instincts. Things they do out of reflex, because they *evolved* the desire for it.
Emotions however are a bit trickier. I believe emotions are learned through the environment, not evolved. That's the difference.
So what you need is sort of a meta-ML model that learns by itself without genetic algorithms. I want to one day find and test a suitable model like that.
If you have a model like that, what you do when you evolve the model itself, you create complex situations that require the model to learn from past experiences, not just evolve instincts.
Realistically though, what you would end up with is sort of a smart ai pet, that does everything it is told, and everything it can to please you.
>>38398994
Hi, do you have any AI books you'd recommend
>>38399128
Books? No. I don't believe learning by reading is as effective as actually trying to implement things yourself.
I do recommend the edx AI course however. It focuses on "traditional" AI with examples on game AI.