I can say for 100% certain that LLMs do not have emotions. Emotions need comprehension, not blindly responding to anything that has been given as a prompt.
Emotions aren't evolutionary baggage, they are tools evolution uses to change our behavior without messing with our logic.
I'm... pretty sure this isn't just wrong, but staggeringly, incredibly wrong? Plenty of our neurological structures and reactions (including but far from limited to emotional responses) are just... actively maladaptive, and as far as we're aware were even in our earlier years, just in ways that weren't sufficiently intense to meaningfully influence evolutionary pressures. They'll cheerfully screw with logic and everything else 'cause evolution doesn't actually give a damn (to the extent a process gives a damn about anything) about anything like that. They're not tools, they're accidents that didn't kill enough of us people stopped getting born with them, ha.
In any case, they're 110% evolutionary baggage in a lot of situations. Our neurology piggybacks that shit on top of all sorts of things that are completely unrelated to how the responses likely developed originally, and often in ways that are incredibly (sometimes literally lethally, especially over longer periods given how persistent stress strips years from our lifespans) unhelpful 'cause it's a goddamn mess like that. See basically everything about our anxiety and stress responses outside of actually life threatening situations, heh.
A lot of people don't seem to get that evolution of the human body is actually very, very, very unoptimized.
I am starting to get a strong feeling that AIs are the new dot.com. A useful technology that is overhyped and will bankrupt many people.
What I, Euchre, and KT were saying since this whole thing started. The bubble will pop and blow over in due time, we'll benefit from what good there is in it while most of the excesses get... sidelined.