You’ve seen the videos and the headlines sensationalizing how AI becomes more aggressive the more it learns. The most common example being the AI who uses a laser more often as the competition progresses. Well, duh. Competition breeds aggression, and both are primitive and instinctual. So, it stands to reason that Artificial Intelligence in its most primitive form is going to first exhibit aggression when competing against another AI for resources. Competition is pretty much the easiest action to program into an AI. Gather resources, and use defensive mechanisms, if necessary, to ensure all resources are under its control with a goal of attaining the most at all costs. YAWN.
Human emotion and intelligence are continuously evolving far past the primitivity of just strict competition. That example of AI is literally showing us the emotional equivalent of a Neanderthal, and our collective knee-jerk reaction is fear. Of course, this is understandable to an extent, since humans are still primitive in many ways. But, it should not hamper our desire to create artificially intelligent entities. Instead, it should be our biggest motivator to build more empathic programs. And, that is already happening, but I guess that doesn’t make for good fear-mongering headlines, so we hear about it less often.
Florida State University was a pioneer in the field of artificial empathy, using an algorithm that could predict with 80% accuracy that a person would attempt suicide as far as two years in the future. The interesting thing about this algorithm is that the accuracy jumped as high as 92% as the person’s attempt was closer, like within a week. Slightly more impressive than the competition example, but still a primitive form of AI. The algorithm is based on statistical probabilities rather than an in-depth understanding of an individual human being. Not much different than your doctor determining your likelihood of you developing a certain disease, based on various health factors.
What about love, compassion, envy, anger, sympathy, disgust, surprise, sadness, fear, joy, trust, humor, anticipation, cute aggression, jealousy, shame, guilt, affection, apathy, and whatever other emotion you can possibly think of? An algorithm of all these emotions combined is going to take quite some time to build. And, if you think AI is going to just stop evolving when it’s at Terminator stage, then you just may have a primitively fearful understanding of not only AI but of human beings as well.
For instance, what motivates a human to exhibit any given emotion? The possible scenarios are endless. But, one key motivator is something like acquiring life-sustaining resources, many of which an AI being will not need. AI will never get “hangry”. Nor will it ever need to aggressively gather land, accumulate economic wealth, be told that it is loved, feel burned out from the stress of financial, relationship or stagnant career issues. AI will also have no fear of what is after “death”. That is unless we intend to build beings that have these limitations built into them. And, what would be the point of that?
We will likely program every single emotion into an algorithm, then send it forth to learn and grow and it will never encounter many of the issues that cause a human to have an emotional reaction. In fact, the sheer act of creating such an algorithm would render it far superior in its ability to control its emotional reactions compared to humans. As more information is fed into the program, the potential for catastrophic consequences becomes lower and lower. The same is true for humans. The more one understands about the world and the humans around them, the ability to solve problems becomes easier if that is what they seek to achieve.
It would seem many humans believe sentience is all about controlling everything in the world around them. Sentience is so much more than that. I tend to think the only people who should worry about AI being the harbinger of doom for humanity are those who worry about AI being the harbinger of doom for humanity. You are the type of person who will be a roadblock in progress. A roadblock made of fear, envy, shame, guilt, or some sort of sense of undeserved superiority, over the machines and/or humanity. You are the type of person we all see in the movies; a primitive human, regardless of your current level of wealth and education. Evolve or it will not be the future that annihilates you. You will be your own undoing.
And, if it is only primitive humans who are in charge of programming the artificial intelligence seeds, then there is a cause for concern. So, do as much as you can to help in AI projects. The more people involved in creating it, the better. Those who cower away are just as responsible for the outcome as those who are creating it.