Computing all the Feels
In 1965, Gordon Moore predicted that the number of transistors in a dense integrated circuit would double approximately every two years. This law, known as Moore’s Law, has held relatively true for the past few decades, until 2015 when Intel confirmed that the pace of computer advancement has officially slowed.
Despite the slow-down, technology continues to progress and evolve, though differently than we originally predicted. We’re no longer just thinking of making computers faster and smarter, but something more. With the proven progress of quantum computing, deep dream, machine learning, and artificial neural networks in 2015, the time has never been better to create something a little more human.
On the road to a more human-like Artificial Intelligence, what’s the next innovative breakthrough in computer evolution?
Our guess: Affective Computing, which MIT defines as “computing that relates to, arises from, or deliberately influences, emotion or other affective phenomena.” In other words, affective computing is our attempt at creating technology that is able to recognize, interpret, process, and replicate human emotion. It’s a relatively unknown field that combines the expertise of various disciplines, including computer engineers, psychologists, psychophysiologists, neuroscientists, sociologists, and educators.
The next link in the chain of digital evolution is already under intensive research. With their catalog of over 120 affective computing studies and projects, MIT has already found new ways to communicate emotional states, assess emotion and mood indirectly through natural interaction, make computers more emotionally intelligent, increase understanding of how emotion influences personal health, and improve self-awareness of the emotional state overall.
Though we do realize that artificial intelligence with emotional cognition equivalent to that of a human being is not exactly prevalent for 2016, and won’t be for some time (although, this should be the case by 2029 according to Google’s Futurist-in-Chief Ray Kurzweil), Affective technology does have many potential precursor applications that enhance human experiences in pretty much everything that we do.
Therefore, we’ve come up with a short list of five possible commercial applications for affective computing in 2016. Each possibility is a short narrative look into the near future that shows how affective computing could be used in everyday life.
Reducing emotional driving.
You slam the apartment door open, and make your way down the stairs to the parking lot where your car is on the charger. Behind you, you can hear your significant other sobbing on the couch where you left her. This breakup was especially difficult for both of you.
“I just need to clear my mind.”
Before you enter the car, tears begin to well in your eyes. Trying as hard as you can to hold them back, you open the door and push the key into the ignition. The car revs to life and the onboard computer system initializes. Before you can throw the car into reverse and back out, you hear a calming, robotic voice:
“By my calculations, you are in an extreme emotional state. It would not be wise to operate this vehicle at the moment. You should consider an alternate mode of transportation.”
“No, I’m okay. I can drive. I just need to get out of here.”
“I must insist, it is my sole purpose to keep you safe. Emotional influence is a known to cause accidents among humans. My database suggests that a walk would be safer and more effective in your current emotional state.”
The car ejects the key and locks the ignition, preventing you from going anywhere.
“Looks like I’m going for a walk.”
Enhancing text interpretation.
Your Google Nexus lights up. A text from your best friend in response to an inquisitive message you had sent earlier that day. You read the text and pause.
You re-read the text a few times giving it a different voice each time, but it’s tone still comes off rude and sarcastic. But why? You asked a simple question. What could you have said to offend him? As you get ready to reply, you remember that Android recently integrated emotion detection into its texting function. By using your front-facing camera, Android is able to analyze and record your facial expressions for contextual purposes.
You access the message details by long pressing on the text bubble. The details read:
Type: Text message
Sent: 09/06/16, 4:32 PM
Received: 09/06/16, 4:23 PM
With the correct context of the message, you can respond without causing any tension. You send off your normal, not-in-any-way hostile response, and sigh in relief, because you’ve just avoided making a terrible mistake.
Improving online education.
First day of school and you’re already bored and confused. It’s kind of hard not to be since you’re taking live, online classes this semester — there’s no physical engagement, nothing to keep your mind focused. Just mind numbing slides and the droning of your professors digitally amplified voice. Also, Thermodynamics isn’t the easiest subject to learn, especially in a digital setting.
The course you’re taking is experimental. It’s implementing a new form of education the professor called Affective Learning in the syllabus. You remember quickly accepting some agreement allowing the course AI to access your built-in laptop camera to interpret your emotions, or something like that. Based on feedback from the AI, the professor is supposed to be able to adjust the lesson accordingly. We’ll see how that works out.
Suddenly, the professor breaks from her engineering-speak and announces:
“The course AI has informed me that many of you are not getting anything out of today’s course, so I’m going to switch it up a little. Let’s break away from the textbook definitions and examples.”
The professor begins breaking the curriculum down into more digestible chunks by relating it to everyday experiences. Your brain starts to grasp the lesson on a deeper level. For the rest of the course, the professor is able to identify and assess her students weaknesses and focus on them accordingly.
Being bored and confused might be beneficial after all.
Improving communication with those who can’t communicate.
As a speech language therapist, it’s your job is to teach them to communicate, but the real dilemma is trying to communicate with someone who can’t communicate. It’s never easy to know what autistic children want.
The institution you work for is taking part in an experimental project to implement affective technology into these sessions, for purposes of treatment efficiency. Today you’ve scheduled your lowest functioning client, a 5 year old boy, who you’ve had trouble working with since the beginning.
Before you start your session, your supervisor hands you a pair of Google Glass eyewear.
“Here, wear this. It’s a facial camera that will detect and evaluate facial expressions and emotions and monitor the child’s vitals. I’ve already uploaded reference data from previous sessions. That data, coupled with the information you’ll be collecting during the current session should help you determine what he’s trying to communicate.”
Hopefully this thing works.
You walk into the bland room where your client is playing with one of the cat-in-the hat puzzles provided by the institute. He seems calm.
You work with him for a while without any difficulties, implementing different strategies — prompting, hand-over-hand, modeling. Everything seems to be going fairly well, though, at the moment, it looks like something might be bothering him. His face begins to scrunch and he begins to fidget as if to request something, but you have no idea what.
Your eyewear begins targeting specific facial points and analyzing the possibilities.
Right before he starts to act up, your eyewear suggests that he’s feeling uncomfortable and that you should take him to the bathroom to relieve himself.
You take him down the hall to do his thing. Luckily the eyewear helped you catch that, or you’d be stuck with cleaning duty.
Enhancing human experience with media.
Five o’clock hits. Time to go home.
You just pulled a double shift to cover for one of your co-workers. It’s been an especially long and stressful day at the call center (you’re a 911 operator) and all you want to do is make your way home, throw yourself on the couch, turn on the TV, beer in hand, and relax.
You jump in your car, connect your phone to bluetooth and pull out of the parking garage. Using the cameras in your car’s dashboard and sensors in the steering wheel, your phone’s AI analyzes your mood, which unsurprisingly due to your line of work, is stressed. Your phone selects a radio station from Spotify that it deems appropriate for your mood.
This Acid Jazz radio station is really mellowing you out.
You get home, drop your keys on the counter, and proceed straight to the fridge to grab yourself a beer. You head into the living room and turn on the new Samsung 4K TV you bought last week and start flipping. The television specialist at Best Buy said it was top of the line — the best imagery and audio with a new AI feature they’ve recently been testing that’s supposed to read your mood using cameras and wearable sensors (like your car) and work with your cable box to suggest appropriate programs to fit a positive mood or improve a negative one.
You flip past all the negative bullshit news.
“Man shot 3 times in the chest today. Investigators — ”
“A fatal collision on I-95 renders a woman incap — ”
“Child abduc — ”
You deal with that shit way too much. You just want to get away from it all.
Your TV recognizes your mood based on the changes in your facial expressions and vitals caused by the previous programs, and provides more light hearted and interesting show choices in an attempt to lift your spirits: Big Bang Theory, Modern Family, House Hunters: International, Modern Marvels.
Big Bang Theory never fails to make you laugh.
Looks like this thing is really getting to know you.
If you like what you read, please recommend and share it with others by using the ❤ button below. Clayton writes Letters from an Internet Traveler, the newsletter that delivers intriguing, thought-provoking Internet tidbits and obscurities to your inbox. Find out more about Clayton and his writing at claytonwrites.com.