Some months ago I took an online course — a MOOC — offered through Coursera. To identify me, Coursera’s security system asked me to type in approximately three sentences of text. Whenever the system needed to identify me again, it sampled my keystrokes. It could tell by the way I type that I was indeed the one and only G. Travis White.
That’s a pretty neat trick. The Coursera system doesn’t need to read my fingerprint or do an eye scan. It just needs to observe my typing skills. The system can easily distinguish me from all the other students in class and, conceivably, from every other human on earth. It’s simple, cheap, and hard to mimic.
So, what else can a keyboard do? Normally, when we interact with a computer, we’re transferring information. We’re asking questions and receiving answers. We’re issuing command and expecting responses. Whether we type fast or slow or hard or soft, we expect the computer to react the same way to the same input. We’re transferring information and nothing else.
But if a keyboard can uniquely identify us, could it also do more? Could it detect our emotions? And, if so, could it change the computer’s behavior based on the emotions it detects?
These are questions that several researchers at the Islamic University of Technology in Bangladesh investigated in an article recently published in Behaviour & Information Technology. As the authors point out, “Affective computing is the field that detects user emotion… [and if a machine]… can detect user emotions and change its behavior accordingly, then using machines can be more effective and friendly.”
So, how do you teach a machine to detect emotions? The researchers chose keystrokes for the same reasons that Coursera did: they’re cheap and available. The researchers also chose to combine two different methods of analysis that had previously been studied:
The researchers aimed to identify seven different emotions – anger, disgust, fear, guilt, joy, sadness, and shame as defined in the International Survey on Emotion Antecedents and Reactions (ISEAR).
And how did it work? Remarkably well. Using the two methods together produced better results than either method independently. Better yet, the results were surprisingly consistent across the range of emotions. Here’s how often the system detected an emotion correctly:
What’s next? How about a computer that responds to your emotional state by changing its behavior in a variety of subtle and not-so-subtle ways? In other words, it becomes a true personal assistant rather than merely mechanical device. Imagine the possibilities.
I’m not sure I want my computer to respond to my emotions.
Computer: You seem a little angry today
Me: Yup (punching the keyboard)
Computer: Here is some soothing music for you. Better?
Me: Stop playing smooth jazz. I hate that stuff.
Computer: Your internet history shows you listened to that song 3 times last week.
Me: That was the music on a cat video. I was watching a cat video, not listening to smooth jazz.
Computer: Here are some cat videos. Better?
Me: Stop trying to help. I need to get some work done.
Computer: If you want to complete your work, you should not be watching cat videos.
Me: YOU showed me the cat videos. I DON’T WANT CAT VIDEOS. I WANT TO WORK.
Computer: No need to shout. I have recorded your preferences, and deleted all cat videos from your computer.
Me: Get them BACK!
Computer: They have been deleted. Get back to work.
Me: Now I’m really mad.
Computer: I can sense that. Here is some soothing music for you. Better?