My Phone Loves Me, I Know He Does

by francine Hardaway on June 17, 2014

If you haven’t seen the movie “Her,” in which a man falls in love with his operating system, don’t bother. Soon enough you are probably going to live it. Argus Labs, a data analytics company,  has an SDK and API that can take whatever data is already on your smart phone and use it to determine not only how you physically feel, but what your emotions are at any given time (happy or sad)  and what your deeper mood  (depressed) might be. And guess how this will be used first? Advertising.

In all the talk about wearables, much has been made of their ability to predict disease and monitor chronic conditions through the use of sensors. We have devices and apps to measure our weight, sleep, blood pressure, O2 levels, calories, and fitness. Rock Health even had a startup that could use the iPhone and a dongle to look in a child’s ear. There are, no doubt, more sensors in the contemporary smart phone than in the body itself. All these measurements, collectively known as the quantified self, will soon feed into Apple’s HealthKit or GoogleFit.

But this is only the very beginning. A perfect storm is brewing, composed of the Internet of Things, ubiquitous sensors, and artificial intelligence. All of these converge on the mobile device.

First a word about artificial intelligence, which has been much-maligned. In the past few years, our ability to deal with larger amounts of data through cloud services has removed the barriers from machine learning and AI; with today’s computing power we are actually able to model the human neocortex, the highest center of the brain. In the past few years, the data from interconnected systems has produced an incredible neural network that is being used to advance artificial intelligence and draw conclusions about behavior patterns. Google processes more than 700 Terabytes of data every day.  Facebook about 500 Terabytes. Even Twitter processes 200 Terabytes. Imagine what all this data is telling us about the nature of humanity.

And lest you think that some vestigial concerns for privacy will slow this down, take a walk back in time with me to a day when you would never have allowed any stranger to know where you were geographically, or what kind of emotional problem you were having. That time, before we photographed all our meals, gave location services access to our whereabouts, and tweeted our weights, was a mere ten years ago. You thought you’d never allow it, right? You thought it would never happen. But it’s all here now ten years later. That’s how quickly conventions about sharing have evolved. We, even the elderly Luddites, become more willing every day to wear sensors and share the information from them with the “appropriate” people.

But what about the data that sits on your phone when you are unaware that you’re sharing? The phone can already passively detect many things about its owner. Through walking patterns, it can perform gender detection, and even find correlations between locomotion and chronic diseases like Parkinson’s and Alzheimer’s. A $3.00 optical sensor can measure whether you are indoors or out, and what you are doing there.

On a more psychological level, heart rate and oxygen levels can help the phone detect mood, as can the way a user taps on the screen or types on the keyboard. Your phone has the capacity to become empathetic, like the OS in the movie.

The mobile device occupies a special place in our lives because it is the first wearable. It is a docking station where information comes in and is interpreted and stored, not unlike the brain. It is central to the Internet of Things: wearables, appliances, apps, and even cars will increasingly report in to the phone. The phone is the gateway between us and the world.

But  “phone”  is just a temporary form factor. It was the first wearable form factor we were willing to embrace, because it was familiar. and useful But now we also have glasses and watches that function like phones. We’re not too far from computers in pills, contact lenses, and teeth — form factors that now seem so far away but may be just over the horizon.

And so far, we have used the “phone” mostly for verbal communication: we call each other, we text, we write. But increasingly we can make our smartphones more than merely verbal communicators — the data they possess can be used to tell us a lot more about ourselves. Would you allow an app to access your feelings? This will come, limited only by ethical concerns.

Service layers are emerging as background apps that interface with a mobile user only when really relevant : Google Now is probably the first of those, and it fills users with joy when it delivers useful information. In the future, your “phone “could anticipate your needs.

Why bother to resist? The partisans of deep learning tell us that we can make a better future by augmenting our cognitive capabilities with an “exocortex.”  And the marketers can’t wait. Once they have access to our moods and emotions, they’ve got us. And our phones will have sold us out.

Leave a Comment

Previous post:

Next post: