Advice to My Smart Phone
Until recently our friends were the ones who knew us best, and perhaps even knew what was best for us. But nowadays that role is being claimed by our smartphones: our new best friends. Loaded with sensors and apps, they know which shoes, books, and music we might like. And a new generation of apps is also helping us to lead physically and mentally healthier lives.
We’ve grown familiar with joggers being motivated by their running apps. But now there are even apps to assist you in things like raising your newborn child. And for the people who aren’t motivated by kind words and inspirational messages, apps working with wearables like the Pavlok-bracelet use electric shocks to get you off the couch.
Over the past two years I’ve studied the rise of apps aimed at monitoring and improving our personal lives. They signal the emergence of a new type of coach: the digital lifestyle coach, or e-coach. The e-coach differs in many ways from its analogue predecessor: its approach is data-driven, its monitoring regime is continuous, and its feedback is real-time and (hopefully) right on time. The promise of the e-coach is that with its objective data, clever algorithmic analysis, and personalized feedback it will help me change my behavior for the better.
So I’ve been asking myself whether I would listen to an app that tells me to change my behavior? Would I trust the advice of a digital, data-driven coach coming to me from the smart phone in my pocket? Within the stressful environment of a modern society constantly bombarding me with information and options, it might even be nice to have a companion app that tells me what choices to make. But before we move on to such a future, there are a couple of things that I feel the e-coach seriously does need to improve. So for now, let’s switch roles, and let me give the emerging digital coach a few words of advice.
Be honest about your imperfections
A smart phone is a wonder of modern technology. We carry around more computing power in our pockets than NASA used to put a man on the moon. But still, not everything our phones calculate is correct and accurate. This goes for apps that monitor our behavior as well. What they measure isn’t always correct, and how they analyze and translate this into advice isn’t either.
Getting accurate measurements of human behavior is tricky. For example, apps and wearables have a hard time interpreting some movements and activities. Popular activity trackers tend to underestimate the activity levels (such as calories burned or distance walked) of people who walk slowly, like pregnant women or obese people. Activities with more subtle movements, like yoga, are also tough to measure. One user we interviewed during our research said that his heart rate monitoring wristwatch would only work when it was strapped so tightly onto his wrist that it was uncomfortable and left a mark on his skin—and even then the data it provided was incomplete and inaccurate. So he stopped using it.
My own experience with sleep tracking ended in a similar way. I had set out to create a beautiful dataset on my sleeping behavior, but setting the tracker to the appropriate tracking mode required a series of taps on a wristband that for me turned out to be difficult to remember or execute properly in the dark right before dozing off. Some nights I failed to set the tracker correctly. On other nights, the data didn’t seem to capture what actually happened, for example, when I was awakened by the roar of a passing motorcycle. So I wasn’t getting the clean and complete dataset I had imagined, and I didn’t know whether I could trust the data enough to make decisions about altering my sleeping behavior. The tracker eventually ended up in a drawer—and that, according to a survey by Endeavour Partners, is the fate of more than 50 percent of trackers.
Regardless of improvements in technology, it seems likely the apps and gadgets designed to help me improve myself will continue make plenty of errors for the foreseeable future. But then again, so do my human friends. So how can I best take advantage of what my imperfect smart phone has to offer?
Research in robotics might provide us with an answer. Studies have shown that people are more inclined to trust technology if it communicates clearly and honestly about its limitations. In flight-simulator experiments, for example, pilots will trust and collaborate with a system more effectively if it informs them when it is unsure about its own judgement.
But being honest about imperfection is one thing; if they want us to trust them, apps are also going to have to do a better job explaining why they give the advice that they do. When your best friend tells you to take it easy, you can ask him why he thinks you should do so, and based on what he says you can decide whether or not to follow his advice. Most of the time technology is unable to provide such an explanation. If your stress monitor thinks you are stressed out, but you don’t feel stressed at all, you can’t ask your stress monitor how it came to its conclusion. Maybe it misinterpreted the data? Or maybe you are not accurately sensing your own stress.
At Delft Technical University researchers are working on the design of self-explaining agents, software that is able to provide users with the reasons for its actions. If applied to digital coaches, such a design could inform users about how advice is constructed. For instance, the app could display the measurements that led it to conclude that the user was stressed, and the reasoning why it recommended taking a walk to help ease that stress. Honesty is the way to go for me. Even if that means that a “smart” app must admit that it doesn’t know everything.
Stop talking behind my back
The digital coach is data-driven. In the process of monitoring and giving feedback, a continuous stream of data is collected about our behavior. This data has a literally intimate quality, as was shown when Fitbit users’ sexual activity showed up in Google searches. When you trust an app to collect and analyze intimate data, you want to be sure that it is handled with appropriate care and confidentiality. Doctors and coaches are bound to confidentiality by law or professional codes. But our apps and data-gathering smart phones are not.
Not surprisingly, many people have their worries about how wearables and health apps handle their personal data. And those worries seem to be justified, because software usually is not a neutral advisor: your app might appear to have your best interests in mind, even as it is sneaking around the back door to sell your personal data.
Evidon Research (now Ghostery Enterprise) found that 12 well-known health apps distributed data to as many as 76 different third parties. Research by the Federal Trade Commission showed that the types of data health apps and wearables are spreading are not just anonymized activity patterns, but usernames, email, unique device identifiers, and other highly personal data.
A patent by wearables manufacturer, Jawbone, offers some insight into where all this data might eventually end up. The company has developed what it calls lifeotype, a master profile that combines data from different apps, wearables, and external sources to create a complete picture of someone’s lifestyle. The patent describes how simple data points (life bits), such as data from an activity tracker, can be analyzed to conclude that someone leads a sedentary lifestyle (life bytes). By combining this information with other data about eating patterns, and perhaps medical history, a lifestyle profile can be created. This lifeotype can tell us that this person eats too much sugar, is slightly obese, exercises little, and is at risk of developing diabetes in the coming years.
This might be okay if we had total control over our own data, but the ways in which data from digital coaching apps are being traded, sold, and re-used remain largely opaque. A colleague of mine, who is a diabetic, tried to find out what happens with the data from her wearable insulin pump when she uploads it into the cloud. By studying the fine print of her privacy policy and making several calls with the service provider she learned that data she thought were used only for telecare purposes were actually also used (after being anonymized) for research and profiling. But she was unable to find out exactly how the data were being analyzed and who was doing the analysis. This worries her because the cloud service, which she used to pay for but is now free, encourages users to also upload their Fitbit data, suggesting to her that the costs of the service are now being covered by monetizing her data.
A digital coach should value personal autonomy, but the current generation of digital coaches doesn’t seem to recognize that there is no one formula for well-being, and they still have a long way to go in terms of allowing users to define their own goals.
These sorts of concerns are not a solid basis for a trusting relationship to emerge between human and app. If we really want to be able to profit from a digital coach, we have to be able to trust it with our personal data. Giving users clear, transparent choices about how their data will and will not be used can pave the way for a more healthy and trusting relationship.
I recently talked to someone at an insurance startup that uses data from driving behavior to establish personalized premiums. The use of personal data in insurance is always a touchy subject, but this company is managing to make it work. They give their users clear information about what data are being collected and how the information will and will not be used, as well as the controls to manage their data and even delete the information after the premiums are calculated. Their customers are very supportive of this type of transparent data use, and both sides benefit from the openness. I think the same approach would work for a digital coach.
Just let me be me
My final word of advice to future digital coaches would be to respect that people are different, that there isn’t a one-size-fits-all approach to being healthy and living well. Health apps promote a certain image of health and well-being. Usually that image is based on some set of guidelines about how much exercise you should get, and how much fruit and how many vegetables you should eat. But for some, a good life might not entail strict compliance with some app’s exercise or dietary standards; it would instead allow for a looser interpretation of such general rules, leaving more room for the social aspects of dining with friends, baking cookies with your kids, or the enjoyment of sloth.
Time magazine reports on an app aimed at kids that helps them manage their eating habits using a simple traffic light feedback system. High calorie foods such as ice-cream get a red-light classification; foods that should be consumed in moderation such as pasta and whole wheat bread are yellow; and things that you can eat as much of as you want, such as broccoli, are green (which of course begs the question of whether kids want to eat any of the green-light foods at all). The article reports on a young girl who, since she started using the app, is now seeing the world in red, yellow, and green. “Everything,” she says, “corresponds to how many red lights you eat.” While providing a useful tool for managing a diet, the app also instills a very specific perspective on food—a perspective informed by calories rather than other qualities of food, a perspective that focuses on the evaluation of food as “good” or “bad” rather than its social aspects, and a perspective that makes eating something you succeed or fail at.
By promoting certain actions and discouraging others, a digital coach presents a view of what is good and what parameters such judgments ought to be based on. Can a digital coach—or the tech company or government agency behind it—determine that for me? What is good for one person doesn’t automatically work for another. A digital coach should value personal autonomy, but the current generation of digital coaches doesn’t seem to recognize that there is no one formula for well-being, and they still have a long way to go in terms of allowing users to define their own goals.
One interesting exception is the Dutch Foodzy app, which tracks what you eat but refrains from telling you what you should eat or not eat. Foodzy users can earn badges for healthy as well as unhealthy behavior. You can get awards for stuffing away fruit and vegetables, but you can also become the King of the BBQ, or claim a hangover badge by consuming a certain amount of alcohol. Foodzy encourages healthful eating, but it doesn’t try to be a dictator about it.
Samsung called one of its smart phones a “Life Companion,” an appropriate description of a device that assists us in almost everything we do. But a real companion has to be reliable and honest, it must have integrity, and it should respect my personal values. These attributes, by the way, are part of the professional code that human coaches must live up to. Our pocket-companions still have a long way to go before they can earn our trust.
Jelte Timmer ([email protected]) is a researcher for the technology assessment department of the Rathenau Institute in the Netherlands. This article is based on the report Sincere Support: The Rise of the E-Coach.