JWP_CornellVP-Research_JeanCosta-3549_edit.jpg

How long will you use that new Fitbit or Apple Watch for monitoring your daily health activities before disinterest sets in?
Jesse Winter
Jesse Winter

Costa-Collage_edit.jpg

Jean Costa wants to create mobile technologies to improve people’s wellbeing, with users needing to pay very little attention to the devices.
Beatrice Jin; Jesse Winter
Beatrice Jin; Jesse Winter

JWP_CornellVP-Research_JeanCosta-3755_edit.jpg

“Studies have shown that if you ask people to constantly reflect on their own behavior…or really do anything routinely, they are less likely to engage in the self-monitoring process and change their behavior.”
Jesse Winter
Jesse Winter

JWP_CornellVP-Research_JeanCosta-3718_edit.jpg

Costa wanted to bridge the gap between human behavior and technology engineered for improving human welfare, so he came to Cornell for Information Science, where he could do interdisciplinary work.
Jesse Winter
Jesse Winter

JWP_CornellVP-Research_JeanCosta-3573_edit.jpg

“We are building and evaluating novel technological solutions to help users manage their emotions and behaviors in subtle ways that are easy to integrate into daily routines.”
Jesse Winter
Jesse Winter

Mindless Computing

by Kanishka Singh, MS/MPA ‘17

With the ubiquity of tools like the Fitbit or the cascade of activity tracking applications for the Apple Watch, self-quantification has become a powerful new phenomenon. But whether personally tracking our daily routines to understand our health is a trend that will stick is a complicated question. User fatigue is one issue; wearable technologies often require an effort to log information or interpret data. Excited users are punctilious at first but leave their devices in the drawer in the long term.

Mindless computing, a promising new field, aims to leverage theories in psychology and behavioral economics for designing technology that accounts for human propensities.

“Studies have shown that if you ask people to constantly reflect on their own behavior or look at visualizations or really do anything routinely, they are less likely to engage in the self-monitoring process and change their behavior. With mindless computing, we hope to subtly change or improve behavior without asking too much of the user,” says Jean M. Costa, Information Science PhD student at the People-Aware Computing Lab, conducting research under Tanzeem K. Choudhury.

Self-Monitoring Technologies, Requiring Little Human Effort

Costa’s work is predicated on dual process theory, a framework in psychology that explains cognition as a result of either implicit, automatic processes or explicit, controlled processes. He cites the work of Nobel laureate Daniel Kahneman, psychologist, who separates these into System 1 processes—unconscious, low-effort and automatic, and often nonverbal and independent of working memory—and System 2 processes, which are the contrary.

“Similar to this, there is also the elaboration likelihood model,” Costa adds, “which proposes that there are two major routes for persuasion: a center route, in which our decisions are made based on careful considerations, and the peripheral route, which is directly influenced by heuristics.”

Costa’s goal is to create mobile and peripheral interventions to improve people’s wellbeing without requiring much attention and effort from them. This technology shifts user engagement away from the center route processes of System 2 toward the peripheral route ones of System 1.

The idea is to provide users with subtle cues that nudge them to preferred behavioral changes without requiring active cognition. Costa’s article, “EmotionCheck: Leveraging Bodily Signals and False Feedbacks to Regulate Our Emotions,” received the Best Paper Award at the 2016 Association for Computing Machinery International Joint Conference on Pervasive and Ubiquitous Computing for which Costa constructed an innovative methodology for his research.

A Few Telling Experiments

Before explicating his own research design, Costa explains that he was inspired by older studies in psychology, such as the one Stuart Valins conducted for his paper, “Cognitive Effects of False Heart-Rate Feedback,” published in 1966. In that experiment, male college students were given heart monitors and headsets through which they could listen to their own heart rate. They were then shown pictures of women and asked to report some information on their attraction. In actuality, some participants received a manipulated response such as a rapid rate, and Valins found that by engineering the feedback, he could influence participants’ own perceptions about their bodily signals. Subjects, who were made to believe their hearts were racing, reported higher levels of attraction.

Upon reading this study, Costa wondered if he could do the opposite. That is, if he could reduce participants’ arousal by convincing them that their heart rate was lower than it actually was. Costa and Alexander Adams, another PhD student involved in the research, designed and built a specialized device—a slight variation on standard health-monitoring watches that delivers vibrations to users’ wrists in customizable cadences.

While explaining the new device to one individual, on the spot, Costa demonstrated vibrations at a frequency 30 percent below the individual’s heart rate.

For one EmotionCheck experiment, an adaptation of the Trier Social Stress Test—a common protocol to induce stress and anxiety—68 participants were invited to a sound-treated room with no decorative items. They were given heart rate monitors and EmotionCheck devices and divided into four groups: a control group in which subjects felt no vibration during the process; a vibration group where subjects were informed only that they would feel vibrations administered at 60 bpm; a slow heart rate group in which subjects were misinformed that they would feel vibrations in accordance with their heart rate, but actual vibrations were delivered at a standard 60 bpm; and a real heart rate group in which subjects were correctly informed that the vibrations were to reflect changes in their heart rate in real time.

“With mindless computing, we hope to subtly change or improve behavior without asking too much of the user.”

All groups were shown a calming video to help them achieve some general baseline of calmness. Following this, they were asked to complete a questionnaire recording demographic and anxiety information. Subjects were then informed they would be interviewed for their dream job and that they had five minutes to prepare a presentation and then five minutes to deliver their pitch to an evaluator without the aid of any notes. In the latter stage, a confederate entered the room and was instructed to maintain a neutral, stoic expression throughout the presentation, refrain from producing any nonverbal signals, and only prompt the presenters if they paused for more than 10 seconds.

Costa and his colleagues found that when placed in such a tense environment—where the average actual heart rate that subjects were experiencing was 110 or 120 bpm—every group reported a considerably heightened state of anxiety, except the group receiving a steady false feedback of 60 bpm. These findings have significant implications.

“We realized that we can manage anxiety, or we can change a person’s state of anxiety, by altering their perceptions of their own bodily signals,” says Costa. “If people believe that they are calmer than they actually are, it creates a feedback loop, which will help reduce their anxiety the same way as people who believe themselves to be excited. Their bodies follow.”

Psychology and Computer Science

Costa’s interest in working at the intersection of psychology and computer science developed during his undergraduate education in Brazil.

“I majored in computer science in both undergraduate and graduate school in Brazil,” he recounts, “and in this field, researchers often focus on solving very technical issues, which is important. But I realized I was more interested in building technologies to solve issues in other domains, such as mental health. Even though I was taking classes in computer science, I would go to the library and also read books on psychology.”

Bridging the gap between the conceptual models on human behavior and technology engineered to improve human welfare is one of Costa’s most significant goals. So he applied to Cornell for Information Science, which would allow him to conduct such interdisciplinary work.

Technologies for Managing Human Emotions and Behaviors

Costa says, “There are a lot of people doing very impressive work here. A team of students are working with my adviser, Tanzeem, in an effort to understand how mobile sensors can be used to predict mental health issues as well as interventions that can help college students who suffer from bipolar disorder, anxiety, or depression.” He continues, “We are building and evaluating novel technological solutions to help users manage their emotions and behaviors in subtle ways that are easy to integrate into daily routines.”

Eventually, Costa would like to expand the application of these solutions beyond collegiate populations. In addition to the EmotionCheck device, he has developed an Apple Watch application that offers the same service, and he envisions a near future where similar tools can be delivered through the Apple Store or Google Play for people to access.

Whether the instruments Costa and his colleagues develop are truly mindless, that is, rely solely or primarily on unconscious behavior, is something they are still examining. After all, requiring users to maintain an awareness of vibration patterns still requires some degree of active cognizance.

“The next step is to construct experiments to test this,” he shares. “Imagine a situation in which people are explicitly told to pay attention to their heart rate or log information as a step in emotional regulation, and another where they are working on a task like studying for an exam. While they are aware of the instrument of intervention, their attention is focused elsewhere. We want to evaluate if the second model actually helps people.”

By creating devices that will enable such interventions, Costa hopes to help realize a future where technology is able to understand individuals by collecting data about their emotional patterns and offer them truly personalized solutions.