The health app paradox is that sometimes apps meant to improve our health can actually do more harm than other apps. Can the recent advancements in Affective Computing solve this paradox?
Apps designed to help maintain your mental and physical health aim to transform you into a better version of yourself, don’t they?
But have you ever found yourself annoyed by untimely and overly optimistic notifications, such as “You’re doing great, fantastic progress today!” or “Keep up the pace”? Especially when you receive them during moments of sadness or when you’re feeling down.
Or perhaps you’ve experienced a sense of guilt for simply marking a missed workout, just to avoid receiving well-intentioned but somewhat stern comments like “Consistency is the key to success, don’t give up!” or “I still believe in you!”.
All these things can intensify your frustration, introduce unnecessary stress into your life, and even lead to depression and various forms of disorders.
This issue is discussed in more detail in this BBC article.
The good news
As a result, this issue has gained the attention of the public and has resonated within society. Scientists have studied this extensively, and even the UK government has tried to tackle it through laws.
Afterward, developers of health-related apps began making changes to their product designs. For example, Under Armour, the company behind the MyFitnessPal app, has recognized this concern and has taken steps to address the potential misuse of the app. They have put in place “specific safeguards to reduce its attraction for individuals attempting to use it to enable detrimental eating behaviors” as detailed in their response to Refinery29.
However, are these measures sufficient? Experts generally agree that simply adding warnings about the necessity of consulting a doctor before using the app may not be adequate.
What else can be done apart from adding these warnings or other elements to the app’s design?
I believe the solution lies in utilizing the latest advancements in emotional AI, also known as Affective Computing.
What are the capabilities of AI for recognizing human emotional states?
But what can AI offer to address users’ psychological issues with health-related apps? Firstly, a potential trigger for the emerging issue could be an increase in stress levels during interactions with the app.
So, hold on! If even the closest people sometimes can’t determine the sincerity of your “I’m fine!”, how can a soulless machine handle this, you might say?
To answer this question, let’s figure out which stress indicators, on an emotional, biological, and cognitive level, we can currently measure.
To avoid overwhelming you, I’ll list the top 10:
- Level of stress hormones (cortisol and adrenaline) in the blood.
- Heart rate (pulse) and blood pressure.
- Rate and frequency of respiration.
- Electrocardiogram (ECG) for analyzing heart activity.
- Changes in subjective self-reports (questionnaires and surveys).
- Analysis of blood biomarkers associated with inflammation, such as cytokines.
- Electroencephalogram (EEG) for studying brain activity.
- Blood glucose level.
- Skin temperature and changes in skin color (peripheral vascular response).
- Electrodermal activity (measuring skin’s sweat response).
It’s interesting that just a few of these markers can expose even the best poker face through a regular camera. It’s possible to read emotional states by analyzing changes in breathing, heart rate, and slight facial color changes (invisible to the naked eye).
You can learn more about this in Rosalind Picard’s interview on Lex Fridman’s podcast at the 30-minute mark.
I hope I’ve convinced you that sometimes machines can understand us better than our closest people.
But let’s return to our problem and try to answer the question: how can we make use of these technologies to minimize all the harm that health apps can inflict?
Stress measurement technologies and their practical applications
Let me make this clear. To accurately measure all the necessary biomarkers, our regular smartphones fall short in capabilities. At least for now. Therefore, for stress control, it’s necessary to use an additional device that will collect the data.
The recent COVID-19 pandemic has seen a surge in depression and suicides, sparking significant interest within the scientific community to explore and address stress-related research.
Psychologists, medical professionals, and neurobiologists are actively developing diagnostic and management methods for stress, along with strategies to bolster mental health during crises.
As a result, we now have access to a diverse range of stress-measuring technologies.
Unfortunately, most of these technologies only work in laboratory or specially created conditions. They are designed for use by scientists in specific research studies.
And, to be frank, many of these devices don’t look very user-friendly. It’s hard to imagine handing a device with a bunch of wires to an average user and saying, “Hey, buddy, wear these wires all day.”
Just the look of this device can cause not only stress but also real panic. Right?
But what options do we have as common creators of digital products?
Let’s start by outlining what we are looking for.
I’ll stick to the tradition of listing ten points:
- Convenience and comfort in use: The device for measuring biomarkers should be comfortable for 24/7 use.
- Accessibility and price: The technology should be accessible to a wide audience and reasonably priced.
- 24-hour monitoring: The sensor should continuously monitor biomarkers around the clock.
- Integration and compatibility: The technology should seamlessly integrate with our applications.
- Privacy and security: Ensuring user data protection and guaranteeing its confidentiality are essential.
- Personalization: Technology should adapt to individual user characteristics and needs for more accurate analysis.
- Adaptation to real-life conditions: The technology should be adapted for measurement in real-life situations.
- Analytics and interpretation: The system should provide analytics and interpretation of results so that users can better understand their emotional state.
- Diverse data sources: Technology should use a variety of physiological and psychological parameters for a more complete analysis of stress.
- Electrodermal activity (EDA) sensor: Having an Electrodermal Activity (EDA) sensor in the device is the key indicator of the most precise measurement of stress levels.
A small comment regarding point 10. During my research, I discovered that one of the factors that determines the accuracy of stress level measurements is the presence of an EDA sensor in the device. As an example, let’s consider this study, which states that “The result showed that EDA could classify the stress level with over 94% accuracy. This system could help people monitor their mental health during overworking, leading to anxiety and depression because of untreated stress.”
So I have shortlisted three devices, which I am happy to introduce to you
For simplicity, I will provide a brief description of the device, its main pros, and cons.
EmbracePlus. Scientists consider this the gold standard for monitoring physiological data in the market. It was created by Empatica in partnership with Professor Rosalind Picard, who is a pioneer in Affective Computing. The price is currently unknown, and the product is not available for purchase yet. However, it’s expected to start at around 1500 euros based on the pricing of the previous model (Embrace 4).
Pros: The smallest and most accurate wearable device to date, combining PPG, accelerometer, gyroscope, temperature, and EDA sensors.
Cons: Currently, the device’s main downside is its unavailability for purchase.
Nowatch. Starting at €447. The device is equipped with Philips EDA Biosensing Technology, which monitors changes in sweat gland activity by measuring skin conductance.
Pros: In addition to its sleek design, users can access real-time data through the iOS or Android app, which provides insights and tips for a balanced lifestyle. The watch includes multiple sensors, such as PPG, EDA, accelerometer, temperature, and barometer.
Cons: It lacks integration capabilities with third-party applications. In fairness, it’s worth mentioning that integration with the Health app is planned for the near future, but that’s all for now.
Mindfield eSense Skin Response. €169. It’s a compact sensor that utilizes your smartphone or tablet’s microphone input (compatible with Android and Apple iOS devices) to measure skin conductance.
Pros: In terms of pricing, the Mindfield eSense Skin Response generally offers a more budget-friendly option when compared to some other stress-monitoring devices on the market.
Cons: The biggest downside is its uncomfortable design, with numerous wires that make it impractical for 24/7 wear.
Well, it’s clear that we don’t have any winners here, as each of the three devices has critical drawbacks that prevent any of them from being used to enhance the level of empathy in digital products at the moment.
Practical Application
But let’s imagine that EmbracePlus is already available on the market, or that NoWatch can now be integrated with third-party applications, or perhaps an even better device has appeared, surpassing the ones we already know.
How could we use them in that case? I can offer you a few hypotheses.
List of improvements for the previously mentioned MyFitnessPal
- Personalized recommendations: By tracking a user’s stress levels, these apps can tailor their recommendations. For example, during periods of high stress, they can suggest methods to relieve tension rather than focusing on calorie intake.
- Emotional eating awareness: Stress often triggers emotional overeating. Apps can prompt in real-time or suggest alternative activities to deal with stress, which reduces the likelihood of emotional overeating.
- Recommendations during illness (I’m missing these options in all the health apps I’m currently using): MyFitnessPal can tailor workout recommendations, monitor hydration and nutrition, track symptoms, and monitor progress to support the user’s health and recovery during illness.
- The most important thing, in my opinion, is the identification of critical states: By analyzing emotional data, the app can identify signs of more serious psychological discomfort, such as depression or anxiety, and offer timely help, including referrals to specialists.
Well, I hope this will be enough to help fuel your imagination and come up with a few more ideas on how to improve the health app you personally use.
I’d be delighted if you share your ideas in the comments to this post.
The conclusion
So, what answer can I provide to the question I posed at the beginning of the article — “Can the health app paradox be resolved using emotional AI?”.
And my answer is — definitely can!
As we have seen, artificial intelligence has the capability to sensitively respond to changes in our emotional state, including monitoring stress levels during interactions with various applications. This opens the door to a more empathetic and user-centered approach to health and well-being.
But we definitely need at least a couple more years before we have a chance to bridge the gap between advanced sensor technology and convenient, affordable devices.
Perhaps scientists will still be able to fit all the sensors they need into, say, a small and elegant ring?
For example, I wear an Oura ring all the time, which gives me almost no discomfort (except for a couple of asanas in yoga, where you have to cross your fingers). But, unfortunately, Oura is not yet able to track my emotional state. Plus the health recommendations and analytics are still very generalised and unspecific.
But I believe that anything is possible! Particularly considering the rapid pace of AI development.
Can you realize that the release of ChatGPT was only six months ago? Yeah, I can’t either.
So keep up to date with the latest news in empathy development for digital products by staying tuned!
Thank you for your attention and for reading to the end ❤
The list of information sources for this post and simply an interesting list for leisurely study:
- 🔬 Research: Calorie counting and fitness tracking technology: Associations with eating disorder symptomatology. Courtney C Simpson & Suzanne E Mazzeo.
- 🔬 Research: It’s Definitely Been a Journey”: A Qualitative Study on How Women with Eating Disorders Use Weight Loss Apps. Elizabeth V Eikey & Madhu C. Reddy
- 📰 News: Calorie counting apps ‘can exacerbate eating disorders’. BBC News.
- 📰 News: The UK Has An Eating Disorder Epidemic. How Do We Stop It?. Refinery29
- 🎙️ Interview: Rosalind Picard: Affective Computing, Emotion, Privacy, and Health. Lex Fridman Podcast #24.
- 🔬 Research: Best practices for stress measurement: How to measure psychological stress in health research. Alexandra D Crosswell & Kimberly G Lockwood.
- 🔬 Research: Electrodermal Activity for Measuring Cognitive and Emotional Stress Level. Osmalina Nur Rahma, Alfian Pramudita Putra, Akif Rahmatillah, Yang Sa’ada Kamila Ariyansah Putri, Nuzula Dwi Fajriaty, Khusnul Ain, and Rifai Chai.
Can the health app paradox be resolved using emotional AI? was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.
Leave a Reply