In the field of healthcare, the COVID-19 pandemic has exposed some fundamental vulnerabilities while simultaneously highlighting opportunities to remedy them through innovation.
The issue of ensuring people have access to medical advice while under lockdown has had two parallel effects. On the one hand, people under lockdown have been visiting “Dr. Google,” whose advice is of course influenced by ad spend and, according to the latest research, is almost always wrong. On the other hand, telehealth is increasingly being adopted for remote consultations, providing patients with a lifeline for qualified medical advice from a real doctor while under lockdown.
Although telehealth can be a lifesaver for patients, particularly when compared to the dubious advice of Dr. Google, not all telehealth solutions are created equal. If a synchronous consultation happens by chat, phone or video, the basic unit of interaction is still 10 to 15 minutes of a doctor’s time, so it does little to reduce the impact on healthcare systems or the cost of that interaction. And because they’re quite complicated to access, utilization rates for telehealth apps still are quite low.
Data Ethics By Design
We developed Abi, which is a micro-consultation service, to bridge the gap between those two things. On the one hand, Abi uses quite sophisticated artificial intelligence (AI) and machine learning to make the process of a medical interaction highly efficient. But on the other hand, that interaction is always with a real doctor from the user’s home country, in order to have a basis of trust and to drive adherence to the information provided.
Quite radically, we have chosen to make the service available inside the world’s most popular chat apps. The Abi micro-consultation service is available inside of WhatsApp, Telegram, Viber and even SMS. This unique approach raises a number of questions around privacy and security.
As we all know, the GDPR calls for data protection by design. That’s very important, but we actually think that’s not quite enough. We are taking it a step further and thinking about data ethics by design. When we talk about data and data protection, we often think about the collection of data. When it comes to Abi, it’s actually a data exchange as we’re both collecting data from users and providing users with data. We like to think about the ethics of data exchange, and how that underpins our service. There are three key pillars to consider: that the service is accessible, that the service is secure, and that it is trustworthy.
Bridging The Digital Divide
If you think healthcare is a right, then healthcare innovators need to consider the digital divide. When we started it was important for us to include SMS as a channel of communication. While an app could reach two or three billion people in the best-case scenario, adding SMS can reach upwards of six billion. This relates to questions of language and culture too, so we ensure that people have access to doctors from their home country, that they’re able to interact with the service in their own language.
But one of the big issues with telemedicine is that, although it may be accessible, it’s not accessed. There are extraordinarily low utilization rates for most digital health services, particularly those that are app-based. Our approach is to integrate our service into existing chat channels like WhatsApp and SMS, and we see utilization rates that are 10 to 20 times higher than traditional telemedicine apps. This is not just a good design choice, it’s a good ethical choice as well. To make the service not only accessible but accessed, you need to meet people where they actually are.
Security As A Service
People are already using chat apps to interact in all kinds of ways, including with physicians. The inspiration for our service was to look at existing user behavior, and then try to build a service around that behavior. If you are lucky enough to have family members or friends who are doctors, this is probably the way you are interacting with them already.
But the question is not just about how it’s already secure, but how we can keep it secured. That’s a question of what additional steps you can take. So, for example, we don’t track our users, we don’t drop cookies, and we make sure that we’re collecting the minimum amount of data that’s necessary to deliver the service. This also has an implication on data ethics; we are not a data business; we are a service business. Because this is a mediated interaction, the user and the doctor are actually never directly in touch with each other.
Would You Trust A Robot?
I don’t think its news to anyone that we are experiencing a crisis of trust globally. Levels of trust in media, politicians, and experts are at an all-time low. At the same time, everybody with a Twitter account pretends that they’re an expert on every topic under the sun. But we have felt from the beginning of our service that it is an ethical imperative to prioritize expertise, and that expertise matters particularly when it comes to healthcare. But it’s not just a question of whether information is trustworthy, it’s a question of whether it’s trusted. Do people actually believe and accept the information that’s provided? Do they adhere to the information that’s there and actually change their behavior as a result?
We asked our doctors to give us an estimate, out of 10 cases, how many times they are saving people an in-person visit to the doctor. And on average, our doctors think they’re doing that in seven out of ten occasions. Abi users say that trusted medical advice has helped them save a visit to the doctor in more than 70% of cases. This alignment speaks to a high level of trust in real doctors, and frankly, people just aren’t ready for robot doctors to replace real ones, whatever the quality of that particular technology.
By adhering to the three pillars of access, security and above all trust, telehealth can not only be an emergency service during the pandemic, but an essential service for the future of healthcare.
By Kim-Fredrik Schneider is the Co-Founder and CEO of Abi Global Health.