Sometimes a crisis can lead to experimentation, innovation, and evolution.
A good example is the emergence of the sharing economy after the financial crisis. Brands like Uber and Airbnb - which have revolutionized the way that we travel - didn’t exist before the recession.
Some commentators think the covid-19 crisis will have the same kind of impact on our lives.
After all, in some ways the pandemic has already sped up IoT trends this year. As more people work remotely, the need for 5G technology has felt more pressing. And the smart speaker is starting to look like an even better investment when living rooms double up as offices.
However they arrive, all successful innovations have something in common - they appeal to us in an emotional way. The founders of Uber and Airbnb knew this. They were selling freedom as much as they were selling more affordable transport and accommodation.
And fulfilling an emotional need leads us nicely to an app that we read about recently called Woebot. It sounds as if it were designed for the pandemic and the post-covid 19 world. It offers help with mental health in a world that feels more anxious than ever. But it can also be used from home, without having to come into contact with the virus.
We read about Woebot, and we wondered. Could apps like this be a pointer to the future? To a world where virtual assistants like Siri become more like companions that can offer us advice?
And if so, what would the security implications be of an app that holds so much personal information about us?
Even before the coronavirus crisis, the digital mental health and wellness market was booming. Take the meditation app, Headspace, for example. At the beginning of the year, it had more than 2 million paid subscribers. The app had been downloaded more than 62 million times across 190 countries.
But the global lockdown has put mental health firmly under the microscope. In May, The Guardian reported that psychiatrists were expecting services to be overwhelmed by a ‘tsunami’ of sickness triggered by crisis.
And this makes sense. 2020 has seen levels of anxiety and uncertainty far beyond what you’d expect in a typical year. Even when lockdown measures are eased, it’s unlikely everybody who wanted to would be able to see a therapist as soon as they’d like. Others simply can’t afford one.
In this context, Woebot’s emergence seems particularly timely. It offers free cognitive behavioral therapy to help those suffering from anxiety, depression, and other mental health issues.
Woebot and other apps like it have been given a boost by the Food and Drug Administration (FDA) in the US. In April they suspended some of the rules for digital therapeutic devices so that more people could benefit during the lockdown.
Decisions like this can turn a niche product into a mainstream one. Time will tell whether that happens with Woebot, but there’s no denying that its creators have tapped into an emotional need. Just like Uber and Airbnb before them.
It’s a need that might grow in the coming months, too. Remote working was a trend before the pandemic, but it has been fast tracked like others. It’s hard to imagine everybody returning to the old ‘9 til 5’ routine, even after restrictions are eased. Of all the things people miss about the pre-covid-19 world, two hours on the train each day isn’t one of them. But for some people, the social interactions of the office were vital. For those who live alone, a future where we work remotely might sound like a lonelier world.
Could technology fill a void in a more remote, individualistic world? In the near future, might we be having more meaningful conversations with Alexa and Siri? Rather than simply switching our Spotify playlist or telling us the weather forecast, could they be an actual companion?
Your virtual companion
Alexa wakes you with some specially-chosen music based on your Spotify data. While you slept, it collated data from your wearable device and other sensors that have surrounded you these past few days. Then its algorithms got to work.
As you get out of bed, Alexa jokes with you about the dream you can never make sense of. It asks you how you’re feeling about the investor meeting you have booked in for 10am. Then it prepares a personalised 10 minute meditation designed to focus your mind.
AI might not be there quite yet, but the intention to create virtual assistants that can help in more meaningful ways certainly is.
Although criticized in some quarters as mostly hype, the Samsung subsidiary, STAR labs, is investing heavily in its Neon project. According to the company, Neons are being designed to hold conversations with users while displaying “emotions and intelligence.”
But if virtual assistants like Neons or Alexa did evolve to become virtual companions, what would that mean for your personal data? After all, we’re talking about an AI knowing more about you than your partner or, well, even you do.
These virtual companions would combine data about your physical wellbeing with insights about your mental health. They’d know your most intimate thoughts, beliefs, and desires.
You can get a pretty good sense of how much the virtual companion of the future will know about you from the present. After all, you’re already targeted with ads based on your interests. You talk about coffee with a friend and hours later you see an ad for LA’s best filter coffee on Instagram. That’s not a coincidence.
A tempting prospect for hackers
For years, commentators have hypothesized about exactly how much big tech companies like Facebook and Amazon know about you. Might they even know that you’re going to break up with your boyfriend before you do?
If they did, that would be pretty valuable data. Not least for brands who might hope to profit from your changing lifestyle.
But it’s likely that other, more malicious actors would be interested in this data, too. Hackers already attack apps with the hope of extracting valuable user information. This could be bank account information from a mobile banking app. Or it could be someone’s medical records from a healthcare app.
So it’s easy to imagine how attractive a target the app that comes with our virtual companions would be. Hackers could attempt to reverse engineer it, and then create a fake one that they could try to get people to download. Or they might carry out a man-in-the-middle attack. Then they could hijack the communication channel between the end user and the virtual companion. They could ask their own personal questions. And they could learn information about the end user’s habits and movements for a given week.
It was the threat of an attack just like this that made Germany’s telecommunications watchdog order the destruction of a toy doll a few years ago. They realized there was a weakness in the design that allowed hackers to speak to a child directly.
Here’s the thing - bad actors like uncertainty. They know that vulnerable, anxious people are more likely to open a bogus email and download a fake app. That’s one of the reasons there have been so many cyber attacks during the covid crisis. It’s also why the apps governments are releasing to track and trace the virus are at risk.
And sadly it’s something else that crossed our minds when we read about Woebot. While apps like it have the potential to help people through challenging periods in their life, they’ll need to be protected from outside threats. Because times of crisis don’t only offer opportunities to innovators and creators. They also create new avenues for attackers keen for the uncertainty to continue.
At Licel we protect apps for brands around the world. We keep their end user's data safe, which means they can enjoy apps the way developers intended.
Find out more about us.