“Look at this.”
“I think it’s another spam message.”
A friend in London had just received an SMS asking her to move the date of her second Moderna vaccine forward a few weeks due to supply issues.
Since the beginning of this year, Sarah explained, she’d received an average of 2-3 texts per week just like this one. Some told her a package she’d been expecting wouldn’t be delivered until she paid a small fee. Others were from the tax office, reminding her about a return that was due. And more recently, she’d been seeing lots of messages about Covid-19 tests and vaccines.
“I’ve been really close to clicking on these links before. After all, I am normally waiting for a package of some kind - I shop quite a lot when I work from home! I also work freelance, so the tax return one nearly got me, too. And now I’m getting all these messages about the vaccine just when I’m super keen to be fully vaccinated.”
In the end Sarah went directly to the NHS website and realised that the text she’d received probably was genuine. But she’s not alone in being sceptical of messages that ping on her phone. Especially those that follow the now familiar template:
“There’s something urgent you need to do to solve a problem - click on this link now to fix it.”
Sarah hasn’t fallen for one of these scams, but two of her friends have. And in both cases one moment of distraction caused a month or so of distress.
When the phone was our safe place
According to Deloitte, almost half of us have been scammed by a phishing message while working from home during the pandemic.
And when you consider the history of the smartphone and the way we feel about our phones in general, it’s easy to understand why we’re more vulnerable to cyber attacks that target us there.
In just a decade the smartphone has completely changed our behaviour. In that time it has almost become an extension of our bodies. A kind of second brain at the tip of our fingers that both connects us to the world and allows us to detach ourselves from it.
Notifications informing us of messages, likes, and comments elicit strong emotions that can change our mood if only for a minute or two. It’s not the same as receiving an email on your laptop.
That’s because for a long time we’ve associated our phones with our friends and family. The people we care about and trust.
The phone has been a safe place for people to joke with one another, take a selfie, send photos from an old holiday, or share a link to an album they can’t stop listening to. This is the subconscious association we have when we think of our phones.
Psychologically this is important because it helps to explain why so many phishing campaigns are sent via SMS. It also tells us why so many of them succeed.
We’ve associated emails with scams for a lot longer. We feel like we can spot them there more easily. Lots of them automatically land in our junk folder anyway.
With text messages it’s different. There is no junk folder.
Because our phones are where we tend to hear from people we trust, a hacker who tries to get our attention there is already at an advantage compared with one who interrupts us on our laptop.
What’s more, we tend to be more alert when we’re on our laptop. Whether we’re working or we’re booking a night in a hotel, we typically have a more focused mindset. So we’re less likely to let a bogus email catch us out.
When we’re on our mobile, we’re often doing other things at the same time. We might be checking the weather forecast on the way out of the house. Or maybe we’re commuting into the office and receive a message just before the train reaches our stop.
Deferring to authoritative voices
If you're distracted and you’re not concentrating fully, then it’s very easy to mistake a phishing message for the real thing. You click on a link and before you know it you’ve filled out a form with some sensitive personal information. Then you forget about it and get on with your day.
This is exactly what happened to Emmeline Hartley, who made the brave decision to admit to falling for a phishing scam earlier this year in a Twitter post. The day after entering her details via a link in a bogus text, she received a call - apparently from her bank - telling her that her account had been compromised and advising her to move her money into a safe account.
Though she was suspicious at the time, she just about believed the person at the other end of the line. The money that she transferred was lost.
Some might be critical of Emmeline for trusting the caller. Indeed many have been in the replies to her tweet. But we’re all capable of a lapse of judgement like this. Especially after what has been the most stressful year many of us have ever lived through.
We’ve deferred to authorities more than ever during the Covid-19 crisis to give us some clarity and direction in a very strange time.
Cybercriminals know this. They know that our emotional attachment to our phones and our habit of multitasking makes us more likely to respond to an SMS. And they know that many of us are constantly expecting parcels and are keen to return to the lives we knew before the coronavirus arrived.
This knowledge forms the core pillar of their social engineering strategy.
Mobile-centered attacks are getting more sophisticated
In Emmeline’s case, the link she clicked on led to a fake call from her bank the following day. But often the act of clicking the link in an SMS phishing message can result in malware spreading throughout your phone and potentially outwards to your contacts.
This malware is designed to give the attacker access to your private accounts and services. It can allow an intruder to use your mobile for unauthorized purposes or to disclose personal data. And it can even erase and change information on your phone.
At the time of writing this article, news of the Pegasus hack is dominating newspaper headlines. Pegasus is being called the most powerful piece of spyware ever developed. It can turn your phone into a 24 hour surveillance device.
The Pegasus hack is yet another example of a sad shift. The phone used to be a vital form of escape and communication for persecuted people, but now it’s often becoming a silent spy in the palm of their hands.
Pegasus is also a reminder that cyberattacks that target our phones are getting more sophisticated by the day.
Earlier this year we read an article in Vice Magazine about a hacker using a service called Sakari, which helps businesses do SMS marketing and mass messaging. The article revealed that a security hole at Sakari allowed an attacker using the service to reroute messages from a victim’s phone to their own.
From there it was quite simple for them to hack into other accounts associated with that phone number. The bad actor sent login requests to apps like Bumble and WhatsApp and could easily access the accounts by getting a password change message sent to their phone.
Is it time we stopped using SMS for two-factor authentication?
The article in Vice seemed to spark a wider debate among cybersecurity analysts about whether it’s time to rethink our relationship with our phone number and what we use it for.
Brian Krebs certainly thinks so. In a blog responding to the Vice piece, he said that phone numbers were never designed to be identity documents. Yet somehow that’s exactly what they’ve become.
He used the blog to advise his readers to remove their mobile numbers from their online accounts where possible. And not to use them for multi-factor authentication.
This is advice that is repeated in a number of publications from recent years. The UK’s National Cyber Security Centre (NCSC) also says that while there are lots of reasons why SMS might be useful for businesses - not least its convenience - it isn’t the most secure of ecosystems.
One of the NCSC’s suggestions is to use iOS and Android push notifications for authentication instead. Indeed, Apple and Google have both recently shifted to making phone verification their go-to approach rather than SMS and voice calls.
Clearly it was a challenge to get people on board to set up two-step verification in the first place. And that probably helps to explain why SMS was chosen as the default channel for it. People are familiar with SMS. We’ve been using it for 20 years or so.
This, then, is a key challenge for the coming years. Yes, we almost certainly do need to transition away from a reliance on SMS. But not if it means putting people off setting up their accounts securely altogether.
What mobile app developers can do to ease end user anxiety
Perhaps an even greater challenge is for us to collectively reclaim the mobile phone as a device of fun and convenience rather than one that makes us tense up when we hear it ping.
Hackers prey on vulnerability. And there’s a lot of that going spare these days. We’re living through a pandemic that has caused a global mental health crisis. Pretty much the last thing people need on top of everything else is to feel anxious whenever they receive an SMS.
Here at Licel we’ve written a lot about the value of embracing security by design principles. And we think the bedrock of working this way is to have empathy for your end user.
If you’re involved in mobile app development, then now more than ever it’s vital that you put yourself in your end user’s shoes.
Think about their day to day. How are they going to use your app? And where? This is quite a common consideration when trying to get the UX just right. But it’s much less common to do so with a view to making the end user’s experience more secure.
Can you educate your end users about how you’ll communicate with them? If they know that you’ll never send them an SMS and ask them to click on a link, then they can safely ignore any bogus messages from people pretending to be you.
Also have a think about how you use SMS as a business. Is it appropriate for you to use it as part of a two-factor authentication procedure? Could you use push notifications instead?
Do you need your users to create a password or can they log into your app with biometrics?
And finally, do you even need to collect your end user’s mobile number? Ask yourself: Why do you really need it?
Chances are that your competitors probably aren’t asking themselves these important questions from the beginning of app development. If you do, then your end users will appreciate it. They might even tell their friends about how cool and transparent you are.
Let’s teach people to recognize the threat
We’re at a bit of a turning point in our relationship with our mobile phones. Don’t get us wrong, they can still be a fun place to be. The millions of people dancing on TikTok each day and sharing memes of cats is testament to that.
But somewhere in the distance - right in the periphery of our vision - there’s a bad actor waving at us. Trying to get us to look in his direction instead.
At the very least we think it’s worth having a conversation about how to make sure people can recognise him.
When they can, they’ll ignore him and turn the other way.
Take a look at our security by design principles to get a better understanding of how to develop mobile apps more safely.