Menu
Menu
inquire

Why making Android App Bundles mandatory has sparked a debate about trust

From this month, mobile app developers uploading a new app to Google Play have to switch to Android App Bundles for distribution.

And it’s fair to say not everybody is happy about it.

Because while App Bundles do make life easier - particularly for end users - there’s a growing concern among app developers about security.

Most of the worry is centered around Google Play Signing. Developers could sign using their own private keys when using the old APK upload method. But with App Bundles, Google manages the signing.

This has led some developers to question whether their app’s code might be tampered with or extra classes added.

Google has sought to reassure them. The company has even introduced an initiative called Code Transparency - itself not without its detractors - to tighten up the signing process.

Yet a steady hum of reservation remains.

Dig a little deeper, beyond the controversy around Google Play Signing, and some wider issues linked to trust emerge.

In this article, we’ll bring those issues to the surface. And we’ll ponder what the ideal Play Store security ecosystem might look like.

So, let’s get started.


The main App Bundles controversy is Google Play Signing

First things first - Google has got a lot right with App Bundles.

There was a genuine problem to be solved in the shape of thousands of different Android devices and configurations. Many of the resources that come in a typical APK are redundant for some people. And these resources take up lots of space on a device.

The great thing about Android App Bundles is that it means each APK is tailored to a specific configuration. So, storage space is saved. What’s more, downloads are faster - an important benefit when you bear in mind that not everyone enjoys superfast internet connections.

If the App Bundle initiative stopped there, you suspect there would be widespread support for it. It’s the insistence on using Google Play Signing that has caused the controversy.

Before Android App Bundles, mobile app developers would sign their APK with a private key that they were responsible for looking after. Though a flexible approach, it wasn’t without its pitfalls. Developers sometimes lost their keys or checked a private key into a public repository. Keys could even be stolen.

With Google Play Signing, Google is taking this responsibility away from developers. Instead it is they who will manage the distribution key used to sign the APKs end users receive. This key will be kept private from the developer.

In an article published on Medium last year, Google Android Developer Advocate Wojtek Kaliciński acknowledged that this is a departure from the norm. He said he understood that developers might feel they’re relinquishing too much control over their app. This wider idea of a loss of control is an interesting one that we’ll come back to later in this article. But if we focus on the Google Play Signing controversy for now, the main point of contention relates to security.

Google is clearly a highly reputable company that cares about protecting mobile apps. But if the last year has taught us anything it’s that anyone can suffer a security breach.

Furthermore, as Google will be looking after the keys for all apps uploaded to the Play Store, bad actors know they only have to attack one place to steal thousands of keys.

Another issue that developers have raised is that the Play Signing method means that somebody at Google could potentially inject or modify code in the app. This is something that simply wasn’t possible with the old APK model. Then, if a change was made at Google, this would alter the signature so people would notice. With Play Signing, Google owns the signing key so only they would know if changes were made.

Google has insisted they won’t modify code within apps. But what seems to be bothering people is the simple fact that they can do so.


Questions around Code Transparency

We should make it clear here that this isn’t only a Google issue. If you publish an APK file on the Amazon App Store, they also inject classes that alter the way it will work on someone’s device. Apple regularly modifies the code of your app on the Apple Store, too.

As for Google, they’ve announced Code Transparency partly as a response to concerns about Play Signing. Code Transparency uses a second signing key to make sure the APK delivered by the Play Store matches the one the developer built. The difference is that this key is held only by the developer.

But Code Transparency has only quietened some of the noise around Play Signing. Indeed, in some cases it has resulted in more questions than answers.

One of these questions is focused on the fact that the Android OS currently has no way of verifying the Code Transparency file upon installation. It is the developer's responsibility to make sure that all DEX and native code files in the downloaded APKs have the correct corresponding hashes in the Code Transparency file. And since Android doesn't provide any built-in functions to verify that Code Transparency file during runtime, developers have to do the checks themselves. They have to download the APKs from the Play Store and test them using Bundletool.

There is of course a risk here that integrity checks will just not be done. No checks at install time, no checks during runtime, and quite possibly none in the developers' own time. And so the app is left vulnerable to tampering.

But even if the developer takes their responsibility seriously and does everything right, the app will ultimately be out of their hands. This is the deeper concern for many. Whoever has signing authority (be it legitimately or illegitimately) over the Android App Bundle uploaded to the Play Store has the power to remove the Code Transparency file and then re-sign the app. If this were done, there would be no way for the original developer to verify its integrity anyway.

Additional layers of security are therefore needed to protect both developers and users from possible interference. These layers include a signing key held solely by the developer (as with Code Transparency). But on top of this, adequate encryption of both code and resources is needed, as well as integrity checks performed automatically during runtime. That way both developers and users get peace of mind that the app is exactly as it was intended to be. This is a crucial feature of RASP: Runtime Application Self-Protection. With these checks, the app itself routinely verifies that none of its code and resources have been modified since the developer signed it off.

And it is possible to combine all of this with the Code Transparency approach, by using the Code Transparency file as another block in the structure of integrity control. Code Transparency, in other words, can be part of a solution to potential problems with Google Play Signing. Even if it is not the whole solution.

For that, what is needed is a more complete approach to app distribution, where the security and integrity of the app can be checked and proven at every stage. So, when the developer uploads it to the app store, when the app store publishes it, when the end user downloads and installs it, and of course routinely on start-up and during runtime. This creates a full chain of trust. A coherent system of verification with real transparency for developers and users.


Why the Android App Bundles controversy matters so much to developers

Speaking of trust, it seems appropriate to talk about a theme we’ve covered a lot here at Licel in recent months: the growing importance of mobile apps to business success.

The more you read about Android App Bundles and Code Transparency, the more you can’t help noticing a disconnect. Mobile apps can be deeply emotional things for developers. After all, we’re talking about a project that might have taken them months or even years to complete. The end result is an app that could become critical to their hopes and dreams for the future.

Many businesses exist now that have forgone a physical store in favor of an app.

Think about challenger banks like Starling and Monzo, for example. In other words, mobile apps are now sometimes the most important asset a business has.

Put yourself in the shoes of a developer at one of these businesses and it’s easy to understand why they might be reluctant to give up so much control. Yes, there were some downsides to looking after their own private key. But at the end of the day it was their responsibility. They were accountable.

To some developers the shift to App Bundles and Play Signing is akin to dropping off a parcel at Google that contains information integral to everything they want to achieve as a business. It stands to reason that they might feel anxious on the way home, wondering what will happen to it and whether it’s in safe hands.

As we said earlier, the chances of any kind of breach at a company like Google or Apple are remote. Yet when the asset being protected is so vital, even a minuscule possibility is hard for developers to accept.


The undercurrent driving the waves

This is the disconnect that is so hard to ignore. Namely the gap between the emotional, financial, and reputational value invested in an app by an individual developer and the security measures in place in Silicon Valley. It’s a disconnect that leads us to ask an important question:

Is it reasonable or right to delegate your cybersecurity needs to businesses like Google, Apple, and Amazon?

This is a particularly pertinent question when you bear in mind that when these companies talk about security what they tend to be referring to is user privacy. But protecting user privacy by only using built-in OS protection is impossible. Robust protection should incorporate many different security layers, from Integrity Control through to Device Attestation.

Another topic we’re fond of exploring here at Licel is the balance between convenience and security. Oftentimes it feels like the scales swing in favor of the former rather than the latter. Might this also be the case with Android App Bundles? The suspicion is that a more seamless user experience has been prioritized at the expense of Integrity Control.

All this helps to explain why there is a wider conversation about trust right now. After all, at the same time that developers are debating the merits of the shift to App Bundles, Fortnite creator Epic Games’ battle with Google and Apple is making headlines.

The undercurrent driving the more obvious waves we’ve covered so far in this article might be a growing resentment of the power and control that the Silicon Valley giants wield. From the outside it can appear they’re writing the rulebook on mobile apps. Not only where they must be downloaded from, but also how they must be protected.


A commitment to security

It’s often forgotten that smartphone apps have only been with us for around a decade or so. To some extent we’ve all had to set the rules as we go.

But the sense you get from reading blogs from developers who challenge the App Bundles and Play Signing requirement is that they feel sidelined. Impotent.

This seems like a bit of a missed opportunity for big players like Google, Apple, and Amazon. All of whom have a chance to play a key role as educators in cybersecurity in the coming years.

The mobile phone has become even more important to people’s everyday lives during the covid-19 pandemic. It’s a trend that hasn’t gone unnoticed by cybercriminals, which explains why they’ve flooded people’s phones with social engineering attacks.

This has brought an increased awareness of cybersecurity to ordinary people. But at the same time we’re not about to pack our phones away under our beds and go back to how life was before. Instead, we’ll simply expect the businesses that develop the apps we rely on to secure that app thoroughly.

In the near future a commitment to security is likely to be a key metric for consumers alongside value for money and trust.

Truly caring about security is a cultural thing. Forward-thinking businesses are now starting to realize the benefits that come from embracing security by design principles. From thinking about security first and functionality second.

But we’re not there yet. If the majority of developers don’t take security seriously enough, then perhaps we shouldn’t expect Apple and Google to do so, either.


Making app store processes more secure

Our goal with this article isn’t to lambast businesses like Google, Apple, and Amazon. Far from it. Instead, it’s to start an open conversation about whether we’re collectively taking mobile security seriously enough. To ponder ways we might be able to work together to make sure bad actors don’t gain even more ground.

So, with that in mind, what do we see as the ideal approach to security for app stores?

At Licel we often speak with our clients about creating a chain of trust. That means across all steps in app development and distribution. From the developer, to the app store, to the end user.

What does that mean in practice?

Well, to start with there should be full PKI infrastructure in place to be able to verify the certificate is genuine and that it hasn’t expired. Much as you can verify the full chain with SSL pinning. This should be the base for any kind of signing. If you can’t verify the original certificate or the full chain, you can’t trust it.

Ideally, a developer should also be able to verify app integrity at each stage of distribution with her public key certificate. So, after the app store processes the application, once it’s downloaded to the device, and later when it’s running on the device.

To really build trust, companies like Google or Apple need to note any modifications they make to files and classes. And developers should be able to verify these with the app store public key. Code Transparency is a good start, but the fact that it isn’t a mandatory measure right now means some developers won’t be aware of it.

Finally, to combat the threat of tampering and reverse engineering, we recommend that the app be uploaded in an encrypted format.