Here’s another great PBL case study from my pgcert cybersecurity course, based on this article. In this instance the focus was less on the details of the flaw itself than the circumstances of how it was revealed by the researchers (way too soon) and handled by HTC (way too late). Now, I’m not an expert in the circumstances of this case, but this is a summary of the general points I took from it and successfully presented.

HTC’s shouldawouldacoulda

Of everything HTC could have done, having a vulnerability reporting and disclosure policy tops the list; this is a set of information made publicly available that states clearly how researchers should communicate with HTC when they discover an issue and how HTC will deal with it (this is now available, but perhaps wasn’t at the time). This usually states that the company will grant legal immunity to researchers who report their findings responsibly along with secure contact information such as an email address and a PGP key since they will want the information exchanged confidentially. Timelines may be included to give the researcher an idea of when they can expect return contact from the company and when they can disclose their findings to the public with the endorsement of the company. Sometimes there’s also a ‘hall of fame’ showing the gratitude of the company to researchers who have helped them improve their products in the past. HTC should have made every effort possible to follow up swiftly on vulnerability reports not only because any issues reported could already be impacting their customers but also to avoid precisely the circumstances that came about here, where researchers felt they weren’t heard and so went public after an unreasonably short amount of time.

HTC could have further increased their profile by having a ‘bug bounty’ policy. This goes beyond simply accepting vulnerability reports by offering financial incentives for them, with the aim of encouraging skilled researchers to actively engage in vulnerability hunting on the company’s behalf. Researchers would have to abide by the relevant disclosure policy to be eligible for the reward, so HTC would have been able to ensure a reasonable disclosure period. If running such a program in-house were not feasible for any reason, there are reputable vendors that run fully managed bug bounty programs on the behalf of large organisations where the effort of operating the program and verifying reported findings are part of the service.

HTC should have had security as a core pillar of their software design policy. Without this in place the guiding principles seem to have been:

  • ‘Security through obscurity’: HTC placed sensitive data in a location where it was unsafe but the assumption was that it would not be a problem because nobody would think to look for it there.
  • ‘Convenience over safety’: HTC presumably had their app logging data to an unprotected location because it was easier to extract from there without requiring higher-level access.

HTC’s vulnerability policy should have recommended a response that eliminates risk to customers as a primary concern. By providing an immediate hotfix for users to fully remove the HTC app that was placing them at risk, they could have bought time to securely redesign this app and re-release it in a future update. Every moment they delayed was putting their customers at risk since there was proof-of-concept exploit code ‘in the wild’ and since the app was not core to the functionality of the device, this should have been a no-brainer of a decision to halt risk both to HTC’s reputation and their customers’ private data.

HTC’s vulnerability policy should have outlined immediate engagement with stakeholders:

  • Customers – in order to communicate the exact nature of the risk they face, what HTC are doing to reduce or negate the threat and what actions their customers should take in the interim.
  • Researchers – in order to ensure HTC correctly understood the nature of the vulnerability and the operation of its exploit.
  • Google – in order to provide them examples of exploit code so they could filter their Store apps in order to avoid the risk of a customer installing a malicious app that would take advantage of the vulnerability.

Along with many other software businesses, HTC must contend with commercial forces. As their resources are finite, when they receive a vulnerability notification they must weigh the amount of investment needed to fix it against the risk of its exploitation. That risk determination could be based on a framework such as CVSS which would evaluate such aspects as the exploitability of the flaw and the impact to customers if it were to be found. The company could then build their patch response policy around this, with response timelines targeted at specific ratings. A well-documented policy containing this metric will ensure that decisions are consistent and that if necessary the company can later explain and defend their rationale.

If I owned a vulnerable device

Sadly a sledgehammer is out as the first port of call, but at the very least I’d remove or disable the vulnerable app as that would immediately halt the vulnerability. Since it was not a response supported by HTC it is likely that this would have a requirement for high technical ability on the user’s part in order to obtain root access to their phone and delete the APK. For this reason it’s not realistically a recommendation that most HTC customers could use. The alternative is to reset the device, removing any non-system apps until a fix is available; third party apps installed after the exploit was made public could be malicious regardless of the apparent trustworthiness of the developer. In the absence of any protective mechanism, the only response HTC customers can take is to avoid the risk by not installing any new apps. Users who have concerns about their security or who have an inherently high-risk profile would be better placed by moving to a more secure handset.

My Verdict

HTC have not acted in the best interest of their customers by failing to design a system that would pass a reasonable standard of security and by failing to respond adequately to a legitimate report of a major security issue both in advance of its public release and subsequently. While it is certainly true that no system can be fully secure, their design was careless and claims of taking security seriously while ignoring an issue of this magnitude rings false. The only message they’ll hear is that of customers voting with their feet and moving to a handset with proven track record in support.

The problem is wider than this single example from HTC; other vendors have similarly introduced vulnerabilities on their Android handsets via their custom bundled code. Vendors need this to distinguish their brand or provide feature differentiation but this code is not subject to the same degree of scrutiny as the core Android OS and therefore can introduce vulnerabilities. This is understandable at least to the degree that these handsets are often sold with minimal overheads to appeal to the price-conscious consumer and therefore receive threadbare development and support. Google’s own Pixel phone would not suffer from this issue as it does not run a non-standard UI. In addition, Google runs a bug bounty program and has already paid bug bounties to researchers who found issues with the Pixel. This device therefore presents a more solid security choice while remaining on the Android platform.