Blog: Internet Of Things

Could ‘Right to Repair’ heighten the risk for IoT and smart devices?

Ken Munro 12 Jul 2017

The right to tinker with and repair devices  – or the ‘Right to Repair’ – could be enshrined in EU law if the report on a longer lifetime for products: benefits for consumers and companies is passed into legislation. It’s part of a movement that’s gaining ground stateside too where bills have been proposed in 12 states as farmers lobby for the right to service their own John Deere tractors.

The Right to Repair is great news for those of us who resent having to take our phones and laptops to manufacturer-endorsed repair shops to pay extortionate sums for a screen replacements for example. It’s also good news for the environment, which is why the report has won the support of Greenpeace, as it aims to prevent the practice of ‘planned obsolescence’, when devices are only designed to last as long as the warranty period.

But it’s not all good news

But what bearing could the right to repair have on the security and safety of products around us? How does it impact connected cars, smart phones, the Internet of Things, and anything around us that runs software or firmware?

Could the right to repair actually make security and safety worse?

In the automotive sector, the mood directly contradicts that of the regulators, with manufacturers claiming there is a very definite need to clamp down on access to in-car systems. They argue more control is needed and that the current practice of allowing independent garages to service vehicles protected under the EU Block Exemption Regulation (BER) is not viable in the future as we move towards connected cars.

The manufacturers claim that to maintain security safety, they need strict controls around the firmware that runs on the 50+ ECUs in your vehicle.

Just because they’re money-minded it doesn’t make them wrong

From a sceptical perspective, this sounds like the manufacturers are trying to go back to the pre-2003 days, where independent servicing of your car would void the manufacturer warranty. But there is a degree of truth in their claims.

From the safety side, with so many complex devices interacting to make life-or-death decisions tens of times a second, it’s easy to see why manufacturers don’t want third-parties modifying their ECUs. Manufacturers are already struggling with code quality issues – see the Toyota unintended acceleration findings – and compounding this with third-party code or aftermarket ECUs makes things even more challenging.

Imagine if a serious car accident involved a vehicle with custom ECU code. How do you determine which ECU caused the problem? Who is liable? Investigating accidents could suddenly become a lot more complex.

From the security side, several high profile hacks – including the Miller and Valasek remote Jeep takeover and the TenCent Tesla hack   – have involved delivering malicious firmware updates. How do you protect a vehicle from malicious firmware updates whilst allowing the owner to make changes?

The manufacturers also have intellectual property concerns. Autonomous vehicles require significant investment to develop, and the output is considered a trade secret. The real-time nature of self-driving vehicles means that this sensitive code must be inside the vehicle, potentially allowing an attacker to access it. How do you allow users to update the firmware without leaking all the details to competitors?

It is both naïve and foolhardy to assume that you will develop a system entirely free of bugs, free of vulnerabilities, and with all the functionality your customers need. For this reason, it is essential that you have a means to update firmware.

Over-The-Air updates will handle everything, right?

When cars only had a few significant ECUs, it was perfectly valid to perform these upgrades during servicing, by swapping the entire unit, changing a ROM chip, or connecting a programmer. As cars have got more and more complex, alongside more and more viable threats, it has become necessary to deploy over-the-air or customer deployed firmware updates. Without this, updates cannot be delivered in a timely manner.

These updates must be delivered securely and without impacting user experience. Secure boot, signed firmware updates, encrypted communication channels, code signing, and dual firmware images can all be used. Whilst it is conceptually easy, implementing this across a fleet of vehicles is extremely challenging.

For the most part, manufacturers are neither ready or able to provide this level of firmware security. We regularly see firmware update mechanisms that introduce their own critical vulnerabilities. Complicating this with methods to allow users or independent garages to update firmware is likely to result in further weakened security in the real world.

And it isn’t just cars – washing machines, fridges, kettles and cuddly toys – all will rely on secure firmware updates. Whilst these may not have as many finely balanced parts as a car, many of the same concerns exist.

Some in security say that “once you have physical access, it’s game over”. Whilst this implies that you should protect your devices from physical access, it doesn’t mean that we should abandon all physical protections. We have come to expect our devices to withstand a degree of local attack – after all, you don’t expect to come back to your locked phone after 5 minutes and find that someone has installed malware on it.

Secure firmware update methods

As it stands, Android phones are probably the best example of something that allows secure firmware updates, whilst still allowing certain users to load their own software. Many devices allow this to happen by “unlocking the bootloader”, where a PKI based challenge/response is performed between the phone and manufacturer. A challenge is read from the phone, provided to the manufacturer (generally on a website) and then a response is entered back into the phone. This process should also destroy the keys used to access encrypted storage on the device. Once this is done, custom firmware can be loaded onto the device. This provides a degree of protection to the user.

Implementing a challenge/response mechanism like this across all devices – especially those without any meaningful user interface – isn’t easy. Convincing manufacturers to do this – or similar – is going to be an uphill struggle.

Apple have taken another stance. They forbid any third-party firmware. Exploits must be used to allow changes to be made, which makes the process more difficult than for Android phones. Again, this provides users with a level of assurance that their devices haven’t been tampered with. But they often come under fire for their policies, both in terms of firmware and obtaining parts and documentation.

One option could be to provide a physical switch to put the device in to a state where the bootloader is unlocked. This would keep those who want to repair happy, but it would introduce significant security issues. Compromise in the supply chain would be much easier, as would local compromise once installed.

A major problem with trivial unlock is that consumers simply won’t understand the security implications of doing so. We see Android phones unlocked in order to access app stores with free cracked apps. Then the user wonders why they’ve been compromised by malware in those stores. How do we distinguish between those who can safely unlock a phone for the purpose of repair/tinkering and those who don’t?

…and another thing

There are further complications. Aspects of the right to repair concern access to documentation (such as schematics) and the ability to determine how products work. Whilst this may be possible for mechanical parts and mechanical components, how does this work for firmware?

The bulk of firmware is comprised of compiled binaries. Whilst these can be reverse engineered to determine how they operate, this can be an extremely time consuming process. Fixing a bug in firmware with only access to the compiled code could be virtually impossible. What is the solution to this?

Should the source and toolchain for all firmware be published for consumers? Maybe in an ideal world, but the clear majority of vendors aren’t willing to open up to this level.

Is this legislation already redundant?

What is maybe most concerning about all this is that the proposed EU legislation does not mention firmware, software, or security at all. Has there been no consideration to it at all? Are we about to pass into law something that is already outdated? Further, does it conflict with EU plans to mandate security for IoT?