The global automotive industry is changing and it will never go back to the way it was. Organizations that can't keep up with the rapid pace of evolution risk being left behind as their more nimble, more advanced counterparts race ahead.
With the June 2020 release of new automotive cyber security regulations from the UNECE's WP.29 forum, we saw a glimpse of the shape this new world is taking from a regulatory standpoint. The line between car and computer becomes increasingly blurred each passing year, making it all the more important for automotive organizations to understand how to not just remain compliant with current legislation, but stay ahead of future changes, too.
But security is often seen as a burden - a box to tick in order to progress with more value-adding activities. However, we believe security is more than a burden, but also a source of revenue.
Under the WP.29's new legislation, automotive manufacturers are being asked to recognize that their vehicles are as much a computer as a car, which means the door to these 'computers' is open to new and potentially dangerous risks.
The June ruling also signifies a shift in the industry; the simple recognition that cars are now heavily at risk of cyber attack and related new forms of tampering suggests that more regulations are to follow in future. We've seen that with data protection regulations, and we expect to see it with automotive as well.
At time of writing the WP.29's cybersecurity rules impact only class M and N vehicles, although the group has signalled that classes O, R, S and T are under consideration.
The released document outlines provisions for both organizations and their vehicles.
As you can see, the changes are both extensive and specific - and most importantly, they apply to each vehicle's entire lifespan. This puts a whole new onus on manufacturers to be able to protect their vehicles for potentially years, decades, after rolling them out of the factory.
While organizations will likely have to scramble to ensure they are compliant with these new regulations, there are a host of other technological and indeed ethical considerations to factor that will impact future regulations, too.
Again, those who aren't able to keep up risk being left behind. All of what we discuss below are conversations that leaders must be having right now - and not just within their organization, but with governing bodies, subject matter experts, customers and even competitors.
Artificial intelligence, machine learning, cloud, big data, these are all increasingly involved in the workings of new vehicles (particularly autonomous vehicles). They pose not just cybersecurity but also privacy concerns.
The more cameras and microphones installed into a vehicle, as well as the amount of information stored (i.e. for personalization of the car, or connecting accounts such as Google or Spotify), the greater the privacy risk - and the temptation for bad actors.
Customers are already weary of their private information being misused, meaning on top of the new WP.29 regulations are also considerations around the ethics of collecting and storing data. How much do you collect, how do you use it, where is it stored?
Restrictions are only getting tighter.
Cars are technological organisms that contain 100+ million lines of code. The issue here from a security point of view is that if someone gains access to the car's data, or finds a way to manipulate how the system works, they could potentially alter the behavior - and safety - of the vehicle.
Take Tesla as a small example. Earlier in 2020, researchers found that an autonomous Tesla car could be tricked into changing its behavior (stopping suddenly, swerving, etc.) by projecting fake road signs or fake obstacles, like people, into the car's path - for instance, displaying a fake road sign on an electronic billboard, or a drone projecting a fake image onto a wall.
Then of course, there are the ethics of artificial intelligence that academics have been debating for some time - but without coming to any strict conclusions.
For example, what is more ethical? A self-driving car that deems one accident to be less of a risk than another and therefore chooses that accident, or a human driver who has lost control and has no choice in the matter (therefore it's a 'true' accident)? In the former case, a team of engineers made a decision to take one life over another.
Some governments, like Germany, have tried to establish rules on this matter, but the debate rages on.
Finally, there's liability. In the event of an accident involving an autonomous car, who is liable? Who is at fault?
To put this question into context, consider third-party insurance. Around the world, third-party insurance typically requires there to be an "at-fault driver". But if the car is autonomous and causes an accident, there is no driver. So is the manufacturer at fault? A software developer?
Again, regulations are piecemeal, but will more than likely tighten over time.
It's understandable that all of these new cybersecurity regulations and ethical talking points may seem like a burden, perhaps even a guaranteed way to lose revenue. But this is a losing attitude - in reality, these regulations can be viewed as a unique opportunity to get ahead.
Leaders must view security as not in opposition to the company, but an asset that requires investment. And, their security teams must embody this optimistic spirit. Security may take time and, yes, cost to establish - especially if a particular organization was already quite far behind - but once it's there it can become a genuine source of revenue.
Consider the marketing element. Customers are sensitive about their personal information, and many are wary of technology in general. The ability to say that your vehicles are not only compliant, but go above and beyond regulations to ensure maximum security, privacy and protection is a powerful marketing tool to sell new vehicles (especially if competitors aren't doing the same).
Then consider the cost-saving element. Security as an afterthought can cost - big. In fact, the longer you wait in the development pipeline to find and deal with errors, the more expensive those errors become. Consider product recalls, reputational harm, or even just the time required paying teams to comb back through line after line of code looking for the bugs. But integrating security into your processes, developing new systems with security in mind and checking them in real time, can help you spot errors faster and, therefore, deal with them quicker - saving money. This requires a change in development mindset, it may even require a team restructure, but it's an investment that's worth it in the end.
You don't have to go through this process alone.
We have decades of experience working with cybersecurity and our teams are experienced in a variety of fields, including automotive. Contact us today for a free maturity consultation so we can talk through your requirements, your current state in relation to the new regulations, and we'll work together to help your company not only get compliant, but win the competitive advantage that security can offer.