November 23, 2024

Hot Custom Car (software?)

I’ve found Tim’s bits on life post-driving interesting. I’ve sometimes got a one-track mind, though- so what I really want to know is if I’ll be able to hack on that self-driving car. I mentioned this to Tim, and he said he wasn’t sure either- so here is my crack at it.

We’re not very good at making choices like this. Historically, liability constrained software development at large institutions (the airlines had a lot of reasons not to let people to hack on their airplanes) and benign neglect was sufficient to regulate hacking of personal software- if you hacked your PC or toaster, no one cared because it had no impact (a form of Lessig’s regulation by architecture). The net result was that we didn’t need to regulate software very much, we got lots of innovation from individual developers, and we stayed bad at making choices like ‘how should we regulate people’s ability to hack?’

Individuals are now beginning to own hackable devices that can also harm the neighbors, though, so the space in between large institution and isolated hacker is filling up. For example, the FCC regulates your ability to modify your own wireless devices, so that you can’t interfere with other people’s spectrum. And some of Prof. Jonathan Zittrain’s analysis suggests that we might want to even regulate PCs, since they can now frequently be vectors for spam and viruses. Tim and I are normally fairly anti-regulation, and pro-open source, but even we are aware that cars running around all over the place driven by by potentially untested code might also fit in this gap- and be more worthy of regulation.

So what should happen? Should we be able to hack our cars (more than we already do), and if so, under what conditions?

It’d help if we could better measure the risks and benefits involved. Unfortunately, probably because we regulate software so rarely, our metrics for assessing the risks and benefits of software development aren’t very good. One such metric is Prof. Zittrain’s ‘generativity’; Dan Wallach’s proposal to measure the ‘O(n)’ of potential system damage is another. Neither are perfect fits here, but that only confirms that we need more such tools in our software policy toolkit.

This lack of tools shouldn’t stop us from some basic, common-sense analysis, though. On the pro side, the standard arguments for open source apply, though perhaps not as strongly as usual, since many casual hackers might be discouraged at the thought of hacking their own car. We probably would want car manufacturers to pool their safety expertise, which would be facilitated by openness. Finally, we might also want open code for auditing reasons- with millions of lives on the line, this seems like a textbook case for wanting ‘many eyes‘ to take a look at the code.

If we accept these arguments on the ‘pro’ hacking side, what then? First, we could require that the car manufacturers use test-driven development, and share those tests with the public- perhaps even allowing the public to add new tests. This would help avoid serious safety problems in the ‘original’ code, and home hackers might be blocked from loading new code into their cars unless the code was certified to have passed the tests. Second, we could treat the consequences very seriously- ‘driving’ with bad code could be treated similarly to DUI. Third, we could make sure that the safety fallbacks (emergency brake systems, etc.) are in separate, redundant (and perhaps only mechanical?) unhackable systems. Having such systems is going to be good engineering whether the code is open or not, and making them unhackable might be a good compromise. (Serious engineers, instead of compsci BAs now in law school, should feel free to suggest other techniques in the comments.)

Bottom line? First, we don’t really know- we just have pretty poor analytical tools for this type of problem. But if we take a stab at it, we can see that there are some potential solutions that might be able to harness the innovation and generativity of open source in our cars without significantly compromising our safety. At least, not compromising it any moreso than the already crazy core idea 🙂

[picture is ‘Car Show 2‘, by starmist1, used under the CC-BY license.]

Comments

  1. do you really want cars on the road that are completely drive by wire?
    we have throttle by wire. if it fails, you can’t accelerate, your stuck at idle.
    what happens when the brake by wire fails? do the brakes just not work, or do they
    automatically apply?
    what happens when the steer by wire fails? does it wind up turning in one direction, or
    doe it just go in a straight line?
    by wire on critical systems on cars is not a good ideal.
    yes, airplanes have most everything by wire, but they require a lot of maintenance.
    few people maintain their cars now, probably 70 percent of the cars on the road right
    now are low on oil.
    by wire steering & brakes on a car i own, that is not going to happen.

  2. 1. I think the code in cars should be modular. Radio, GPS, etc. should be separate.
    2. I think the interactions between modules should be documented by APIs and
    diagrams. For example, if you have an HVAC module you may hack it, but your hack
    you must be aware that it should respond to a signal to warm up the engine when it
    is cold freezing out there before you can start the engine. Critical signals must be
    clearly labeled (engine failure, failure to deploy airbags,etc systems).
    3. Critical safety systems should require signed code (TPM?) After all, it is the manufacturer’s liability we are talking about here.
    4. Untamperable Change Logs. After all, if you crash because your mod did not allow the car to stop on time, you should not be able to sue the manufacturer.
    5. It would be nice if the manufacturers could have a “Virtual Car” where you could test your code before entering it into your car. I envision a system where you can submit code to the virtual car with the provision that test results when you plug it into the real car will be fed back to the manufacturer. If you can create a module that saves 10% gas you should be able to sell it, or the manufacturer should know about it and contact you if they want to include it on the next car update.

  3. It seems to me that the easiest way to do this would be to isolate the code that can potentially kill people (as opposed to the bits that merely control the radio etc). It can still be open source, and the tests should be freely available, but cars should refuse to run it unless it’s signed with a key owned by an independent regulator.

    Ideally, it should be possible for anyone to get their modifications signed – in fact, it ought to be possible to automate the process entirely. You submit your custom binary, it runs all the standard tests against it (and they’d need to be a very thorough set of tests indeed), and if it passes all of them, it signs it for you. When new tests are developed, the key can be updated such that new cars will refuse to run old code until it’s certified against the new set of tests.

  4. Bryan Feir says

    I’m reminded of a bit in an SF story in which vehicular homicide while drunk was considered first degree murder. The line of thought went as follows:
    – All cars of this time had autopilots and safety features to prevent somebody from driving while impaired.
    – Disabling those safety features was difficult, requiring a fair bit of detail work.
    – Having demonstrated one was capable of reasoning by disabling the system, one had to then deliberately impair that capability by drinking.
    – Therefore, any ‘accidents’ while manually driving impaired were considered premeditated.

  5. I can imagine a compartmentalized model, whereby parts of the car system are open and relatively hackable, but other parts — the safety-critical parts — are not.

    This is similar to how many PC hardware platforms now contain a TPM, which can be used to perform hack-/reverse-engineering-resistant operations while the rest of the OS remains hackable.

    Don’t some cars already contain hack-resistant speed/acceleration-limiting hardware? I seem to recall hearing about something along those lines…

  6. Am I allowed to build or modify a car and drive it on the road at the moment? I expect yes, but even if not, there is nothing physically preventing me doing it. It is legislation that dictates I can’t remove my brake disk and then drive on the road. In fact, in the UK, the requirement is that cars must periodically pass an MOT to assess road worthiness:
    http://www.direct.gov.uk/en/Motoring/OwningAVehicle/Mot/DG_4022112
    I assume other countries have similar requirements.

    The only difference with code is that the barriers to entry are potentially much lower for an individual.

    I think a good solution would be to produce a standard test suite (and probably a standard car API), and legislate that every piece of code running on a car conforms to said test suite. Essentially extending the requirements of the MOT to the software that drives it.