April 24, 2014

avatar

Maybe "Open Source" Cars Aren't So Crazy After All

I wrote last week about the case for open source car software and, lo and behold, BMW might be pushing forward with the idea- albeit not in self-driving cars quite yet. ;)

Tangentially, I put “open source” in scare quotes because the car scenario highlights a new but important split in the open source and free software communities. The Open Source Initiative’s open source definition allows use of the term ‘open source’ to describe code which is available and modifiable but not installable on the intended device. Traditionally, the open source community assumed that if source was available, you could make modifications and install those modifications on the targeted hardware. This was a safe assumption, since the hardware in question was almost always a generative, open PC or OS. That is no longer the case- as I mentioned in my original car article, one might want to sign binaries so that not just anyone could hack on their cars, for example. Presumably even open source voting machines would have a similar restriction.

Another example appears to be the new ‘google phone’ (G1 or Android). You can download several gigs of source code now, appropriately licensed, so that the code can be called ‘open source’ under the OSI’s definition. But apparently you can’t yet modify that code and install the modified binaries to your own phone.

The new GPL v3 tries to address this issue by requiring (under certain circumstances) that GPL v3′d code be installable on devices with which it is shipped. But to the best of my knowledge no other license is yet requiring this, and the v3 is not yet widespread enough to put a serious dent in this trend.

Exactly how ‘open’ code like this is is up for discussion. It meets the official definition, but the inability to actually do much with the code seems like it will limit the growth of constructive community around the software for these types of devices- phones, cars, or otherwise. This issue bears keeping in mind when thinking about openness for source code of closed hardware- you will certainly see ‘open source’ tossed around a lot, but in this context, it may not always mean what you think it does.

Comments

  1. rp says:

    Isn’t this just taking the Linux kernel policy to its logical conclusion? If the people who control actual installation are reasonably responsive, a review layer may well not slow things down too much for a community to develop.

    (And not the “actual installation” is am fuzzy term when you have emulators and jailbreak code; what you really mean is that members of the developer community can’t engage in mass distribution to members of the non-developer user community without passing through the folks who sell the hardware or over-arching software.)

    Now I’m wondering what a BMW emulator would look like.

  2. John Millington says:

    You can do that with Open Source, but it sure ain’t Free Software. So much for charges of pedantry against RMS for making the distinction. Now the distinction matters, and is right in everyone’s face.

  3. Anonymous says:

    Most software openness and compatability in automobiles has been with manufactures twisted arms (through law), taking your car to the dealer for service is a major profit center.

  4. Anonymous says:

    Merely being able to read the code that runs on my car, even if I can’t modify it, would be useful.

  5. Anonymous says:

    There’s a real benefit to being able to download, read, analyze, and fiddle with the code that runs our devices. At the same time, there’s a very real risk associated with letting the users of those devices install modified versions of the software that controls them.

    Perhaps the best example here is voting machines. Our society will absolutely benefit if we can all have a look at the source code to the software that runs the tools we vote with. Complete transparency here is, IMO, one of two absolute requirements for creating trustworthy electronic voting machines. The other is verifiable nonmodification. It does no good to show people the code if they can’t be sure that the code you show them is the code that’s running on the machines.

    Another great example is the lead-in for this fine article: automobiles. I’m sure we’d all love to be able to tweak the software that runs our cars for better performance, better economy, more features, etc. You might even feel confident enough in your tweaks to drive your tweaked car at highway speeds. But you know that guy who works down the hall from you who never checks for nil and who can’t tell the difference between O(n) and O(n^3)? Would you feel comfortable driving next to his tweaked car at highway speeds?

    • John Millington says:

      “Would you feel comfortable driving next to his tweaked car at highway speeds?”

      I’m not saying that user-modifiable auto software doesn’t come with risks, but we already have those risks right now. That same guy already has the power to put weird stuff into his fuel tank, the power to drive with only one lug nut on each wheel, and he even has the power to forget to change his brake pads when they start to squeak. There are so many ways to negligently maintain a car; what’s one more? You might as well just hold people responsible for the ill effects of what they do, instead of going to a lot of trouble to try to prevent 1% of it.

      • Anonymous says:

        How do you hold a lone well-intentioned programmer responsible for a bug which results in a car crash that costs, say, four lives? Do you take his house, all his money, and his freedom? That seems justified, but also very harsh considering that most of us make simple, honest mistakes on a daily basis. Do you expect his insurance company to pay some enormous settlement? If that happens, you can expect all affordable insurance policies to specifically forbid modifying any software in your vehicle, which gets you back to essentially the same closed system we have now. Do you have the government impose licensing and quality control standards for automotive software developers? I don’t think anyone wants to go there.

        Large companies such as automotive manufacturers have the sorts of resources (time, money, labor, talent, systems) required to ensure a level of safety that individuals generally cannot afford. And since they sell millions of vehicles, they have vastly greater incentive to make sure their software works safely. They also have the ability to amortize the cost of mission-critical software over millions of vehicles so that the cost per vehicle isn’t prohibitive. And if something does go wrong, they also have very deep pockets.

        What we’re really talking about here is whether or not there’s a place for regulation in the software industry, and who if anyone should provide that regulation. Closed-source software is one form of manufacturer-imposed regulation. Non-modification of devices is another. The point of regulation is usually to encourage adherence to some standard of quality and to make it possible for customers/consumers/users to trust a product. As far as I can tell, open-source software and non-modifiable devices both promote that goal by making it possible to know exactly what software a device is running. Closed-source software and user-modifiable devices work against that goal by making it impossible to know what software a device is running.

        I don’t care if you reprogram the audio system in your vehicle. But at a time when manufacturers are experimenting with drive-by-wire, crash avoidance, active stabilization, etc., I don’t think it’s in the driving public’s interest for individuals to be able to customize their own (or someone else’s!) vehicle’s control systems.

        • Anonymous says:

          Maybe there should be a “Good Samaritan” law for those cases a programmer does programming on a car without being financially rewarded?

  6. Dana Cline says:

    I doubt this will ever happen, but I want the ability to program how my car’s UI behaves. What gauges, where, and what style. I want to program alarms (speed, gas level, temp, oil pressure, etc). I want the ability to program all the buttons.

    Finally, I want this information stored in a usb stick so I can move it from car to car…

  7. Andrew says:

    This is largely an issues about hardware and operating systems.
    And Linux is never going to convert to GPLv3.
    So most of this conversation has no short or medium term relevance.

    Dont get me wrong, I am a card carrying FSF member, because of the political manuvreing that RMS and FSF do that makes sure that both Free Software and Open Source software are possible and not handicapped by politics.

    I suspect that android is agnostic about the signing of binaries but the manufacturer has required the signing of the binaries.

  8. Richard says:

    Clearly those people who race saloon cars on the track will want the ability to modify there own car’s code and should not be prevented from doing so by manufacturer imposed protection measures.

    Equally clearly these cars will not be road legal (or insurable for road use) in exactly the same way as race modfied cars aren’t today.

    No one physically stops me from using my personal lathe and milling machine to make components for the brake system of my car – although there may be legal issues if I use it on the road.

    Mods to the brake system are potentially as dangerous as mods to the software system so the situation should be the same:

    There should be no physical constraints on loading software – there should be no legal constraints on using (testing) such a vehicle in private – but there should be legal constraints on using such a vehicle in public.

    I must admit to being somewhat concerned about the assumptions made by many particiapants in this debate – the assumption that anything that is new and “could” be dangerous should be illegal or the subject of DRM like measures by default.

    Historically this has NEVER been the case(or most of the technological advances of the last 200 years would not have happened).

    One of my hobbies is the building and flying of model aircraft powered by homebuilt model gas turbines with (in my case) a homebuilt and programmed computer engine control system.
    In the wrong hands and in the wrong place these could be just as dangerous as a software modified car – but of course they are not legally constrained by default.