September 24, 2018

Thoughts on California’s Proposed Connected Device Privacy Bill (SB-327)

This post was authored by Noah Apthorpe.

On September 6, 2018, the California Legislature presented draft legislation to Governor Brown regarding security and authentication of Internet-connected devices. This legislation would extend California’s existing reasonable data security requirement—which already applies to online services—to Internet-connected devices.  

The intention of this legislation to prevent default passwords and other egregious authentication flaws is admirable, especially given the extent of documented security vulnerabilities in Internet of things (IoT) devices. Many such vulnerabilities, including default passwords and cleartext data transmissions (e.g., in IoT toys and medical devices discovered by researchers at Princeton), stem from lazy developer practices resulting in devices with minimal (or nonexistent) security or privacy protections. Such flaws could be addressed with well-known best practices that would not place excessive burden on device manufacturers.  With this context in mind, we applaud California’s effort to mandate minimal security and privacy protections for connected devices.

Unfortunately, as critics have pointed out, the wording of the proposed legislation is imprecise, especially regarding “reasonable security features.” Rather than reiterate this criticism, we point out some additional technical limitations of the proposed legislation, focusing on cases where the current language does not properly address security flaws as intended. We hope that these examples will help inform future improvements to this or other IoT security and privacy legislation.

1798.91.04.b.1: “The preprogrammed password is unique to each device manufactured.”
Mandating unique passwords for each device still leaves room for passwords that are still easily guessable. For example, a manufacturer could assign consecutive integers as passwords to all devices and be in compliance, but such passwords could be easily enumerated by an attacker. Related problems have already occurred. In 2015, TP-LINK routers were shipped with unique WiFi passwords derived from the hardware (MAC) addresses of each device. This made it trivial to observe traffic from one of these routers and subsequently guess the WiFi password. Much research has gone into the topic of generating secure passwords, the takeaways of which should be incorporated into any default password prevention law. Ultimately, additional criteria on preprogrammed unique passwords are needed for this bill to provide the intended security benefits.

1798.91.04.b.2: “The device contains a security feature that requires a user to generate a new means of authentication before access is granted to the device for the first time.”
This alternative requirement leaves open the possibility that devices or device features that users never access will not receive unique, secure passwords or other new authentication means. For example, many users never access the administrative interface of their home router (which typically has a different authentication method than the WiFi network itself). If the first login attempt is from an attacker, the attacker could receive the new authentication credentials.

Additionally, this requirement does not address the potential for devices to employ otherwise insecure authentication systems that still generate new credentials upon first access. As an analogy, just because an Internet user creates a new password for each online account does not mean that each of these passwords is necessarily secure. Again, additional criteria on the authentication system are needed.

1798.91.05.b: “‘Connected device’ means any device, or other physical object that is capable of connecting to the Internet, directly or indirectly, and that is assigned an Internet Protocol address or Bluetooth address.”
This extends the purview of the proposed legislation to PCs, laptops, smartphones, servers, and any other computing device that can access the Internet, even though the law is being promoted specifically as a protection for IoT devices.  While we are in favor of additional security and privacy protections on all networked devices, this broad scope may prevent improvements in the specificity of future versions of the legislation which may only be applicable to IoT, mobile, or other subcategory of devices.

1798.91.06.a: “This title shall not be construed to impose any duty upon the manufacturer of a connected device related to unaffiliated third-party software or applications that a user chooses to add to a connected device.”
This leaves unaddressed the complex ecosystem of third-party software that is incorporated into IoT and other Internet-connected devices before they reach the user. As an example, how does this law apply to Android smartphones for which the hardware is made by one manufacturer, the operating system is made by another (Google), and still other companies create mobile applications that come pre-loaded on the phone? The manufacturer of the hardware unlikely implements any authentication visible to the user. Instead, the operating system provides authentication for user access to the phone (usually via a PIN or a biometric fingerprint), while individual apps may have their own authentication measures. Future versions of the bill should clearly specify which criteria apply to the wide range of operating systems, applications, and software libraries from third-parties that provide core security, privacy, and authentication features for many devices.

Final Thoughts
Legal requirements that connected devices avoid well-known and easily fixed security flaws, such as default passwords, are long overdue. Yet, such legislation must recognize the complexity of cybersecurity issues. The above examples demonstrate the type of technical “gotchas” that can hide in well-meaning regulatory language. As California SB-327 or related legislation proceeds, we hope that legislators will consult with academic or industry researchers who have spent considerable effort developing, testing, and refining security and privacy solutions in a wide variety of technology contexts.

Internet of Things in Context: Discovering Privacy Norms with Scalable Surveys

by Noah Apthorpe, Yan Shvartzshnaider, Arunesh Mathur, Nick Feamster

Privacy concerns surrounding disruptive technologies such as the Internet of Things (and, in particular, connected smart home devices) have been prevalent in public discourse, with privacy violations from these devices occurring frequently. As these new technologies challenge existing societal norms, determining the bounds of “acceptable” information handling practices requires rigorous study of user privacy expectations and normative opinions towards information transfer.

To better understand user attitudes and societal norms concerning data collection, we have developed a scalable survey method for empirically studying privacy in context.  This survey method uses (1) a formal theory of privacy called contextual integrity and (2) combinatorial testing at scale to discover privacy norms. In our work, we have applied the method to better understand norms concerning data collection in smart homes. The general method, however, can be adapted to arbitrary contexts with varying actors, information types, and communication conditions, paving the way for future studies informing the design of emerging technologies. The technique can provide meaningful insights about privacy norms for manufacturers, regulators, researchers and other stakeholders.  Our paper describing this research appears in the Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies.

Scalable CI Survey Method

Contextual integrity. The survey method applies the theory of contextual integrity (CI), which frames privacy in terms of the appropriateness of information flows in defined contexts. CI offers a framework to describe flows of information (attributes) about a subject from a sender to a receiver, under specific conditions (transmission principles).  Changing any of these parameters of an information flow could result in a violation of privacy.  For example, a flow of information about your web searches from your browser to Google may be appropriate, while the same information flowing from your browser to your ISP might be inappropriate.

Combinatorial construction of CI information flows. The survey method discovers privacy norms by asking users about the acceptability of a large number of information flows that we automatically construct using the CI framework. Because the CI framework effectively defines an information flow as a tuple (attributes, subject, sender, receiver, and transmission principle), we can automate the process of constructing information flows by defining a range of parameter values for each tuple and generating a large number of flows from combinations of parameter values.

Applying the Survey Method to Discover Smart Home Privacy Norms

We applied the survey method to 3,840 IoT-specific information flows involving a range of device types (e.g., thermostats, sleep monitors), information types (e.g., location, usage patterns), recipients (e.g., device manufacturers, ISPs) and transmission principles (e.g., for advertising, with consent). 1,731 Amazon Mechanical Turk workers rated the acceptability of these information flows on a 5-point scale from “completely unacceptable” to “completely acceptable”.

Trends in acceptability ratings across information flows indicate which context parameters are particularly relevant to privacy norms. For example, the following heatmap shows the average acceptability ratings of all information flows with pairwise combinations of recipients and transmission principles.

Average acceptability scores of information flows with given recipient/transmission principle pairs.

Average acceptability scores of information flows with given recipient/transmission principle pairs. For example, the top left box shows the average acceptability score of all information flows with the recipient “its owner’s immediate family” and the transmission principle “if its owner has given consent.” Higher (more blue) scores indicate that flows with the corresponding parameters are more acceptable, while lower (more red) scores indicate that the flows are less acceptable. Flows with the null transmission principle are controls with no specific condition on their occurrence. Empty locations correspond to less intuitive information flows that were excluded from the survey. Parameters are sorted by descending average acceptability score for all information flows containing that parameter.

These results provide several insights about IoT privacy, including the following:

  • Advertising and Indefinite Data Storage Generally Violate Privacy Norms. Respondents viewed information flows from IoT devices for advertising or for indefinite storage as especially unacceptable. Unfortunately, advertising and indefinite storage remain standard practice for many IoT devices and cloud services.
  • Transitive Flows May Violate Privacy Norms. Consider a device that sends its owner’s location to a smartphone, and the smartphone then sends the location to a manufacturer’s cloud server. This device initiates two information flows: (1) to the smartphone and (2) to the phone manufacturer. Although flow #1 may conform to user privacy norms, flow #2 may violate norms. Manufacturers of devices that connect to IoT hubs (often made by different companies), rather than directly to cloud services, should avoid having these devices send potentially sensitive information with greater frequency or precision than necessary.

Our paper expands on these findings, including more details on the survey method, additional results, analyses, and recommendations for manufacturers, researchers, and regulators.

We believe that the survey method we have developed is broadly applicable to studying societal privacy norms at scale and can thus better inform privacy-conscious design across a range of domains and technologies.

Announcing IoT Inspector: Studying Smart Home IoT Device Behavior

By Noah Apthorpe, Danny Y. Huang, Gunes Acar, Frank Li, Arvind Narayanan, Nick Feamster

An increasing number of home devices, from thermostats to light bulbs to garage door openers, are now Internet-connected. This “Internet of Things” (IoT) promises reduced energy consumption, more effective health management, and living spaces that react adaptively to users’ lifestyles. Unfortunately, recent IoT device hacks and personal data breaches have made security and privacy a focal point for IoT consumers, developers, and regulators.

Many IoT vulnerabilities sound like the plot of a science fiction dystopia. Internet-connected dolls allow strangers to spy on children remotely. Botnets of millions of security cameras and DVRs take down a global DNS service provider. Surgically implanted pacemakers are susceptible to remote takeover.

These security vulnerabilities, combined with the rapid evolution of IoT products, can leave consumers at risk, and in the dark about the risks they face when using these devices. For example, consumers may be unsure which companies receive personal information from IoT appliances, whether an IoT device has been hacked, or whether devices with always-on microphones listen to private conversations.

To shed light on the behavior of smart home IoT devices that consumers buy and install in their homes, we are announcing the IoT Inspector project.

Announcing IoT Inspector: Studying IoT Security and Privacy in Smart Homes

Today, at the Center for Information Technology Policy at Princeton, we are launching an ongoing initiative to study consumer IoT security and privacy, in an effort to understand the current state of smart home security and privacy in ways that ultimately help inform both technology and policy.

We have begun this effort by analyzing more than 50 home IoT devices ourselves. We are working on methods to help scale this analysis to more devices. If you have a particular device or type of device that you are concerned about, let us know. To learn more, visit the IoT Inspector website.

Our initial analyses have revealed several findings about home IoT security and privacy.

[Read more…]