November 21, 2024

Web Browser Security User Interfaces: Hard to Get Right and Increasingly Inconsistent

A great deal of online commerce, speech, and socializing supposedly happens over encrypted protocols. When using these protocols, users supposedly know what remote web site they are communicating with, and they know that nobody else can listen in. In the past, this blog has detailed how the technical protocols and legal framework are lacking. Today I’d like to talk about how secure communications are represented in the browser user interface (UI), and what users should be expected to believe based on those indicators.

The most ubiquitous indicator of a “secure” connection on the web is the “padlock icon.” For years, banks, commerce sites, and geek grandchildren have been telling people to “look for the lock.” However, The padlock has problems. First, it has been shown in user studies that despite all of the imploring, many people just don’t pay attention. Second, when they do pay attention, the padlock often gives them the impression that the site they are connecting to is the real-world person or company that the site claims to be (in reality, it usually just means that the connection is encrypted to “somebody”). Even more generally, many people think that the padlock means that they are “safe” to do whatever they wish on the site without risk. Finally, there are some tricky hacker moves that can make it appear that a padlock is present when it actually is not.

A few years ago, a group of engineers invented “Extended Validation(EV) certificates. As opposed to “Domain Validation(DV) certs that simply verify that you are talking to “somebody” who owns the domain, EV certificates actually do verify real-world identities. They also typically cause some prominent part of the browser to turn green and show the real-world entity’s name and location (eg: “Bank of America Corporation (US)”). Separately, the W3 Consortium recently issued a final draft of a document entitled “Web Security Context: User Interface Guidelines.” The document describes web site “identity signals,” saying that the browser must “make information about the identity of the Web site that a user interacts with available.” These developments highlight a shift in browser security UI from simply showing a binary padlock/no-padlock icon to showing more rich information about identity (when it exists).

In the course of trying to understand all of these changes, I made a disturbing discovery: different browser vendors are changing their security UI’s in different ways. Here are snapshots from some of the major browsers:

As you can see, all of the browsers other than Firefox still have a padlock icon (albeit in different places). Chrome now makes “https” and the padlock icon green regardless of whether it is DV or EV (see the debate here), whereas the other browsers reserve the green color for EV only. The confusion is made worse by the fact that Chrome appears to contain a bug in which the organization name/location (the only indication of EV validation) sometimes does not appear. Firefox chose to use the color blue for DV even though one of their user experience guys noted, “The color blue unfortunately carries no meaning or really any form of positive/negative connotation (this was intentional and the rational[e] is rather complex)”. The name/location from EV certificates appear in different places, and the method of coloring elements also varies (Safari in particular colors only the text, and does so in dark shades that can sometimes be hard to discern from black). Some browsers also make (different) portions of the url a shade of gray in an attempt to emphasize the domain you are visiting.

Almost all of the browsers have made changes to these elements in recent versions. Mozilla has been particularly aggressively changing Firefox’s user interface, with the most dramatic change being the removal of the padlock icon entirely as of Firefox 4. Here is the progression in changes to the UI when visiting DV-certified sites:

By stepping back to Firefox 2.0, we can see a much more prominent padlock icon in both the URL bar and in the bottom-right “status bar” along with an indication of what domain is being validated. Firefox 3.0 toned down the color scheme of the lock icon, making it less attention grabbing and removing it from the URL bar. It also removed the yellow background that the URL bar would show for encrypted sites, and introduced a blue glow around the site icon (“favicon”) if the site provided a DV cert. This area was named the “site identification button,” and is either grey, blue, or green depending on the level of security offered. Users can click on the button to get more information about the certificate, presuming they know to do so. At some point between Firefox 3.0 and 3.6, the domain name was moved from the status bar (and away from the padlock icon) to the “site identification button”.

In the soon-to-be-released Firefox 4 is the padlock icon removed altogether. Mozilla actually removed the “status bar” at the bottom of the screen completely, and the padlock icon with it. This has caused consternation among some users, and generated about 35k downloads of an addon that restores some of the functionality of the status bar (but not the padlock).

Are these changes a good thing? On the one hand, movement toward a more accurately descriptive system is generally laudable. On the other, I’m not sure whether there has been any study about how users interpret the color-only system — especially in the context of varying browser implementations. Anecdotally, I was unaware of the Firefox changes, and I had a moment of panic when I had just finished a banking transaction using a Firefox 4 beta and realized that there was no lock icon. I am not the only one. Perhaps I’m an outlier, and perhaps it’s worth the confusion in order to move to a better system. However, at the very least I would expect Mozilla to do more to proactively inform users about the changes.

It seems disturbing that the browsers are diverging in their visual language of security. I have heard people argue that competition in security UI could be a good thing, but I am not convinced that any benefits would outweigh the cost of confusing users. I’m also not sure that users are aware enough of the differences that they will consider it when selecting a browser… limiting the positive effects of any competition. What’s more, the problem is only set to get worse as more and more browsing takes place on mobile devices that are inherently constrained in what they can cram on the screen. Just take a look at iOS vs. Android:

To begin with, Mobile Safari behaves differently from desktop Safari. The green color is even harder to see here, and one wonders whether the eye will notice any of these changes when they appear in the browser title bar (this is particularly evident when browsing on an iPad). Android’s browser displays a lock icon that is identical for DV and EV sites. Windows Phone 7 behaves similarly, but only when the URL bar is present — and the URL bar is automatically hidden when you rotate your phone into landscape mode. Blackberry shows a padlock icon inconspicuously in the top status bar of the phone (the same area as your signal strength and battery status). Blackberry uniquely shows an unlocked padlock icon when on non-encrypted sites, something I don’t remember in desktop browsers since Netscape Navigator (although maybe it’s a good idea to re-introduce some positive indication of “not encrypted”).

Some of my more cynical realistic colleagues have said that given the research showing that most users don’t pay attention to this stuff anyway, trying to fix it is pointless. I am sympathetic to that view, and I think that making more sites default to HTTPS, encouraging adoption of standards like HSTS, and working on standards to make it easier to encrypt web communications are probably lower hanging fruit. There nevertheless seems to be an opportunity here for some standardization amongst the browser vendors, with a foundation in actual usability testing.

Usable security irony

I visited Usable Security (the web page for the 2007 Usability Security workshop) today to look up a reference, except the link I followed was actually the SSL version of the page. Guess what?

Secure Connection Failed

usablesecurity.org uses an invalid security certificate.
The certificate expired on 12/29/08 12:21 AM.

(Error code: sec_error_expired_certificate)

  • This could be a problem with the server’s configuration, or it could be someone trying to impersonate the server.
  • If you have connected to this server successfully in the past, the error may be temporary, and you can try again later.

How many other web sites out there have the same problem? Using SSL, all the time, is clearly a good thing from a security perspective. There’s a performance issue, of course, but then there’s this usability problem from the server admin’s perspective.

On the future of voting technologies: simplicity vs. sophistication

Yesterday, I testified before a hearing of Colorado’s Election Reform Commission. I made a small plug, at the end of my testimony, for a future generation of electronic voting machines that would use crypto machinery for end-to-end / software independent verification. Normally, the politicos tend to ignore this and focus on the immediately actionable stuff (e.g., current-generation DREs are unacceptably insecure; optical-scan is the best thing presently on the market). Not this time. I got a bunch of questions asking me to explain how a crypto voting system can be verifiable, how you can prove that the machine is behaving properly, and so forth. Pretty amazing. What I realized, however, is that it’s really hard to explain crypto machinery to non-CS people. I did my best, but it was clear from conversations afterward that a few minutes of Q&A did little to give them any confidence that crypto voting machinery really works.

Another of the speakers, Neil McBurnett, was talking about doing variable sampling-rate audits (as a function of how close the tally is). Afterward, he lamented to me, privately, how hard it is to explain basic concepts like what it means for something to be “statistically significant.”

There’s a clear common theme here. How do we explain to the public the basic scientific theories that underly the problems that voting systems face? My written testimony (reused from an earlier hearing in Texas) includes links to papers, and some people will follow up. Others won’t. My big question is whether we have a research challenge to invent progressively simpler systems that still have the right security properties, or whether we have an education challenge to explain that a certain amount of complexity is worthwhile for the good properties that can be achieved. (Uglier question: is it a desirable goal to weaken the security properties in return for greater simplicity? What security properties would you sacrifice?)

Certainly, with our own VoteBox system, which uses a variation on Benaloh‘s voter-initiated ballot challenge mechanism, one of the big open questions is whether real voters, who just want to cast their votes and don’t care about the security mechanisms, will be tripped up by the extra question at the end that’s fundamental to the mechanism. We’re going to need to run human subject tests against these aspects of the machine design, and if they fail in practice, it’s going to be a trip back to the drawing board.

[Sidebar: I’m co-teaching a class on elections with Bob Stein (a political scientist) and Mike Byrne (a psychologist). The students are a mix of Rice undergrads, most of whom aren’t computer scientists. I experimentally built a lecture that began by teaching just enough number theory to explain how El Gamal cryptography works and how it allows for homomorphic vote tallying. Then I described how VoteBox uses this mechanism, and wrapped up with an explanation of how to do Benaloh-style challenges. I left out a lot of details, like how you generate large prime numbers, or how you construct NIZK proofs, but I seemed to have the class along with me for the lecture. If I can sell the idea of end-to-end cryptographic mechanisms to undergraduate non-science students, then there may yet be some hope.]