Should the Web be Encrypted?
The Electronic Frontier Foundation is a non profit organization founded in 1990. It first released HTTPS Everywhere as a beta test version in June of 2010. Last week, it released the 1.0 version which includes support for hundreds of additional websites, using carefully crafted rules to switch from HTTP to HTTPS.
Earlier this year, two research papers reported the observation of strange phenomena in the Domain Name System (DNS) at several US ISPs. On these ISPs' networks, some or all traffic to major search engines, including Bing, Yahoo! and (sometimes) Google, is being directed to mysterious third party proxies.
EFF Senior Staff Technologist Peter Eckersley explaind:
Without HTTPS, your online reading habits and activities are vulnerable to eavesdropping, and your accounts are vulnerable to hijacking. Today's Paxfire revelations are a grand example of how things can go wrong. EFF created HTTPS Everywhere to make it easier for people to keep their user names, passwords, and browsing histories secure and private."
HTTPS Everywhere 1.0 encrypts connections to Google Image Search, Flickr, Netflix, Apple, and news sites like NPR and the Economist, as well as dozens of banks. HTTPS Everywhere also includes support for Google Search, Facebook, Twitter, Hotmail, Wikipedia, the New York Times, and hundreds of other popular websites.
The EFF Firefox extension is able to protect people using Google, DuckDuckGo or StartingPage for their searches, but not Bing and Yahoo users, because those search engines do not support HTTPS.
Last year Dan Kaminsky wrote this essay on HTML 5 security and referencing the flaws in which browsers implement HTTP:
Robert “RSnake” Hansen and Josh Sokel’s “HTTPS Can Byte Me“. Their point is that the HTTP version of a site actually has quite a bit of control about the credentials presented to the HTTPS version of a site — and that this control, while not overwhelming, is a lot more powerful, and troubling than expected.
- If you have a site with a wildcard certificate, and that site has an XSS attack reachable irrespective of Host header
- Since HTTP sites can write cookies that will be reflected to HTTPS endpoints, and since cookies can be tied to certain paths, and since servers will puke if given too long cookies, an HTTP attacker can “turn off” portions of the HTTPS namespace by throwing in enormous cookies.
Ultimately, these findings have increased my belief that we need the ability to mark sites as SSL-only, so they simply don’t have an HTTP endpoint to corrupt. The melange of technologies, from HTTPS Everywhere, to Strict-Transport-Security, to the as-yet unspecified DNSSEC Strict Transport markings, become ever more important.
The Web, having just turned 20, shows signs of fatigue and its core technologies seem to be increasingly unable to cope with sophisticated attacks. With the rapid growth of Web APIs, and the out-of-control proliferation of pseudo-standard ways to secure Web protocols, the bulk of our data is also at stake. Is the Web about to become encrypted? What's your take on it?