BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Is OAuth 2.0 Bad for the Web?

Is OAuth 2.0 Bad for the Web?

Bookmarks

Composite Applications went from being a curiosity to mainstream in less than 5 years. One of the key architectural issues when building composite applications is the double authentication that is required to access a particular service and the corresponding authorization rules: in general we need to authenticate both the (composite) application invoking the service and the user of the composite application itself while having the ability to define the service access rules both for the application and the user. This is particularly difficult in a client/server middleware environment, including HTTP, which has been built for decades on the premise that the "client" represents a single identity: either a software client or a user but not both.

A typical composite application today uses popular APIs (such as Twitter, Facebook or Google) through which it requires access to user specific data. In the early days, composite application developers used to require the knowledge of the user credentials for each service in order to access their data to mash them up in a valuable way. Jeff Altwood once compared this process to asking the keys to someone's house just to be able to access the address book. In April 2007, a small group of people founded the OAuth project under the initiative of  Blaine Cook, Chris Messina, Larry Halff and David Recordon

OAuth is a protocol layered on top of HTTP and HTTP/S enables a user of both a composite application and a service to grant partial access to the service to the application. It is based on a three party trust federation: user-application, user-service and application-service. Very quickly, the group published the OAuth 1.0 specification which people started to use. OAuth 1.0 may have been published too quickly and was the target of criticisms. It was quickly followed by a competing proposal WRAP (Web Resource Authorization Protocol), which became a profile of OAuth after a rapid standardization effort. Since then, the OAuth working group has been working on OAuth 2.0.

One of the most visible utilization of OAuth is Twitter which decided to make it mandatory across its APIs as of this month (September 2010) and consequently killed its support for basic authentication. Michael Calore explains:

Twitter’s move mirrors a broader trend on the social web, where basic authentication is being ditched for the more secure OAuth when services and applications connect user’s accounts.

Many web sites, such as iCodeBlog, provided tutorials to help developers quickly update their application. And, even though OAuth 2.0 is still a draft, it is already supported by Facebook which is to date the largest implementation of the OAuth protocol and a key stakeholder of the specification.

It looks that for once the industry has developed a broad consensus to solve an important problem. Yet, Eran Hammer-Lahav, published some criticisms about the latest direction of the specification which dropped signatures and cryptography in favor of "bearer tokens". However, to Eran's own admission,"Cryptography is unforgiving". Developers can easily make mistakes in the steps they take to encrypt or sign a message and it is generally unforgiving. The idea of dropping cryptography was part of the WRAP initiative:

At the heart of the WRAP architecture is the requirement to remove any cryptography from the client side. The WRAP authors observed how developers struggled with OAuth 1.0 signatures and their conclusion was that the solution is to drop signatures completely. Instead, they decided to rely on a proven and widely available technology: HTTPS (or more accurately, SSL/TLS). Why bother with signatures if instead the developer can add a single character to their request (turning it from http:// to https://) and protect the secret from an eavesdropper. Much of the criticism that followed focused on the fact that WRAP does not actually require HTTPS. It simply makes it an option. This use of tokens without a secret or other verification mechanism is called a bearer token [which is very similar to a cookie]. Whoever holds the token gains access. If you are an attacker, you just need to get hold of this simple string and you are good to go. No signatures, calculations, reverse engineering, or other such efforts required.

The argument of the supporters of this model is as follows: since most services use a cookie-based authentication system, it would not be more secure to use additional mechanisms since an attacker would always target the weakest point. Actually, Eran's concerns are not about OAuth today, but the impact that this specification will have in five years when inherently more secure protocol will be needed. First, the argument will again be, since OAuth 2.0 is the weakest point, there is no need to implement stronger security mechanisms. Second, the reason why OAuth would work in today's environment is because all the APIs are fairly significant to the clients and most of the API endpoints are declared statically in the clients code or configuration while being thoroughly tested before the application is released. So overall, there is little risk that the token will be sent to an unfriendly destination. 

Subbu Allamaraju, author of the RESTful Web Services Cookbook, explained in a private note that:

If a client application sends a request to an erroneous address ("mail.exmple.org" instead of "mail.example.org"), the rogue server at "mail.exmple.com" now has the client access token and can access its mail. Of course, in the case of browsers, the browser developer is responsible for not leaking cookies by implementing the same origin policy. OAuth 2.0 client developers will share the same responsibility.

Yet, Eran believes that the Web needs to be more secure to support more dynamic scenarios and a world of standard APIs implemented by a variety of services:

As soon as we try to introduce discovery or interoperable APIs across services, OAuth 2.0 fails. Because it lacks cryptographic protection of the tokens (there are no token secrets), the client has to figure out where it is safe to send tokens. OAuth reliance on the cookie model requires the same solution – making the client apply the security policy and figure out which servers to share its tokens with. The resource servers, of course, can ask for tokens issued by any authorization server. For example, a protected resource can claim that it requires an OAuth access token issued by Google when in fact, it has nothing to do with Google (ever though it might be a Google subdomain). The client will have to figure out if the server is authorized to see its Google access token. Cookies have rules regarding which cookie is shared with which server. But because these rules are enforced by the client, there is a long history of security failures due to incorrect sharing of cookies. The same applies to OAuth 2.0.

He concludes:

Any solution based on client side enforcement of a security policy is broken and will fail. OAuth 1.0 solves this by supporting signatures. If a client sends a request to the wrong server, nothing bad happens because the evil server has no way of using that misguided request to do anything else. If a client sends an OAuth 2.0 request to the wrong server (found via discovery), that server can now access the user’s resources freely as long as the token is valid.
Without discovery, smaller companies will have a harder time getting their services accessible.

Composite Applications are rapidly becoming a key vector of innovation adding value to otherwise plain data like tasks, friends or TV guides. At the same time, OAuth is poised for a rapid adoption because it solves an acute problem and has gained some momentum in the industry with the support of Facebook, Twitter....  Like often in the standardization process, we are now at crossroads, and our industry has to choose one path or the other: do we support simpler security mechanisms to allow a larger group of developers to build these composite applications, or do we implement stronger ones that would allow other developers to build mores services that interoperate and compete with existing ones? Where do you stand?

Rate this Article

Adoption
Style

BT