BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Articles Overcoming RESTlessness

Overcoming RESTlessness

Leia em Português

This item in japanese

Lire ce contenu en français

Bookmarks

Key Takeaways

  • For the last couple of years, there has been a growing anti-REST sentiment in the software development community. However, alternative technologies often arose within a particular context, which presents strengths and weaknesses in relation to specific use cases.
  • The rise of REST was itself fuelled by a false dichotomy, with SOAP playing the role of bogeyman. Whereas SOAP attempted to provide a method of tunneling through the protocols of the web, the REST approach embraced them. 
  • Instead of seeking to replace REST, the software engineering industry should seek to evolve by building on the maturity of the REST ecosystem while exploiting the technological strengths of the new protocols.

New API protocols like GraphQL, gRPC, and Apache Kafka have risen in popularity as alternatives to REST-inspired HTTP APIs. This article argues that the strength of the REST paradigm cannot be reflected in one-on-one protocol comparisons. Instead of seeking to replace REST, the software engineering industry should seek to evolve by building on the maturity of the REST ecosystem while exploiting the technological strengths of the new protocols.

Overcoming RESTlessness

New API protocols like GraphQL, gRPC, and Apache Kafka have risen in popularity as alternatives to REST-inspired HTTP APIs. This article argues that the strength of the REST paradigm cannot be reflected in one-on-one protocol comparisons. Instead of seeking to replace REST, the software engineering industry should seek to evolve by building on the maturity of the REST ecosystem while exploiting the technological strengths of the new protocols.

On protocols, paradigms, and false dichotomies...

Fellow Vancouverite Tim Bray's recent blog post, "Post-REST", garnered a great deal of attention in our industry with good reason. As the usage of web APIs has grown, so has skepticism about whether REST1 is the ideal communication convention for web APIs. In addition to its initial scope of open Web communication, REST is now also being used for supplying web application data, providing inter-microservice communication, facilitating infrastructure administration and automation, and even for asynchronous patterns like messaging, event distribution and streaming.

Tim's piece does a good job of outlining REST's usage, its perceived limitations, some emerging protocol alternatives (GraphQL and gRPC), and speculating on the future of web API communication. While I agree with the piece in general, I feel that there is more to be said on this topic. Namely, rather than just look at what might replace REST, I would like us to consider how the strengths of REST can be synthesized with these new protocols' innovations to come up with evolutionary alternatives for communication in distributed software ecosystems.

New protocols, old battle lines, false dichotomies

For the last couple of years, there has been a growing anti-REST sentiment in the software development community. Numerous articles have been posted that gripe about REST's limitations, and then present an alternative protocol or approach for communication.

The pattern "REST is bad at x, so use y instead" can be seen in pieces that support GraphQL, gRPC, asynchronous communication, and even more obscure options. The arguments go something like this:

  • GraphQL is better than REST because it allows the API consumer to control what data it receives, and allows the API provider to aggregate resources on the server-side
  • gRPC (plus protocol buffers) is better than REST because it is type-safe, has optimized performance through binary serialization, and is able to exploit the capabilities of HTTP/2
  • Asynchronous communication (AMQP, Kafka, etc.) is better than synchronous REST communication because it reduces blocking and thread usage, and thereby increases service autonomy

Each of these approaches arose within a particular context. GraphQL was created by Facebook as part of their re-working of the Facebook mobile application. It is the over-the-wire communication method used in conjunction with the Relay and React Native JavaScript frameworks, essentially a means of serving up ad hoc data to the app. Unsurprisingly, many public proponents of GraphQL have a bias toward data centricity and JavaScript. gRPC and protocol buffers emerged from Google's internal usage, and followed a similar path to the public as the Kubernetes container orchestration project. Expectedly, much of the advocacy for gRPC centres on communication between container-based applications. Exclusive asynchronous communication is often promoted for use within the context of reactive systems or event sourcing. It makes sense that these approaches that were designed for each of these specific contexts would have some advantages over the more generic REST approach within those contexts.

In defense of REST, it is tempting to take these criticisms at face value and come up with counterpoints like these:

  • For the GraphQL case, nothing in the REST paradigm prevents consumer choice or resource aggregation (it has merely been a common practice to use static interfaces on single resources) and there is plenty of information suggesting that constraining consumer choice has its own benefits
  • For gRPC, runtime optimization is unlikely to be the primary bottleneck in most distributed architectures, and gRPC's embedded library requirement -- not to mention protobuf's enumerated structure -- can lead to unforeseen issues
  • For async, there is absolutely a need to include event-based scenarios, but those are likely in addition to synchronous patterns like queries and commands

However, in my opinion these counter arguments do not tell the whole story.2 Software engineering is a restless industry, and we frequently oversimplify our problems in order to justify overly simplistic solutions. We like to label the "burning platform" so we can incent people to leap to some new safety. Real-time processing is good because batch is bad. Microservices are good because monoliths are bad. Using REST as the bogeyman on a context-by-context basis like the arguments above creates a series of false dichotomies. Rather than looking at what REST lacks in each of the above scenarios, perhaps we should examine a different question: how did REST become the default communication approach for component-to-component network hops in distributed computing? Let's go back to the beginning.

REST's origins, rise, and ubiquity

REST (Representational State Transfer) was defined as a chapter in Roy Fielding's 2000 Ph.D. dissertation, "Architectural Styles and the Design of Network-based Software Architectures". The meta-purpose of this paper was to "define a framework for understanding software architecture ... to guide the architectural design of network-based application software". REST was included as the example architectural style that codified the design principles of the World Wide Web, with an emphasis on evolvability, scalability, and generality of interfaces. Compared to the contexts for the new approaches listed above, REST's initial problem space was broad.

Broad as it was, the idea of using the Web for network-based sharing of data and services beyond the browser was a popular one. Software developers quickly seized on Fielding's work and put it into practice.3 The rise of REST was itself fuelled by a false dichotomy, with SOAP playing the role of bogeyman. Whereas SOAP attempted to provide a method of tunneling through the protocols of the web, the REST approach embraced them. This notion of REST being "of the web, not just on the web" made it a more intuitive choice for software engineers already building web-based solutions.

As the SOAP and WS-* ecosystem became more complicated, the relative simplicity and usability of REST won out. Over time, JSON replaced XML as the de facto data format for web APIs for similar reasons. As the usage of the web computing paradigm expanded to new scenarios -- enterprise application integration, cloud provisioning, data warehouse querying, IoT -- so did the adoption of REST APIs.

Now, if one were to examine each specific usage scenario, there might be some weaknesses in REST's applicability, or some alternative communication approach that would seem more ideal. But that examination would ignore the power that comes from REST's universality. Because of REST's ubiquity, web developers that were used to making AJAX calls could intuitively grasp how to use AWS' APIs for provisioning cloud infrastructure; developers of web-based social networks could quickly lay down the plumbing for mobile applications; enterprise developers had a well known way of making newly decomposed microservices talk to each other. Software engineering is a field where barriers to delivery are often more human than machine. Well-understood approaches provide value, often having a bigger impact on delivery time than technologically-optimized niche solutions.

This universality has also created robustness in the REST ecosystem. Swagger -- now OpenAPI -- arose organically as a metadata specification to help developers document, design and consume APIs. OAuth provides a scalable, transferrable framework for authentication and authorization.4 "API Management" emerged as a set of capabilities -- rate limiting, dynamic routing, caching, etc. -- that proved commonly useful when providing REST APIs. The comprehensiveness of the REST paradigm and the maturity of its ecosystem represent the greatest value of REST as an approach to network-based communication in a software system. In all likelihood, this ubiquity comes more from REST being "how the Web works" than from any one technical detail.

Communication in the post-Web paradigm

The impact of the Web on software engineering cannot be overstated. In parallel with the rise of REST, the software engineering world also saw the advent of open source, Agile, DevOps, Domain-Driven Design, and microservice architecture. Each of these movements was enabled by the Web, and collectively they have amplified the importance of the human element in software delivery. Along with the flexibility and expediency offered by cloud computing, a new paradigm of software engineering has emerged characterized by continuously running, continuously evolving, loosely-coupled applications and services. While Tim Bray called his article "post-REST", perhaps this new paradigm could be called "post-Web". And since the characteristics of this paradigm align with the original principles of Fielding's REST, it would not make sense to discard REST and start from scratch. On the other hand, it would be equally naïve to ignore two decades of technological innovations.

So how can REST's value evolve in this new paradigm? There are an increasing number of organizations adopting an "API First" approach to software development; that is, emphasizing the importance of designing the machine interfaces in their applications and services to the same extent as UI's, and using those APIs to decouple the development efforts of teams responsible for different domains. OpenAPI often plays an important role in this methodology, as the implementation-agnostic interface specification. In accordance with the post-Web paradigm, this benefits the various people involved in building or modifying the software system. There is already a project underway -- AsyncAPI from Fran Mendez -- that aims to bring this same value to event-based interactions. Along the same lines, Mike Amundsen and Leonard Richardson introduced the ALPS specification to capture the semantics of network-based application interactions. Efforts like these help to address the design-time challenges of building distributed systems.

There are opportunities to extend REST's value in the cloud native runtime as well. The move to microservices has introduced network boundaries where inter-process communication (IPC) used to take place. These physical boundaries may be projections of business domain boundaries by design, in order to achieve the people benefits discussed above. However, there is an implied runtime tradeoff with additional network latency and the potential for partial failures in the service call chain.

The service mesh pattern addresses these issues for container-based systems, featuring a "sidecar" service proxy that handles all network-based communication between application components. The service mesh topology means that IPC has been re-introduced between the application containers and their associated sidecars. Nonetheless, developers of application containers still need to specify network protocols in their code, since the service proxies do not typically change transport protocols for proxied messages.

Should these application developers be dealing with protocol handling at all? Could they instead deal with abstract service requests (queries, commands, events) and have the service proxy handle the protocol mapping, transcoding, and transmission? These questions should be explored, and the universal design-time understanding of REST APIs could offer a starting point for abstraction. These are just two areas where REST's ubiquity can be leveraged to help solidify the post-Web paradigm.

A less RESTless future

Hyped technology trends often tout how they can replace the old approach with a new one. In reality, evolution in software engineering usually happens in layers. Each new innovation sets the foundation for a new set of innovations to follow. New API protocols like GraphQL, gRPC and Kafka will replace the use of resource-based, JSON-encoded, HTTP-transmitted messages in some distributed scenarios. However, REST's legacy in the evolution of distributed systems should be less about its implementation details, and more about the traits that have led to its ubiquity: providing a framework for universal connectivity, decoupling service consumers from providers, emphasizing usability and accessibility. It is these characteristics that can make REST -- originally defined as the architectural style of the Web -- foundational for the post-Web paradigm of software engineering.

Footnotes

  1. The intent of this article is not to debate the definition of REST. The term "REST" will be used both to refer to Fielding's original definition as well as to CRUD-style APIs over HTTP.
  2. For a practical analysis of the REST/gRPC/GraphQL decision context, read this blog post from Phil Sturgeon.
  3. ...and very early on, the desire to distill REST down to CRUD over HTTP was already being promoted. See.
  4. Back to the dichotomy with SOAP, this organic standards development contrasts with the explosion of W3C and OASIS standards that appeared during the peak of the Web Services boom in the early 2000's.

Thanks to Mike Amundsen, Erik Wilde, Irakli Nadareishvili, and Ronnie Mitra for locating the "history of REST" resources listed in this article.

About the Author

Matt McLarty is an experienced software architect who leads the API Academy team for CA Technologies, a Broadcom company. He works closely with organizations on designing and implementing innovative, enterprise-grade API and microservices solutions. Matt has worked extensively in the field of integration and real-time transaction processing for software vendors and clients alike.  Matt recently co-authored the O’Reilly books Microservice Architecture and Securing Microservice APIs

Rate this Article

Adoption
Style

BT