Hypermedia is an enabler for a better architecture, Einar Høst claimed in his presentation at the recent DDD Europe 2020 conference in Amsterdam. In his talk he described the architecture challenges at NRK TV, the TV streaming service at the Norwegian public broadcaster, and how they migrated their monolithic architecture into a more modular design and implemented hypermedia in their Player API.
Høst, working for NRK TV, started by noting that TV streaming is a competitive and rapidly evolving domain. Since linear broadcast is declining, online TV streaming services becomes more important and as a public broadcaster they must take more of the responsibility, which means that stability becomes more important.
The Player API serves metadata and manifests to all the different clients people use to watch TV. Høst groups the clients in two categories: progressive clients, like browsers, mobile phones and others that do frequent updates to include new features, and legacy clients, for instance clients running on smart TVs that have less upgraded functionality. They have radically different deployment cycles which have an impact on the architecture.
Back in 2016 they had a monolithic API serving the clients. It was running in two data centres at Azure backed by a relational database and various background jobs. The complexity was overwhelming, which made it hard to reason about the system. Changes had unforeseen effects and caused intricate failure modes. All the problems resulted in a fear of change and stagnation.
Høst claims that the main reason they ended up with such a complex monolith was the Entity – for NRK TV, the TV show. There are many valid aspects that relate to a TV show depending on whose context you are using, and the entity inherits the complexity from all the involved domains and becomes a monolith, or a big ball of mud.
To be able to discuss a TV show from different perspectives, they started a functional decomposition of the monolith with a Domain-Driven Design (DDD) mindset and using bounded contexts. One example of a context is the catalogue where the focus is on describing the media content. Another one is the playback context with a focus on playing the actual content. Other contexts include recommendations, personalization and search. The main reason for this decomposition was to contain complexity, but also to be able to focus on different critical features in different contexts. With one large context everything becomes equally critical; it’s for example not possible to have redundancy in one part, but not in the rest of the application.
After this decomposition they started to use bounded contexts as service boundaries, and when they wrote new things, they also created new services. An API gateway was introduced as an architectural seam enabling them to route requests to different endpoints. With this in place they started to use the strangler pattern, gradually moving functionality from the monolith to services.
From a client perspective this decomposition into services is not interesting. Especially end users wants a coherent story which allows them to navigate across boundaries. They therefore started to recompose from a client's perspective and used hypermedia to enable a user to seamlessly move between services.
Høst defines hypermedia as media with hyperlinks and emphasizes that he is not talking about REST. They are just using links where each link includes a relation to describe what the link refers to. These links show the possible ways a client can navigate from one resource to another with the relation describing the relationship between the two resources. There should be a link for each reasonable next step, thus offering a client the things it can do next. Together these links connect resources to form coherent narratives that a client can follow to achieve the desired goal. Links also enables support for multiple paths through the API and more than one way to reach the goal.
For hypermedia format, they are using the Hypertext Application Language (HAL) and Høst notes that the reason is that it’s lightweight, and allows them to add links gradually, which is important for them since they are in the process of shrinking the monolith.
For Høst and the teams, versioning of their APIs has not been an issue since they normally don’t replace individual endpoints. Instead they add new and improved narratives using links, with link relations describing the new narratives. Users can then gradually move over from the old to the new narratives. Since the clients have very different deployment rates, it’s important that they can switch over at their own pace. By tracking which narratives that are used, it’s possible to remove links when they are not in use anymore.
Looking at the present situation, Høst notes that the monolith is shrinking; they can now use independent deploys and standard HTTP caching techniques. All new resources use links and many of the old ones have been retrofitted with links. They also have a completely new playback solution and he specifically notes a new personalization solution that uses Orleans virtual actor framework.
The slides from Høst’s presentation are available for download. Most presentations at the conference were recorded and will be published during the coming months.