Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ


Choose your language

InfoQ Homepage News Doubles Delivery Performance Using DORA Metrics and Micro Frontends Doubles Delivery Performance Using DORA Metrics and Micro Frontends

This item in japanese

The team in’s fintech business unit implemented a series of improvements across the backend and the frontend of its platform and was able to double the delivery performance, as measured by DORA metrics. Additionally, the Micro Frontends (MFE) pattern was used to break up the monolithic FE application into multiple decomposed apps that could be deployed separately.

A new engineering team was formed in mid-2022 at and was made responsible for several processes within the finance domain. The team inherited a portion of the wider platform architecture consisting of a monolithic frontend application written in Perl and Javascript using Vue Framework, as well as a Java backend service with dependencies on many other microservices.

The Team’s Area Of Ownership (Source: Engineering Blog)

The team soon discovered that making changes to the existing codebases and deploying these into production was risky and time-consuming. Wanting to improve the delivery frequency, the team decided to adopt customized DevOps metrics, proposed by DORA, to track key performance indicators for their delivery process. Engineers started to record delivery speed metrics (deployment frequency and lead time for changes) to establish the baseline. They also opted for bespoke reliability/stability metrics, including service availability and the number of open defects, instead of DORA metrics (change failure rate and mean time to recovery). Based on the team’s measurements, between March and November of 2023, the key delivery speed metrics have improved twofold while the quality and availability remained stable.

The Overview of Improvement in Delivery Metrics (Source: Engineering Blog)

Throughout the period under observation, the engineers were gradually improving the code quality for the Java backend service. They also moved towards smaller merge requests (aka pull requests) to ensure code reviews were less painful and were being prioritized among team members. Furthermore, developers made improvements in the deployment process, gradually reducing the manual verification steps and relying more heavily on improved automated testing. Finally, they switched to fully automated deployments reducing the deployment time from 40 minutes to 4 minutes.

Egor Savochkin, senior engineering manager at, describes the approach the team took to reduce the risks of introducing issues while making changes and to improve the code along the way:

The team adopted the Boy Scout rule to improve the code quality by refactoring and test automation while not stopping all the feature work. While implementing changes or fixing defects, you also strive to improve the code around. This does not need to be a huge improvement. This may be as simple as adding unit tests to the classes you touched or making small refactorings to fight code smells.

The frontend side also saw improvements after the team opted to split the monolithic app into Micro Frontends (MFE), but the improvement didn’t materialize as quickly as hoped. After adjusting the code review process and reducing the dependency on external expert approvals, the code review time went down to 8 minutes. With quick code reviews, the team moved to more frequent but small deployments, reducing the deployment time to 1 hour.

Previously, InfoQ also reported how eBay was able to improve its delivery metrics significantly after reworking the View Item page.

About the Author

Rate this Article