BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Articles Limitations of Technical Debt Quantification: Do you Rely on these Numbers?

Limitations of Technical Debt Quantification: Do you Rely on these Numbers?

Bookmarks

 

Here is a snippet of a conversation during a conference lunch:

Participant A: What you are going to attend next?

Participant B: I am thinking of attending this talk about technical debt.

Participant A: Oh, technical debt…we already have “XYZ” tool that quantifies the exact amount of technical debt in terms of the effort required to repay it in person hours. It even gives us nice graphs and visualizations. So I don't need to attend this talk!

This conversation triggered an interesting question in my mind: By using currently available technical debt tools, is it possible to exactly know how much technical debt does the software has in terms of cost and effort? Is it possible for such a quantification to be accurate and complete in all aspects?

The quantification provided by currently available technical debt quantification tools and frameworks are incomplete and inaccurate. Why? Consider the following reasons:

  • Lack of clear consensus on dimensions of technical debt: Technical debt has various dimensions including code debt, design debt, architecture debt, documentation debt, and test debt. There is no well-accepted scope to define the boundaries of technical debt. For instance, many people believe that known defects also contribute to technical debt. The present set of technical debt quantification tools does not explicitly state the scope of the technical debt dimensions that they have adopted. This leads to a misconception: the user of the tool believes that the picture painted by the tool is complete and accurate and hence there is no need for him to bother about technical debt beyond the information provided by the tool.
  • Poor support to identify and quantify various technical debt dimensions: Currently available technical debt quantification tools focus only on a few dimensions such as code debt and to some extent design debt and test debt. Such tools do not provide a comprehensive support to detect issues pertaining to other dimensions such as architecture debt or documentation debt. In fact, the comprehensiveness of the supported dimensions is also questionable! For instance, how many design debt issues (or design smells) such tools identify and report? Although, such tools support a set of design rules (that may lead to design smell detection), but such rules are just handful. Further, dealing with false positives (i.e., false alarms) generated by the underlying analysis tools is inherently difficult.
  • Generalized absolutization: The technical debt quantification tools convert the identified issues into the absolute effort required to fix them. The effort required to fix a technical debt issue (especially design and architecture related issue) varies depending on the context of the issue (severity, scope, platform, and skills, to name a few). Thus generalized absolutization produces estimates that are far from the reality. Hence, such quantification is at best an approximation and it must be treated as so (and not as something carved in stone).
  • Missing interest component: Any type of debt has two components: principal and interest. The currently available technical debt quantification tools focus on (and compute) principal associated with a technical debt issue and ignore the interest component. This makes the quantification far from reliable. A few researchers (such as Ariadi Nugroho et al. 1 and Vallary Singh et al. 2) have proposed approaches to compute technical debt interest accumulated over time along with the principal. However, the current set of tools has not incorporated the same yet in their technical debt computation. The interest component – especially related to added difficulty in comprehension – is difficult (if not impossible) to quantify accurately.

Let us discuss a couple of examples to illustrate why currently available technical debt quantification tools are incomplete and inaccurate:

  1. Example 1: Consider a case of a large enterprise software product that is in its maintenance phase. The original design of the software followed strict layering style (i.e., a layer can access only the layer immediately below it). An architect who had recently joined the project identified hundreds of layering violations throughout the code base by using an architectural analysis tool. The smells included “skip call” smell wherein a layer directly accessed a lower level layer (instead of accessing through the layer immediately below it) and “back call” smell wherein a layer has called to the layer above it (which is prohibited in the layering style). Their team was already using a technical debt quantification tool as part of the dashboard for monitoring quality. However, the technical debt quantification tool did not cover the architectural debt dimension, and hence, the extent of technical debt shown by the tool was misleading.

  2. Example 2: Consider quantifying technical debt in an order processing system. Let us assume that complex tax-related calculations that are unrelated to order processing were added in an existing class named Order in one of the earlier releases as a shortcut to meet the release deadlines and it was planned to be fixed later. This is the “prudent and deliberate” 3 case of technical debt.

    Now, code analyzers don't generally detect this problem. Design analyzers are likely to detect this problem – if the class becomes very in-cohesive (due to tax-related responsibilities that are not relevant in the Order class were added to the class). For instance, Designite 6 detects such a design smell as “Multifaceted Abstraction” 4. The important aspect to realize is that the identified technical debt issue will contribute to the total technical debt for the software only if the technical debt quantification tool you are using detects design smells directly or indirectly (with the help of other tools) and accounts them as part of the technical debt.

    What is the severity of the smell and how much effort will it require to refactor the smell to repay technical debt? The severity of the smell and the effort required to refactor this smell depends on many aspects: the size of the existing class, the size and complexity of the misplaced functionality (i.e., tax related calculations), the extent to which the rest of the code base uses the tax-related features provided as part of the Order class, the possibility of an existing class that handles tax-related responsibilities, the availability of unit tests for Order class, etc. Well, this doesn’t look simple at all!

    The work-item for refactoring the smell may be in the backlog for many years. The developers maintaining the software would have found it difficult to comprehend why tax-related functionality is in the Order class. They had to deal with working around the problem and they kept adding new code in and around the tax related functionality which made it even harder to refactor in future versions. This is the interest component of the incurred technical debt (the principal being the original shortcut to add tax-related functionality to the existing Order class). Technical debt quantification tools (all the current ones that we know of) ignore this interest part when quantifying the technical debt. Even if any of quantification tools consider the interest part, it is extremely difficult (if not impossible) to accurately quantify the added difficulty in understanding, comprehending, and dealing with the unrelated tax-related functionality being part of the Order class.

Still not convinced? Try running different technical debt quantification tools on the same code base and crosscheck their quantifications – you’ll be surprised how vastly different they are!

Important note: We did not want to offend any tool vendor and hence have consciously decided to omit mentioning any technical debt quantification tools. We use technical debt quantification tools and like them, but we also wanted to caution the readers about their limitations, especially not to rely on the numbers they generate.

Summary

Technical debt serves well as a metaphor for communicating the consequences of poor design and the need for continual refactoring. When we try to measure and quantify technical debt, it gets as difficult as precisely measuring software productivity or quantifying software quality!

Of course, tools quantify some aspects of technical debt, assign monetary values (or parameterize such quantification), help us understand the extent of technical debt, and provide us with a means to track our progress in repaying debt. However, the cost and effort estimates they provide should be taken with caution.

Since technical debt is about taking shortcuts or sub-optimal design decisions, the debt incurred is inherently difficult to quantify precisely and completely (unlike financial debt in which the amount of the debt and the interest is unambiguous to compute). Treat technical debt quantification in terms of effort and cost from tools as estimates and only estimates and not as absolute truth.

References

1 Ariadi Nugroho, Joost Visser, and Tobias Kuipers. 2011. An empirical model of technical debt and interest. In Proceedings of the 2nd Workshop on Managing Technical Debt (MTD '11). ACM, New York, NY, USA, 1-8. DOI=10.1145/1985362.1985364

2 Vallary Singh, Will Snipes, Nicholas Kraft, A Framework for Estimating Interest on Technical Debt by Monitoring Developer Activity Related to Code Comprehension, Sixth International Workshop on Managing Technical Debt, International Conference on Software Maintenance and Evolution, 2014 (ICSME 2014).

3 Martin Fowler, “Technical debt Quadrant”. (last accessed 26-May-2015)

4Refactoring for Software Design Smells: Managing Technical Debt”, Girish Suryanarayana, Ganesh Samarthyam, Tushar Sharma, ISBN - 978-0128013977, Morgan Kaufmann/Elsevier, 2014.

5 Alexandra Szynkarski, Technical Debt Debate with Ward Cunningham & Capers Jones, (last accessed 25-May-2015)

6 Designite – A Design Quality Assessment Tool.  (last accessed 26-May-2015)

About the Authors

Tushar Sharma is currently a Technical Expert at Research and Technology Center, Siemens Technology and Services Pvt. Ltd. Bangalore, India. His work at Siemens involves researching, providing consultation and trainings on topics related to software design, refactoring, design smells, code and design quality, design patterns, and change impact analysis. He has been pursuing his research interests actively in topics related to software design and refactoring that resulted in several patents, research papers, and tools. He has an MS (by research) degree in Computer Science from the Indian Institute of Technology-Madras (IIT-M), Chennai, India, where he specialized in design patterns and refactoring. He has co-authored two books; his first book "Oracle Certified Professional Java SE 7 Programmer Exams 1Z0-804 and 1Z0-805" published by Apress in February 2013 and the second book "Refactoring for Software Design Smells: Managing Technical Debt" published by Morgan Kaufmann in November 2014. He is an IEEE Senior Member. His twitter handle is @Sharma__Tushar

Ganesh Samarthyam has 12+ years of working experience in IT industry. He is currently a corporate trainer & independent consultant based in Bangalore. He worked for Siemens (Corporate Research and Technologies, Bangalore) in "Software Architecture and Development" team for the last 6+ years. Before Siemens, he worked in Hewlett-Packard’s C++ compiler team, Bangalore for 4.5 years. He also served as a member of ANSI/ISO C++ standardization committee (JTC1/SC22/WG21) from 2005 to 2007 representing HP.  He is an IEEE certified Software Engineering Certified Instructor. He has authored/co-authored many articles, research papers, and books. His latest book is “Refactoring for Software Design Smells: Managing Technical Debt” published by Morgan Kaufmann/Elsevier (published in November 2014). For more information, visit his website or his LinkedIn page. His twitter handle is @GSamarthyam

Rate this Article

Adoption
Style

Educational Content

BT