Integrated ALM Tools Are Fundamental to Success
Modern business processes are increasingly becoming more integrated with each other - delivery, manufacturing, and accounting work together to both deliver increased business value and reduce costs to the customer. Over the last 40 years, automation has enabled businesses processes to flow, integrating specialist tools and practices into a holistic value chain. Customers expect to give their information once; then assume your help desk knows everything about them and their product and that marketing will be direct and personal. However, the processes used to create and maintain those automated systems are much more fragmented and disconnected than the systems they support. The typical software delivery project captures requirements numerous times, describes tests in multiple places, is indiscriminate of what is in a particular build, and often requires a large amount of analysis to know who is doing what and why. Application Lifecycle Management (ALM), the business process of application delivery, is much less mature than the business processes it supports. The software business process has moved from ancillary to mission critical, requiring software companies to deliver a higher-quality product quickly and directly to the customer. This means software delivery organizations and teams must start thinking about how to best integrate their delivery disciplines, creating a holistic integrated ALM approach.
Integration is hard to value, but without it your projects will fail
How many times have you sat in a meeting where everyone is confused while talking about similar, yet different things? Or a meeting where the first 30 minutes are spent working out what the meeting is about? This problem increases when the meeting is comprised of people from different departments, geographies, or organizations. The reality of modern software delivery is teams are made up of people from a variety of different places; development, the PMO, the business, and outsourced testing organizations. Each of these groups leverage a unique set of tools, employ their own practices, and process methods, all optimized for their group rather than the end-to-end process. Collaboration across groups always seems to be the responsibility of the group currently running that stage of the project. For example, when the project is mostly run by development, they take responsibility for sharing the stories and tasks they are working on with the group. When testing is the focus, defects and issues are the currency used for discussion. Because one team is driving a phase they tend to ignore all prior work and often exclude content from other groups; leaving ad-hoc processes, spreadsheets, and wikis to fill in the gaps. The result - unproductive, frustrating meetings; but more importantly, issues that should be resolved are often missed, leading to poor software, increased defects, and costly late projects. Getting everyone ''integrated'' is negated by:
- Ownership – It is easy to define who is responsible for improving one discipline, but who is responsible for the interaction between two disciplines? For example, who would be responsible for improving the relationship between development and testing? Without clear ownership, improvement is hard.
- Geographical, organizational, and political boundaries – The reality of any large organization is organizational structures evolve and are supported by both managerial and political boundaries. Breaking down these barriers is often very difficult when you are pursuing something as intangible as collaboration.
- Measurement – Software delivery has a history of poor measurement, but even the limited measurement used often focuses on one discipline such as testing, development, or planning. Integration by its very nature is hard to measure, and without a clear measure it is hard to focus or improve.
- Inertia – Change is never easy—it is always uncertain, and most workers are biased against it due to negative experiences of ineffective changes coming from management.
Governance and compliance rely on an integrated view
For organizations of any size, being aware of what happened, to whom, when, and who was involved is crucial at all times for auditability and compliance. As work assignments flow through departments, software products, processes, and people, it becomes increasingly difficult to reconcile a change to an action or set of actions. For example, a security requirement on the system manifests itself in code, tests, work items, defects, working software, and operational tickets. If that requirement changes during the process, seeing when and why it happened and what the impact was is often hard to measure or resolve. If the change was localized to the code alone, it could cause compliance issues. Even organizations that are not required to comply with a particular government mandate should have a clear measure of impact, traceability, and process control to manage their processes. Traceability is undermined by:
- Disconnect between the tools storing the information. Each tool stores an artifact, often providing clear mechanisms around history and version management. As work moves through the different departments, the relationship between artifacts becomes disconnected.
- Managing traceability is a huge overhead expense. The time it takes to document a relationship between two items is not too onerous, but keeping these matrices and links updated is difficult.
- The traceability meta-model. When working on a project and trying to deliver software on a deadline, building documentation seems like it should be low on the priority list. Cross team / artifact documentation is even harder to justify. By not taking sufficient time to understand the relationships between the work products and documentation elements, it is difficult to provide traceability, let alone know if you need it at all. Each project is different, and with the addition of Agile methods, the project may follow its own process determined by the problem space and team.
It All Comes Down to the Numbers
An often-cited research study from IDC states that searching for and not finding information costs a company on average $3,300 per employee per year. This of course is greatly affected by the type of information and the employee’s loaded cost. For instance, a C-level executive trying to find a key piece of data for a report is much more costly than a temp trying to locate an email. The IDC study focuses on finding the right information and its effect on productivity in general, so it also ignores the powerful added value of collaboration between two groups that traditionally did not see each other’s information. For example, by connecting the developer and the tester, existing requirements can be re-evaluated, presenting a potential opportunity for a smarter solution. Daniel Moody and Peter Walsh, two researchers who conducted a study on the subject, discuss the increased value of information in their work ''Measuring the Value of Information: An Asset Valuation Approach''. They argue that in general, sharing of information tends to multiply its value – the more people who use it, the more economic benefits can be extracted from it''. The IDC study also ignores the legal and compliance requirements that information holds in many organizations. For example, a system change that causes security fraud but was not correctly documented would lead to litigation and heavy fines.
In general, the costs of data not integrated can be grouped into several categories:
- Data access to the right information – Instead of relying on the IDC number, it is possible to calculate the impact of not having the right information in other ways. For instance, sampling a team over a period of time and studying how long it takes them to find the correct requirement, defect, code element, build, and release information would accomplish this.
- Aggregating information – To expose the multiplier, review the amount of time spent building spreadsheets or emails that aggregate information such as defect lists, requirement status, or task lists.
- The 'Ah-Ha' value – This is much harder to provide hard numbers for but ultimately can be accomplished by reviewing projects that are integrated and trying to identify a particular change or smarter decision that was made.
- Traceable information – If you have X information, what is related to it? For example, if you have the defect, you should be able to find the requirements and build that describe the cause and effect. The lack of traceability information can be thought of as a compliance cost; if the information is not there, the organization is liable if anything goes wrong and the company cannot provide evidence of that compliance.
Why The Time is Now for an Integrated View
It is clear that software projects create large amounts of information. Each discipline has its own process and tools creating information. Add on composite applications, where each application is itself consumed by other applications, and the value and complexity of information only grows. The inherent cost of not having the right integrated information is huge, not only in terms of lost productivity, but also in terms of potential litigation for safety critical systems that could result in physical harm. To better manage your integrated application information, application development professionals should:
- Build a good understanding of the information – Creating an information model of application development information seems like a waste of time to many people. However, a better understanding of the key artifacts and their relationship over time is crucial for building the ALM processes necessary to implement that model.. As software delivery becomes more important to a business, then the number of management reports grow. Reporting can help drive the definition of the information requirements for ALM, and provides a great set of requirements for any such endeavor.
- Look to software delivery tools to capture that key information – Tools provide support for the key disciplines of development but rarely support the end-to-end business process. The material described in the information model needs to be supported, which requires a clear understanding of what is captured in those tools and extending the data as necessary.
- Automate integration – Spreadsheets, emails, and whiteboards are great ways of communicating how one concept affects another. Without a clear automation approach to integration, it is difficult to maintain or manage these relationships. Integrations should be acquired or built between the tools as defined by the information model.
The majority of key business processes within companies have been enhanced and automated. This optimization has led to corporate data models, warehouses, integration technology, and middleware platforms. Software delivery is an increasingly vital business process, but compared to traditional processes such as shipping, customer relationship management, and accounting, its associated infrastructure and process discipline are much less mature. It is not hard to justify integrating customer information with accounting, but connecting requirements to testing has always been much harder to justify, instead relying on manual processes and the tacit knowledge of the team. As software becomes more important and systems become composite, the productivity and quality of those applications is reduced by not having integrated information, and direct business value is subsequently reduced. The time is right for Integrated Application Lifecycle Management, and it is crucial for long term application, project, and business success.
About the Author
Dave West is the Chief Product Officer at Tasktop. In this capacity, he engages with customers and partners to drive Tasktop’s product roadmap and market positioning. As a member of the company’s executive management team, he also is instrumental in building Tasktop into a transformative business that is driving major improvements in the software industry. As one of the foremost industry experts on software development and deployment, West has helped advance many modern software development processes, including the Unified process and Agile methods. He is a frequent keynote at major industry conferences and is a widely published author of articles and research reports, along with his acclaimed book: Head First Object-Oriented Analysis and Design, that helped define new software modeling and application development processes. He led the development of the Rational Unified Process (RUP) for IBM/Rational. After IBM/Rational, West returned to consulting and managed Ivar Jacobson Consulting for North America. During the past four years he served as vice president, research director at Forrester Research, where he worked with leading IT organizations and solutions providers to define, drive and advance Agile-based methodology and tool breakthroughs in the enterprise.
mostly theory talking
Integration of ALM tools
Anatole Tresch Mar 03, 2015