Agile and standard metrics: the perfect complement to add strategic value


The software industry is associated with the most innovative methods and technologies, but is also an artisan art activity. As Bill Gates once said, “software is a great combination between artistry and engineering.” From the outside, two software solutions with the same functionalities may be similar, but on the inside they can be quite different. That said, one may perform better performance or be easier to maintain in the future, provide the same functionalities with less code, be less prone to error or simply be built more cleverly.

Such aspects revolve around the technical knowledge of the teams who create solutions and the quality and clarity of requirements. But more than anything, they revolve around two magic words, which dictate not only how complex IT solutions are once they’ve been created, but also how IT projects are managed: “common sense”. Sometimes sound simple, but not all the times this common sense is easy to accomplish, especially when people make decisions or implement actions not based on technical criteria, or when there are conflicts of interest, knowledge gaps, or when requirements or concepts are vague or ill-defined. What seemed like brilliant IT products (internal or external) becomes mediocre ones.

It is even more interesting when you add the word “excellence” to these two words; indeed, this is crucial: “common sense excellence”. Delivering the best quality for customers, the best product, or in fact for your customers to benefit from the maximum possible value from an IT solution (external if you are an IT company, or within your own company if you are in the IT department), and offering this at the best possible price are strategic objectives that must be made specific and measurable (e.g. by using SMART goals: specific, measurable, achievable, relevant and time-bound).

It is essential to keep in mind that information technology must deliver the best possible business value and competitive differentiators for companies. IT projects are just a temporary process that helps create, maintain or enhance products. At the same time, IT products can provide a business with strategic value or just help it to complete processes more quickly or at lower cost. So it is imperative to measure and monitor this value, and have clear and concrete targets.

Agile: the dawning of a new age in project management, started at a ski resort

In February 2001, seventeen software development thinkers met up in the Snowbird ski resort in Utah, USA, and created the “Agile Manifesto” for software development. By the end of that year, the Agile Alliance had been set up as a non-profit organisation with the aim of promoting agile software development based on the so-called Agile Manifesto.

This Agile Manifesto laid down principles or commandments. In essence, it is common sense and excellence in one, although for some in the world of project management there may be one or two things missing. The manifesto shifts thinking away from traditional waterfall concepts to interaction methods. Now famous, it includes twelve high-level guidelines that are essential for:

  • providing value for the customer (“our highest priority is to satisfy the customer through early and continuous delivery of valuable software”)
  • underscoring the importance to correctly managing requirements and their changes, so that the product matches real customer needs (“welcome changing requirements, even late in development”; “deliver working software frequently”)
  • working correctly (“business people and developers must work together daily throughout the project”; “build projects around motivated individuals”)
  • gaining commitment on all levels (“the sponsors, developers and users should be able to maintain a constant pace indefinitely”)
  • highlighting the importance of “continuous attention to technical excellence and good design” and “simplicity, the art of maximising the amount of work not done”

These words may be nothing new from a “common sense excellence” perspective, but sometimes it is important to remind ourselves of the crucial things in life in order to keep focusing on key issues.

The “CHAOS Report”, issued by the Standish Group, is a fascinating study produced annually for more than twenty years based on the analysis of thousands of IT projects around the world, from small projects to extremely big ones. The main focus of this periodical report is to analyse the reasons why projects succeed and fail.

We can safely say that history tends to repeat itself. In the most recent report, the three major reasons of project success are: user involvement, executive sponsorship, and emotional maturity (team behaviours, skills, etc.). Turning back the clock two decades, the reasons in the nineties were: user involvement, executive management support and a clear statement of requirements. Looking at things from the opposite angle, the top factors resulting in project cancellation were: incomplete requirements, lack of user involvement and a lack of resources. The actual technology involved in a project is entirely different (apart from mainframe world) from twenty years ago, but the main causes of the success or failure of a project are almost the same.

One clear (but not new) finding from this report is that the ratio of IT projects cancelled, completed but over-budget, delayed or with fewer functionalities than expected is much higher in very large or large IT projects than in small ones. The success of an IT project is inversely proportional to project size. Small projects are typically described as a success, whereas extremely large projects are more likely to be described as “cancelled”. We could say that there are clear overlaps between the Agile Manifesto and the CHAOS report, linked to the “dividing to rule” principle of agile methods: break projects down into smaller parts based on user functionalities (user stories), prioritise them and deliver them quickly and regularly (iterations). Another finding in recent years is that the percentage of successful projects emerging from agile methods is much higher than with waterfall projects.

There can be no doubt that the agile concept has resulted in the dawning of a new age in project management, and it has changed how IT projects are managed.

Missing golden metrics

It is extremely important to make a distinction between the concept of “projects” and “products” and to manage and measure each of these concepts appropriately. This is a simple idea, but sometimes not this clear distinction is done, not just in small companies, but also in big ones. In fact, it is not uncommon to see that the IT “product” (that is being created or enhanced) is not even measured. Instead, metrics focus solely on how the project is going or even sometimes look at how a contract is going. The point is that similar software solutions, products or enhancements can be achieved under different types of contracts, or with waterfall or agile methods (in small iterations, where customers can see and check results quickly). Nonetheless, whether waterfall or agile methods are used, whether work is based on a fixed-price contract or time & material approach, whether it is one big project or divided into dozens of small projects, ultimately the final output is a software product. This software product must deliver value to the company based on a tangible cost (regardless of methodology, contractual pricing, the number of contracts, or even the number of projects).

So if it is essential to distinguish between a product and project, and emphasise that these are two completely different things, then it is no less important to distinguish between effort (time invested) and size (functionalities provided to the customer). Nobody (not even people less familiar with such topics) should to fall into the trap of believing that investing more effort into a project means that the product created is greater in scope or vice-versa as something standard. Maybe this will be the case, but maybe not. Whatever the nature of a project, I have never witnessed a project that does not monitor or control effort or costs, but I have seen far too many projects where size is overlooked.

Story points are the most widely used method to calculate the effort required to develop a story. These address the volume of work to be carried out, complexity and the risk or uncertainty of a task. But it’s important to take into account that as such, it is a somewhat arbitrary measure. It is safe to say that even within the same company two different teams may arrive at two different numbers of story points because for one team it may be an easy task whereas for the other the task may be extremely complex. In isolation, however, this metric can be helpful for estimating, planning and tracking stories or for future forecasting, albeit only from the perspective of the individual story.

Based on the concept developed by Capers Jones “thirteen software metrics”, story points only meet four out of those thirteen criteria: they are not standardised (every story is different), they are highly ambiguous, they do not have adequate published data, they do not have tools available for new projects and legacy projects, they do not have conversion rules for related metrics, they cannot deal with all deliverables, they do not support all kinds of software, and they do not support reusable artefacts. The most typical agile metrics only fit four of the thirteen criteria.

Agile metrics, and even traditional project metrics, create new possibilities when it comes to a product concept, measuring it with standard methods and using those as a basis for most strategic metrics. If this is undone, all subsequent metrics will provide arbitrary ad hoc numbers. In fact, one and the same IT product, irrespective of whether it was based on an agile or waterfall approach, will have the same size. The methods used to manage a project (or contract) are simply mechanisms, applied in order to create products or enhance existing ones. Even within the same framework, standard international metrics make it possible to compare products created using agile methods with products created using different methods or under different contract types.

The perfect coexistence

Something that is complementary to something helps make things complete or perfect. Based on recognised ISO standards, the metric “application size” is an ideal and indeed necessary complement to agile metrics. It represents a magic combination between product metrics and project management metrics. The customer receives an IT product, not a project. When we are customers ourselves buying products in a supermarket, we consider what a product will be like (or we thought about this before if we are already using a product) and we assess the quantity we will receive, the quality, the price and other features such as brand reputation. As a rule, however, we are not in the slightest bit interested in how the company makes this product or manages its internal projects (a black-box process we know nothing about anyway), or its processes, tools or the machines used to manufacture the product. As a customer, we basically place emphasis on the product we receive and its price. We may even go further and compare the price per kilo or litre as a yardstick for assessing products based on size. The same generally applies when we buy something tangible like standard software. So why as users or customers would we want to place so much emphasis on how an IT product is developed or think about things from a project management perspective – and not measure the product itself?

The standard approach to measuring size is to use IFPUG FSM (functional size measurement), a recognised ISO/IEC method, considered the most widely used method internationally and it has spawned other standards. But it is interesting to make a distinction between different internationally acknowledged ISO/IEC methods: IFPUG (International Function Point Users Group), FiSMA (Finnish Software Measurement Association), Nesma (Nederlandse Software Metrieken Associatie), COSMIC (Common Software Measurement International Consortium) and Mk II (UK Software Metrics Association), sponsored by non-profit organisations and that in spite that can be seen as competitors or rivals they are closely linked and engage in cooperation, to other non-standard methods created by private companies. We can say that the more a method is applied and the more universal it becomes, the more useful it becomes as a point of reference for comparison and benchmarking.

By using such approaches to product metrics, initiatives move beyond the isolated project level and it becomes possible to measure the product itself and make meaningful comparisons, internally or externally, even versus international benchmarks (without needing to mention words such as agile, waterfall, incremental, etc.). Perhaps the approaches used in waterfall projects will follow the example of agile methods, so different projects or stories will start to be looked at as if they are a product. By combining project and product information, we will unveil fascinating information and potentially exciting conclusions. We would reach a point whereby from a strategic, product-owner or C-level executive point of view, the word “product” becomes much more important than the word “project”. What actual product will I receive and what will it cost?

There may be a perception that agile project metrics work at a different speed or pursue a different purpose; that they are completely different from traditional concepts, estimation processes and measurements. One might also wonder if there is sometimes a conflict between the conventional project IT management approaches and new methods (on the surface, or less obvious). It is clear that there are different camps of thought and some people are willing to defend different positions and specialities – sometimes for technical reasons or based on common sense, but perhaps also simply for personal reasons or motivated by what a group or team has to gain.

What is essential is that the output of work is measured properly. As highlighted, irrespective of the approach taken for a project (or projects), or the methods used to arrive at a solution or how a project was managed, output must be measured in terms of quality, productivity and price. Agile methods must not be seen as an ecosystem, but rather as an interesting tool that enables us to build products and apply the Agile Manifesto more effectively – which is tantamount to applying common sense in managing projects.