Australian-based construction collaboration technology vendor Aconex says it is now “providing its online information management service to $100 billion worth of construction and engineering projects” (I’m not sure if this is US dollars or Australian dollars; in British pounds sterling, this would be around £50bn or £40bn respectively).
According to its news release: “Global uptake of the company’s web-based solution… doubled in 2006 for the third consecutive year”. Without giving any figures other than claiming it is currently “servicing clients in more than 40 countries”, Aconex claims it is the “world’s largest provider of web collaboration to the construction and engineering industries“.
As previously pointed out, other vendors have also made claims to being the biggest or to having achieved market or even global leadership, including CTSpace (see De Kieviet’s Gambit – more CTSpace ‘spin’) and Autodesk.
I have been monitoring the construction collaboration market for some years and there are few (if any) reliable metrics to support such claims – partly because there are drawbacks with just about every potential measurement (one fundamental issue relates to the ability of each vendor to collate accurate data on its system’s use: if the vendor is delivering its applications on an on-demand or Software-as-a-Service basis, it will be easier to gather information than if the vendor provides enterprise applications which are then hosted by its customers).
Let’s take a look at some of the most often-quoted metrics:
- total capital value of projects – adding up project capital values for all schemes can give an impressive total value, but favours firms which have been around for a few years, or which have won roles on major schemes, particularly mega-projects in boom markets such as Dubai or China.
- total number of projects – can be skewed by schemes which include a large number of small individual projects (for example, a bank re-branding programme might be counted as one major project or regarded as, say, 500 separate sites)
- total number of users – can be distorted by including all users since the vendor’s first project some years ago. Such historic figures may include many people who no longer use the system; they may include people who were registered to use the system but never actually used it; they may include some users more than once – for example, if the individual moved to another company; or they may omit users who share a single login (a particular problem if the application is licensed on a per-seat basis).
- total number of companies – like the total number of users, this may be increased by including companies which no longer use the system, or which were registered but never used it.
- total number of customers – particularly in the early days, there was a tendency by some firms to regard all user firms as customers. Of course, if end-users have to pay per-seat to use the system on a project, each company could be regarded as a customer, but some vendors prefer to deliver the service on a per-project basis, dealing with the organisation commissioning the project (or that organisation’s representative) and placing no limits on the number of end-users in the project team.
- total number of logins, drawings or documents – again, care should be taken to understand what period is covered by the totals. A historic figure may sound impressive, but may not reflect more recent trends. (At BIW, to overcome such issues, we have started to focus on how many new users have registered in the past calendar year, and how many registered users have logged in during the same period; numbers of annual logins, or totals for documents or drawings published during the previous 12 months are also useful indicators of recent system use. See also Needed: a new NCCTP standard: how to account for extranet ‘traffic’ and Statistics, statistics.)
- number of countries of operation – Does this mean states where the vendor has offices, where actual projects are being carried out, or where clients or end-users are based?
- total number of staff – Some businesses claim to be the biggest simply because they have most employees, but, particularly where vendors deliver solutions in other areas in addition to collaboration, care needs to be taken to focus on those involved in developing, delivering and supporting the vendor’s collaboration solution(s) – perhaps using full-time equivalents to allow for individuals whose role is split.
- annual revenue or turnover – Like the last point, does the total relate purely to collaboration, or to all products and services? Also, depending on what licensing and accounting practices are applied, some vendors may include up-front fees paid for licenses covering periods longer than a year while others will only account for fees actually paid during the year in question. On-demand vendors may also have an additional metric in their statistical armoury: future order book, giving the value of services for which they have orders but which they have yet to invoice. And, of course, the long-term health of a vendor will depend on its ability to deliver its applications and/or services at a profit. (See also BIW’s growth continues – but what about the others?)
In my view, financial performance is the most critical measure of a vendor’s success. We need to look behind the headline-grabbing claims and get some clear information (preferably detailed audited profit-and-loss accounts) to see if the financial picture is as rosy. Even the most impressive bank of statistics counts for nothing if the business is not making enough money to survive. We would also be able to make some meaningful comparisons if all the leading vendors began to release statistics compiled to consistent standards and to release financial information collated to recognised accounting standards. However, I won’t be holding my breath waiting for it to happen.