An interoperability code of practice for technologies in the built and managed environment is set to be launched on Monday 17 April in London.
Poor interoperability of information has been a perennial problem for professionals in architecture, engineering, construction and asset operation and maintenance for decades. When I wrote my book, “Construction Collaboration Technologies: An Extranet Evolution”, in 2005, I looked forward to a rosy collaborative future encompassing BIM, real-time collaboration, mobile technologies, and rich integration with back-office and others systems. There has been a lot of progress on the first three, but, sadly, much less on how information can be exchanged between systems.
Impacts of poor interoperability
At the time, I noted a 2004 report by the US National Institute of Standards and Technologies which estimated that poor interoperability cost US businesses a massive $15.8bn a year – then between one and two per cent of the construction industry’s annual turnover. Extrapolating that figure to the UK, I calculated that poor integration was likely to be costing UK businesses between £0.8bn and £1.6bn per annum. The International Alliance for Interoperability (now buildingSMART International) was busy developing Industry Foundation Classes (IFC), and there was talk of vendor-neutral data standards, but my 2005 optimism was gradually eroded.
In an Extranet Evolution post in October 2007, for example, I noted a survey saying “interoperability costs add 3.1% to a typical project budget”. In 2008, I wrote about Autodesk and Bentley plans to improve interoperability between their proprietary design formats (Bentley and Autodesk target interoperability). But in 2009 US practitioners were suggesting efforts to improve interoperability were “falling apart” (USACE contract requirements expose BIM interoperability shortcomings). And UK efforts through the Network for Construction Collaboration Technology Providers (NCCTP) to create a data exchange standard between SaaS collaboration platforms largely failed (July 2009 post: Collaboration vendors unveil (old) plans for deeper interoperability).
Fast forward 10 years and the picture was no rosier. I talked frequently to industry practitioners who were frustrated that the technologies they used often prevented reliable exchanges of information with their clients or other collaborators. I mused about ‘connected data environments‘ in January 2020, then talked to John Egan at BIMLauncher about data exchange standards to connect the various applications, platforms or technology ecosystems (Open CDEs and BIMLauncher). The issue of poor interoperability also featured prominently in 2020’s ‘Autodesk Open Letter’ conversations (see Design firms demand change at Autodesk, and More designer unrest about Autodesk).
Addressing poor interoperability
However, change was afoot. Starting to address some of the challenges hampering effective information management in the UK AECO sector, a BIM Interoperability Expert Group (BIEG) had been established. It gathered evidence and produced a March 2020 report that identified some practical enablers of interoperability. This report – and the ensuring evolution of the BIEG to become the Government & Industry Interoperability Group (the GIIG, of which I was a member) – successfully stimulated considerable discussion of interoperability at the Digital Construction Week (DCW) trade show in London in November 2021 (post) and at DCW in May 2022.
The GIIG works with the public and private sectors to improve their ability to exchange and use information, ensuring that the information is independent of the technologies used to deliver it (a definition captured in the November 2021 GIIG Glossary – available here).
In 2022, the group conducted a survey into the impacts of poor interoperability. As well as extensive experiences of added time and cost and reduced quality of information due to poor interoperability, the survey received anecdotal feedback:
- One organisation reported an increase in staff turnover: frustration over the amount of time spent manipulating files was cited in exit interviews.
- Another highlighted duplication of work, estimating it had spent around £1.24 million over seven years on repeat surveys and reproducing information.
- A third organisation had been commissioned simply to transfer information from one proprietary platform to another. As the respondent said: “There should be no need for us to do this.”
Delivering Valuable Data: an interoperability code of practice
Working with cross-industry stakeholders, the GIIG appointed a 16-strong working group, which I chaired, and which was tasked with drawing up a code of practice to establish and promote good interoperability in the technologies used in planning, design, construction and asset management. This code may be used to demonstrate whether technology products support interoperability.
In the Construction Playbook and the TIP Roadmap to 2030, the UK Government is calling for the sector to deliver better whole-life outcomes. To do this, we need to work more effectively, to share information between supply chain partners and deliver it to asset owners in forms they can use throughout the life cycle of their assets. Increasingly, clients are seeking to manage their information independent of the technology used to create it, but this is currently challenging to deliver. The Code of Practice is tackling these challenges, setting some core principles and providing supporting technical recommendations to software developers and to the wider industry.
- “Delivering Valuable Data” is being launched on Monday 17 April 2023 at the Institution of Civil Engineers in London. Come along to learn about the Code of Practice, how it can support procurement to improve data delivery, and opportunities for its future development. Join policymakers, public and private asset owners, technology providers and industry supply chain professionals to learn about the benefits the code will deliver and how you can participate in its future. Sign up for free launch (and a networking lunch) here.