Sometimes one’s reading and writing can throw up some fortunate coincidences. Take this morning, for example.
- Via TenLinksDaily, I followed a link to an AECcafe.com article by Susan Smith: The Road to AEC Project Execution Success.
- Later, my feed-reader pointed me to a blog posting (and associated PowerPoint presentation) by Stress-Free’s David Harrison: Making digital collaboration “more betterer”.
- I also got round to reading an AECbytes newsletter from 7 October in which John Tobin writes about atomicBIM.
- Last, but by no means least, I opened an email from Gustavo Lima of Cannon Design (following up on my 13 December 2007 post: Collaboration: standards, viewing, plus industry education).
All three strands, in different ways, argue that Building Information Modelling (BIM), on its own, will not solve some of the fundamental collaboration challenges involved in delivering projects in the built environment.
BIM and mitigating risk
In her article, Susan reviews research findings from a Newforma-funded research project undertaken by Bruce Jenkins of Spar Point Research, entitled “Mitigating Risk in AEC Project Execution: Perspectives from Principals, Counsel and Insurers” during which Jenkins interviewed 11 people. The risk-averse nature of the construction industry is clearly described:
“Fear of liability and strong aversion to risk exposure were cited as the primary cause of the industry’s declining productivity. Lack of communication between different parties involved in projects has been ensured by the compartmentalizing of responsibility. The maintenance of business processes and project execution processes that further isolation between the different parties has become the norm as it allows each party to shift accountability to others in the asset-creation value chain. This situation, in which fear is wagging the dog’s tail, so to speak, has characterized the industry and fragmented the value chain, which in turn causes more things to go wrong in the end.”
The study’s research then suggests that a greater exposure to liability could in fact reduce the likelihood and severity of problems on projects, as, instead of avoiding liability and shifting risk, professionals look at controlling risk. Human aspects of project delivery were found to be critical to avoiding errors and omissions:
“a corporate approach and culture that valued “training and sensitizing” their employees to identifying risk and then staying on task with a process – rapid detection of a project going off track, and quick and targeted response to remedy the situation. In a nutshell – project execution.”
While tools such as BIM are said to address project execution issues, the study identified:
“uncertainty as to whether project execution can be addressed by digital technologies, while at the same time, respondents shared a strong sense of the need for these solutions and have definite ideas about what they should be able to accomplish”.
It is clear that there are people and process issues that need to be addressed as well as technology to support better collaboration (regular readers will know that this is a recurring theme in this blog – eg: Another good BIM viewpoint and People, processes and technology and the afore-mentioned 13 December 2007 post).
Interviewees suggested that BIM and other better automated tools will make it easier for AEC firms and clients to resolve their differences or avoid their having conflicts in the first place, highlighted untenable contracts as an issue, and criticised existing risk management practices (“finger-pointing [is] more attractive than problem-solving,” said one). One principal (Rich Nitzsche of Perkins+Will) said the single most important factor in reducing risk in AEC projects is:
“Transparent communications between all parties. A clear rendering of risk & reward, shared risk and benefit, and the elimination of adversarial roles is essential in achieving this. It sounds utopian, but it’s the sort of ideal that will embrace design assist, interdisciplinary coordination and other positive behaviors, one that will put a high premium on problem solving and none on blamecasting.” [my emphasis]
Lawyer Chris Noble felt BIM and other better automated tools had to be integrated with other improvements:
“It appears that this will be a very helpful factor, especially if these technological developments are accompanied by more collaborative and productive project delivery structures and procedures.”
Digital architectural collaboration
In his excellent post, David Harrison says he recently gave a presentation at Victoria University on his PhD work on digital architectural collaboration. He says that “whilst BIM is an excellent productivity tool it does not address many of the industry’s collaboration issues – in fact in many respects it compounds them”, before describing BIM as a “significant yet problematic collaboration technology” (I love his slide “BIM is not a golden bullet”). For example, David references the 2007 McGraw Hill SmartMarket report (see my 27 October 2007 post Software incompatibility bar to interoperability) to highlight how BIM does not address industry issues about use and re-use of data, and, I dare say, he probably has the same people and process issues identified above in mind too.
Rather than delve down into how we might address these problems, he outlines the need for an overriding set of digital collaboration principles, developing “Seven principles of the Project Information Cloud” (an allusion perhaps to current debates about ‘cloud computing’):
4. Modular design
5. Information awareness
6. Context sensitivity
7. Evolutionary semantics
Through application of these principles, David hopes we can evolve a collaborative vocabulary and establish Project Information Clouds within architectural projects: “These unbounded information clouds will link significant amounts of projects data into intelligent, loosely joined, knowledge-bases.”
In this context, I found myself thinking, once again, about BIMaaS (see my 26 August 2008 post AEC’s Software-plus-Services player to deliver BIMaaS? – which also links to a previous article by David) and CADaaS, and the utopian (? – that word again) notion of a building model being constructed from information held across an immense constellation of different hyperlinked sources.
This brings me neatly to atomicBIM….
This has nothing to do with the nuclear industry; rather it is a suggestion, in an article entitled atomicBIM: Splitting Data to Unleash BIM’s Power, from John Tobin (author of the Proto-building article I wrote about in May 2008) that instead of creating ever larger, difficult-to-access BIM models, we should be conceptualising and structuring the BIM environment for quick and easy access:
“We could imagine an arrangement where BIM is comprised of many tiny pieces of data. We can call this atomicBIM—that is, BIM in small, discrete pieces of data. An atomized information structure would provide granularity and rapid access so that subsets of BIM information could be more easily accessed without a massive download.”
For John, such an atomicBIM approach requires that atoms of the BIM remain intact as they are passed from application to application, and that there will be authoring software and an integrative setting:
“Authoring software will populate a BIM environment with atoms, or add discipline-specific information to those atoms. … ultimately BIM authoring is unlikely to be the sole preserve of architects, engineers and other building designers; it will expand to include property managers, financiers, estimators, suppliers, procurers—in fact, anyone whose day-to-day job deals with the built environment.”
On the integration environment, John says:
“It is likely that the BIM “model” most people will interact with will be a static repository of atoms rather than a live interactive design environment. Each participant’s interactive BIM authoring software will produce or add atoms of data, placing them into this static context; the BIM environment will be the “ether” which manages those atoms. … In an ideal arrangement, atoms will be able to be continuously “checked out” of the BIM repository, and authored anew with updated information.”
The foremost candidate to manage “BIM atoms” is, of course, the Industry Foundation Class (IFC) standard promoted by BuildingSMART (formerly IAI). The IFC is no longer a way of overcoming problems of poor interoperability, it has become a powerful means to manage geometric and associated data in a uniform way. The integration environment (the “ether”) is also being developed; some approaches build on the IFC approach, aggregating different models, while ‘model server’ environments tend towards centralised databases (John mentions Oracle in this context) that can be accessed by multiple team members.
Looking more long-term, John says atomicBIM will streamline BIM workflow:
- “Extracting slices of data and processing them in any numbers of authoring applications
- Enabling the use of thin-client devices for lean, efficient access to large datasets
- Easing interoperability and aggregation of data from multiple sources.”
However, he is under no illusion about the scale of the technical challenge. “What is needed is a reassessment of our vision for the eventual BIM model,” he says, before continuing:
“… transitioning to a granular form of BIM will be a wrenching, but important re-alignment for the evolution of BIM. … Though our current BIM solutions have served us well in the last decade, they may not be setting us up for future success. In particular, they have not created scalable, open or granular access to the information we create during design activities. The concept of atomicBIM will help us structure that information in a much more manageable way.” (my emphasis)
(Update (5 November 2008): I just came across Matt’s Dezignstuff blog posting about CAD in the Cloud. He, and his commenters, make the valid point about the scale of the processing power and bandwidth required to run what we currently understand as CAD applications, but then Matt discusses “distributed computing” – another concept that could be applied to the idea of atomicBIM.)
Interoperability between systems
Finally, in his email to me, Gustavo reiterated David’s theme of poor interoperability between systems and succinctly re-affirming why project team members still need to look at people and process issues (eg: contracts) rather than just technology. He writes:
“I’m still looking for the holy grail of collaboration between all these systems. Hardly a day goes by without me having to mediate in a project where one of the parties in the contract wants us to use their system and they can’t understand why we wouldn’t want to do it.”
While industry practitioners remain focused on working within existing approaches to technologies, structures and processes we will not see a ‘great leap forward’ of the kind envisaged when the BIM concept first began to emerge. What I think is needed is a grand vision – perhaps a development of the principles proposed by David Harrison and John Tobin, maybe with a pinch of Web 2.0 thinking and a healthy measure of ideas from the sustainability movement – and the emergence of, in effect, a new ClueTrain Manifesto* for the AEC or built environment world (this manifesto idea also cropped up in discussions before and during the day at the recent Be2camp 2008 in London).
(* Cluetrain co-author David Weinberger also wrote Small Pieces Loosely Joined – linked in David Harrison’s idea of ‘intelligent, loosely joined, knowledge-bases’. You can see a quick ClueTrain Review here from Michael Specht – thank to Mel for the link.)