Big Data Drives the Production System

Say “production system” to a builder and they probably think “prefabrication”. Say it to a systems thinker and it’ll probably mean much more. Like rapid prototyping, comparing and optimizing several whole-system solutions – not just a wall system, but the whole building – starting in the planning stages.

McKinsey & Company’s report, Reinventing Construction Through A Productivity Revolution, gives seven solutions for construction’s decades old problems. McKinsey rightly claims that the biggest impact on productivity comes from thinking about construction as a production system. (1)

Yet, confining the production system to off-site and on-site productivity improvements significantly limits its potential. We need a comprehensive production system for the whole project, from beginning to end. And we need a systems approach to big data to pull it off. Then we can finally reverse decades of declining productivity, along with its excess waste, time and costs. (2)

Toyota’s Production System (TPS) models the systems approach for us. Early in product development, Toyota’s set-based concurrent engineering (SBCE) includes prototyping of multiple design and manufacturing concepts – concurrently. Toyota optimizes the whole car as a complete system of attributes that align with customer interests: fuel efficiency, ride comfort, noise, first costs, service and maintenance costs, resale value, as well as styling.

Likewise, we can now analyze the building as a complete system, instead of a disconnected assembly of component parts. This allows us to optimize the whole building according to its purpose, and the customer’s needs. A systems approach focuses the entire building team on attributes like functionality, operability, performance, quality, durability, long term value, as well as aesthetics and cost-effectiveness. Measuring results in terms like facility cost per “exam room” or “patient visit” — directly ties the facility to the customer’s “business case”.

Currently, building production is too far removed from the purpose of the building, and the customer’s needs. Complexity and fragmentation exacerbate all attempts to integrate the two. A big data system changes all of this. It networks the facility program, design, procurement, production, and performance into a system centered around the facility’s purpose and functions.

Gartner defines big data as “high volume, high velocity, and/or high variety information assets that require new forms of processing to enable enhanced decision making, insight discovery and process optimization.”

Objective knowledge, acquired through big data analysis, drives a comprehensive production system. Cost effective, high performance buildings result.

Our earlier article, A Big Data Breakthrough in Construction de-mystified big data analysis. We learned that a tight system, or nucleus, of critical data dramatically reduces a facility to less than 200 data points; bringing vital structure to the otherwise unmanageable array (hundreds of thousands) of data points. Detailed data points, previously scattered throughout a variety of disassociated applications, spreadsheets, processes and so on, roll up into this nucleus of critical data. Powerful knowledge and insight to the production system follows.

Figure 1 shows how the data system works outward from the nucleus of critical data to the detailed data regions. Buildings become “known” as a complete, integrated and virtual system of critical information.

 

The building as a system, is centered on its purpose (e.g. hospital). Departments (e.g. surgery, patient beds, radiology) and functions within each department (e.g. CT scanning, ultrasound, exam) further define the all important purpose of the facility. Constraints (location, owner type, shell type, etc.) and standards (quality, durability, energy, etc.) make up the other critical attributes. Given the functions, constraints and standards, big data enables prediction and value analysis of the facility’s program, design parameters, schedule and costs.

A systems approach to big data, when centered on the facility’s purpose and functions, enables a comprehensive production system

Experiencing Big Data Analysis

A new comprehensive (data-driven) production system revolutionizes the construction process. It does this by measuring the effectiveness of our program, design, and production decisions and actions against the buildings purpose and functions. (3) Finally, we have the means to objectively measure, and therefore improve, the building process from beginning to end.

Figures 2 and 3 display the comparative data results from a series of completed, real-world, healthcare projects. Each facility was produced by a different combination of architect and builder. The Figure 2 scatter chart shows the relative Program and Plan Efficiency (vertical axis) compared to Specification and Production Efficiency. (4) The lower-left quadrant represents the most efficiently designed and most cost-effective projects.

 

The data points displayed as outlined diamonds represent the total building results. The dots represent the building elements (foundations, exterior wall, etc.). The wide range of variation displayed in this chart indicates the enormous waste that plagues the building process.

See, for example, the two projects represented by the orange and gold diamonds at the top of the chart. The program and plan for these projects (vertical axis) is over 20% higher per function (exam, patient room, etc.) than the market average, and over 50% higher than the project represented by the green diamond near the chart bottom. Even though all three projects were produced within 2% of the market average unit cost ($/GSF), there is at least a 50% difference in total first cost, per function served. But the implication extends far beyond first costs. The orange and gold projects tax the owner and occupants with heating, cooling, operating and maintaining significantly more building throughout its life.

Figure 3 displays the data differently. It’s the gross building area per function (exam room in this case) on the right bar chart, and the construction cost per function on the left bar chart. The top white bar represents the “market average” prediction based on the various attributes (constraints, standards, and program). The bottom colored bar represents the actual results. This example of real world projects normalizes the results to same construction period and location.

 

Note that cost per standard exam (left chart) follows the program and plan efficiency (right chart) very closely. Again, excessive building area translates into higher total first costs – and it’s likely to result in higher life-cycle cost when measured against the facility’s function.

Imagine having this kind of information in the planning stages, rather than at the end of the project when it’s too late.

A comprehensive production system results from a systems approach to big data–creating a powerful knowledge resource. This knowledge enriches our other innovations–BIM, VDC, IPD and other Lean practices. This knowledge helps reverse decades of declining productivity, excess waste and time, and cost overruns. And then ushers in a high-performance, high-value building age.

_______________________________

End Notes:

(1) From McKinsey Reinventing Construction paper, “The biggest impact on productivity would come from moving toward thinking about construction as a production system, where possible encouraging off-site manufacture, minimizing on-site construction through the extensive use of pre-cast technology, assembling panels in factories and then finishing units on- site.” The Seven Solutions cited in the report include:

  1. Reshape regulation and raise transparency.
  2. Rewire the contractual framework.
  3. Rethink design and engineering processes.
  4. Improve procurement and supply-chain management.
  5. Improve on-site execution
  6. Infuse digital technology, new materials, and advanced automation.
  7. Reskill the workforce.

(2) As reported in studies by the Construction Industry Institute, American Planning Association, and Department of Commerce.

(3) Building Catalyst is a newly technology that lets us experience the power of big data. Through a systems approach to big data, Catalyst performs predictive modeling, set-based prototyping, target value setting/tracking, comparative analysis and much more.

(4) The Program and Scope Efficiency compares the total building scope (gross building area plus building envelope) to the net departmental/functional area. The Specification and Production Effectiveness compares the direct building cost to the total scope.

Big Data​: It’s about the Power of Knowledge

Big data analytics breaks through our complex and fragmented construction state. Big data replaces our subjective opinion-laden planning and decision-making approaches with powerful knowledge-based resources and processesBig data enriches every innovation sought to improve the construction process: BIM, Lean, IPD, Green Building, and so on.

Who wouldn’t want to know the answers to questions like these?

  • Can we know where the cost overrun comes from? budget? program? design? due diligence issues? construction? Was it the sitework? foundations? shell? interiors? MEP? general conditions? fees? schedule? How can we avoid or catch these problems earlier?
  • Can we check a contractor’s estimate–or our own estimate–against reliable, objective standards? Can we compare them, for example, to the “market average” or our own sampling of project experiences–by line item?
  • Can we know the likely project outcome with greater certainty? Is there some way to model and compare multiple building concepts well before investing in design?
  • Can we know the impact of the biggest decisions before we implement them?
  • Can we establish target values impartially, and keep them free from tampering?
  • Can we track and analyze the project details as we go? How can we know if we’re steering the design and delivery toward desired outcomes (e.g. target values)?
  • Can we to compare the effect of BIM, Lean, IPD, Green Building, Prefabrication etc. against traditional methods?
  • Can we make the project known and measurable based on the occupants’ and/or owner’s business case?

Impartial, big-data analysis provides the “knowledge” that answers questions like these. That’s not all. It paves the way for new automated, and easy-to-learn, modeling and analysis tools that take fraction of the time, effort and cost required by conventional methods and tools.

Moneyball inspired us because it wrapped scientific discovery into an engaging story involving our love of baseball. Some might say construction’s story of scientific discovery isn’t as engaging, but the stakes are so much higher. After all, baseball’s just a game. Still, Moneyball provides a great illustration of real science overcoming pseudo-science. Good data trumping bad. Construction professionals that “get” Moneyball, can now get objective knowledge by the same principles, starting with planning and preconstruction. All it takes is a willingness to transition to Deming-like systems-thinking. (1)  That’s the goal of our earlier articles:

  • Freeing Construction from Industrial-era Thinking applies systems theory and data science to acquire knowledge. Deming’s PDCA (Plan → Do → Check → Act) cycle transformed manufacturing, and can do it for construction too.
  • The Problem with Dismissing Big Data explains big data and its power when applied construction. It makes the case for big data-related technology – including integrated BIM – becoming a transformational force in construction.
  • A Big Data Breakthrough in Construction tells how a “nucleus of critical data” brings structure to the hundreds of thousands of otherwise unmanageable data points spread across a myriad of fragmented applications, processes and files.

Together, the articles explain that objective knowledge requires scientific discovery. Scientific discovery of complex buildings requires big data; big data requires a system of interdependent causes that determine outcomes; a complex system requires a system aim. The system aim–or facility’s purpose–is defined by its departments (e.g. ASC, radiology, clinic) and, more precisely, by the functions within each department (e.g. CT scanning, ultrasound, exam). The facility’s functions provide the fundamental and vital building block for scientific discovery through big data analytics. Figure 1 graphically depicts this system with its nucleus of critical data.

Figure 1 Nucleus of Critical Data Central to a Data Governance Model

The knowledge transformation going on in construction is all about systems theory and data science. Through a tight system of critical data, we can readily predict and compare projects to yield remarkable knowledge and insight.

Relationship between Knowledge and Impartial Analysis

Big data raises the knowledge bar. The highest knowledge, though, comes via impartial analysis from outside of ourselves. In other words, we can finally stop breathing our own exhaust. Transformed, optimized building processes require impartial analysis. Per Deming:

“…knowledge generally comes from outside of the system… A system cannot understand itself without help from outside the system, because prior experiences will bias objectivity, preventing critical analysis of the organization. Critical self-examination is difficult without impartial analysis from outside the organization.

Optimization of a system can occur when all interconnecting components are orchestrated to achieve the organization’s goal (e.g. system aim or purpose). The people, free of fear and competition within the system can band together for optimization of the system. 

No one component may seek its own reward without destroying the balance of the system. Each component is obligated to contribute its best to the system as a whole. In all negotiations the results must be win/win.” (2)

Integrated Project Delivery (IPD), Virtual Design and Construction (VDC), and other strategies seek to, “band together for the optimization of the system”. But without “impartial analysis from outside the organization”, objective standards (e.g. target values) can’t be established. Today’s IPD may guard against cost overruns, but it can’t prove added-value against an objective standard. True value improvement will take impartial analysis from within and from outside an organization.

So, where does this impartial analysis come from? It comes from our own past projects, and from the marketplace. Our past project data is vital because it’s more knowable to us. But it’s the data sampling from the broader marketplace that’s most intriguing and beneficial. This data comes from our peers (perhaps including our competitors) from around the country. This poses another issue. How do we perform impartial analysis from information produced by our peers and competitors, and still maintain the privacy of our own data?

Muscular, cloud-based, big data analytics is how it’s done. We combine the knowledge from in-house and marketplace data that’s been processed through the scientific method. Again, that comes from the PDCA cycle of predictions that are validated by real-world experience. (3) Users submit their data to the pool of other market data. No one user can access that of another, unless its shared to them. All users benefit from the knowledge of the “market average and range of variation”. Data analysts and cost specialists routinely compile and apply the PDCA cycle until the predicted-to-actual results are aligned. These predictions aren’t just for cost, but for program, scope (design parametrics), schedule and so on.

It all boils down to this: Knowledge acquired from recorded, completed project data (via impartial analysis), leads to powerful and verifiable project success and process improvement. Organizations that properly compile and apply the most past project data win!

Next up? Disciplined Data Governance–becoming a well-oiled, fine-tuned, high-performance building team.

In the mean time, feel free to contact the author to learn how to transition your organization through a systems approach with big data analysis.

_______________________________

End Notes:

  1. W. Edwards Deming is cited throughout these articles as a pioneer in quality and process improvement. Deming, a physicist and statistician, was a key figure in transforming manufacturing, starting in post World War II Japan. His principles will also transform construction.
  2. Taken from Dr. Barbara Berry’s paper, “There is a Relationship Between Systems Thinking and W. Edwards Deming’s Theory of Profound Knowledge
  3. The scientific method applied to construction is explained in an earlier article, “Freeing Construction from Industrial-era Thinking”. See below for the link.

A Big Data Breakthrough in Construction

Construction’s innovators have long sought solutions to reverse decades of poor productivity, excessive waste and energy consumption, and cost overruns. CM, Design/Build, Partnering, Value Engineering, and TQM are a few from the last generation. Our generation’s solutions include BIM, VDC, Lean (IPD, TVD, CBA, LPS), and Green Building. (1)

So why don’t we experience measurable improvement? It’s simple. We’ve not found a way to measure the solution’s impact. Without objective standards to measure against, “on time and under budget” claims aren’t sufficient. They’re based on subjective measures. No objective measurement, no measureable improvement.

The movie Moneyball tells the before and after effects of data science. The Oakland A’s replaced subjective analysis (using bad data) with objective analysis (using good data) in order to achieve the success of the New York Yankees, but at a fraction of the cost.

Data science puts us on a similar path of success–producing higher performance, more cost-effective facilities that are driven by owner and occupant measures of success.

How can this be achieved in a field as complex as construction? Our earlier Moneyball articles pose an answer–combine a systems approach with big data analysis. This article gets into the nitty gritty of how it actually works. Today’s buildings are measured by hundreds of thousands of data points, scattered throughout a myriad of fragmented applications and processes. Integrated big data analytics solves this. How? Through a big data breakthrough enabling a facility to be measured and analyzed by a nucleus of integrated critical data. This critical data includes less than 200 data points on most projects. Through a systems approach, a facility becomes known as an integrated, complete and virtual system of critical data. Through big data analysis of critical data, a facility is modeled, measured and analyzed objectively. The analysis draws comparison to other projects within the organization and, more importantly, to the real-world market beyond the organization. The systems approach coupled with big data analytics provides a powerful knowledge resource. It forms a reliable basis for decision making and process improvement–first to attain more certain outcomes, and then to optimize facility performance and value.

A big data breakthrough enables a facility to be measured and analyzed by a nucleus of integrated critical data. 

New forms of practical data-driven modeling and management combine a systems approach with big data analysis. Statistical analysis empowers data contributors with remarkable knowledge and depth of insight, while maintaining organization and project privacy. This is possible when tools are: (1) knowledge-based with regard to the facility’s specific purpose, functions and other key attributes, and (2) validated by real-world data from completed projects. Already, data-driven planning and analysis tools are used to compile this data in: healthcare, education, commercial, parking, residential, hospitality, industrial, recreation, government, and so on. The 2017 Healthcare Facilities Report gives an example of the insight big data analytics provides. (2) This report is available to Construction Science Forum members.

Sufficient samplings of past projects can yield remarkable predictive modeling and analysis on future projects. More specifically, the benefits include:

  • Objectively predicting a range of likely outcomes starting at project inception, when the highest impact decisions are made.
  • Optimizing those outcomes through rapid, set-based, planning and prototyping.
  • Establishing impartial target values, and methodically steering the results to more certain outcomes.
  • Advancing innovations by measuring and comparing (and thereby increasing) the impact of BIM, VDC, Lean, Green Building, Prefabrication, and so on.
  • Solving age-old problems of declining productivity and excessive waste by providing means of measurement (we can’t improve what we can’t measure). (3)

From early planning and programming to conceptual design stages, data-driven tools provide stand-alone predictive (statistical) modeling, prototyping and analysis of the facility’s program, scope, schedule and costs. Starting in the conceptual or schematic design stages, information interoperability/exchange can take place between data-driven modeling and other technology applications. These include: design-driven BIM, estimating, scheduling, energy modeling, accounting and management applications.

Figure 1 illustrates how data-driven modeling and analysis helps steer the project outcomes from a wide range of variation–toward the approved target value with greater certainty.

Figure 1 – Steering a Project to More Certain Outcomes (from Building Catalyst)

Let’s take a closer look at how the systems approach and big data are possible. We do this by understanding the building as a system with an aim. The system aim, or building purpose, is at the center of the nucleus of critical data.

Understanding the System and its Aim

World renowned process improvement pioneer, W. Edwards Deming defines a system as “a network of interdependent components working together…to accomplish the aim of the system.”

The “aim of the system”, as far as construction’s concerned, is a facility’s purpose (e.g. a medical center). Facility purpose is first and foremost among all structured data, informing downstream predictions.

Figure 2 – Facility Structured Data as a System organized around a Purpose

The facility’s purpose – or the aim of the system – is defined by departments (e.g. ASC, radiology, clinic) and, more precisely, by the functions within each department (e.g. CT scanning, ultrasound, exam).

It’s the functions that must drive the building process–and now can. The composition of functions fully define the purpose, and establish a direct connection to the owner’s “business case”. The facility’s functions provide the only way to establish objective standards to measure against. The facility’s functions, therefore, provide the only way to objectively predict outcomes, and achieve measurable improvement.

Among causes (or inputs), the other structured data is grouped by constraints and standards. Constraints include location, number of floors, building shell type, and so on. Standards include quality class, durability, energy/environment, and so on.

Among effects (or outcomes), the structured data is grouped by program, parameters, schedule and costs. Data-driven modeling/analysis tools predict outcomes, based on the structured data causes/inputs. A range of variation also exists, due largely to the unstructured data (e.g. decisions made). Now, through big data, the impact of those decisions can be measured impartially (the subject of the next article).

Understanding the Nucleus of Critical Data

As noted above, there exists a nucleus of critical data that dramatically reduces the facility to less than 200 data points of information assets. It provides structure to the otherwise unmanageable hundreds of thousands of detailed data points scattered throughout a variety of applications, processes and files.

Each of the myriad of detailed data points rolls into one or more of the critical data points within a data management system–as illustrated in Figure 3.

Figure 3 – Data Management Overview

The 1st tier of detail provides information shared between the owner and their design and building team. The outer tiers of detail involve information exchanged with the trade contractors, suppliers, manufacturers, and so on.

Within the nucleus the fewest possible data points are compiled and processed, that gives the greatest knowledge about the facility. The nucleus is the command center of information for the project.

Figure 4 shows the nucleus of critical data as an integrated system of all eight information groups working together to accomplish the aim of the system–the facility purpose.

Figure 4 – Data Governance overview and the Nucleus of Critical Data

This nucleus of critical data, makes data-driven modeling/analysis possible, and enables us to:

  • Tie the facility to the owner’s business case, and make decisions accordingly
  • Consistently predict, analyze and manage a project from inception to completion
  • Establish an integrated BIM system from inception to completion
  • Provide objective standards to measure detailed data against
  • Provide objective standards to measure process improvements against
  • Increase transparency and certainty of outcomes, especially for IPD projects

Upcoming articles address the importance of impartial analysis and disciplined data management across projects and organizations.

In the meantime, if you haven’t already, you can learn more by reading our most recent article, The Problem with Dismissing Big Data.

Revised January 30, 2017

_______________________________

End Notes:

(1) Abbreviations applied: CM (construction management), TQM (total quality management), BIM (building information modeling), VDC (virtual design and construction), IPD (integrated project delivery, TVD (target value development), CBA (choosing by advantages), LPS (last planner system)

(2) Building Catalyst www.buildingcatalyst.com, provides this knowledge-based, data-driven modeling and analysis capability.

(3) 2003 and 2004 Studies by the Construction Industry Institute (CII), Department of Commerce and American Planning Association find the construction industry plagued with excessive waste, declining productivity and cost overruns on most projects. These studies remain unrefuted today.

The Problem With Dismissing Big Data

Last May the World Economic Forum released their survey on the Future of Construction. (1) It found Integrated BIM to be the most-likely, highest-impact technology in construction’s future. Big Data Analytics, however, was at the bottom of the heap. It’s considered one of the least-likely, lowest-impact technologies. If this survey accurately represents industry thinking, we’re in big trouble. We’re dismissing the very technological innovation most critical to construction’s future success. Why are we dismissing big data? First, we really don’t understand what big data is. Secondly, we haven’t experienced its power.

Big data, however, is critical to construction–providing the only objective knowledge basis from which to make decisions about a facility. It’ll do this three ways:

  • By providing the means to measure (we can’t improve what we can’t measure) our age-old problems of excessive waste and poor productivity. (2)
  • By enabling reliable prediction, tracking, control, and therefore stabilization of ongoing projects.
  • By measuring and comparing the impact of innovations like BIM, lean, IPD, VDC, green building, prefabrication, etc. to traditional methods.

So, what is big data anyway? Per Gartner, big data is “high-volume, high-velocity and/or high-variety information assets that demand cost-effective, innovative forms of information processing that enable enhanced insight, decision making, and process automation.”

Construction’s “high-variety information assets” demand new forms of data science and processing. Why? In order to objectively predict, measure and evaluate any facility concept/solution, process improvement or technological innovation. Data science forms the knowledge-basis required for “enhanced insight, decision making, and process automation.”

Moneyball tells this story so well. The New York Yankees relied on subjective analysis (using unproven metrics), and a lot of money, to win baseball games. The Oakland A’s, for lack of money, relied on data science and big data analytics. These teams won the same number of games, but it cost the Yankees $1.4 million per win, compared to the A’s $260 thousand. (3)

So too, in our case. Big data-enabled technologies can now move facility owners and/or professional building teams (large or small) out of the fog of subjective analysis (using unproven metrics) into clear, data-driven objective analysis (using validated metrics). Facility planning, estimating and analysis is possible at a fraction of the cost and time that traditional methods require.

Out of the fog of subjective analysis (using unproven metrics) into clear, data-driven objective analysis (using validated metrics). 

A process improvement culture can thrive inside complex systems (like construction) if properly motivated and equipped. Motivation comes from within (management can inspire or frustrate it, however). Equipping comes from the outside, starting with new knowledge formation. Big data modeling and analysis from scientifically validated data is critical to knowledge acquisition. Read our recent post, Freeing Construction from Industrial-era Thinking, for a simple explanation of how the scientific method enables big data analytics in construction.

Big data requires two conditions. The first is a consistently applied, well-structured data schema across all facility development disciplines (planning, programming, design, estimating, scheduling, procurement and management). Structured data must be consistent across facilities (projects) and organizations. The second condition is a systems approach to information and acquisition of knowledge. The systems approach establishes a network of interdependent information assets or attributes, measureable per the structured data schema. In other words, a facility must be “known” as a complete, virtual, and integrated system of information assets (attributes).

Taking a closer look at a systems approach to facility information helps explain how it leads to and enables a wide range of process improvements–starting in the early planning stages and moving toward an Integrated BIM System.

Systems Approach

W. Edwards Deming, a world renowned pioneer in quality and process improvement, brought systems theory and the scientific method to manufacturing. The result was revolutionary. Our industry’s waited long enough. It starts with Deming’s understanding of a system “a network of interdependent attributes…working together to accomplish the aim of the system.”

Figure 1 is a graphical representation of Deming’s System, as it applies to construction. The first of the eight information categories, the facility’s purpose, represents Deming’s aim of the system. (4) The facility purpose and building type are often synonymous (i.e. hospital, medical center, office, or school)

Figure 1 – The Deming System Applied to Construction

The facility purpose is more fully known by the composition of its functions, the second category. The functions are typically organized by department, or business unit. A cancer treatment department, for example, will include some combination of key functions: infusion units, linear accelerator, exams, operating and/or special procedure units.

Besides the facility’s purpose and functions, two other vital attribute categories determine a facility’s outcomes: constraints and standards. Constraints include the location, number of floors, project type, shell type, and so on. Standards include quality class, durability (service life), energy, LEED, operability, security and key demands. Key demands include climate and soil capacity. Seismic and hurricane demands must be considered where applicable.

The systems approach also includes four outcome categories: program, parameters, production (schedule) and costs. The program (department, space/room hierarchy) outcomes generally align with (or map to) the Omniclass Schedules. The parameters are a digital representation of facility’s characteristics–like site work and building quantities. The parameter and cost outcomes follow Uniformat II standards.

This systems approach provides the objective knowledge platform needed to predict, measure, compare, track and manage facilities to successful conclusions. For collaborative teams, it also aims to create an Integrated BIM System.

Integrated BIM System

What is BIM? Is it what it was intended to be? The National BIM Standard (and buildingSMARTalliance) definition starts this way: “A BIM is a digital representation of a facility’s physical characteristics”. BIM, as it’s practiced today, fits this simple definition to a tee. We call it design-driven BIM. The original vision for BIM, however, implies far more. The full National BIM Standard definition confirms that: “BIM is digital representation of a facility’s physical and functional characteristics. A BIM is a shared knowledge resource of information forming a reliable basis for decisions about a facility during its life-cycle; from inception, onward.”

Design-driven BIM comes into play only after the highest impact (facility) decisions have already been made. In other words, today’s design-driven BIM cannot inform the most important decisions. In contrast, the systems approach enables a function-basedand data-driven BIM. Here, BIM meets the full National BIM Standard definition. Profound decision-making implications result as shown with the adaptation of the MacLeamy Curve in Figure 2.

Figure 2 – MacLeamy Curve adapted to design-driven vs. data-driven BIM

Data-driven BIM supports high-impact, low-cost decision making and process improvements. Data-driven BIM’s benefits include (1) predictive modeling, (2) set-based prototyping, (3) high-definition benchmarking, (4) impartial target setting and tracking, and (5) purpose-driven planning.

Integrated BIM results from consistently structured data exchange between data-driven and design-driven BIM solutions

Fortunately, the World Economic Forum’s survey is wrong with regard to big data analytics. Construction’s big data future is now, and ready to impact many projects breaking ground in 2017.

If interested in big data ideas and information exchange, consider joining the Construction Science Forum.

Endnotes:

1. World Economic Forum publication: Shaping the Future of Construction A Breakthrough in Mindset and Technology included the Survey for the Future of Construction.

2. 2003 and 2004 Studies by the Construction Industry Institute (CII), Department of Commerce and American Planning Association find the construction industry plagued with excessive waste, declining productivity and cost overruns on most projects. These studies remain un-refuted today.

3. YouTube Clip of Red Sox’s owner, John James, offering Billy Beane a job as general manager: The Job Offer

4. This graphic is limited to the facilities development. The complete facility system of ten categories includes lifecycle information needed to predict, measure and analyze the total cost of ownership.

Freeing Construction from Industrial-era Thinking

“There’s got to be a better way @#%!!!?” Who hasn’t been frustrated with the lack of planning and preconstruction tools and information? We need knowledge-based technologies and processes to more effectively predict and guide facility outcomes. The Construction Science Forum focuses on systems theory and data science to do all this – and free us from the industrial-era thinking that stymies us.

Process improvement pioneer, W. Edwards Deming started it all when he brought systems theory and data science to manufacturing and service industries. (1) Major league baseball overcame their own version of medieval (industrial-era) thinking to the delight of Oakland A’s fans. Per “Moneyball” A’s manager Billy Beane led a movement of data science in baseball. This past Fall, Cubs’ general manager Theo Epstein brought the world championship to Chicago after a 108-year dry spell–another data science success. (2)

Data science drove Deming’s work in manufacturing. Data science also drove baseball’s transformation. Data science will now drive construction’s transformation.

A similar movement in construction has been tough to pull off, partly because of the complex, variable nature of our industry. Despite great innovations like BIM, Lean and IPD, we’re still stuck in industrial-era thinking, decision making and fragmented processes. This has resulted in continued excessive waste, declining productivity and cost overruns on too many projects. (3)

Systems-thinking is now replacing industrial-era thinking in construction. Muscular and agile, big data enables construction to adopt Deming-like systems theory and scientific knowledge formation. (4) Manufacturing and baseball have replaced subjective analysis (using bad data) with objective analysis (using good data). The same is now possible for construction, starting in the early planning stages. Building producers are now able acquire knowledge needed to methodically predict, steer and finally, optimize, project outcomes.

All we need is our own generation of Beanes and Epsteins, i.e., practitioners ready to apply data science to facility planning, design, delivery and process improvement. For this reason, we’ve formed the Construction Science Forum. Members exchange information, experience, and ideas through a pilot initiative. We (at Building Catalyst) provide the data analysis. Forum members just provide the key project data that’s easily extracted from selected drawings and actual cost data. No biggee.

Transitioning the construction process toward systems theory and scientific knowledge formation takes just a few simple steps:

  • Gaining a basic understanding of systems theory and scientific knowledge formation as it relates to construction.
  • Applying a tight system of the most critical data that gives meaning to the hundreds of thousand of otherwise unmanageable data points across myriads of fragmented tools and processes.
  • Rapidly compiling and analyzing historical facility data to gain knowledge for use in predicting and improving future project results.
  • Applying this knowledge to current and proposed projects to methodically predict and steer projects to more certain outcomes.
  • Applying the knowledge toward process (and technology) integration and improvement, and breakthrough innovation.

Step 1 – Gaining a basic Understanding of Systems Theory and Scientific Knowledge Formation

Looking “behind the curtain” helps. Getting complex buildings to be predictable and methodically guided toward a desired results–is possible. You don’t have to understand the complexity of the databases or algorithms. It does help, though, to understand how research analysts approach the data modeling creation, testing, validating, etc.

We use Deming’s scientific method. It has four iterative steps: Plan → Do → Check → Act. Figure 1 shows this process adapted to predicting and steering project outcomes. It’s the secret to abandoning subjective analysis (using bad data) and replacing it with objective analysis (using good data).

Figure 1 – The Deming Circle for Learning and Continuous Process Improvement

For example, during the planning and conceptual estimating stages, the HVAC system outcomes are more reliably predicted and analyzed. We do this by modeling the facility based on the attributes that impact outcomes. For some projects, the building envelope loads drive most of the HVAC scope and costs. Loads for the exterior components (roof vs. insulated opaque walls vs. glazed walls and so on) vary widely based on interacting factors: exterior surfaces, climate, energy standards, etc. For other projects, the internal functions are the major drivers of the HVAC scope and cost. Surgical areas, for example, obviously pose entirely different loads than do primary care areas. Likewise between science labs and classrooms. The interactions between facility attributes (including the varying exterior and interior loads) must be accounted for. Without the scientific approach and cycle, there’s no way to predict and budget the HVAC, especially in the planning stages when the highest impact decisions take place. This example applies for all site and building elements that make up the whole facility.

Stuck in our industrial-era thinking, we can really only guess at costs based on bad data (i.e. outdated, whole-building comparables).

Adopting the scientific method allows us to continuously study facilities using the above PDCA analysis and learning cycle. That is:

  • Forming predictions and algorithms based on real causes (inputs) that determine outcomes;
  • Compiling actual detailed real-world results;
  • Checking, comparing the real-world results against the predictions and studying the patterns;
  • Revising assumptions, related algorithms and (especially) interactions.

Is it possible for any particular construction company or integrated building team to use this approach? Possibly. But, is it advisable? No. A major Deming tenet is that a system (e.g. facility) or process cannot be understood by itself, without help from outside the system. Why? Because prior experiences will bias objectivity, preventing critical analysis and self-examination. Impartial analysis from outside the system or process is necessary. (5) Therefore, even the largest construction companies should compile and analyze their project data objectively. Their data analysis must be measured against an objective standard (e.g. sampling of projects throughout the marketplace) while maintaining confidentiality.

Now we’ve come full circle. Our industry has a great deal of complexity and variability. Because of that, new forms of big data modeling and analysis are necessary to apply systems theory and scientific knowledge formation. That is why we (Building Catalyst) have created a new BIM technology and process. Construction Science Forum members interested in gaining knowledge through the recording and analysis of completed projects can have a free subscription to Building Catalyst. Forum Members can also use Catalyst to perform predictive modeling and analysis on a trial/pilot basis for no charge.

Next Steps

In the next post we will explore how selected facility data and other information is structured and networked into a system of interdependent attributes. For the first time in history, a facility’s program, design parameters, schedule and cost can be predicted and analyzed based the actual attributes that determine outcomes (how novel is that?).

In the meantime to learn more visit www.buildingcatalsyt.com , or consider joining the Construction Science Forum, if you haven’t already.

End Notes:

  1. Baseball’s use of scientific methods is captured by the Moneyball revolution. See this post: Moneyball For Construction: Can science deliver more certain outcomes while driving higher value and profitability?
  2. Worldwide, Deming is considered the father and standard-bearer for quality and process improvement in manufacturing and service industries.
  3. Studies produced by the American Planning Association (APA), the Construction Industry Institute (CII) and Department of Commerce reveals cost overruns and excessive waste on most construction projects along with decades of declining productivity.
  4. Building Catalyst (www.buildingcatalsyt.com) research analysts and software developers
  5. Relevant work by Deming is summarized in this short paper by Dr. Barbara Barry: There is a Relationship Between Systems Thinking and W. Edwards Deming’s Theory of Profound Knowledge.