No items found.

By Sebastien Frenette, BIM Director, Engineer, M.Sc.A., Provencher Roy

Some of the most frequently asked questions when we talk about BIM is, “how can it support multidisciplinary work?”, “How can data and models work together to support multidisciplinary design teams?”, and “What is data centralization and how can it support the decision-making of architectural and engineering teams?” As Advocate Architects designing a major new hospital, it was critical to implement a data-centric approach to drive the design teams, architects, and engineers. We utilized a tool that allowed us to consolidate user needs in a structured and organized way bringing added-value information into our 3D models, including our client’s equipment list of more than 70,000 units of medical equipment in 6,750 rooms. We managed to import data into our 3D models and drive the models with accurate, real-time information. This data was made available to designers, programming teams, modellers, and stakeholders seamlessly, ensuring the client's design requirements were met.

INTRODUCTION

The hospital located in Vaudreuil-Soulanges, Quebec will have 404 beds, 11 operating rooms, and an emergency room that can accommodate 41 stretchers. Work will begin in 2022 and be completed in 2026. The project represents an overall investment of CAD$1.7billion and is being carried out in accordance with public infrastructure’s major projects. For years we had been looking for an opportunity that would allow us to experiment with a process that will reduce the gap between the modelers and the programming teams by reconciling program data through a centralized data-driven approach. Representing a unique opportunity, the project was in accordance with our aspirations, and it allowed us to achieve our goal, resulting in a high-value initiative for all stakeholders involved in the project.

PROBLEM DEFINITION

Two roles, two solitudes. The first role belongs to the programming teams who work closely with the project's development teams (owner, users, etc.) in order to address the needs, requirements and translate the requirements into cohesive programming data. The other role belongs to the modelling teams that must model the project according to the latest program data in order to provide visual and technical support to facilitate the decision-making and facilitate the buy-in. These two roles are intimately linked since each of them feed each other either with graphical data, the 3D, or with non-graphical data, the programming data. This need for collaboration was exacerbated by the scope of this massive project to be carried out in order to manage spaces, medical equipment, finish, doors, hardware, lighting equipment, electrical outlets, and more. While 3D has been used for several years, access to a defined, structured, exploitable, and accessible program data is deficient in its organization, in its means of dissemination, leaving modeller teams facing a lack of high-quality information delivered in a timely and consistent manner. 

HIGH-LEVEL SOLUTION 

The better the information is, the better the project is. Our goal was that the program data would drive the design teams; architects and engineers could rely on a tool that had allowed us to consolidate user needs in a structured and organized way in order to bring this added value information into our 3D models. We had to manage the programming data but also all the medical equipment. To do so, we implement dRofus, a powerful and world-renowned software that allowed us to structure all the data in an exploitable way in order to manage the functional program, the rooms, and to manage the distribution of various equipment and their technical requirements. Since the owner already had their own data management platform to manage thousands of pieces of medical equipment, we needed to find a way to support our vision to bring their data to an exploitable state through our tools. We aimed to connect our two platforms rather than impose one in order to respect the technical expertise and software needs of each stakeholder. We started by creating a data transfer protocol between our platforms so that we could transfer data on a weekly basis. This protocol consisted of: 

  1. Defining the technical data transferred from database A to B or B to A. 
  2. Establishing a schedule for synchronizing the information, and 
  3. Stowing the progress of our information being under our respective responsibilities

The transfer protocol allowed us to solidify this unprecedented level of collaboration and bring trust within a high-value information sharing environment. We were then able to work on our primary objective: the data driving the design.

SOLUTION DETAILS

The primary element was to consolidate the information in the way it would be exploited for many uses: no information was generated into a simple, single text box format. Instead, information was structured and organized in a way that we could manipulate it, export it or import it into our production models. However, all the data was not imported into our 3D production models (that was not the intention). The goal was to bring the added value data that we wanted to manipulate for a specific use, i.e., for a room, its name, its number, its technical sheet number or its finishes; for medical equipment, its code, number, its dimension in order to ensure its layout, its electromechanical needs or its load calculations in the case of some critical equipment which we defined from P0 to P4. First, we connected the rooms in our informational database with our 3D models. No room in our model was placed without it being in the list of rooms in our database, limiting errors in the program and allowing us to improve it to compensate for errors and omissions. Then, since we worked with equipment lists in our database, we were able to model and place only equipment specified in the program, and once again, limit errors and omissions. No equipment was forgotten or lost. From the medical equipment, we also took the data dimensions to model them automatically at a relatively low level of detail, though sufficient to coordinate it. Specifically, we created a parametric object in the form of a box based on the overall dimensions and another box nested for clearances. The medical equipment of the facility was automatically generated in our modelling tool in the generic format. This level of detail was sufficient enough for the project stage, allowing us to make design decisions based on the client's needs and optimize the program while supporting our design teams in coordinating with the project stakeholders. We then improved the modelling of some critical equipment, but they always remained attached to the database. At this point, dRofus supported in several ways, including:

  1. Ensuring that all specialized equipment synchronized from the owner’s database to ours corresponded in real-time 
  2. The dimensions of the customer's equipment being provided in height, width, length, weight or the electromechanical needs.
BUSINESS BENEFITS

The business benefit of such an initiative is an increase in overall project quality based on the delivery of high-value information unlocking collaboration and enhancing coordination. We faced many challenges on human and technical levels, but we managed to import data into our 3D models and drive the models with accurate, real-time information. This data was made available to designers, programming teams, modellers, and stakeholders seamlessly, ensuring the client's design requirements were met. Since access to information is probably one of the most critical elements of a project delivery process since all stakeholders and actions rely on data, the lack of data or its fragmentation can have catastrophic effects on production time, project quality, and even trust between stakeholders involved in the process. Implementing a technology that can centralize and organize all the data flow is critical to success. Projects are getting bigger, more complex, and must be delivered in record time. 

Data flow management is probably one of the most underestimated elements, and yet, probably the missing link to enable a fully mature BIM ecosystem. It is not a matter of imposing software or a process, but how these can be tied together to share data at the right time, whether that data is graphical or non-graphical. Projects are getting bigger, more complex, and must be delivered in record time. Data flow management is probably one of the most underestimated elements in productivity improvement and probably the missing link to enable a fully mature BIM ecosystem. It is not a matter of imposing software or a process on a third party, but how the ecosystems can tie together to share data at the right time whether that data is graphical or non-graphical, and in a structured and usable way. Data loss, misinformation, and poor quality are well documented in the industry as a source of lost efficiency and quality. The integration of such an approach allows mitigation of this risk. Beware of the pitfalls, the intention is not to impose, but to structure the data to share it efficiently and to harmonize the data in order to avoid wasting time and to focus on the production of high-quality projects. Isn't that the greatest promise of BIM?

Related Awards

No items found.

Related Presentations

No items found.