English Translation:
Traduction en français:
Moteurs de jeu dans l’industrie de construction

Real-time visualization technologies in AEC

By Yiğitcan Karanfil, Design Technology Manager, Mark Cichy, Director of Design Technology, and Tim Lazaruk, Design Technology Manager, all, HOK Canada

The construction industry has seen a surge in tools that are utilizing game engines. Real-time visualization technologies have enabled professionals in AEC to be more efficient and foster new ways of conveying narrative, enhancing coordination, and deploying data aggregation.

Game Engine Technology and its Interaction with the Industry

This trend is on the rise and these technologies can unlock opportunities in the AEC industry. Video games are created using a digital toolkit called a “game engine” –an environment specifically built to accelerate visual exploration. Designers, engineers, and constructors use tools such as ArchiCAD, Revit, and/or Rhinoceros to create architectural designs, while game developers use game engines to create their end product: closed package entertainment software, like video games. Some of these engines such as Unreal Engine (by Epic Games) and Unity are available to any developer, while some are developed as proprietary in- house technologies; examples such as Id Tech (Id Software) or Frostbite (Electronic Arts). While their primary purpose is entertainment, they possess the ability to process large amounts of data, simulate light and physics, and visualize/animate objects in real-time – these features were added to engines over time to increase a sense of realism and elevate the experiential qualities of the gaming experience. Coincidentally, many of these technologies and features also benefit the AEC industry.

Many game engine-based tools developed specifically to target AEC have become ubiquitous. The most prominent application for these has been architectural/construction visualizations. Tools such as Twinmotion (based on Epic’s Unreal Engine), Lumion, and Enscape have maintained exponential performance improvements thanks to the rapid advancement of GPU technology in the last few years. These applications can generate images instantly with incredibly accurate visual fidelity. While they might not produce “photo-realistic” renders, the images or videos are produced in a fraction of the time spent with traditional rendering tools. They enable production of rendered images in high volume and allow errors to be mitigated on the spot as opposed to noticing them hours into the render process. Game engines also provide the distinct ability to present models directly in virtual reality (VR). While VR is heavily marketed to the general public as a gaming solution, design professionals have been utilizing this technology to create immersive presentations for clients and/ or experience design spaces within collaborative virtual environments.

Game engines have now been streamlined to the point whereby a few clicks are sufficient to engage in the collaborative exploration of a design model – with just a few extra steps one can utilize an immersive headset and evaluate revisions/options in Virtual Reality (VR). VR headsets, gaming laptops, and external Graphical Processing Unit (GPU) enclosures have become commonplace in AEC offices.

It is practical to utilize game engines for developing AEC software because they come pre-packaged with many features required for our uses. Navigation, physics, graphical representation, data processing technologies have been developed, optimized, and are designed to be more user-friendly for creators. Additionally, they can be more cost- effective for the end-user as gaming hardware has become increasingly commoditized.

Research into the explicit use of game engines for analysis is still in early stages – many publicly funded and academic initiatives are making significant gains, however. Artificial and daylight analysis are mostly addressed in the use of visualization tools and many possibilities remain to be explored for wind, rainfall, structural (statics) and building performance analysis. It is feasible to use gaming engines to achieve quantitative metrics, but it will take time to see this level of integrated ubiquity accessible within gaming platforms.

HOK is utilizing the tools discussed in this article (some publicly available to market as well as those that have yet to be released) to explore multi-phase design states and objectives. We are constantly looking for new ways to take advantage of and leverage game engine technologies throughout our process. The Centre Block Rehabilitation project has been one such opportunity for us to explore some of these possibilities and develop innovative solutions that suit a unique and robust set of requirements.

Project Summary

  • Owner: Public Services and Procurement Canada (PSPC)
  • Name: Centre Block Rehabilitation Project
  • Size: 543,580 sq. ft. / 50,500 sq. m. (Parliament building)
  • Location and Address: Parliament Hill, Ottawa, Canada
  • Scope: Centre Block is one of Canada’s most iconic buildings and houses the seat of government for the country, including the House of Commons and Senate Chamber. The scope of the Centre Block Rehabilitation (CBR) project includes comprehensive restoration- of Centre Block and its integrated Peace Tower, along with the completion of the Parliamentary Welcome Centre Complex, and over 25 enabling and 40 investigative sub-projects.

The CENTRUS BIM Team (HOK/A49/WSP) in collaboration with Carleton Immersive Media Studio, or, CIMS, is leveraging game engine technology to bring project data together in an interactive and immersive environment for all project stakeholders. The objective of the streamable assets project is to develop processes that allow digital assets (point clouds, BIM content, additional forms of data, etc.) to be streamed into Unreal Engine (UE). The purpose of the initiative is to create a platform that allows users to quickly and efficiently visualize multiple data sets within a singular environment without the difficulty of moving between file formats, software platforms, and file sizes. The first phase of ongoing development involves procedures for streaming assets into UE. Beyond this, the team is working to develop a Heads-Up Display System (HUD) for desktop/online use as well as virtual and augmented reality applications.

In order to facilitate the needs of architectural designers, historical conservators, and the client, the Unreal Engine streamable assets application is designed to serve multiple functions: (1) to support Centre Block BIM models and metadata, photogrammetry, and point clouds; (2) to provide the ability to view and compare new versus existing data; (3) to view and compare different phases of design; (4) to track historical assets and their locations in real-time; and (5) to be accessible via an interactive online portal.

Our visualization specialists are fulfilling stakeholder requirements by building explicit features into our interactive application. Some of these features include: (1) crop boxes used for slicing selected objects for comparative analysis – sectioning the building in exterior and plan views; (2) bounding boxes used in IFC model format to target specific heritage assets; (3) a timeline tool to reveal content states available via an interactive timeline – providing the ability to view data from a specific date or phase; (4) time of day controls to dynamically change the lighting in any given scene; (5) a room tour feature to guide the user around specific spaces, pointing out updates and key features; and (6) a save user settings feature to reserve data such as repository location so that the user does not need to readjust settings upon every login.

Future versions of the Streamable Assets application will be shared in real-time to any device with a web browser (mobile included). Users will send their commands over the website and the application will allow them to collaborate accordingly. Our team is working closely with Epic Games, and in late 2021 our application will formally adopt Epic’s Unreal Engine 5. This will allow our application to support (1) significantly more complex models; (2) much denser point cloud data sets; (3) denser photogrammetric scenes – multiple photogrammetry meshes loaded and visible at the same time; and (4) much more sophisticated lighting and environmental effects.

In 2018, Epic Games, the creators of the Unreal Engine 4 (UE4), created an Enterprise team whose sole purpose is to support the AEC industry. Multiple open formats are now integrated into the Engine and can be imported directly – via a dedicated interoperability plugin known as Datasmith. This allows for multiple types of data to live and interact inside one application. The most notable supported formats include Industry Foundation Classes (.IFC), GL Transmission Format (.GLTF), Wavefront OBJ (.OBJ), and Autodesk Filmbox (.FBX). The CBR Streamable Assets application takes advantage of using these interoperable file types to get our data (solid models, meshes, point clouds, etc.) into the UE4 environment. Many authoring software platforms can export IFC, FBX, and GLTF, allowing flexibility when importing data in UE4 while retaining the geometric, materiality and geo-location accuracy that is needed.

LiDAR Point Clouds are now supported and can be imported directly into UE4. Supported formats include .LAS, .LAZ, .PTS, .TXT, and .XYZ. On the CBR Streamable Assets project, we are using automation solutions to batch decimate point clouds making them easier for UE4 to support while maintaining performance expectations.

Our applications have three main data types that require project-level integration: (1) solid models are exported from BIM and are made using mathematical curves not supported by game engines, converting these models to triangles and polygons creates highly complex and dense 3D meshes impacting real-time performance. To alleviate this, we developed a new optimization framework leveraging custom tools inside Autodesk 3DS Max. This tool leverages new Autodesk software integration APIs, in conjunction with Maxscript (the native language of 3DS Max) to read and translate BIM data from Revit, while preserving its metadata, materials, and textures. Multiple model optimization procedures are applied automatically, including the creation of secondary UV channels required for proper static shadow creation inside UE4; (2) The photogrammetric model is meshed using Reality Capture, optimized using Houdini FX, and finally imported into UE4 using the .GLTF format; and `(3) Point clouds are imported to UE4 via an internal plugin which batch decimates and converts files into .XYZ file types using our automation platform.

Our team has had some challenges manipulating and working the sheer scale of the data set being imported and loaded at run-time. Currently, the experience cannot support all data loaded at once and since the density of data will only be increased, additional optimization and data structure measures will be required. Some performance-related issues will be addressed with future software revisions; we are investigating additional measures to ensure that we take full advantage of dynamically loading various streams of content as well.

Most of our data is separated into corridors and rooms. Subsequently, all data from each room needs to be combined into a single file that the main application can read. We are leveraging a format proprietary to UE4 to dynamically load content. This comes with some limitations, takes a long time to create, and requires a developer to coordinate and manage the complexity. We are gradually moving away from closed formats in favour of adopting the DATASMITH file type. Epic is continuously expanding its data schema and is anticipating that it will be adopted by industry standard applications such as Rhino and 3DS Max.

Our greatest technical challenge remains to be the limitation of hardware currently available. Our BIM data includes tens of thousands of models each with their own metadata. Our Photogrammetric models captured from each room and space in Centre Block are comprised of tens of millions of polygons. Complex Point Clouds are made of tens of millions of points. The rendering power needed to display all these complex data types simultaneously depends on significant advances in GPU hardware and potentially AI.

Game engine technologies provide fast rendering capabilities that facilitate the development and iteration of new ideas, concepts, and design exploration. They also provide a necessary bridge to new and expanding technologies like Virtual and Augmented Reality applications. Their adoption shows no signs of slowing down. This translates into better integration between software applications.

The “Direct Link” between specialized AEC applications and game engines will continue to grow and expand. Feedback, comments, and changes made inside the game engine will ultimately be saved and applied to the host application in real-time and logged as a new iteration of design content. We are already seeing this trend in our own research and development efforts.

New streaming technologies will continue to grow and expand. State-of-the-art workstations will process complex applications and stream their content as a livestream in real-time to any device connected to the internet with support for gesture-driven interaction included.

These new technologies will alleviate most of the current technical challenges while the rest are addressed by newer and more advanced hardware and software versions. Like the recently announced Unreal Engine 5, said to include multiple advancements in real-time rendering technologies, and the capability to support hundreds of millions of polygons and points.The future of gaming technology is growing exponentially and offers substantial gains and opportunity to the AEC industry globally for the future.

Related Awards

No items found.

Related Presentations

No items found.