AEC Magazine

The shape of BIMs to come

2303 0

Predicting the future may be a mug’s game, but increasingly, it’s one we all must play as we place our bets on the technologies reshaping AEC.


With a multitude of both established software companies and ambitious start-ups looking to transform how the AEC industry uses data in its projects, how on earth do users plan for what comes next? It’s a topic I’ve spent many hours contemplating and one I’m regularly forced to re-evaluate with every new start-up that comes along.

In this article, it’s my intention to dig down a little into the technology trends I feel will have the most significant impacts on the AEC sector, as well as look at some of the development work currently in progress. And along the way, I’ll be channelling my inner Mystic Meg to make some predictions, based on what I see in the market.

But first, let me take a step back to around 2015 or 2016, when I was researching an article on AEC tech stacks. This involved me interviewing industry design managers about the challenges they were facing. What came across loud and clear was their general frustration with the lack of innovation in architectural BIM. Revit development was seemingly flatlining, costs were rising and, for many, establishing good practices when it came to BIM and producing clean, structured data felt out of reach.

Solid productivity improvement is a perennial goal for most firms, of course, along with unlocking value from the gigabytes of data they create. Other challenges include eliminating bottlenecks from BIM workflows, keeping an eye on requirements when it comes to hardware/software/skills, thinking of ways to liberate and integrate data caught up in silos, not to mention staying one step ahead of ISO rules, terms and definitions.


Advertisement

But at that time, in the mid-2010s, it seemed as if every vendor was channelling the bulk of its R&D budget into building cloud-based systems to deliver PDF drawings to contractors, rather than fleshing out existing design tools and introducing new ones that might help customers tackle their own pressing challenges head-on.

Fast-forward to today, and much has changed. We now have a host of new BIM design tools in open development, many of which have attracted venture capital funding and some that are yet to emerge from stealth mode. And almost everything that is emerging is, first and foremost, cloud-based.

Shifting to cloud

The shift to the cloud has been all-encompassing. In response to a 2020 open letter from Revit customers, expressing concern over the product’s future roadmap, Autodesk confirmed that there will be no ground-up rewrite of its industry dominating BIM modeller, along the lines of its current desktop-based incarnation.

Advertisement
Advertisement

Instead, Autodesk CEO Andrew Anagnost expressed the view that he didn’t want to “create a faster horse”. The future of all the company’s solutions, he insisted, would be “cloud-first”.

Since then, the company has delivered the first instalment of its AEC cloud-based vision, Autodesk Forma, aimed at architectural conceptual design. And, over time, Autodesk’s other desktop apps will be rewritten as cloud-first applications, with both data and apps residing in Autodesk’s growing cloud infrastructure.

While all this seems relatively new, the company was already working towards this point a few years prior to launching Autodesk Fusion in 2013. Fusion was the company’s first pure cloud-based application and part of a wider vision to rewrite all Autodesk common and core software functions as web services that could form the basis of new web-based applications. This vision, initially named Forge and since rebranded as Autodesk Platform Services, was a seriously bold long-term move — a colossal bet by the company that the next platform would be the cloud.

Other vendors seem to agree. In the past eighteen months or so, I have not seen a single new AEC application from any developer that could be construed as a desktop application, in the sense that it can wholly run on a local workstation, without an Internet component or connection.

In software palaeontology terms, we are undoubtedly in the early ‘cloudicene’ period. If you are creating a new application for commercial launch within the AEC space, it’s almost certainly cloud-centric and only available via subscription.

At the same time, there are still a lot of legacy applications that have yet to be replaced or rewritten and customers probably don’t realise this yet. The process of shifting 100% to the cloud could take a decade for many firms. So is there a chance that we might have a more hybrid desktop / cloud future?

It’s an interesting question. In the mechanical CAD (MCAD) industry, we have seen two new cloud-based applications developed and released at great cost with a view to challenging the supremacy of Dassault Systèmes’ desktop Windows-based Solidworks.

Both have failed to achieve their much-stated aim of usurping this market-leading product. The first is Onshape, created by Solidworks founder Jon Hirschtick, which was sold to PTC in 2019 for $470 million, thus joining PTC’s stable of desktop applications.

The other contender, as previously mentioned, is Autodesk Fusion. This is still in development but continues to play second fiddle to the company’s desktop-based Autodesk Inventor.

Maturity has certainly been an issue when it comes to the feature sets of both contenders — especially in an established market full of power users. The flexibility the cloud offers has so far not been sufficiently compelling to beat the depth of functionality inside many desktop products.

In my view, MCAD is different to AEC, and it’s perfectly possible that the same problems won’t thwart the ambitions of cloud-based AEC applications in the same way. However, the risk remains that it will take new cloud apps a lot longer than expected to oust desktop BIM tools. Firms such as Graphisoft are rearchitecting their BIM software to run in a hybrid manner, where data and processing runs either online or locally on the desktop. Ultimately, the deciding factor could be a generational shift in AEC, as younger workers join the industry.

I, for one, am not happy renting a computer in the sky when I have a perfectly good one in front of me — and one, I might add, that doesn’t levy micro-charges for access and compute. Less King Canute, more King Compute.



Files to databases

One knock-on effect of the shift to the cloud is in the way data is structured. File-based workflows have been the way we have worked forever, but they’re susceptible to loss, corruption and duplication. Users create data stored on a local drive, usually in a proprietary CAD file system such as DWG or RVT. These core design files are used to generate hundreds of document files, like PDFs, which also need managing across projects, networks and users.

But as data is increasingly kept in the cloud, we need to have our data structured differently. It needs to be ‘streamable’ for lightweight transmission. It needs to be atomised for better dynamic sharing, providing users with just the data that’s relevant to the task in hand, rather than sending everything plus the kitchen sink across the Internet. BIM files, in particular, can grow large and unwieldy in a short space of time. Project data needs to expand rather than collecting in individual pools, unique to each application’s schema. Data silos are traps that inhibit collaboration. These pools need to become unified data lakes of spatially-related project data. All this is happening.

Autodesk and Bentley Systems are both promoting unified database structures. In the case of Autodesk, it’s Autodesk Docs. For Bentley, it’s iTwin. Autodesk’s approach is proprietary and resides in the company’s cloud. Bentley’s iTwin, by contrast, has been made open and is portable.

Other BIM software developers, such as Vectorworks, are integrating database connectivity to augment their file-based workflows. Graphisoft takes a unique hybrid approach, where data can be either local or in the cloud. It looks like BIM file formats as we know them will become either merely transactional, or legacy. As Autodesk’s Anagnost has said: “Files are dead things working.”

There are some efforts within the industry to create open data frameworks, so that AEC practices can remain in control of their own data without having to rely on, and possibly get trapped by, a commercial cloud platform (see section right on Openness).

API access

In a cloudy future, ‘sending data around’ will be a last resort. Design applications will instead ‘come’ to where the data is stored and be accessed via APIs (application programming interfaces) that grant access to perform tasks on permitted data.

Historically, a software developer would write a specific application which sat on top of a desktop Revit to access the BIM data. In the future, and in some cases right now, new start-ups are writing applications that reside in the cloud and simply connect to a customer’s data stores to perform tasks. If the data is in a file on a desktop machine, plug-ins extract the data and send it to the cloud for processing. In a cloud-centred process, seamless connectivity between applications is possible.

However, one of the issues with cloud-based anything is that it involves somebody else’s computer. There is often a charge associated with hosting data, as well as micro transaction charges for API calls and data transmissions between cloud servers. In addition to subscription for tools, there will be some kind of token payment system for usage, possibly per API call. Usage will be metered.

Openness

We live in exciting times. For the majority of BIM history, the only open standard was IFC (Industry Foundation Classes), which has been seen as a lowest common denominator outcome. In fairness, it has suffered along the way from some poorly executed export interpretations by software vendors.

Now it seems as if we are on the cusp of being spoiled for choice, with 3D formats coming out of our ears. USD (Universal Scene Description) has been taken up by key industry players that are working with the Khronos Group (glTF) to harmonise those models with scene formats. There are many emerging metaverse standards for 3D, such as Cesium’s Open Tiles for 3D geospatial content. Data mobility is set to dramatically increase as the barriers to sharing fall.

At its Autodesk University (AU) event last year, the company announced a significant interoperability deal with Trimble, Ansys and Nemetschek, which is soon to be officially ratified. Previously, Autodesk had signed up to use the IFC libraries from the Open Design Alliance (ODA) and build out comprehensive file translation web services for its Forma offering.

I now believe that Autodesk is very serious about driving openness and interoperability in the AEC space, which may seem counterintuitive when the company has benefited from DWG and RVT lock-in for decades. But it was reassuring to hear Anagnost say at the AU press briefing: “It’s not our data, it’s the customer’s data.”

This is significant. This is a ‘Berlin Wall coming down’ moment for the industry — and there’s no David Hasselhoff around to spoil it by singing. All the key software firms agree that data needs to flow between applications and that proprietary approaches work against everyone’s best interest. I hope the promise of this becomes reality.


Find this article plus many more in the January / February 2024 Edition of AEC Magazine
👉 Subscribe FREE here 👈

Ecosystems versus cloud APIs

Autodesk has benefitted from generating its own product ecosystems and bundling them up first as Suites, and then Collections. While its apps might not all work together super well, AutoCAD, Navisworks, 3ds Max, Revit, Forma, Docs, Civil 3D and Recap are all actively in use by customers. By moving to the cloud, embracing openness and championing APIs over files and applications, customers could decide to de-bundle when moving from the desktop and build a tech stack of best-in-class cloud apps and services from multiple vendors, subscribing to only what they need from the Autodesk tech stack.

With proprietary file locks looking like a thing of the past, tools and services will be much more fluid than they are today. Firms will need to think more data-centrically, while managing their tech stack of cloud services. With data fluidity, we should become less concerned about being tied to specific point authoring applications.

Artificial intelligence

Yes, AI is deeply overhyped. Yes, it’s a marketing checkbox for any software firm that has integrated ChatGPT or midjourney into its product. That said, AI is going to have a long-lasting impact on the AEC industry.

Companies such as SWAPP are attempting to use AI to automate processes such as transforming simple sketches into detailed BIM models or slashing drawing production times from months to minutes. AI is already being used in analysis tools and in generative design. Architects are deploying it to write Python and other code, to extract data from files, to heal models, optimise sketching, assess energy performance and to reverse-engineer BIM objects out of dumb point clouds.

Structural engineers are analysing dams, bridges and tunnels, running AI over videos to identify cracks. Companies like Augmenta are wiring up all electrical components in a BIM model.

And this is just the start. The fact is that buildings and infrastructure are recipes, they are patterns. Structures obey physical laws and materials have inherent properties. AI design automation is an inevitability, but this is likely to occur in association with human interaction — albeit the interactions of fewer humans than might have previously been needed to complete a project.

While there will be generic AIs for building design, the focus for large practices will be the development of their own in-house design AI, trained on past projects and capturing and reusing the knowledge of generations of employees.

Footwear firm Adidas has an in-house AI that contains thousands of photos of every trainer [sneaker] the company has ever made. Designers can use the tool to interact and sketch, with the AI suggesting designs with details from previous generations of footwear.

In the long term, the more simple the building type, the more the variation on a theme is possible and the easier it will be for AI to become an expert design and engineering system, bypassing current project phases and going straight to 1:1 digital fabrication.


Advertisement

Automation

Taking a step back, before we make the whole industry redundant, one of the current hot topics for development is the creation of truly automated drawings. If firms analysed the hours, software and skills required to create documents, they would see that these collectively represent a considerable slice of operational costs. Even if the first generation of this kind of technology could only automate 50% of the documentation production, the savings would be huge.

There are several firms already working to realise this, namely SWAPP, Bentley Systems, Graphisoft and Graebert. The rules these systems work with might be derived from scans of past layouts, or more manually configured by check box.

Graebert already has out an early version of an automated drawings production tool, which could be offered as a generic SaaS service or licensed to developers as an in-application component. As a cloud service, you would simply upload models of any origin, get back drawing sets to your specifications, or use automation with APIs to ensure a constant supply of up-to-date drawings derived from specific project BIM models.

The problem here for our current BIM tools is that their key output is coordinated drawing sets. If AI can do a much better job of automating more of that work, it makes half the functionality of BIM packages redundant. Modelling could be done in anything: Rhino, SketchUp, Blender BIM. In the future, will we still need monolithic modelling-to-drawing BIM solutions? The race has already started.

Those who want to explore automation now can use Speckle’s impressive new Automation capability to reduce time-intensive tasks. Combining Speckle’s connectors that work inside today’s BIM tools with its APIs and SDK, it’s possible to create automated workflows based on key triggers. For example, that might include running an analysis when a design changes and is then uploaded to Speckle or running QA/clash detection.

The Speckle team is also playing with applications of AI, and recently demonstrated the ability to use ChatGPT to ‘talk to’ BIM objects in a model.

BIM tools

With mature desktop applications such as Revit having their functionality tagged for transition to the cloud, there is a sense of an opportunity for new players to come to market. In other words, they want to act fast, while Autodesk is distracted with managing the shift. But applications are sticky, and users don’t like to see professional knowledge they worked hard to attain getting devalued. Even if there was no new competition, Autodesk’s new cloud-based tools would still face opposition from ingrained, desktop-centric, file-based Revit users.

In competition for the BIM market are Snaptrude, Arcol and Qonic, with at least two others in stealth and still to come to market. These cloud-first applications are now some way into their BIM development journeys. Qonic focuses somewhere in between architecture and construction BIM but offers no built-in 2D. Snaptrude is intent on mirroring Revit’s core feature set, from concept to docs. Arcol is very much targeting SketchUp and early-stage design. All aspire to replace incumbent BIM players.

Autodesk will be fighting a rear-guard action, adding a small number of new capabilities to Revit over time, while fleshing out its cloud-first Forma offering. That is, unless it decides to acquire again, as opposed to build.

The Autodesk AEC group has form here, having acquired Softdesk in 1996 to build AutoCAD Architectural Desktop, and then buying Charles River Software, better known as Revit Technology Corporation, in 2002.

I think most of us in the industry were surprised to see Snaptrude exhibiting at Autodesk University 2023. Was this out of a deep interest in the younger company by Autodesk or a reflection of Autodesk’s new openness policy? In the past, long-time trusted third-party developers have been ‘uninvited’ from AU for introducing one new feature which overlaps Autodesk functionality — so this was an unexpected appearance.

When you consider these attempts to automate the production of entire drawing sets, or enable AI to build detail models, I think it’s clear that the nature of what we expect a ‘BIM modeller’ to do is set to change radically in the coming years. We need to become less application-centric and more data-centric. For now, the applications we use, and the formats they write, dictate the ecosystem we build and which tools and services we tend to buy. If we are to achieve true openness and fluidity of data, we must shed these historical constraints. With SaaS cloud apps, API integration, open data, more application choice, we have the potential to craft best-in-class ‘Mr. Potato Head’ tech stacks.


Advertisement

Hardware

The rise of AI is also going to impact our computers. Processors in personal workstations and cloud servers are starting to  feature dedicated AI neural silicon. AMD, Nvidia and Intel have GPUs with AI-optimised cores.

There’ll also be a new generation of Accelerated Processing Units (APUs) that combine CPU, GPU and Neural Processing Unit (NPU) on a single piece of silicon. These will offer incredible firepower, with plenty of options to offload processing to a variety of dedicated cores, using less energy than previous generations. AMD and Intel have the technology to do this in-house. Nvidia has teamed up with ARM. So far it has only delivered integrated CPU/GPU chips for the datacentre, but there are rumours of a forthcoming desktop chip that can run Windows.

The big question is where will the majority of the processing take place? With software and data looking likely to go to the cloud, will firms have to pay to rent hardware, maintaining constant connectivity? To date, GPUs in the cloud have been expensive. Cloud might not always be the right solution, but for AI, the hardcore processing must be done where the data resides.

Extended reality

Thanks to gaming and meta, virtual reality (VR) headsets have become accessible for any firm, and they offer plenty of immersive benefits for designers. Headsets like the Varjo XR-4 offer the current state of the art and are used in lots of product and automotive design houses, but they don’t come cheap. We are all waiting for a proper look at the Apple Vision Pro, which is currently shipping in the US. While again, it’s not cheap, this first-generation headset looks to deliver the wide-angle, high-resolution mixed reality experience that we have been hoping for. Although the app culture has yet to be established, we expect to see AEC use cases for the Apple headset within the first few months of 2024. This is the year that extended reality, or xR, finally starts to deliver.

Conclusion

In computing history, changes of platform, from Unix to DOS, DOS to Windows, have always been moments in which market-leading applications have been at their most vulnerable. This is one of the key reasons why Autodesk jumped with both feet into cloud application development, to get ahead of the game with Fusion in the MCAD space. The question remains, why did it fail? Was it that customers weren’t ready for cloud-based applications? Were feature-rich desktop applications still more compelling? Even when offered at the crazy original subscription price of $50 per month and with $10,000-worth of CNC machining functionality thrown in, Fusion still didn’t make a dent in the Solidworks installed base.

So why will a cloud transition be different in AEC? The first thing I’d point out is that the leading player in AEC is actively looking to make that technology transition itself, rather than passively allowing itself to be usurped. Autodesk’s Forge (APS) platform for cloud software development has matured and while Autodesk has been cutting its teeth with Fusion, it has also spent years developing web services like Autodesk BIM 360 and Autodesk Construction Cloud. Having delivered Forma, Autodesk can work on migrating its customers’ BIM data to the new unified database, ultimately alleviating Revit of the file burden, with an intermediate stage of Revit becoming a much faster thick client.

At the same time, Covid and working from home forced many firms to adopt cloud infrastructure and they have come to appreciate the benefits as a result. Products such as Figma, a collaborative tool for interface design, have already demonstrated that collaborative workflows translate well into web-based tools. The issue is that Autodesk wasted a lot of time trying to come up with a cloud reimaging of its BIM authoring tools (as seen in Projects Quantum and Plasma) and this has created space for new developers to try and get there first.

This is the first time in twenty years that there have been new BIM modellers coming to market, mainly based on the concept of Figma. The stickiness of Revit, its maturity versus the immaturity of the start-ups and good old user inertia will all play major roles here. But I believe that open formats, user-controlled data frameworks, AI tools and other web services such as cloud-based automatic documentation pose potential existential threats to the historic BIM workflow that Revit has defined, regardless of whether it resides in the cloud or on the desktop. It’s the most interesting time for AEC tools in decades.


The voice of the software industry

It wouldn’t be fair if only we gave our predictions for the future of BIM, so we asked some leading startups to share their thoughts too


Altaf Ganihar: CEO and founder, Snaptrude

SnaptrudeBIM has always been a promised land that hasn’t arrived. Today, BIM is treated more as a process of delivery than a process of design implementation/execution, and that’s where the true potential of BIM has failed to deliver.

We talk about design to construction being a non-linear workflow, and traditional BIM tools enforce a more linear approach to execution. With the transition to cloud-based database models, we might see that changing → moving away from static, proprietary, siloed files to interconnected datasets that make it more fluid, accessible, and updatable in real-time. More importantly, this enables collaboration (synchronous as well as asynchronous) with varied degrees of access control, and if presented in a modern, intuitive user experience (UX), we might see design boards in offices to construction sites collaborate on a stream of data residing on a database that is contextually presented to the right stakeholders with the relevant amount of detail.

For sustainability studies, one needs data in a certain format (planar representation); for construction, one needs to create and assign submittals, and all these could be derived from the parent source of data and represented with the relevant amount of detail

To explain this further, for sustainability studies, one needs data in a certain format (planar representation); for construction, one needs to create and assign submittals, and all these could be derived from the parent source of data and represented with the relevant amount of detail. And, of course, there is AI. Once we start seeing centralised data pools, we will see what modern machine learning algorithms can transpire.


Tiemen Strobbe: Head of product, co-founder Qonic

QonicThere’s no one-size- fits-all answer to the question ‘What is the shape of BIM to come?’ The answer will most likely include collaboration. BIM technology should be cloud-first to enhance digital collaboration at scale — leading to more streamlined flows of information, higher mobility (in the field), and access through different platforms (browser, mobile, and desktop).

Another part of the answer lies in performance. Regardless of the size and complexity of the BIM, users should not have to adapt themselves to the limitations of the technology. The tools of the future should be comfortable working at scale and remain performant to a construction level of detail. It is likely also related to openness. Technology built on open standards to connect with other tools and workflows already used. Also, technology should be available via powerful APIs, so anyone can contribute new logic, styles, or approaches.

A final part of the answer lies in a user-friendly interface. Nowadays tools should be easy and intuitive enough to use to ensure that all project architects, contractors, and field engineers are able to extract maximum benefit from it.

These are all valid answers, and we strongly believe that future BIM technology should be collaborative, scalable, fast, open, and accessible for everyone. That said, we consider these characteristics as the minimum requirements for BIM technology to be viable in the market — so called ‘table stakes’.

The real differentiator is enabling a model centric approach where the model drives processes, ranging from concept design, up to the delivery of the building, and facilities management.

This kind of model-based workflow requires next-generation modelling technology, enabling the finest detailing of components. Think of tools to automate modelling of real-life building systems (wall assemblies, manufacturable products, etc.) and details & connections (such as sills, insulation stones, roof caps, etc.). This technology will offer enough depth to enable an AEC process “from cradle to grave” – finally making BIM live up to its promises.


Richard Harpham: Co-founder Skema

SkemaRoughly 20 years ago in an internal Autodesk meeting, I was presenting a concept for how we would serve a design, construction and operations BIM environment, where the visual ‘model’ entity and context would be handled in a similar way to a web page. We would design in a spatial context, with real-world physics, attached to all the necessary meta-data without the need for single monolithic files to contain all the BIM data. Basically, we would develop a structured spatial protocol, leveraging the internet, that could operate with HTML like communication standards. Then the Web would transcend text and image-based limitations to become a true context driven decision environment. For me, this has always seemed an inevitability, not a possibility.

Now, after the last two or three decades of file-based BIM design solutions, the first sign of an immersive spatial web environment is starting to emerge as HSTP. HSTP is the next logical step in application-level protocols for objects and activities in spaces. With HSTP, we might serve a web of connected spaces, with spatial domains securely managing access to immersive and temporal context. Some are calling this Web 3.0, some call it the ‘metaverse’. First movers, Meta, Magic-leap and Apple are already investing billions in these ideas. Apple’s $3,500 spatial headset with few apps seems nuts right now, but so did a $500 iphone with no keyboard in 2007.

In AEC, we already coined a term for spatial assets called Digital Twins, but this has become a clumsy description for a not yet fully worked out concept. For many, Digital Twin has been seen as a deliverable that emerges from the lifecycle of delivering a built asset that can then be used for efficient operations. But if a Spatial Web 3.0 becomes a reality, then a Digital Twin is no longer a deliverable, it is a real-time reflection of every moment during the building process. More a Digital Mirror, than a Digital Twin. The amount of contextual decision data this would provide would be staggering and probably far too much for an individual to view and parse, which is where AI comes in.

A timber beam placed in a design would know it obeys real-world rules such as mass, inertia, bending moments, lifeexpectancy and relative impact on cost and carbon, and the AI would constantly assess/offer the optimum solution/ alternatives to the designer

An integration of Artificial Intelligence (AI) leveraging ‘Games-Theory-Physics’ with Spatial-Temporal Resolution (HSTP) capabilities could offer architects unprecedented tools and insights for conceiving, developing, and managing architectural projects. AI algorithms, coupled with HSTP spatial data, could process vast amounts of information in real-time, so architects utilising next-gen BIM software could gain deeper insights into site conditions, environmental factors, and user behaviour. This facilitates more informed decision-making during the design phase, optimising building placement, orientation, and overall sustainability. For example, a timber beam placed in a design would know it obeys real-world rules such as mass, inertia, bending moments, lifeexpectancy and relative impact on cost and carbon, and the AI would constantly assess/ offer the optimum solution/ alternatives to the designer.

Maybe most importantly, the dynamic nature of HSTP spatial AI introduces the temporal dimension to BIM design, allowing architects to analyse how spaces evolve over time. This temporal understanding enables architects to create designs that not only respond to current conditions but also adapt to changing requirements. BIM models might simulate and visualise how buildings will perform under various scenarios, providing architects with a comprehensive understanding of the long-term implications of their designs.

Couple this with the already emerging capabilities of AI BIM solutions to remove the repetitive tasks of modelling and document creation, such as from my current company, Skema Inc., then architects will be able to focus on the more creative and complex aspects of design. This not only increases efficiency but also reduces the likelihood of errors, leading to higher quality designs.

One of my favourite movies is The Fifth Element, which suggests that after fire, water, wind and air, a transcendent Fifth element from medieval science called æther fills the universe beyond the terrestrial sphere. Maybe the integration of AI with HSTP into BIM software could exemplify an æther for architects to redefine the practice of architecture, with BIM 3.0 solutions that create more intelligent, adaptive, and sustainable built environments.


Johan Hanegraaf: co-founder of Arkio

Snaptrude

For us the future of BIM is about enabling more stakeholders to access and contribute to the same information model. So instead of recreating models and documents for every project phase, the same data can be used during the full building lifecycle.

We’ve seen how cloud sharing and real-time visualisation have made existing BIM tools more accessible, but most design applications are still isolated and not easy to work with.

The current design/coordination process still relies heavily on mail, chat and paper, which moves the collaboration away from the latest models. If we can experience and design models together in real-time we can streamline the design process drastically. To enable working from any device and prevent duplicating work, model data should be streamed from and to all other design tools and databases.

Spatial computing devices like the Apple Vision Pro and Meta Quest 3 allow us to step away from traditional 2D interfaces and deliverables and bring information models to the real world. These immersive devices offer more natural eye and hand interactions with 3D models and simplify collaboration compared to traditional tools. Artificial intelligence will further enhance the design process and provide inspiration, assistance and automation for these spatial design workflows. We can’t wait to push these technologies further and move BIM out of the constraints of flat-screens into the era of spatial computing!


Advertisement