In 2016, Autodesk announced Project Quantum, described as a platform technology for “evolving the way BIM works, in the era of the cloud, by providing a common data environment”. The project then went dark but now it’s back and called Project Plasma
At Autodesk University in 2016, the then senior vice president of products at Autodesk, Amar Hanspal, gave the AEC keynote and slightly opened the kimono on a new technology the company was developing to tackle crippling issues the industry faces because of its long history of working in a federated way – replicating poor collaboration between firms in the digital realm, leaving some of the benefits of BIM on the table.
Before this keynoter there was considerable debate internally at Autodesk, as to if Quantum should have been exposed so early in its development. However, the backdrop to this was that many mature Revit customers were asking: what was next for Revit and where was development going?
Autodesk’s best Revit customers were concerned at the lack of updates to the core application and the move to ‘Suites’, and subscription had seemingly dissipated development to incremental updates across a wide number of applications, when the heart of most firms’ BIM efforts centred around Revit and collaborative workflows.
Revit is over 20 years old. While it has seen significant re-engineering, the core element is still limited to running on a single CPU core, the database quickly swells in size and detail and suffers from an old graphics pipeline that is hard to accelerate in an increasingly GPU rich world.
As Autodesk has moved to the cloud in a major way, the desktop applications really need to store the data on the cloud, in BIM 360, to benefit from Autodesk’s increasing array of cloud services, such as document management, analysis, collaboration, as well as a growing number of cloud-based applications from third-party developers.
Autodesk’s products also have a long history of not being able to share data as well as you would expect coming from the same company. And, looking ahead to a world of digital fabrication, there are fundamental problems with BIM tools optimised to produce co-ordinated, symbolic 2D drawings, as opposed to driving CNC machines and robots, which require 1 to 1 modelling.
Quantum, as it was pitched, was an elegant solution to many of these challenges in moving forward to the next generation of BIM. You had to admire the decision to basically relook at the whole industry, how it works (or doesn’t work) and to realise that yet another monolithic application is really not going to map to the current federated industry workflows that silo data, are eminently file-based and ultimately damage the flow of data.
Autodesk had decided to look at a data-centric approach that could include current workflows, alleviate some of the pain by providing headroom to Revit, connect teams and address the growing use of multiple applications in designers’ tool sets.
Digital Fabrication is coming to AEC and not just at the high-end. Factories are being built everywhere to prepare to modularise, pre-fabricate and utilise automated digital fabrication methods. BIM data, at 1:100 or 1:50 cannot drive this, not without being remodelled in a Mechanical Computer Aided Design (MCAD) application such as Inventor or Solidworks.
By adding high levels of detail to BIM models, databases swell-up and they quickly become unmanageable. Again, with Quantum, Autodesk’s solution introduced a novel approach, where BIM models would ‘hand-off’ at set interface points components that needed to be manufactured by better adapted CAD systems. This meant that different professionals in the workflow could all have different versions of the same model, but they were connected by a common platform. What’s more, live geometry could be pumped around the system in real time for teams to see the model in various levels of detail.
This was indeed a brave new world and it seemingly was a very slick way of introducing a change in the way BIM would touch every player in a federated AEC project. Unfortunately, company politics intervened, Hanspal left Autodesk after the company’s board chose Andrew Anaganost as new CEO, and news on Quantum went dark.
Roll forward to Autodesk University 2018 and in one of those corridor conversations that happen by happenstance, AEC Magazine learned that Quantum had indeed survived but had actually been a victim of its own success. The technology was deemed to be so useful that the company decided to take a broader view of its potential to all products and verticals (e.g. AEC, manufacturing) and so paused to take in more internal stakeholders into its development as a platform technology. The net result is Project Plasma.
Earlier this year, AEC Magazine had the opportunity to talk with Autodesk’s chief software architect, Jim Awe, about the name change and the company’s vision for Plasma and its capabilities.
Awe explained, “What happened was, the idea of quantum, and doing automated workflows in a trusted way gained momentum. As we talked to customers, they definitely supported that idea and that was crucial to moving project collaboration forward. Then when we started to talk to other people around the company, we found that the manufacturing group were facing the exact same problem.
“Autodesk has traditionally been a design-based company and the goal of most of the products like Revit were to produce construction documents and then ‘throw them over the wall’ and have someone figure out how to make it. It turns out, manufacturing had the same issue, that they design everything upfront and then had to figure out which tools on the factory floor were going to make different parts of the assembly. They did the same thing and would break up the model, divide it up between multiple people who would figure out how to make their part in the process.
“This is a strategic shift. It’s not good enough to just design something, you have to be able to make it and in order to do that, you need a workflow that goes throughout the project life-cycle. This technology [Plasma] should be in the platform and so it became a much bigger, more elaborate, effort. We are taking our time to get it right because it’s so important.”
We asked Awe, how the platform worked, “The best analogy which describes what we’re trying to do here, is if you look at what Apple did with iOS. Apple has a platform, where you can build an App and plug into well-known services of the iOS system and then construct a mobile workflow. The App hands off the GPS location to maps, and photos are integrated into other workflows, all on your mobile device. We need to integrate enough of the pieces so that when you try and move data from, say Revit to a fabrication phase, you are moving the appropriate amount of data and the person on the other end knows where to find it and absorb it into their workflow.
“Carrying on with the Apple iOS analogy, we will definitely build some of our own applications plugged into this and we fully expect some of our Forge partners to hook up into Plamsa in lots of interesting ways.
“All the professional and IP boundaries are maintained with the data that is flowing between collaborators; it’s tracked, it’s scoped, so you’re not just sending the entire model across. In most instances it’s more likely to be a subset of it.”
Autodesk is calling this data exchange a form of a ‘data contract’, so users decide what limited data they want to share with each project participant and the system tracks and approves the exchange, as it goes through a series of ‘gates’. This means control is maintained by each of the originators in a project, sharing data which can’t be edited, so while the structural engineer can see the architectural elements, she can’t change components that the architect is responsible for and vice versa.
Data Contracts and Escrow
While Quantum went through some changes in the quiet period, Autodesk has come back with more detail of the mechanics and concepts of how it works. In many respects, it’s a combination of common data environment (some Autodeskers call this a Unified Data Environment) with transactional intelligence that is perhaps akin to digital currencies or banking systems.
The two core concepts in Plasma, are the Data Contract, and Escrow. A Data Contract could be defined as packaging up data for the curtain wall fabricator, such as the gridlines and some other relevant information about the curtain wall, but not sending the rest of the model. Escrow is the neutral place through which you pass all your data, tracking all the exchanges.
The other piece of the puzzle are the necessary plug-ins to different applications. These plug-ins know how to push and pull data from the applications as defined by user defined Data Contracts. For instance, Revit will have a plug-in to pull data out of Revit and to receive data back from other project participants.
Autodesk will create these for all of its own relevant applications, but it will also make some for the most used non-Autodesk products and will provide toolkits for any developer to enable Plasma transactions.
This is very similar to how financial systems work and we wondered if Autodesk was looking at using blockchain in this ecosystem? Awe replied, “We have discussed blockchain, and the Escrow part of the process could definitely use blockchain, but we have not decided to do that just yet. It may be overkill for what we’re trying to achieve with Plasma.
“There are new emerging technologies that may do what blockchain does in a more straightforward and lightweight way, such as Amazon, which just announced what they call the Ledger Database, which basically behaves like Blockchain. We are currently exploring how to secure the legal part of the system.
“If you look at the progression of how were trying to deliver this, the theory is the legal part. The implementation, so far, has focused on the data interoperability piece, which is what and how do you get the data automated in and out of the application. How do you construct a workflow that will spin up all the right compute nodes and send the proper notifications? The mechanism is now in place but we have not figured out the legal side as to how these transactions don’t get tampered with. We also have to work out what legal exposure Autodesk has in providing this service!
“This is more than just technology, it’s Autodesk looking at who is going to be responsible for that Escrow. There are a couple of options; Autodesk could decide to take responsibility, or maybe create a spin-off entity, or we could put the onus on the owner, or concede it to something like blockchain and have a technology solution, where nobody needs to be responsible as a technology is taking care of it. But for right now the mechanisms we have in place will work for any of those scenarios, but we haven’t decided yet how to administer it. That can come a little bit later as for now, we need to make sure that in our tests, the data flows in a reliable, automated way.”
Autodesk is currently trialing Plasma with a lot of internal prototypes, mostly with Revit and other internal Autodesk products, such as Civil3D, Inventor and Fusion. Perhaps, oddly the development team is especially excited with the potential of links to Excel, where they are finding a lot of potential workflows because users can do workflow authoring, as opposed to workflows authored by developers.
Most workflows for Plasma will require programs, either applications developed by Autodesk, or third-party developers, but end users can develop their own tools in products such as Excel and Dynamo which can process extracted data and export it out via a Data Contract or vice versa. Autodesk has developed a number of internal examples where they prototyped a little bit of logic that lives outside of Revit.
With data being submitted by federated users, the issue of quality management and standards across the system might be a concern. We asked Awe how quality checking of model data could be achieved . Awe responded, “That’s where the contract helps, because now you’re asking for specific data to adhere to that contract. We tested the contract system on a simple problem, wall framing boundaries, the framing for interior walls. The customer we were working with said they sometimes receive Revit designs which might have a single wall component, rising up seven storeys! The architect obviously thought that was the easiest way of modelling it for his version of the truth, but it makes no sense for those making interior walls from Revit models.
“In a wall framing boundary-based contract, it would state you are not allowed to produce a seven-storey wall, you have to deliver it chunked-up in an expected way so that the structural application that is receiving the data can further process it into manufacturable panels.
“We call these checking mechanisms validators, where any unexpected data coming through the gate will raise a flag. Another example would be, if I’m working on fabricating curtain wall panels, I wouldn’t expect those panels to be below ground, so maybe there is a rule that checks that those incoming panels don’t have a negative elevation and that they match the floor-to-floor heights of the building.
“We think this is going to be similar to how we build software development pipelines, basically running regression tests and other processes when design changes are submitted, and eliminating human error that is introduced when people have to manually process or re-model that data. Once it is initially set up, it would all be automated and the validators and other processes kick in each time updated design data passes through a gate.
“Initially we will seed template libraries with contracts for our own workflows, such as between Revit and Civil 3D. But, there will always be unique workflows where the customer has to do it. Initially we expect there will be enterprising, tech savvy customers who are doing this and we hope that they will contribute some of their contracts to the community, and over time hopefully the community will promote them to be de facto standards. Also, third-party developers are likely to come up with workflows of their own.”
Asynchronous vs Synchronous
Plasma can work in two ways, asynchronous workflows and synchronous workflows. One is on user demand, the other is truly dynamic.
The asynchronous methodology is more akin to current workflows, just without all the horrendous data wrangling. With the plasma architecture connecting applications in workflows via plug-ins, these apps actually know absolutely nothing about each other, they only know how to read the Data Contracts through the Escrow system. Designers are editing their models independently of each other until they decide to push it through the Escrow service. Designers get notified when there is a change, and the user simply opens the gate to allow that change to come into their work environment. Users have complete control to open that gate or not.
In a synchronous workflow, the opposite is true, and the gate is kept open and the designer sees live updates in their workspace from project participants who have contracted to share work parcels. In demos, this seems like an amazing capability, but it could be too dynamic for its own good! Humans will probably need to embrace a different work methodology for real time collaborative workspaces. The good news is that users can flip at will between asynchronous and synchronous work states. Awe agrees, “My guess is that most workflows that cross an app boundary or cross a discipline boundary, will choose an asynchronous workflow. And users will only react when you reach certain milestones.”
While users don’t have to give all of their geometry in the process, it should be noted that once geometry is shared in Escrow, there is a permanent record of it and you can’t take it back.
So, it seems that Plasma is transactional or live but offers incredible fluidity to the design process with increased granularisation and a control mechanism. In fact, Plasma is a living breathing view of a model’s development. It’s possible to tag the Data Contract with a status such as ‘work in progress’ or a Milestone but also allow designers to exchange data more frequently. If a user hits Undo in Revit, it will only Undo the work they have done in their model, but users can roll back to a previous version of the Data Contract. Users can see what receiving versions of the Data Contract had on their model. It’s possible to play through all of the decisions in the lifecycle of a design and even go back to a previous version of the Data Contract and start modelling from there or replay through all the data that came through those gates.
According to Awe, Branching and Merging are built into the underlying database technology, “We do use it in the Escrow system and the contract definition,” he says. “However, if users want to have that same technology in a core application like Revit, then that’s a different story. Revit wasn’t built that way. Revit could start making use of that facility in the database and we’ve done some experiments, but we haven’t decided if we are going to go back and re-engineer those applications.”
Speeding up Revit?
In our initial discussion, one of Quantum’s aims was to take the load off Revit. As it has become Plasma and become a platform technology there seems to have been less emphasis placed on this. Awe explained, “All things are possible! The way we’ve specifically approached this, is that if we have to stitch together a workflow across the project ecosystem, you have to be able to include all tools that already exist, without major modifications.
“We have enabled AutoCAD with a simple ARX plug-in, so it too can participate, but those other apps which we have built from scratch on this platform have lots of additional new capabilities. Revit too can still participate without any changes. The theory is that every application out there that’s available today has to be able to connect to the workflow. As you adopt more and more of these new data platform features, they get richer and richer, but it’s not required. It’s up to us and up to customer demand as to how much Revit evolves, but it doesn’t have to evolve at all to have just bare minimum participation.
“Right now, it’s a much bigger challenge to decouple workflows from Revit, because Revit is already doing that design coordination between structure and the rest of the building. But if you play the theory out, ten years down the road, ideally Revit would also be able to decouple every system in the building and say I don’t need to model the entire thing myself, I just need to coordinate with someone else who’s modelling that piece.“You could basically divide Revit up into more specialised modellers for the specific systems but still coordinate between the different disciplines. But right now, Revit is taking on that responsibility and doing all the coordination itself. Plasma enables the single version of the truth. However, it is essentially distributed. Each app, each persona in the ecosystem is able to maintain the model that makes the most sense to them and then communicates the part that needs to have a collective sense of the overall model for collective co-ordination.”
We asked Awe how Plasma impacted current Revit development. He responded, “The Revit team are being very aggressive in trying to stay up-to-date with everything that we are doing in the data platform.”
It seems the Revit team is continuing to work on a number of experiments with regards to data granularity within the database but as it stands there is a lot of business data trapped inside of the Revit database, as it was designed to be a multi-discipline repository. The question appears to be, how much data would it be okay to remove from Revit and how much is required to remain because it needs to be there to fulfil the business logic?
Awe says Revit is going to continue to evolve and the company is committed to keeping Revit fresh and ‘architecturally sound’ as Plasma moves forward. Awe commented that the Revit team is learning a lot from the AutoCAD team, which is the most aggressively re-architected platform within Autodesk. He said, “They are constantly figuring out what they need to do next and not afraid to update the technology.”
Reading between the lines, Revit’s demise is a long way off, if ever. Stage one will be connecting it to the Plasma workflow. At that point, the need for Revit to be the sole point of coordination lessens, as the data, the version of the truth, becomes distributed and supports varying levels of detail within various applications, even stored if multiple formats.
It’s also clear that having a single product that can edit components supplied by all disciplines, as Revit does now, comes with some risks. Plasma will be all about designers with different roles maintaining their data in whatever system suits them, while submitting controlled work packets to the federated team, probably asynchronously.
While Autodesk has talked mainly about the benefits of plugging in different Autodesk products, like Inventor, and imagines plugging into common, competitive tools like McNeel Rhino, there appears to be no reason why Plasma couldn’t work just as well between different Revit users. And this is exciting, as it potentially unifies an industry that has struggled with data wrangling ever since the dawn of BIM.
We certainly got the impression that perhaps, long-term, Revit’s replacement was more likely to be actually multiple discipline-specific applications, which would be native to the Plasma way of working and these could be cloud, mobile or desktop based. We are so conditioned to defining ourselves or our jobs by the modelling software we use. In a data-centric approach, the authoring tool is no longer the star, it’s the dynamism of the data and the intelligence that’s embedded in the system. Nobody describes themselves by the web browser they currently use, it’s just the Internet.
With this data centric approach, it will be much quicker and easier for developers to create small applications which perform discrete tasks within the many disciplines, these could be analysis applications, smart sensor data readouts or façade analysis tools. In the past, developers would have had to create tools to plug into applications (like Revit) to access the data which only it could load but Plasma and Autodesk’s new development environment Forge cut out the middle man.
Forge developers have been buzzing about using Revit.IO, which is a new component held in the cloud , available for application developers. It enables Revit functionality to be applied to Revit models on services such as BIM 360. We asked if this was a special cloud version of Revit, Awe explained. “It’s really just a headless User Interface-less Revit that runs on a server. So it’s not a new version of Revit, but it does allow Forge developers to load Revit models and access Revit functions for cloud-based workflows, which is very useful.”
If you are wondering how this collaboration platform sits alongside Autodesk’s BIM 360 platform, Awe explained, “BIM 360 is currently acting as the data platform for a subset of the project lifecycle. But as we get more ambitious about Design-to-Make workflows, they will start to encompass more activities than are appropriate for BIM 360 to surface directly. BIM 360 will likely get ‘jacked up’ a little bit and the platform is going to get richer underneath. This will allow users to bounce data off Plasma, some of which will show up in BIM 360 and interact with BIM 360 but it doesn’t mean you go to BIM 360 to do everything, it just means BIM 360 is a large window into project data and workflow’.
With an advanced collaborative workflow, and perhaps a requirement for broader technical knowledge and programming resources, we wondered who the typical Plasma customer would be. Awe explained, “We don’t think this technology is limited to the high-end architects like Zaha Hadid Architects and Foster + Partners.
“When buildings are difficult to define and construct, sure, this technology will help, but we’ve seen many simple examples of collaboration between designer and fabricator where huge improvements were made in efficiency and quality because they had a feedback loop. That doesn’t usually happen in a typical ‘paper waterfall workflow’. This means you are going to have people in the supply chain who have a much clearer view of the designs early on. And can contribute much more within the process. It’s going to be down to the cleverness of the firms to embrace new technology.
While Quantum was first aired in 2016, in 2019 the development seems to be progressing but there is a lot to do. Awe said that Autodesk is not committing to any delivery dates and with a new VP of Cloud Platform, Sam Ramji, there appears to have been a re-evaluation of what the company had developed so far, the systems architecture and what will and won’t be delivered.
Autodesk wants to avoid pre-announcing early technology and return to being more conservative. In fact, internally Autodesk has qualms about letting Project names slip into the open, Awe said that Autotodesk refers to this style of data-centric workflow across the project lifecycle as a ‘Plasma Workflow’ and hence the project name, but in time we have been told it will likely appear as just a capability of the Forge Data Platform in the future. For the time being, the company has done a lot of internal prototypes with Plasma Workflow and it has a number of construction firms who are willing to try it out on simple workflows, like wall framing and layout, experimenting with ways to remove the paper from the process.
In terms of performance optimisation, it’s also very early days. We asked what kind of performance recent tests had given for multiple users. Awe explained, “So far with all our tests, the size of the data is small. Pushing out to individual contracts is not that big a problem, the real test will be when you want to aggregate all the data together, from across the system.
For example, if you wanted to do something like clash detection or viewing the project in its entirety, every participating system would have to provide a display mesh, and some application would need to be optimised to load and work with a much heavier set of data.
Our initial hopes of Quantum being the next generation of Revit in some ways have drowned in the sea of automated collaboration. However, we now realise that the whole concept of applications, especially desktop ones is somewhat moot in the world of cloud-based workflows. Not unlike the end of The Avengers: Infinity War, we see desktop apps and workflows turning to dust in slow motion.
Project Plasma is a total rethink of data flows in a cloud-enabled world. It’s Autodesk recognising that one application cannot be infinitely expanded to solve all upstream and downstream problems in digital design workflows – an outlook that in its previous life, Autodesk tried to drive every nail with an AutoCAD shaped hammer.
Plasma retains the elegance of both catering to current workflows and toolsets, while providing collaboration through a common data environment, which is highly user controlled and flexible in its usage. However ultimately monolithic applications that try to be everything to everyone and attempt to hold the co-ordination in a single database on a desktop will only continue to silo project data and never solve the collaboration issue.
We get the feeling that Plasma is still years off from meeting its prime objectives, but some elements of collaboration and exchange will be available earlier than the complete system. Over time, and when it makes sense, new applications will appear for performing discrete and industry-specific functions, removing the need for Revit to handle coordination and to concentrate on authoring. It’s also great to hear Autodesk is looking to remove the need for drawings when design to digital fabrication makes sense.
In software terms we commonly talk about the next generation; with Quantum / Plasma being developed from the data level upwards, it would make more sense to look at Autodesk developing an environment for a new species.
If you enjoyed this article, subscribe to our email newsletter or print / PDF magazine for FREE