Last month’s cover story on the trend in EULA metastasis certainly invoked a wide range of responses. It’s quickly becoming a regular boardroom-level topic that extends way beyond our small corner of the commercial software world. Martyn Day provides an update
End-user licence agreements, or EULAs, have long been overlooked or outright ignored. Many software users view them as ‘just’ legal small print, worded to indemnify all parties and conveying basic rules such as, ‘Don’t steal this software’. They tend to be updated every few years, with minor changes made, mainly to reflect newer product offerings or services.
Then along came AI, and some vendors got pushy. EULAs began to contain clauses stating that the vendor might take its pound of flesh in the form of data, because real-world customer data has a much greater value to them than synthesised data. This upset many in the AEC industry and has become an ongoing worry.
This was the main thrust of the ‘Contract killers’ article in the September/October 2025 issue of AEC Magazine and we’ve received a great many responses on the subject since it was published. Importantly, we’re keeping the conversation going.
Discover what’s new in technology for architecture, engineering and construction — read the latest edition of AEC Magazine
👉 Subscribe FREE here
For example, during the panel session I hosted at SpeckleCon 2025 recently, I asked Vignesh Kaushik, Gensler’s principal and regional design technology director for Asia Pacific, about his current worries and concerns. Without a pause, he responded: “EULAs.”
An organisation the size of Gensler, with so many employees and important projects underway, simply cannot afford to get caught up in these data grabs. Despite having locked down IT infrastructure so that users cannot install maverick applications, some employees have still managed to evade these protections. When a new tool appears on the firm’s technology inventory, the worry is not so much the application itself, but the legal entitlement to data that a customer grants to the software company that built it.
There is clearly a significant disconnect between what legal teams believe they’ve signed and what technology now makes possible. Put simply, the AEC industry is woefully underprepared for negotiating the new terrains of AI and data ownership. Representatives of large firms with hefty legal teams have told me they have pushed back and renegotiated or even edited EULAs since the alternative was having it thrown out of the software estate entirely.
For Autodesk, restrictions on AI training using such broad terms was indeed added to its Terms of Use in 2018, not 2025, as we originally thought. It’s just that in 2018, nobody was checking software agreements as deeply as they are today.
It wasn’t until the AI revolution really started hitting that the industry woke up to the terms of some of these EULAs, as well as the whole wider problem of data ownership in a world where confidential company data is sent to an external software developer’s servers for AI to do its thing.
Hunting down AI terms in agreements has become an international pastime. While it now seems that Autodesk’s original intention was more modest than its wording suggests, and will hopefully be redefined soon, there is an obvious line where Autodesk (in fact, all software developers) don’t want the business logic of their commercial design tools stolen by AI, especially by third parties.
But it is true that almost every commercial software licence or EULA does contain a clause prohibiting copying, modifying, reverse engineering, decompiling, or disassembling the software.
To some extent, then, AI training on data could fall under reverse engineering. However, this isn’t what customers want to do from training on their BIM data.
Bill of rights
Quick off the mark, in response to our ‘Content killers’ article, Motif CEO Amar Hanspal published the company’s own Bill of Rights, a manifesto agreeing that design practitioners deserve more than legacy software vendors have been willing to offer. In its essence, this Bill of Rights states that customers should fully own their data and outputs, not be subject to opaque clauses that quietly hand over training or derivative rights to the platform.
Motif commits to open standards and open APIs as the default, rejecting the notion that proprietary lock-in is the status quo. Pricing must be fair and transparent, with no hidden fees, forced bundles or surprise hikes. Users should pay only for what they use and only when they use it. Privacy and security should be non-negotiable.
What began as an investigation into a single clause has become a broader movement to reassert some balance in the relationship between vendors and the industry that relies on them
Significantly, Motif promises that no customer data will be sold or misused, and the workings of any embedded AI features will remain transparent. Continuity features strongly. Projects and data must remain accessible and usable over time, even as technology evolves. Users must have access to product roadmaps and to company leadership and be able to influence decisions.
Meanwhile, in the UK and Australia, a new group is bringing together architects, digital directors, technologists, legal minds, enterprise customers and this magazine to work out what exactly has just happened and what to do about it.
The reaction to the EULA issue has been discussed in detail, looking at examples of what happens when AI ambitions collide with outdated contractual scaffolding. The idea that all future architectural work will probably start inside BIM systems with layers of AI to define the design seems strategically risky, if that negates firms’ ability to train on their own data. Interoperability would be the first casualty of AI-era business models.
The group’s conversation widened, inevitably, to what rights AEC firms should reasonably expect from the tools that now facilitate almost every act of design. Firms want more clarity on how data is used, the freedom to move information between platforms and transparency around AI training. They also want fair conditions for developers who extend or integrate with these tools, in the form of predictable, versioned contractual terms that don’t shift silently. The idea of a ‘Tech Stack Bill of Rights’ emerged almost naturally from the discussion.
While the meeting revolved around contractual terms, the most revealing conversations were cultural. People are beginning to realise that AI isn’t just another tool. It has the potential to reshape the economics of architectural practice.
If AI can generate, refine and automate significant portions of design, then the value of the data that feeds it – whether this pertains to families, details, project history, even naming conventions – increases dramatically.
Software vendors know this. Lawyers, on the other hand, often do not, which is how we end up with terms that were written half a decade ago dictating the rules of engagement for technologies that barely existed at the time but are of great interest today.
There was further discussion on how there is a lack of transparency around enterprise agreements. Several participants noted that non-disclosure agreements (NDAs) prevent them from sharing pricing structures, token models and contractual differences between standard EULAs and enterprise agreements. This opacity isn’t simply inconvenient; it becomes dangerous when AI is added to the mix, because firms cannot know what rights they’ve relinquished or what limitations they’ve accepted.
Tangible outcomes By the end of the meeting, there were several tangible outcomes. The decision was made to establish a first draft of a Bill of Rights and then to build a repository of historical industry EULAs so that changes can be tracked.
I will be following up conversations with developers to understand what boundaries are reasonable, rather than idealistic. Interest is already spreading beyond the initial group, with firms such as Bentley Systems and Graphisoft picking up on the reaction to our article.
What began as an investigation into a single clause has become a broader movement to reassert some balance in the relationship between vendors and the industry that relies on them. AI has made design data far more valuable than the contracts around it ever anticipated.
The question now is whether the sector can establish clear principles before the next generation of tools, and their accompanying EULAs, decide those principles for us.
In our original ‘Contract killers’ article we incorrectly stated that Autodesk added restrictions on AI training to its Terms of Use in 2025, where in fact it was 2018. A corrected version of the article can be found here, along with responses from Autodesk and D5 Render