EULA

Rewriting the rules

17 0

In late 2025, we highlighted a trend for troubling language around AI and customer data ownership contained in end-user licence agreements. Following customer feedback, key AEC software developers have listened and some are taking action, writes Martyn Day


In October 2025, AEC Magazine published ‘Contract Killers’. This article focused on how end-user licence agreements, or EULAs, are shifting power away from users, in favour of shareholders, through a range of troubling terms and conditions. Our conclusion: vendor access rights are expanding and, in the process, starting to look more akin to a form of customer behavioural control.

Using Autodesk’s terms of use and Acceptable Usage Policy (AUP) as an example, we highlighted a broad clause that, if taken literally, would prohibit users from training their own AI models using output from an Autodesk application. The practical implications, at least as they have been interpreted by many readers, were stark: companies experimenting with AI on their own BIM models, drawings or design data could technically be found in breach of contract.

Some two months later, we corrected one key point: the contentious AI-training terms in Autodesk’s terms and conditions weren’t a new 2025 addition but had been present in contracts since May 2018. For that reason, some executives at Autodesk were perplexed by the delayed reaction to this clause among customers. After all, seven years had passed since its first inclusion.


Discover what’s new in technology for architecture, engineering and construction — read the latest edition of AEC Magazine
👉 Subscribe FREE here

The fact remains that the industry didn’t pay much attention to AI clauses in EULAs until generative AI started to make a significant impact in products coming onto the market. In short, Autodesk had, perhaps inadvertently, written AI-training prohibitions into its standard licence language many years before generative AI became operationally relevant to most customers.

Customers naturally reached out for clarification, because investigations and fines around licence non-compliance are never far from the minds of design IT directors. Either way, the clause appeared woefully misaligned with current realities when it comes to how technologically advanced customers use their own data, not to mention with how the industry expects to innovate with AI in future. We spoke to representatives of numerous firms that are already developing training classifiers to detect modelling errors, using historical project data to build cost-prediction tools, for example, or running machine-learning scripts over Revit exports. Several reported pausing or re-scoping internal AI projects while their legal teams worked to interpret software vendors’ terms and conditions.

The debate widened further when Nathan Miller, principal at digital design agency Proving Ground, posted a detailed breakdown of Autodesk’s AUP on LinkedIn in late 2025.

Supported by
AI content is independently produced by the AEC Magazine editorial team. HP and NVIDIA supports the creation of this content, but all opinions and coverage remain editorially independent.

Heated debate

If you take the wording literally, Miller concluded, it could be interpreted as a ban on legitimate AEC AI use cases. His post ricocheted through the online computational design and BIM communities, triggering an unusual degree of open discussion about licence terms, customer rights and vendor power. What had previously been a debate over an obscure legal clause was soon seen as a symptom of a far wider and more serious industry issue.

Autodesk’s position at this point was that the intent of the clause was to prevent its customers from reverse-engineering Autodesk products or building competitors. Intent, however, is different from legal validity. As lawyers have pointed out, it’s the words on the page that determine what is permitted and what is prohibited, not internal intent. The gap between how Autodesk said the clause should be read, and how it could be read, lies at the heart of this conflict.

Either way, Autodesk materially revised its terms of use, AUP and Autodesk Platform Services (APS) terms of service on 8 December 2025, albeit with limited fanfare outside of legal and developer circles.

The company positioned this update as part of a broader modernisation of its legal framework intended to support connected workflows, platform services and what it referred to as “agentic automation”. In practice, it also represented a direct response to the industry backlash over AI training rights.

As it stands, the revised AUP, together with Autodesk’s accompanying terms change guidance, now makes it clear that users may train machine learning or AI models on their own data, providing they are not using Autodesk offerings to copy, reverse-engineer or replicate Autodesk’s own product functionality.

It also clarifies that prohibitions on scraping and mining apply to Autodesk content and content licensed by Autodesk, not to customer-generated data. This is a subtle but important shift. For the first time, Autodesk’s public terms draw a clear line between protecting vendor intellectual property and enabling customer-driven AI workflows.

Nathan Miller, who had helped catalyse the debate, acknowledged the change in a December LinkedIn post, describing the new wording as “much needed specificity” compared with the earlier blanket phrasing. While he did not claim victory, he noted that the clarification materially altered how the terms could be read by customers and their lawyers.

For AEC firms experimenting with AI, the practical implications are significant. Internal AI training workflows that rely on a company’s own project data now sit on far firmer contractual ground. Firms can train models to analyse historical BIM data, automate repetitive tasks or develop predictive tools, as long as those models are not being used to recreate Autodesk product features or to compete directly with Autodesk’s software.

The update also reinforces, more clearly than before, that customers own their content and may reuse it for AI training, subject to the reverse-engineering limitation and other standard contractual constraints.

In effect, Autodesk has narrowed the scope of what it considers impermissible. The restriction now targets a specific category of behaviour — using AI on Autodesk tools to rebuild Autodesk product functionality — rather than a broad class of AI activity. For most internal innovation teams and computational design groups, that is the distinction that matters.

Grey areas

The welcome change does not resolve everything, however. Grey areas remain between what the EULA now says and what customers perhaps feel they should be entitled to do.

The most obvious is definitional. What exactly counts as “reverse engineering” or “replicating functionality” in an AI context? If a firm trains a model to automate a modelling task that Revit already performs manually, is that replicating Autodesk functionality or merely automating a workflow?

The answer is legal, not technical, and will likely only be tested if a dispute arises. If an AI is trained on data from one specific product such as Revit, then over time, it cannot help but learn the logic of the creation tool. Researchers we have talked to who are using synthetic BIM models have told us that eventually their AI learns the synthetic tool’s logic.

Finally, there is the question of commercialisation. While internal AI training is now explicitly tolerated, the boundary between internal use and external product development remains blurred, especially when APIs, cloud services and third-party tools enter the picture. That uncertainty matters to the start-ups, consultants and software vendors that build tools on top of Autodesk’s ecosystem. It also raises the risk that future innovation will be shaped as much by contractual interpretations as by technical possibilities.

Positive signals

It’s encouraging that Autodesk engaged with customers and made necessary changes to its EULA. That’s a welcome turn of events. Other developers, such as D5 Render, have updated their EULAs following conversations with staff and customers, too. That’s another positive signal.

But it is also a reminder of how much practical power large software vendors now exercise through licence language. These are words that customers rarely read until terms and conditions collide with real-world workflows. With advances in AI and an agentic future, and with customers having access to software on demand, the relationship between vendors and customers is going to change significantly in coming years.

So Autodesk’s December revision to its terms and conditions should not be read as the end of the debate, but more as its formal beginning. It is a meaningful step towards aligning legal language with contemporary AI practice, but one that also exposes how ill-suited traditional EULAs are to a world in which software outputs are no longer just files, but training data for entirely new classes of tools.

Over the next decade, the dance between software vendors, their customers and third-party developers will accelerate as agentic AI reshapes workflows, software development and the economic logic that surrounds what customers are actually paying for.

The unresolved question is not whether EULAs will continue to change, but whose interests those changes ultimately serve: shareholders seeking to defend product moats, or customers seeking to innovate freely on top of their own data.

Useful links

AEC Magazine – Contract Killers

AEC Magazine – 28 days later

Autodesk Terms of Use

Autodesk APS Terms of Service


What about APIs and developer rights?

Alongside the AI-training clause, another sensitive issue raised by Autodesk’s earlier terms concerned APIs and developer rights.

Before the December update, Autodesk Platform Services (APS) and Autodesk Developer Network terms were spread across multiple documents, some of which appeared to tether API usage primarily to internal business purposes. Developers worried this could constrain commercial third-party tools built on Autodesk data and services.

Nathan Miller also highlighted this issue in LinkedIn discussions, arguing that the wording created uncertainty around whether developers could legally build and sell products that relied on Autodesk APIs, even when customers explicitly authorised access to their own data.

The December 2025 update consolidates and refreshes these terms under a unified APS framework. Autodesk now more clearly distinguishes between internal use, commercial applications and prohibited activities such as scraping Autodesk-owned content or reverse-engineering Autodesk services.

The APS terms are also explicitly aligned with the updated Acceptable Use Policy, reinforcing that restrictions apply to Autodesk’s own intellectual property, not to customer-generated data accessed via APIs.

That said, the developer position is still not entirely frictionless. The line between an internal automation tool and a commercial product remains legally delicate, and the practical meaning of “replicating Autodesk functionality” in an API-driven ecosystem is still open to interpretation. For developers building serious businesses on top of Autodesk’s platform, careful legal review of both the APS terms and the AUP must be considered necessary and unavoidable.

Advertisement

Leave a comment