Contract killers: how EULAs are shifting power from users to shareholders

15 0

Most architects and engineers never read the fine print of software licences. But today’s End User Licence Agreements (EULAs) and Terms of Use reach far beyond stating your installation rights. Software vendors are using them to have rights over your designs and control project data, limit AI training, and reshape developer ecosystems — shifting power from customers to shareholders. Martyn Day explores the rapidly changing EULA landscape


The first time I used AutoCAD professionally was about 37 years ago. At the time I knew a licence cost thousands of pounds and was protected by a hardware dongle, which plugged into the back of the PC.

The company I worked for had been made aware by its dealer that the dongle was the proof of purchase and if stolen it would cost the same amount to replace, so we were encouraged to have it insured. This was probably the first time I read a EULA and had that weird feeling of having bought something without actually owning it. Instead, we had just paid for the right to use the software.

Back then, the main concern was piracy. Vendors were less worried about what you did with your drawings and more worried about stopping you from copying the software itself. That’s why early EULAs, and the hardware dongles that enforced them, focused entirely on access.

The contract was clear: you hadn’t bought the software, you had bought permission to use it, and that right could be revoked if you broke the rules.

As computing spread through the 1980s and 1990s, so did the mechanisms of digital rights management (DRM). Dongles gave way to serial numbers, activation codes and eventually online licence checks tied to your machine or network. Each step made it harder to install the software without permission, but the scope was narrow. The EULA told you how many copies of the software you could run, what hardware it could be installed on, and that you could not reverse-engineer it.

What it didn’t do was tell you what you could or could not do with your own work. Your drawings, models and outputs were your business. The protection was wrapped tightly around the software, not around the data created with it. That boundary is what has changed today.

Supported by
AI content is independently produced by the AEC Magazine editorial team. HP and NVIDIA supports the creation of this content, but all opinions and coverage remain editorially independent.

The rising power of the EULA

As software moved from standalone desktop products to subscription and cloud delivery, the scope of EULAs began to widen. No longer tied to a single boxed copy or physical medium, licences became fluid, covering not just installation but how services could be accessed, updated and even terminated.

The legal fine print shifted from simple usage restrictions to broad behavioural rules, often with the caveat that terms could be changed unilaterally by the vendor.

At first the transition was subtle. Subscription agreements introduced automatic renewals, service-level clauses and restrictions on transferring licences. Cloud services were layered in terms around uptime, data storage, and security responsibilities. What once was a static contract at the point of sale evolved into a living document, updated whenever the vendor saw fit. And in the last five to seven years we have seen more frequent updates.

Software firms now have an extraordinary new power: the ability to reshape the customer relationship through the EULA itself. Where early agreements were about protecting intellectual property against piracy, modern ones increasingly function as business strategy tools. They dictate not just who can access the software, but how customers interact with their data, APIs, and even with third-party developers. The fine print was no longer just about access control; it became a mechanism of control.

Profound changes

The most striking shift in recent years is that EULAs have moved beyond software access and into the realm of customer data. What you produce with the tools (models, drawings, schedules, and outputs) has become strategically valuable to the software developers – as valuable as the software itself. Vendors now see customer data as fuel for analytics, training, and new AI services. The contract language has followed and there are varying degrees of land grab going on.

EULAs are no longer obscure boilerplate legalese, tucked at the end of an installer. They have become the front line in a new battle, not over software piracy, but over who controls the data, workflows, and ecosystems that shape the future of design

This year alone we have seen two firms – Midjourney and D5 Render – attempt to change their EULAs to automatically lay claim to perpetual rights access and use customer created data (mainly AI renderings), as well as the right to pass on lawsuits if any of those images infringe copyright and are subsequently used by the software vendor to train its AI models. Many of the pure-play AI firms will lay claim to your first born given half a chance.

Autodesk

Closer to home, Autodesk provides a recent sharp example. Its current Acceptable Use Policy (we think updated May 2025), forbids customers from using “any Offering or related Output” to train machine learning or AI systems as a blanket term. In practice, this implies that even if you create designs entirely in-house, you may not lawfully use that output to train and develop your own AI models on your own data. Autodesk alone holds the right to decide if, when, or how your data can be used for such purposes.

This is a profound change. Historically, your files were yours: a Revit model or AutoCAD drawing was protected only by your own governance. Now the licence agreement dictates not only how the software runs, but also how you can use the fruits of your own labour.

Autodesk’s licensing language creates a subtle but important tension between ownership and control. In its License and Services Agreement (effectively the EULA), Autodesk reassures customers with familiar phrases such as “You own Your Work” and “Your Content remains yours.” On the surface, this means that the models, drawings, and other outputs you create belong to you, not Autodesk. However, when you move into Autodesk’s General Terms of Use and the Acceptable Use Policy (AUP), the definition of what you can do with that work becomes more constrained.

Talking with May Winfield, global director of commercial, legal and digital risks for global engineering consultancy Buro Happold, she suggests this goes further: Autodesk’s Acceptable Use Policy’s purported restrictions on customer outputs may even conflict with copyright laws in certain jurisdictions, where authors automatically own their creations unless they expressly transfer or license those rights. The question becomes: if copyright law guarantees authorship, but Autodesk contractually limits permitted uses, which prevails?

In these documents, Autodesk introduces the term “Output,” meaning any file or result generated using its software. The AUP explicitly prohibits customers from using “any Offering or related Output in connection with the training of any machine learning or artificial intelligence algorithm, software, or system.” In practice, this means that even though Autodesk concedes ownership of your designs, it contractually restricts you from applying them in one of the most strategically valuable ways: training your own AI models.

I know many of the more progressive AEC firms that attend our NXT BLD event are training their own in-house AI based on their Revit models, Revit derived DWGs and PDFs. With no caveats or carve outs for customers, they now have the Sword of Damocles hanging over their data. As worded, the broad use of the word ‘output’ could theoretically even apply to an Industry Foundation Classes (IFC) file exported from Revit, as it’s an output from Autodesk’s product stack, meaning you are not even allowed to train AI on an open standard!

Legally, the company has not taken your intellectual property; instead, it has ring-fenced its permitted uses, in a very specific way. This creates what I’m calling a “legal DRM moat” around your data. Autodesk positions itself as the arbiter of how your data can be exploited, leaving you in possession of your files but without full freedom to decide their fate. The fine print ensures Autodesk maintains leverage over emerging AI workflows, even while telling customers their data still belongs to them. And the one place where this restriction doesn’t apply is within Autodesk’s cloud ecosystem, now called Autodesk Platform Services (APS). Only last month at Autodesk University, Autodesk was showing the AI training of data within the Autodesk Cloud.


EULA



Knock-on risks for consultants

Winfield also points out that Autodesk’s broad claims over “outputs” may have knock-on consequences for customer–client agreements. Most design and consultancy contracts require the consultant to warrant that deliverables are original and fully owned by them. If a vendor asserts ownership rights through its licence terms, that warranty could be undermined. The risk goes further: consultancy agreements often contain indemnities, requiring the designer to protect the client against copyright breaches or claims. If a software vendor were to allege ownership or misuse under its EULA, a client might look to recover damages from the consultant. This creates a potential double exposure — liability to the vendor, and liability to the client.

Possible reasons

I’ve been trying to work out what Autodesk’s rationale for this move might be. I imagine Autodesk would dispute that the intention of this clause is to restrict the innovation of its customers. Autodesk frames these measures as safeguards — designed to protect its intellectual property, shield confidential data, and ensure AI use happens within environments it can govern and support. However, it does this to a backdrop of a lot of messaging around openness and interoperability.

It has been suggested that these strong AI rules may have been added following concerns that ChatGPT, Gemini or one of the major AI firms would unlock the design knowledge hidden in Autodesk software’s outputs, challenging Autodesk’s hold on its customers.

The short solution to this would be for Autodesk to refine the language in its Terms of Use and not have such an implied broad ban on customers creating their own trained AIs on their own design data, irrespective of the software that produced it.

Nathan Miller, a digital design strategist and developer from Proving Ground has run a series of posts on Linkedin highlighting these new limitations.

While it was certainly a topic hotly commented on, the only Autodesk-related person to add their thoughts was Aaron Wagner of reseller Arkance, who commented:

“I don’t think the common interpretation is accurate to the spirit of that clause. Your data is your data and the way you use it is under your own discretion. Of course, you should always seek legal counsel to refine any grey areas.

“This statement to me reads that the clause is from a standpoint of Autodesk wanting to protect its products from being reverse engineered and hold themselves free of liability of sharing private information, but model element authors can still freely use AI/ML to study their own data / designs and improve them.”

Here, Buro Happold’s Winfield explained, “Contract interpretation is generally not impacted by spirit of a clause – if the drafting is clear, it is not changed by the assertion of a different intention? Unless there are contradictions in other clauses and copyright law then it all needs to be read together and squared up to be interpreted in a workable way? It may be the “outputs” in the clause needs to qualify / clarify its intentions, if different from the seemingly clear drafting of read alone?”

The apparent sweeping ban on AI training using any output from Autodesk software has not gone unnoticed by major customers. Autodesk already has a reputation for running compliance audits and issuing fines when licence breaches are discovered, so the presence of this clause in an updated, binding contract has raised alarm.

The fear is simple: if the restriction exists, it can be enforced. Several design IT directors have already told their boards that, on a strict reading of Autodesk’s updated terms, their firms are probably now out of compliance – not for piracy, but for training their own AI models, on their own project data.

Some of the commentors on Miller’s original Linkedin post, reported that they raised the issue with Autodesk execs in meetings. By and large these execs had not heard of the EULA changes and said they would go find out more information.

Other vendors

Looking around at what other firms have done here, their EULAs include clauses about AI training of data, but it always appears to be in relation to protecting IP or reverse engineering commercial software – not broad prohibitions.

Adobe has explicit rules around its Firefly generative AI features and the company’s Generative AI User Guidelines forbid customers from using any Firefly-generated output to train other AI or machine learning models. However, in product-specific terms, Adobe defines “Input” and “Output” as your content and extends the same protections to both.

Graphisoft has so far left customer data largely unconstrained in terms of AI use. Bentley Systems sits somewhere in between, allowing AI outputs for your use but prohibiting their use in building competing AI systems. The standard Allplan EULA / licence terms do not appear to contain blanket prohibitions on using output for AI training.

Meanwhile, Autodesk’s blanket ban on AI training using outputs from its software, combined with an exception for its own cloud ecosystem, appears to effectively grant the company a monopoly over how design data can fuel AI. Customers are free to create, but if they wish to train internal AI on their own project history, the contract shuts the door — unless that training happens inside Autodesk’s APS environment. The effect is to funnel innovation into Autodesk’s platform, where the company retains commercial leverage.

This mirrors tactics used in other industries. Social media platforms, for example, restrict third-party scraping to ensure AI training occurs only within their walls – although in that instance the third party would be using data it does not own.

If licence agreements prevent firms from using their own outputs to train AI, they forfeit the ability to build unique, in-house intelligence from their past projects

In finance, regulators have intervened to stop institutions from controlling both infrastructure and the datasets flowing through them. Europe’s Digital Markets Act directly targets such gatekeeping, while US antitrust agencies are scrutinising restrictive contract terms that entrench platform dominance.

For the AEC sector, the potential impact of the restrictions in Autodesk’s Acceptable Use Policy is clear: it risks concentrating AI innovation inside Autodesk’s ecosystem, raising barriers for independent development and narrowing customer choice.

Proving is difficult

How Autodesk might enforce this AI training ban is an open question. Traditional licence audits can detect unlicensed installs or overuse. Meanwhile, proving that a customer has trained an AI on Autodesk outputs is way more complex. But Autodesk file formats (DWG, RVT, etc.) do contain unique structural fingerprints that could, in theory, be detected in a trained model’s weights or outputs – for example, if an AI consistently reproduces proprietary layering systems, metadata tags, or parametric structures unique to Autodesk tools.

Autodesk could also monitor API usage patterns: large-scale systematic exports or conversions may signal that datasets are being harvested for training. Another possible avenue is watermarking — embedding invisible markers in outputs that survive export and could later be detected.

APIs, APS and developers

Autodesk is also making significant changes to other areas of its business – changes that could have a big impact on those that develop or use complementary software tools. Autodesk’s API and Autodesk Platform Services (APS) ecosystem has long been central to the company’s success, enabling customers and commercial third parties to extend tools like Autodesk Revit and Autodesk Construction Cloud (ACC).

But what was once a relatively open environment is now being reshaped into a monetised, tightly governed platform — with serious implications for customers and developers.

Nathan Miller of Proving Ground points out that virtually every practice he has worked with relies on opensource scripts, third-party add-ins, or inhouse extensions. These are the utilities that make Autodesk’s software truly productive. By introducing broad restrictions and fresh monetisation barriers, Autodesk risks eroding the very ecosystem that helped drive its dominance.

The most visible change is the shift of APS into a metered, consumption-based service. Previously bundled into subscriptions, APIs will now incur line-item costs for common tasks such as model translations, batch automations and dashboard integrations.

A capped free tier remains, but high value services like Model Derivative, Automation and Reality Capture will now be billed per use. For firms, this means operational budgets must now account for API spend, with the risk of projects stalling mid-delivery if quotas are exceeded or unexpected charges triggered.

Autodesk has also tightened authentication rules. All integrations must be registered with APS and use Autodeskcontrolled OAuth scopes. These scopes, which define the exact permissions an app has, can be added, redefined or retired by Autodesk — improving security, but also centralising control over what kinds of applications are permitted.

Perhaps the most profound change is not technical, but contractual. Firms can still create internal tools for their own use. But turning those into commercial products — or even sharing them with peers — now requires Autodesk’s explicit approval. The line between “internal tool” and “commercial app” is no longer a matter of technology but of contract law. Innovation, once free to circulate, is now fenced in.

This changing landscape for software development is not unique to Autodesk. Dassault Systèmes (DS), which is strong in product design, manufacturing, automotive, and aerospace, has sparked controversy by revising its agreements with third party developers for its Solidworks MCAD software. DS is demanding they hand over 10% of their turnover along with detailed financial disclosures. Small firms fear such terms could make their businesses unviable.

Across the CAD/BIM sector, ecosystems are being re-engineered into revenue streams. What were once open pipelines of user-driven innovation are narrowing into gated conduits, designed less to empower customers than to deliver shareholder returns.

Why all this matters

The stakes are high for both customers and developers. For customers, the greatest risk is losing meaningful control over their design history. Project files, BIM models and CAD data are no longer just records of completed work; they are the foundation for future AI-driven workflows. If licence agreements prevent firms from using their own outputs to train AI, they forfeit the ability to build unique, inhouse intelligence from their past projects. The value of their data, arguably their most strategic asset, is redirected into the vendor’s ecosystem. The result is growing dependence: firms must rely on vendor tools, AI models and pricing, with fewer options to innovate independently or move their data elsewhere.

For software developers, the risks are equally severe. Independent vendors and in-house innovators who once built add-ons or utilities to extend core CAD/BIM platforms now face new costs and restrictions. Revenue-sharing models, such as Dassault Systèmes’ 10% royalty scheme, threaten commercial viability, especially for smaller firms. When API use is metered and distribution fenced in by contract, ecosystems shrink. Innovation slows, customer choice narrows, and vendor lock-in grows.

AI is the existential threat vendors don’t want to admit. Smarter systems could slash the number of licences needed on a project, deliver software on demand, and let firms build private knowledge vaults more valuable than off-the-shelf tools. Vendors see the danger: EULAs are now their defensive moat, crafted to block customers from using their own data to fuel AI. The fine print isn’t just about compliance — it’s about making sure disruption happens on the vendor’s terms, not those of the customer.

This trajectory is not inevitable. Customers and developers can push back. Large firms, government bodies and consortia hold leverage through procurement. They can demand carve-outs that preserve ownership of outputs and guarantee the right to train AI. Developers, too, can resist punitive revenue-sharing schemes and press for fairer terms. Only collective action will ensure innovation remains in the hands of the wider AEC community, not locked in vendor boardrooms.

The tightening of EULAs and developer agreements is not happening in a vacuum. In Europe, new regulations like the Digital Markets Act (DMA) and the Data Act could directly challenge these practices. The DMA targets “gatekeepers” that restrict competition, while the Data Act enshrines customer rights to access and use data they generate, including for AI training. Clauses banning firms from training AI on their own outputs may sit uncomfortably with these principles.

In the US, antitrust law is less settled but moving in the same direction. The FTC has signalled increased scrutiny of contract terms that suppress competition, and restrictions such as Autodesk’s AI-output ban or Solidworks’ 10% developer royalty could draw attention.

For customers and developers, this creates negotiating leverage. Large firms, government clients, and consortia can push for carve-outs citing regulatory rights, while developers may resist punitive revenue-sharing as disproportionate. Yet smaller players face a harder reality: challenging vendors risks losing access to platforms that underpin longstanding businesses.

A Bill of Rights?

With so many software firms busily updating their business models, EULAs and terms, the one group here that is standing still and taking the full force of this wave are customers. A constructive way forward could be the creation of a Bill of Rights for AEC Software customers — a simple but powerful framework that customers could insist their vendors sign up to and be held accountable against. The goal is not to hobble innovation, but to ensure it happens on a foundation of fairness and trust. Knowing this month’s ‘we have updated our EULA’ will not transgress some core concepts.

At its heart we’re suggesting five core principles:

Data Ownership – a statement that customers own what they create; vendors cannot claim control of drawings, models, or project data through the fine print.

AI Freedom – guarantees that firms may use their own outputs to train internal AI systems, preserving the ability to innovate independently rather than relying solely on vendor-driven tools.

Developer fairness – ensures that APIs remain open, with transparent and non-punitive revenue models that allow third-party ecosystems to thrive.

Transparency – requires vendors to clearly disclose when and how customer data is used in their own AI training or analytics.

Portability – commits vendors to interoperability and open standards, so that customers are never locked into one ecosystem against their will.

Such a Bill of Rights would not prevent Autodesk, Bentley Systems, Nemetschek or Trimble from building profitable AI services or new subscription tiers. But it would establish clear boundaries: vendors innovate and capture value, but not at the expense of customer autonomy. For customers, developers, and ultimately the built environment itself, this would restore balance and accountability in a market where the fine print has become as important as the software itself.

AEC Magazine is now working with a group of customers, developers and software vendors to see how this could be shaped in the coming months.

Conclusion

EULAs are no longer obscure boilerplate legalese, tucked at the end of an installer. They have become the front line in a new battle, not over software piracy, but over who controls the data, workflows, and ecosystems that shape the future of design.

Autodesk’s apparent prohibition on AI training with customer outputs and Dassault Systèmes’ demand for a slice of developer revenues illustrates just how quickly the ground is shifting. Contracts are no longer just protective wrappers around software; they are strategic levers which can be used to lock in customers and monetise ecosystems.

This should concern everyone in AEC. Customers risk losing the ability to use their own project history to innovate, while mature developers face sudden, new revenue-sharing models that could undermine entire businesses. Left unchallenged, the result will be less competition, less innovation, and greater dependency on a handful of large vendors whose first loyalty is to shareholders, not users.

The only path forward I see is collective action. Customers and developers must push back, demand transparency, insist on long-term contractual safeguards, and possibly unite around a shared Bill of Rights for AEC software. The question is no longer academic: in the age of AI, do you own your tools and your data — or does your vendor own you?


Explainer #1 – EULA vs Terms of Use: what’s the difference?

At first glance, a EULA (End User Licence Agreement) and Terms of Use can look like the same thing. In practice, they operate at different levels — and together form the legal framework that governs how customers engage with software and cloud services.

The EULA is the traditional licence agreement tied to desktop software. It explains that you do not own the software itself, only the right to use it under certain conditions. Typical clauses cover installation limits, restrictions on copying or reverse-engineering, and confirmation that the software is licensed, not sold.

The Terms of Use apply more broadly to online services, platforms, APIs and cloud tools. They include acceptable use rules, data storage and sharing conditions, API restrictions, and often a right for the vendor to change the terms unilaterally.

One unresolved issue is how to interpret contradictions. If the EULA states ‘you own your work’ but the Acceptable Use Policy restricts what you can do with that work, and neither agreement specifies which takes precedence, which clause governs? In practice, customers may only discover the answer in the event of a dispute — an unsettling prospect for firms relying on predictable rights.


Explainer #2 – Why is data the new goldmine?

As the industry moves into an era defined by artificial intelligence and machine learning, customer content has become more than just the product of design work, it has become the raw material for training and insight.

BIM and CAD models are no longer viewed solely as deliverables for projects, but as vast datasets that can be mined for patterns, efficiencies, and predictive value. This is why software vendors increasingly frame customer content as “data goods” rather than private work.

With so much of the design process shifting to cloud-based platforms, vendors are in a powerful position to influence, and often restrict, how those datasets can be accessed and reused.

The old mantra that “data is the new oil” captures this shift neatly: just as oil companies controlled not only the drilling but also the refining and distribution, software firms now want to control both the pipelines of design data and the AI refineries that turn it into intelligence.

What used to be customer-owned project history is being reconceptualised as a strategic asset for software vendors themselves and EULAs and Terms of Use are the contractual tools that allow them to lock down that value.


Explainer #3 – Autodesk’s EULA Shift

What changed?

Autodesk’s Acceptable Use Policy (AUP) bans AI/ML training on any “output” from its software — including models, drawings, exports, even IFCs — unless done within Autodesk’s APS cloud.

Why it matters

Customers risk losing the ability to train internal AI on their own design history. Strict licence audits mean firms could be flagged non-compliant even without intent.

Legal experts warn the AUP’s broad claims over “outputs” may conflict with copyright law, which in many jurisdictions gives authors automatic ownership of their creations.

Consultants could face knock-on risks if client contracts require them to warrant full ownership of deliverables — raising potential indemnity exposure.

Autodesk gains leverage by funnelling AI innovation into its paid ecosystem.

The big picture

This move mirrors gatekeeping strategies in other tech sectors, where platforms wall off data to consolidate control. Regulators in the EU (Digital Markets Act, Data Act) and US antitrust bodies are increasingly scrutinising such practices.


Explainer #4 – Developers at risk

What changed?

Autodesk has overhauled Autodesk Platform Services (APS): APIs are now metered, consumption-based, and gated by stricter terms. While firms can still build internal tools, sharing or commercialising scripts now requires Autodesk’s explicit approval.

Why it matters

Independent developers face new costs and quotas for integrations that were once bundled into subscription fees. In-house teams must now budget for API usage, turning process automation into an ongoing operational cost.

Quota limits mean projects risk disruption if thresholds are unexpectedly exceeded mid-delivery.

The contractual line between “internal tool” and “commercial app” is now defined by Autodesk, not developers.

Innovation that once flowed freely into the wider ecosystem is fenced in, with Autodesk deciding what can be shared.

The big picture

Across the CAD/BIM sector, developer ecosystems are being monetised and restricted to generate shareholder returns. What were once open innovation pipelines are narrowing into vendor-controlled platforms, threatening the independence of smaller developers and reducing customer choice.


Recommended viewing: May Winfield @ NXT DEV

May Winfield
May Winfield

At AEC Magazine’s NXT DEV event this year, May Winfield, global director of commercial, legal and digital risks for Buro Happold presented “EULA and Other Agreements: You signed up to what?”, where she invited the audience to reconsider the contracts they’ve implicitly accepted.

How many users digest the fine print of EULAs and AI tool terms? Winfield warns that their assumptions often misalign with contractual reality and highlights key clauses that tend to lurk in user agreements: ownership of content, usage rights, and liability limitations.

The presentation serves as a practical reminder: what you think you own or can do might be constrained by what you signed up to — underscoring the urgency for users, developers, and governance bodies to delve into EULAs and demand clarity.

■ Watch @ www.nxtaec.com

Advertisement