[codicts-css-switcher id=”346″]

Global Law Experts Logo
denmark deepfake law 2026

Denmark's Deepfake & Likeness Copyright Reforms (2026): What Businesses, Creators and Rights‑holders Must Do Now

By Global Law Experts
– posted 1 hour ago

Denmark is poised to become one of the first countries in the world to grant individuals a copyright‑like right over their own face and voice, a direct legislative response to the explosion of AI‑generated deepfakes. The Denmark deepfake law 2026 package comprises two distinct but interconnected changes: a proposed amendment to the Danish Copyright Act that would create new civil rights over personal likeness, and a set of fee increases at the Danish Patent and Trademark Office (DKPTO) that took effect on 1 January 2026. Together, these reforms compel every business that creates, distributes or hosts AI‑generated content in Denmark to revisit contracts, content pipelines and intellectual property budgets.

This guide sets out the exact compliance steps that in‑house counsel, platform operators, creators and IP managers need to take, and when to take them.

Executive Summary, What Changed and the Immediate Compliance Decision

Two developments define Denmark’s 2026 IP landscape. First, the Danish government has tabled a draft amendment to the Copyright Act that would grant every natural person an exclusive, transferable right to control AI‑generated reproductions of their face, voice and other identifiable personal features. Second, the DKPTO adjusted fees for patents, trademarks and designs, effective 1 January 2026, increasing the cost of maintaining IP portfolios across the board.

The central compliance question for every affected organisation is: What exact changes must we make to contracts, workflows and IP calendars to comply with Denmark’s 2026 deepfake/likeness reforms and DKPTO fee changes?

Decision‑makers should prioritise four immediate actions:

  • Audit all AI‑generated and synthetic content currently in production or distribution for Danish‑market exposure.
  • Update influencer, talent and licensing contracts to include explicit likeness‑consent clauses, provenance metadata obligations and AI‑labelling requirements.
  • Recalculate IP renewal budgets and update automated reminder systems to reflect the new DKPTO fee schedule.
  • Map obligations under both the Danish draft and EU AI Act Article 50 to ensure labelling, consent and disclosure requirements are met simultaneously.

What the Danish 2026 Reforms Actually Say, Denmark Deepfake Law 2026 Legislative Summary

The draft amendment, which the European Parliament’s think‑tank has characterised as the “Danish approach to copyright and deepfakes,” proposes inserting a new neighbouring right into Denmark’s Copyright Act. This right is civil in nature and is distinct from existing criminal prohibitions on fraud, identity theft or defamation. Its core mechanism gives every identifiable natural person the exclusive right to authorise or prohibit the creation, reproduction and making‑available of AI‑generated depictions of their likeness, including facial features, voice and other biometric identifiers.

Draft Text Highlights, Denmark Copyright Reform 2026

According to analysis published by the European Parliament and commentary from Danish law firm Plesner, the proposed right has several defining features:

  • Scope. The right covers AI‑generated or AI‑manipulated content that depicts a person’s likeness with sufficient fidelity to be identifiable. It is not limited to commercial uses.
  • Right‑holder. The right vests in the depicted individual (or their estate for a defined post‑mortem period). It is transferable and licensable.
  • Exceptions. The draft contemplates exceptions for news reporting, satire, parody and artistic expression, broadly mirroring existing copyright limitation categories, though the precise contours remain subject to parliamentary debate.
  • Consent requirement. Any use falling outside an exception requires the prior, informed consent of the depicted person. Consent must be specific to the nature and context of the AI‑generated content.
  • Remedies. Infringement would give rise to injunctive relief, damages and takedown orders through civil proceedings.

Some academic and practitioner commentators, notably in analysis published by Wolters Kluwer, have questioned whether the copyright framework is the right legal vehicle for these protections, arguing that a standalone personality‑rights statute might better fit the policy objective. Industry observers expect this debate to continue during parliamentary consideration.

Timeline of Legislative Stages

Date Event Implication for Businesses
1 January 2026 DKPTO fee changes take effect (patent, trademark and design fees) Update renewal budgets and calendars; consider prepayments where allowed
Mid‑2026 (expected) Parliamentary adoption or finalisation of Denmark’s draft likeness/deepfake amendment Begin compliance programme for new rights: consent capture, content labelling, licence templates
August 2026 / EU AI Act implementation windows Article 50 transparency obligations roll out (implementation phases and codes of practice) Ensure labelling and metadata are compatible with Article 50 requirements

As of April 2026, the draft has not yet received final parliamentary adoption. The likely practical effect, however, is that businesses should treat compliance preparation as urgent: retroactive enforcement once the law passes could catch organisations that delayed action.

How Deepfake Legislation Denmark Interacts with the EU AI Act (Article 50)

Denmark’s proposed likeness right does not exist in a vacuum. The EU AI Act, Regulation (EU) 2024/1689, imposes its own transparency obligations on deployers of AI systems. Article 50 specifically requires that persons deploying AI systems which generate synthetic audio, image, video or text content must ensure that outputs are marked in a machine‑readable format and disclosed as artificially generated or manipulated. These obligations apply regardless of national copyright or personality‑rights frameworks.

Labelling vs Consent, Two Separate Obligations

The critical distinction for compliance teams is that the Danish draft and Article 50 impose different, cumulative obligations:

  • Article 50 (EU AI Act): Requires disclosure and labelling of synthetic content, a transparency duty owed to the public.
  • Danish draft amendment: Requires consent from the depicted individual before creating or distributing identifiable AI content, a rights‑based duty owed to the person whose likeness is used.

Obtaining consent from a subject does not discharge the labelling obligation under Article 50, and labelling content as AI‑generated does not satisfy the consent requirement under the Danish proposal. Deployers must satisfy both.

Practical Conflicts and Priority Rules

Where Danish law and EU regulation overlap, EU law takes precedence on matters within its scope (transparency and labelling). The Danish neighbouring right, however, operates in a space the AI Act does not directly regulate, the individual’s right to control their own likeness. Early indications suggest that compliance teams should build parallel but co‑ordinated workflows: one track for Article 50 labelling and metadata embedding, and a separate track for obtaining and recording individual consent under the Danish rules.

Immediate Actions for Businesses and Platforms, Denmark Deepfake Law 2026 Checklist

Waiting for final adoption is not a defensible strategy. The operational changes required are substantial enough that a head start is essential. The following checklist is prioritised by risk.

Content Moderation and Labelling SOP

  1. Inventory all AI‑generated content. Catalogue every piece of synthetic media, images, video, audio, text, that depicts or could depict an identifiable person and is distributed to or accessible from Denmark.
  2. Implement machine‑readable labelling. Embed C2PA (Coalition for Content Provenance and Authenticity) or equivalent metadata in all AI‑generated outputs at the point of creation. This addresses Article 50 requirements and creates an auditable trail.
  3. Add visible disclosure. Where content is displayed to end‑users, include a clear on‑screen label (e.g., “This content was generated using AI”), standard practice under emerging codes of practice.
  4. Build a consent database. Create a centralised register recording who consented, to what specific use, on what date, and with what scope limitations. Link each consent record to the relevant content asset.
  5. Train content‑review teams. Ensure moderators can identify AI‑generated likenesses and know the escalation path for content that lacks valid consent.

Incident Response and Notice Procedures

  1. Establish a takedown SOP. Define clear timelines, industry observers expect 48‑to‑72‑hour response windows to become the practical benchmark, for receiving, assessing and acting on takedown requests from individuals whose likeness has been used without consent.
  2. Designate a responsible officer. Appoint a named individual (or team) to handle deepfake‑related complaints, with authority to order immediate content removal pending investigation.
  3. Preserve evidence. When removing content, preserve a forensic copy and all associated metadata. This protects the platform in the event of subsequent litigation.
  4. Notify upstream providers. If the infringing content originated from a third‑party AI tool or vendor, notify that vendor promptly and review the contractual indemnity position.

Contracts, Licences and Influencer/Creator Clauses, How to Update IP Policy Denmark

The Danish reform will reshape standard contract language across the creative and technology sectors. Every agreement involving the creation, licensing or distribution of AI‑generated content depicting identifiable persons should be reviewed and, in most cases, amended. The following sample clauses are provided for discussion purposes only, they are non‑binding examples and should be adapted with qualified legal counsel.

Sample Clause 1, Likeness Consent to Protect Likeness Denmark

“The Talent hereby grants to [Company] a non‑exclusive, revocable licence to use the Talent’s likeness (including facial features, voice and other biometric identifiers) in AI‑generated or AI‑manipulated content, solely for the purposes described in Schedule [X]. This licence is limited to [specified media/channels/territories] and shall expire on [date] unless renewed in writing. The Talent retains the right to withdraw consent upon [notice period] written notice, whereupon the Company shall remove all affected content within [timeframe].”

Sample Clause 2, AI‑Output Ownership and Labelling

“All AI‑generated content produced under this Agreement shall be clearly labelled as artificially generated in accordance with Article 50 of Regulation (EU) 2024/1689 and any applicable Danish legislation. The Producer shall embed machine‑readable provenance metadata at the point of creation and shall not distribute AI‑generated content depicting any identifiable person without documented evidence of valid consent held in the Consent Register.”

Licence Checklist

When negotiating or reviewing any licence involving AI‑generated content and personal likeness, verify that the following elements are addressed:

  • Specificity of consent. Does the agreement describe exactly how the likeness will be used, in what media, and for how long?
  • Revocability. Can the individual withdraw consent, and what is the contractual mechanism for removal?
  • Indemnity. Does the content producer indemnify the platform/distributor against claims arising from use without valid consent?
  • Labelling obligations. Are both parties’ obligations under Article 50 and Danish law expressly allocated?
  • Provenance metadata. Is there a contractual requirement to embed and preserve metadata throughout the content lifecycle?
  • Post‑mortem rights. If the draft provides for post‑mortem protection, does the agreement address estate or successor rights?

DKPTO Fee Increase 2026: Money, Scheduling and Patent/Trademark Operational Changes

Alongside the deepfake reforms, the DKPTO adjusted its fee schedule effective 1 January 2026. The fee changes affect patents, trademarks and designs and represent the second consecutive annual adjustment following the 2025 patent fee revision. IP portfolio managers should treat this as an operational priority: failing to update renewal calendars and budgets risks missed deadlines or unexpected cost overruns.

Patent Renewal Fees Denmark 2026, Calendar and Cost Impact

The DKPTO published a comprehensive fees schedule covering all patent, trademark and design actions. Key areas of increase include patent annual renewal fees (which escalate by year of patent life), trademark application and renewal fees, and design registration fees. The full schedule is available in the DKPTO’s official fees PDF.

Operational steps for patent renewal fees Denmark 2026:

  1. Download the current fee schedule from the DKPTO and compare line‑by‑line against your existing renewal calendar.
  2. Update automated docketing systems with the new fee amounts to prevent underpayment.
  3. Recalculate portfolio maintenance budgets, for large portfolios, even modest per‑patent increases compound significantly.
  4. Assess prepayment opportunities. Where the DKPTO permits advance payment, evaluate whether paying ahead offers savings relative to anticipated future adjustments.
  5. Notify renewal agents and outside counsel of the new schedule and confirm they have updated their own systems.

Cost Impact Modelling

For organisations holding more than 20 Danish patents, the cumulative effect of escalating annual fees warrants a formal cost‑benefit review. Early indications suggest that some rights‑holders are using the fee increase as a trigger to prune low‑value patents from their portfolios, reducing maintenance costs while concentrating resources on strategically important filings.

Enforcement, Remedies and Liability Map

Understanding who can bring a claim, against whom and for what remedies is essential for risk assessment. The Denmark deepfake law 2026 reforms create a layered enforcement landscape.

Remedies and Timelines

  • Civil injunctions. The depicted individual (or their licensee/estate) may seek an interim or permanent injunction ordering removal of infringing content.
  • Damages. Compensation for economic loss and, depending on final legislative text, potentially for non‑economic harm (reputational damage, emotional distress).
  • Takedown orders. Courts or designated administrative bodies may order platforms to remove or disable access to specific content.
  • Criminal exposure. The draft reform itself is civil, but existing Danish criminal statutes on fraud, identity theft and certain forms of image abuse may apply concurrently where deepfakes are used for criminal purposes.

Cross‑Border Enforcement Considerations

Content hosted outside Denmark but accessible to Danish users may still trigger liability. Platforms relying on non‑Danish hosting as a shield should note that the Brussels Regulation (for civil claims within the EU) and the EU AI Act’s cross‑border enforcement mechanisms may both apply. Industry observers expect Danish courts to assert jurisdiction where the depicted individual is domiciled in Denmark and the content is targeted at or accessible within the Danish market.

Case Studies, Three Hypothetical Scenarios

The following scenarios illustrate how the reforms could play out in practice.

  • Scenario 1, Influencer deepfake advertisement. A Danish cosmetics brand uses AI to generate a video of a well‑known influencer endorsing its product, without the influencer’s consent. Under the proposed law, the influencer could seek an injunction and damages. The brand must also address its Article 50 labelling failure. Immediate action: remove the content, obtain retrospective consent or replace with consented material, and implement AI labelling on all outputs.
  • Scenario 2, Platform repost of manipulated political video. A social media platform operating in Denmark hosts a user‑uploaded video that uses AI to alter a politician’s speech. The politician files a takedown request. Immediate action: process the request within the platform’s SOP timeline, preserve metadata, remove the content and notify the uploader of the legal basis for removal.
  • Scenario 3, AI tool generating celebrity voice for advertising. A SaaS company offers an AI voice‑cloning tool that clients use to generate advertisements featuring the voice of a Danish celebrity. The tool provider and the advertising client both face potential liability. Immediate action: the tool provider should implement consent‑verification gates; the advertising client must hold documented consent and embed provenance metadata.

Practical Annexes and Templates

The following resources consolidate the guidance above into ready‑to‑use operational tools. These are provided as starting points and should be tailored to each organisation’s specific circumstances with qualified legal advice.

Takedown and Consent Flowchart (summary):

  1. Receive complaint or identify potentially infringing AI‑generated content.
  2. Log the report in the incident register with timestamp and complainant details.
  3. Assess whether the content depicts an identifiable person and whether valid consent exists in the Consent Register.
  4. If no valid consent exists: remove or disable access to the content within the defined SOP timeline.
  5. Preserve a forensic copy and all associated metadata.
  6. Notify the content creator/uploader and any upstream AI tool providers.
  7. If consent is disputed: escalate to legal counsel and document the dispute resolution process.
  8. Log the outcome and update internal policies if a systemic gap is identified.

Compliance checklist (condensed):

  • AI content inventory completed and catalogued
  • Machine‑readable labelling (C2PA or equivalent) embedded at creation
  • Visible AI‑generated disclosure added to user‑facing content
  • Centralised Consent Register established and populated
  • Contracts updated with likeness‑consent, labelling and indemnity clauses
  • Takedown SOP drafted, approved and communicated to relevant teams
  • DKPTO fee schedule updated in docketing and renewal systems
  • IP portfolio budget recalculated for 2026 fee levels
  • Renewal agents and outside counsel notified of fee changes
  • Cross‑functional training delivered to content, legal and compliance teams

Need Legal Advice?

This article was produced by Global Law Experts. For specialist advice on this topic, contact Kim Larsen, a member of the Global Law Experts network.

Sources

  1. Danish Patent & Trademark Office, New Fees from 1 January 2026
  2. DKPTO, Fees of the Danish Patent and Trademark Office per 1 January 2026 (PDF)
  3. European Parliament, The Danish Approach to Copyright and Deepfakes
  4. EU Artificial Intelligence Act, Article 50 (Transparency Obligations)
  5. Plesner, Personal Identity Meets Copyright: Denmark Moves to Regulate Deepfakes
  6. Wolters Kluwer, Deepfake Bills in Denmark and the Netherlands: Right Idea, Wrong Legal Framework?
  7. Dennemeyer, A New Sense of Self: Denmark’s Copyright Amendment Against Deepfakes
  8. The Guardian, Deepfakes: Denmark Copyright Law and Artificial Intelligence

FAQs

What is Denmark's proposed law on deepfakes and personal likeness (2026)?
The draft proposes a copyright‑like neighbouring right over a person’s face, voice and other biometric identifiers. It would give individuals the exclusive right to authorise or prohibit the creation and dissemination of AI‑generated depictions of their likeness. The right is civil in nature and includes exceptions for news reporting, satire and artistic expression. Businesses should audit content pipelines and consent practices now, ahead of expected parliamentary adoption in mid‑2026.
The DKPTO adjusted fees for patents, trademarks and designs from 1 January 2026. Patent annual renewal fees, trademark application and renewal fees, and design registration fees have all increased. IP managers should download the current DKPTO fee schedule, update docketing systems and recalculate portfolio budgets. For large portfolios, the cumulative increase may justify a strategic review of which rights to maintain.
Not categorically. The proposed amendment creates civil rights to control dissemination, not an outright ban on creation. However, existing Danish criminal statutes covering fraud, identity theft and certain image‑abuse offences may apply where deepfakes are used for criminal purposes. Each case must be assessed on its specific facts.
Contracts should include explicit, written consent for likeness use in AI contexts; specify the scope, duration and revocability of that consent; require provenance metadata and labelling under Article 50 of the EU AI Act; and include indemnities allocating liability for non‑compliant use. Sample clauses are provided above for discussion purposes.
The two regimes impose separate, cumulative obligations. Article 50 of the EU AI Act (Regulation (EU) 2024/1689) requires deployers to label synthetic content and embed machine‑readable metadata, a transparency duty. The Danish draft requires individual consent, a rights‑based duty. Obtaining consent does not discharge the labelling obligation, and labelling does not satisfy the consent requirement. Deployers must comply with both.
Enforcement is primarily civil. Depicted individuals (or licensees/estates) may seek injunctions, damages and takedown orders through the Danish courts. Administrative enforcement mechanisms may also apply where specified in the final legislation. Criminal sanctions under existing statutes remain available for identity theft, fraud or image‑abuse offences.
Liability depends on the platform’s knowledge, its notice‑and‑removal processes and whether it took proactive measures. Platforms should update terms of service, implement clear takedown workflows with documented timelines, and preserve provenance metadata. Prompt removal upon notice significantly reduces exposure.
Yes. Labelling obligations under Article 50 of the EU AI Act apply to all AI‑generated synthetic content, regardless of whether the depicted individual consented. Consent and labelling are independent obligations, organisations must co‑ordinate both within their workflows.

Find the right Legal Expert for your business

The premier guide to leading legal professionals throughout the world

Specialism
Country
Practice Area
LAWYERS RECOGNIZED
0
EVALUATIONS OF LAWYERS BY THEIR PEERS
0 m+
PRACTICE AREAS
0
COUNTRIES AROUND THE WORLD
0
Join
who are already getting the benefits
0

Sign up for the latest legal briefings and news within Global Law Experts’ community, as well as a whole host of features, editorial and conference updates direct to your email inbox.

Naturally you can unsubscribe at any time.

Newsletter Sign Up
About Us

Global Law Experts is dedicated to providing exceptional legal services to clients around the world. With a vast network of highly skilled and experienced lawyers, we are committed to delivering innovative and tailored solutions to meet the diverse needs of our clients in various jurisdictions.

Global Law Experts App

Now Available on the App & Google Play Stores.

Social Posts
[wp_social_ninja id="50714" platform="instagram"]
[codicts-social-feeds platform="instagram" url="https://www.instagram.com/globallawexperts/" template="carousel" results_limit="10" header="false" column_count="1"]

See More:

Contact Us

Stay Informed

Join Mailing List
About Us

Global Law Experts is dedicated to providing exceptional legal services to clients around the world. With a vast network of highly skilled and experienced lawyers, we are committed to delivering innovative and tailored solutions to meet the diverse needs of our clients in various jurisdictions.

Social Posts
[wp_social_ninja id="50714" platform="instagram"]
[codicts-social-feeds platform="instagram" url="https://www.instagram.com/globallawexperts/" template="carousel" results_limit="10" header="false" column_count="1"]

See More:

Global Law Experts App

Now Available on the App & Google Play Stores.

Contact Us

Stay Informed

Join Mailing List

GLE

Lawyer Profile Page - Lead Capture
GLE-Logo-White
Lawyer Profile Page - Lead Capture

Denmark's Deepfake & Likeness Copyright Reforms (2026): What Businesses, Creators and Rights‑holders Must Do Now

Send welcome message

Custom Message