Our Expert in Japan
No results available
Japan’s Act on the Protection of Personal Information (APPI) entered a new phase on 7 April 2026, when the Cabinet approved a comprehensive amendment package that rewrites the rules for organisations using personal data to train artificial-intelligence models. For General Counsel, Data Protection Officers, CTOs and AI product leads operating in or transferring data through Japan, the changes impose immediate obligations around consent, pseudonymisation, cross-border transfers and vendor contracts. This guide, written for information technology lawyers Japan practitioners and the compliance teams that work alongside them, sets out exactly what changed, who must act, and how to reach compliance within weeks rather than months.
It provides the practical checklists, contract clauses and decision frameworks that the short firm alerts dominating the current landscape do not offer.
Key takeaways at a glance:
This article covers the Information Technology practice area as it applies to Japan, and is designed as a single reference point for enterprise compliance programmes navigating the 2026 reforms.
The APPI amendment package approved by the Cabinet on 7 April 2026 represents the most significant overhaul of Japan’s data protection framework since the 2020 revision cycle. The amendments respond directly to the explosion of generative AI and large-language-model development, areas where the existing APPI provisions left significant interpretive gaps. The package touches four pillars: definitions, consent and lawful-basis rules, cross-border transfer mechanisms, and enforcement powers.
The amendment broadens the definition of “personal information” to capture certain structured inference data, outputs derived from personal data that, when combined with other datasets, can re-identify an individual. It also formalises the concept of “AI training use” as a distinct processing purpose under the Act, requiring organisations to specify this purpose in their utilisation notices. A new consent exemption has been introduced for AI model development, but it is conditional: the data must be pseudonymised to the standard set out in the amended Article 41 provisions, and the processing must satisfy a legitimate-interest-style balancing test that weighs the benefit of the AI application against the impact on data subjects.
Penalties have been materially increased. The PIPC can now impose administrative monetary penalties of up to ¥100 million on corporations for serious breaches of the consent and transfer provisions, up from the previous ¥50 million threshold. Individual criminal liability remains in place for wilful misuse of personal data.
| Date | Amendment / Guidance | Immediate Action Required |
|---|---|---|
| 7 April 2026 | Cabinet approval of APPI amendment package | Legal teams: review published amendment text and identify all affected processing activities within 7–14 days |
| Mid-2026 (promulgation date per legislative schedule) | New consent exemption, transfer requirements and penalty provisions enter into force | Operational teams: complete data inventory and begin DPIA/AIA updates; procurement: trigger vendor contract reviews |
| Ongoing 2026 | Digital Agency and PIPC supplementary guidance expected on AI-specific compliance | Monitor regulator publications; update internal compliance programme as guidance is issued |
Industry observers expect the PIPC to issue detailed implementation guidance, including worked examples for AI training scenarios, within 60 to 90 days of promulgation. Organisations should treat the promulgation date as a hard compliance deadline and not wait for supplementary guidance before beginning their implementation programmes.
The core question for AI product teams is straightforward: do we still need consent to use personal data for model training? The answer under the 2026 APPI amendment is nuanced, consent remains the default rule, but a new exemption creates a lawful pathway for training use without individual consent, provided specific conditions are met. Understanding those conditions is the difference between a compliant AI programme and an enforcement action.
Under the APPI, the primary lawful basis for handling personal information remains the data subject’s consent (Article 23). The 2026 amendment does not change this baseline. If an organisation collects personal data directly from individuals for the express purpose of training an AI model, prior consent specifying that purpose is still required. The utilisation purpose notice, typically included in a privacy policy or terms of service, must now explicitly reference “AI model development and training” as a named purpose if the organisation intends to use the data in this way.
For data already collected under existing privacy notices that do not mention AI training, organisations face a consent gap. The amendment does not grandfather pre-existing collections. This means enterprises must either obtain fresh consent, anonymise the data to the point where the APPI no longer applies, or qualify for the new consent exemption discussed below. Early indications suggest that the PIPC will scrutinise retroactive reliance on the exemption more closely than prospective reliance.
The new consent exemption for AI training data is structured as a two-stage test:
Decision framework (text flowchart):
Even where the consent exemption applies, the APPI’s general data minimisation and purpose limitation principles remain in force. Organisations should adopt the following strategies to de-risk their AI training data pipelines:
Sample notice wording for updated privacy policies:
“We may use pseudonymised versions of your information for the purpose of developing and improving AI models and machine-learning systems. Where we do so, we apply technical measures to prevent your identification and conduct an impact assessment to ensure that this use does not disproportionately affect your rights. You may contact [DPO contact] for further information or to exercise your rights under the Act on the Protection of Personal Information.”
AI model training rarely stays within national borders. Training runs on cloud infrastructure hosted overseas, datasets are shared with multinational research teams, and pre-trained models are fine-tuned on servers in multiple jurisdictions. The 2026 APPI amendment significantly tightens cross-border data transfer rules, making this one of the highest-priority areas for information technology lawyers Japan teams to address.
Under the existing APPI, cross-border transfers of personal data required one of three conditions: (a) transfer to a country recognised by the PIPC as having an equivalent level of data protection; (b) the recipient having established a system conforming to PIPC standards; or (c) the data subject’s prior consent. The 2026 amendment adds a fourth mechanism, transfer under approved standard contractual clauses (SCCs) or binding corporate rules (BCRs) endorsed by the PIPC, and simultaneously imposes new due-diligence documentation requirements on all mechanisms.
Specifically, organisations must now prepare and maintain a “transfer impact record” (移転影響記録) for each category of cross-border transfer. This record must document the destination country, the legal basis relied upon, the recipient’s data-protection measures, and any government-access risks in the destination jurisdiction. The practical effect for AI teams is that every cloud-provider arrangement involving personal data (including pseudonymised data that has not been fully anonymised) must be reviewed and documented.
The following comparison table summarises the transfer mechanisms available under the 2026 APPI and their practical implications for AI model training:
| Mechanism | Typical Use Case | Pros / Cons |
|---|---|---|
| Adequacy decision (PIPC-recognised country) | Transfers to EU/EEA, UK, and other recognised jurisdictions for collaborative model training | Pro: No additional contractual layer required. Con: Limited list of recognised countries; does not cover major cloud regions (e.g., certain US data centres). |
| Equivalent-standard system at the recipient | Transfers to a corporate affiliate or processor with certified data-governance controls | Pro: Flexible, can be applied to any jurisdiction. Con: Requires detailed due diligence and documentation of the recipient’s system; new transfer impact record obligation adds burden. |
| PIPC-approved SCCs / BCRs | Multinational enterprises with centralised AI training infrastructure | Pro: Provides legal certainty once approved; familiar to organisations operating under GDPR. Con: Approval process timeline uncertain; the PIPC has not yet published template SCCs as of May 2026. |
| Data subject consent | Direct collection from individuals who are informed of the overseas transfer | Pro: Immediately available. Con: Impractical at scale for training datasets; consent must be specific as to the destination country and purpose. |
For most enterprise AI programmes, the “equivalent-standard system” mechanism will remain the primary route for cloud-based training, supplemented by the new SCCs once the PIPC publishes its templates. Legal teams should begin preparing transfer impact records now, prioritising transfers to jurisdictions without adequacy decisions. The OECD Cross-Border Privacy Rules (CBPR) framework, to which Japan is a participant, may also provide supporting evidence of a recipient’s data-protection standards, although it does not alone satisfy the APPI’s requirements.
Compliance with the 2026 APPI amendment is not a single event, it is a phased programme. The following roadmap breaks the work into 30-, 90- and 180-day horizons, with clear ownership assignments. This section serves as the AI compliance checklist that information technology lawyers Japan teams and their clients can use to track progress.
| Role | Primary Responsibility | Key Deliverable |
|---|---|---|
| General Counsel (GC) | Legal interpretation of APPI amendment; board reporting on compliance risk | Legal opinion on consent exemption applicability; updated privacy notices |
| Data Protection Officer (DPO) | Programme coordination; DPIA/AIA oversight; regulator liaison | Balancing-test documentation; transfer impact records; training register |
| Chief Technology Officer (CTO) | Pseudonymisation architecture; access controls; memorisation testing | Technical specification for pseudonymisation pipeline; audit log system |
| Chief Product Officer (CPO) | Ensuring product-level consent flows and user notices reflect new requirements | Updated consent UX; product-level privacy impact assessment inputs |
Data protection in Japan now demands that contracts governing AI model training contain precise, APPI-aligned provisions. The following sample clauses are designed for inclusion in data processing agreements (DPAs), master services agreements and cloud infrastructure contracts. Each clause should be adapted to the specific transaction and reviewed by qualified IT lawyers in Japan before execution.
These clauses reflect the specific requirements introduced by the 2026 APPI amendment. Procurement teams should use them as a starting point for redlining and negotiate them alongside the technical schedules that specify pseudonymisation standards, audit procedures and memorisation testing protocols.
The PIPC is Japan’s primary data protection regulator and the body responsible for interpreting and enforcing the APPI. Following the Cabinet’s approval of the amendment package on 7 April 2026, the PIPC is expected to issue detailed implementation guidance, including worked examples for common AI training scenarios, in the months following promulgation. The Digital Agency, which leads Japan’s digital transformation strategy, has separately published guidance on responsible AI development that supplements the APPI framework with operational standards for data governance, algorithmic transparency and human oversight.
Enforcement risk is real and growing. The doubling of the maximum administrative fine to ¥100 million, combined with the PIPC’s expanded authority to issue corrective orders, signals a regulatory posture that prioritises deterrence. Industry observers expect the PIPC to pursue early enforcement actions in high-profile sectors, particularly fintech, healthcare AI and consumer-facing generative-AI applications, to establish precedent. Organisations should prepare templated responses for PIPC inquiries and designate a regulatory-affairs liaison.
Parallel regulatory frameworks also apply. The Telecommunications Business Act governs data handling by communications service providers, and sector-specific guidelines from the Financial Services Agency (FSA) and the Ministry of Health, Labour and Welfare (MHLW) impose additional requirements on AI deployments in regulated industries. A comprehensive compliance programme must map these overlapping obligations.
Scenario A, SaaS vendor training on user-generated content. A SaaS platform wants to use customer-uploaded documents to fine-tune a proprietary language model. The platform’s existing terms of service do not mention AI training. Action: Update terms to include AI training as a specified purpose. For historical data, assess whether pseudonymisation to the Article 41 standard is feasible and conduct the balancing test. If either condition is not met, obtain fresh, specific consent before using the data.
Scenario B, Multinational model trained on cross-border data. A global enterprise trains a single foundation model using datasets collected across Japan, the EU and the United States, with training infrastructure hosted in the US. Action: Prepare a transfer impact record for the Japan-to-US data flow. Assess the US provider’s data-protection system against PIPC standards. Execute updated DPA clauses (see sample language above). Monitor for PIPC-approved SCCs that could simplify the transfer basis.
Scenario C, M&A: migrating user data for model retraining. A company acquires a Japanese startup and plans to migrate the startup’s user database to retrain the acquirer’s AI models. Action: Conduct a consent-basis audit of the target’s dataset as part of due diligence. Assess whether the original collection purposes cover AI training. If not, plan for either consent remediation or pseudonymisation before migration. Include APPI compliance warranties and indemnities in the acquisition agreement.
The 2026 APPI amendment transforms Japan’s data protection landscape for AI development. Organisations that delay compliance risk administrative fines of up to ¥100 million, reputational damage, and the operational disruption of having training pipelines shut down by PIPC corrective orders. The consent exemption, while welcome, demands rigorous pseudonymisation, documented balancing tests and robust cross-border transfer controls, none of which can be implemented overnight.
Engaging qualified information technology lawyers Japan practitioners early, before promulgation, not after enforcement, is the most effective risk-mitigation step available. Begin with the 30-day checklist above, prioritise your training-data inventory, and ensure that vendor contracts reflect the new requirements. The compliance window is narrow, and the regulatory environment will only intensify as the PIPC and Digital Agency release further guidance throughout 2026.
This article was produced by Global Law Experts. For specialist advice on this topic, contact Noboru Kitayama at Mori Hamada & Matsumoto, a member of the Global Law Experts network.
posted 21 minutes ago
posted 44 minutes ago
posted 1 hour ago
posted 1 hour ago
posted 2 hours ago
posted 3 hours ago
posted 3 hours ago
posted 3 hours ago
posted 4 hours ago
posted 4 hours ago
posted 5 hours ago
posted 5 hours ago
No results available
Find the right Legal Expert for your business
Sign up for the latest legal briefings and news within Global Law Experts’ community, as well as a whole host of features, editorial and conference updates direct to your email inbox.
Naturally you can unsubscribe at any time.
Global Law Experts is dedicated to providing exceptional legal services to clients around the world. With a vast network of highly skilled and experienced lawyers, we are committed to delivering innovative and tailored solutions to meet the diverse needs of our clients in various jurisdictions.
Global Law Experts is dedicated to providing exceptional legal services to clients around the world. With a vast network of highly skilled and experienced lawyers, we are committed to delivering innovative and tailored solutions to meet the diverse needs of our clients in various jurisdictions.
Send welcome message