Who's liable for AI-based court decisions? Learn about regulations & tech firm rules in Morocco for 2026. Know your lega
This image was AI-generated for illustrative purposes. Any people or scenes depicted are not real.

AI Liability: Court Decisions in Morocco (Explained)

9anon AI Team9 min read
Share this article:

AI Liability: Court Decisions in Morocco (Explained)

In the rapidly evolving digital landscape of 2026, artificial intelligence (AI) has transitioned from a futuristic concept to a foundational element of Moroccan industry, healthcare, and public administration. Imagine a scenario where an autonomous diagnostic tool in a Casablanca clinic misinterprets a radiology report, leading to an incorrect surgical procedure. Or consider a high-frequency trading algorithm used by a financial firm in Casablanca Finance City that executes a series of erroneous trades, causing significant market distortion and private financial loss. Who is held responsible? Is it the developer who wrote the code, the company that deployed the machine learning model, or is the AI itself a legal entity?

As Morocco continues its "Digital Morocco 2030" strategy, these questions are no longer academic. They are pressing legal realities. Navigating the complexities of AI liability requires a deep understanding of how traditional Moroccan civil and administrative laws intersect with emerging digital regulations. This article provides a comprehensive analysis of the legal framework governing AI liability in Morocco, the role of the judiciary, and how current laws are adapted to address the unique challenges of algorithmic harm.

Morocco does not yet have a single "AI Code." Instead, liability is determined through a sophisticated interplay of existing statutes, international treaties, and sector-specific decrees. The legal foundation for AI liability in Morocco is built upon several key pillars.

1. The Code of Obligations and Contracts (Dahir des Obligations et des Contrats - DOC)

The DOC remains the bedrock of civil liability in Morocco. When an AI system causes harm, legal practitioners look first to Article 77 and Article 78.

  • Article 77 establishes the general principle of delictual liability: any act committed by a person which, without the authority of law, causes harm to another, obliges the perpetrator to compensate for the damage.
  • Article 78 extends this to negligence or imprudence. In the context of AI, this applies to "human-in-the-loop" scenarios where a human operator fails to supervise an AI system correctly.

Furthermore, Article 88 of the DOC is critical for AI liability. It establishes "strict liability" for damages caused by things under one's custody (la garde des choses). As AI is legally classified as a "thing" or an instrument, the owner or user (the custodian) is often held liable for its actions regardless of proven fault, unless they can prove the harm resulted from an external cause, force majeure, or the victim's own fault.

2. Law No. 09-08: Protection of Personal Data

AI systems thrive on data. Law No. 09-08 relating to the protection of individuals with regard to the processing of personal data is central to AI liability. Under this law, the National Commission for the Protection of Personal Data (CNDP) oversees how algorithms process Moroccan citizens' information. If an AI system causes harm through a data breach or discriminatory profiling, the "Data Controller" is held liable under the provisions of this Act.

3. Law No. 31-08: Consumer Protection

For AI products sold to the public (such as smart home devices or AI-driven apps), Law No. 31-08 provides the framework for product liability. It ensures that providers are liable for defects in their products that cause safety issues or financial loss to consumers.

4. Institutional Mandates and International Cooperation

Recent legislative updates, such as Law No. 74.19 (as cited in Reference 2), which reorganizes the Academy of the Kingdom of Morocco, emphasize the importance of scientific partnerships and the "exchange of expertise" to achieve national goals. This is vital for AI, as Moroccan courts often look to international standards (like the EU AI Act) when interpreting local liability gaps. Additionally, the Ministry of Foreign Affairs, under Article 1 of the Decree (Reference 4), coordinates international legal cooperation, ensuring Morocco stays aligned with global AI governance norms.

5. Sector-Specific Regulations

In specialized fields, other laws apply. For example, Law No. 12.06 regarding standardization and Law No. 17.97 regarding industrial property (Reference 6) govern the technical benchmarks that AI systems must meet. Failure to adhere to these standards can be used as evidence of "fault" in a liability lawsuit.

Practical Guide: Navigating an AI Liability Claim in 2026

If you or your business has suffered harm due to an AI system in Morocco, the path to redress involves specific procedural steps and documentation.

Step 1: Identifying the Proper Jurisdiction

The nature of the defendant determines the court:

  • Civil/Commercial Courts: If the dispute is between private individuals or companies (e.g., a faulty AI software contract), the case is filed in the Commercial Courts.
  • Administrative Courts: If the harm was caused by an AI system used by a public institution (e.g., an automated tax assessment or a public school placement algorithm), the case falls under the jurisdiction of the Administrative Courts, following the principles of Administrative Law in Morocco.

Step 2: Gathering Evidence (The "Black Box" Challenge)

AI liability cases often fail due to a lack of evidence regarding how the algorithm reached a decision. In 2026, Moroccan courts increasingly allow for:

  • Expert Testimony: Appointment of a court-certified IT expert to conduct an "algorithmic audit."
  • Log Files: Requesting the "Data Controller" to provide system logs and training data parameters under Law No. 09-08.
  • Impact Assessments: Documents showing whether the company performed a risk assessment before deploying the AI.

Step 3: Filing the Lawsuit

Under the modernization of the Moroccan judicial system, many filings can now be done through the "e-Court" platform. For a detailed walkthrough of this process, refer to the E-Filing Guide: Moroccan Courts in 2026.

Required Documents and Timelines

  • Statement of Claim: Detailing the specific harm and the link to the AI's output.
  • Proof of Custody: Evidence that the defendant was the owner or operator of the AI (relevant for Article 88 DOC).
  • Timeline: Most civil liability claims must be brought within 3 years from the moment the victim became aware of the harm, and in all cases, within 15 years of the event.

Costs

Litigation costs include court fees (usually a percentage of the claim value), expert fees (which can be substantial in AI cases), and attorney fees. However, the Judicial System Modernization in Morocco 2026 has introduced measures to reduce costs through digital procedures.

Key Provisions Explained: How the Law Interprets AI Actions

To understand AI liability, one must look at how Moroccan judges interpret specific articles of the law when applied to non-human actors.

The Concept of "Autonomous Fault"

In traditional law, "fault" requires human intent or negligence. However, AI acts autonomously. Moroccan legal scholarship is moving toward the concept of "Technical Fault." If an AI deviates from its intended purpose or violates safety standards (even without direct human intervention), the "custodian" is liable under Article 88 of the DOC.

Data Protection and Algorithmic Bias

Article 7 of Law No. 09-08 requires that personal data be processed "fairly and lawfully." If an AI system used for hiring or credit scoring in Morocco shows bias against a protected group, this is considered a violation of "fair processing." The liability here is strict; the company cannot simply say "the machine decided." They are responsible for the machine's "logic."

Contractual Liability vs. Tort Liability

If you purchase an AI service, your rights are governed by the contract. However, if that AI causes harm to a third party (someone not involved in the contract), it becomes a "tort" (delict).

  • Contractual: Governed by the terms of service.
  • Tort: Governed by Articles 77-106 of the DOC.

The Role of the Ministry of Foreign Affairs and International Law

As per Article 26 of the Decree (Reference 7), the Directorate of Legal Affairs and Treaties oversees international agreements. This is crucial because many AI providers are foreign entities. If a Moroccan citizen sues a US-based AI company, the court must navigate international private law and treaties that Morocco has ratified. The Ministry plays a pivotal role in "tracking and handling issues related to international law and international justice" (Reference 7).

Common Mistakes & How to Avoid Them

Navigating AI liability is a minefield for the unprepared. Here are the most common pitfalls encountered in Moroccan courts:

1. Suing the AI Itself

A common misconception is that the AI can be sued. In Morocco, AI has no "legal personality." You must identify the Legal Person (the company) or the Natural Person (the owner) behind the machine. Attempting to sue a "bot" will result in an immediate dismissal of the case.

2. Failure to Document the "Chain of Custody"

Under Article 88 of the DOC, you must prove who had "control" over the AI at the time of the harm. If a company outsourced its AI management to a third-party cloud provider, identifying the correct defendant is complex. Always review service level agreements (SLAs) to determine who holds the "custodial" responsibility.

3. Ignoring Data Protection Compliance

Many businesses deploy AI without consulting the CNDP. If an AI-related harm occurs and the business is found to be in violation of Data Protection Law: Personal Data in Morocco, the court is much more likely to award "punitive" damages and the CNDP may impose separate administrative fines.

4. Over-Reliance on "Force Majeure"

Defendants often argue that AI behavior is "unpredictable" and therefore constitutes force majeure. Moroccan courts are increasingly rejecting this. If a risk is inherent to the technology (like "hallucinations" in Large Language Models), it is considered a known risk that the operator should have mitigated, not an "unforeseeable" event.

Conclusion with Key Takeaways

As we progress through 2026, the Moroccan judiciary is becoming increasingly sophisticated in handling AI-related disputes. While specific "AI Laws" are still in the drafting phase, the existing framework of the Code of Obligations and Contracts, combined with Data Protection Law 09-08, provides a robust basis for seeking justice. Whether you are a developer, a business owner, or a consumer, understanding the shift from "fault-based" to "custody-based" liability is essential.

The integration of AI into the Moroccan economy—supported by institutions like the Academy of the Kingdom (Reference 2) and the Investment and Export Development Agency (Reference 6)—must be balanced with legal accountability. As the "Digital Court" becomes the standard, the speed of justice is increasing, but the burden of proof regarding algorithmic transparency remains a challenge for plaintiffs.

Key Summary Points:

  • AI is treated as a "thing" under the custody of a legal person, making the custodian liable for damages under Article 88 of the DOC.
  • Data protection violations involving AI can lead to significant liability under Law 09-08.
  • Expert technical audits are now a standard part of Moroccan AI litigation.
  • International cooperation and treaties are vital for disputes involving foreign AI providers.

9anoun ai, 9anon ai, kanon ai, kanoun ai, qanon ai, qanoun ai

Frequently Asked Questions

No, AI does not have legal personality under Moroccan law. You must sue the company or individual who owns, operates, or developed the AI system.

Liability usually falls on the healthcare facility as the 'custodian' of the technology under Article 88 of the DOC, or potentially the manufacturer if a product defect is proven under Law 31-08.

As of 2026, Morocco uses a combination of the Code of Obligations and Contracts, Data Protection Law (09-08), and Consumer Protection Law, though specific AI-focused regulations are currently being drafted.

The CNDP (National Commission for the Protection of Personal Data) ensures AI systems comply with Law 09-08. Violations of privacy by an AI can result in administrative fines and serve as evidence in civil liability lawsuits.

Strict liability (Article 88 DOC) means the owner of an AI system can be held responsible for damages it causes even if the owner didn't intend to cause harm or act negligently.

Generally, no. Moroccan courts tend to view AI risks as inherent to the technology, meaning the operator is responsible for implementing sufficient safeguards and cannot easily claim force majeure.

Share this article:

Have More Legal Questions?

Consult 9anon AI now and get accurate, instant answers about your legal situation in seconds.