Why AI vendor due diligence determines compliance or liability

An SME tests a new AI solution for marketing copy generation. The interface is convincing, the price is right — and in the pilot enthusiasm, the first real order arrives: customer data needs to be processed. What follows surprises many decision-makers: a data protection review, a data processing agreement, proof of data storage location, disclosure of sub-service providers. The vendor responds hesitantly or with generic references to its terms and conditions. The product is live — and so is the compliance gap.

This scenario is not an isolated case. It repeats itself daily across German mid-sized companies — with ChatGPT Enterprise accounts, Microsoft Copilot roll-outs, AI-powered CRM add-ons, or AI interfaces in ERP systems. The problem is not bad intent but absent structure: no standardised checklist, no clear ownership, no defined process before the roll-out. This article closes that gap.

AI vendor due diligence for SMEs — six assessment dimensions at a glance
Fig. 1: The six assessment dimensions in AI vendor due diligence — from legal basis and DPA through certifications to model training on customer data.

AI vendor due diligence: the 6 assessment dimensions

A robust vendor assessment is not a one-off formality but a structured evaluation across six concrete dimensions. Each addresses a different risk type — legal, technical, or operational.

1. Legal basis and Data Processing Agreement (DPA)

As soon as an AI vendor processes personal data on your company's behalf, it is a data processor under GDPR Article 28. A DPA is then not optional but a legal requirement. Check: does the vendor provide a DPA — or does it attempt to classify the data processing as its own activity? The latter is common with US-based vendors and shifts GDPR responsibility in an opaque way.

The DPA must meet the minimum content requirements of Article 28(3) GDPR: subject matter, duration, nature and purpose of processing, the type of personal data and categories of data subjects must be explicitly specified. Standard texts in terms-and-conditions annexes are often insufficient. Have the DPA reviewed by your legal department or an external data protection advisor before the roll-out proceeds.

2. Data storage location and data residency

Where are your data physically stored and processed? For companies in regulated sectors — healthcare, finance, public sector contractors — this question is often binding. GDPR permits third-country transfers only under specific conditions: an adequacy decision (e.g. for the UK or Japan), Standard Contractual Clauses (SCCs), or Binding Corporate Rules.

Many AI vendors process data in US data centres by default. This is not inherently illegal — but it is only GDPR-compliant if the SCCs are correctly implemented and a Transfer Impact Assessment (TIA) has been conducted. Ask specifically: in which regions are your production and backup systems located? Is an EU-only option available? How are potential access requests by US authorities (Cloud Act) addressed technically and contractually?

3. Sub-processors and supply chain transparency

No AI vendor runs its infrastructure entirely in-house. Language models run on cloud infrastructure (AWS, Azure, GCP), monitoring tools transmit telemetry data, and support systems access ticket contents. Every such service provider is a potential sub-processor — and you as the controller need to be informed about them.

Under GDPR Article 28(2), the data processor needs your approval before engaging sub-processors. This happens either through specific individual authorisation or a general authorisation with a notification obligation upon changes. Check: does the vendor maintain a current, publicly accessible list of its sub-processors? Does it proactively notify of changes? Are the sub-processors themselves contracted in a GDPR-compliant manner?

4. Security certifications: SOC 2, ISO 27001 and BSI C5

Certifications are not a guarantee of absolute security, but they demonstrate that the vendor manages information security systematically and allows external auditing. Three standards are particularly relevant for German SMEs:

  • ISO 27001: International standard for information security management. Covers organisational and technical measures.
  • SOC 2 Type II: US audit standard for data privacy, security, and availability. Type II confirms that controls were effective over an extended period.
  • BSI C5: Cloud Computing Compliance Criteria Catalogue from the German Federal Office for Information Security — particularly relevant for federal agencies and their contractors.

Ask for current audit reports (no older than 12 months), verify the scope, and check whether the specific service you use is actually covered by the certified scope. A certificate for "Product A" does not automatically extend to "Product B" from the same vendor.

5. Deletion and access rights

GDPR guarantees data subjects the right to erasure (Article 17) and the right of access (Article 15). As the controller, you are obligated to fulfil these rights towards data subjects — and for that you need the technical support of your vendor.

Clarify contractually and technically: what retention periods apply to processing logs, prompts and outputs? Is there a mechanism to delete individual records or conversations on request? How quickly does the vendor respond to access requests? Are backups included in the deletion process? Vendors that provide no clear answers to these questions significantly increase your own liability exposure.

6. Model training on customer data

Are your inputs, prompts, or generated content used to further train the vendor's AI model? For many consumer products, the answer is yes — for enterprise tiers it is often no, but not always clearly communicated. A vendor that uses customer data for training effectively appropriates that data and can no longer be classified as a pure data processor. Review the terms and DPA for explicit opt-out clauses, or proactively request written confirmation that your data is used solely for contract fulfilment and not for model training.

The 4-step due diligence process in practice

An AI vendor review is not a one-off form letter but a process with clearly defined phases. We recommend the following sequence, which can be standardised internally as a reusable template:

Phase 1 — Pre-qualification (days 1–3): Based on publicly available information, determine whether the vendor meets basic prerequisites: is a DPA available? Is a data protection officer named? Are certifications listed? Tools such as the vendor website, trust centres, and G2 reviews provide initial indicators.

Phase 2 — Questionnaire & document review (days 4–10): Send a structured questionnaire covering the six assessment dimensions. Request current audit reports, a DPA draft, and the sub-processor list. Not every vendor will provide all documents immediately — that itself is a quality signal.

Phase 3 — Legal review (days 10–20): The DPA and data transfer agreements are reviewed by your legal department or an external data protection consultant. Negotiate clauses that do not meet GDPR requirements. Document the outcome of this review for future audit purposes.

Phase 4 — Approval and ongoing monitoring (from day 21): Only after a positive review does the official roll-out approval take place. Thereafter: annual repetition of the review, proactive monitoring of vendor change notifications, and inclusion of the vendor in your internal Records of Processing Activities (Article 30 GDPR).

AI vendor due diligence for SMEs — 4-step process: pre-qualification, document review, legal review, approval
Fig. 2: The four-step due diligence process — from pre-qualification and document review through legal review to official roll-out approval.

This process sounds demanding — and it is, the first time. But once established as an internal template, it can be completed significantly faster for each subsequent AI vendor. Organisations that skip this step frequently pay the price at their first audit, first data protection incident, or first enquiry from a supervisory authority — at a point when the room for manoeuvre is considerably smaller.