GDPR applies to AI agent memory — most developers ignore this

When developers think about GDPR compliance, they typically focus on their database, their analytics tools, and their marketing emails. They don't think about their AI agent's memory layer — but they should. Every fact an AI agent stores about a user is, by definition, personal data under GDPR if it relates to an identified or identifiable natural person.

The GDPR definition of personal data (Article 4) is intentionally broad: "any information relating to an identified or identifiable natural person." An agent memory that stores "user prefers Python over JavaScript" is personal data if it's associated with an identifiable user ID. An agent that stores "user works at Acme Corp, Paris, team of 10" is storing personal data. Vector embeddings of personal statements are personal data — they are a transformed representation of personal information, not anonymized.

Vector embeddings are personal data. A common misconception is that because embeddings are not human-readable, they're anonymized. The EDPB (European Data Protection Board) has clarified that pseudonymization — including vector representations — does not exempt data from GDPR obligations when the original data can be reconstructed or the subject identified through other means. Treat embeddings as personal data.

The practical implication: your memory API is a data processor under GDPR Article 4(8) if you're using it on behalf of your users (who are data subjects) or your customers (whose users are data subjects). This triggers specific obligations — documented processing, restricted data flows, right to erasure, and a Data Processing Agreement with the memory API provider.

What counts as personal data in agent memory

In practice, almost everything an AI agent usefully stores about a user will qualify as personal data. Here are the categories with examples:

Direct identifiers

Behavioral and preference data

Business context linked to an individual

Special category data (extra protection required)

If your agent stores health-related information, political opinions, religious beliefs, or trade union membership — even incidentally mentioned in conversation — this is special category data under Article 9, requiring explicit consent and stricter processing conditions.

Safe harbor only applies at true anonymization. If memories are stored without any user identifier, with no possibility of re-identification, they may fall outside GDPR scope. In practice, this is nearly impossible for an agent memory layer — the memory is valuable precisely because it's associated with a specific user. Do not rely on anonymization as a compliance strategy for memory.

Right to erasure: Article 17 and what it means for memory APIs

GDPR Article 17 grants users the right to request deletion of all personal data you hold about them, subject to limited exceptions. The exceptions that might apply to agent memory are narrow: legal obligation compliance, establishment or defense of legal claims, or public interest in archiving — none of which typically apply to B2B SaaS agent memory.

For practical purposes: when a user requests erasure, you must delete all their memories from your memory layer. This must be complete — not just "soft deleted" with a flag, but genuinely removed from storage and any backups within your retention schedule.

The compliance requirements for erasure:

The right to erasure is frequently tested. EU data protection authorities regularly audit companies' erasure procedures. A memory layer that doesn't support atomic per-user deletion is a compliance liability. Verify that your memory API supports complete agent deletion before you ship to production.

Data residency: why US-hosted memory APIs are a legal risk for EU companies

GDPR Chapter V restricts transfers of personal data to countries outside the EEA (European Economic Area) unless one of the following conditions is met:

For a B2B SaaS company with EU end users, using a US-hosted memory API (Mem0, US-based vector databases, US-region OpenAI) means:

EU-hosted memory eliminates this entire category of risk. Data stays within the EEA; no Chapter V transfer mechanism is needed; your legal basis for processing doesn't depend on a fragile diplomatic framework.

Kronvex's approach: EU-only storage, per-agent deletion, TTL

Kronvex was built with GDPR compliance as a first-class requirement, not an afterthought. The design decisions:

Data Processing Agreements (DPA) — who needs one and when

Under GDPR Article 28, if you engage a data processor (any third party that processes personal data on your behalf), you must have a written contract — a Data Processing Agreement — that covers specific mandatory terms: purpose and duration of processing, nature of the processing, type of personal data and categories of data subjects, and the controller's obligations and rights.

You need a DPA with Kronvex (or any memory API) if:

For B2B SaaS companies serving EU business customers whose employees use your AI agent: you almost certainly need a DPA. Your customers' employees are natural persons; their preferences and work context stored in agent memory are personal data.

DPA vs. standard terms: Many API providers include DPA terms in their standard ToS. Review the actual terms rather than assuming coverage. A valid Article 28 DPA must include specific provisions: sub-processor lists, audit rights, deletion obligations, security measures, and international transfer mechanisms if applicable.

Practical checklist for GDPR-compliant AI agent memory

Code: implementing erasure via the Kronvex API

Python — Complete GDPR erasure flow
from kronvex import Kronvex
from datetime import datetime
import logging

kv = Kronvex(api_key="kv-your-key")
logger = logging.getLogger(__name__)

def handle_erasure_request(
    user_id: str,
    request_id: str,
    requested_by: str
) -> dict:
    """
    Handle a GDPR Article 17 erasure request.
    Returns a compliance record suitable for audit logging.
    """
    started_at = datetime.utcnow().isoformat()
    agent = kv.agent(user_id)

    # 1. Enumerate memories before deletion (for audit record)
    try:
        memories = agent.list_memories()
        memory_count = len(memories)
    except Exception as e:
        logger.warning(f"Could not enumerate memories for {user_id}: {e}")
        memory_count = -1  # Unknown count

    # 2. Delete agent and all associated memories
    try:
        agent.delete()
        deletion_status = "SUCCESS"
        error_detail = None
    except Exception as e:
        deletion_status = "FAILED"
        error_detail = str(e)
        logger.error(f"Erasure failed for user {user_id}: {e}")
        raise

    completed_at = datetime.utcnow().isoformat()

    # 3. Build audit record (do NOT store this in a system that retains PII)
    audit_record = {
        "request_id": request_id,
        "user_id": user_id,  # Pseudonymize this in your audit log if possible
        "requested_by": requested_by,
        "started_at": started_at,
        "completed_at": completed_at,
        "memories_deleted": memory_count,
        "status": deletion_status,
        "processor": "kronvex",
        "error": error_detail,
        "gdpr_article": "17",
        "compliance_note": "Agent and all associated memories deleted from EU-hosted storage"
    }

    logger.info(f"Erasure request {request_id} completed: {deletion_status}")
    return audit_record


# Example usage
audit = handle_erasure_request(
    user_id="user_42",
    request_id="erasure-req-2026-001",
    requested_by="user_42@email.com"
)
print(audit)
Python — Data export (Article 15 right of access)
import json
from datetime import datetime

def export_user_data(user_id: str) -> dict:
    """
    Export all memories for a user (GDPR Article 15 right of access).
    Returns a structured export suitable for delivery to the data subject.
    """
    agent = kv.agent(user_id)

    try:
        memories = agent.list_memories()
        memory_data = [
            {
                "id": m.id,
                "content": m.content,
                "created_at": m.created_at,
                "expires_at": m.expires_at,
                "access_count": m.access_count
            }
            for m in memories
        ]
    except Exception as e:
        memory_data = []
        # Log error but don't fail the export request

    export = {
        "export_generated_at": datetime.utcnow().isoformat(),
        "data_controller": "Your Company Name",
        "data_processor": "Kronvex (kronvex.io)",
        "processing_location": "EU (Frankfurt, Germany)",
        "user_id": user_id,
        "memory_count": len(memory_data),
        "memories": memory_data,
        "rights_notice": (
            "You have the right to request correction (Art. 16) or "
            "deletion (Art. 17) of this data. Contact privacy@yourcompany.com"
        )
    }

    return export


# Deliver as JSON to the user
export = export_user_data("user_42")
print(json.dumps(export, indent=2, default=str))
Python — Automatic TTL for data minimization
def store_memory_with_minimization_policy(
    agent,
    fact: str,
    category: str = "general"
) -> None:
    """
    Store a memory with TTL based on data minimization policy.
    Only retain data for as long as necessary (GDPR Art. 5(1)(e)).
    """
    TTL_POLICY = {
        "permanent": None,       # Core preferences, identity
        "long_term": 365,        # Technical stack, business context
        "medium_term": 90,       # Active projects, goals
        "short_term": 30,        # Tactical context, current evaluations
        "ephemeral": 7,          # Temporary states
        "general": 180,          # Default: 6 months
    }

    ttl_days = TTL_POLICY.get(category, TTL_POLICY["general"])
    agent.remember(fact, ttl_days=ttl_days)


# Usage
agent = kv.agent("user_42")
store_memory_with_minimization_policy(
    agent, "User speaks French", "permanent"
)
store_memory_with_minimization_policy(
    agent, "User evaluating vendor X for Q3 decision", "short_term"
)
store_memory_with_minimization_policy(
    agent, "User currently comparing PostgreSQL vs MySQL", "ephemeral"
)