Compliance risk

The hidden GDPR risk in your photo archive

Your organization probably has thousands of photos containing faces. Under GDPR, every one of them is personal data. Most compliance programs overlook this entirely.

The scale of the problem

The numbers most organizations have never calculated

€30.5M

largest GDPR fine for facial recognition violations

60%

of organizations report increased data subject access requests

~50%

of brand and marketing images contain identifiable faces

0

organizations (in our experience) that have fully audited this

Think about your own organization. Your website has team pages, event galleries, blog posts with author photos, press releases, customer stories. Your internal systems have even more: SharePoint folders full of event photos, marketing asset libraries, HR profile pictures, training materials.

Every face in every one of those photos is personal data under GDPR. If someone asks you to find all their photos, or to delete them, you need to be able to do that across every system. Can you?

Why compliance programs miss this

Photos are not structured data

GDPR compliance tools focus on databases, CRMs, and email. Photos are binary blobs — they do not show up in data mapping exercises unless someone specifically looks for them. And most people do not.

Photos accumulate silently

Nobody decides to create a massive face data archive. It happens gradually — an event here, a team photo there, a marketing campaign. Over years, you end up with thousands of photos and no inventory.

Photos are not searchable

You can search text. You cannot search inside photos. Without face recognition, there is no practical way to answer "which photos contain this person?" — so the question never gets asked.

Enforcement reality

Regulators are paying attention to biometric data

Biometric data enforcement is accelerating across Europe. Supervisory authorities have issued significant fines for mishandling facial recognition data, failing to conduct impact assessments, and processing biometric data without adequate legal basis.

The Swedish DPA fined a school for using facial recognition for attendance without proper basis. The Italian DPA fined Clearview AI EUR 20 million for processing biometric data of Italian residents. The French CNIL has issued multiple fines related to biometric data processing.

These cases involved organizations that actively deployed face recognition. But the underlying principle applies equally to organizations that passively hold face data without proper governance — the obligations exist regardless of whether you are actively processing the images.

Key risk areas

Incomplete DSAR responses

Missing photos in a response can be treated as non-compliance

No Record of Processing

Article 30 requires documenting photo processing activities

Incomplete erasure

Deleting some photos but missing others is still a violation

Missing legal basis

Photos published without consent or documented legitimate interest

What to do about it

Three steps to get your arms around this

Follow our face data audit guide for a complete walkthrough of each step.

1

Find out what you have

Start with your website — it is the most visible and easiest to scan. Then move to internal systems. Until you inventory what exists, compliance is guesswork.

2

Document and classify

For each source, document the legal basis for processing, the categories of people appearing, and the retention policy. Add photo processing to your Record of Processing Activities.

3

Build the operational capability

You need the ability to find all photos of a specific person and to respond to access and erasure requests. This is where tooling like Ansikt makes the difference between a theoretical compliance posture and an operational one.

Find out what is hiding on your website

You probably have no idea how many face photos are publicly visible right now. Our free scan shows you the exposure you didn't know existed. Two minutes. No signup. Just a URL.