A permanent platform boundary

Zero Biometric Data.

We do not use facial recognition. We do not hash faces. We do not store or process biometric identifiers. We believe there is no safe way to handle biometric data in consumer photo delivery.

This page explains our design position and threat model. It is not legal advice.
The argument

Why we refuse biometrics in guest photo delivery

We are not saying biometric systems are always malicious. We are saying they create a class of risk that is irreversible in consumer workflows. If something goes wrong, you cannot meaningfully rotate a face the way you rotate a link, code, or password. Our platform stays safer by never entering that category.

Point 1

Biometrics are not rotatable

Links, codes, passwords
If exposed, you revoke, regenerate, and move on.
Biometric identifiers
If exposed, there is no reliable re-issue. That is the core asymmetry.
Point 2

Hashing is not an escape hatch

To hash, you must first extract
Any system that can match later must create a stable template now. A template that can match a person is still a biometric identifier.
Stability is the danger
It is designed to persist across time, devices, and contexts.
Point 3

Consent breaks in real venues

Mixed scenes are normal
Families, groups, minors, crowds, school events, public venues.
Opt-outs create exceptions forever
That exception lane is where accidental processing happens.
What we do instead

Simple access, clear boundaries

Guests find galleries using links and access codes, and studios control sharing rules. This keeps the platform rotatable, auditable, and predictable.
Gallery links
Fast, familiar, shareable. Controlled by the studio.
Access codes
Low friction for high volume programs, no enrollment required.
Studio-managed sharing
Studios decide how access works. FYP enforces controls.
Industry signal

This risk has a history, even at big platforms

We are not building a legal case here. This is a product risk signal. When biometric pipelines meet consumer reality, the failure modes are expensive, high scrutiny, and hard to unwind.

Examples below are provided for context only. They are not legal advice, and they are not an exhaustive list.
Example

Illinois (BIPA) lawsuits and settlements

Illinois’ Biometric Information Privacy Act became a major litigation driver and produced large reported settlements, including a widely reported settlement involving Facebook’s photo tagging related claims (reported at $650M).

Example

State enforcement actions

Some disputes are not “just class actions.” State-level enforcement can produce very large outcomes. For example, Texas announced a reported $1.4B settlement with Meta over facial recognition related claims.

Example

Scraping and repurposing concerns

Facial datasets can be built from scraped images, then repurposed. Litigation around companies accused of scraping faces highlights how quickly a “helpful feature” becomes a broader surveillance concern.

Our takeaway

We refuse the category

The safest way to avoid biometric disputes, edge cases, and exception routing is not to build the biometric pipeline at all. We choose rotatable access methods that can be audited and repaired when something goes wrong.

Visual model

Risk does not scale evenly

These charts are illustrative. They do not use customer data, and they are not claims of absolute measurement. They exist to show why we treat biometric identifiers as a different risk class.

Illustrative risk profile
Biometric identifiers vs link/code access
illustrative
Interpretation: biometric systems concentrate irreversible risk in fewer failure modes (breach, misuse, repurposing).
Rotatability test
What you can reset after exposure
core issue
Interpretation: links, codes, and passwords can be rotated quickly. Biometric identifiers cannot be meaningfully re-issued.
Threat model

How biometric risk enters a photo workflow

Biometric flow (what we refuse)
Capture face data, create templates, store templates, match templates later. Each step increases the blast radius if anything goes wrong.
Capture
A face is scanned in a consumer context.
Template
A biometric representation is created.
Storage
Templates must be stored and protected.
Matching
Templates are used for retrieval decisions.
Blast radius grows
A password leak can be mitigated. A biometric leak is persistent and hard to meaningfully undo.
Our platform is designed to avoid this entire branch, permanently.
What we commit to

Clear commitments

No facial recognition features
We do not offer “find by face” workflows for guests.
No hashing of faces
We do not create or store face templates, hashes, or biometric identifiers.
Studios control access
Links, codes, passwords, and sharing are studio-managed by design.
Clarity over cleverness
We prefer predictable guest flows that scale without biometric risk.
Talk to us about studio delivery
Guests: if you need help finding your gallery, use Guest Help on the main site.
FAQ

Common questions

Short answers for operators and partners.

Hashing and templating do not remove biometric sensitivity. If it can match a person later, it is still a biometric identifier. Our position stays simple: we do not collect, store, or process biometric identifiers.
Guests typically arrive via a link or access code from the studio or event. That path is fast, familiar, and works across devices without enrolling a face. Studios retain control of sharing and password rules.
Permanent. This is a platform boundary. We are building guest access and studio delivery that do not require biometric shortcuts.
Yes. Studios manage links, access codes, passwords, and sharing rules. That keeps control with the operator and keeps the platform out of biometric territory.
You can, but then your entire pipeline must treat that tag as a permanent compliance rule across every transformation: resizing, exports, indexing, search, AI tooling, support workflows, analytics, backups, and vendor tools. One missed handler becomes accidental processing. We avoid the category entirely.