We do not use facial recognition. We do not hash faces. We do not store or process biometric identifiers. We believe there is no safe way to handle biometric data in consumer photo delivery.
We are not saying biometric systems are always malicious. We are saying they create a class of risk that is irreversible in consumer workflows. If something goes wrong, you cannot meaningfully rotate a face the way you rotate a link, code, or password. Our platform stays safer by never entering that category.
In real venues, consent is not universal. The moment one person opts out, your workflow becomes a permanent exception engine. That exception engine is exactly where accidental processing and compliance failures happen.
Mary uploads a selfie to find her photos. She "consents" to the scan. She thinks the system is only looking for her.
Steve is in the background of 40 photos. He never used the app. He never consented.
Face-finding AI doesn't work in a vacuum. For a system to "match" Mary, it must first biometrically hash every single person in the photographer's upload.
We are not building a legal case here. This is a product risk signal. When biometric pipelines meet consumer reality, the failure modes are expensive, high scrutiny, and hard to unwind.
Illinois’ Biometric Information Privacy Act became a major litigation driver and produced large reported settlements, including a widely reported settlement involving Facebook’s photo tagging related claims (reported at $650M).
Some disputes are not “just class actions.” State-level enforcement can produce very large outcomes. For example, Texas announced a reported $1.4B settlement with Meta over facial recognition related claims.
Facial datasets can be built from scraped images, then repurposed. Litigation around companies accused of scraping faces highlights how quickly a “helpful feature” becomes a broader surveillance concern.
The safest way to avoid biometric disputes, edge cases, and exception routing is not to build the biometric pipeline at all. We choose rotatable access methods that can be audited and repaired when something goes wrong.
These charts are illustrative. They do not use customer data, and they are not claims of absolute measurement. They exist to show why we treat biometric identifiers as a different risk class.
Short answers for operators and partners.