Perception Consent

Perception Consent

The internet has Do Not Track. Physical space has nothing.

For two decades, the privacy conversation has been about the data we generate online. Cookies, trackers, opt-outs, consent banners. The framework is imperfect, but it exists. You can express a preference. Sometimes systems honor it.

When an autonomous vehicle drives past your home, it captures you, your children, and your license plate. When a delivery robot rolls down your sidewalk, it records every face it passes. When a humanoid robot enters a coffee shop, it perceives everyone inside. None of these systems ask. None offer an opt-out. There is no protocol for saying no — or even not this, not now.

The standard defenses — that asking is too hard, that public space carries no expectation of privacy, that the data is needed for safety — are deferrals, not answers. The hard part is solvable. The public-space argument was written for human observers, not fleets with perfect memory. And collecting data is not the same as keeping it.

Every machine that perceives you should give you the chance to opt in or opt out. A two-way privacy handshake between humans and autonomous systems is not science fiction. It is the next layer of the internet, and it is closer than the industry admits.

Perception consent is the principle that people in physical space should be able to express preferences about how machines perceive, record, and learn from them — and that machines should be designed to read and honor those preferences. It is Do Not Track for physical space. It is the missing layer in every regulatory conversation about AI in public.

The framework will be built. The question is who builds it, and whether the people being perceived have any say in how.


Griffin Leggett is an inventor with a patent-pending portfolio covering autonomous system privacy, perception consent protocols, and AV robustness testing. Contact: griffin@griffin-leggett.com