Unless you spend all of your time in a cave, you’ve probably been photographed without your knowledge.
The presence of surveillance cameras is becoming a growing norm in public spaces. For example, currently in the United Kingdom, there are an estimated 4.2 million cameras —some even with facial-recognition capability— that passively capture photos of nearly the entire nation, every moment of the day.
At the same time, back in the States, we’re seeing the rise of self-surveillance — otherwise known as sousveillance — and whether you like it or not, people are photographing you (accidentally or otherwise) with cheaper and cheaper portable cameras, and ever-smarter phones.
Cameras are becoming ubiquitous — as cameras like GoPro’s HERO gain consumer interest, and with the debut of wearable, automatic lifelogging cameras — and as it happens a new shift is coming that will change how we think about public space, and how we manage our public personas.
Do You Own Your Face?
In early November, Narrative (formerly Memoto) will begin shipping their long-awaited Narrative Clip. Similarly, other newcomers like Perfect are also seeking uplevel the game even more with Glassware apps. As an avid lifelogger, it’s an exciting leap forward in wearables that will add new context to my lifelog. But when these cute little cameras are clipped to lapels around the globe, and the tsunami of images begins flowing, you need to ask yourself this question: Who owns your face?
The somewhat unnerving answer is: not you. Probably not anyone else either, unless you’re a very famous model with a very restrictive contract. There are laws governing the use of your recognizable image for commercial purposes. And if you’re an extra on a film you’ll have to sign a waiver allowing the use of your image, even if you aren’t paid for it.
Here’s a short personal anecdote about what can happen when your image is set loose in the world. In a not so distant past, I was working on some lifestyle videos for a certain Software Giant. During one shoot, there was a last-minute decision to cast me in the role of “the girlfriend” in a touching scene that involved pointing a remote at a TV. We shot two versions of this ten second masterpiece, and I signed the waiver and went on with my life.
The scene appeared, as planned, in an interactive demo. A few months later it was used as part of a presentation at a global conference. And around Christmas it was shown on a giant screen in Times Square. For two years, it was recycled over and over by the Software Giant. And why not? Who wants to pay to shoot another couple pointing a remote at a screen?
That video was not shot in public, and I did sign the waiver. This story is fairly typical in terms of what happens when you waive the rights to your image. But what about your public image? When you’re in public anyone else can see you, without paying for the privilege. Likewise, any lens can capture you and what you’re doing.
Can someone post a photo of you online without your permission? Have you seen Facebook? Yes, they can. What if that photo was taken without your knowledge? As attorney Ruth Carter states in this blog post, “You have no expectation of privacy in anything you do in public.”
However, the rise of visual data is not necessarily a bad thing. In addition to producing rich photo lifelogs and visual records of important events, ubiquitous photography also creates an historical archive unlike anything we’ve had before — it’s truly an exciting time. If the images can be parsed, tagged, and archived safely, we will be able to literally look back at every moment in time.
The Shift In New Public Behavior
It’s becoming much more difficult to get away with behaving badly in public. If you’re the kind of person who does incriminating or embarrassing things, the likelihood of evidence of your actions cropping up is increasing exponentially. For some people, the presence of cameras anywhere and everywhere may cause them to act differently — better, cooler, more like the person they wish they were.
More likely, we’ll simply get used to being photographed and go on doing what we do. As Narrative founder Oskar Kalmaru says in this interview, “We’ve already seen that in the testing we’ve done that users and people around the users very quickly adapt to [the camera].”
Social media has made us accustomed to living at least somewhat in the public eye. Now the public is getting more eyes. It’s probably best to assume that if you’re in public, you’re on Candid Camera.
The Narrative founders encourage transparency. If we live in a world where our authentic lives are captured digitally — by cameras, apps, and sensors — we are creating records of who we really are. These can be used as mirrors, to reflect on whether we’re living the lives we want to live. They also become valuable repositories of knowledge for better understanding the human condition, if we’re willing to be transparent.
For many, that’s not an easy leap of faith to make. Given the recent data privacy scares, the idea of data transparency may seem dangerous. The good news is, good stewardship of data and protection of privacy are issues getting attention in both the public and private sectors. In order to succeed, those who create, store, and use this data will have to behave ethically.
Personally, I’m ready to be on Candid Camera. I think the value of lifelogging outweighs the risk. I’ve spoken to many lifeloggers and quantified-selfers who feel the same. Technology is revolutionizing the way we perceive ourselves and our world. The more honest that perception becomes, the better off we all are.
Kitty Ireland is a data curator at A.R.O., makers of Saga, the essential lifelogging app. Ireland oversees the development new data interfaces, and is tackling the challenge of making your personal data friendly, useful and more consumable.
VentureBeat and marketing technology analyst David Raab are working on a new Marketing Automation usage and ROI study
. If you currently use a marketing automation system, help us out by answering the survey.
If you do, we'll share the resulting data with you.