Dear Republicans, beware: Big Brother is watching you

In Tampa, Florida, just outside of the building where the Republican National Convention is taking place, vigilant observers are perched high above, working day and night to spot suspicious activity. They are not police officers – they are surveillance cameras, equipped with “behavior recognition” technology that constantly studies each person to determine whether he or she is the next security threat. By “learning” patterns of behavior, these devices can monitor large crowds to alert authorities, within seconds, when something out of the ordinary occurs.

High-tech security measures might be expected at large politically charged gatherings. But cameras capable of real-time, sophisticated data mining are starting to appear everywhere.

It may soon no longer be necessary to have a human being actively monitoring the screens. Computers will be able to do a better job and for a fraction of the cost. Legal protections from surveillance cameras currently focus on where a camera can be placed.  This will shift to what types of analysis the camera is capable of performing, and for what purpose.

The reason for the quick adoption of these cameras is simple: Human beings are not good at attentively watching large amounts of video for very long. In the United States, it is estimated that there are 30 million surveillance cameras, which create more than 4 billion hours of footage every week. At best only a small portion of this footage will ever be reviewed. London, for example, has close to 500,000 surveillance cameras. But this has only helped police in solving three percent of all street robberies.

Instead of trying to solve crimes after they have happened, advances in camera technology can spot problems as they are occurring. On Liberty Island, home to one of the nation’s most famous landmarks, surveillance camera data are brought together and analyzed in order to spot when somebody abandons a bag or tries to stay on the island after hours. This technology can alert police to the appearance of an imminent fight. Across the Bay, in Manhattan, surveillance cameras can track a person’s general description. If there is a report about a suspicious person wearing a red shirt, for example, every person wearing a red shirt in sight of any of the area’s thousands of cameras can be displayed together—in an instant.

It’s not just law enforcement that has taken note of this. Retail outlets such as Macys, Babys ‘R’ Us, and CVS have installed systems in some of their stores that can spot shoppers who do unusual things — such as remove many items from a shelf at once, open a case that is normally locked, or walk suspiciously through the aisles. Pathmark grocery stores have implemented similar technology that will quickly alert managers of potential shoplifting and employee fraud as it takes place.

These systems are programmed to assume that everybody is a potential shoplifter, terrorist, or criminal. In addition to issues related to presumption of innocence, this raises many questions about privacy. The idea of a person closely watching our movements is unsettling. Does it “feel” different if it’s just a computer rather than a human being?

WikiLeaks cables released earlier this month revealed a widespread use by local and federal agencies in the U.S. of TrapWire, a technology that aggregates incident reports and camera feeds to try to detect potential terrorist threats. Understandably, there was uproar over the lack of public disclosure. These same features are being used in other parts of the world to combat dissent. In China, security cameras are commonly used to count the number of people in crosswalks. These alert the authorities if a crowd forms at an unusual time — which could be sign of unsanctioned protest. Around the world, companies like Sony, Kraft, and Adidas are also installing cameras to target ads to consumers based on their physical features.

The last two decades have largely settled the question of where a security camera can be placed. The promise of increased safety has trumped the right to remain anonymous. Not having behavioral detection systems present will be seen as a danger and liability, especially as the cost of monitoring technology drops and advanced surveillance becomes even more affordable.

So far, there has been little consequence to this because nothing is usually done with the footage.  But that is going to change. There will, undoubtedly, be concerns arising related to how these datasets can be combined with personally identifiable information to track not only our locations and activities, but our feelings and state. You can expect these to be the next privacy battles in the courts. One would expect the Republicans—who often consider themselves to be the defenders of free speech and liberty—to lead the charge against these technologies.

Meanwhile, back at the Convention in Tampa, cameras have been working overtime alongside police officers to make sure that things run smoothly. If the protests turn violent, as they did at the 2008 Convention in St. Paul, the authorities will now know when and where to react.  Big Brother will tell them. It will be interesting to see how the need for domestic security will be balanced against individual rights and our need for privacy.

Tarun Wadhwa is a research associate at Singularity University researching how advancing technologies can be used to solve public policy issues. This story was produced in cooperation with Singularity University partner site Singularity Hub.

[Top image via publicintelligence.net]

This story originally appeared on Forbes.com.