Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.

The flood of news stories on the data-collection and online behavioral advertising (“OBA”) practices of search engines, mobile apps, brand advertisers, and social networks is giving many people a very distinct feeling: the creeps. Whether the stories are about concerns over Facebook sharing its users’ profile details with advertisers, Google bypassing default browser settings, or Target figuring out a teenager is pregnant before her parents do, the natural reaction is to picture the companies’ employees as shadowy, green-eyed peepers crouching in the darkness.

The curious thing about OBA practices is that it’s difficult to identify the direct “harm” it causes. Courts have struggled with this issue in many privacy lawsuits. Plaintiffs often fail because they can’t show legally cognizable harm. For their part, regulators are clearly unsettled by OBA, but even after getting comments from dozens of interested parties for a 2009 report, the FTC was unable to articulate whether or how OBA directly harmed consumers. Academics have done interesting research on the pros and cons of OBA, but the research has not yet translated into any consensus on acceptable practices.

Consequently, industry leaders, privacy advocates, and regulators have not established normative rules based on the harm caused by different forms of OBA. Instead, they have focused on creating a comprehensive “notice and choice” regime. Under this regime, consumers are meant to see how their data is used and choose whether they want to allow such use. This is great, assuming companies participate, but it ignores a critical real-world problem. News today of the California Attorney General’s agreement with major app-enabling companies to alleviate data collection concerns simply with more privacy policy announcements is the latest example of these efforts (in my view largely ineffectual). When it comes to OBA, most consumers are disadvantaged by what experts call “knowledge asymmetry.” Even if companies tell consumers exactly what data they’re collecting and how they’re using it, most people don’t have the expertise to understand the full implications.  This reality challenges the notion of “informed consent” and suggests that “notice and choice” are not enough.

But in the absence of concrete harm, how do we distinguish OBA practices that are benign from those that are unacceptably intrusive? Unfortunately, uproar over the latest privacy outrage tends to blur these distinctions. There are, however, at least seven factors that stand out as significant “creepiness” indicators.  OBA that scores high on any of these factors should be scrutinized carefully and, at a minimum, industry leaders should consider establishing guidelines that discourage such practices.


Intelligent Security Summit

Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.

Register Now

Creep Factor No. 1: Linking behavioral data with unique identifiers

One of the most powerful ways to deliver targeted ads to consumers is to assign a unique identifier to individuals and track their online behavior across multiple sites, platforms, and apps. However, as Apple found when its use of UDIDs (Unique Device Identifiers) resulted in a public outcry, this is also one of the practices consumers find most disturbing. Although Apple is eliminating the use of UDIDs from its development platform, app developers (and their marketing executives) are pushing hard to find alternatives.  Some mobile marketing companies advocate the use of MAC addresses in lieu of UDIDs. Others have proposed an open source UDID alternative. Setting aside security concerns associated with some of the UDID alternatives (MAC addresses? Really?), the problem with these alternatives is they aren’t really any less unnerving than the technology they seek to replace.

Creep Factor No. 2: Detail and scope of data collection

Most people have some tolerance for “being watched.” After all, we’re social creatures, and we understand that, at some level, others will observe what we do and try to gain advantages from what they learn. But there’s a point at which data collection can make consumers feel like they’re trapped in a kind of Orwellian Panopticon. For example, if a data collection practice is both broad (i.e., relating to behavior in multiple contexts, like emailing, texting, web browsing, and voice calling) and granular (i.e., capturing details of the behavior, as in keystroke-logging), expect a sharp rise in the sale of tin-foil hats, because consumers will do anything to avoid this kind of practice. Just ask companies like Phorm and NebuAd, who partnered with Internet Service Providers a couple of years ago to use deep-packet inspection technology to deliver targeted ads to users. If you want to know how that story ends, you can read all about it in the transcripts of the congressional hearing.

Creep Factor No. 3: OBA based on “negative” assumptions

It’s hard to envision how regulators would address this issue, since it’s inherently subjective, but it’s still relevant. OBA is all about making assumptions based on known features of the consumer. However, these assumptions can have negative, positive, or neutral connotations. If the underlying assumptions are negative, consumers will likely find this intrusive. For example, if I’m a marathon runner, I’m perfectly fine getting targeted ads promoting the latest workout app. If I’m a pudgy couch potato…not so much. (I’m a 42-year old attorney who spends most of his day sitting in front of a computer monitor, so you can guess which scenario I identify with.)  Consumers are much more likely to find OBA based on negative assumptions (e.g., you’re fat and need to work out) intrusive, not to mention tacky.

Creep Factor No. 4: Sensitivity of data

There’s a reason the ancient penalty for peeping Toms was gouging out their eyes. Some data is so sensitive that, even if it’s anonymized, consumers will not tolerate its collection and use. For a notably disconcerting example, read the Wall Street Journal’s reporting on Neilson Co.’s practice of scraping a private online forum for discussion threads from people suffering from emotional disorders. Neilson was monitoring what consumers were saying about various pharmaceutical products on the forum. The information Neilson collected wasn’t tied to individuals and wasn’t used for direct marketing purposes. But when the story broke, you could almost hear consumers sharpening their stakes.

Creep Factor No. 5: Impact on operability

This is one issue that courts view as a legally cognizable harm. If data collection and tracking technology significantly impacts the operability of users’ computers or mobile devices, as in the case of spyware, adware, and malware, the sense of intrusion can be overwhelming.  Consumers will run, not walk, away from these kinds of practices.

Creep Factor No. 6: Ease of opting out

Zombie cookies are one example of this issue. They’re HTTP cookies that are automatically recreated (I prefer the word “respawned”—much creepier) after users attempt to delete them.  This technology can make it virtually impossible for users to opt out of being tracked. Any company using zombie cookies to collect or monetize sensitive information is about as wholesome as John Hinckley, Jr.

Creep Factor No. 7:  Lack of notice

Online apps and services may provide various types of notice to users about what’s being done with their data, but it’s safe to say that any OBA data-collection practice conducted with absolutely no consumer notice is seriously disturbing. A good example of this is a practice called “device fingerprinting.” Device fingerprinting creates a unique identifier for computers, cell phones, and other devices based on a combination of externally observable characteristics like installed font styles, clock settings, and TCP/IP configuration. In addition to being problematic because it creates a persistent, unique identifier (see “Creep Factor No. 1”), this information is collected “passively,” and in most instances users can’t even detect that it’s happening.

There are undoubtedly many other “Creep Factors,” but I’ve tried to identify the worst of them. The point is that not all data collection and OBA poses the same threat to consumers’ sense of personal privacy. By identifying specific practices likely to be viewed as intrusive, industry leaders, trade organizations, and regulatory bodies may find it easier to determine the level of notice required, or whether some practices should be prohibited outright. These criteria may also be useful for companies developing OBA and tracking technologies who want to build sustainable businesses. After all, nobody likes a creep.

Slade Cutter is a licensed attorney, Certified Information Privacy Professional, and member of the Mobile Marketing Association’s Consumer Best Practices Committee. His firm, Cutter Law, provides general counsel and compliance consulting services to companies in the interactive media and e-commerce spaces.

[Credit for top image: Zurijeta/Shutterstock]

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.