Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
Updated at 6:14 p.m. Pacific time.
Facebook already stores tons of data. Its data warehouse for analytics alone measures 300 petabytes. And in just a couple of months, it looks like that stockpile could mushroom.
The social network with more than 1.15 billion monthly active users could start grabbing data on “minute user interactions,” including how long a user’s cursor spends on certain areas of the Facebook site and whether the news feed is currently displayed on a mobile device, the Wall Street Journal reported today.
Multiply those little flows of data making their way to Facebook data centers by all those users, and you realize Facebook analysts could have lots more data on hand to evaluate. Those data analysts could take their insights over to product managers and work with them to play up, optimize, or retire user-facing elements and then run A/B tests to check their revisions before rolling out the changes to everyone.
Presumably analysts could also figure out what users are doing in different countries, states, and even cities (after all, users already share that data with Facebook) and perhaps adjust the product for different geographical regions as they see fit. For example, if Graph Search isn’t getting much attention from users on desktop computers in Florida, maybe it would make sense to deploy the tool on mobile devices to users in other states first and improve the product. That way, it could make a bigger splash when it finally does debut in Florida.
It would be a little silly to think Facebook would use clickstream data exclusively for product development. The data could provide advertisers with better metrics on engagement. If users spend more time rolling their cursors over an ad this week than they did last week, well, that could validate the power Facebook has as an advertising platform. Indeed, Facebook analytics chief Ken Rudin said improving ad targeting could be a use case for the expanded data warehouse.
To be fair, Rudin — who will speak a VentureBeat’s DataBeat 2013 conference in Redwood City, Calif., on Dec. 4-5 — said in the Wall Street Journal interview that he “can’t promise that it will roll out.” But if it does happen, Facebook will have to think even harder about how it can pack the most data into its already large infrastructure footprint.
By the way, if Facebook is able to pull off this new level of data collection without raising a ruckus among privacy advocates, the practice could become more common on other websites. You can already hear data-analytics vendors foaming at the mouth.
After this article was published, a Facebook spokesperson sent the following statement to VentureBeat: “Like most websites, we run numerous tests at any given time to ensure that we’re creating the best experience possible for people on Facebook. These experiments look at aggregate trends of how people interact with the site to inform future product decisions. We do not share this information with anyone outside of Facebook and we are not using this information to target ads.”
At the end of the day, it looks like the company wants to improve its own product with the data Rudin talked about — and simultaneously avoid getting dinged on the privacy front, by saying it wouldn’t let the data go elsewhere. Ideally, the improvements would delight users so much that they would push aside concerns about allowing Facebook itself to get the data in the first place.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.