Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More
You know those “Consider the environment: Think before you print” signatures that show up on company emails? They’ve led modern society astray. While keeping emails in the virtual world certainly saves paper, storing and transmitting the data behind those emails comes at an often overlooked — but increasingly high — environmental cost.
Of course, we’re not just keeping email data alive. The amount of data powering our digital society’s work and play is huge, ranging from that “environmentally friendly” email to digital security camera footage to IT infrastructure information. The list keeps growing.
We increasingly store data in a mysterious place called the cloud. In fact, by 2025, we’ll store half of the world’s data in cloud servers. But as ethereal as the cloud sounds, it’s powered by very material data centers.
The increasing need for data storage capacity
Data centers house the information making our digital economy possible. And the need for these data centers is growing.
Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.
In the next three years, the world will create and consume an estimated 181 zettabytes of data, almost double our current volume. Fueling this meteoric rise in data volume are more people coming online and more companies migrating their assets to the cloud.
According to the U.N., more than half of the global population, or 3.7 billion people, still can’t regularly access the internet. But those 3.7 billion people will likely start connecting as time goes on. In the corporate world too, there’s room for tremendous growth in digital activity as the cloud currently stores only 60% of all corporate data.
The data center industry is taking note. This sector will grow by a CAGR of almost 22% to reach nearly $616 billion by 2026, according to Technavio estimates.
But at what cost?
The climate change cost of data centers
Once an email is read — and not printed — it’s often out of sight, out of mind. But there’s a significant, albeit hidden, carbon cost to consuming this data.
One key factor is data center cooling. To function, data centers must stay cool, around 68 to 71 degrees Fahrenheit. But servers generate heat which, without proper temperature control, can bring them down, causing expensive downtime and service interruptions. So, data centers need to either reside in cool climates or have round-the-clock, infallible temperature control systems.
While naturally cold regions could offer a more environmentally-friendly option for data center cooling, data localization laws and sector-specific regulatory requirements dictate that data remain in certain locations.
Other economic forces are also at play. Exhibit A: The U.S. has the most data centers in the world with 2,670 facilities. And the world’s largest cluster of these data centers is in balmy Virginia’s “Data Center Alley.” This area’s location doesn’t exactly offer a cool climate, hence the need for cooling systems.
Most data centers rely on artificial air cooling systems — liquid and air cooling. While liquid cooling systems pipe cold water through server racks and tend to be more efficient, they rely on watersheds, many of which are already under stress from climate change. Air cooling blows cold air using massive fans and consumes a massive amount of electricity.
The environmental price tag is high. A recent MIT report found that one data center can run on the same amount of electricity as 50,000 homes. This enormous energy consumption makes the cloud’s carbon footprint larger than the notorious airline industry’s own massive footprint.
Yet, the data center industry remains under-scrutinized. The environmental impacts of other industries — the airline industry included — tend to occupy environmentalists’ time and attention.
It’s the tech industry’s turn to take action
The environmental cost of data storage is an inconvenient truth. So inconvenient, in fact, that much of the tech industry chooses to hide or simply ignore its enormous — and growing — carbon footprint.
It’s no longer an option to throw our hands in the air because our digital economy can’t operate without data centers. The industry must acknowledge its contribution to climate change and embrace its responsibility to do better. In fact, companies doing business in Europe will soon have to start reporting their carbon footprint, and that will include utilization of the cloud.
While the data center market must innovate at the data centers themselves to reduce power and water consumption, the rest of the industry should rethink its intrinsically wasteful, “the more data, the merrier” mindset.
- Stop using email for data communications
Email was designed for human-to-human communication; as such, it has features that were designed for that purpose but also make it an incredibly inefficient mechanism for machine-to-machine communication. Even worse, in some businesses, there may be governance regulations requiring email to be retained for many years. Ditch email and use modern communication methods such as webhooks and REST APIs.
- Consider structured (versus unstructured) data
Unstructured data, referred to as qualitative data, lacks a predefined data model. Since most data tools can’t analyze unstructured data, the industry uses data lakes to preserve the raw data. The problem: Data stored in its native format cannot be deleted, so it sits in data lakes forever — on the off chance someone in some nebulous future will need it.
Structured, or quantitative, data takes more upfront work to organize and predefine before sending to storage. But this approach offers a significant benefit: Technology can extract the operationally significant context from the data and dispose of the rest. This slightly more labor-intensive process eliminates meaningless data storage.
- Eliminate data lakes
Data lakes are central repositories for both structured and unstructured data. Unfortunately, many organizations see their data lakes’ size as a badge of honor, when what really matters is the data’s value. Many organizations have, indeed, created a scalability problem for themselves. Instead of focusing on how to effectively extract meaning from and store less data, the industry invests its people-power in storing greater volumes of data.
- Rethink customer contracts
Some industry regulations mandate data retention; however, tech vendors encourage many other customers not beholden to these regulations to store copious amounts of largely useless data. Why? The tech vendors’ business models rely on the amount of data storage a company “requires.” The more data a customer stores, the higher the revenue generated.
Our society can’t afford to simply not print emails and forget about the rest. While regulatory change will require the collective effort of public and private interests, individual companies must embrace their own responsibility — now. They must abandon wanton data storage and take incremental steps toward creating a more sustainable digital existence.
Richard Whitehead is a DevNetwork advisory board member, ONUG co-chair and CTO at Moogsoft.
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!