Head over to our on-demand library to view sessions from VB Transform 2023. Register Here

Cloud footprints are exploding, as is the volume of data stored within them. 

And, due to its low cost, simplicity, reliability and flexibility (among other factors), the public cloud — or a hybrid or multicloud model incorporating it — is the option of choice. 

But everything has its disadvantages; notably, increased work processes in the public cloud can cause security gaps, experts say. 

“Organizations are experiencing an explosion of data on their public cloud environments,” said Dan Benjamin, CEO and cofounder of Dig Security. This results in “an extended data attack surface that can lead to a breach or compliance failure.”


VB Transform 2023 On-Demand

Did you miss a session from VB Transform 2023? Register to access the on-demand library for all of our featured sessions.


Register Now

>>Don’t miss our special issue: The CIO agenda: The 2023 roadmap for IT leaders.<<

Data: Dynamic, complex — and ever-vulnerable

With the public cloud model, all servers, storage, hardware, software and other supporting infrastructure are owned and managed by the provider. And, those are shared with other organizations, or ‘“tenants.”

As of 2022, more than 60% of all corporate data was stored in the cloud. This share reached 30% in 2015 and has continued to grow as organizations look to improve reliability and agility.

This year, revenue in the public cloud market is projected to reach $525.6 billion, registering a compound annual growth rate (CAGR) of nearly 14%. Undoubtedly, the market will only continue to grow (and at an accelerated pace), exceeding $881 billion by 2027. 

And, Gartner estimates that, by 2026, public cloud spending will exceed 45% of all enterprise IT spending, up from less than 17% in 2021.

But, Benjamin pointed out that high-profile security incidents such as the Uber and LastPass breaches have proven how vulnerable cloud data stores are, even for organizations that understand cybersecurity and invest in data protection.

“Data is dynamic and complex,” said Benjamin. “It lives in various forms and is constantly being collected, so it is ever-changing across the public cloud.” 

Cloud environments are often part of complex ecosystems that include more than one public cloud provider and on-premises infrastructure, he explained. Also, many organizations simultaneously run multiple software-as-a-service (SaaS) applications, virtual machines (VMs), containers and cloud instances, adding more layers of abstraction. 

As data travels between these assets, discovering it and mapping data flows is challenging and easy to lose control of, he said.

Hiding in the shadows

As organizations move quickly and deliver faster to production, they give a lot of power to areas other than IT or DevSecOps, explained Shira Shamban, CEO and cofounder of cloud security company Solvo

And, “they create, unintentionally of course, shadow data that doesn’t follow security best practices,” she said. 

Shadow data is that which is not actively managed or governed by IT teams. It can include snapshots, backups and copies of data used for development and testing purposes, Benjamin explained. It primarily exists in spreadsheets, local copies of databases, emails, presentations and on personal devices. 

Security controls and policies are often not applied to this data, making it more difficult to track, manage and monitor. It also leaves it susceptible to unauthorized access and exfiltration, said Benjamin. 

This poses significant risk from both security and compliance perspectives, he said. A lapse in compliance could result in fines and reputational damage, while a weakened data security posture exposes organizations on several levels. Damage caused can diminish customer trust and result in reputational damage, fines, legal fees and IP theft.

In particular, the nature of the public cloud “makes it easy to spin up a new data store, but difficult for security teams to monitor the contents of that data store,” said Benjamin. “As such, organizations must change the way they think about data security.”

A complex data environment

Across the board, protecting cloud data is both critical and challenging — no matter whether private, public, hybrid or multicloud, experts say. 

And, the most common attacks in the cloud are no different from common attacks on-premises, said Shamban. Typically, this is credential theft; the unique attack vectors in the cloud have to do with misconfiguration of cloud technology. 

Benjamin agreed that there are a variety of ways to infiltrate the cloud environment; attackers commonly exploit software vulnerabilities, leaked credentials or misconfigured access. But, regardless of how the environment is infiltrated, he said, the objective is always either to steal or sabotage the data for financial or other gain. 

“This is what makes focusing on protecting data so important and effective,” said Benjamin. 

Visibility is critical

There are many tools that organizations use to protect themselves; one common one is cloud security posture management (CSPM). This identifies and remediates risk through visibility automation, uninterrupted monitoring, threat detection and remediation workflows. It searches for misconfigurations across diverse cloud environments and infrastructure including SaaS, infrastructure-as-a-service (IaaS) and platform-as-a-service (PaaS). 

Gartner, for its part, recently introduced the idea of data security posture management (DSPM). 

According to Patrick Hevesi, Gartner VP analyst, this includes several components: 

  • Compliance assessment
  • Risk identification
  • Operational monitoring
  • DevSecOps integration
  • Policy enforcement
  • Threat protection

As Benjamin explained, this approach can work alongside a similarly new concept of data detection and response (DDR), which (as its name would suggest) provides real-time monitoring, detection and response. 

“Increasingly, there is a heightened awareness of the risks and a movement toward better governance and monitoring over data assets,” he said. “Capabilities for DSPM, cloud data loss prevention (DLP), and DDR can help organizations meet the challenges head-on.”

A mix of tools, culture

Ultimately, organizations must train their devops and R&D teams to have security “ingrained in their mindset,” said Shamban. They must also be equipped with the right tools to help automate some of their daily decision-making and remediation tasks, as this will free up their time for more complex projects.

“We can’t stop using the cloud, and that’s why we should learn how to use it more efficiently and more securely,” she said. 

Benjamin agreed, acknowledging that enterprises aren’t going to abandon the public cloud due to its numerous advantages

“Cloud computing enables unparalleled flexibility, performance and velocity,” he said. 

And ultimately, “the risks should not discourage organizations from using public clouds,” said Benjamin.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.