Check out all the on-demand sessions from the Intelligent Security Summit here.
New data governance and sharing, business intelligence, supply chain management, security, AI/ML, spatial simulation tools and capabilities — this week was a busy one at AWS re:Invent, with AWS rolling out a multitude of new services.
Here are some of the most significant announcements from AWS’ annual conference.
Dynamic 3D experiments help organizations across industries — transportation, robotics, public safety — understand possible real-world outcomes and train for them.
For instance, determining new workflows for a factory floor, running through different response scenarios to natural disasters, or factoring different road closure combinations.
Intelligent Security Summit On-Demand
Learn the critical role of AI & ML in cybersecurity and industry specific case studies. Watch on-demand sessions today.
But complex spatial simulations require significant compute resources, and it can be a difficult, expensive process to integrate and scale simulations with millions of interacting objects across compute instances.
To help customers build, operate and run large-scale spatial simulations, AWS has rolled out AWS SimSpace Weaver. The fully-managed compute service allows users to deploy spatial simulations to model systems with many data points — such as traffic patterns across a city, crowd flows in a venue, or layouts of a factory floor. These can then be used to perform immersive training and garner critical insights, according to AWS.
Users can run simulations with more than a million entities (people, cars, traffic lights, roads) interacting in real time. “Like an actual city, the simulation is an expansive ‘world’ in itself,” according to AWS.
When a customer is ready to deploy, SimSpace Weaver automatically sets up the environment, connects up to 10 Amazon EC2 instances into a networked cluster, and distributes the simulation across instances. The service then manages the network and memory configurations, replicating and synchronizing the data across the instances to create a single, unified simulation where multiple users can interact and manipulate the simulation in real time, said AWS.
Customers include Duality Robotics, Epic Games and Lockheed Martin; the latter worked with AWS to develop a San Francisco earthquake recovery demo to illustrate ways that first responders might organize an aid relief mission.
“We need to be able to simulate at real-world scale to trust that the insights we gain from simulation are transferable back to reality,” said Lockheed Martin virtual prototyping engineer Wesley Tanis.
Working with AWS, they were able to simulate more than a million objects “at a continental scale,” he said, “giving us real-world insight to increase our situational preparedness and planning across a wide range of scenarios, including natural disasters.”
Better data handling
Today’s organizations collect petabytes — even exabytes — of data spread across multiple departments, services, on-premises databases and third-party sources.
But before they can unlock the full value of this data, administrators and data stewards need to make it accessible. At the same time, they must maintain control and governance to ensure that data can only be accessed by the right person and in the right context.
The new Amazon DataZone service was launched to help organizations catalog, discover, share and govern data across AWS, on-premises and third-party sources.
“Good governance is the foundation that makes data accessible to the entire organization,” said Swami Sivasubramanian, vice president of databases, analytics, and ML at AWS. “But we often hear from customers that it is difficult to strike the right balance between making data discoverable and maintaining control.”
Using the new data management service’s web portal, organizations can set up their own business data catalog by defining their data taxonomy, configuring governance policies and connecting to a range of AWS services (such as Amazon S3 or Amazon Redshift), partner solutions (such as Salesforce and ServiceNow), and on-premises systems, said Sivasubramanian.
ML is used to collect and suggest metadata for each dataset; after catalogs are set up, users can search and discover assets via the Amazon DataZone web portal, examine metadata for context and request access to datasets. The new tool is integrated with AWS analytics services — Amazon Redshift, Amazon Athena, Amazon QuickSight — so that consumers can access them in the context of their data project.
As Sivasubramanian put it, the new service “sets data free across the organization, so every employee can help drive new insights to maximize its value.”
Safe data sharing
Similarly, to derive critical insights, organizations often want to complement their data with those of their partners. At the same time, though, they must protect sensitive consumer information and reduce or eliminate raw data sharing.
This often means sharing user-level data and then trusting that partners will fully adhere to contractual agreements.
Data clean rooms can help address this challenge, as they allow multiple parties to combine and analyze their data in a protected environment where participants are unable to see each other’s raw data. But clean rooms can be difficult to build, requiring complex privacy controls and specialized data movement tools.
AWS Clean Rooms aims to ease this process. Organizations can now quickly create secure data clean rooms and collaborate with any other company in the AWS Cloud.
According to AWS, customers choose the partners they want to collaborate with, select their datasets, and configure restrictions for participants. They have access to configurable data access controls — including query controls, query output restrictions, and query logging — while advanced cryptographic computing tools keep data encrypted.
“Customers can collaborate on a range of tasks, such as more effectively generating advertising campaign insights and analyzing investment data, while improving data security,” said Dilip Kumar, VP of AWS applications.
Proactively acting on security data
Organizations want to quickly detect, and respond to, security risks. This allows them to take swift action to secure data and networks.
Still, the data they need for analysis is often spread across multiple sources and stored in a variety of formats.
To ease this process, AWS customers can now leverage the Amazon Security Lake. This service automatically centralizes security data from cloud and on-premises sources into a purpose-built data lake in a customer’s AWS account.
Security analysts and engineers can then aggregate, manage and optimize large volumes of disparate log and event data to enable faster threat detection, investigation, and incident response, according to AWS.
“Customers tell us they want to take action on this data faster to improve their security posture, but the process of collecting, normalizing, storing and managing this data is complex and time consuming,” said Jon Ramsey, vice president for security services at AWS.
Addressing supply chain complexity
In recent years, supply chains have experienced unprecedented supply and demand volatility — and this has only been accelerated by widespread resource shortages, geopolitics, and natural events.
Such disruptions put pressure on businesses to plan for potential supply chain uncertainty and respond quickly to changes in customer demand while keeping costs down.
But when businesses inadequately forecast for supply chain risks — for instance, component shortages, shipping port congestion, unanticipated demand spikes, or weather disruptions — they can deal with excess inventory costs or stockouts. In turn, this can cause poor customer experiences.
The new AWS Supply Chain helps simplify this process by combining and analyzing data across multiple supply chain systems. Businesses can observe operations in real-time, quickly identify trends, and generate more accurate demand forecasts, according to AWS.
“Customers tell us that the undifferentiated heavy lifting required in connecting data between different supply chain solutions has inhibited their ability to quickly see and respond to potential supply chain disruptions,” said Diego Pantoja-Navajas, VP of AWS supply chain.
The new service is based on nearly 30 years of Amazon.com logistics network experience, according to the company. It uses pretrained ML models to understand, extract and aggregate data from ERP and supply chain management systems. Information is then contextualized in real time, highlighting current inventory selection and quantity at each location.
ML insights show potential inventory shortages or delays, and users are alerted when risks emerge. Once an issue is identified, AWS Supply Chain provides recommended actions — moving inventory between locations, for instance — based on percentage of risk resolved, the distance between facilities, and the sustainability impact, according to AWS.
“As supply chain disruptions continue for the foreseeable future, companies need to stay focused on balancing cost efficiency, sustainability and relevancy across their supply networks to support growth,” said Kris Timmermans, global supply chain and operations lead at Accenture (an AWS Supply Chain customer).
“Executing a cloud-based digital strategy can enable an agile, resilient supply chain that is responsive to market changes and customer demands,” said Timmermans.
Also this week at AWS re:Invent, AWS announced five new database and analytics capabilities, five new capabilities for its business intelligence tool Amazon QuickSight, and eight new Amazon SageMaker capabilities.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.