Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Register today.
Startups and tech giants alike are vying for a slice of the burgeoning internet of things (IoT) market, and Amazon is in pole position with an estimated 34% of IoT developer market share. Its lengthy list of IoT services includes IoT Core, which lets connected devices interact with cloud apps, and IoT Greengrass, which extends Amazon Web Services to edge devices so they can act locally on the data they generate. There’s also the analytics service IoT SiteWise; the application builder IoT Things Graph; and the cybersecurity suite IoT Device Defender, to name a few others.
To get a sense of the IoT landscape through Amazon’s lens just over two months out from the company’s annual AWS re:Invent conference, we spoke with CTO Werner Vogels earlier this week in a phone interview. Conversation topics ranged from the challenges involved in device deployment to the privacy concerns that arise as data from IoT devices is collected and processed.
Here’s a transcript of our interview, which has been edited for length and clarity.
VentureBeat: Could you talk the state of the IoT space today and why it’s such an important part of AWS’ business? It’d be great if you could address in your answer the hybrid cloud paradigm and some interesting use cases there, or relevant AWS services and customers you’d like to highlight.
MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.
Werner Vogels: Many of our customers literally deploy hundreds of thousands of sensors. Woodside, a large energy company in Australia, has 200,000 sensors to support, [each of which] generates huge amounts of data. [Some are on] drilling platforms hundreds of miles out to sea, where connectivity is not always stable.
IoT Greengrass is often used for these scenarios, which is our IoT environment that can operate independently of the cloud. [Customers like Woodside] … can with AWS not only observe what’s happening now, but predict what’s going to happen a week in advance. For them, it’s really important to be able to anticipate maintenance on those manufacturing operations.
In these IoT scenarios, it’s not just a matter of IoT — it’s IoT plus intelligent processing so that machine learning can be applied to get insights that improve safety and efficiency. There’s a lot of processing that happens in the cloud because most of [AI model training] is very labor-intensive, but processing often happens at the edge.
If you look at the Amazon.com fulfillment centers, for example, we have over 200,000 Kiva robots running about. They can’t always rely on a centralized control to steer them around; they need to be able to be autonomous and operate by themselves.
Massive, heavy compute will [have a place] in the cloud for model training and things like that. However, their workloads aren’t real-time critical most of the time. For our real-time critical operations, models must be moved onto edge devices.
VentureBeat: I’m glad you mentioned the robotics use case because AWS RoboMaker [Amazon’s cloud robotics service for deploying and managing intelligent machines] has gained quite a lot of traction in just a few years. And Amazon internally has robots — fulfillment center robots, as you mentioned, but also Amazon Prime Air drones and even Scout.
Vogels: Yeah, the drones are a really good example. Amazon drones have a diversity of sensors, because as it turns out, certain [objects] can’t be detected with sonar. They need to be able to operate in complete autonomous mode — to arrive in somebody’s backyard in the place where they should be landing and detect potential hazards on device rather than in the cloud.
VentureBeat: You noted a second ago that some workloads have to be performed in the cloud because of the amount of data involved. AWS a while ago announced a product called Inferentia, an inference chip that delivers high inferencing performance and supports AI frameworks like Google’s TensorFlow and Facebook’s PyTorch. Can you talk about scenarios where a customer might want to use Inferentia?
Vogels: Advancements in AI [research] have maintained lockstep with the development of [kits] and devices that can accelerate model execution, but we’ve also seen significant investments in software. For instance, AWS recently announced SageMaker Neo, which is targeted toward IoT devices that have a much smaller memory footprint.
It’s a combination of software and hardware that will [advance the state of the art]. Inferentia will play a role in that, but I think software like SageMaker Neo will drive things forward as well.
VentureBeat: That’s a great segue into the next topic I’d like to discuss, which is data privacy. Edge computing is one way to ensure a level of privacy, depending on the application and data involved. How can the average business ensure that data isn’t transmitted to a server people don’t want it transmitted to?
Vogels: I always consider security on the one hand and privacy on the other. Privacy — what is acceptable to share, what is not acceptable to share — is often much more of a societal or individual decision.
On the subject of security, AWS IoT Device Defender is a platform is dedicated to managing all the given security capabilities of devices and the environment around them. That includes device encryption or data encryption, which I think is crucial. Corporations should have full control over where their devices can communicate … and make sure that their devices have strong identity.
The scenarios that we’ve seen in the past year — [compromised] home automation devices running a very open version of Linux — are scenarios that absolutely should not happen. That’s why Amazon FreeRTOS, our IoT operating system, offers very strong identity and encryption in combination with IoT Device Defender to provide control over where data can flow and where not.
VentureBeat: But is it fair to say that, were a company like Amazon to deploy, for example, an on-device English language model to Echo devices, it’d be a boon for privacy because processing would happen locally instead of in the cloud?
Vogels: Consumer devices need very strong controls. In the Echo case we’re talking about, there is a mute button on top to disable microphones. With that, we need to make sure that the device has very limited capabilities in the data sense. We need to make sure it only listens for the wake word, and that the data that’s being collected and processed to improve the device is what the consumer wants to share.
It’s not just a matter of developers operating ethically or … things like that. Customers need to have control.
VentureBeat: Privacy is an important part of machine learning model training — not just inference at the edge, but training the models that run at the edge. Could you talk about Amazon’s approach with respect to privacy-preserving techniques? Do you offer AWS customers services that take advantage of, say, federated learning?
Vogels: At Amazon from day one, security and privacy have always been a [top] concern. There’s no business without security to protect customers’ data, and we have very strong controls [around this]. Your data is your data, and we operate internally with a least privileged model, so we’re continually taking permissions from developers to see what’s the minimum set of privileges they actually need to do their job. No engineer can take the old-fashioned root privileges — they have limited or no access to customer data.
There’s a whole different area that’s still very much in the research stage at AWS and Amazon, and that’s [identifying] bias in machine learning. We want to move this area forward so our customers can ensure that both their data and models are fair. On a related note, there are GDPR conditions where customers can remove their data not only from storage, but also from models that have been built using that data.
VentureBeat: Right, and it depends on the type of data we’re talking about — you’d want particularly strong controls around health care data, for example. Transitioning a bit, I’d like to talk a bit about connectivity, which is another important piece of the IoT puzzle. Amazon not that long ago announced Sidewalk, a project to develop a wireless protocol that’s low power and low bandwidth but high range. Clearly you as a company are invested in this — could you explain its importance?
Vogels: We have nothing to announce, but let me take another angle there without talking about protocols. One of the enabling technologies in IoT is 5G. That’s not only because of its higher speeds, but because of the massive parallel management it makes possible.
From my point of view, the important part isn’t necessarily the better bandwidth, but the fact that you can have so many more devices connected while maintaining the bandwidth. With the advent of 5G, I think what you’ll see is many more concurrent connections can be kept. And for all the smart IoT operations that have been built or that are being built, that’s extremely important.
VentureBeat: We’re running up against time here, but I did want to ask about AWS’ developer hardware business — specifically kits like AWS DeepLens. Is this an area AWS still considers critical? It seems to be a burgeoning field, what with Google’s Coral and Nvidia’s Jetson Nano.
Vogels: Absolutely. We strongly believe in the notion that builders build, and for that they need to have the capabilities to build. Having a nicely encompassed device that easily connects to SageMaker is an enabler for customers to start focusing on the algorithms they want to build or the data they want to process, and what we’ve seen is massive innovation happen.
We’re looking at builders to make sure they have the right tools — not necessarily to build a production system, but to become really familiar with the capabilities. That’s the whole story behind SageMaker. It gave developers a really good machine learning pipeline that didn’t exist before.
The same is true of DeepRacer, the driverless car racing competition we’ve been running for a year. It’s all about reinforcement learning — basically, how the machinery itself decides to balance long- and short-term goals. Most developers don’t have access to an autonomous car, but a really small car can really help them think about what data they need. We also built a simulator, so that they can get these same capabilities … without having to procure a real-world track.
DeepRacer is important for another reason — autonomous cars are a scenario where compute is shifting from the cloud to the edge. It can’t always be assumed that autonomous cars will maintain a connection to the cloud, because that would risk lives.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.