Check out all the on-demand sessions from the Intelligent Security Summit here.


By 2022, we are expected to have more than 14 billion connected devices generating a volume of data we could never have imagined. When it comes to Internet of Things (IoT) sensors and data collectors, more is more.

The size of modern datasets can leave leaders of large-scale IoT deployments lost at where to begin analyzing and interpreting data for business benefits. Just like anything else, having a billion of something is helpful only if you know what to do with them.

We throw the word “scalable” around a lot, but at the end of the day, it’s just companies and leaders wrestling with the question: “Will my system or platform be able to handle the data at hand and a projected increase?”

Here are the common challenges faced and what you should look out for when evaluating your platform.

Event

Intelligent Security Summit On-Demand

Learn the critical role of AI & ML in cybersecurity and industry specific case studies. Watch on-demand sessions today.

Watch Here

The cardinality challenge

Cardinality is the number of possible values in a dataset, from as low as two to hundreds of millions.

High cardinality has always been a pain point for data processing, as latency and cardinality are directly correlated in standard databases. As you can imagine, the datasets often seen in large-scale IoT deployments like industrial, manufacturing, or automation scenarios can have extremely high cardinality. Consider, for example, an IoT deployment with 5,000 devices, each having 100 sensors across 100 warehouses, leading to a cardinality of 50 million. In addition, the metadata commonly paired with time series data can quickly fuel this fire.

To ensure that your systems perform well enough to support the real-time analytics and monitoring that are now essential to industrial use cases, you need to be sure that your database management system will not get bogged down as the cardinality of your data increases. Only systems that can resolve this pain point and guarantee consistent latency for data queries — even as the number of tables in your database increases exponentially — can be considered future-ready and prepared to meet your business needs.

Don’t box yourself in

Developers and data scientists in automation, manufacturing and other parts of the industrial sector are constantly evolving, and so should the technology that powers their enterprises. The most powerful lesson that industrial leaders can take away is choosing to be agile. Nothing hurts like taking apart an architecture or infrastructure that ceases to serve the company’s needs or, worse, being locked into a system that prevents you from moving forward.

Because the data is so complex, platforms should be user-friendly. Your data platform should simplify your business, not add another layer of complications. It’s also valuable to look at open-source projects that don’t tie you to a specific vendor or service provider or box you out with legacy constraints. And because the data is infinite, choosing a cloud-native system is the most beneficial way to stay agile. The cloud — whether public, private or hybrid — is the future that allows you to utilize elastic storage, computing and network resources.

How to shop to scale for an expanding IoT

Because there are technical challenges to scaling for IoT, leaders should have a strategic vision of the capabilities they will need. There are several things to consider.

First, the platform’s foundation needs a simple architecture to reduce maintenance costs and be able to scale to meet projected business growth. For IoT especially, the platform must be able to ingest millions of data points rapidly, enabling solid data analytics with SQL support. In addition, the system should have outstanding concurrency support since more and more users will access the system for data analytics, including batch and real-time data analysis.

Scaling must be done gradually, not just by flipping the light switch. The architecture not only needs to be able to handle the data, but it also needs to do it well. That means all cylinders — connectivity, processing, storage and organization — must be firing at 100% to consider it a successful scaling project.

Leadership involves taking charge not only of the issues of today but being prepared for the challenges of tomorrow. To guide your enterprise forward, you need systems and architectures that can grow with your business and support your products now and in the future.

Jeff Tao is founder and core developer of TDengine.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers