Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.

In 2018, a group of organizations from all of the world will begin construction of the largest radio telescope ever built, the Square Kilometre Array (SKA).

With one million square meters of collecting area and enough optical fiber to wrap around the Earth twice, this marvel of modern engineering will be sensitive enough to detect airport radar on a planet 50 light years away. SKA will also generate 700 terabytes of data every second, equivalent to roughly 35 times the data stored in the Library of Congress. At full capacity, the SKA’s aperture arrays are expected to produce 100 times more data than the entire Internet. It doesn’t take a rocket scientist to realize that such a deluge of information creates a big data problem, perhaps the biggest we have ever encountered.

Solving this big data problem for the space industry requires innovation in the data storage, processing, and access (or visualization) technologies, which, in turn, creates ample opportunities for startups and large data crunching companies to take advantage of.

A few major factors will drive exponential growth in the amount of terabytes falling on us from the skies over the next couple of decades: the increasing speed of commercial satellite deployment, implementation of faster communication technology, and the onset of interplanetary missions.

The Growing “Orbital Economy” and Deep Space Exploration

The dwindling cost of launches and the democratization of the satellite market are going to result in an unprecedented growth of orbital activity. Based on announced plans by various companies and space programs, between 2,000 and 2,750 cube- and nano-sats will be launched by the end of this decade — the Goddard Space Flight Center lists 2,271 satellites currently in orbit. Most of the new spacecraft will have commercial applications, particularly in Earth observation. Earth observation means images and video, often multi-spectral or even 3D, which are some of the heaviest “packages” in terms of data units involved.

Historically, the single largest barrier that has kept the space data floodgates closed was the ability to transmit the collected information back to Earth. Most current space missions use radio frequency to transfer data, which is a relatively slow approach. NASA’s typical deep space explorer would send back data on the order of megabytes per second, while earth orbiting spacecraft are typically doing so in gigabytes per second. In the future, however, the space industry is expected to start switching to new type of optical (or laser) communications that will significantly increase the download speed and mean a 1,000-multiple surge in the volume of data.

In the last few years, both national space programs and private companies have made a number of big announcements regarding their plans for ambitious interplanetary missions: China is reportedly plotting a moon colony, SpaceX is well on track for a manned mission to Mars — especially given the latest contract award from NASA — and Planetary Resources is planning to prospect and mine near-earth asteroids for water and platinoid group metals by the end of this decade. (Disclosure: Planetary Resources is one of my portfolio companies.)

Eric Anderson, the co-founder of Planetary Resources, estimates that the “planetographic” data available just in our own solar system dwarfs the amount of geographical data we have on Earth by three orders of magnitude.

Data Storage and Management

Amazon and NASA have recently launched the NASA Earth Exchange (NEX) platform, a collaboration and analytical tool that combines state-of-the-art supercomputing, Earth system modeling, workflow management and NASA remote-sensing data. With NEX, users can explore and analyze large earth science data sets, run and share modeling algorithms, collaborate on new or existing projects, and exchange workflows and results within and among other science communities. For now, NEX works primarily with data sets for climate, vegetation, and Landsat global land survey. However the platform ultimately serves as a strong showcase for what cloud computing technologies can do for the space industry.

In the meantime, we see a number of players testing new business models by bringing the concepts of sharing economy into the geo-business by mobilizing underused assets — satellite constellations, UAVs, and other aerial imaging platforms — and essentially creating a new revenue channel for data owners. The concepts of “virtual satellite constellation” and “geo-AppStore” are becoming more and more a reality. In the past year, we have seen a number of cloud-based platforms such as ArcGIS by Esri and CloudEO Store that bring together data providers, software developers, and service providers in an online marketplace where customers can search for geospatial products to fit their needs in safe SaaS-based environments. (Disclosure: CloudEO is one of my portfolio companies.)

Even hardware innovators are recognizing the importance of opening up their platforms to greater collaboration. Silicon Valley based Planet Labs — which raised more than $60 million from groups like DFJ, OATV, and Yuri Milner — is promising to release its developer API (application-programming interface) by the end of this year.


Visualization is the other important aspect of making geospatial data useful to the end customer. Whether you are a farmer looking to assess how soil moisture content affects vegetation levels across your fields or a government agency trying to identify deforestation patterns and illegal logging operations, the way data is analyzed and presented can be partial to the end result.

Spanish startup CartoDB recently offered a unique approach to visualization. Instead of focusing on the base maps like Google Earth does, it focuses on the data and application layers on top. Moreover, by using an open-source approach, CartoDB has attracted more than 50,000 users to its platform, and they are constantly contributing to the quality and quantity of available data and applications on the platform. The result has been thousands of beautiful maps that are useful across a number of industries, from real estate and banking to healthcare and natural resources. Investors showed their confidence in the company’s approach with an $8 million Series A round earlier this month.

The market for geographic information systems (GIS) is estimated at $2.5 billion, the data visualization market stands at $4.2 billion, and location-based services stand at $7.5 billion. No wonder Google has been actively building on top of its platform by acquiring complementary assets such as Skybox Imaging and Titan Aerospace earlier this year. By combining satellite and drone imagery with its computing power and content delivery capabilities, Google has a chance to build the first fully vertically integrated GIS service and perhaps take Google Earth platform LIVE someday.

While it does seem more glamorous to be launching rockets and building space stations, the truth of the matter is that major dollars will still be made on Earth by data crunchers converting space bytes into beautiful maps and infographics that anyone of us can use.

Ilya Golubovich is Managing Partner at I2BF Global Ventures.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.