Today people all around the world will be celebrating the 50th anniversary of one of humanity’s greatest technological achievements: landing on the Moon.

Technology has undergone immense change since 1969. The computer systems and software that took Neil Armstrong, Buzz Aldrin, and Michael Collins to our nearest celestial neighbor pale in comparison to the smartphones we carry around in our pockets today.

Fifty years on, as we set our sights on a return to the Moon, as well as future human spaceflight to Mars and beyond, what are the innovations that will get us there?

Openness and collaboration

Research institutions and national labs across the globe are pouring hundreds of thousands of research hours into every conceivable aspect of space science. And, overwhelmingly, the high performance computing (HPC) systems used for all research are running open source software.

In fact, 100% of the current TOP500 supercomputers run on some form of Linux.

Therefore, it’s likely that the future of space exploration will be built on the open source philosophy of knowledge sharing and collaboration among researchers and developers. Success will depend on the adoption of open technologies to stimulate collaboration among nations, as well as advances in the field of AI and machine learning.

Although these are ambitious objectives that could take several years to fully implement, we are already seeing great progress: open source software is already running in spaceAI and machine learning is used in spacecraft communications and navigation, and the number of commercial companies interested in the space economy is growing.

The Spaceborne computer example

It is probably no coincidence that HPE selected its own high-density Apollo servers when building the International Space Station’s Spaceborne Computer in 2017, bringing one teraflop of computational power from Earth to space. The Spaceborne Computer is an off-the-shelf system installed in a special enclosure. It runs non-hardened hardware and software, and is controlled by an open source operating system.

Computing systems on spacecraft are usually highly-specialized and specifically hardened to protect against exposure to cosmic rays, gravitational forces, and other environmental hazards. However, ever since the first manned spaceflight in 1961, opinions have begun to change regarding the hardening and protection of the hardware. If humans can sustain severe environmental changes for extended periods of time, then shouldn’t computer hardware be able to as well?

HPE and NASA originally planned Spaceborne’s mission to be a year-long experiment, which is approximately the amount of time it would take a spacecraft to reach Mars. The goal was to run computer and data-intensive applications in the hostile environment of outer space and determine the effects of solar radiation on the systems while running. On June 4 this year, after spending 615 days on board ISS and having traveled nearly 228 million miles, the Spaceborne computer was successively returned to Earth by SpaceX’s Dragon 9 spacecraft.

The outcomes of the Spaceborne project will help scientists find new ways of using off-the-shelf hardware in space without the need for expensive and bulky protective shielding or other hardening techniques. Spaceborne’s success also confirms that commodity computers using standard operating systems and software can be used to transport humans to Mars. These machines could then be delivered to the surface of the Red Planet and deployed by scientists and ground personnel to conduct research and experiments.

Open source hardware and infrastructure

We could speculate that computer hardware will follow the same pattern as software and that open source design principles, like those used in RISC-V, will help create processors to run the brains of a spacecraft or landing module.

Lowering the barrier of entry for electronic design is one of the main goals of an initiative launched by DARPA, which is aiming to cost-share research with the microelectronics community to usher microsystems into a new age of innovation. DARPA is helping, to a certain extent, to open-source hardware designs.

Similarly, we need a fundamental shift in the way we approach computing infrastructure — just like commoditization and standardization transformed supercomputers from proprietary to more open designs.

Some of the computer systems on board the International Space Station are 20-25 years old. Once they are there in space, they usually stay there. Computers we use on Earth today are thousands of times more powerful than computers that run in space.

This is where, for extended space missions, the idea of composable infrastructure becomes very interesting. Composable infrastructure treats compute, storage, and network devices as pools of resources that can be provisioned as needed and in real time, depending on what different workloads require.

The approach is not unlike a public cloud in that resource capacity is requested and provisioned from a shared pool. However, composable infrastructure sits on-premises in an enterprise data center. Or, in this case, on board the spacecraft.

As you take a spacecraft from Earth to orbit, and then to distant planets, the purpose and computational needs change progressively. For instance, in a colonization effort such as Mars One, once the module lands, it’s not going to leave Mars. Onboard computing systems, therefore, need to take the form of a “portable cloud” that can be self-aware and able to intelligently reconfigure themselves down to the basic elements like CPUs, memory, and storage. They also need to run general-purpose operating systems and orchestration software.

Extending human capabilities with AI

The most valuable currency in the world for humans is time, especially when it comes to solving problems. Machine learning and artificial intelligence are transforming industries by allowing humans to spend their time focusing on high-value problems.

In space, these technologies could truly be transformative, as computers could collect, analyze, and act upon data acquired during flight without having to involve a human. It is incredibly expensive to send someone into space — NASA announced in June that it would open the International Space Station to private individuals at roughly $35,000 per night per astronaut (in addition to the cost of the flight).

If you take away the need for a computer technician or engineer (or one of the many other hats an astronaut needs to wear) through composable infrastructure and AI, you can make room for more specialists instead of expecting astronauts to be “Jacks-of-all-trade.” So from a mission perspective, it would mean you can send more explorers and scientists with essential skills to colonize Mars, for example.

Embracing innovation and breaking away from tradition is what helped humanity land on the Moon. Getting to the Red Planet and beyond will require a fundamental shift in the way we deploy off-the-shelf, modular, and self-learning computer infrastructure. We will need to re-evaluate the way we design software and hardware. And an entire ecosystem of companies will need to work together to push the very limits of what is currently thought possible. It sounds like a tall order, but it was this spirit, this willingness to push the boundaries, that made the Apollo missions a success 50 years ago.

Yan Fisher is Global Evangelist, Emerging Technologies, Red Hat.