Search
Close this search box.
DIgital Data Connection

AI Technologies in Space

As if space missions are not exciting enough, enter Artificial Intelligence! Artificial Intelligence (AI) has already demonstrated significant advantages in faster autonomous decision making, broader information (Big Data) analysis, and faster trend analysis. Add neuro-networks and machine learning (ML) to the increasing number of satellite sensors and the complexity of satellite missions, and it is easy to imagine AI’s inevitability in space. We are, in fact, at the cusp of an AI revolution in space.

Why AI – Increase Data

Space sensors, like cameras and imagers, are developing in two ways; they are becoming smaller without losing functionality and they are streaming more data at higher data rates. Where 25 years ago, a commercial satellite might have had a visible camera and an infrared (IR) imaging sensor taking still pictures, today it is common for a commercial satellite to have at least three streaming imaging sensors operating in the ultraviolet (UV), visible and IR spectrums.

Because there are more options, satellite operators have added data compression, processing, and transmission-based algorithms to select combinations of image sensors to create a composite image onboard the satellite. These commands to control the satellite are not instantaneous and, in some cases, require waiting until the next orbit – a minimum of 25 to 30 minutes.

AI will enable autonomous satellite operations. The satellite operator can tell the satellite what they are interested in and allow the satellite to use its sensor data to analyze and collect the best camera direction and combination of optical spectrum. The result will be a composite image or video containing the best fit to all the operator’s requirements.

Space Data Put into Practice

Hurricane from Space

Some examples include focusing on the eye of a hurricane and measuring the wind speed at several distances away from the eye or observing how ships are moving across the ocean to optimize their paths through winds and currents along their way. Optimizing the best possible image, video and renderings to observe devastating flooding conditions in real time from monsoons in countries that do not have good communications infrastructures is another use of AI technology in space.

Take this to the next level – critical military operations, for example – and we’re talking about a satellite swarm organized to render specific autonomous operations of both friendly and enemy movements in 3D rendered video. Not only can AI maintain the relative positions of each satellite in real time, but the AI engines in the swarm can autonomously reconfigure a specific satellite’s mission profile to adapt to the mission requirements. AI empowers the satellite operator to enhance the mission objectives.

AI Enabling Technologies

Machine learning (ML) and high-performance computing are converging to enable AI. To perform these high-computation functions in real time takes enormous processing power, so typically, general-purpose computing on graphics processing units (GPGPUs) are used for AI applications. The first terrestrial GPGPUs became available in about 2001 and employ high-performance such as OpenGL, DirectX and NVIDIA’s CUDA. Since then, GPGPUs have exploded in functional capability.

NASA’s Low-Earth Orbit Flight Test of an Inflatable Decelerator (LOFTID) was an example of modern space advancements. The successful LOFTID mission demonstrated an inflatable heat shield that acted as a viable braking system by deploying a large inflatable aero shell (in a deployable structure with a flexible heatshield) before re-entering the atmosphere. LOFTID proved the inflatable heat shield concept, which is also suitable for future missions to Mars, Venus, and Saturn’s moon Titan.

NASA LOFTID
NASA’s Low-Earth Orbit Flight Test of an Inflatable Decelerator (LOFTID)

Successful GPGPU Implementations

While passing through polar orbit – the highest radiation low earth orbit – and then reentering Earth’s atmosphere, six of Aitech GPGPU-based S-A1760 Venus AI supercomputers collected visible and IR video data and transmitted the video, along with telemetry, to a ground station and other nearby assets. The S-A1760 was the first implementation of GPGPU processing in a space environment.

Further, many of the autonomous data decisions made onboard LOFTID relied on the unit’s powerful GPGPU computing to achieve mission success. Finally, the S-A1760 systems provided tracking information so the aeroshell could be recovered quickly after completing its mission.

The heart of Aitech’s S-A1760 is the NVIDIA Jetson TX2i system-on-module (SoM), which features the Pascal architecture with 256 CUDA cores and reaches 1 TFLOPS of processing. It is size, weight and power (SWaP) optimized to limit the impact on satellite buses. This supercomputer is the most powerful and smallest radiation-characterized GPGPU rated for space flight in low-Earth orbit (LEO) and near-Earth orbit (NEO) applications. Aitech is currently radiation testing its next generation GPGPU-based space systems, which will offer even greater processing power.

The Future of Space-Rated Electronics

The future of space electronics will continue to see a dramatic shift from traditional computing in Earth-bound data centers to accelerated computing happening in-orbit and Aitech will continue to lead the way. Today our GPGPUs are meeting the call for higher processing capabilities to enable AI in the next generation of higher intelligence providing more intuitive computing capabilities and increased system performance throughout space.