We can’t escape AI conversations these days, can we? AI may not be everywhere yet, but talk about it is—in magazines, blogs and product roadmaps in every industry. Opportunities for AI value abound in satellite networks, across both space and ground segments. And while AI is raising more questions than answers so far, a few broad technology strokes are becoming clear, especially in the ground network.
One bit of clarity is that to support AI in network operations, software environments beat hardware. For another, interoperability will be key.
Space networking use cases are as diverse as predictive maintenance, improved reliability and automating operations for profit-based management.
And for security. For example, employing AI to actively repel and defend by looking for and quickly reacting to anomalous traffic into and out of the ground M&C. Which highlights the first of those broad stroke conclusions. Virtualized, orchestrated elements enable more organic and dynamic processes such as on-the-fly updates and fixes, which is not only critical for cyber security but is especially needed for automating operations.
One reason (among many) why terrestrial network operators are ahead of satellite operators when it comes to their AI plans is that most have already moved far down the software-defined and cloud-enabled paths.
The second broad conclusion, the need for interoperability, is even more important and fits hand in glove with virtualization. Here’s why.
It’s common knowledge that AI is only as good as its training data. For network operations that means capturing the correct data from disparate elements and systems, all with the right labeling and correlations, and in consistent formats for teaching the AI what you want it to learn.
Purpose-built hardware is usually proprietary, reporting a fixed set of information in a fixed format. Other systems may be able to massage and harmonize that data but can only do so much and only after the fact. With virtualized functions, on the other hand, all information can be exposed as required and evolve over time into formats that are more easily digestible for the AI. In many ways standards are all about supply chain compatibility.
For example, suppose you wanted to train your AI to automatically meet unique network throughput needs while optimizing overall power consumption. Say you have three gateways, each differentiated for diverse environmental factors, and all with differing levels of capability and costs. In addition to external data such as weather, you’ll need a wide variety of component, system, network and business information, including very specific and accurate power consumption data and current configuration for each application. Acquiring this data from some hardware may not even be possible, and without rigorous standards compliance would be extremely difficult to correlate. In a standards-based software environment, however, while it may require a resource commitment, it is not technically challenging to build and scale a software or cloud environment in ways that allow access and extraction of any necessary information in consistent, AI-friendly formats.
What’s more, the technology challenge of growing and future-proofing systems is commonplace and perpetual. And AI isn’t the only fundamental tech change we are facing today. While we in the satellite industry are just now getting to understand, plan for and implement 5G NTN, the terrestrial network world is already beginning to define 6G.
According to a recent white paper written by research firm Analysys-Mason, “6G will be an AI-native technology… As optical inter-satellite links become commonplace in low-earth orbit, orchestration platforms must be ready to compute optimal network configurations and traffic routes in a considerably more complicated system. As the size of the interconnected network grows, the number of possible configurations increases exponentially, and machine learning (ML) tools will be required to compute viable configurations in real time.”