The Evolution of Design in Low-Earth Orbit Satellite Communications Systems

Mar 22, 2024

Commercial space satellite systems are experiencing a surge in interest and investment. With more than $23.5 billion injected by private investors into space-related companies since 2021, tech giants like SpaceX and Amazon (Kuiper) have driven space initiatives aimed at increasing global broadband access. Historically, satellite communications served purposes such as voice communication, defense, and space exploration. One of the first commercial communications satellites, named Intelsat 1, was launched in the mid-1960s, and it used a geostationary orbit (GEO). GEO satellites orbit the Earth along the equatorial plane at an altitude of 35,786 km (22,236 mi) and appear not to move in the sky to an observer on the Earth’s surface. However, the introduction and proliferation of fast-moving Low-Earth Orbit (LEO) satellites has significantly reduced the financial barrier to launch satellites and opened avenues for novel applications. 

This economic benefit is due to two factors: 1) the satellites’ size - the latest Starlink LEOs from SpaceX are as small as a kitchen table – and 2) multiple LEOs can be launched simultaneously. However, while LEOs make satellite communications systems more economically viable, they introduce complexity and require engineers to manage higher Doppler shifts, interference, and network complexities.

Trends Driving LEO Satellite Communications Systems Adoption

Ubiquitous connectivity – an environment where devices can create, share, and process data from virtually anywhere – is one of the key trends driving the adoption of LEOs. Despite the progress in building the terrestrial wireless communications infrastructure, significant portions of the world, such as rural communities and oceans, remain devoid of cellular connectivity due to cost or geography. It is often not economically viable for a cellular service provider to build a base station infrastructure in remote areas since too few people in those areas can purchase service for the provider to recover its costs. Satellites, however, are a critical enabling technology in the wireless industry’s work toward closing the connectivity gaps between urban and rural areas. Closing this gap will enable people all over the world to enjoy a host of digital activities that many urbanites already accept as standard – distance learning, seamless videoconferencing, online gaming, movie streaming, e-commerce, collaboration with distant colleagues, and remote work.

In addition to cellular accessibility, LEOs can also improve cellular capacity. Consider the following market data from Statista: there are currently 4.6 billion smartphone users worldwide, and the number of Internet-connected devices is expected to reach more than 29 billion worldwide by 2030. More and more people are using the Internet, increasing the global cellular system demand. Much of that traffic requires high bandwidth (e.g., movie streaming and high-quality videoconferencing). Wireless companies continue to invest in terrestrial infrastructure as commercial satellites have not always been cost-effective; however, the cost of LEOs is decreasing, making them a viable option to address the increasingly limited bandwidth, especially in remote areas. Even in urban areas, LEO satellites can be used to offload some of the bandwidth being used by the terrestrial cellular system. And as all of us have had the experience of a choppy Zoom or Teams meeting, that additional bandwidth can help to improve our long-distance digital interactions.

Finally, disaster recovery communications are a key trend driving satellite communications adoption as extreme weather events are becoming more powerful and frequent. Cellular infrastructure is frequently knocked out during these events, prompting satellite activation to ensure first responders, government officials, and residents can broadcast and receive critical safety information. This use case was validated when Starlink positioned 120 satellites over Southwest Florida and other areas affected by Hurricane Ian when terrestrial cellular infrastructure was destroyed.

Figure 1. The introduction and proliferation of Low-Earth Orbit (LEO) satellites have lowered the financial barrier required to launch satellites and opened the door for new use cases 

LEO Advantages: Less Latency and Loss, and More Redundancy

Before LEOs, satellite communications systems primarily used Geostationary Earth Orbit (GEO) satellites. Three GEO satellites, properly spaced in longitude and revolving at the same rate as the rotation of Earth, can provide virtually full Earth coverage and not introduce any Doppler shifts in the carrier frequencies of the transmitted waveforms. Three GEO satellites can cover the planet with a few crosslinks but, unfortunately, are far more expensive to build and launch than LEOs. Also, if one GEO satellite fails without a redundant satellite already in orbit to replace it, then network connectivity is substantially compromised. LEO constellations with thousands of satellites enjoy a built-in redundancy that can absorb some in-orbit satellite failures.  

Further, GEO satellites’ distance from the ground and each other introduced considerable latency in their signals, on the order of a quarter second for a round trip. While GEO satellites are acceptable for email and other non-real-time communications, phone and video calls experience significant delays that impede natural communication. 

Signal delays are much shorter with LEOs because they are closer to the Earth’s surface. However, transmitters need more power to communicate with LEOs than terrestrial wireless networks. This is because terrestrial network signals travel between 5-10 kilometers while LEO signals travel up to 2000 kilometers and suffer more signal loss. Even so, a GEO signal to the Earth’s surface suffers at least 25 dB more loss (20 log10(35,786 /2000) than a LEO signal to the surface.

LEO Challenge: Nonlinear Power Amplification

LEOs’ diminutive size is both a boon and a design challenge. LEOs’ power amplifiers (PAs) must be physically small yet have enough power to transmit a signal to their intended target. In an ideal world, satellite engineers want PAs to have a linear characteristic even when driven with high-power inputs. However, the figure below shows that PAs driven too hard can significantly distort signals. Figure 2 shows both compression effects from overdriving and memory (i.e., filtering) effects from the finite PA bandwidth.

Figure 2. Power amplifier characteristic showing the effects of both nonlinearity (compression) and memory. The digital pre-distortion (DPD) characteristic shown compensates for the nonlinearity.

This is exacerbated by the fact that 5G orthogonal frequency division multiplexed (OFDM) signals, which will be used with LEO satellites, can have peak-to-average power ratios (PAPRs) in excess of 10 dB. This would mean that the 5G signals would have to have relatively low power when input to the PAs, to avoid the spectral regrowth and adjacent power ratio (ACPR) degradation that comes with overdriving a PA. However, such low-power inputs would cause the PAs to operate at suboptimal efficiency.  

Various techniques have been suggested to reduce the PAPR of OFDM. They include:

  • Clipping and filtering,
  • Iterative clipping and filtering,
  • Error control coding, with codewords chosen to minimize PAPR, and
  • Constellation extension.

Notwithstanding the above techniques for PAPR reduction, digital pre-distortion (DPD) subsystems in the transmitter can counteract these distortions. 

DPD subsystems do not aim to reduce the PAPR of an OFDM signal. Rather they apply an “inverse PA” characteristic to the signal that causes the output signal of the PA to be more linear. DPD tools, such as those in the Communications Toolbox, are increasingly using AI to improve results. These tools illustrate all the steps of an AI workflow:

  • Data preprocessing,
  • Offline training to build a static AI-based DPD network,
  • Online training to build a dynamic network that adapts to temperature-dependent PA characteristics,
  • Network compression to save on computational complexity and memory, and 
  • HDL code generation to deploy a network to an edge device

LEO Challenge: Interference

Interference also presents a challenge when using LEOs for satellite communications systems. The top reason is the simple fact that there are nearly 6,000 LEOs in orbit currently. 

Traditional RF links have long been used in satellite communications systems, but engineers increasingly choose optical links when possible. Optical beam patterns are much narrower than traditional RF links, whose broad beams can spill over into other receivers and cause interference. Interference from optical systems is significantly reduced due to limited signal spreading. 

Finally, satellite engineers can also use phased arrays, which are groups of computer-controlled antennas that create a beam that can be electronically steered to point in different directions. Phased arrays can spatially null out interference and direct energy at a particular spot on the ground. Phased array systems maximize beam energy in the direction of the signal of interest and insert beam nulls in the direction of interference, thus maximizing signal-to-interference plus noise ratio (SINR).

LEO Challenge: Doppler Shifts

Unlike GEOs, LEOs do not revolve around the Earth at the same rate as the planet’s rotation. This means that they’re constantly moving either toward or away from receivers. This movement creates a Doppler effect that satellite engineers must manage. 

In engineering terms, the Doppler effect refers to the difference in frequency between the transmitted wave and the received wave due to transmitter or receiver motion. Doppler challenges require satellite engineers to acquire and track LEOs’ continually changing center frequencies. 

The transmitter and receiver’s frequency and phase must be completely locked in to ensure the waveforms are successfully demodulated. However, large Doppler shifts cause the frequency, phase, and timing to be out of sync. As a result, multiple closed loops must be implemented in these receivers to eliminate Doppler-induced frequency offsets. Synchronization must happen at the frame, symbol timing, carrier frequency, and carrier phase levels. 

The challenges outlined above create a strong motivation to model and simulate these processes before companies commit to building any hardware. Satellite orbital modeling can quantify the magnitudes of Doppler shift incurred by LEO satellites. Link budget analysis provides a ballpark estimate of the required PA output power, antenna size, and/or number of antenna elements required for an array. Link simulation can provide a performance baseline for a receiver design, particularly when subject to degraded channel conditions and common RF impairments like phase noise, intermodulation products, and PA nonlinearities. MATLAB enables all these modeling and simulation efforts so that LEO communications payloads can be manufactured in confidence, as long as they conform to the simulated performance.

LEOs have received their fair share of attention because of their compelling short- and long-term use cases. Companies like Apple are already tapping into satellite communications networks, and that’s only the start. As satellite communications continue influencing the wireless industry, engineers should familiarize themselves with their uses, challenges, and enabling technologies.

Contributed by


Country: United States
View Profile