In reality, there were vast numbers of radio waves in the universe long before Hertz performed his famous experiments. Electromagnetic waves are generated naturally whenever electric charges are accelerated or decelerated. All hot objects, in which charged particles are in rapid random motion, radiate electromagnetic energy in various frequencies. The stars are potent sources of electromagnetic energy, which is the basis of radio astronomy. On our own planet, atmospheric events such as lightning strikes produce showers of radio energy, noticeable as the background crashes and crackles heard on broadcast receivers during thunderstorms. In most of these cases of natural generation, the radio energy is incoherent, characterized by a jumble of photons of disparate energies. The same is true of many human-made sources of electromagnetic disturbances, such as electrical machinery1.
Italian inventor and electrical engineer Guglielmo Marconi had successfully transmitted radio energy for a bit more than a kilometer, generating his own “invisible waves” in Pontecchio, Italy. After failing to impress the Italian government, Marconi traveled to England at the age of 22, he found financial backers for his work. By 1897, Marconi was broadcasting radio waves up to 20 kilometers away and was commissioned to set up a wireless station on the Isle of Wight so that Queen Victoria could send Morse code messages to her son while he was aboard his yacht. In 1899, Marconi began work on a transatlantic broadcast, believing (unlike the leading physicists of the day) that the signals would follow the curvature of the earth. In 1901, he achieved a 3,430 km-transmission of three Ss (or three sets of three dots in Morse code) from Poldhu, Cornwall to Signal Hill in St. John’s.
Needless to say, his work led to a communications revolution. But, why exactly?
Marconi will not be remembered for generating precise radio waves—he was, technically speaking, spouting noise in a more or less controlled manner—but he will be remembered for leading the work to convey information wirelessly by manipulating electrical signals with the help of electric circuitry. In short, he fathered radio access technology.
In the late 1890s, Canadian-American inventor Reginald Fessenden started to realize that spark transmitters were not the most elegant, and he realized he could develop a far more efficient transmitter and coherent receiver combination. To this end, he worked on developing a high-speed alternator that generated pure sine waves and produced a continuous train of propagating waves of substantially uniform nature, or, in modern terminology, a continuous-wave (CW) transmitter, which still consisted in turning on and off a sine wave signal of fixed characteristics in a rather sloppy way.
In time, the idea of doing such a thing—switching a signal on and off—would start to give way to a more subtle approach: keeping the carrier on and altering the attributes of said carrier at the rhythm of another signal containing the information to be transmitted. In other words, modulation. Fessenden would at some point introduce the concept of signal heterodyning, where a receiver would have a local oscillator producing a radio signal adjusted to be close in frequency to the signal being received. When the two signals would be mixed, a "beat" frequency equal to the difference between the two frequencies was created. Adjusting the local oscillator frequency would correctly put the beat frequency in the audio range, where it would be heard as a tone in a receiver's speaker whenever the transmitter signal is present. Now we could hear tones, and thus the Morse code "dots" and "dashes" would become audible as beeping sounds. Nice.
Soon, we would realize that nothing was preventing us from opportunistically altering a carrier’s amplitude with something more than mere tones (a tone is a signal composed of a single frequency) but using richer signals, such as audio2. Consequently, voice communication would become a very popular thing.
With all this, it became clear that:
- The bandwidth of the modulating signal (its spectral richness of frequencies, so to speak) would turn out to be of great importance since it would start requiring certain capabilities from transmitters and receivers for protecting such bandwidth across their signal chains if we pretended to ensure audio quality.
- The proliferation of radio transmitters would make evident the need for coordination. After all, the experimental early phase of radio communications was starting to be over, and we had to start regulating transmitters’ activity to reduce unwanted interference and aim for a more responsible use of a now key natural resource like the radio spectrum.
Additionally, and as other technologies like transistors, integrated circuits and computers were maturing fast, we would realize that the modulating signals did not need to be only of analog nature. Digital signals could also be used to alter a sine wave’s attribute. In the meantime, we also would figure out ways of altering the other attributes of sinusoidal signals, namely frequency, and phase, which showed more interesting bandwidth efficiencies, where for example a change of phase of a certain amount (symbol) could be coded to represent several data bits.
Digital modulation would make a glorious appearance, giving us the possibility of sending digital data between a transmitter and a receiver. A small practical problem showed up: Fourier analysis taught us that perfectly squared waves are equivalent to a sine wave at the fundamental frequency summed with an infinite series of odd-multiple harmonics at decreasing amplitudes (describing a sinc function in the frequency domain). This meant a quasi-infinite bandwidth to handle! So, inevitably, a breadth of techniques for limiting the bandwidth of digital signals prior to transmission materialized for ensuring that practical bandwidths could be achieved while guaranteeing the ones and zeros present in the data stream could still be discerned at the receiving end.
A metric started to surface as important: data rate; that is, the number of bits of information that could be transmitted or received per unit of time (usually per second). We wanted it to be higher and higher (and we still do). Shannon and Hartley with their homonymous theorem would come to tell us to hold our horses because the theoretical upper bound on the information rate of data that could be communicated at an arbitrarily low error rate for a given received signal power through an analog communication channel subject to noise would be constrained by the bandwidth of the channel, the average received signal power over such bandwidth and the average power of the noise and interference over said bandwidth. What is more, the theorem would make it very visible that such rates would be governed by the signal-to-noise ratio (SNR).
In time, our engineering predecessors would burn their brains devising techniques to squeeze more and more from a now contested and badly behaved—noisy—radio spectrum. This included techniques to improve said SNR—and no, pumping out power is not always a practical solution—with the aim to achieve lower and lower bit error rates (BER). Antenna technology improvements and techniques combined with the appearance of a multitude of coding techniques would allow radio engineers to reach the same BERs at lower SNRs, making their designs simpler.
Spectrum use sophistication wildly kicked in. It became understood that we could also partition the spectrum of a communications channel in a series of non-overlapping bands called subcarriers (a technique called frequency-division multiplexing, or FDM). And now the concept of orthogonality would find a great application: remember we said before that a digital signal’s spectrum is described by a sinc function; well, if we chose the frequencies of these contiguous sub-bands wisely, the individual peaks of the subcarriers’ sincs line up with the nulls of the other subcarriers. A clever, constructive use of interference. Such overlap of spectral energy still gives us the ability to recover the original signal, which is the foundation of Orthogonal Frequency-Division Multiplexing (OFDM), a method to convey data using a set of orthogonal subcarriers at lower data rates as opposed as to modulating a single carrier at a very high data rate. OFDM would become the foundation of Wi-Fi (IEEE 802.11a/g/n/ac), 4G LTE and 5G mobile communication technologies.
Fast-forwarding to these days, wireless networks are ubiquitous. The mobile phones we carry in our pockets are continuously exchanging data with cell phone operators' towers at high data rates, with antenna towers operating arrays of antennas equipped with techniques to track users spatially, serving a big bunch of moving targets (phones or machine-type-communication in 5G jargon) as they nervously wander around the cities, across the globe.
Note that, thus far, we have not talked at all about what’s in the data being transmitted. And this is intentional. Wireless links can be thought of as onions of sorts, in the sense that they are designed in layers. So far, we have been strictly talking about the lowermost layer: the physical layer. This is about electric signals traversing across a transmitter, through an antenna that emits them as a shower of radio quanta which propagates towards a receiver, without caring about what those bits being transmitted mean.
Engineers would spend a long time defining different formats and protocols for the data to follow. Such protocols would address important needs such as addressing, error handling, automatic retransmission requests, forward-error-correction, fragmentation, virtual channels multiplexing, and many other features. Still, at the end of the day, the fact is that only bits are communicated.
But how does this space fit in all this? Space hasn’t grown unaware of the technological progress of radio engineering in the last century. Radio links have been, for obvious reasons, the weapons of choice for transferring data from satellites to the ground. But the space industry seems to be showing, in terms of the state-of-the-art, two different faces: SATCOM and EO. On the SATCOM face, the spectrum exploitation level of sophistication is very high, using advanced frequency-division multiplexing or time-division multiplexing techniques to squeeze as many customers as possible into a single communications satellite. Called multiple access, it allows several carriers from several earth stations to access a SATCOM’s antenna. To serve its customers, the payload of a SATCOM satellite incorporates a repeater that consists of one or more channels, called transponders, operating at all times in parallel on different sub-bands of the total bandwidth used. SATCOM networks can be based on a single-beam antenna payload or multibeam antenna payload. In the context of a single-beam payload, the carriers transmitted by earth stations access the same satellite receiving antenna beam, and these same earth stations can receive all the carriers retransmitted by the same antenna3. The number of users that can be served with a single SATCOM satellite and with only one parabolic antenna is mesmerizing.
On the Earth Observation face, the level of sophistication in terms of spectrum use decreases a bit. Earth observation satellites are still using predominantly single carrier channels and techniques, somewhat “classic” modulation schemes and protocols, and limited or no beam forming or beam scanning capabilities. In Earth Observation missions in LEO, the wireless links are typically mechanically steered (with the satellite moving its body-mounted antenna to track a ground-based, big antenna and the ground station antenna, in turn, tracking the satellite).Satellites, regardless of SATCOM, EO, pure science, or any other application, are about data. No satellite exists for the sake of being up there isolated, floating alone in space. Satellites are always part of a mission, and missions are about acquiring, forwarding or downlinking various kinds of digital data, be it health status telemetry, internet traffic, IoT transactions, astronomy, or a hyperspectral image of some location on the ground. Such bits and bytes need to move fast because someone, somewhere is eagerly waiting to make sense out of them. Augmented with other technologies like lasers, satellites can become nodes in a global-area wireless network. Nothing is really stopping us from being able to integrate satellites to our ground-based wireless networks, including our modern mobile networks.At ReOrbit, we are giving our customer the possibility to choose the number of data links in their missions, the bandwidths and data rates, the physical layers (RF, optical, or both), the upper layers, protocols, and security stacks, combined with software-configurable capabilities such as store-and-forward and delay-tolerant networking which turns the spacecraft into data routers placed beyond the Kármán line.
Gluon satellites can carry Earth Observation or any other kind of payloads and make sure the data acquired from these sensors will reach the customers fast and reliably. ReOrbit satellites are designed with availability and uptime as strong design drivers, together with autonomy: no one expects to monitor a network routing equipment continuously or allocating large teams of people checking on them 24/7. We design our satellites with the mindset that you could ignore a Gluon bus for extended periods of time and the spacecraft should stay nice and healthy, tirelessly working to get your data through while working with your ground automations to ingest the tasks and schedules for acquisition collection. A Gluon bus could only inform you in case any attention is needed, reversing the paradigm of space operations which has historically been about continuously, inefficiently polling space assets.And finally, realtimeness: data needs to be available as soon as possible; meeting data delivery deadlines is a built-in feature of our flight software design and architecture.
ReOrbit is standing on the shoulders of giants like Hertz and Marconi, working hard on making in-orbit networking a reality. Data bottlenecks, jams and delays are problems we will soon leave behind. Data needs to keep moving to create value, and the only way ahead is to network.
1 Adapted from Radio Antennas and Propagation: Radio Engineering Fundamentals (1st Edition), William Gosling
2 Audio is any electrical signal which represents in voltage an originally acoustic phenomenon (music, speech, etc)
3 Satellite Communications Systems: Systems, Techniques and Technology, 6th Edition. Gerard Maral, Michel Bousquet, Zhili Sun