In the future, rather than relying on large, costly individual space satellites, groups of smaller satellites—referred to as a “swarm” by scientists—will work together to provide enhanced accuracy, agility, and autonomy. Researchers at Stanford University’s Space Rendezvous Lab are at the forefront of this innovation. They have recently conducted the first-ever in-orbit test of a prototype system capable of navigating a satellite swarm using only visual data communicated via a wireless network.
“It’s a milestone achievement and the result of 11 years of work by my lab, which was established with the aim of advancing the field of distributed autonomy in space,” said Simone D’Amico, associate professor of aeronautics and astronautics and senior author of the study published on the arXiv preprint server. “Starling represents the first demonstration of an autonomous satellite swarm.”
The experiment, called the Starling Formation-Flying Optical Experiment or StarFOX, involved successfully navigating four small satellites in coordination using only visual data from onboard cameras to determine their trajectories. The researchers shared their results from this initial StarFOX test at the Small Satellite Conference in Logan, Utah, where they presented their findings to experts in swarm satellite technology.
All The Angles
D’Amico described the challenge as a driving force for his team over the past decade. “Since the lab’s founding, we’ve been advocating for distributed space systems. Now, it has become a mainstream approach. Agencies like NASA, the Department of Defense, and the U.S. Space Force have recognized the value of coordinating multiple assets to achieve objectives that would be difficult or impossible for a single spacecraft,” he explained. “The benefits include enhanced accuracy, coverage, flexibility, robustness, and the potential for new objectives that have yet to be imagined.”
Navigating a satellite swarm presents a significant technological challenge. Current systems depend on the Global Navigation Satellite System (GNSS), which requires frequent communication with ground-based systems. For missions beyond Earth’s orbit, the Deep Space Network is available but is relatively slow and not easily scalable for future needs. Additionally, neither system can help satellites avoid what D’Amico refers to as “non-cooperative objects,” such as space debris, which could damage or destroy a satellite.
The swarm requires an autonomous navigation system that ensures high levels of independence and robustness, according to D’Amico. Advances in miniaturized cameras and other hardware have made such systems more feasible, with the StarFOX test utilizing proven, cost-effective 2D cameras known as star-trackers, commonly used in satellites.
“Fundamentally, angles-only navigation doesn’t require extra hardware even for small and inexpensive spacecraft,” D’Amico said. “Furthermore, sharing visual information among swarm members introduces a new capability for distributed optical navigation.”
Written In The Stars
StarFOX integrates visual data from single cameras mounted on each satellite within a swarm. Similar to how ancient navigators used a sextant to find their way at sea, this system uses the background field of known stars to determine bearing angles relative to the satellites. These angles are then processed onboard using precise physics-based models to estimate the satellites’ positions and velocities in relation to the orbited planet—such as Earth, the Moon, Mars, or other celestial bodies.
StarFOX utilizes the Space Rendezvous Lab’s Angles-Only Absolute and Relative Trajectory Measurement System (ARTMS), which incorporates three advanced space robotics algorithms. The Image Processing algorithm identifies and tracks multiple targets in images, calculating target-bearing angles to determine the movement of objects, including space debris. The Batch Orbit Determination algorithm uses these angles to estimate each satellite’s approximate orbit. Finally, the Sequential Orbit Determination algorithm fine-tunes the swarm’s trajectories by processing new images over time, potentially supporting autonomous guidance, control, and collision avoidance systems onboard.