Google Research has unveiled Project Suncatcher, an ambitious moonshot initiative exploring the possibility of scaling artificial intelligence (AI) compute infrastructure in space. The project envisions solar-powered satellite constellations equipped with Tensor Processing Units (TPUs) and interconnected through free-space optical links, potentially redefining the future of AI data centers.
Artificial intelligence is reshaping industries and accelerating scientific discovery but its growing demand for power and cooling has become a pressing challenge. Google’s Project Suncatcher aims to overcome these terrestrial limits by harnessing the Sun’s abundant energy in orbit.
In the right orbital environment, solar panels can be up to eight times more productive than on Earth, with near-continuous sunlight that reduces battery dependence. By deploying compute infrastructure in space, Google envisions a system that minimizes land, energy, and water usage while scaling AI capabilities beyond Earth’s constraints.
The proposed system consists of a constellation of networked satellites operating in a sun-synchronous low Earth orbit. This orbit provides constant solar exposure, allowing continuous power generation and reducing onboard storage requirements.
Each satellite would host TPUs and optical transceivers, connected via dense wavelength-division multiplexing (DWDM) and spatial multiplexing to achieve tens of terabits per second of data transfer performance comparable to terrestrial data centers. In laboratory tests, Google’s prototype system achieved 1.6 Tbps total throughput using a single transceiver pair.
High-bandwidth inter-satellite links require satellites to fly in close proximity hundreds of meters apart much tighter than existing systems. Using numerical models based on Hill-Clohessy-Wiltshire equations and a JAX-based differentiable physics model, Google simulated orbital dynamics at a mean altitude of 650 km. Results indicate that modest station-keeping maneuvers will be sufficient to maintain stability.
For AI accelerators to function in low Earth orbit, they must withstand intense radiation. Google tested its Trillium v6e Cloud TPU under a 67 MeV proton beam. The results were promising no hard failures were observed up to 15 krad(Si), well beyond the expected 5-year mission exposure, demonstrating that TPUs can operate reliably in space with proper shielding.
Historically, high launch costs limited large-scale space infrastructure. However, with declining prices, Google projects that by the mid-2030s, costs may drop below $200 per kilogram making space-based AI compute cost-competitive with terrestrial data centers on a per-kilowatt-year basis.
Google’s next milestone is a learning mission in partnership with Planet Labs, scheduled to launch in early 2027. Two prototype satellites will test TPU performance in space and validate optical inter-satellite communication for distributed machine learning tasks.
Long-term, Google envisions gigawatt-scale constellations integrating compute, solar collection, and thermal management in a unified design paving the way for a new era of orbital cloud computing.
Project Suncatcher reflects Google’s legacy of tackling bold scientific challenges similar to its early bets on quantum computing and autonomous vehicles. If successful, it could usher in a new paradigm where AI infrastructure operates beyond Earth, powered by sustainable energy and unconstrained by terrestrial limitations.
This initiative could also redefine the future of cloud architecture, energy sustainability, and global data access, marking a pivotal step toward the era of space-based AI systems.
Also Read: Andhra Pradesh approves INR 1.02 Lakh Crore in investments, potentially creating 85,870 jobs



