New Delhi17 minutes ago
- copy link
Google announced its new moonshot project ‘Suncatcher’. Under this project, Google will build an AI data center in space. Google CEO Sundar Pichai gave this information by sharing a post on social media platform X.
Under this project, the company will send solar powered satellites into space, which will be equipped with Google’s Trillium TPUs. These satellites will scale AI compute power beyond Earth by connecting through free-space optical links.
That is, under this project, AI tasks will be completed by generating electricity from sunlight i.e. solar power. Google claims that solar panels in space will provide 8 times more power than on Earth and the battery requirement will also be less.
What did Sundar Pichai say by sharing the post on X?
Sundar Pichai told X that our TPUs are heading towards space. The company is pursuing it like moonshots like quantum computing and autonomous driving.
The power of the sun is 100 trillion times more than the total electricity production of the entire world. TPUs have tolerated radiation in initial tests. Two prototypes will be launched with Planet Labs in early 2027.

What is Project Suncatcher, how will it work?
Project Suncatcher is a research idea of Google. Under which small satellites will be launched in Low Earth Orbit i.e. Sun-Synchronous Orbit (SSO), where sunlight is always available. Each satellite will be equipped with solar panels and Google’s Trillium TPUs chips, which are designed for AI training. These satellites will be connected to each other through optical links, which will provide terabits per second speed.
Google said that the cluster of 81 satellites will fly in just 1 kilometer radius, so that data transfer is easy. Solar power will be continuously available in the space, which will reduce the need for batteries. The company has got 1.6 Tbps bidirectional speed in initial tests. And the cluster flying 400 miles above will handle larger ML workloads. This will reduce the tension of electricity, water and land on earth.
Why in space, what is the problem on earth?
It takes a lot of energy to train AI models. The problem of electricity, water and space for data centers on Earth is increasing. Google’s senior director Travis Beals said that the sun is the ultimate energy source of our solar system, which provides 100 trillion times more power than humanity’s total electricity production.
Solar panels in space will be 8 times more productive and will provide almost continuous power. This will also reduce carbon footprint. Google believes that by 2030 the launch cost will be $ 200 per kg, so the price of space data center will be equal to the one on Earth.
Technical Challenges, Protecting TPUs from Radiation
There is a lot of radiation in space, which spoils the chips. Google tested Trillium TPU in a particle accelerator (67MeV proton beam). The result was good – the chip will tolerate radiation up to 15 krad (Si). But High Bandwidth Memory (HBM) is sensitive.
Satellites will have to be flown close for the optical link to work. Hill-Clohessy-Wiltshire equations and JAX model will be used for this. Thermal management and ground communication are also big challenges. Travis Beals said there are no physics or economic barriers to the core concepts, only engineering challenges remain.
First test in 2027, partnership with Planet Labs
Google will launch two prototype satellites with Planet Labs company in early 2027. The TPU hardware, optical links and models will be tested in space. Gigawatt scale constellations will be built in the future. All its details are given in Google’s preprint paper.
If successful then AI will be in training space
If the project is successful, AI training will be done from space. Large ML workloads can be handled easily. Resources will be saved on earth and the environment will be protected. Space compute will become cheaper as launch costs and solar efficiency increase. Experts believe that space data centers can become a reality by 2035.
Source link
[ad_3]