In an initiative energized by Google Vice-President and Chief Internet Evangelist Vint Cerf, the International Space Station could be testing a brand new way of communicating with Earth. In 2009, it is hoped that the ISS will play host to an Interplanetary Internet prototype that could standardize communications between Earth and space, possibly replacing point-to-point single use radio systems customized for each individual space mission since the beginning of the Space Age.
This partnership opens up some exciting new possibilities for the future of communicating across vast distances of the Solar System. Manned and robotic space craft will be interconnected via a robust interplanetary network without the problems associated with incompatible communication systems…
“The project started 10 years ago as an attempt to figure out what kind of technical networking standards would be useful to support interplanetary communication,” Cerf said in a recent interview. “Bear in mind, we have been flying robotic equipment to the inner and outer planets, asteroids, comets, and such since the 1960’s. We have been able to communicate with those robotic devices and with manned missions using point-to-point radio communications. In fact, for many of these missions, we used a dedicated communications system called the Deep Space Network (DSN), built by JPL in 1964.”
Indeed, the DSN has been the backbone of interplanetary communications for decades, but an upgrade is now required as we have a growing armada of robotic missions exploring everything from the surface of Mars to the outermost regions of the Solar System. Wouldn’t it be nice if a communication network could be standardized before manned missions begin moving beyond terrestrial orbit?
On the observational mainstream surface, the concept makes good, logical sense.
I cannot make any additional, knowledgable comments because my expertise in InnerTube Networking is limited at best, even though I am an experienced ‘user’. I simply find the ‘architecture’ aspect overwhelming.
Okay, I’ll make a guess ( so I lied about not commenting ); From what I get from this is that each planet, moon, artificial satellite and probe will have its own individual ‘Internet.’ Each local network will then send time delayed TCP/IP ‘packets’ to each other, thus linking up to the major Earth Google-Plex.
The deal breaker is the light-speed delay, but this should be negated somewhat by a hardy ‘time delayed’ TCP/IP protocol.
It would seem to me that would require more memory packed into even smaller physical entities.
Quantum computing to the rescue?
Or perhaps the GooglePlex AI needs to happen first?