“Passengers”: Artificial intelligence in space flight
How machines could detect errors in the future
In the science fiction blockbuster “Passengers”, the Avalon spaceship has a technology whereby the crew can detect errors by means of a hologram. In reality, Bosch expert Samarjit Das and his team are working on putting an early-warning system into place for every type of machine. The first tests are taking place where “Passengers” is set – in space.
AI in the science fiction adventure “Passengers”
In “Passengers”, the dream of life on the distant “Homestead II” planet is a 120-year space flight away from Earth. The 5,000 passengers aboard the Avalon spaceship hibernate on the journey in a so-called “hypersleep”. On the way, the spaceship gets caught up in an asteroid field and is severely damaged in a collision. As one of the consequences, it causes the mechanical engineer Jim to wake up 90 years too early from the hypersleep. Jim in turn wakes up another passenger, Aurora.
In one of the film’s key scenes, the two passengers succeed in gaining an overview of the state of the spaceship with the help of a board computer that localizes system failures and displays them by means of a hologram. The input of the intelligent devices now gives them the chance to save themselves and the rest of their fellow travelers.
With SoundSee, we are researching the potential to detect machine faults on the International Space Station at an early stage by means of sound analysis.
What’s unique about the tests in space?
A test in space: The “SoundSee” acoustic sensor system that has been developed by Bosch will soon be sent to the manned ISS. Using sound recognition, the technology is designed to find out whether the spacecraft’s critical machines are working smoothly. SoundSee therefore functions as an intelligent machine listening technology and sends the recorded sounds down to Earth to a Bosch research team. The team then performs AI-based advanced sound analysis to evaluate the performance of SoundSee’s machine health monitoring capabilities. The result is still being written, quite literally, in the stars — SoundSee will be tested for the first time onboard the ISS: “The cooperation with NASA is a big chance for the further development of SoundSee,” says Samarjit Das.
In the space station, the SoundSee device will ride on a small NASA-built robot, called the Astrobee, which will float almost silently around the ISS compared to a robotic drone here on earth. Zero gravity onboard the space station makes Astrobee’s “free-flying” possible and produces this unique vantage point and mobility in terms of autonomous acoustic monitoring: “On earth, flying robots like drones are just too noisy as a monitoring platform and current technology hasn’t been able to address that yet,” explains Dr. Das.
Artificial intelligence for audio analysis
The best-case scenario would be when all the audio data from the ISS is so good that an artificial intelligence (AI) can be trained by it. The AI algorithm learns by means of specific sound patterns of faulty machines. Or, an even better scenario would be to learn how a machine sounds like when the failure is impending and what exactly the root cause is. “Through SoundSee, we would like to localize the faults as soon as possible,” says Dr. Das. If SoundSee comes through the test phase successfully, unmanned space stations could in future also profit from the technology as machine faults can be detected directly on the station without any human intervention. Such a technology would have been ideal for our heroes in “Passengers”. The impending system failure would have been detected earlier and fewer machines would have sustained damage.
From space to earth
Although SoundSee’s machine listening technology is not yet being deployed for terrestrial business applications, researchers already have some specific applications in mind. They range from predictive maintenance in Industry 4.0 to smart automotive and building technologies. For example, the technology can recognize characteristic noises such as a pane of glass shattering, and then alert users to potential critical situations. In automotive applications, the SoundSee team is working with Bosch engineers to make self-driving cars capable of reacting automatically to the sirens of rescue or police vehicles and making way for them.
Dr. Samarjit Das,
Principal Researcher and Senior Manager at the Bosch Research and Technology Center in Pittsburgh, PA, USA.
Samarjit Das received undergraduate degree in Electronics and Communications Engineering in 2006 from the Indian Institute of Technology Guwahati (IITG), India. He then did his PhD in Electrical Engineering at the Iowa State University, Ames, IA, USA and postdoc at the Robotics Institute, Carnegie Mellon University in Pittsburgh. He has worked for Bosch since 2013 and heads a research group that concentrates on the interface between artificial intelligence (AI) and the Internet of Things (IoT). Dr. Das is the project lead for the SoundSee mission to the International Space Station (ISS) and also heads the collaboration with NASA.
Summary
Science Fiction or Science Fact? Although SoundSee does not visualize a spacecraft defect by means of a hologram like in the “Passengers” movie, it could however play a part in precise early detection and localization by using audio analysis. If the Bosch developers do successfully implement their plans, it could lead to an artificial intelligence being created that can warn of mechanical problems at an early stage. SoundSee could then not only be employed in space but also here on Earth.