Clarksville, TN Online: News, Opinion, Arts & Entertainment.


NASA Researchers develop Artificial Intelligence for Submersibles

 

Written by Andrew Good
NASA’s Jet Propulsion Laboratory

NASA - National Aeronautics and Space AdministrationPasadena, CA – If you think operating a robot in space is hard, try doing it in the ocean.

Saltwater can corrode your robot and block its radio signals.

Kelp forests can tangle it up, and you might not get it back.

Sharks will even try to take bites out of its wings.

The ocean is basically a big obstacle course of robot death. Despite this, robotic submersibles have become critical tools for ocean research. While satellites can study the ocean surface, their signals can’t penetrate the water. A better way to study what’s below is to look beneath yourself — or send a robot in your place.

JPL's Steve Chien with several of the underwater drones used in a research project earlier this year. Chien, along with his research collaborators, are developing artificial intelligence for these drones. (NASA/JPL-Caltech)

JPL’s Steve Chien with several of the underwater drones used in a research project earlier this year. Chien, along with his research collaborators, are developing artificial intelligence for these drones. (NASA/JPL-Caltech)

That’s why a team of researchers from NASA and other institutions recently visited choppy waters in Monterey Bay, California.

Their ongoing research is developing artificial intelligence for submersibles, helping them track signs of life below the waves.

Doing so won’t just benefit our understanding of Earth’s marine environments; the team hopes this artificial intelligence will someday be used to explore the icy oceans believed to exist on moons like Europa.

If confirmed, these oceans are thought to be some of the most likely places to host life in the outer solar system.

A fleet of six coordinated drones was used to study Monterey Bay. The fleet roved for miles seeking out changes in temperature and salinity. To plot their routes, forecasts of these ocean features were sent to the drones from shore.

The drones also sensed how the ocean actively changed around them. A major goal for the research team is to develop artificial intelligence that seamlessly integrates both kinds of data.

“Autonomous drones are important for ocean research, but today’s drones don’t make decisions on the fly,” said Steve Chien, one of the research team’s members. Chien leads the Artificial Intelligence Group at NASA’s Jet Propulsion Laboratory, Pasadena, California. “In order to study unpredictable ocean phenomena, we need to develop submersibles that can navigate and make decisions on their own, and in real-time. Doing so would help us understand our own oceans — and maybe those on other planets.”

Other research members hail from Caltech in Pasadena; the Monterey Bay Aquarium Research Institute, Moss Landing, California; Woods Hole Oceanographic Institute, Woods Hole, Massachusetts; and Remote Sensing Solutions, Barnstable, Massachusetts.

If successful, this project could lead to submersibles that can plot their own course as they go, based on what they detect in the water around them. That could change how we collect data, while also developing the kind of autonomy needed for planetary exploration, said Andrew Thompson, assistant professor of environmental science and engineering at Caltech.

“Our goal is to remove the human effort from the day-to-day piloting of these robots and focus that time on analyzing the data collected,” Thompson said. “We want to give these submersibles the freedom and ability to collect useful information without putting a hand in to correct them.”

At the smallest levels, marine life exists as “biocommunities.” Nutrients in the water are needed to support plankton; small fish follow the plankton; big fish follow them. Find the nutrients, and you can follow the breadcrumb trail to other marine life.

But that’s easier said than done. Those nutrients are swept around by ocean currents, and can change direction suddenly. Life under the sea is constantly shifting in every direction, and at varying scales of size.

“It’s all three dimensions plus time,” Chien said about the challenges of tracking ocean features. “Phenomena like algal blooms are hundreds of kilometers across. But small things like dinoflagellate clouds are just dozens of meters across.”

It might be easy for a fish to track these features, but it’s nearly impossible for an unintelligent robot.

“Truly autonomous fleets of robots have been a holy grail in oceanography for decades,” Thompson said. “Bringing JPL’s exploration and AI experience to this problem should allow us to lay the groundwork for carrying out similar activities in more challenging regions, like Earth’s polar regions and even oceans on other planets.”

The recent field work at Monterey Bay was funded by JPL and Caltech’s Keck Institute for Space Studies (KISS). Additional research is planned in the spring of 2007.

Caltech in Pasadena, California, manages JPL for NASA.

For more information about this research, visit:

http://kiss.caltech.edu/new_website/techdev/seafloor/seafloor.html


Sections

Technology

Topics

, , , , , , , , , , , , ,

Comments

You must be logged in to post a comment.


  • Visit Us On FacebookVisit Us On TwitterVisit Us On GooglePlusVisit Us On PinterestVisit Us On YoutubeCheck Our Feed
  • Personal Controls

    Archives

      December 2016
      S M T W T F S
      « Nov    
       123
      45678910
      11121314151617
      18192021222324
      25262728293031