Hit enter to search or ESC to close

How Autonomous Technology Helps Tackle the Monumental Task of Mapping the Seabed

Consider the hours of labor required to map an as-yet uncharted area of the ocean floor. You’d have to sail for weeks just to reach some of the waters to be mapped, and a crewed vessel heading out into the big wide open needs to carry enough fuel for the boat and supplies for the crew.

Since oceans cover 71% of the earth’s surface, the process would take ages. Literally. Larry Mayer, the Director of the Center for Coastal and Ocean Mapping at the University of New Hampshire, has run the calculations and estimates that would take hundreds of millions of hours to create high-resolution maps of the sea bed. In short, without a significant strategy shift, it won’t happen any time soon.  

That’s why oceanographers are turning to autonomous technology and drawing inspiration from the development of autonomous cars. By using robotic, wind- and solar-powered vessels, Mayer notes that “you can be out for six, seven months at a time working 24 hours a day. There’s a phenomenal efficiency gain, in terms of the amount of mapping you can do.”

Because of the difficulty, only 20.6% of the ocean floor has been mapped so far, and that represents a huge advancement since 2018, when just 6.9% had been charted. Yet that figure pales in comparison to the almost 90% coverage we have of the surface of Mars.

There are urgent reasons to play catch-up, the overriding one being that we live here, not on the Red Planet, and have a real need to understand the dynamics of the planet we inhabit. “Mapping the oceans underpins all of the science, discovery, and research related to our Blue Economy,” explains Rachel Medley, chief of the National Oceanic and Atmospheric Administration’s (NOAA) Expeditions and Exploration Division. That includes everything from shipping to seafood production, energy production to tourism, and recreation to environmental protection.

The situation is akin to the early days of driving before there were street maps, let alone highly accurate GPS navigation. It’s hard to understand where to go—or what to protect—if we don’t know what’s there. That’s why the UN has set a target date of 2030 to create high-resolution 3D bathymetric charts showing the topographic features and elevations of the entirety of the ocean floor.

“We have a tremendous amount to learn from the automotive industry,” says Mayer, “in terms of the overall technologies of machine learning and situational awareness.” Mayer works with NOAA and Seabed 2030, a collaborative project to produce the definitive map of the ocean floor, and says that land-based autonomous vehicles (AVs) are seeking to solve the same complex problems. “We [both] need to have precise situational awareness in terms of other vessels, in terms of hazards, and in terms of obstacles,” he says.

Oceanographers use two types of autonomous vessels: autonomous underwater vehicles, or AUVs, and uncrewed surface vessels, or USVs. Each serves a specific purpose, explains Captain William Mowitt, Deputy Director of the NOAA’s Uncrewed System Operations Center. “A USV can cover a wide area, but it won’t always get the detail that we need. For more detailed information, we use AUVs. They can get really close to the ocean floor, and bring back pictures and other information about what’s down there in addition to just the depths.”

Ocean mapping vessels may operate in much less congested areas than autonomous cars, but the surface vessels face the challenge of working in a highly variable environment. “In a city, autonomous vehicles operate in a rigid framework,” explains Mayer. “But we have a fluctuating background where waves move constantly, vertically, horizontally, rolling, and shifting. A USV moves around much more than an autonomous vehicle. And the entire background does the same in a random range of conditions from flat calm to ten-foot waves, making it difficult to know if that’s a small vessel approaching, or a white cap.”

As for AUVs, detection and perception is difficult below the surface. “Light-based mapping tools don’t work underwater,” explains Mayer, “and echo-based mapping is expensive, time-consuming, and unclear, because it diverges in the water and becomes quickly distorted.”

AUVs use a range of sensors, including lidar for close-range detection, as well as dual-range radar, and 360-degree camera systems. Autonomous vehicles use a similar sensor suite to perfect high-resolution 3D maps which provide intricate detail of an environment in which they plan to safely deploy their fleets, similar to the goal for oceanographers. “We try to incorporate as much situational awareness information as we can,” says Mayer, “but it’s this dynamic environment that’s a challenge.” Since AUVs operate in much lower visibility conditions than their land-roving counterparts, oceanographers are also exploring the use of other sensors such as forward-looking infrared (FLIR) which can sense infrared radiation.

The big blue unknown

It’s the best part of a century since echo sounding—the use of sonar to measure water depth with sound waves—began to replace the ancient bathymetric technique of lowering lead on a line over the ship’s side and waiting for it to connect with the bottom. So why is it so difficult to map the ocean?

Firstly, it’s deep. Really deep. If Mount Everest (elevation: 29,032 feet) stood in the ocean’s deepest point, its peak would still be more than a mile underwater. In general, the ocean floor is way down there: the average depth is 13,123 feet.

But it’s not just deep, it’s also dark, and far from crystal clear. Light can only penetrate clear water to a certain depth; Medley explains that shorelines can be mapped using bathymetric lidar, which can penetrate water and reflect off objects up to 65 feet below the surface, but any deeper, and light becomes attenuated—that is, the intensity of light decreases with depth. “We therefore can’t use the primarily visual technology we use on land or to map the surface of Mars,” explains Mowitt. “We have to use sonar. So in some ways, mapping the ocean has challenges that mapping the surface of Mars doesn’t have—although don’t get me wrong, mapping Mars has its own challenges!”

And lastly, the vast area covered by the Earth’s oceans makes it difficult to map. “Google Maps is made up of satellite imagery, and for the earth part of Google Maps, we have spectacular resolution,” notes Mayer, who also helps run Seabed 2030’s Arctic and North Pacific Ocean Regional Center. “Unless we can fundamentally change the laws of physics and make electromagnetic waves propagate through the ocean for tens of kilometers, we have to depend on sonar.”

Autonomy on the surface, and underwater

The importance of ocean knowledge, the magnitude of the challenge, and the complexities of reaching such depths to understand the creatures and features down there are why millions of dollars are now being poured into ocean-mapping projects using autonomous technology.

A 3D visualization of the ocean floor. Photo credit: Terradepth
A 3-D visualization of deep-ocean data from Terradepth, a company that develops autonomous submersive vehicles. Photo Credit: Terradepth

Two AUV makers, Bedrock and Terradepth, each raised $8 million in seed funding recently. Meanwhile, up on the surface, NOAA is working closely with USV maker Saildrone, the company behind a robust autonomous sailboat known as the Saildrone Surveyor. Saildrone recently raised $100 million in venture funding, not long after being awarded roughly $1 million by Google for a mission to explore the impact of the Gulf Stream on weather forecasting and global carbon models.

Salidrone Surveyor - Photo Credit: Courtesy Salidrone
The Salidrone Surveyor makes its way to Honolulu (left) while capturing data to help map the ocean floor (visualized right). Photo Credit: Courtesy Salidrone

Saildrone made headlines twice in 2021. In July, the Surveyor completed its maiden autonomous voyage from San Francisco to Hawaii, a 28-day, 2,250-nautical mile journey during which it mapped 6,400 square nautical miles of seafloor. And then in September, one of its drones sailed into the eye of Hurricane Sam, capturing stunning footage

The maps being so painstakingly produced by the world’s oceanographers are ground-truthed by rare manned visits to the deepest ocean floors; the Mariana Trench has only ever been visited by Jacques Piccard and Don Walsh in 1960, James Cameron in 2012, and Victor Vescovo in 2019, as part of his record-breaking Five Deeps Expedition to the deepest point of each of the five oceans.

But the rarity of such visits underlines the difficulty of using crewed vessels to get the job done. That’s why it’s essential that scientists and engineers collaborate to develop autonomous solutions—and the fact that the clock is ticking increases the pressure.

On dry land, we take high-resolution digital maps for granted, but less than two decades ago we still relied on paper maps. Google Maps was launched in 2005, and 14 years later the company announced that Google Earth had high-definition coverage of 98% of the inhabited Earth and over 10 million miles of Street View imagery. Considering that visibility and accessibility have played a key role in this achievement, and with land covering less than 30% of the Earth’s surface, it’s easy to see how much oceanographers have their work cut out.

“Yeah, it’s difficult,” says Mayer, “but we can do it.” It will, of course, come at a price: Mayer estimates it would cost between $3 to $5 billion using current technology to map the entire seafloor. “That’s a lot of money. But that’s one Mars mission. How many missions have we had to map Mars? Don’t we owe it to ourselves to just turn one of those missions back down to planet earth and map our own planet?”

Choose your lane

How Autonomous Vehicles Distinguish Between Bicycles and People Who Ride Them

How Autonomous Vehicles Distinguish Between Bikes and People

When it comes to how autonomous vehicles see the world, humans come first, literally. Autonomous vehicles (AVs), like the kind operated by Pittsburgh-based Argo AI, use Machine Learning to detect and classify the objects in their surroundings, identifying people...
Why The League of American Bicyclists is optimistic about autonomous vehicles

Why a Leading Cycling Advocacy Group Is Optimistic About Autonomous Vehicles

As autonomous vehicle use grows, AV companies and the League of American Bicyclists are collaborating on how to ensure cyclists and motorists can share the roads safely, even if the “motorist” is artificial intelligence software. As part of the...

Self-Driving Is Arriving Right On Time. Just Like Ice Cream Did

Seven years ago, I was a self-driving skeptic. Not of the technology. Of all the “experts” promising autonomous vehicles would be everywhere by 2020. You didn’t need to be Nostradamus to know that was ridiculous. All you needed was...
Illustration of a futuristic parking deck turned into a mixed-use space, with AVs driving by

How Autonomous Vehicles Could Help Transform Parking Lots

Researchers say it’s likely that autonomous vehicles (AVs) can help reduce the need for parking lots, opening more room for grass and trees and other elements of nature. It may not seem like it when you’re circling the block...
An illustration of an Argo autonomous vehicle in teal against a backdrop of buildings, a bicyclist, and research papers

7 Big Breakthroughs From Argo Research at CVPR 2022

The 2022 Conference on Computer Vision and Pattern Recognition (CVPR 2022) is nearly here. Thousands of computer scientists, software engineers, and researchers from around the globe will gather in New Orleans to review and discuss their latest work in...

Researchers Predict the Future With Lidar Data

Researchers at Carnegie Mellon University’s Argo AI Center for Autonomous Vehicle Research, a private-public partnership funded by Argo for advancing the autonomous-vehicle (AV) field, say they have come up with a way to use lidar data to visualize not...

Must Reads