The promise is awesome: thousands of sailors voluntarily collecting valuable ocean data as they cross the seas — work that would cost hundreds of millions of dollars if research vessels were used. But there’s peril: Can you trust the data?
"There is always an element of ‘data unreliability’ when many different people are collecting data," admits Federico Lauro, the leader behind the project to crowdsource sea science described this week in the journal PLOS Biology.
But the team has an answer: Keep it simple. "Our approach is to use automated instrumentation that will self-collect samples and eliminate the ‘human error’ aspect," says Lauro, a marine microbiology professor at Australia's University of New South Wales.
The concept was presented after being tested during a four-month, 5,800-nautical-mile sail across the Indian Ocean last year organized by the nonprofit Indigo V Expeditions.
Crowdsourcing with technology has been around since the 1990s when astronomers turned to public PCs to crunch data in the search for extraterrestrial life. But using people power to monitor oceans has been limited, and more so when it comes to the Indigo team's focus: sampling the health and distribution of marine microbes, the tiny bacteria and plankton that are the foundation of the ocean food web.
"Citizen science is tremendously promising in the sense that in order to better understand complex ecosystems, scientists need hundreds of thousands, if not millions of data points," Lauro says. "Collecting this much data with a small team is simply impossible. So critical, large-scale questions about the environment remain unanswered and largely unknown."
The cost savings are compelling since a traditional research ship can cost $20,000-$30,000 a day to operate. A recent European ocean survey project cost $23 million to collect samples at 180 stations, the team noted, while the Indigo expedition got samples at 50 stations for less than $75,000.
"An entire four-month expedition … costs the equivalent of a day or two of ship time aboard an oceanographic research vessel," the project leaders wrote. "Imagine what the thousands of yachts that are already out on the water could do."
Next steps include designing "an ocean sampling microbial observatory" that's rugged, small and easy to deploy by next year. Composed of researchers at 12 universities around the world, the team aims to bring the costs down to $1,500 for collecting, processing and sequencing each sample.
The team also hopes to get more sponsors on board, including the U.S. National Oceanic and Atmospheric Administration.
It could help that some NOAA ocean researchers are already fans of crowdsourcing.
"This kind of citizen science has certainly worked for me," says Jim Manning, a researcher at NOAA's Northeast Fisheries Science Center who works with fishermen to record seabed temperatures by attaching a wireless device to lobster traps.
Developed with NOAA funding, the tool is significantly cheaper than having a research vessel do the same thing.
Manning is now sharing that crowdsourcing experience with a broader NOAA program researching the impact of climate change on fish stocks. "They are suggesting we do more cooperative research with fishermen," he says, and crowdsourcing figures into that.
NOAA's program to deal with ocean debris also uses crowdsourcing via a phone app for tracking debris that was developed with the University of Georgia.
"I use those citizen-collected data to get an understanding of where people are using the tracker, to understand the types of debris that show up frequently at a given location, and also as an education tool," says Jason Rolfe, the Southeast and Caribbean coordinator for NOAA's Marine Debris Program.
The University of Georgia developers also leveraged technology to keep the process simple.
"For example, our app automatically logs the GPS coordinates so we get debris location without the user having to do anything to give those to us," says Jenna Jambeck, a UGA assistant professor of environmental engineering.
Training and well-described protocols are also a must for projects in which users are asked to do more, Jambeck notes.
Moreover, a project needs to be clear on its limits, she says. In the case of the tracker app, the goal is not to provide a global picture of debris since data entry relies on volunteers. But it can provide detailed looks over time at specific areas adopted by monitoring teams.
Knowing the limitations is critical, agrees John Hersey, who is part of a team at the engineering firm Survice that worked with NOAA to develop a crowdsourcing system to measure ocean water depth.
"You may not be comfortable using the data to change a depth number on a nautical chart, but you can certainly be confident that there is an obstruction or that the number on the chart is wrong, and that it would be prudent to dispatch a research vessel to investigate," he says.
"The benefit that can be realized by sampling is both continuous and ubiquitous," Hersey adds, "allowing for real-time insight into conditions that otherwise might be observed once per blue moon, if ever. The information age in which we live assures no difficulty in streaming and processing the observations provided by millions."
For the Indigo V Expeditions team, crowdsourcing sea science on a global scale is doable and critical.
"We are seeing the rise of ocean dead zones, acidification … and the demise of 90 percent of the big fish and 50 percent reduction in coral reefs," says co-leader Rachelle Jensen, citing estimates from other studies. "So it’s important for people to understand why we should all care about the oceans and understand what a critical role the oceans play in supporting habitable life on the planet."