NOAA's Response and Restoration Blog

An inside look at the science of cleaning up and fixing the mess of marine pollution


Leave a comment

Using a NOAA Tool to Evaluate Toxic Doses of Pollution at the Hanford Nuclear Reservation

This is a post by Troy Baker, an environmental scientist in NOAA’s Office of Response and Restoration.

Salmon swimming in a river.

NOAA and partners are examining whether chromium released at Washington’s Hanford Nuclear Reservation has affected Chinook salmon eggs and young fishes in the Columbia River. (Department of Energy)

Chromium, manganese, zinc.

Elements like these may show up in a daily multivitamin, but when found in a certain form and concentration in water and soil, these elements can cause serious problems for fish, birds, and wildlife. As assessors of environmental harm from pollution, we see this scenario being played out at hazardous waste sites around the country.

Take chromium, for example, which is an element found in some multivitamins and also naturally in rocks, plants, soil, and animals (and thus at very low concentrations in meat, eggs, and cheese). At the Hanford Nuclear Reservation in eastern Washington, we are evaluating how historical discharges of chromium resulting from nuclear fuel production may have affected soils, river sediments, groundwater, and surface waters along the Columbia River bordering this property.

Of particular concern is whether discharged chromium affected Chinook salmon eggs and young fishes. Hanford’s nuclear reactors, first constructed as part of the top-secret Manhattan Project during World War II, required huge amounts of river water to keep the reactor’s nuclear core cool, and chromium compounds were added to keep this essential equipment from corroding.

A little bit of chromium in the environment is considered part of a baseline condition, but if animals and plants are exposed to elevated amounts during sensitive periods, such as when very young, they may receive harmful doses.

How Much Is Too Much?

Have you heard the saying, “the dose makes the poison?” I wanted to find out how my evaluation of what chemicals may cause harm to aquatic species at Hanford matches up to toxicity data from one of NOAA’s software tools, the Chemical Aquatic Fate and Effects (CAFE) database.

I already knew that chromium in surface waters at the level of parts per billion (ppb) has the potential to cause harm at Hanford, including to migratory Chinook salmon and steelhead. But what does that concentration look like?

A helpful analogy from the Washington State Department of Ecology shows just how small that concentration is: One part per billion would be one kernel of corn sitting in a 45-foot high, 16-foot diameter silo.

Digging Through Data

Government scientists set standards called “injury thresholds” to indicate the pollution concentrations when harm reliably occurs to a certain species of animal or type of habitat. It’s my job to see if we can trace a particular contaminant such as chromium back to a source at the Hanford Nuclear Reservation and then document whether aquatic species were exposed to that contaminant for a certain area and time period and harmed as a result.

I’m currently working with my colleagues to set injury thresholds for the amount of chromium and other harmful materials in soils, sediments, and surface waters at the Hanford Nuclear Reservation.

What’s different in this case is that we are evaluating what short-term harm might have occurred to fishes and other animals from either historical pollution mixtures or existing contamination in the Columbia River. To do that, we need large amounts of toxicity data for aquatic species presented in an easy-to-digest format. That’s where NOAA’s CAFE database comes in.

Graph from the CAFE database showing the level of toxic effects for chromium exposure to a range of fish and aquatic invertebrates.

Example data output from NOAA’s CAFE database showing aquatic invertebrates as the most sensitive freshwater aquatic organism after exposure to chromium for 48 hours in laboratory tests. One microgram per liter (µg/L) is equivalent to one part per billion. (NOAA)

Using this toxicity database for aquatic species, I was able to generate multiple scenarios for chromium exposure to a range of freshwater fish and invertebrates found in the database. I could compare at what concentration chromium becomes toxic to these species and easily see which life stage, from egg to adult, is most affected after 24, 48, and 96 hours of exposure.

The results from CAFE confirmed that setting an injury threshold for chromium somewhere within the “very highly toxic” range of exposure (less than 100 parts per billion of chromium) would be appropriate to protect a wide range of aquatic invertebrates and fish. With the help of CAFE, I was able to quickly double-check whether there is any scientific reason to lower or raise the injury thresholds I’m discussing with my Hanford colleagues.

More Contamination, More Work Ahead

hanford-h-reactor-cocooned-columbia-river_noaa_1946

View of Cocooned H reactor at Hanford Nuclear Facility from Locke Island, Columbia River, Washington. The reactor operated for 15 years and was one of nine along the river. (NOAA)

My colleagues and I have a lot more environmental assessment work to do at the Hanford Nuclear Reservation. Home to nine former nuclear reactors plus processing facilities, that site is one of the nation’s most complex pollution cases.

Part of my work at NOAA is to collaborate with my agency and tribal colleagues through the Natural Resource Damage Assessment process to understand whether harm occurred and ultimately restore the environment in a way that’s equivalent to the scale of the injuries.

We are concerned about more than 40 contaminants at Hanford, but that shouldn’t be a problem for CAFE. This database holds information on environmental fate and effects for about 40,000 chemicals.

The next version of CAFE, due out in 2016, will be able to display information on longer-term effects of chemicals beyond 96 hours, increasing to 28 days if laboratory test data are available. Having toxicity data available for longer durations will be a huge help to my work as it gets translated into decisions about environmental restoration in the future.

Learn more about our environmental assessment and restoration work at the Hanford Nuclear Reservation.


8 Comments

NOAA Scientist Helps Make Mapping Vital Seagrass Habitat Easier and More Accurate

Shoal grass seagrass on a sandy ocean floor.

Seagrass beds serve as important habitat for a variety of marine life, and understanding their growth patterns better can help fisheries management and restoration efforts. (NOAA)

Amy Uhrin was sensing a challenge ahead of her. As a NOAA scientist working on her PhD, she was studying the way seagrasses grow in different patterns along the coast, and she knew that these underwater plants don’t always create lush, unbroken lawns beneath the water’s surface.

Where she was working, off the North Carolina coast near the Outer Banks, things like the churning motion of waves and the speed of tides can cause seagrass beds to grow in patchy formations. Clusters of bigger patches of seagrass here, some clusters of smaller patches over there. Round patches here, elongated patches over there.

Uhrin wanted to be able to look at aerial images showing large swaths of seagrass habitat and measure how much was actually seagrass, rather than bare sand on the bottom of the estuary. Unfortunately, traditional methods for doing this were tedious and tended to produce rather rough estimates. These involved viewing high-resolution aerial photographs, taken from fixed-wing planes, on a computer monitor and having a person digitally draw lines around the approximate edges of seagrass beds.

While that can be fairly accurate for continuous seagrass beds, it becomes more problematic for areas with lots of small patches of seagrass included inside a single boundary. For the patchy seagrass beds Uhrin was interested in, these visual methods tended to overestimate the actual area of seagrass by 70% to more than 1,500%. There had to be a better way.

Seeing the Light

Patches of seagrass beds of different sizes visible from the air.

Due to local environmental conditions, some coastal areas are more likely to produce patchy patterns in seagrass, rather than large beds with continuous cover. (NOAA)

At the time, Uhrin was taking a class on remote sensing technology, which uses airborne—or, in the case of satellites, space-borne—sensors to gather information about the Earth’s surface (including information about oil spills). She knew that the imagery gathered from satellites (i.e. Landsat) is usually not at a fine enough resolution to view the details of the seagrass beds she was studying. Each pixel on Landsat images is 30 meters by 30 meters, while the aerial photography gathered from low-flying planes often delivered resolution of less than a meter (a little over three feet).

Uhrin wondered if she could apply to the aerial photographs some of the semi-automated classification tools from imagery visualization and analysis programs which are typically used with satellite imagery. She decided to give it a try.

First, she obtained aerial photographs taken of six sites in the shallow coastal waters of North Carolina’s Albemarle-Pamlico Estuary System. Using a GIS program, she drew boundaries (called “polygons”) around groups of seagrass patches to the best of her ability but in the usual fashion, which includes a lot of unvegetated seabed interspersed among seagrass patches.

Six aerial photographs of seagrass habitat off the North Carolina coast, with yellow boundary lines drawn around general areas of seagrass habitat.

Aerial photographs show varying patterns of seagrass growth at six study sites off the North Carolina coast. The yellow line shows the digitally drawn boundaries around seagrass and how much of that area is unvegetated for patchy seagrass habitat. (North Carolina Department of Transportation)

Next, Uhrin isolated those polygons of seagrass beds and deleted everything else in each image except the polygon. This created a smaller, easier-to-scan area for the imagery visualization program to analyze. Then, she “trained” the program to recognize what was seagrass vs. sand, based on spectral information available in the aerial photographs.

Though limited compared to what is available from satellite sensors, aerial photographs contain red, blue, and green wavelengths of light in the visible spectrum. Because plants absorb red and blue light and reflect green light (giving them their characteristic green appearance), Uhrin could train the computer program to classify as seagrass the patches where green light was reflected.

Classify in the Sky

Amy Uhrin stands in shallow water documenting data about seagrass inside a square frame of PVC pipe.

NOAA scientist Amy Uhrin found a more accurate and efficient approach to measuring how much area was actually seagrass, rather than bare sand, in aerial images of coastal North Carolina. (NOAA)

To Uhrin’s excitement, the technique worked well, allowing her to accurately identify and map smaller patches of seagrass and export those maps to another computer program where she could precisely measure the distance between patches and determine the size, number, and orientation of seagrass patches in a given area.

“This now allows you to calculate how much of the polygon is actually seagrass vegetation,” said Uhrin, “which is good for fisheries management.” The young of many commercially important species, such as blue crabs, clams, and flounder, live in seagrass beds and actively use the plants. Young scallops, for example, cling to the blades of seagrass before sliding off and burrowing into the sediment as adults.

In addition, being able to better characterize the patterns of seagrass habitat could come in handy during coastal restoration planning and assessment. Due to local environmental conditions, some areas are more likely to produce patchy patterns in seagrass. As a result, efforts to restore seagrass habitat should aim for restoring not just cover but also the original spatial arrangement of the beds.

And, as Uhrin noted, having this information can “help address seagrass resilience in future climate change scenarios and altered hurricane regimes, as patchy seagrass areas are known to be more susceptible to storms than continuous meadows.”

The results of this study, which was done in concert with a colleague at the University of Wisconsin-Madison, have been published in the journal Estuarine, Coastal and Shelf Science.


Leave a comment

How Do We Use Satellite Data During Oil Spills?

This is a post by NOAA’s George Graettinger with Amy MacFadyen.

A view of the Deepwater Horizon oil spill from NASA's Terra Satellites.

A view of the Deepwater Horizon oil spill from NASA’s Terra Satellites on May 24, 2010. When oil slicks are visible in satellite images, it is because they have changed how the water reflects light, either by making the sun’s reflection brighter or by dampening the scattering of sunlight, which makes the oily area darker. (NASA)

Did you know satellites measure many properties of the Earth’s oceans from space? Remote sensing technology uses various types of sensors and cameras on satellites and aircraft to gather data about the natural world from a distance. These sensors provide information about winds, ocean currents and tides, sea surface height, and a lot more.

NOAA’s Office of Response and Restoration is taking advantage of all that data collection by collaborating with NOAA’s Satellite and Information Service to put this environmental intelligence to work during disasters such as oil spills and hurricanes. Remote sensing technology adds another tool to our toolbox as we assess and respond to the environmental impacts of these types of disasters.

In these cases, which tend to be larger or longer-term oil spills, NOAA Satellites analyzes earth and ocean data from a variety of sensors and provides us with data products such as images and maps. We’re then able to take that information from NOAA Satellites and apply it to purposes ranging from detecting oil slicks to determining how an oil spill might be impacting a species or shoreline.

Slick Technology

During an oil spill, observers trained to identify oil from the air go out in helicopters and planes to report an oil slick’s exact location, shape, size, color, and orientation at a given time. Analogous to this “remote sensing” done by the human eye, satellite sensors can help us define the extent of an oil slick on the ocean surface and create a target area where our aerial observers should start looking for oil.

In the case of a large oil spill over a sizable area such as the Gulf of Mexico, this is very important because we can’t afford the time to go out in helicopters and look everywhere or sometimes weather conditions may make it unsafe to do so.

The three blue shapes represent the NOAA oil spill trajectory for May 17, 2010, showing potential levels of oiling during the Deepwater Horizon oil spill. The green outline represents the aerial footprint or oil extent for the same day, which comes from the NOAA satellite program. All of these shapes appear on a NASA MODIS Terra Satellite background image, as shown in our online response mapping program ERMA.

The three blue shapes represent the NOAA oil spill trajectory for May 17, 2010, showing potential levels of oiling during the Deepwater Horizon oil spill. The green outline represents the aerial footprint or oil extent for the same day, which comes from the NOAA satellite program. All of these shapes appear on a NASA MODIS Terra Satellite background image, as shown in our online response mapping program ERMA. (NOAA)

Satellite remote sensing typically provides the aerial footprint or outline of the surface oil (the surface oiling extent). However, oil slicks are patchy and vary in the thickness of the oil, which means having the outline of the slick is useful, but we still need our observers to give us more detailed information. That said, we’re starting to be able to use remote sensing to delineate not just the extent but also the thickest parts of the slicks.

Armed with information about where spilled oil may be thickest allows us to prioritize these areas for cleanup action. This “actionable oil” is in a condition that can be collected (via skimmers), dispersed, or burned as part of the cleanup process.

You can see how we mapped the surface oiling extent during the Deepwater Horizon spill based on data analyses from NOAA Satellites into our online response mapping program ERMA.

A Model for the Future

A common use of remotely sensed data in our work is with our oil spill models. Reports of a slick’s extent from both satellite sensors and aerial observers, who report additional information about constantly changing oil slicks, helps our oceanographers improve the forecasts of where the oil will be tomorrow.

Just as weather forecasters continually incorporate real-time observations into their models to improve accuracy, our oceanographers update oil spill trajectory models with the latest overflights and observations of the surface oiling extent (the area where oil is at a given moment). These forecasts offer critical information that the Coast Guard uses to prioritize spill response and cleanup activities.

A Sense of Impact

Oil at the water's surface in a boat wake.

The 2010 Deepwater Horizon oil spill provided us with a number of new opportunities to work with remotely sensed data. One use was detecting the outline of oil slicks on the ocean surface. (NOAA)

Over the course of an oil spill, knowing the surface oiling extent and where that oil is going is important for identifying what natural resources are potentially in harm’s way and should be protected during the spill response.

In addition, the data analyses from remote sensing technology directly support our ability to determine how natural resources, whether salt marshes or dolphins, are exposed to spilled oil. Both where an oil slick is and how often it is there will affect the degree of potential harm suffered by sensitive species and habitats over time.

In recent years, we’ve been learning how to better use the remote sensing data collected by satellite and aircraft to look at how, where, and for how long coastal and marine life and habitats are impacted by oil spills and then relate this oil exposure to actual harm to these resources.

Large amounts of oil that stay in the same place for a long time have the potential to cause a lot more harm. For example, dolphins in a certain impacted area might breathe fumes from oil and ingest oil from food and water for weeks or months at a time. Without remotely sensed data, it would be nearly impossible to accomplish this task of tying the exact location and timing of oil exposure to environmental harm.

Remote Opportunities

The 2010 Deepwater Horizon oil spill provided us with a number of new opportunities to work with remotely sensed data. For example, we used this technology to examine the large scale features of the circulation patterns in the Gulf of Mexico, such as the fast-moving Loop Current and associated eddies. The Loop Current is a warm ocean current that flows northward between Cuba and Mexico’s Yucatán Peninsula, moves north into the Gulf of Mexico, then loops east and south before exiting through the Florida Straits and ultimately joining the Gulf Stream.

During this oil spill, there were concerns that if the oil slick entered the Loop Current, it could be transported far beyond the Gulf to the Caribbean or up the U.S. East Coast (it did not). NOAA used information from satellite data to monitory closely the position of the slick with respect to the Loop Current throughout the Deepwater Horizon oil spill.

Our partnership with NOAA’s Satellite and Information Service has been a fruitful one, which we expect to grow even more in the future as technology develops further. In January, NOAA Satellites launched the Jason-3 satellite, which will continue to collect critical sea surface height data, adding to a satellite data record going back to 1992. One way these data will be used is in helping track the development of hurricanes, which in turn can cause oil spills.

We hope ongoing collaboration across NOAA will further prepare us for the future and whatever it holds.


2 Comments

Helping a 7-year-old Oceanographer Study Oil Spills in Washington’s Waters

A young boy drops wooden yellow cards off the side of a boat into water.

Dropping the first round of drift cards off a boat in Washington’s San Juan Islands, a kindergartner kicked off his experiment to study oil spills. (Used with permission of Alek)

One spring day in 2014, a shy young boy sidled up to the booth I was standing at during an open house hosted at NOAA’s Seattle campus. His blond head just peaking over the table, this then-six-year-old, Alek, accompanied by his mom and younger sister, proceeded to ask how NOAA’s oil spill trajectory model, GNOME, works.

This was definitely not the question I was expecting from a child his age.

After he set an overflowing binder onto the table, Alek showed me the printed-out web pages describing our oil spill model and said he wanted to learn how to run the model himself. He was apparently planning a science project that would involve releasing “drift cards,” small biodegradable pieces of wood marked with identifying information, into Washington’s Salish Sea to simulate where spilled oil might travel along this heavily trafficked route for oil tankers.

Luckily, Chris Barker, one of our oceanographers who run this scientific model, was nearby and I introduced them.

But that wasn’t my last interaction with this precocious, young oceanographer-in-training. Alek later asked me to serve on his science advisory committee (something I wish my middle school science fair projects had the benefit of having). I was in the company of representatives from the University of Washington, Washington State Department of Ecology, and local environmental and marine organizations.

Over the next year or so, I would direct his occasional questions about oil spills, oceanography, and modeling to the scientists in NOAA’s Office of Response and Restoration.

Demystifying the Science of Oil Spills

A hand-drawn map of oil tankers traveling from Alaska to Washington, a thank-you note on a post-it, and a hand-written card asking for donations.

Alek did a lot of work learning about how oil tankers travel from Alaska to Washington waters and about the threat of oil spills. He even fund-raised to cover the cost of materials for his drift cards. (NOAA)

According to the Washington Department of Ecology, the waters of the Salish Sea saw more than 7,000 journeys by oil tankers traveling to and from six oil refineries along its coast in 2013. Alek’s project was focused on Rosario Strait, a narrow eastern route around Washington’s San Juan Islands in the Salish Sea. There, he would release 400 biodegradable drift cards into the marine waters, at both incoming and outgoing tides, and then track their movements over the next four months.

The scientific questions he was asking in the course of his project—such as where spilled oil would travel and how it might affect the environment—mirror the types of questions our scientists and oil spill experts ask and try to answer when we advise the U.S. Coast Guard during oil spills along the coast.

As Alek learned, multiple factors influence the path spilled oil might take on the ocean, such as the oil type, weather (especially winds), tides, currents, and the temperature and salinity of the water. He attempted to take some of these factors into account as he made his predictions about where his drift cards would end up after he released them and how they would get there.

As with other drift card studies, Alek relied on people finding and reporting his drift cards when they turned up along the coast. Each drift card was stamped with information about the study and information about how to report it.

NOAA has performed several drift card studies in areas such as Hawaii, California, and Florida. One such study took place after the December 1976 grounding of the M/V Argo Merchant near Nantucket Island, Massachusetts, and we later had some of those drift cards found as far away as Ireland and France.

A Learning Experience

A young boy in a life jacket holding a yellow wooden card and sitting on the edge of a boat.

Alek released 400 biodegradable drift cards near Washington’s San Juan Islands in the Salish Sea, at both incoming and outgoing tides, and tracked their movements to simulate an oil spill. (Used with permission of Alek)

Of course, any scientist, young or old, comes across a number of challenges and questions in the pursuit of knowledge. For Alek, that ranged from fundraising for supplies and partnering with an organization with a boat to examining tide tables to decide when and where to release the drift cards and learning how to use Google Earth to map and measure the drift cards’ paths.

Only a couple weeks after releasing them, Alek began to see reports of his drift cards turning up in the San Juan Islands and even Vancouver Island, Canada, with kayakers finding quite a few of them.

As Alek started to analyze his data, we tried to help him avoid overestimating the area of water and length of coastline potentially affected by the simulated oil spill. Once released, oil tends to spread out on the water surface and would end up in patches on the shoreline as well.

Another issue our oceanographer Amy MacFadyen pointed out to Alek was that “over time the oil is removed from the surface of the ocean (some evaporates, some is mixed into the water column, etc.). So, the sites that it took a long time for the drift cards to reach would likely see less impacts as the oil would be much more spread out and there would be less of it.”

During his project, Alek was particularly interested in examining the potential impacts of an oil spill on his favorite marine organism, the Southern Resident killer whales (orcas) that live year-round in the Salish Sea but which are endangered. He used publicly available information about their movements to estimate where the killer whales might have intersected the simulated oil (the drift cards) across the Salish Sea.

Originally, Alek had hoped to estimate how many killer whales might have died as a result of a hypothetical oil spill in this area, but determining the impacts—both deadly and otherwise—of oil on marine mammals is a complicated matter. As a result, we advised him that there is too much uncertainty and not enough data for him to venture a guess. Instead, he settled on showing the number of killer whales that might be at risk of swimming through areas of simulated oil—and hence the killer whales that could be at risk of being affected by oil.

Ocean Scientist in Training

Google Earth view of the differing paths Alek's two drift card releases traveled around Washington's San Juan Islands and Canada's Vancouver Island.

A Google Earth view of the differing paths Alek’s two drift card releases traveled around Washington’s San Juan Islands and Canada’s Vancouver Island. Red represents the paths of drift cards released on an outgoing tide and yellow, the paths of cards released on an incoming tide. (Used with permission of Alek)

“I’d like to congratulate him on a successful drift card experiment,” said MacFadyen. “His results clearly show some of the features of the ocean circulation in this region.”

In a touching note in his final report, Alek dedicated his study to several great ocean scientists and explorers who came before him, namely, Sylvia Earle, Jacques Cousteau, William Beebe, and Rachel Carson. He was also enthusiastic in his appreciation of our help: “Thank you very very much for all of your help! I love what you do at NOAA. Maybe someday I will be a NOAA scientist!”

If you’re interested in learning more about Alek’s study and his results, you can visit his website www.oilspillscience.org, where you also can view a video summary of his project.


Leave a comment

Our Top 10 New Year’s Resolutions for 2016

2015 written on a sandy beach with an approaching wave.

So long, 2015. Hello, 2016!

Another year has gone by, and we’ve stayed plenty busy: responding to a leaking California pipeline, examining the issue of wrecked and abandoned ships, preparing a natural resource damage assessment and restoration plan for the Gulf of Mexico, and removing 32,201 pounds of marine debris from Hawaii’s Midway Atoll.

You can read more about what we accomplished in the last year, but keep in mind we have big goals for 2016 too. We’re aiming to:

  1. Be better models. This spring, we are planning to release an overhaul of our signature oil spill trajectory forecasting (GNOME) and oil weathering (ADIOS) models, which will be combined into one tool and available via an online interface for the first time.
  2. Tidy up. Our coasts, that is. In the next year, we will oversee marine debris removal projects in 17 states and territories, empowering groups to clean up coastal areas of everything from plastics to abandoned fishing gear.
  3. Use or lose. Nature and wildlife offer a lot of benefits to people, and we make use of them in a number of ways, ranging from recreational fishing to birdwatching to deep-seated cultural beliefs. In 2016 we’ll examine what we lose when nature and wildlife get harmed from pollution and how we calculate and make up for those losses.
  4. Get real. About plastic in the ocean, that is. We’ll be turning our eye toward the issue of plastic in the ocean, how it gets there, what its effects are, and what we can do to keep it out of the ocean.
  5. Explore more. We’ll be releasing an expanded, national version of our DIVER data management tool, which currently holds only Deepwater Horizon data for the Gulf of Mexico, allowing us and our partners to better explore and analyze ocean and coastal data from around the country.
  6. Get artistic. Through our NOAA Marine Debris Program, we are funding projects to create art from ocean trash to raise awareness of the issue and keep marine debris off our coasts and out of our ocean.
  7. Break ground on restoration. Finalizing the draft comprehensive restoration plan for the Gulf of Mexico, following the 2010 Deepwater Horizon oil spill, will bring us one step closer to breaking ground on many restoration projects over the next several years.
  8. App to it. We are working on turning CAMEO Chemicals, our popular database of hazardous chemicals, into an application (app) for mobile devices, making access to critical information about thousands of potentially dangerous chemicals easier than ever.
  9. Train up. We pride ourselves on providing top-notch training opportunities, and in 2016, we already have Science of Oil Spill classes planned in Mobile, Alabama, and Ann Arbor, Michigan (with more to come). Plus, we’ve introduced a brand-new Science of Chemical Releases class, designed to provide information and tools to better manage and plan for responses to chemical incidents.
  10. Get strategic. We are updating our five year strategic plan, aligning it with NOAA’s Ocean Service strategic priorities [PDF], which are coastal resilience (preparedness, response, and recovery), coastal intelligence, and place-based conservation.


Leave a comment

On the Hunt for Shipping Containers Lost off California Coast

Large waves break on a pier that people are walking along.

The M/V Manoa lost 12 containers in stormy seas off the coast of California in the area of the Greater Farallones National Marine Sanctuary. (Credit: Beach Watch/mojoscoast)

On December 11, 2015, the Matson container ship M/V Manoa was en route to Seattle from Oakland, California, when it lost 12 large containers in heavy seas. At the time of the spill, the ship was maneuvering in order to allow the San Francisco Bay harbor pilot to disembark.

The containers, which are 40 feet long and 9 feet wide, are reported as empty except for miscellaneous packing materials, such as plastic crates and packing materials such as Styrofoam. Luckily there were no hazardous materials in the cargo that was spilled.

The accident occurred about eight miles outside of the Golden Gate Bridge in the Greater Farallones National Marine Sanctuary. Three containers have come ashore, two at or near Baker Beach, just south of the Golden Gate Bridge, and one at Mori Point near Pacifica, California. The search continues for the others.

The Coast Guard is responding to this incident with assistance from NOAA, the National Park Service, State of California, and City of San Francisco. The responsible party is working with an environmental contractor to recover the debris and containers. The Coast Guard asks that if a container is found floating or approaching shore to exercise caution and notify the Coast Guard Sector San Francisco Command Center at 415-399-7300.

On December 14, NOAA’s Office of Response and Restoration became involved when the Coast Guard Sector San Francisco contacted the NOAA Scientific Support Coordinator for the region, Jordan Stout. The Coast Guard requested help from the Office of Response and Restoration in tracking the missing containers. Oceanographer Chris Barker is providing trajectory modeling, using wind and current information to predict the potential direction of the spilled containers.

NOAA chart of waters off San Francisco showing where the shipping containers were lost and where three have been found.

A NOAA oceanographer is using wind and current information to predict the potential direction of the spilled shipping containers off the California coast. This information is helping direct search efforts for the remaining containers. (NOAA)

This accident occurred in NOAA’s Greater Farallones National Marine Sanctuary. The Greater Farallones Marine Sanctuary Association Beach Watch program, provided some of the initial sightings to the Coast Guard, and volunteers are doing additional beach surveys to look for debris and more containers. There is a concern that the containers, contents, or parts of the containers could pose a hazard to wildlife through entanglement or by ingestion. There is also concern about the containers potentially damaging ocean and coastal bottom habitats within the marine sanctuary. (Read a statement from the sanctuary superintendent. [PDF])

This incident illustrates another way that marine debris can enter the environment. According to Sherry Lippiatt of the NOAA Marine Debris Program, “This incident is a reminder that while marine debris is an everyday problem, winter storms and higher ocean swells may increase the amount of debris entering the environment.”

To learn more about how storms can lead to increased marine debris, take a look at the recent article, California’s “First Flush”. For information on how citizen science can help in situations like this, see this article about searching for Japan tsunami debris on the California coast.


Leave a comment

Explore Oil Spill Data for Gulf of Mexico Marine Life With NOAA GIS Tools

In the wake of the Deepwater Horizon oil spill, the sheer amount of data scientists were gathering from the Gulf of Mexico was nearly overwhelming. Everything from water quality samples to the locations of oiled sea turtles to photos of dolphins swimming through oil—the list goes on for more than 13 million scientific records.

So, how would anyone even start to dig through all this scientific information? Fortunately, you don’t have to be a NOAA scientist to access, download, or even map it. We have been building tools to allow anyone to access this wealth of information on the Gulf of Mexico environment following the Deepwater Horizon oil spill.

We’re taking a look at two of our geographic information systems tools and how they help scientists, emergency responders, and the public navigate the oceans of environmental data collected since the 2010 Deepwater Horizon oil spill.

When it comes to mapping and understanding huge amounts of these data, we turn to our GIS-based tool, the Environmental Response Management Application, known as ERMA®. This online mapping tool is like a Swiss army knife for organizing data and information for planning and environmental emergencies, such as oil spills and hurricanes.

ERMA not only allows pollution responders to see real-time information, including weather information and ship locations, but also enables users to display years of data, revealing to us broader trends.

View of Environmental Response Management Application showing map of Gulf of Mexico with varying probabilities of oil presence and sea turtle oiling during the Deepwater Horizon oil spill with data source information.

In the “Layer” tab on the right side of the screen, you can choose which groups of data, or “layers,” to display in ERMA. Right click on a data layer, such as “Turtle Captures Probability of Oiling (NOAA) (PDARP),” and select “View metadata” to view more information about the data being shown. (NOAA)

For instance, say you want to know the likelihood of sea turtles being exposed to heavy oil during the Deepwater Horizon oil spill. ERMA enables you to see where sea turtles were spotted during aerial surveys or captured by researchers across the Gulf of Mexico between May and September 2010. At the same time, you can view data showing the probability that certain areas of the ocean surface were oiled (and for how long), all displayed on a single, interactive map.

View of Environmental Management Application map of Gulf of Mexico showing varying probabilities of oil presence and sea turtle exposure to oil during the Deepwater Horizon oil spill with map legend.

Clicking on the “Legend” tab on the right side of the screen shows you basic information about the data displayed in ERMA. Here, the red area represents portions of the Gulf of Mexico which had the highest likelihood of exposing marine life to oil. Triangles show sea turtle sightings and squares show sea turtle captures between May and September 2010. The color of the symbol indicates the likelihood of that sea turtle receiving heavy exposure to oil. (NOAA)

Perhaps you want to focus on where Atlantic bluefin tuna were traveling around the Gulf and where that overlaps with the oil spill’s footprint. Or compare coastal habitat restoration projects with the degree of oil different sections of shoreline experienced. ERMA gives you that access.

You can use ERMA Deepwater Gulf Response to find these data in a number of ways (including search) and choose which GIS “layers” of data to turn on and off in the map. To see the most recently added data, click on the “Recent Data” tab in the upper left of the map interface, or find data by browsing through the “Layers” tab on the right. Or look for data in special “bookmark views” on the lower right of the “Layers” tab to find data for a specific topic of interest.

Now, what if you not only want to see a map of the data, what if you also want to explore any trends in the data at a deeper level? Or download photos, videos, or scientific analyses of the data?

That’s where our data management tool DIVER comes in. This tool serves as a central repository for environmental impact data from the oil spill and was designed to help researchers share and find scientific information ranging from photos and field notes to sample data and analyses.

As Ocean Conservancy’s Elizabeth Fetherston put it:

Until recently, there was no real way to combine all of these disparate pixels of information into a coherent picture of, for instance, a day in the life of a sea turtle. DIVER, NOAA’s new website for Deepwater Horizon assessment data, gives us the tools to do just that.

Data information and integration systems like DIVER put all of that information in one place at one time, allowing you to look for causes and effects that you might not have ever known were there and then use that information to better manage species recovery. These data give us a new kind of power for protecting marine species.

One of the most important features of DIVER, called DIVER Explorer, is the powerful search function that allows you to narrow down the millions of data pieces to the precise set you’re seeking. You do it one step, or “filter,” at a time.

DIVER software dialog box showing how to build a query by workplan topic area for marine mammals studied during the Deepwater Horizon oil spill.

A view of the step-by-step process of building a “query,” or specialized search, in our DIVER tool for Deepwater Horizon oil spill environmental impact data. (NOAA)

For example, when you go to DIVER Explorer, click on “Guided Query” at the top and then “Start to Explore Data,” choose “By Workplan Topic Area,” hit “Next,” and finally select “Marine Mammals” before clicking “Run Query” to access information about scientific samples taken from marine mammals and turtles. You can view it on a map, in a table, or download the data to analyze yourself.

An even easier way to explore these data in DIVER, however, is by visiting https://www.doi.gov/deepwaterhorizon/adminrecord and scrolling down to and clicking on #5 Preassessment/Assessment (§§ 990.40 – 990.45; 990.51). This will reveal a list of various types of environmental impacts—to birds, sea floor habitat, marine mammals, etc.—which the federal government studied as part of the Deepwater Horizon oil spill’s Natural Resource Damage Assessment.

Say you’re interested in marine mammals, so you click on 5.6 Marine Mammal Injury and then 5.6.3 Data sets. You can then download and open the document “NOAA Marine Mammal data related to the Deepwater Horizon incident, available through systems such as DIVER and ERMA, or as direct downloads. (September 23, 2015).”

Under the section “Data Links,” you can choose from a variety of stored searches (or “queries”) in DIVER that will show you where and when, for example, bottlenose dolphins with satellite tags traveled after the spill (tip: zoom in to view this data on the map)—along with photographs to go with it (tip: click on the “Photos” tab under the map to browse).

Map view of DIVER software map showing where tagged dolphins swam in the Gulf of Mexico after the Deepwater Horizon oil spill.

A map view of DIVER shows where tagged dolphins traveled along the Gulf Coast, showing two populations that stayed in their home bases of Barataria Bay and Mississippi Sound. (NOAA)

This can tell us key information, such as the fact that certain populations of dolphins stay in the same areas along the coast, meaning they don’t travel far from home. We can also look at data about whether those dolphin homes were exposed to a lot of oil, which would suggest that the dolphins that lived there likely were exposed to oil again and again.

Both of these tools allow us to work with incredible amounts of data and see their stories brought to life through the power of geographic information systems. So, go ahead and start exploring!