NOAA's Response and Restoration Blog

An inside look at the science of cleaning up and fixing the mess of marine pollution


Leave a comment

Redrawing the Coast After Sandy: First Round of Updated Environmental Sensitivity Data Released for Atlantic States

Contsruction equipment moves sand to rebuild a New Jersey beach in front of houses damaged during Hurricane Sandy.

In Brick, New Jersey, construction crews rebuild the beaches in front of homes damaged by Hurricane Sandy. This huge storm actually changed the shape of shorelines up and down the East Coast. (Federal Emergency Management Agency/FEMA)

This is a post by the Office of Response and Restoration’s Jill Petersen.

In 2012 Hurricane Sandy brought devastating winds and flooding to the Atlantic coast. In some parts of New Jersey, flood waters reached nearly 9 feet. Up and down the East Coast, this massive storm actually reshaped the shoreline.

As a result, we’ve been working to update our Environmental Sensitivity Index (ESI) maps to reflect the new state of Atlantic shorelines. These maps and data give oil spill planners and responders a quick snapshot of a shoreline’s vulnerability to spilled oil.

This week, we released the digital data, for use within a Geographic Information System (GIS), for the first regions updated after Hurricane Sandy. Passed the January following Sandy, the Disaster Relief Appropriations Act of 2013 provided funds to update ESI maps for eleven Atlantic coast states, ranging from Maine to South Carolina. For this project, we grouped the states into seven regions.

The GIS data for the regions released this week cover South Carolina and portions of New York and New Jersey, including the Hudson River, south Long Island, and the New York–New Jersey metropolitan area. For these two regions, we mapped more than 300 oil-sensitive species and classified over 17,000 miles of shoreline according to their sensitivity to spilled oil.

Updated GIS data and PDF maps for the remaining regions affected by Sandy will be available in the coming months.

Time for a Change

The magnitude of the overall effort has been unprecedented, and provided us with the opportunity to revisit what was mapped and how, and to update the technology used, particularly as it relates to the map production.

Our first Environmental Sensitivity Index maps were produced in the early 1980s and, since that time, the entire U.S. coast has been mapped at least once. To be most useful, these data should be updated every 5–7 years to reflect changes in shoreline and species distributions that may occur due to a variety of things, including human intervention, climate change, or, as in this case, major coastal storms.

In addition to ranking the sensitivity of different shorelines (including wetlands and tidal flats), these data and maps also show the locations of oil-sensitive animals, plants, and habitats, along with various human features that could either be impacted by oil, such as a marina, or be useful in a spill response scenario, such as access points along a beach.

New Shores, New Features

A street sign is buried under huge piles of sand in front of a beach community.

In the wake of Sandy, we’ve been updating our Environmental Sensitivity Index maps and data and adding new features, such as storm surge inundation data. Hurricane Sandy’s flooding left significant impacts on coastal communities in eleven Atlantic states. (Federal Emergency Management Agency/FEMA)

To gather suggestions for improving our ESI maps and data, we sent out user surveys, conducted interviews, and pored over historical documentation. We evaluated all suggestions while keeping the primary users—spill planners and responders—at the forefront. In the end, several major changes were adopted, and these improvements will be included in all future ESI maps and data.

Extended coverage was one of the most requested enhancements. Previous data coverage was focused primarily on the shoreline and nearshore—perhaps 2–3 miles offshore and generally less than 1 mile inland. The post-Sandy maps and data extend 12 nautical miles offshore and 5 miles inland.

This extension enables us to include data such as deep water species and migratory routes, as well as species occurring in wetlands and human-focused features found further inland. With these extra features, we were able to incorporate additional hazards to the coastal environment. One example was the addition of storm surge inundation data, provided by NOAA’s National Hurricane Center, which provide flood levels for storms classified from Category 1 to Category 5.

We also added more jurisdictional boundaries, EPA Risk Management Facilities (the EPA-regulated facilities that pose the most significant risk to life or human health), repeated measurement sites (water quality, tide gauges, Mussel Watch sites, etc.), historic wrecks, and locations of coastal invasive species. These supplement the already comprehensive human-use features that were traditionally mapped, such as access points, fishing areas, historical sites, and managed areas.

The biological data in our maps continue to represent where species occur, along with supporting information such as concentration, seasonal variability, life stage and breeding information, and the data source. During an oil spill, knowing the data source (e.g., the U.S. Fish and Wildlife Service) is especially important so that responders can reach out for any new information that could impact their approach to the spill response.

A new feature added to the biological data tables alerts users as to why a particular species’ occurrence may warrant more attention than another, providing context such as whether the animals are roosting or migrating. As always, we make note of state and federal threatened, endangered, or listed species.

Next up

Stay tuned for the digital data and PDF maps for additional Sandy-affected regions. While the updated PDF maps will have a slightly different look and feel than prior ones, the symbology and map links will be very familiar to long-time users.

In the meantime, we had already been working on updating ESI maps for two regions outside those funded by the Disaster Relief Appropriations Act. These regions, the outer coast of Washington and Oregon and the state of Georgia, have benefited from the general improvements brought about by this process. As of this week, you can now access the latest GIS data for these regions as well.

Jill PetersenJill Petersen began working with the NOAA spill response group in 1988. Originally a programmer and on-scene responder, in 1991 her focus switched to mapping support, a major component of which is the ESI program. Throughout the years, Jill has worked to broaden the ESI audience by providing ESIs in a variety of formats and developing appropriate mapping tools. Jill has been the ESI program manager since 2001.


Leave a comment

How Do We Use Satellite Data During Oil Spills?

This is a post by NOAA’s George Graettinger with Amy MacFadyen.

A view of the Deepwater Horizon oil spill from NASA's Terra Satellites.

A view of the Deepwater Horizon oil spill from NASA’s Terra Satellites on May 24, 2010. When oil slicks are visible in satellite images, it is because they have changed how the water reflects light, either by making the sun’s reflection brighter or by dampening the scattering of sunlight, which makes the oily area darker. (NASA)

Did you know satellites measure many properties of the Earth’s oceans from space? Remote sensing technology uses various types of sensors and cameras on satellites and aircraft to gather data about the natural world from a distance. These sensors provide information about winds, ocean currents and tides, sea surface height, and a lot more.

NOAA’s Office of Response and Restoration is taking advantage of all that data collection by collaborating with NOAA’s Satellite and Information Service to put this environmental intelligence to work during disasters such as oil spills and hurricanes. Remote sensing technology adds another tool to our toolbox as we assess and respond to the environmental impacts of these types of disasters.

In these cases, which tend to be larger or longer-term oil spills, NOAA Satellites analyzes earth and ocean data from a variety of sensors and provides us with data products such as images and maps. We’re then able to take that information from NOAA Satellites and apply it to purposes ranging from detecting oil slicks to determining how an oil spill might be impacting a species or shoreline.

Slick Technology

During an oil spill, observers trained to identify oil from the air go out in helicopters and planes to report an oil slick’s exact location, shape, size, color, and orientation at a given time. Analogous to this “remote sensing” done by the human eye, satellite sensors can help us define the extent of an oil slick on the ocean surface and create a target area where our aerial observers should start looking for oil.

In the case of a large oil spill over a sizable area such as the Gulf of Mexico, this is very important because we can’t afford the time to go out in helicopters and look everywhere or sometimes weather conditions may make it unsafe to do so.

The three blue shapes represent the NOAA oil spill trajectory for May 17, 2010, showing potential levels of oiling during the Deepwater Horizon oil spill. The green outline represents the aerial footprint or oil extent for the same day, which comes from the NOAA satellite program. All of these shapes appear on a NASA MODIS Terra Satellite background image, as shown in our online response mapping program ERMA.

The three blue shapes represent the NOAA oil spill trajectory for May 17, 2010, showing potential levels of oiling during the Deepwater Horizon oil spill. The green outline represents the aerial footprint or oil extent for the same day, which comes from the NOAA satellite program. All of these shapes appear on a NASA MODIS Terra Satellite background image, as shown in our online response mapping program ERMA. (NOAA)

Satellite remote sensing typically provides the aerial footprint or outline of the surface oil (the surface oiling extent). However, oil slicks are patchy and vary in the thickness of the oil, which means having the outline of the slick is useful, but we still need our observers to give us more detailed information. That said, we’re starting to be able to use remote sensing to delineate not just the extent but also the thickest parts of the slicks.

Armed with information about where spilled oil may be thickest allows us to prioritize these areas for cleanup action. This “actionable oil” is in a condition that can be collected (via skimmers), dispersed, or burned as part of the cleanup process.

You can see how we mapped the surface oiling extent during the Deepwater Horizon spill based on data analyses from NOAA Satellites into our online response mapping program ERMA.

A Model for the Future

A common use of remotely sensed data in our work is with our oil spill models. Reports of a slick’s extent from both satellite sensors and aerial observers, who report additional information about constantly changing oil slicks, helps our oceanographers improve the forecasts of where the oil will be tomorrow.

Just as weather forecasters continually incorporate real-time observations into their models to improve accuracy, our oceanographers update oil spill trajectory models with the latest overflights and observations of the surface oiling extent (the area where oil is at a given moment). These forecasts offer critical information that the Coast Guard uses to prioritize spill response and cleanup activities.

A Sense of Impact

Oil at the water's surface in a boat wake.

The 2010 Deepwater Horizon oil spill provided us with a number of new opportunities to work with remotely sensed data. One use was detecting the outline of oil slicks on the ocean surface. (NOAA)

Over the course of an oil spill, knowing the surface oiling extent and where that oil is going is important for identifying what natural resources are potentially in harm’s way and should be protected during the spill response.

In addition, the data analyses from remote sensing technology directly support our ability to determine how natural resources, whether salt marshes or dolphins, are exposed to spilled oil. Both where an oil slick is and how often it is there will affect the degree of potential harm suffered by sensitive species and habitats over time.

In recent years, we’ve been learning how to better use the remote sensing data collected by satellite and aircraft to look at how, where, and for how long coastal and marine life and habitats are impacted by oil spills and then relate this oil exposure to actual harm to these resources.

Large amounts of oil that stay in the same place for a long time have the potential to cause a lot more harm. For example, dolphins in a certain impacted area might breathe fumes from oil and ingest oil from food and water for weeks or months at a time. Without remotely sensed data, it would be nearly impossible to accomplish this task of tying the exact location and timing of oil exposure to environmental harm.

Remote Opportunities

The 2010 Deepwater Horizon oil spill provided us with a number of new opportunities to work with remotely sensed data. For example, we used this technology to examine the large scale features of the circulation patterns in the Gulf of Mexico, such as the fast-moving Loop Current and associated eddies. The Loop Current is a warm ocean current that flows northward between Cuba and Mexico’s Yucatán Peninsula, moves north into the Gulf of Mexico, then loops east and south before exiting through the Florida Straits and ultimately joining the Gulf Stream.

During this oil spill, there were concerns that if the oil slick entered the Loop Current, it could be transported far beyond the Gulf to the Caribbean or up the U.S. East Coast (it did not). NOAA used information from satellite data to monitory closely the position of the slick with respect to the Loop Current throughout the Deepwater Horizon oil spill.

Our partnership with NOAA’s Satellite and Information Service has been a fruitful one, which we expect to grow even more in the future as technology develops further. In January, NOAA Satellites launched the Jason-3 satellite, which will continue to collect critical sea surface height data, adding to a satellite data record going back to 1992. One way these data will be used is in helping track the development of hurricanes, which in turn can cause oil spills.

We hope ongoing collaboration across NOAA will further prepare us for the future and whatever it holds.


Leave a comment

Explore Oil Spill Data for Gulf of Mexico Marine Life With NOAA GIS Tools

In the wake of the Deepwater Horizon oil spill, the sheer amount of data scientists were gathering from the Gulf of Mexico was nearly overwhelming. Everything from water quality samples to the locations of oiled sea turtles to photos of dolphins swimming through oil—the list goes on for more than 13 million scientific records.

So, how would anyone even start to dig through all this scientific information? Fortunately, you don’t have to be a NOAA scientist to access, download, or even map it. We have been building tools to allow anyone to access this wealth of information on the Gulf of Mexico environment following the Deepwater Horizon oil spill.

We’re taking a look at two of our geographic information systems tools and how they help scientists, emergency responders, and the public navigate the oceans of environmental data collected since the 2010 Deepwater Horizon oil spill.

When it comes to mapping and understanding huge amounts of these data, we turn to our GIS-based tool, the Environmental Response Management Application, known as ERMA®. This online mapping tool is like a Swiss army knife for organizing data and information for planning and environmental emergencies, such as oil spills and hurricanes.

ERMA not only allows pollution responders to see real-time information, including weather information and ship locations, but also enables users to display years of data, revealing to us broader trends.

View of Environmental Response Management Application showing map of Gulf of Mexico with varying probabilities of oil presence and sea turtle oiling during the Deepwater Horizon oil spill with data source information.

In the “Layer” tab on the right side of the screen, you can choose which groups of data, or “layers,” to display in ERMA. Right click on a data layer, such as “Turtle Captures Probability of Oiling (NOAA) (PDARP),” and select “View metadata” to view more information about the data being shown. (NOAA)

For instance, say you want to know the likelihood of sea turtles being exposed to heavy oil during the Deepwater Horizon oil spill. ERMA enables you to see where sea turtles were spotted during aerial surveys or captured by researchers across the Gulf of Mexico between May and September 2010. At the same time, you can view data showing the probability that certain areas of the ocean surface were oiled (and for how long), all displayed on a single, interactive map.

View of Environmental Management Application map of Gulf of Mexico showing varying probabilities of oil presence and sea turtle exposure to oil during the Deepwater Horizon oil spill with map legend.

Clicking on the “Legend” tab on the right side of the screen shows you basic information about the data displayed in ERMA. Here, the red area represents portions of the Gulf of Mexico which had the highest likelihood of exposing marine life to oil. Triangles show sea turtle sightings and squares show sea turtle captures between May and September 2010. The color of the symbol indicates the likelihood of that sea turtle receiving heavy exposure to oil. (NOAA)

Perhaps you want to focus on where Atlantic bluefin tuna were traveling around the Gulf and where that overlaps with the oil spill’s footprint. Or compare coastal habitat restoration projects with the degree of oil different sections of shoreline experienced. ERMA gives you that access.

You can use ERMA Deepwater Gulf Response to find these data in a number of ways (including search) and choose which GIS “layers” of data to turn on and off in the map. To see the most recently added data, click on the “Recent Data” tab in the upper left of the map interface, or find data by browsing through the “Layers” tab on the right. Or look for data in special “bookmark views” on the lower right of the “Layers” tab to find data for a specific topic of interest.

Now, what if you not only want to see a map of the data, what if you also want to explore any trends in the data at a deeper level? Or download photos, videos, or scientific analyses of the data?

That’s where our data management tool DIVER comes in. This tool serves as a central repository for environmental impact data from the oil spill and was designed to help researchers share and find scientific information ranging from photos and field notes to sample data and analyses.

As Ocean Conservancy’s Elizabeth Fetherston put it:

Until recently, there was no real way to combine all of these disparate pixels of information into a coherent picture of, for instance, a day in the life of a sea turtle. DIVER, NOAA’s new website for Deepwater Horizon assessment data, gives us the tools to do just that.

Data information and integration systems like DIVER put all of that information in one place at one time, allowing you to look for causes and effects that you might not have ever known were there and then use that information to better manage species recovery. These data give us a new kind of power for protecting marine species.

One of the most important features of DIVER, called DIVER Explorer, is the powerful search function that allows you to narrow down the millions of data pieces to the precise set you’re seeking. You do it one step, or “filter,” at a time.

DIVER software dialog box showing how to build a query by workplan topic area for marine mammals studied during the Deepwater Horizon oil spill.

A view of the step-by-step process of building a “query,” or specialized search, in our DIVER tool for Deepwater Horizon oil spill environmental impact data. (NOAA)

For example, when you go to DIVER Explorer, click on “Guided Query” at the top and then “Start to Explore Data,” choose “By Workplan Topic Area,” hit “Next,” and finally select “Marine Mammals” before clicking “Run Query” to access information about scientific samples taken from marine mammals and turtles. You can view it on a map, in a table, or download the data to analyze yourself.

An even easier way to explore these data in DIVER, however, is by visiting https://www.doi.gov/deepwaterhorizon/adminrecord and scrolling down to and clicking on #5 Preassessment/Assessment (§§ 990.40 – 990.45; 990.51). This will reveal a list of various types of environmental impacts—to birds, sea floor habitat, marine mammals, etc.—which the federal government studied as part of the Deepwater Horizon oil spill’s Natural Resource Damage Assessment.

Say you’re interested in marine mammals, so you click on 5.6 Marine Mammal Injury and then 5.6.3 Data sets. You can then download and open the document “NOAA Marine Mammal data related to the Deepwater Horizon incident, available through systems such as DIVER and ERMA, or as direct downloads. (September 23, 2015).”

Under the section “Data Links,” you can choose from a variety of stored searches (or “queries”) in DIVER that will show you where and when, for example, bottlenose dolphins with satellite tags traveled after the spill (tip: zoom in to view this data on the map)—along with photographs to go with it (tip: click on the “Photos” tab under the map to browse).

Map view of DIVER software map showing where tagged dolphins swam in the Gulf of Mexico after the Deepwater Horizon oil spill.

A map view of DIVER shows where tagged dolphins traveled along the Gulf Coast, showing two populations that stayed in their home bases of Barataria Bay and Mississippi Sound. (NOAA)

This can tell us key information, such as the fact that certain populations of dolphins stay in the same areas along the coast, meaning they don’t travel far from home. We can also look at data about whether those dolphin homes were exposed to a lot of oil, which would suggest that the dolphins that lived there likely were exposed to oil again and again.

Both of these tools allow us to work with incredible amounts of data and see their stories brought to life through the power of geographic information systems. So, go ahead and start exploring!


Leave a comment

Using NOAA Tools to Help Deal with the Sinking Problem of Wrecked and Abandoned Ships

Workers direct the lifting of a rusted boat from a waterway onto a barge.

Clearing a derelict vessel from the Hylebos Waterway in Tacoma, Washington. NOAA has created several tools and resources for mapping, tracking, and dealing with shipwrecks and abandoned vessels. (Washington Department of Natural Resources/ Tammy Robbins) Used under Creative Commons Attribution-NonCommercial-NoDerivs 2.0 Generic license.

Walk along a waterfront in the United States and wherever you find boats moored, you won’t be hard pressed to find one that has been neglected or abandoned to the point of rusting, leaking, or even sinking. It’s a sprawling and messy issue, one that is hard to fix. When you consider the thousands of shipwrecks strewn about U.S. waters, the problem grows even larger.

How do these vessels end up like this in the first place? Old ships, barges, and recreational vessels end up along coastal waters for a number of reasons: they were destroyed in wartime, grounded or sunk by accident or storm, or just worn out and left to decay. By many estimates shipping vessels have a (very approximate) thirty-year lifetime with normal wear and tear. Vessels, both large and small, may be too expensive for the owner to repair, salvage, or even scrap.

So, wrecked, abandoned, and derelict ships can be found, both invisible and in plain sight, in most of our marine environments, from sandy beaches and busy harbors to the deep ocean floor.

As we’ve discussed before, these vessels can be a serious problem for both the marine environment and economy. While no single comprehensive database exists for all wrecked, abandoned, and derelict vessels (and if it did, it would be very difficult to keep up-to-date), efforts are underway to consolidate existing information in various databases to get a larger view of the problem.

NOAA has created several of these databases and resources, each created for specific needs, which are used to map and track shipwrecks and abandoned vessels. These efforts won’t solve the whole issue, but they are an important step along that path.

Solution to Pollution

Black and white photo of a steam ship half sinking in the Great Lakes.

The S/S America sank after hitting rocks in Lake Superior in 1928, but the wreck was found close to the water surface in 1970. This ship has become the most visited wreck in the Great Lakes, where divers can still see a Model-T Ford on board. (Public domain)

NOAA’s Remediation of Underwater Legacy Environmental Threats (RULET) project identifies the location and nature of potential sources of oil pollution from sunken vessels. These include vessels sunk during past wars, many of which are also grave sites and now designated as national historic sites. The focus of RULET sites are wrecks with continued potential to leak pollutants.

Many of these wrecks begin to leak years, even decades, after they have sunk. An example of such a wreck is Barge Argo, recently rediscovered and found to be leaking as it lay 40 feet under the surface of Lake Erie. The barge was carrying over 4,500 barrels of crude oil and the chemical benzol when it sank in 1937. It had been listed in the NOAA RULET database since 2013. U.S. Coast Guard crews, with support from NOAA’s Office of Response and Restoration, are currently working on a way to safely remove the leaking fuel and cargo.

As in the Barge Argo case, the RULET database is especially useful for identifying the sources of “mystery sheens” —slicks of oil or chemicals that are spotted on the surface of the water and don’t have a clear origin. NOAA’s Office of National Marine Sanctuaries and Office of Response and Restoration jointly manage the RULET database.

Information in RULET is culled from a larger, internal NOAA Sanctuaries database called Resources and Undersea Threats (RUST). RUST lists about 30,000 sites of sunken objects, of which about 20,000 are shipwrecks. Other sites represent munitions dumpsites, navigational obstructions, underwater archaeological sites, and other underwater resources.

Avoiding Future Wrecks

The NOAA Office of Coast Survey’s Wrecks and Obstructions Database contains information on submerged wrecks and obstructions identified within U.S. maritime boundaries, with a focus on hazards to navigation. Information for the database is sourced from the NOAA Electronic Navigational Charts (ENC®) and Automated Wrecks and Obstructions Information System (AWOIS).

The database contains information on identified submerged wrecks and obstructions within the U.S. maritime boundaries, including position (latitude and longitude), and, where available, a brief description and attribution.

Head to the Hub

Recently, the NOAA Marine Debris Program developed and launched the Abandoned and Derelict Vessels (ADV) InfoHub to provide a centralized source of information on cast-off vessels that contribute to the national problem of marine debris. Hosted on the NOAA Marine Debris Program website, the ADV InfoHub will allow users to find abandoned and derelict vessel publications, information on funding to remove them, case studies, current projects, related stories, and FAQs.

Each coastal state (including states bordering the Great Lakes) will have a dedicated page where users can find information on state-specific abandoned and derelict vessel programs, legislation, and funding as well as links to case studies from that particular state and relevant publications and legal reviews. Each state page will also provide the name of the department within that state government that handles abandoned and derelict vessel issues along with contact information.

Power Display

In select parts of the country, the Office of Response and Restoration is now using its Environmental Response Management Application (ERMA®) to map the locations of and key information for abandoned and derelict vessels. ERMA is our online mapping tool that integrates data, such as ship locations, shoreline types, and environmental sensitivity, in a centralized format. Here, we use it to show abandoned and derelict vessels within the context of related environmental information displayed on a Geographic Information System (GIS) map. In Washington’s Puget Sound, for example, the U.S. Coast Guard and Washington Department of Natural Resources can use this information in ERMA to help prioritize removing the worst offenders and raise awareness about the issue.

A view of Pacific Northwest ERMA, a NOAA online mapping tool which can bring together a variety of environmental and response data. Here, you can see the black dots where ports are located around Washington's Puget Sound as well as the colors indicating the shoreline's characteristics and vulnerability to oil.

A view of Pacific Northwest ERMA, a NOAA online mapping tool which can bring together a variety of environmental and response data. Here, you can see the black dots where ports are located around Washington’s Puget Sound as well as the colors indicating the shoreline’s characteristics and vulnerability to oil. (NOAA)

Now part of both Pacific Northwest ERMA and Southwest ERMA (coastal California), our office highlighted ERMA at a May 2015 NOAA Marine Debris Program workshop for data managers. This meeting of representatives from 15 states, four federal agencies, and Canada showcased ERMA as an efficient digital platform for displaying abandoned vessel information in a more comprehensive picture at a regional level.

Once again, removing abandoned vessels or reducing their impacts can be very difficult and costly. But we have been seeing more and more signs of progress in recent years, which requires an increasing amount of collaboration among local, state, and federal agencies and education among the public. By providing more detailed and comprehensive information, NOAA is hoping to help resource managers prioritize and make more informed decisions on how to address the various threats these vessels pose to our coasts.

The Office of Response and Restoration’s Doug Helton also contributed to this post.

Photo of derelict vessel used under Creative Commons Attribution-NonCommercial-NoDerivs 2.0 Generic license.


Leave a comment

In Mapping the Fallout from the Deepwater Horizon Oil Spill, Developing One Tool to Bring Unity to the Response

This is a post by Katie Wagner, Amy Merten, and Michele Jacobi of NOAA’s Office of Response and Restoration.

The Deepwater Horizon Oil Spill: Five Years Later

This is the fifth in a series of stories over the coming weeks looking at various topics related to the response, the Natural Resource Damage Assessment science, restoration efforts, and the future of the Gulf of Mexico.

After an explosion took place on the Deepwater Horizon drilling platform in the Gulf of Mexico on April 20, 2010, responders sprang into action.

Vessels surveyed the area around the platform, oil booms were deployed, aerial surveying operations were launched, risk assessment and shoreline cleanup teams set out, and many other response activities were underway. Field teams and technical experts from around the country were immediately called to help with the response.

Mapping Organized Chaos

People at a crowded table with computers and maps.

During the Deepwater Horizon oil spill, NOAA debuted the online mapping tool ERMA, which organized crucial response data into one common picture for everyone involved in this monumental spill.

Among our many other responsibilities during this spill, NOAA’s Office of Response and Restoration reported to the scene to help manage the data and information being collected to inform spill response decisions occurring across multiple states and agencies.

The process of responding to an oil spill or natural disaster can often be described as “organized chaos.” Effectively managing the many activities and influxes of information during a response is crucial. Responders need to be aware of the local environment, equipment, and associated risks at the scene of the spill, and government leaders from the closest town to Washington, DC, need to make informed decisions about how to deal with the event. Data-rich maps are one way to organize these crucial data into one common operational picture that provides consistent “situational awareness” for everyone involved.

The Environmental Response Management Application (ERMA®) was developed by NOAA’s Office of Response and Restoration, the U.S. Environmental Protection Agency, and the University of New Hampshire in 2007 as a pilot project, initially focused on the New England coast. ERMA is an online mapping tool that integrates both static and real-time data, such as ship locations, weather, and ocean currents, in a centralized, interactive map for environmental disaster response.

In late March of 2010, ERMA was tested in a special oil spill training drill known as the Spills of National Significance Exercise. The industry representatives, U.S. Coast Guard, and state partners participating in this mock oil spill response recognized ERMA’s potential for visualizing large amounts of complex data and for sharing data with the public during an oil spill.

From Test to Trial by Fire

Twenty-five days later, the Deepwater Horizon disaster began. In the first couple of days after the accident, the ERMA team recognized that the scale of the still-developing oil spill would call for exactly the type of tools and skills for which their team had prepared.

A few days into the disaster, the ERMA team created a new, regional version of their web-based mapping application, incorporating data specific to the Gulf of Mexico and the rapidly escalating Deepwater Horizon oil spill. This included geographic response plans (which guide responses to oil spills in specific areas), oil spill trajectories, and locations of designated response vessels, aerial surveys of oil, oiled shoreline assessments, critical habitats, and fishery closure areas.

Screen shot of mapping program for Gulf of Mexico with oil spill data.

A few days into the disaster, the ERMA team created a new, regional version of their web-based mapping application, incorporating data specific to the Gulf of Mexico and the rapidly escalating Deepwater Horizon oil spill. Here, ERMA shows the location of the wellhead, the days of cumulative oiling on the ocean surface, and the level of oiling observed on shorelines. (NOAA)

Due to the size of the spill, NOAA’s Office of Response and Restoration was able to expand the team working on ERMA to include members skilled in data management and scientists familiar with the type of data being collected during a spill response. The ERMA team trained dozens of new Geographic Information Systems (GIS) staff to help upload and maintain the new Deepwater Horizon ERMA site as hundreds of data layers were created weekly.

Within a week of the start of the oil spill, NOAA sent the first of many ERMA team members to work in the command posts in Louisiana, where they could translate the needs of the Federal On-Scene Commanders (those in charge of the spill cleanup and response) into updates and changes for ERMA software developers to make to the mapping application.

ERMA played a critical role in the Deepwater Horizon oil spill response effort. Around a month into the spill, the U.S. Coast Guard selected ERMA as the official common operational picture for all federal, state, and local spill responders to use during the incident. With this special designation, the ERMA tool provided a quick visualization of the sprawling, complicated oil spill situation, and improved communication and coordination among responders, environmental stakeholders, and decision makers. On June 15, 2010 the White House presented a publicly accessible version of the Deepwater Horizon ERMA website, which drew more than 3 million hits the first day it was live. This was an unprecedented effort to make transparent data usually only shared within the command post of an oil spill.

The value of the new tool to the response won it praise from retired Coast Guard Admiral Thad Allen, the national incident commander for the spill, who described its impact, saying, “It allowed us to have a complete picture of what we were doing and what was occurring in the Gulf. The technology has been there, but it’s never been applied in a disaster that was this large scale. It is something that is going to have to incorporate this system into our disaster response doctrine.” Additionally the NOAA development team was one of the finalists for the 2011 Samuel J. Heyman Service to America Medal for Homeland Security contributions by a member of the federal civil service.

From Response to Restoration

In addition to mapping the Deepwater Horizon response and cleanup efforts, ERMA continues to be an active resource throughout the ongoing Natural Resource Damage Assessment and related restoration planning. The Gulf of Mexico coastal resources and habitat data available in ERMA are helping researchers assess the environmental injuries caused by the oil spill.

Five years after this mapping tool’s debut on the national stage during the Deepwater Horizon oil spill, developers continue to improve the platform. NOAA now has nine other ERMA sites customized for various U.S. regions, each of which is kept up-to-date with basic information available around the clock and is publicly available. All regional ERMA websites now reside in the federally approved Amazon Cloud environment for online scalability and durability, and the platform has a flexible framework for incorporating data sources from a variety of organizations.

The Deepwater Horizon oil spill shifted our perspective of who needs data and when they need it. With the help of ERMA, the public, academic communities, and those outside of the typical environmental response community can access data collected during a disaster and be engaged in future incidents like never before.

Visit ERMA Deepwater Gulf Response for a first-hand look at up-to-date and historical data collected during the response, assessment, and restoration planning phases of the Deepwater Horizon oil spill.


3 Comments

Attempting to Answer One Question Over and Over Again: Where Will the Oil Go?

The Deepwater Horizon Oil Spill: Five Years Later

This is the first in a series of stories over the coming weeks looking at various topics related to the response, the Natural Resource Damage Assessment science, restoration efforts, and the future of the Gulf of Mexico.

Oil spills raise all sorts of scientific questions, and NOAA’s job is to help answer them.

We have a saying that each oil spill is unique, but there is one question we get after almost every spill: Where will the oil go? One of our primary scientific products during a spill is a trajectory forecast, which often takes the form of a map showing where the oil is likely to travel and which shorelines and other environmentally or culturally sensitive areas might be at risk.

Oil spill responders need to know this information to know which shorelines to protect with containment boom, or where to stage cleanup equipment, or which areas should be closed to fishing or boating during a spill.

To help predict the movement of oil, we developed the computer model GNOME to forecast the complex interactions among currents, winds, and other physical processes affecting oil’s movement in the ocean. We update this model daily with information gathered from field observations, such as those from trained observers tasked with flying over a spill to verify its often-changing location, and new forecasts for ocean currents and winds.

Modeling a Moving Target

One of the biggest challenges we’ve faced in trying to answer this question was, not surprisingly, the 2010 Deepwater Horizon oil spill. Because of the continual release of oil—tens of thousands of barrels of oil each day—over nearly three months, we had to prepare hundreds of forecasts as more oil entered the Gulf of Mexico each day, was moved by ocean currents and winds, and was weathered, or physically, biologically, or chemically changed, by the environment and response efforts. A typical forecast includes modeling the outlook of the oil’s spread over the next 24, 48, and 72 hours. This task began with the first trajectory our oceanographers issued early in the morning April 21, 2010 after being notified of the accident, and continued for the next 107 days in a row. (You can access all of the forecasts from this spill online.)

Once spilled into the marine environment, oil begins to move and spread surprisingly quickly but not necessarily in a straight line. In the open ocean, winds and currents can easily move oil 20 miles or more per day, and in the presence of strong ocean currents such as the Gulf Stream, oil and other drifting materials can travel more than 100 miles per day. Closer to the coast, tidal currents also can move and spread oil across coastal waters.

While the Deepwater Horizon drilling rig and wellhead were located only 50 miles offshore of Louisiana, it took several weeks for the slick to reach shore as shifting winds and meandering currents slowly moved the oil.

A Spill Playing on Loop

Over the duration of a typical spill, we’ll revise and reissue our forecast maps on a daily basis. These maps include our best prediction of where the oil might go and the regions of highest oil coverage, as well as what is known as a “confidence boundary.” This is a line encircling not just our best predictions for oil coverage but also a broader area on the map reflecting the full possible range in our forecasts [PDF].

Our oceanographers include this confidence boundary on the forecast maps to indicate that there is a chance that oil could be located anywhere inside its borders, depending on actual conditions for wind, weather, and currents. Why is there a range of possible locations in the oil forecasts? Well, the movement of oil is very sensitive to ocean currents and wind, and predictions of oil movement rely on accurate predictions of the currents and wind at the spill site.

In addition, sometimes the information we put into the model is based on an incomplete picture of a spill. Much of the time, the immense size of the Deepwater Horizon spill on the ocean surface meant that observations from specialists flying over the spill and even satellites couldn’t capture the full picture of where all the oil was each day.

Our inevitably inexact knowledge of the many factors informing the trajectory model introduces a certain level of expected variation in its predictions, which is the situation with many models. Forecasters attempt to assess all the possible outcomes for a given scenario, estimate the likelihood of the different possibilities, and ultimately communicate risks to the decision makers.

In the case of the Deepwater Horizon oil spill, we had the added complexity of a spill that spanned many different regions—from the deep Gulf of Mexico, where ocean circulation is dominated by the swift Loop Current, to the continental shelf and nearshore area where ocean circulation is influenced by freshwater flowing from the Mississippi River. And let’s not forget that several tropical storms and hurricanes crossed the Gulf that summer [PDF].

A big concern was that if oil got into the main loop current, it could be transported to the Florida Keys, Cuba, the Bahamas, or up the eastern coast of the United States. Fortunately (for the Florida Keys) a giant eddy formed in the Gulf of Mexico in June 2010 (nicknamed Eddy Franklin after Benjamin Franklin, who did some of the early research on the Gulf Stream). This “Eddy Franklin” created a giant circular water current that kept the oil largely contained in the Gulf of Mexico.

Some of the NOAA forecast team likened our efforts that spring and summer to the movie Groundhog Day, in which the main character is forced to relive the same day over and over again. For our team, every day involved modeling the same oil spill again and again, but with constantly changing results.  Thinking back on that intense forecasting effort brings back memories packed with emotion—and exhaustion. But mostly, we recall with pride the important role our forecast team in Seattle played in answering the question “where will the oil go?”


Leave a comment

Latest NOAA Mapping Software Opens up New Possibilities for Emergency Responders

This is a guest post by emergency planner Tom Bergman.

Aerial view of destroyed houses in Vilonia, Arkansas, after EF4 tornado in April 2014.

NOAA and EPA’s MARPLOT mapping software was designed for emergency responders and planners dealing with chemical spills. However, its features lend it to a host of other uses, from search and rescue after a tornado to dealing with wildfires. (NOAA National Weather Service)

For 20 years, thousands of emergency planners and responders have used the MARPLOT mapping software to respond to hazardous chemical spills. But creative MARPLOT users have also employed the program for a wide range of other uses, including dispatching air ambulances and helping identify a serial arsonist.

MARPLOT is the mapping component of a suite of software programs called CAMEO, jointly developed by NOAA’s Office of Response and Restoration and the U.S. Environmental Protection Agency to help emergency planners and responders deal with chemical spills.

These agencies have just released a new version of MARPLOT (version 5.0). MARPLOT 5 offers a host of new and improved capabilities, which translate to more mapping options, greater flexibility, and even more powerful data searching capabilities.

On the Grid

To illustrate a few of the new capabilities of MARPLOT 5, let’s imagine that a category EF2/EF3 tornado is blowing through McClain County, Oklahoma. McClain County is a mostly rural area, with only three small towns. For this scenario, we will assume the tornado passes through the small town of Blanchard, Oklahoma.

Immediately following the tornado, first responders will conduct initial damage surveys of the affected area. Generally, the Incident Command, which is the multi-agency team responsible for managing the emergency response, will want to divide the area the tornado impacted into a “grid” and assign teams to survey specific areas of it. MARPLOT 5 has a new “gridding” tool, which allows those in an Incident Command to determine and display the various survey zones.

In the Ready Files

Fortunately, McClain County is well-prepared to deal with this emergency. The county already has a complete list of addresses for the affected area in the proper file format for working in maps (E911 address point shape files) and has imported them into MARPLOT 5 before the tornado hit. In addition, McClain Emergency Management has compiled information such as locations with chemicals stored on site, homes or businesses with fortified safe rooms, and any special populations such as those with impaired mobility and made that data available in MARPLOT 5. Having this information at their fingertips helps the Incident Command prioritize resources and search areas in the affected zones, as well as keep survey and search-and-rescue teams safe.

The latest version of the software allows users to upload any .png image file to serve as a map symbol. This feature provides critical information to responders in a customizable and easily interpreted way. Notice in the screen shot of the MARPLOT map below that the locations of safe rooms, E911 address points, and residences of oxygen-dependent and mobility-impaired persons are clearly identified by specific symbols. The user can select any map symbol and see an associated information box displayed for that symbol.

Screenshot showing close-up of grid zones for a hypothetical tornado. The map shows safe rooms, 911 address points, and special populations displayed in MARPLOT 5.

Close-up of grid zones for a hypothetical tornado. The map shows safe rooms, 911 address points, and special populations displayed in MARPLOT 5. (NOAA)

In MARPLOT, any square of the grid can be selected and “searched” for information associated with that area of the map, which is then displayed in the latest version of MARPLOT as a “spreadsheet.” This spreadsheet can be printed and given to the teams surveying impacted areas. Below is an example of an information spreadsheet for E911 address points in a selected one-square-mile grid zone (Grid Box 2, 4).

Screenshot of MARPLOT 5 showing addresses in a spreadsheet.

Address points in the selected Grid Box 2, 4, displayed as a spreadsheet in MARPLOT 5 which responders can print out and take on surveys of damaged areas. (NOAA)

With this feature, emergency responders have the information they need contained in both a map and a spreadsheet as they conduct their initial damage survey. In this example, responders assigned to survey Grid Box 2, 4 already know they must clear 142 address points in the area, six of which have safe rooms, two of which have mobility-impaired residents, and one with an oxygen-dependent person.

Furthermore, the emergency responders in this scenario were able to accomplish all of these operations in MARPLOT without any access to Internet or cloud servers. And the software is 100 percent free.

This is a very simple example of new ways MARPLOT 5 may be implemented by emergency planners and responders across the country. There are a host of other new operations in version 5—including real-time weather via web mapping service (WMS) access—that could be used for dealing with wildfires, search and rescue operations, floods, hazardous material releases, resource management, manhunts … In fact, MARPLOT could be used in just about any type of situation where customizable and user-operated mapping might be helpful.

Learn more about and download the latest version of MARPLOT.

Tom Bergman is the author of the CAMEO Companion and host of the www.cameotraining.org website. Tom is the EPCRA (Emergency Planning and Community Right-to-Know Act) Tier 2 Program Manager for the State of Oklahoma and has been a CAMEO trainer for many years.  He has conducted CAMEO training courses in Lithuania, Poland, England, Morocco, and 45 U.S. states.