NOAA's Response and Restoration Blog

An inside look at the science of cleaning up and fixing the mess of marine pollution


Leave a comment

One Step Toward Reducing Chemical Disasters: Sharing with Communities Where Those Chemicals Are Located

This is a guest post by emergency planner Tom Bergman.

Dirty label on leaking chemical drum

Attempting to access, collect, and share information on where chemicals are produced, stored, and transported is a challenge for state and local emergency responders trying to prevent the type of chemical disasters that devastated West, Texas, and Geismar, Louisiana, in 2013. (killbox/Creative Commons Attribution 2.0 Generic License)

The year 2013 saw two major chemical disasters in the United States, which tragically killed 17 people and injured hundreds more. As a result, President Obama signed Executive Order 13650 (EO 13650) August 1, 2013, followed by a report the next year to improve the safety and security of chemical facilities and to reduce the risks of hazardous chemicals to workers and communities.

As part of this directive, six federal agencies and departments, including the U.S. Environmental Protection Agency (EPA), formed a work group to investigate how to better help local communities plan for and respond to emergencies involving hazardous substances.

Out of these work group discussions came one area needing improvement which might sound surprising to the average person: data sharing. Specifically, the work group highlighted the need to improve data sharing among the various federal programs that regulate hazardous substances and the state and local communities where those chemicals are produced, stored, and transported.

EPA works with NOAA on the chemical spill planning and response software suite known as CAMEO. These software programs offer communities critical tools for organizing and sharing precisely this type of chemical data.

Lots of Chemicals, Lots of Data

Many parts of the federal government, including several of the agencies involved in the work group, regulate hazardous chemicals in a number of ways to keep our communities safe. That means collecting information from industry on the presence or usage of hazardous substances in communities across the nation. It also results in a lot of data reported on the hazardous materials manufactured, used, stored, and transported in the United States. Making sure these data are shared with the right people is a key goal for chemical safety.

However, federal agencies do not require industry to report all of this information in consistent formats across agencies. Furthermore, this reported information on hazardous chemicals is generally not available to local emergency planners and responders—the very people who would need quick access to that information during a disaster in their community.

Trying to access, collect, and share all of this information is a challenge for state and local emergency responders trying to prevent the type of chemical disasters that devastated West, Texas, and Geismar, Louisiana, in 2013. Fortunately, however, NOAA and EPA have a suite of software tools—known as CAMEO—that helps make this task a little easier.

One State’s Approach to Better Data Sharing

As required by the Emergency Planning and Community Right-to-Know Act (EPCRA), which was passed to help communities plan for emergencies involving hazardous substances, each state, Local Emergency Planning Committee, and local fire department receives hazardous material information via hazardous chemical inventories, or “Tier 2” reports. This information represents one part of the picture for local communities, but as the federal work group pointed out, it is not enough.

Already familiar with the CAMEO software suite, Oklahoma’s state emergency planners decided to use this complementary set of programs to tackle the goal of better sharing chemical safety data, as outlined in Executive Order 13650.

Under EPCRA, each state is required to have a State Emergency Response Commission to oversee the law’s hazardous chemical emergency planning programs. In Oklahoma, the group is known as the Oklahoma Hazardous Materials Emergency Response Commission (OHMERC).

As their first step toward improving chemical data sharing with local planners, OHMERC set out to obtain hazardous material information from the EPA, Department of Homeland Security, and Bureau of Alcohol, Tobacco, Firearms, and Explosives. Then, they sought to make that information available to all Oklahoma Local Emergency Planning Committees (LEPC). Subsequently, these federal agencies began to contact other state representatives to explore avenues to share these data.

Each of the three federal agencies OHMERC contacted provided non-sensitive hazardous material program data—plus the state already had access to some of the information—but these data were in different file formats. Some were contained in spreadsheets, others as PDF files, and still others delivered in text documents. As a result, there was no consistent format for delivering the information to local emergency planners.

Going Local

Oklahoma Local Emergency Planning Committees already use the CAMEO suite of software to manage their Tier 2 (EPA hazardous chemical inventory) reports. As a result, OHMERC decided to use the database program CAMEOfm to deliver additional information from other federal hazardous material programs to these local committees.

For each Tier 2 report, CAMEOfm has an “ID and Regs” section, which typically contains standard identifying codes for each local facility dealing with chemicals. For the appropriate facilities, OHMERC added new designations to the ID fields for the additional regulatory data from the Department of Homeland Security, EPA, and Bureau of Alcohol, Tobacco, Firearms, and Explosives. Now, local planners can search CAMEOfm to see which facilities in their jurisdiction are subject to several other hazardous material regulatory programs. If interested, local planners then can contact a facility, inquire why it is regulated by a particular program, gather more information, and plan directly with that facility.

Since all the CAMEOfm records are linked to the MARPLOT mapping program (also part of the CAMEO software suite), Local Emergency Planning Committees now have the information mapped as well. For example, a planner from Tulsa County can search CAMEOfm for locations with chemicals regulated under the Department of Homeland Security’s Chemical Facility Anti-Terrorism Standards program (CFATS) and the EPA’s Risk Management Plan and Toxics Release Inventory programs. Next, the planner can display the results on a map using MARPLOT.

In addition, Oklahoma facilities regulated under EPA’s Risk Management Plan program have been encouraged to include the non-sensitive parts of their plans in the “Site Plans” section of CAMEOfm. Many, though not all, of these sites did so, realizing this was an effective method to ensure the local first responders had access to that important information.

Getting Data in Ship Shape

Oklahoma’s Local Emergency Planning Committees now have all of this chemical safety information in a consistent format, located in a familiar program where they easily can access it for planning and response efforts.

Screen shot of CAMEOfm record with chemical information of shipment of Bakken crude oil.

Rail lines provide data that Oklahoma’s state emergency planners want to share with the local planning committees. The data include the appropriate Material Safety Data Sheets (MSDS) for Bakken crude oil, along with emergency response personnel and information for that railroad, and a report of the numbers of trains shipping more than 1 million pounds of Bakken crude. This information is added as a CAMEOfm record quickly and easily, in a way that is completely accessible to the responders and planners along with their other CAMEOfm records.

Another timely example of how Oklahoma is using this CAMEOfm and MARPLOT combination is for managing information on rail shipments of Bakken crude oil through the state. Bakken oil is a highly flammable type of oil typically shipped by train from the Bakken region of North Dakota and Montana and has been involved in a number of high-profile explosions and fires after train cars carrying it have derailed. OHMERC entered this shipment information, provided by the railroads, into CAMEOfm, where it becomes linked to the appropriate railroad map objects in MARPLOT. OHMERC then sends this material in the CAMEOfm and MARPLOT format to the relevant Local Emergency Planning Committees.

Using these programs to better share data is a step that any emergency planner or responder can take. You can find more information about the CAMEO software suite at response.restoration.noaa.gov/cameo.

This is a guest post by Oklahoma emergency planner Tom Bergman. He is the author of the CAMEO Companion and host of the www.cameotraining.org website. Tom is the EPCRA (Emergency Planning and Community Right-to-Know Act) Tier 2 Program Manager for the State of Oklahoma and has been a CAMEO trainer for many years. He has conducted CAMEO training courses in Lithuania, Poland, England, Morocco, and 45 U.S. states.


1 Comment

NOAA Builds Tool to Hold Unprecedented Amounts of Data from Studying an Unprecedented Oil Spill

This is a post by Benjamin Shorr of NOAA’s Office of Response and Restoration.

The Deepwater Horizon Oil Spill: Five Years Later

This is the seventh in a series of stories over the coming weeks looking at various topics related to the response, the Natural Resource Damage Assessment science, restoration efforts, and the future of the Gulf of Mexico.

The Deepwater Horizon oil spill was the largest marine oil spill in U.S. history. In the wake of this massive pollution release, NOAA and other federal and state government scientists need to determine how much this spill and ensuing response efforts harmed the Gulf of Mexico’s natural resources, and define the necessary type and amount of restoration.

That means planning a lot of scientific studies and collecting a lot of data on the spill’s impacts, an effort beginning within hours of the spill and continuing to this day.

Scientists collected oil samples from across the Gulf Coast. Oil spill observers snapped photographs of oil on the ocean surface from airplanes. Oceanographic sensors detected oil in the water column near the Macondo wellhead. Biologists followed the tracks of tagged dolphins as they swam through the Gulf’s bays and estuaries. Scientists are using this type of information—and much more—to better understand and assess the impacts to the Gulf ecosystem and people’s uses of it.

But what is the best way to gather together and organize what would become an unprecedented amount of data for this ongoing Natural Resource Damage Assessment process? Scientists from across disciplines, agencies, and the country needed to be able to upload their own data and download others’ data, in addition to searching and sorting through what would eventually amount to tens of thousands of samples and millions of results and observations.

First, a Quick Fix

Early on, it became clear that the people assessing the spill’s environmental impacts needed a single online location to organize the quickly accumulating data. To address this need, a team of data management experts within NOAA began creating a secure, web-based data repository.

This new tool would allow scientific teams from different organizations to easily upload their field data and other key information related to their studies, such as scanned field notes, electronic data sheets, sampling protocols, scanned images, photographs, and navigation information. Graphic with gloved hands pouring liquid from sample jar into beaker and numbers of samples, results, and studies resulting from NOAA efforts. While this data repository was being set up, NOAA needed an interim solution and turned to its existing database tool known as Query Manager. Query Manager allowed users to sort and filter some of the data types being collected for the damage assessment—including sediment, tissue, water, and oil chemistry results, as well as sediment and water toxicity data—but the scope and scale of the Deepwater Horizon oil spill called for more flexibility and features in a data management tool. When NOAA’s new data repository was ready, it took over from Query Manager.

Next, a New Data Management Solution

As efforts to both curtail and measure the spill’s impacts continued, the amount and diversity of scientific data began pouring in at unprecedented rates. The NOAA team working on the new repository took stock of the types of data being entered into it and realized a database alone would not be enough. They searched for a better way to not only manage information in the repository but to organize the data and make them accessible to myriad scientists on the Gulf Coast and in laboratories and offices across the country.

Building on industry standard, open source tools for managing “big data,” NOAA developed a flexible data management tool—known as a “data warehouse”—which gives users two key features. First, it allows them to integrate data sets and documents as different as oceanographic sensor data and field observations, and second, it allows users to filter and download data for further analysis and research.

Now, this data warehouse is a little different than the type of physical warehouse where you stack boxes of stuff on row after row of shelves in a giant building. Instead, this web-based warehouse contains a flexible set of tables which can hold various types of data, each in a specific format, such as text documents in .pdf format or images in .jpg format.

Screenshot of data management tool showing map with locations of various data.

NOAA’s data management tool allows users to integrate very different data sets and documents, such as water and oil samples and field observations, as well as filter and download data for further analysis and research. (NOAA)

To fill this digital warehouse with data, the development team worked with the scientific and technical experts, who in many cases were out collecting data in places impacted by the oil spill, to establish a flow of information into the appropriate tables in the warehouse. In addition, they standardized formats for entering certain data, such as date, types of analysis, and names of species.

Manual and automated checks ensure the integrity of the data being entered, a process which gets easier as new data arrive in the warehouse and are incorporated into the proper table. The process of standardizing and integrating data in one accessible location also helps connect cross-discipline teams of scientists who may be working on different parts of the ecosystem, say marsh versus nearshore waters.

The NOAA team has also created a custom-built “query tool” for the data warehouse that can search and filter all of those diverse data in a variety of ways. A user can filter data by one or more values (such as what type of analysis was done), draw a box around a specific geographic area to search and filter data by location, select a month and year to sort by date sampled, or even type in a single keyword or sample ID. This feature is critical for the scientists and technical teams tasked with synthesizing data across time and space to uncover patterns of environmental impact.

Download the Data Yourself

NOAA’s data warehouse currently holds validated damage assessment data from more than 53,000 water, tissue, oil, and sediment samples, which, once these samples were analyzed, have led to over 3.8 million analytical results, also stored within the new tool. Together, NOAA’s samples and analytical results have informed more than 16 scientific studies published in peer-reviewed scientific journals, as well as many other academic and scientific publications.

While not all of the data from the damage assessment are publicly available yet, you can access validated data collected through cooperative studies or otherwise made available through the Natural Resource Damage Assessment legal process.

You can find validated data exported from NOAA’s digital data warehouse available for download on both the Natural Resource Damage Assessment website and NOAA’s interactive online mapping tool for this spill, the ERMA Deepwater Gulf Response website. Stay tuned for more about this new tool, including additional details on how it works and where you can find it.


Leave a comment

Who Is Funding Research and Restoration in the Gulf of Mexico After the Deepwater Horizon Oil Spill?

This is a post by Kate Clark, Acting Chief of Staff with NOAA’s Office of Response and Restoration, and Frank Parker, Associate Director for the NOAA RESTORE Act Science Program, with NOAA’s National Centers for Coastal Ocean Science.

The Deepwater Horizon Oil Spill: Five Years Later

This is the fourth in a series of stories over the coming weeks looking at various topics related to the response, the Natural Resource Damage Assessment science, restoration efforts, and the future of the Gulf of Mexico.

When an oil spill takes place, people want to see the coasts, fish, wildlife, and recreational opportunities affected by that spill restored—so they can be as they were before, as quickly as possible. Fortunately, the Oil Pollution Act of 1990 supports this. After most major oil spills, what routinely happens is the government undertakes a Natural Resource Damage Assessment, a rigorous, scientific process of assessing environmental injuries and, with public input, identifying and implementing the appropriate amount of restoration to compensate for the injuries resulting from this spill (all paid for by those responsible for the pollution).

What is not routine in the wake of an oil spill is the groundswell of support for even more research and restoration, beyond the scope of the usual damage assessment process, to bolster the resilience of the impacted ecosystem and coastal communities. Yet that is exactly what happened after the Deepwater Horizon well blowout in 2010, which renewed a national interest in the unique environment that is the Gulf of Mexico.

In the wake of this disaster, there have been various additional investments, outside of the Natural Resource Damage Assessment process, in more broadly learning about and restoring the Gulf of Mexico. These distinct efforts to fund research and restoration in the Gulf have been sizable, but keeping track of them can be, frankly, a bit confusing.

The many organizations involved are working to ensure the Gulf’s new infusions of funding for restoration and research are well coordinated. However, keep in mind that each effort is independent of the others in funding mechanism, primary mandate, and process.

Tracking Dollars for Gulf Restoration

In one effort, announced while the Macondo well was still gushing oil, BP dedicated up to $500 million dollars to be spent over 10 years “to fund an independent research program designed to study the impact of the oil spill and its associated response on the environment and public health in the Gulf of Mexico.” This investment spawned the Gulf of Mexico Research Initiative, or GOMRI, which is governed by an independent, academic research board of 20 science, public health, and research administration experts and independent of BP’s influence.

Meanwhile, BP faced both potential criminal and civil penalties under the Clean Water Act, which regulates the discharge of pollutants into U.S. waters. When such penalties are pursued by the government for pollution events, such as an oil spill, a portion of the criminal monetary penalties are usually paid to a local environmental foundation or conservation organization to administer the funds.

Ultimately, BP agreed to a $4 billion criminal settlement in 2013, with the bulk of that money going to North American Wetlands Conservation Fund, National Fish and Wildlife Foundation, and National Academy of Sciences.

Chart showing various investments and their recipients for science and restoration efforts in the Gulf of Mexico after the Deepwater Horizon oil spill.

Science and restoration initiatives in the Gulf of Mexico following the Deepwater Horizon oil spill. (NOAA)

That still leaves civil penalties to be determined. Normally, civil penalties under the Clean Water Act are directed to the General Treasury.

However, Congress passed legislation calling for 80 percent of the administrative and civil penalties related to the Deepwater Horizon oil spill to be diverted directly to the Gulf of Mexico for ecological and economic restoration. This legislation, known as the RESTORE Act (Resources and Ecosystems Sustainability, Tourist Opportunities, and Revived Economies of the Gulf Coast States Act of 2012), passed on July 6, 2012.

While the full extent of BP’s civil penalties have yet to be determined, in 2013 the Department of Justice finalized a civil settlement with Transocean in the amount of $1 billion. This settlement results in more than $800 million going to the Gulf of Mexico under the RESTORE Act. As to penalties for BP, the court has currently ruled on two of the three trial phases. Based on those rulings, currently under appeal, the penalty cap for BP is $13.7 billion. A third trial phase for factors that are taken into account in establishing the penalty at or under that cap was concluded in February 2015. The court has yet to rule on the third phase of the trial, and the pending appeals have not yet been heard by the appeals court.

NOAA and Restoration in the Gulf

So where does NOAA fit into all of this? NOAA is carrying out its usual duties of working with its partners to assess injury to and restore impacted natural resources through the Natural Resource Damage Assessment process. However, NOAA also is involved in supporting broader Gulf research and resilience, which will complement the damage assessment process, in two new ways through the RESTORE Act.

First, NOAA is supporting in the RESTORE Act’s Gulf Coast Ecosystem Restoration Council, which is chaired by Commerce Secretary Penny Pritzker (NOAA sits in the Department of Commerce). Second, NOAA is leading the Gulf Coast Ecosystem Restoration Science, Observation, Monitoring, and Technology Program, or more simply, the NOAA RESTORE Act Science Program.

A NOAA ship at dock.

NOAA is leading a science program aimed at improving our understanding of the Gulf of Mexico and the plants and animals that live there, in order to better protect and preserve them. (NOAA)

This program exists because we simply don’t know as much as we need to know about the Gulf of Mexico and the plants and animals that live there in order to reverse the general decline of coastal ecosystems and ensure resilience in the future.

To make sure this new science program addresses the needs of the region, NOAA, in partnership with the U.S. Fish and Wildlife Service, met with resource managers, scientists, and other Gulf of Mexico stakeholders to discuss what the focus of the program should be. We heard three key messages loud and clear:

  • Make sure the research we support is closely linked to regional resource management needs.
  • Coordinate with other science initiatives working in the region.
  • Make the results of research available quickly to those who could use them.
Woman checks for bubbles in a sample of water on board the NOAA Ship Pisces.

The NOAA RESTORE Act Science Program is already in the process of making available $2.5 million for research in the Gulf of Mexico, with more opportunities to come. (NOAA)

NOAA and the U.S. Fish and Wildlife Service have designed a science plan [PDF] for the NOAA RESTORE Act Science Program that outlines how we will make this happen.

The science plan describes the research priorities highlighted during our engagement with stakeholders and from reviewing earlier assessments of the science needed to better understand the Gulf of Mexico. These priorities will guide how the program directs its funding over the coming years.

The research priorities include improving our understanding of how much and when freshwater, sediment, and nutrients enter the coastal waters of the Gulf of Mexico and what this means for the growth of wetlands and the number of shellfish and fish in the Gulf of Mexico. Another priority is developing new techniques and technologies for measuring conditions in the Gulf to help inform resource management decisions.

Apply for Research Funding

Currently, the NOAA RESTORE Act Science Program is holding its first competition for funding, with over 100 research teams already responding. It will make $2.5 million available for researchers to review and integrate what we already know about the Gulf of Mexico and work with resource managers to develop strategies directing the program toward our ultimate goal of supporting the sustainability of the Gulf and its fisheries.

The results of this work also will help inform the direction of other science initiatives and restoration activities in the Gulf region. NOAA and the U.S. Fish and Wildlife Service will announce the winners of this funding competition in the fall of 2015.

To learn more about the NOAA RESTORE Act Science Program and future funding opportunities, visit http://restoreactscienceprogram.noaa.gov/.


Leave a comment

University of Washington Helps NOAA Examine Potential for Citizen Science During Oil Spills

Group of people with clipboards on a beach.

One area where volunteers could contribute to NOAA’s scientific efforts related to oil spills is in collecting baseline data before an oil spill happens. (Credit: Heal the Bay/Ana Luisa Ahern, CC BY-NC-SA 2.0)

This is a guest post by University of Washington graduate students Sam Haapaniemi, Myong Hwan Kim, and Roberto Treviño.

During an oil spill, how can NOAA maximize the benefits of citizen science while maintaining a high level of scientific integrity?

This was the central question that our team of University of Washington graduate students has been trying to answer for the past six months. Citizen science is characterized by volunteers helping participate in scientific research, usually either by gathering or analyzing huge amounts of data scientists would be unable to do on their own.

Dramatic improvements in technology—particularly the spread of smartphones—have made answering this question more real and more urgent. This, in turn, has led to huge growth in public interest in oil spill response, along with increased desire and potential ability to help, as demonstrated during the 2007 M/V Cosco Busan and 2010 Deepwater Horizon oil spill responses.

As the scientific experts in oil spills, NOAA’s Office of Response and Restoration has a unique opportunity to engage citizens during spills and enable them to contribute to the scientific process.

What’s in it for me?

Our research team found that the potential benefits of citizen science during oil spills extend to three groups of people outside of responders.

  • First, professional researchers can benefit from the help of having so many more people involved in research. Having more citizen scientists available to help gather data can strengthen the accuracy of observations by drawing from a potentially greater geographic area and by bringing in more fine-grain data. In some cases, citizen scientists also are able to provide local knowledge of a related topic that professional researchers may not possess.
  • The second group that benefits is composed of the citizen scientists themselves. Citizen science programs provide a constructive way for the average person to help solve problems they care about, and, as part of a collective effort, their contributions become more likely to make a real impact. Through this process, the public also gets to learn about their world and connect with others who share this interest.
  • The final group that derives value from citizen science programs is society at large. When thoughtfully designed and managed, citizen science can be an important stakeholder engagement tool for advancing scientific literacy and reducing risk perception. Citizen science programs can provide opportunities to correct risk misconceptions, address stakeholder concerns, share technical information, and establish constructive relationships and dialogue about the science that informs oil spills and response options.

How Should This Work?

Volunteer scrapes mussels off rocks at Hat Island.

A volunteer samples mussels off of Everett, Washington, as part of the citizen science-fueled NOAA Mussel Watch Program. (Credit: Lincoln Loehr, Snohomish County Marine Resources Committee)

Recognizing these benefits, we identified three core requirements that NOAA’s Office of Response and Restoration should consider when designing a citizen science program for oil spills.

  1. Develop a program that provides meaningful work for the public and beneficial scientific information for NOAA.
  2. Create a strong communication loop or network that can be maintained between participating citizens and NOAA.
  3. Develop the program in a collaborative way.

Building on these core requirements, we identified a list of activities NOAA could consider for citizen science efforts both before and during oil spill responses.

Before a response, NOAA could establish data collection protocols for citizen scientists, partner with volunteer organizations that could help coordinate them, and manage baseline studies with the affiliated volunteers. For example, NOAA would benefit from knowing the actual numbers of shorebirds found at different times per year in areas at high risk of oil spills. This information would help NOAA better distinguish impacts to those populations in the event of an oil spill in those areas.

During a response, NOAA could benefit from citizen science volunteers’ observations and field surveys (whether open-ended type or structured-questionnaire type), and volunteers could help process data collected during the response. In addition, NOAA could manage volunteer registration and coordination during a spill response.

How Could This Work?

Evaluating different options for implementing these activities, we found clear trade-offs depending on NOAA’s priorities, such as resource intensity, data value, liability, and participation value. As a result, we created a decision framework, or “decision tool,” for NOAA’s Office of Response and Restoration to use when thinking about how to create a citizen science program. From there, we came up with the following recommendations:

  1. Acknowledge the potential benefits of citizen science. The first step is to recognize that citizen science has benefits for both NOAA and the public.
  2. Define goals clearly and recognize trade-offs. Having clear goals and intended uses for citizen scientist contributions will help NOAA prioritize and frame the program.
  3. Use the decision tool to move from concept to operation. The decision tool we designed will help identify potential paths best suited to various situations.
  4. Build a program that meets the baseline requirements. For any type of citizen science program, NOAA should ensure it is mutually beneficial, maintains two-way communication, and takes a collaborative approach.
  5. Start now: Early actions pays off. Before the next big spill happens, NOAA can prepare for potentially working with citizen scientists by building relationships with volunteer organizations, designing and refining data collection methods, and integrating citizen science into response plans.

While there is not one path to incorporating citizen science into oil spill responses, we found that there is great potential via many different avenues. Citizen science is a growing trend and, if done well, could greatly benefit NOAA during future oil spills.

You can read our final report in full at https://citizensciencemanagement.wordpress.com.

Sam Haapaniemi, Myong Hwan Kim, and Roberto Treviño are graduate students at the University of Washington in Seattle, Washington. The Citizen Science Management Project is being facilitated through the University of Washington’s Program on the Environment. It is the most recent project in an ongoing relationship between NOAA’s Office of Response and Restoration and the University of Washington’s Program on the Environment.


Leave a comment

After an Oil Spill, How—and Why—Do We Survey Affected Shorelines?

Four people walking along a beach.

A team of responders surveying the shoreline of Raccoon Island, Louisiana, on May 12, 2010. They use a systematic method for surveying and describing shorelines affected by oil spills, which was developed during the Exxon Valdez spill in 1989. (U.S. Navy)

This is part of the National Ocean Service’s efforts to celebrate our role in the surveys that inform our lives and protect our coasts.

In March of 1989, oil spill responders in Valdez, Alaska, had a problem. They had a very large oil spill on their hands after the tanker Exxon Valdez had run aground on Bligh Reef in Prince William Sound.

At the time, many aspects of the situation were unprecedented—including the amount of oil spilled and the level of response and cleanup required. Further complicating their efforts were the miles and miles of remote shoreline along Prince William Sound. How could responders know which shorelines were hardest hit by the oil and where they should focus their cleanup efforts? Plus, with so many people involved in the response, what one person might consider “light oiling” on a particular beach, another might consider “heavy oiling.” They needed a systematic way to document the oil spill’s impacts on the extensive shorelines of the sound.

Out of these needs ultimately came the Shoreline Cleanup and Assessment Technique, or SCAT. NOAA was a key player involved in developing this formal process for surveying coastal shorelines affected by oil spills. Today, we maintain the only SCAT program in the federal government although we have been working with the U.S. Environmental Protection Agency (EPA) to help develop similar methods for oil spills on inland lakes and rivers.

Survey Says …

SCAT aims to describe both the oil and the environment along discrete stretches of shoreline potentially affected by an oil spill. Based on that information, responders then can determine the appropriate cleanup methods that will do the most good and the least harm for each section of shoreline.

The teams of trained responders performing SCAT surveys normally are composed of representatives from the state and federal government and the organization responsible for the spill. They head out into the field, armed with SCAT’s clear methodology for categorizing the level and kind of oiling on the shoreline. This includes standardized definitions for describing how thick the oil is, its level of weathering (physical or chemical change), and the type of shoreline impacted, which may be as different as a rocky shoreline, a saltwater marsh, or flooded low-lying tundra.

After carefully documenting these data along all possibly affected portions of shoreline, the teams make their recommendations for cleanup methods. In the process, they have to take a number of other factors into account, such as whether threatened or endangered species are present or if the shoreline is in a high public access area.

It is actually very easy to do more damage than good when cleaning up oiled shorelines. The cleanup itself—with lots of people, heavy equipment, and activity—can be just as or even more harmful to the environment than spilled oil. For sensitive areas, such as a marsh, taking no cleanup action is often the best option for protecting the stability of the fragile shoreline, even if some oil remains.

Data, Data Everywhere

Having a common language for describing shoreline oiling is a critical piece of the conversation during a spill response. Without this standard protocol, spill responders would be reinventing the wheel for each spill. Along that same vein, responders at NOAA are working with the U.S. EPA and State of California to establish a common data standard for the mounds of data collected during these shoreline surveys.

Managing all of that data and turning it into useful products for the response is a lot of work. During bigger spills, multiple data specialists work around the clock to process the data collected during SCAT surveys, perform quality assurance and control, and create informational products, such as maps showing where oil is located and its level of coverage on various types of shorelines.

Data management tools such as GPS trackers and georeferenced photographs help speed up that process, but the next step is moving from paper forms used by SCAT field teams to electronic tools that enable these teams to directly enter their data into the central database for that spill.

Our goal is to create a data framework that can be translated into any tool for any handheld electronic device. These guidelines would provide consistency across digital platforms, specifying exactly what data are being collected and in which structure and format. Furthermore, they would standardize which data are being shared into a spill’s central database, whether they come from a state government agency or the company that caused the spill. This effort feeds into the larger picture for managing data during oil spills and allows everyone working on that spill to understand, access, and work with the data collected, for a long time after the spill.

Currently, we are drafting these data standards for SCAT surveys and incorporating feedback from NOAA, EPA, and California. In the next year or two, we hope to offer these standards as official NOAA guidelines for gathering digital data during oiled shoreline surveys.

To learn more about how teams perform SCAT surveys, check out NOAA’s Shoreline Assessment Manual and Job Aid.


Leave a comment

NOAA’s Online Mapping Tool ERMA Opens up Environmental Disaster Data to the Public

Six men looking at a map with a monitor in the background.

Members of the U.S. Coast Guard using ERMA during the response to Hurricane Isaac in 2012. (NOAA)

This is a post by the NOAA Office of Response and Restoration’s Jay Coady, Geographic Information Systems Specialist.

—-

March 15-21, 2015 is Sunshine Week, an “annual nationwide celebration of access to public information and what it means for you and your community.” Sunshine Week is focused on the idea that open government is good government. We’re highlighting NOAA’s Environmental Response Management Application (ERMA) as part of our efforts to provide public access to government data during oil spills and other environmental disasters.    

Providing access to data is a challenging task during natural disasters and oil spill responses—which are hectic enough situations on their own. Following one of these incidents, a vast amount of data is collected and can accumulate quickly. Without proper data management standards in place, it can take a lot of time and effort to ensure that data are correct, complete, and in a useful form that has some kind of meaning to people. Furthermore, as technology advances, responders, decision makers, and the public expect quick and easy access to data.

NOAA’s Environmental Response Management Application (ERMA®) is a web-based mapping application that pulls in and displays both static and real-time data, such as ship locations, weather, and ocean currents. Following incidents including the 2010 Deepwater Horizon oil spill and Hurricane Sandy in 2012, this online tool has aided in the quick display of and access to data not only for responders working to protect coastal communities but also the public.

From oil spill response to restoration activities, ERMA plays an integral part in environmental data dissemination. ERMA reaches a diverse group of users and maintains a wide range of data through a number of partnerships across federal agencies, states, universities, and nations.

Because it is accessible through a web browser, ERMA can quickly communicate data between people across the country working on the same incident. At the same time, ERMA maintains a public-facing side which allows anyone to access publically available data for that incident.

ERMA in the Spotlight

During the Deepwater Horizon oil spill in the Gulf of Mexico, ERMA was designated as the “common operational picture” for the federal spill response. That meant ERMA displayed response-related activities and provided a consistent visualization for everyone involved—which added up to thousands of people.

Screen grab of ERMA map.

ERMA map showing areas of dispersant application during the response to the Deepwater Horizon oil spill in 2010. (NOAA)

To date, the ERMA site dedicated solely to the Deepwater Horizon spill contains over 1,500 data layers that are available to the public. Data in ERMA are displayed in layers, each of which is a single set of data. An example of a data layer is the cumulative oil footprint of the spill. This single data layer shows, added together, the various parts of the ocean surface the oil spill affected at different times over the entire course of the spill, as measured by satellite data. Another example is the aerial dispersant application data sets that are grouped by day into a single data layer and show the locations of chemical dispersant that were applied to oil slicks in 2010.

Even today, ERMA remains an active resource during the Natural Resource Damage Assessment process, which evaluates environmental harm from the oil spill and response, and NOAA releases data related to these efforts to the public as they become available. ERMA continues to be one of the primary ways that NOAA shares data for this spill with the public.

ERMA Across America

While the Deepwater Horizon oil spill may be one ERMA’s biggest success stories, NOAA has created 10 other ERMA sites customized for various U.S. regions. They continue to provide data related to environmental response, cleanup, and restoration activities across the nation’s coasts and Great Lakes. These 10 regional ERMA sites together contain over 5,000 publicly available data layers, ranging from data on contaminants and environmentally sensitive resources to real-time weather conditions.

For example, in 2012, NOAA used Atlantic ERMA to assist the U.S. Coast Guard, Environmental Protection Agency, and state agencies in responding to pollution in the wake of Hurricane Sandy. Weather data were displayed in near real time as the storm approached the East Coast, and response activities were tracked in ERMA. The ERMA interface was able to provide publically available data, including satellite and aerial imagery, storm inundation patterns, and documented storm-related damages. You can also take a look at a gallery of before-and-after photos from the Sandy response, as viewed through Atlantic ERMA.

Screen grab of an ERMA map.

An ERMA map showing estimated storm surge heights in the Connecticut, New York and New Jersey areas during Hurricane Sandy. (NOAA)

In addition, the ERMA team partnered with NOAA’s Marine Debris Program to track Sandy-related debris, in coordination with state and local partners. All of those data are available in Atlantic ERMA.

Looking to the north, ERMA continues to be an active tool in Arctic oil spill response planning. For the past two years, members of the ERMA team have provided mapping support using Arctic ERMA during the U.S. Coast Guard’s Arctic Technology Evaluation exercises, which took place at the edge of the sea ice north of Barrow, Alaska. During these exercises, the crew and researchers aboard a Coast Guard icebreaker tested potential technologies for use in Arctic oil spill response, such as unmanned aircraft systems. You can find the distributions of sensitive Alaskan bird populations, sea ice conditions, shipping routes, and pictures related to these Arctic exercises, as well as many more data sets, in Arctic ERMA.

Screen grab of an Arctic ERMA map.

ERMA is an active tool in Arctic oil spill response planning. (NOAA)

To learn more about the online mapping tool ERMA, visit http://response.restoration.noaa.gov/erma.

Jay Coady is a GIS Specialist with the Office of Response and Restoration’s Spatial Data Branch and is based in Charleston, South Carolina. He has been working on the Deepwater Horizon incident since July 2010 and has been involved in a number of other responses, including Post Tropical Cyclone Sandy.


Leave a comment

For Alaska’s Remote Pribilof Islands, a Tale of Survival and Restoration for People and Seals

Set in the middle of Alaska’s Bering Sea, a string of five misty islands known as the Pribilof Islands possess a long, rich, and at times, dark history. A history of near extinction, survival, and restoration for both people and nature. A history involving Alaska Natives, Russians, the U.S. government and military, and seals.

It begins with the native people, known as the Unangan, who live there. They tell a story that, as they say, belongs to a place, not any one person. The story is of the hunter Iggadaagix, who first found these islands many years ago after being swept away in a storm and who wanted to bring the Unangan back there from the Aleutian Islands. When the Unangan finally did return for good, it was in the 18th century, and their lives would become intimately intertwined with those of the northern fur seals (Callorhinus ursinus). Each summer roughly half of all northern fur seals breed and give birth in the Pribilof Islands.

Map of fur seal distributions in Bering Sea and Pacific Ocean, with location of Pribilof Islands.

An 1899 map of the distribution (in red) and migrations of the American and Asiatic Fur Seal Herds in the Bering Sea and North Pacific Ocean. Based on data collected 1893-1897. The Pribilof Islands (St. Paul and St. George) are visible north of the main Aleutian Islands, surrounded by the center collections of red dots. Click to enlarge. (U.S. Government)

But these seals and their luxurious fur, along with the tale of Iggadaagix, would eventually bring about dark times for the seals, the Unangan, and the islands themselves. After hearing of Iggadaagix and searching for a new source of furs, Russian navigator Gavriil Loginovich Pribylov would land in 1786 on the islands which would eventually bear his name. He and others would bring the Unangan from the Aleutian Islands to the Pribilof’s St. George and St. Paul Islands, where they would be put to work harvesting and processing the many fur seals.

In these early years on the islands, Russian hunters so quickly decimated the fur seal population that the Russian-American Company, which held the charter for settling there, suspended hunting from 1805 to 1810. The annual limit for taking fur seals was then set at 8,000 to 10,000 pelts, allowing the population to rebound significantly.

The United States Arrives at the Islands

Fast forward to 1867, when the United States purchased Alaska, including the Pribilof Islands, from Russia for $7.2 million.

Some people considered the lucrative Pribilof Islands fur seal industry to have played a role in this purchase. In fact, this industry more than repaid the U.S. government for Alaska’s purchase price, hauling in $9,473,996 between 1870 and 1909.

The late 19th and early 20th centuries saw various U.S. military branches establish stations on the Pribilof Islands, as well as several (at times unsuccessful) attempts to control the reckless slaughter of fur seals. From 1867 until 1983, the U.S. government managed the fur seal industry on the Pribilof Islands.

In 1984, the Unangan finally were granted control of these islands, but the government had left behind a toxic legacy from commercial fur sealing and former defense sites: hazardous waste sites, dumps, contaminants, and debris.

Making Amends with the Land

This is where NOAA comes into the picture. In 1996, the Pribilof Islands Environmental Restoration Act called on NOAA to restore the environmental degradation on the Pribilof Islands. In particular, a general lack of historical accountability on the islands had led to numerous diesel fuel spills and leaks and improperly stored and disposed waste oils and antifreeze. By 1997 NOAA had removed thousands of tons of old cars, trucks, tractors, barrels, storage tanks, batteries, scrap metal, and tires from St. Paul and St. George Islands. Beginning in 2002, NOAA’s efforts transitioned to cleaning up soil contamination and assessing potential pollution in groundwater.

However, the Department of Defense has also been responsible for environmental cleanup at the Pribilof Islands. The U.S. Army occupied the islands during World War II and left behind debris and thousands of 55-gallon drums, which were empty by 1985 but had previously contained petroleum, oils, and lubricants, which could have leaked into the soil.

By 2008, NOAA’s Office of Response and Restoration had fulfilled its responsibilities for cleaning up the contamination on the Pribilof Islands, closing a dark chapter for this remote and diverse area of the world and hopefully continuing the healing process for the Unangan and fur seals who still call these islands their home.

Learn More about the Pribilof Islands

Man posing with schoolchildren.

Dr. G. Dallas Hanna with a class of Aleut schoolchildren on St. George Island, Alaska, circa 1914. (National Archives)

You can dig even deeper into the wealth of historical information about the Pribilof Islands at pribilof.noaa.gov.

There you can find histories, photos, videos, and documents detailing the islands’ various occupations, the fur seal industry, the relocation of the Unangan during World War II, the environmental contamination and restoration, and more.

You can also watch:

Follow

Get every new post delivered to your Inbox.

Join 537 other followers