SUMMARY OF THE REMEDIATION TECHNOLOGIES DEVELOPMENT FORUM
SEDIMENTS REMEDIATION ACTION TEAM
ASSESSMENT SUBGROUP MEETING
Crowne Plaza Cincinnati
May 9, 2000
WELCOME AND OPENING REMARKS
Richard Jensen opened the meeting by welcoming participants to the Assessment Subgroup meeting. (A
list of meeting attendees is included as Attachment A.) He provided an overview of the agenda items for
this meeting: (1) review the Subgroup's draft white papers, (2) evaluate the sediments assessment game,
and (3) discuss the Subgroup's plans for the November 2000 Society of Environmental Toxicology and Chemistry (SETAC) meeting.
REVIEW OF DRAFT WHITE PAPERS
During the Subgroup's previous meeting, participants agreed that 12 white papers should be written to address a variety of issues that pertain to sediments. Nine of these papers were submitted for review during this Subgroup meeting. Dennis Timberlake suggested that, before reviewing each white paper, the Subgroup should agree upon the purpose and use of the papers. He thought the papers should identify data gaps and research needs for the Subgroup to pursue. Timberlake cautioned that the Subgroup must be careful to avoid writing white papers that could be interpreted as guidance documents. (One Subgroup member noted that the Remediation Technologies Development Forum's [RTDF's] Bioremediation Consortium had the same concern about a training course it prepared. To prevent people from thinking that the course was presenting guidance, it was presented as an industry-developed program that lacked regulatory endorsement.) Some Subgroup members said the white papers should serve as primers for people conducting sediments remediation. Each white paper discusses an assessment or remediation technology or concerns that must be considered when assessing or remediating sites. The white papers that describe technologies discuss their advantages and disadvantages, as well as issues to consider when using them. Jensen stated that the white papers should strike a balance between being highly technical documents for internal use and being simplified descriptions for wide distribution. Subgroup members agreed that they should reach a consensus about the papers' content and use before the November 2000 SETAC meeting. Jensen asked authors to summarize their papers and Subgroup members to provide comments. They obliged; information on each paper is presented below.
Evaluating Reference Area Conditions in Sediment Assessments
Written by Ralph Stahl, DuPont Corporate Remediation
Presented by Richard Jensen, DuPont Corporate Remediation
Ralph Stahl, the paper's author, could not attend the Subgroup meeting, so Jensen presented the paper for discussion. This white paper discusses reference area selection. For sediment assessments, the reference area is assumed to be similar to the study area, but without the substances of concern. The reference area is used to determine if remediation is needed, evaluate changes in the study area after remediation, and determine if remediation has been successful.
Selecting an inappropriate reference area may lead investigators and risk managers to draw inappropriate conclusions and to make inappropriate risk management decisions. For example, if shellfish population is an evaluation criterion and the shellfish population in the reference area is different from the population in the study area, it may not be appropriate to compare data from the two sites.
Currently, there is no standard approach to selecting a reference area. This white paper presents several criteria to consider during the selection process. Some of these are: (1) absence of the substance suspected of causing impacts to the study area, (2) physical nature of the sediment (e.g., grain size), (3) chemical composition of the sediment (e.g., organic carbon and pH), (4) physical proximity to the study area, (5) flow dynamics, and (6) composition of the benthic community. Some of the problems encountered when selecting a reference area include: (1) there are usually only a few locations proximate to the study area where substances are absent, (2) grain size can be hard to match, (3) chemical composition can be hard to match, (4) nearby areas can be hard to locate, and (5) benthic studies are often time- and resource-intensive.
On pages 3 and 4, the paper discusses how difficult it is to find reference areas in highly industrialized areas. There may not be any areas free of the substance under study. David Moore suggested rewording the text on page 3 to state that a reference area may contain some of the substance under study, but that it should be free of impact from this substance.
Another Subgroup member stated that multiple reference areas were selected at a site in San Diego to identify an overall background level. The white paper should mention this approach.
One Subgroup member asked if investigators ever decide that there is no suitable reference area and, if so, what is done in those instances. Can the involved parties, for example, come to an agreement about what would be an acceptable remediation goal without having a reference area? Sabine Apitz commented that, in the absence of a reference area, investigators can use regional ambient and background data. In these cases, data from several regional sites should be analyzed to evaluate the regional ambient and background conditions against site-specific conditions. Apitz has seen different approaches used when regional data are presented in a risk assessment. The data may be used either at the beginning to eliminate evaluation of substances or at the end to evaluate risk. Apitz cautioned that investigators who select only one or two reference areas may not get enough data to conduct statistical evaluations and draw conclusions.
A Subgroup member brought up the Passaic River in New Jersey as an example of a river that will never again be like it was before industrialization and urbanization. In trying to restore such a highly polluted river, the Subgroup member asked, is there a natural process for restoring the river? What can be done to encourage this process? For example, after a forest fire, there is a known sequence of regrowth before the original forest composition returns. Jensen replied that perhaps the practice of restoring study areas to reference area conditions is best applied to less polluted systems. Attempting to restore a portion of a highly polluted river may not be appropriate. This question led to the broader comment that the white paper, although it captures many of the issues surrounding reference area selection, could discuss urbanized areas in more detail. Separating many contamination sources is one of the problems associated with finding reference areas sites in such locations. For the Passaic River, investigators selected another river system with different characteristics as the reference area.
Ken Finkelstein suggested including the following approach in the white paper: selecting near-field and far-field reference areas for study areas in industrialized or urbanized rivers. (The near-field reference area would be within the industrialized area and the far-field reference area would be in the upper reaches of the impacted river or another river system.) This would provide information about the potential contamination ranges for the study area.
The white paper should state, one Subgroup member commented, that investigators should consider the questions they are trying to answer when selecting reference areas. For example, if investigators are trying to determine if a single substance is killing fish in a river, they could pick a reference area that is just as polluted, but does not have that substance.
Finkelstein mentioned that the last lines on page 4 imply that using the benthic community as an assessment tool is not a preferred method. Finkelstein suggested that this text be re-worded.
Moore mentioned that in marinas and shipyards with physical disturbances, physical impacts to the benthic community should be considered.
One Subgroup member commented that it may be difficult to maintain reference areas over the time periods that are needed to assess remediation projects. This white paper may lead people to believe that they must select a reference area for every study area. Instead of selecting reference areas, he said that, in the National Pollution Discharge Elimination System (NPDES) program, criteria such as "fishable" and "swimmable" are selected as goals for discharged water. Perhaps those concepts could be expanded for use in sediments remediation.
Moore said that the U.S. Army Corps of Engineers (USACE) may have guidance for selecting reference areas. Stahl could consult this guidance while preparing the white paper.
The Conceptual Site Model
Written and Presented by Robert Hoke, DuPont Haskell Lab
Robert Hoke said that he wrote this white paper as a primer to describe conceptual site models. Hoke noted that there are a number of existing references and documents that describe these models. For example, the American Society for Testing and Materials has a short document that discusses conceptual site models.
A conceptual site model represents a site and its characteristics, including contaminants of concern, contaminant fate and transport, and receptor populations. It can also include reference areas and background conditions. A conceptual site model is developed to focus investigation and remediation of a site in the most cost-effective manner. It can also be used as a baseline to track changing site conditions (e.g., remedial actions or weather events). Hoke's white paper briefly discusses the physical processes and chemical processes to consider when developing conceptual site models. (Hoke noted that biological processes are not discussed in detail and need to be added.) The white paper also lists the steps involved and the points that should be considered when developing conceptual site models. Hoke noted that developing a conceptual site model is an iterative process. The model should be changed and refined as more data are collected.
Timberlake asked if conceptual site models include computer modeling. Hoke stated that a conceptual site model is not one specific thing, but rather a person's idea or picture of a site. Components may include geological or chemical site data or information about the biological community at a site. Mathematical and computer models may be used to evaluate this data and refine the conceptual site model. Jensen felt that computer and mathematical modeling is an important part of the conceptual site model, and of understanding chemical movement in an aquatic system. Another Subgroup member stated that a conceptual site model is a tool used in risk assessment to define current site conditions and to predict future site conditions to evaluate potential risks. Usually, a conceptual site model is built with very little data and poses hypotheses about what is happening at a site. The model is continually refined as more site data become available.
A Subgroup member stated that many environmental permits require conceptual site models to anticipate future site uses. Moore said that such anticipation is especially important when considering capping as a remediation strategy in active ports--capping in a navigation channel limits future activities to widen and deepen the channel. Apitz said that risk assessments are often conducted considering management options, and the conceptual site model can help anticipate appropriate management options.
Jensen said that developing conceptual site models is part of developing a strategy to address contamination. He noted that models should consider the history of a site and identify any ongoing contamination sources there. Natural processes, and how they can be used, should also be considered when the site remedy is selected. Apitz felt that the conceptual site model should also capture the differences between site-specific and regional contamination and risk.
A Subgroup member asked if a conceptual site model is used to delineate the geographic area of a site. Hoke replied that the conceptual site model is based on the investigation's goals, such as determining if contamination comes from the site or an upstream source. The conceptual site model, therefore, can be used to define areas. For example, a conceptual site model may hypothesize about source location. Area could be defined in many ways in the conceptual site model (e.g., contaminant concentration, geographic location, or population size).
A number of different conceptual site models may be produced for a single site. Often, the regulators and investigators at a site each develop their own models. Similarly, risk assessors and risk managers may produce different conceptual site models. In addition, investigators can develop a conceptual site model based on public perceptions of the issues at a site; issues that are not of concern for the investigators are often of great concern for the community. A conceptual site model can help investigators address these perceived issues in the early stages of a site investigation.
A Subgroup member suggested that Hoke add some examples of specific concerns related to sediments.
Timberlake stated that this paper is a good contribution to the Subgroup's white papers, because it introduces the topic. The white paper, he said, should reference other sources of information.
Application of Sediment Toxicity Testing in Site Remediation Activities
Written and Presented by David Moore, MEC Analytical Systems, Inc.
Moore said that this white paper discusses different types of available sediment toxicity tests. The paper is organized by test type (whole sediment, sediment porewater, sediment extract, and sediment elutriate), he said, and was written to serve as an educational tool. Moore said that he thinks this tool is needed because investigators often select tests that cannot answer their questions. He said that the paper provides a general discussion of different testing options, discusses the advantages and disadvantages of each test, and lists issues to consider when conducting sediment toxicity testing.
Finkelstein suggested that Moore add test durations to the description of the whole sediment test. Moore said he had considered including the durations of short-term and long-term tests, but thought doing so might provide more detail than necessary. Moore agreed to reword the white paper to state more clearly that longer-term tests are available, and to provide references for more information. Another Subgroup member cautioned that people with limited technical backgrounds might assume that a longer test is a better test, which is not always true. Longer tests can have more confounding factors, and their results sometimes require more intensive interpretation than those of shorter tests. This concern should be stated in the white paper.
Moore noted that there are a number of concerns and issues that can affect the application of sediment toxicity tests, but that these are seldom documented. For example, benthic invertebrates do not possess the receptor that is affected by dioxin exposure. Therefore, there is not much concern about dioxins adversely impacting the health of benthic communities; there is concern, however, about these compounds bioaccumulating in benthic communities and impacting higher trophic levels. Moore suggested that the white paper mention some of these concerns and provide references for more information. Apitz agreed that concerns and issues should be mentioned to tell investigators what they should consider when selecting a test.
One Subgroup member stated that the white paper does not mention surrogate tests using semi-permeable membranes. Moore replied that these tests study bioaccumulation and are, therefore, not true toxicity tests.
Finkelstein said that he was not sure whether the sediment extract, sediment-water interface, or sediment elutriate tests are used anymore. He asked if they should be used more; if not, perhaps the white paper should not discuss them. Moore said that he included a brief discussion of all available tests since the white paper is intended to serve as an educational tool. Moore agreed that some of these tests are no longer commonly used, but said that people still do consider using them. For those people, Moore wants to highlight the tests' limitations. For some of the other tests (e.g., sediment-water interface tests), Moore included discussions because use is increasing, especially in the western United States.
Finkelstein suggested dividing the long paragraph on page 2 into several paragraphs. He also commented on the summary on page 7, asking Moore to add benthic community studies to the list of other information to consider.
Characterizing the Spatial Extent of Sediment Contamination at Impacted Sites
Written by Joseph Jersak, Hull & Associates, Inc.
Presented by John Hull, Hull & Associates, Inc.
Joseph Jersak could not attend the Subgroup meeting, so John Hull presented this white paper. He began by saying that spatial characterization of contamination is done at a rudimentary level or as an afterthought at many sites. Jersak's white paper discusses how people can better characterize contamination.
Hull commented that Jersak might be able to present some of the information in the paper more concisely. He noted the importance of using statistical models to identify sample locations and support conclusions. It is also important, he said, to consider sediment type, flow velocities, and other aquatic properties--rather than using a simple grid sampling pattern--when developing a sampling plan.
Apitz said that on page 1, paragraph 3, the third objective should either be removed from the white paper or be described in more detail. If removed, she said, it could serve as the topic for a separate white paper.
Moore mentioned that USACE conducted a risk assessment of the Bush and Gunpowder Rivers; these are adjacent to the Aberdeen Proving Ground on the Chesapeake Bay in Maryland. USACE was able to use toxicity tests to help define spatial characterization of the site. USACE argued that if they could reduce the number of laboratory replicates, they could redistribute funds and collect more field samples and improve their understanding of contamination extent and spatial variability. Moore suggested mentioning this fund redistribution approach in the white paper. Hull agreed that it is a good example.
Jensen felt it was important that the white paper discuss the smart-site approach to evaluating site contamination. Instead of using a grid sampling pattern, this approach involves collecting initial samples, analyzing them at an onsite mobile laboratory, and then collecting additional samples after obtaining the initial analytical results. Apitz stated that initial sampling plans are often not comprehensive; additional sampling and equipment mobilization are often required to characterize a site. This supports using the smart-site approach. Finkelstein noted that samples collected for toxicity tests should be collected at the same time as those collected for chemical tests; this may cause problems if the smart-site approach is used.
Jensen asked whether correlations can be made between sediment toxicity and contaminant concentrations. If so, he suggested, it might be useful to predict toxicity based on contaminant concentrations because the latter are cheaper to analyze. Apitz said that correlations can be made, but that it is often difficult to establish them, because of confounding factors that exist at sites. Another Subgroup member agreed that correlations can be drawn if an extensive data set exists, but he questioned whether chemical analyses will truly be cheaper, especially when multiple chemicals are analyzed or unique analyses are required.
A Subgroup member mentioned that there are inexpensive methods available to characterize sites and develop conceptual site models. For example, computer or mathematical models can be applied to identify areas for toxicity and chemical tests.
Determining Contaminant and Sediment Fate
Written and Presented by Danny Reible, Louisiana State University
When preparing this white paper, Danny Reible said, he identified two topics that could be discussed: (1) the processes that influence the fate of contaminants and sediments, and (2) the methods that are used to assess the fate of sediments and contaminants. Reible decided to focus the white paper on the former topic, because he felt that people must understand the issues that affect fate before they try to assess it. The white paper discusses contaminant and sediment fate as it affects exposure and the risk assessment process. It summarizes the key processes that affect contaminant and sediment fate in an erosional environment and a net depositional environment. In an erosional environment, the contaminant dynamics may be controlled largely by the sediment dynamics. Efforts, therefore, should be focused on understanding sediment dynamics. In a depositional environment, contaminant dynamics are largely controlled by bioturbation and physiochemical processes. These processes should be defined during site investigations.
Reible asked Subgroup members if he should compress the paper and add a discussion about how to determine fate. As an alternative, he said, he could just address this topic in a separate white paper. Apitz said that the white paper should remain as written and that a second white paper should be prepared. This second paper, she said, could discuss what parameters should be measured and how they should be analyzed. Reible agreed to write the second paper.
One Subgroup member suggested reorganizing the paper so that the processes that affect contaminant and sediment fate are presented in the context of environment descriptions. (As currently written, information on erosional and depositional environments and processes that influence fate are presented separately.)
A Subgroup member made an editorial comment: the white paper's title does not accurately describe the paper's content. Reible agreed to revise the title.
Monitoring Remedial Effectiveness
Written and Presented by Danny Reible, Louisiana State University
Danny Reible said that this white paper discusses how to monitor the impact and efficacy of remediation technologies. He said that efforts to evaluate efficacy are often minimized. He said that understanding a technology's impact and effectiveness at one site may lead to better decisions about remediation at others.
Reible said that monitoring remedial effectiveness should be composed of three parts: (1) monitoring to meet a primary goal, (2) interim monitoring, and (3) monitoring of the remedy implementation. The primary endpoint of the remediation process may include protecting or recovering a resource--for example, revitalizing a fishery. The first part of the monitoring program should evaluate if this primary endpoint has been reached. Because recovery of a resource may require many years of monitoring to evaluate, interim monitoring goals may be established to assess effectiveness sooner. Interim monitoring goals may include chemical or biological measurements. Monitoring the remedy during implementation can determine if short-term goals are being achieved. Engineering and construction monitoring (e.g., monitoring cap thickness during application) determine if the remedy has been installed properly. Monitoring for protection of human and ecological health during implementation includes assessing contaminant loss. To meet monitoring goals, Reible said, sufficient data are needed to define the mass flows in systems. He said that monitoring data could also be used to compare effectiveness of remediation technologies at different sites.
A Subgroup member noted that, when a generic technology (e.g., manufactured soil) is applied to a particular site, the goal is often to create an environment in which plants and animals can live and grow. However, in developing the manufactured soil technology, engineers focused on chemical mass balance and did not consider the toxicity of intermediate breakdown products. Biological tests can be conducted either to assess the toxicity of intermediate contaminants or to assess the effectiveness of the technology during development. These tests provide a feedback loop to help determine what mixtures work or how the matrix should be manipulated. In response, Reible said that he would modify the paper to place more emphasis on biological measures and to include the concept of feedback loops.
Another Subgroup member suggested that the white paper include a discussion of how end use, either of the site or the sediment, can affect a monitoring program.
One Subgroup member said that tracking injuries that can result from implementing remedial technologies is an important part of health and safety. Some components of the remediation process, such as dredging, are dangerous to workers. Reible agreed to add a statement about health and safety to the paper.
A Subgroup member noted that remediation can also have ecological impacts. For example, at a site in the Northeast, regulators decided that the short-term effects that would result from removing contaminated sediments would be very detrimental to local bird populations.
One Subgroup member asked how monitoring should be conducted in areas where the structure of the environment changes. (For example, dredging can convert a shallow environment into a deep one.) Finkelstein responded that regulators and trustees monitor the remediation process to account for changing environments and resource damage. For example, ports must create or protect shallow environments when dredging occurs. In some cases, Jensen said, remediated areas are backfilled to restore original environments.
In Situ Bioaccumulation Tests
Written and Presented by David Hohreiter, Blasland, Bouck & Lee, Inc.
Hohreiter said that this white paper introduces the topic of in situ bioaccumulation tests; discusses some of the tests that are available; and discuss the advantages, disadvantages, and issues associated with these tests. Hohreiter identified two types of in situ bioaccumulation tests: caged biota studies and semi-permeable membrane devices (SPMDs). The white paper provides a brief description of each type of test and includes references for more information. Both tests provide a direct measure of contaminant uptake in the water column or near the sediment and provide more control and reproducibility than studies that use a resident population. Overall, in situ bioaccumulation tests provide a good indication of the relative bioavailability of a contaminant. They can best be applied as part of a pre- and post-remediation monitoring program. However, they are not useful for risk evaluation, because the exposure scenarios they create are artificial.
Finkelstein suggested adding more examples of the caged muscle studies.
Apitz noted that many people use laboratory bioaccumulation tests. She asked if a discussion--perhaps only one paragraph long--could be included about these tests.
Jensen asked if there is existing literature that discusses how animals react to changing contaminant concentrations. For example, he asked whether there is literature that describes the time required for fish to reach equilibrium with contaminated water. Hohreiter said that there have been studies of steady-state conditions between fish and water contamination. Finkelstein responded that it is always worthwhile to know if a resident fish species is being exposed to the contaminant of interest. One way to find this out is to sample resident fish that are near the area of contamination and have small home ranges. Another Subgroup member felt that research in this field, which is ongoing, is important, especially because different species may act differently. Apitz said that there have been extensive studies on mussel uptake rates.
Finkelstein said that caged mussels can be measured before and after exposure to assess change in size. A Subgroup member asked if caging the study animals or placing them in nutrient-poor environments reduces lipid content and lowers tissue mass and, as a result, raises contaminant concentration. Hohreiter stated that his studies with juvenile minnows found lipid content and tissue mass reduction about one month after the minnows were caged, so this is an important issue to consider.
Bioavailability of Contaminants in Soil and Sediment Systems
Written and Presented by John Davis, The Dow Chemical Company
Davis said that this white paper serves as an educational tool; it defines bioavailability, summarizes issues surrounding it, and provides references for additional information. Often, regulatory decisions are made based on the total contaminant concentration at a site, even though only a fraction of this concentration is actually bioavailable. Processes that affect bioavailability include sorption, oxidation, and sequestration. Davis stated that the greatest uncertainty in understanding bioavailability is understanding the long-term stability of a contaminant's non-bioavailable fraction. The paper lists available methods to measure bioavailability, but Davis noted that the validity of these methods needs to be considered. Researchers are still working on a way to collect sediment samples and extract the bioavailable portion, rather than determining the total contaminant concentration. Davis noted that the paper does not discuss pharmacological availability, which affects the fate of a contaminant once it enters an organism. Dioxins are pharmacologically unavailable to benthic invertebrates, for example: they lack the receptor that is affected by them.
One Subgroup member mentioned that the U.S. Environmental Protection Agency (EPA) is considering revising the ambient water quality criteria for metals to account for bioavailability. If this approach is used, it would be assumed that the free metal ion is the bioavailable portion. Copper, which has an external effect on organisms, will be the first metal considered. Davis stated that he recently read an article that suggested reviewing the idea that only the free metal ions are bioavailable. Another Subgroup member mentioned that metals have different toxicity to, and effects on, different species.
Jensen said that studies have found that, in terrestrial systems, only a small fraction of the total contaminant concentration is bioavailable. However, terrestrial environments are often regulated according to total contaminant concentrations. Similar research on bioavailability of polychlorinated biphenyls (PCBs) in sediment found that only a small portion of the PCBs at any site are bioavailable. Unlike terrestrial systems, however, sediments are already regulated based on toxic effects or body burdens.
There are many tests available to determine if the contaminants in sediment are causing adverse effects. However, another factor in considering bioavailability is considering if exposure is likely. For example, a contaminant may be found at toxic levels in the sediment 10 feet below the surface, where it is not available for exposure. Biologists and engineers should work together to understand how biological processes, such as the work of deep-burrowing organisms, can transport contamination to make it available for exposure. For example, armoring to prevent bioturbation may be required when capping is the selected remedy. The conceptual site model should be used as a tool to understand the availability of a contaminant for exposure.
Ecological Assessment Tools
Written and Presented by Ken Finkelstein, National Oceanic and Atmospheric Administration (NOAA)
Finkelstein said that this white paper discusses measurement tools that can be used to investigate sediment contamination. Some of the tools mentioned in the paper are addressed in greater detail in other Subgroup white papers. Finkelstein said that this one is intended to serve as a primer for people who need to investigate sediment contamination. The white paper discusses measurement endpoints, describes tools, discusses chemical-specific considerations in applying these tools, and mentions food web risk assessment tools. The paper also has a lengthy reference list.
A Subgroup member asked whether it is easy to determine the cause of an observed toxic effect. Finkelstein said that he tried to discuss chemicals and their toxic effects, but that observed toxic effects may be caused by something other than chemicals. Table 3 in the paper lists the types of contamination that can be present at sites and indicates which types of test could be used to determine if contaminants are present at a level that will cause adverse effects.
Moore suggested making the following points under the water chemistry discussion: (1) metal concentrations in the water column may not be bioavailable, and (2) many contaminants of concern are hydrophobic and, therefore, not found in high concentrations in water columns. Moore also suggested making the following point under the sediment chemistry discussion: the sediment quality guidelines only address synergistic effects or potential interactions for a few contaminants, like polyaromatic hydrocarbons (PAHs).
Moore wondered if the narcosis model should be mentioned in the discussions of PAHs.
Finkelstein requested comments on the tables. Subgroup members suggested making the titles and captions that describe the tables' content more clear. Commenting on Table 3, Moore said that there are biomarker approaches available for PAHs, PCBs, and dioxins. Moore will send this information to Finkelstein.
Reviewers provided several editorial comments. On page 1, under sediment chemistry, sentence 2 should be reworded, because the sediment guidelines do not measure exposure. On page 3, the discussion of PCBs and dioxins describes the use of sediment guidelines to assess dioxin and the following sentence discusses toxicity tests. Readers may assume that toxicity tests are appropriate to use in assessing dioxin contamination, which is false. The text should be re-worded. On page 6, the document referred to as USEPA, 1999b, may actually have been published in 2000.
Jensen indicated that the Subgroup is planning to host an open house at the November 2000 SETAC meeting. He said that he would like the authors to revise their white papers for use at this meeting. Jensen asked whether this was a reasonable goal and, if so, what should be done to achieve it. Timberlake said that the papers would need to undergo an EPA review if they are distributed with the RTDF logo. He said that regulatory agencies would be concerned about releasing white papers that might be interpreted as guidance documents. Subgroup members provided several suggestions to avoid this interpretation: (1) including a preamble or disclaimer in each paper, (2) publishing the papers as industry-produced documents without agency endorsements, (3) producing the white papers as draft documents for comment, (4) conducting a fast-track EPA review to confirm that the white papers do not present guidance, (5) presenting the white papers as a poster session with no take-away documents, and (6) presenting the information in a lecture with no take-away documents.
Overall, the Subgroup would like to encourage more regulatory involvement. One Subgroup member said that he circulated his white paper at EPA for review; perhaps documenting information on sediments-related issues will encourage EPA involvement. Timberlake noted that the Superfund program is developing a sediment guidance document, scheduled for release in fall 2000. The guidance addresses many of the same topics as the white papers, so the Subgroup should be aware of the political sensitivity of publishing the white papers at the same time as the guidance.
Subgroup members agreed that authors should continue to revise their white papers. They also agreed to seek EPA comment and review before deciding if the papers should be distributed. Authors said that they would revise their papers based on the comments that they received during the meeting. They asked Subgroup members to send any additional comments via e-mail, carbon-copying other Subgroup members. Hopefully, they said, revised drafts will be completed before the Subgroup's conference call that is scheduled for June 2, 2000.
Before closing, Jensen posed two questions for future consideration. First he asked whether additional
white papers should be written to assess natural attenuation. Then, he asked a logistical question: once
the white papers are released, does the Subgroup want to establish a point of contact to field questions?
He pointed out that USACE uses this approach. It provides the name of a contact person who can answer
questions about USACE products.
THE SEDIMENTS ASSESSMENT GAME
David Hohreiter, Blasland, Bouck & Lee, Inc.
Hohreiter said that a game has been created to teach non-technical people about ecological risk assessment. The object of the game is to gather information and make conclusions about the need for remediation at a site. The game is played by several teams of three to five players. Teams gather information by buying studies, which have different prices and time requirements. Each team is given different time and budget constraints, and players can assume different roles, such as regulator or site owner. The studies are assumed to occur sequentially, although, in the real world, some of these studies would be done concurrently. Conditions, such as the budget, may change as the game progresses, as they would in the real world. A facilitator is present, Hohreiter said, to answer players' questions and to explain what the studies include. The facilitator and a game board help lead players through the sequential processes involved in ecological risk assessment. At the end of the game, the teams present conclusions and supporting information. This allows the teams to discuss their different approaches and the choices they made.
Hohreiter said that the game described above focuses on contamination in a floodplain and on terrestrial concerns. He said that he and Stahl have started modifying the game so that it focuses on sediment issues and educates people about concerns at sediments remediation sites. Hohreiter asked Subgroup members whether he and Stahl should continue to develop the game and, if so, how they might alter it.
Subgroup members asked a number of questions. Apitz asked if players need to make decisions about the study design, such as how many samples to collect or what analyses to conduct, when they purchase a study. Hohreiter said that each study is assumed to contain an appropriate level of detail; people would become bogged down in the details if the game required study design. The studies, he said, were created to introduce people to the techniques, not teach study design. Jensen asked if there are remediation choices. Hohreiter said that players decide if remediation is needed or not; there is no component that discusses what remediation technology is most applicable. A Subgroup member asked if the game leads people to conclude that spending more money is better. Hohreiter answered that the game does not reinforce the idea that expensive is better; rather, it stresses making wise choices. Another Subgroup member asked if it was realistic to have role playing with all the stakeholders involved. Hohreiter said that, through role playing, people can take positions that they normally would not. For example, a regulator can act as a site owner.
A Subgroup member suggested that the game include a relative contribution component, such as something involving the effects of regional contribution and background concentrations. Hohreiter agreed with this comment, but said there had been a conscious decision to exclude that component. The game was designed to illustrate the ecological risk assessment process, without drawing attention to who is to blame for the contamination. Other modules may focus on source identification, sampling design, or remediation technology selection. A Subgroup member suggested that, although these topics might not be described in depth, they should be mentioned. A naive audience may not understand that sites are complex, and this type of exercise is a good way to introduce the idea of site complexity. Hohreiter replied that players with some real-world experience focus on determining the source of contamination rather than using the game to learn about ecological assessment. One way to add complexity to the game might be to add uncertainty to the data provided in the studies. It is important to be aware of the difference between remediation and restoration, and to express this distinction in the game. Hohreiter said that the game does address some of these issues.
Subgroup members agreed that Hohreiter and Stahl should continue to develop and refine the scenario for application at a sediment site. A freshwater and marine scenario could be developed to highlight the different considerations for these systems. Another scenario could address remediation options and assess the ecological impacts of the remedy. Community involvement scenarios, such as a dam continuously releasing contamination or a release from combined sewage outfalls, could be included. Hohreiter suggested that he and Stahl continue to develop the baseline assessment scenario and then assess the potential for future scenarios later. He felt that this game could be used to introduce people to sediment assessment technologies. Jensen suggested that the Subgroup could use the game as a basis for a training program for the Interstate Technology and Regulatory Cooperation (ITRC) Work Group. Timberlake suggested that the game could be used to transfer technology information to different organizations.
Hohreiter said that Stahl distributed information about the game prior to the meeting. Some Subgroup did
not receive it. Hohreiter stated that he would resend Stahl's e-mail with all of its attachments. He asked
Subgroup members to review the information in Stahl's e-mail and to provide additional comments
during the Subgroup conference call that is scheduled for June 2, 2000.
SUBGROUP OPEN HOUSE AT THE NOVEMBER 2000 SETAC MEETING
Richard Jensen, DuPont Corporate Remediation
Jensen said that the last agenda item called for a discussion of the Subgroup's plans for the November 2000 SETAC meeting. He felt, however, that the Subgroup covered this topic in their summary discussions of the white papers. Another Subgroup meeting will be held in September 2000; at this time, Subgroup members can make final decisions about what to present at the SETAC meeting.
Attachment A: Final Attendee List
RTDF Sediments Remediation Action Team
Assessment Subgroup Meeting
Crowne Plaza Cincinnati
May 9, 2000
3000 Continental Drive, N
Wendy Davis Hoover
Charles (Dick) Lee
Merton (Mel) Skaggs
RTDF/Logistical and Technical Support Provided by:
Eastern Research Group, Inc.
110 Hartwell Avenue
Lexington, MA 02421-3136