Thursday, August 13, 2015

EPA’s Approach to Decision Support is in need of a Sea Change

In the past few decades, the USEPA has widely recognized the importance of economic analysis to the EPA mission. As a consequence, EPA has hired environmental economists and supported research on benefits assessment. This has greatly enhanced EPA’s knowledge base for decision support. EPA should now make a further significant improvement to their decision support by establishing prescriptive decision analysis as the best way to present uncertain scientific knowledge for informed decision making.

Decision analysis, based on the normative model of decision theory, is a well-established discipline that is taught in many university public policy and business programs. There are two fundamental elements in a decision analysis:
  •      A utility function that characterizes the values, or perhaps net benefits, associated with outcomes of interest that result from a management action,
  •    A probability model that quantifies the uncertainty in the outcomes of interest that result from a management action.

The economic analysis that is now embraced by EPA may be used to provide the first element of the decision analysis quantifying value. An uncertainty analysis can provide the second essential element.

Why has EPA recognized the importance of economic benefits assessment to inform decision making, yet seems oblivious to the need to follow the decision analysis model that is so well-established as an academic discipline? I think that a major reason for this situation is that the environmental engineering and ecology programs that have provided the academic training for many scientists in EPA and in state environmental agencies do not include a course in decision analysis, nor do they recommend a curriculum that includes decision analysis taught in another academic department.

Perhaps to better appreciate the role of this decision analytic framework, consider the following example from everyday life. All of us have made decisions on outdoor activities in consideration of the forecast for rain. In deciding whether to hold or postpone an outdoor activity, we typically seek (scientific) information on such things as the probability (reflecting uncertainty) of rain. Further, it is not uncommon  to hear the weather forecast on the evening news, but still defer a final decision on the activity until an updated weather prediction in the morning (in other words, get more sample information).
Beyond consideration of the scientific assessment in the weather forecast, we also think about how important the activity is to us. Do we really want to participate in the activity, such that a little rain will not greatly reduce our enjoyment? Or, is the activity of only limited value, such that a small probability of rain may be enough so that we choose not participate?
Every day, we make decisions based on an interplay, or mix, of uncertainty in an event (e.g., rain) and value (enjoyment) of an activity. We are used to weighing these considerations in our minds and deciding. These same considerations--getting new information on the weather (which is analogous to supporting new scientific research, as in adaptive management), and deciding how valuable the activity is to us (which is what we determine through cost/benefit analysis)--are key features of decision analysis.

Public sector decisions involving uncertain knowledge and uncertain forecasts should follow this same decision analytic paradigm. Given the consequences of most public sector decisions and the uncertainties in environmental modeling, it is essential that this happen. Failure on the part of EPA to use decision analysis as their prescriptive model for decision support means that many of EPA’s assessments and models will continue to ignore uncertainty in model predictions, resulting in many unexpected management outcomes because stakeholders are unaware of the large uncertainties in predictions from the deterministic models that EPA provides in its decision support. In my view, this situation is inexcusable.

Wednesday, August 5, 2015

Unattainable Surface Water Quality Standards may Diminish Widespread Public Support for Water Quality Improvements

Many state water quality standards were established in the early years of the Clean Water Act (CWA) when a key goal of the 1972 CWA was “to eliminate pollutant discharge to navigable waters by 1985.” Unfortunately, this admirable goal sometimes has resulted in required pollutant load reductions (e.g., TMDLs) that are based on unattainable water quality standards that reflect the environmental euphoria of the 1970s and 1980s. In my view, it is wise to consider if we should continue to develop water quality management plans focused on achievement of those goals, or if it is better to develop realistic goals and set attainable water quality standards. 

From a pragmatic perspective, working toward unattainable water quality standards diminishes our ability to achieve widespread buy-in on pollutant load controls.  I see this reaction to water pollutant control now in North Carolina, where unattainable standards are leading to a backlash against pollutant reduction, due primarily to extremely high costs of compliance with a TMDL. 
Unfortunately, this perspective may be given further support by long lag times between implementation of nonpoint source controls and observable water quality improvements, leading to skepticism that the required pollutant load reductions will have any effect.

For example, Falls Reservoir in North Carolina has a TMDL mandating a 77% reduction in phosphorus loading to attain the 40 ug/l chlorophyll a water quality criterion. Given the preponderance of nonpoint sources of phosphorus in the Falls Reservoir watershed, a 77% phosphorus load reduction is not feasible; even if it were, the cost of attainment almost certainly will far exceed the benefits derived for designated use. Given that situation, Falls Reservoir is in need of a Use Attainability Analysis (which determines if a designated use is technologically and economically feasible) or new site-specific nutrient criteria.

I believe that realistic and achievable water quality standards, with designated use (e.g., recreational fishing) improvements that is causally-linked to attainment of water quality criteria (e.g., chlorophyll a), are needed to gain widespread support for pollutant controls for water quality improvements. In Falls Reservoir, the backlash against the high cost of phosphorus load reductions has resulted in a state-sponsored plan for in-lake artificial mixing (using SolarBees). This is a waste of money, as whole-lake mixing is not feasible due to the large size of Falls Reservoir, and in-lake mixing will have little effect on nutrient concentrations. While I do not believe that water column mixing in Falls Reservoir is scientifically-defensible, I do understand that local and state elected officials may feel desperate enough to embrace even ineffective “solutions” in the hope of reducing pollutant control costs for their constituents.

It is unfortunate that the laudatory goals of the Clean Water Act are not everywhere attainable. Given that fact, I believe that the most effective way to achieve additional protection of designated uses is to adopt technologically and economically feasible water quality standards. This is likely to result in relaxation of a limited number of current water quality criteria. I wish that we could do better and eliminate pollutant discharges to navigable waters, but that is not going to happen. In my view, recognition of the need to set realistic water quality goals is the best pathway to achieve and maintain meaningful water quality improvements.