PDF(1727 KB)
PDF(1727 KB)
PDF(1727 KB)
氮素污染滞后现象的影响因素与定量评估方法研究进展
Review of Influencing Factors and Quantitative Evaluation Methods of Legacy Effect in Nitrogen Pollution
水环境氮污染治理管控已成为环境科学领域的研究热点。然而污染治理措施制定与措施生效之间常存在时间滞后的问题,当前学界普遍认为其原因在于历史输入的氮素仍以生物地球化学遗留氮或水文遗留氮的形式滞留在流域内,将持续影响当前地表水质。因此,掌握氮素输出对输入的滞后响应规律便成为精准治氮的关键一环。然而,现行的水文模型常忽略或简化这一过程。介绍了氮素污染治理滞后现象的影响因素及滞后时间与遗留负荷的定量评估方法的近期进展,提出现有模型在量化遗留氮空间分布特征上还存在不足,建议在氮素溯源基础上增加对于氮素输出途径的了解,建立源-径耦合的氮素输出模型,以求在短时间内以最小资金投入实现水质的迅速提升。
[Objective] This study examines the legacy effect in aquatic nitrogen pollution control, emphasizing the role of historically accumulated nitrogen. It reviews advanced methods for quantifying lag times and legacy loads, aiming to provide a scientific basis for more precise nitrogen management. [Methods] Based on a literature review, this study analyzed nitrogen fate and transport processes, focusing on biogeochemical and hydrological legacy nitrogen. It evaluated current quantification approaches and the limitations of hydrological models. [Results] The analysis indicated that historically accumulated nitrogen could remain in watershed soils and groundwater in various forms, constituting a persistent pollution source that prevented an immediate response to management measures. Although recent research made some progress in quantifying lag times and legacy loads, current hydrological models still exhibited significant shortcomings in accurately characterizing the spatial distribution of legacy nitrogen, which limited the predictive capabilities for the lagged nitrogen response. [Conclusion] The study concludes that, to overcome the limitations of current models and effectively address the challenge posed by lag time in nitrogen pollution management, future research should focus on establishing a source-pathway coupled model for nitrogen export. This model integrates precise source identification with advanced simulation of export pathways, thereby providing a critical tool for achieving precise nitrogen management and rapid water quality improvement with minimal investment.
面源污染 / 总氮治理 / 滞后现象 / 地下水污染 / 遗留氮
nonpoint source pollution / total nitrogen treatment / legacy effect / groundwater pollution / legacy nitrogen
| [1] |
The loss of agricultural nitrogen (N) is a leading cause of global eutrophication and freshwater and coastal hypoxia. Despite regulatory efforts, such as the European Union’s Nitrogen Directive, high concentrations of N persist in freshwaters. Excessive N leaching and accumulation in groundwater has created a substantial N reservoir as groundwater travel times are orders-of-magnitude slower than those of surface waters. In this study we reconstructed past and projected future N dynamics in groundwater for four major river basins, the Rhine, Mississippi, Yangtze and Pearl, showcasing different N trajectories. The Rhine and Mississippi river basins have accumulated N since the 1950s and although strategies to reduce excess agricultural N have worked well in the Rhine, groundwater legacy N persists in the Mississippi. The Yangtze and Pearl river basins entered the N accumulation phase in the 1970s and the accumulation is expected to continue until 2050. Policies to reduce N pollution from fertilizers have not halted N accumulation, highlighting the importance of accounting for the N legacy in groundwater. Restoring groundwater N storage to 1970 levels by diminishing N leaching will therefore take longer in the Yangtze and Pearl (>35 years) than in the Rhine (9 years) and Mississippi (15 years). Sustainable watershed management requires long-term strategies that address the impacts of legacy N and promote sustainable agricultural practices aligned with the Sustainable Development Goals to balance agricultural productivity with water conservation.
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
|
| [6] |
In August 2017, the Gulf of Mexico's hypoxic zone was declared to be the largest ever measured. It has been estimated that a 60% decrease in watershed nitrogen (N) loading may be necessary to adequately reduce eutrophication in the Gulf. However, to date there has been no rigorous assessment of the effect of N legacies on achieving water quality goals. In this study, we show that even if agricultural N use became 100% efficient, it would take decades to meet target N loads due to legacy N within the Mississippi River basin. Our results suggest that both long-term commitment and large-scale changes in agricultural management practices will be necessary to decrease Mississippi N loads and to meet current goals for reducing the size of the Gulf hypoxic zone.Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
|
| [7] |
Global-scale nitrogen budgets developed to quantify anthropogenic impacts on the nitrogen cycle do not explicitly consider nitrate stored in the vadose zone. Here we show that the vadose zone is an important store of nitrate that should be considered in future budgets for effective policymaking. Using estimates of groundwater depth and nitrate leaching for 1900-2000, we quantify the peak global storage of nitrate in the vadose zone as 605-1814 Teragrams (Tg). Estimates of nitrate storage are validated using basin-scale and national-scale estimates and observed groundwater nitrate data. Nitrate storage per unit area is greatest in North America, China and Europe where there are thick vadose zones and extensive historical agriculture. In these areas, long travel times in the vadose zone may delay the impact of changes in agricultural practices on groundwater quality. We argue that in these areas use of conventional nitrogen budget approaches is inappropriate.
|
| [8] |
Subsurface contamination due to excessive nutrient surpluses is a persistent and widespread problem in agricultural areas across Europe. The vulnerability of a particular location to pollution from reactive solutes, such as nitrate, is determined by the interplay between hydrologic transport and biogeochemical transformations. Current studies on the controls of subsurface vulnerability do not consider the transient behaviour of transport dynamics in the root zone. Here, using state-of-the-art hydrologic simulations driven by observed hydroclimatic forcing, we demonstrate the strong spatiotemporal heterogeneity of hydrologic transport dynamics and reveal that these dynamics are primarily controlled by the hydroclimatic gradient of the aridity index across Europe. Contrasting the space-time dynamics of transport times with reactive timescales of denitrification in soil indicate that ~75% of the cultivated areas across Europe are potentially vulnerable to nitrate leaching for at least one-third of the year. We find that neglecting the transient nature of transport and reaction timescale results in a great underestimation of the extent of vulnerable regions by almost 50%. Therefore, future vulnerability and risk assessment studies must account for the transient behaviour of transport and biogeochemical transformation processes.
|
| [9] |
In the Chesapeake Bay, excess nitrogen (N) from both landscape and atmospheric sources has for decades fueled algal growth, disrupted aquatic ecosystems, and negatively impacted coastal economies. Since the 1980s, Chesapeake Bay Program partners have worked to implement a wide range of measures across the region—from the upgrading of wastewater treatment plants to implementation of farm-level best management practices—to reduce N fluxes to the Bay. Despite widespread implementation of such measures and notable reductions in N inputs, water quality across the region has been slow to improve. Such lack of response has in some cases been attributed to N legacies—accumulations of surplus N in soils and groundwater—that can contribute to time lags between implementation of conservation measures and improvements in water quality. Here, we use the ELEMeNT-N modeling framework to explore the role of legacy N in slowing reductions in N loading to the Bay, and to provide estimates of the time required to meet water quality goals in nine major tributary watersheds. Our results first show that recent improvements in water quality can be attributed to decreases in N surplus magnitudes that began to occur in the 1970s and 1980s, and that such improvements will continue in the coming decades. Future simulations suggest that, even with no additional changes in current management practices, goals to reduce N loads across the region by 25% can nearly be met within the next two decades. The present results also suggest that time lags to achieving water quality may vary considerably in the individual study watersheds, with the longest lag times being found in the highly agricultural Choptank watershed, where N surplus magnitudes remain high and where legacy N remains a major control on water quality.
|
| [10] |
Increases in nitrogen (N) fertilizer application, livestock densities, and human population over the last century have led to substantial increases in nitrate contamination. While increases in riverine N loads are well-documented, the total magnitude of N accumulation in groundwater remains unknown. Here we provide a first data-driven estimate of N mass accumulation in groundwater within the Upper Mississippi River Basin (UMRB), an area of intensive row-crop agriculture and the primary contributor to Gulf of Mexico hypoxia. Using approximately 49 000 groundwater nitrate well concentration values and a suite of geospatial predictors, we developed a Random Forest model to produce gridded predictions of depth-varying nitrate concentrations. Our results suggest that approximately 15 Tg of N (328 ± 167 kg-N ha−1) is currently stored in UMRB groundwater recharged over the last 50 years. For context, we compare these predictions to those from a lumped statistical model, which predicts accumulation of 387 ± 133 kg-N ha−1, as well as to a simple N mass balance model of the UMRB, which puts an upper bound on accumulation of approximately 1000 kg-N ha−1 (1967–2017). These findings highlight the importance of considering legacy N when forecasting future water quality, as N in the subsurface will continue to impair drinking water quality and elevate surface water N concentrations for decades to come.
|
| [11] |
|
| [12] |
Land use change and agricultural intensification have increased food production but at the cost of polluting surface and groundwater. Best management practices implemented to improve water quality have met with limited success. Such lack of success is increasingly attributed to legacy nutrient stores in the subsurface that may act as sources after reduction of external inputs. However, current water-quality models lack a framework to capture these legacy effects. Here we have modified the SWAT (Soil Water Assessment Tool) model to capture the effects of nitrogen (N) legacies on water quality under multiple land-management scenarios. Our new SWAT-LAG model includes (1) a modified carbon-nitrogen cycling module to capture the dynamics of soil N accumulation, and (2) a groundwater travel time distribution module to capture a range of subsurface travel times. Using a 502-km(2) Iowa watershed as a case study, we found that between 1950 and 2016, 25% of the total watershed N surplus (N Deposition + Fertilizer + Manure + N Fixation - Crop N uptake) had accumulated within the root zone, 14% had accumulated in groundwater, while 27% was lost as riverine output, and 34% was denitrified. In future scenarios, a 100% reduction in fertilizer application led to a 79% reduction in stream N load, but the SWAT-LAG results suggest that it would take 84years to achieve this reduction, in contrast to the 2years predicted in the original SWAT model. The framework proposed here constitutes a first step toward modifying a widely used modeling approach to assess the effects of legacy N on the time required to achieve water-quality goals. Plain Language Summary For nearly a century, we have used nitrogen fertilizers to boost crop yields. However, the environmental effects of fertilizer use have been severe. Drinking water with high nitrate levels threatens human health, and high nitrogen loads in rivers lead to the creation of dead zones in coastal waters that make it impossible for fish or underwater plants to survive. Although we have tried for decades to reduce nitrogen levels in our waterways, the results have been disappointing. Scientists now believe that improvements may be slow to come because there are large amounts of nitrogen that have accumulated in soil and groundwater-legacy nitrogen-that continue to pollute our rivers even after farmers have reduced fertilizer use or improved management. However, policymakers still struggle to predict how long it will take to improve water quality. In our work, we have created a new model, Soil Water Assessment Tool-LAG, that allows us to predict the time lags caused by legacy nitrogen. Using an agricultural watershed in Iowa as a case study, we show that it can take as long as 80years to see the full effects of new management practices and that these time lags must be considered when setting policy goals.
|
| [13] |
|
| [14] |
Legacy nitrogen (N) is recognized as a primary cause for the apparent failure of watershed N management strategies to achieve desired water quality goals. The ELEMeNT-N (exploration of long‐term nutrient trajectories for nitrogen) model, a parsimonious and process-based model, has the potential to effectively distinguish biogeochemical and hydrological legacy effects. However, ELEMeNT-N is limited in its ability to address long-term legacy N dynamics as it ignores temporal changes in soil organic N (SON) mineralization rates. This work represents the first use and modification of ELEMeNT-N to quantify legacy effects and capture spatial heterogeneity of legacy N accumulation in China. An exponential function based on mean annual temperature was employed to estimate yearly changes in SON mineralization rate. Based on a 31 year water quality record (1980–2010), the modified model achieved higher efficiency metrics for riverine N flux in the Yongan watershed in eastern China than the original model (Nash–Sutcliff coefficient: 0.87 vs. 0.72 and R\n 2: 0.80 vs. 0.71). The modified ELEMeNT-N results suggested that the riverine N flux mainly originated from the legacy N pool (88.2%). The mean overall N lag time was 11.9 years (95% confidence intervals (CIs): 8.3–21.3), of which biogeochemical lag time was 9.7 years (6.3–18.4) and hydrological lag time was 2.2 years (2.0–3.0). Legacy N accumulation showed considerable spatial heterogeneity, with 219–239 kg N ha−1 accumulated in soil and 143–188 kg N ha−1 accumulated in groundwater. The ELEMeNT-N model was an effective tool for addressing legacy N dynamics, and the modified form proposed here enhanced its ability to capture SON mineralization dynamics, thereby providing managers with critical information to optimize watershed N pollution control strategies.
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
|
| [20] |
|
| [21] |
|
| [22] |
|
| [23] |
Understanding nutrient pathways to streams will improve nutrient management strategies and estimates of the time lag between when changes in land use practices occur and when water quality effects that result from these changes are observed. Nitrate and orthophosphate (OP) concentrations in several environmental compartments were examined in watersheds having a range of base flow index (BFI) values across the continental United States to determine the dominant pathways for water and nutrient inputs to streams. Estimates of the proportion of stream nitrate that was derived from groundwater increased as BFI increased. Nitrate concentration gradients between groundwater and surface water further supported the groundwater source of nitrate in these high BFI streams. However, nitrate concentrations in stream-bed pore water in all settings were typically lower than stream or upland groundwater concentrations, suggesting that nitrate discharge to streams was not uniform through the bed. Rather, preferential pathways (e.g., springs, seeps) may allow high nitrate groundwater to bypass sites of high biogeochemical transformation. Rapid pathway compartments (e.g., overland flow, tile drains) had OP concentrations that were typically higher than in streams and were important OP conveyers in most of these watersheds. In contrast to nitrate, the proportion of stream OP that is derived from ground water did not systematically increase as BFI increased. While typically not the dominant source of OP, groundwater discharge was an important pathway of OP transport to streams when BFI values were very high and when geochemical conditions favored OP mobility in groundwater.
|
| [24] |
|
| [25] |
|
| [26] |
Understanding concentration‐discharge (C‐Q) relationships are essential for predicting chemical weathering and biogeochemical cycling under changing climate and anthropogenic conditions. Contrasting C‐Q relationships have been observed widely, yet a mechanistic framework that can interpret diverse patterns remains elusive. This work hypothesizes that seemingly disparate C‐Q patterns are driven by switching dominance of end‐member source waters and their chemical contrasts arising from subsurface biogeochemical heterogeneity. We use data from Coal Creek, a high‐elevation mountainous catchment in Colorado, and a recently developed watershed reactive transport model (BioRT‐Flux‐PIHM). Sensitivity analysis and Monte‐Carlo simulations (500 cases) show that reaction kinetics and thermodynamics and distribution of source materials across depths govern the chemistry gradients of shallow soil water and deeper groundwater entering the stream. The alternating dominance of organic‐poor yet geo‐solute‐rich groundwater under dry conditions and organic‐rich yet geo‐solute‐poor soil water during spring melt leads to the flushing pattern of dissolved organic carbon and the dilution pattern of geogenic solutes (e.g., Na, Ca, and Mg). In addition, the extent of concentration contrasts regulates the power law slopes (b) of C‐Q patterns via a general equation. At low ratios of soil water versus groundwater concentrations (Cratio = Csw/Cgw < 0.6), dilution occurs; at high ratios (Cratio > 1.8), flushing arises; chemostasis occurs in between. This equation quantitatively interpretsbvalues of 11 solutes (dissolved organic carbon, dissolved P, NO3−, K, Si, Ca, Mg, Na, Al, Mn, and Fe) from three catchments (Coal Creek, Shale Hills, and Plynlimon) of differing climate, geologic, and land cover conditions. This indicates potentially broad regulation of subsurface biogeochemical heterogeneity in determining C‐Q patterns and wide applications of this equation in quantifyingbvalues, which can have broad implications for predicting chemical weathering and biogeochemical transformation at the watershed scale.
|
| [27] |
|
| [28] |
|
| [29] |
|
| [30] |
|
| [31] |
|
| [32] |
Nonpoint source (NPS) watershed projects often fail to meet expectations for water quality improvement because of lag time, the time elapsed between adoption of management changes and the detection of measurable improvement in water quality in the target water body. Even when management changes are well‐designed and fully implemented, water quality monitoring efforts may not show definitive results if the monitoring period, program design, and sampling frequency are not sufficient to address the lag between treatment and response. The main components of lag time include the time required for an installed practice to produce an effect, the time required for the effect to be delivered to the water resource, the time required for the water body to respond to the effect, and the effectiveness of the monitoring program to measure the response. The objectives of this review are to explore the characteristics of lag time components, to present examples of lag times reported from a variety of systems, and to recommend ways for managers to cope with the lag between treatment and response. Important processes influencing lag time include hydrology, vegetation growth, transport rate and path, hydraulic residence time, pollutant sorption properties, and ecosystem linkages. The magnitude of lag time is highly site and pollutant specific, but may range from months to years for relatively short‐lived contaminants such as indicator bacteria, years to decades for excessive P levels in agricultural soils, and decades or more for sediment accumulated in river systems. Groundwater travel time is also an important contributor to lag time and may introduce a lag of decades between changes in agricultural practices and improvement in water quality. Approaches to deal with the inevitable lag between implementation of management practices and water quality response lie in appropriately characterizing the watershed, considering lag time in selection, siting, and monitoring of management measures, selection of appropriate indicators, and designing effective monitoring programs to detect water quality response.
|
| [33] |
|
| [34] |
|
| [35] |
|
| [36] |
|
| [37] |
The Walnut Creek Watershed Monitoring Project was conducted from 1995 through 2005 to evaluate the response of stream nitrate concentrations to changing land use patterns in paired 5000-ha Iowa watersheds. A large portion of the Walnut Creek watershed is being converted from row crop agriculture to native prairie and savanna by the U.S. Fish and Wildlife Service at the Neal Smith National Wildlife Refuge (NSNWR). Before restoration, land use in both Walnut Creek (treatment) and Squaw Creek (control) watersheds consisted of 70% row crops. Between 1990 and 2005, row crop area decreased 25.4% in Walnut Creek due to prairie restoration but increased 9.2% in Squaw Creek due to Conservation Reserve Program (CRP) grassland conversion back to row crop. Nitrate concentrations ranged between <0.5 to 14 mg L(-1) at the Walnut Creek outlet and 2.1 to 15 mg L(-1) at the downstream Squaw Creek outlet. Nitrate concentrations decreased 1.2 mg L(-1) over 10 yr in the Walnut Creek watershed but increased 1.9 mg L(-1) over 10 yr in Squaw Creek. Changes in nitrate were easier to detect and more pronounced in monitored subbasins, decreasing 1.2 to 3.4 mg L(-1) in three Walnut Creek subbasins, but increasing up to 8.0 and 11.6 mg L(-1) in 10 yr in two Squaw Creek subbasins. Converting row crop lands to grass reduced stream nitrate levels over time in Walnut Creek, but stream nitrate rapidly increased in Squaw Creek when CRP grasslands were converted back to row crop. Study results highlight the close association of stream nitrate to land use change and emphasize that grasslands or other perennial vegetation placed in agricultural settings should be part of a long-term solution to water quality problems.
|
| [38] |
|
| [39] |
|
| [40] |
The contribution of groundwater to streams is controlled by temporally and spatially variable groundwater flow paths with distinctive travel times. The aggregated average travel time distribution (TTD) of all these flow paths functions as a catchment characteristic. Currently, research on TTDs is expanding towards dynamic TTDs and building on this, we present dynamic backward TTDs and residence time distributions using forward particle tracking on a high‐resolution spatially distributed groundwater flow model (25*25 m). We show that the dynamic backward TTDs of three Dutch catchments are determined by the interplay between the activation of shallow short flow paths and the intensification of fluxes through all flow paths when groundwater levels rise. In addition, the preference for young water in our lowland catchments appears strongly controlled by drainage density. Variations in catchment mixing with time and between catchments were analyzed using dynamic StorAge Selection (SAS) functions. This showed the effect of differences in geology and topography on the shape of the SAS functions. Additionally, the variability of SAS functions in time was shown to depend on the extent to which new flow paths can be activated. Time‐varying SAS functions are required for computation of dynamic TTDs, and this research showed realistic values for the variability in the SAS functions of lowland catchments. The step towards dynamic TTDs is crucial for understanding the temporal and spatial behavior of streams, their chemical composition, and their ecological value.
|
| [41] |
. Fresh groundwater on barrier islands is affected by changing sea levels and precipitation variability due to climate change and is also vulnerable to anthropogenic processes, such as contamination and groundwater over-abstraction. Constraining groundwater mean residence times (MRTs) and flow paths is essential for understanding and managing these resources. This study uses tritium (3H) and carbon-14 (14C) to determine the MRTs of groundwater along a transect across subtropical North Stradbroke Island, south-east Queensland, Australia. Hydraulic properties, major ion geochemistry and stable isotopes are used to validate residence times and to identify the processes responsible for their variability. 3H activities range from less than 0.01 to 1 TU (tritium units), which are values lower than those of local average rainfall (1.6–2.0 TU). 14C concentrations range from 62.5 to 111 pMC (percent modern carbon). Estimated MRTs determined using lumped parameter models and 3H activities range from 37 to more than 50 years. Recharge occurs over the entire island, and groundwater MRTs generally increase vertically and laterally towards the coastal discharge areas, although no systematic pattern is observed. MRTs estimated from 14C concentrations display similar spatial relationships but have a much greater range (from modern to approximately 5000 years). Water diversion and retention by lower-permeability units in the unsaturated parts of the dune systems are the most likely course for relatively long MRTs to date. The results indicate that the internal structures within the dune systems increase MRTs in the groundwater system and potentially divert flow paths. The structures produce perched aquifer systems that are wide-spread and have a significant influence on regional recharge. The geochemical composition of groundwater remains relatively consistent throughout the island, with the only irregularities attributed to old groundwater stored within coastal peat. The outcomes of this study enhance the understanding of groundwater flow, recharge diversion and inhibition for large coastal sand masses in general, especially for older sand masses that have developed structures from pedogenesis and dune movement. With respect to south-east Queensland, it allows the existing regional groundwater flow model to be refined by incorporating independent MRTs to test models' validity. The location of this large fresh groundwater reservoir, in dry and populous south-east Queensland, means that its potential to be used as a water source is always high. Background information on aquifer distribution and groundwater MRTs is crucial to better validate impact assessment for water abstraction.
|
| [42] |
|
| [43] |
|
| [44] |
. Environmental heterogeneity is ubiquitous, but environmental systems are often analyzed as if they were homogeneous instead, resulting in aggregation errors that are rarely explored and almost never quantified. Here I use simple benchmark tests to explore this general problem in one specific context: the use of seasonal cycles in chemical or isotopic tracers (such as Cl−, δ18O, or δ2H) to estimate timescales of storage in catchments. Timescales of catchment storage are typically quantified by the mean transit time, meaning the average time that elapses between parcels of water entering as precipitation and leaving again as streamflow. Longer mean transit times imply greater damping of seasonal tracer cycles. Thus, the amplitudes of tracer cycles in precipitation and streamflow are commonly used to calculate catchment mean transit times. Here I show that these calculations will typically be wrong by several hundred percent, when applied to catchments with realistic degrees of spatial heterogeneity. This aggregation bias arises from the strong nonlinearity in the relationship between tracer cycle amplitude and mean travel time. I propose an alternative storage metric, the young water fraction in streamflow, defined as the fraction of runoff with transit times of less than roughly 0.2 years. I show that this young water fraction (not to be confused with event-based \"new water\" in hydrograph separations) is accurately predicted by seasonal tracer cycles within a precision of a few percent, across the entire range of mean transit times from almost zero to almost infinity. Importantly, this relationship is also virtually free from aggregation error. That is, seasonal tracer cycles also accurately predict the young water fraction in runoff from highly heterogeneous mixtures of subcatchments with strongly contrasting transit-time distributions. Thus, although tracer cycle amplitudes yield biased and unreliable estimates of catchment mean travel times in heterogeneous catchments, they can be used to reliably estimate the fraction of young water in runoff.
|
| [45] |
. Decades of hydrograph separation studies have estimated the proportions of\nrecent precipitation in streamflow using end-member mixing of chemical or\nisotopic tracers. Here I propose an ensemble approach to hydrograph\nseparation that uses regressions between tracer fluctuations in precipitation\nand discharge to estimate the average fraction of new water (e.g., same-day\nor same-week precipitation) in streamflow across an ensemble of time steps.\nThe points comprising this ensemble can be selected to isolate conditions of\nparticular interest, making it possible to study how the new water fraction\nvaries as a function of catchment and storm characteristics. Even when new\nwater fractions are highly variable over time, one can show mathematically\n(and confirm with benchmark tests) that ensemble hydrograph separation will\naccurately estimate their average. Because ensemble hydrograph separation is\nbased on correlations between tracer fluctuations rather than on tracer mass\nbalances, it does not require that the end-member signatures are constant\nover time, or that all the end-members are sampled or even known, and it is\nrelatively unaffected by evaporative isotopic fractionation. Ensemble hydrograph separation can also be extended to a multiple regression\nthat estimates the average (or “marginal”) transit time distribution (TTD)\ndirectly from observational data. This approach can estimate both\n“backward” transit time distributions (the fraction of streamflow that originated as\nrainfall at different lag times) and “forward” transit time distributions\n(the fraction of rainfall that will become future streamflow at different\nlag times), with and without volume-weighting, up to a user-determined\nmaximum time lag. The approach makes no assumption about the shapes of the\ntransit time distributions, nor does it assume that they are time-invariant,\nand it does not require continuous time series of tracer measurements.\nBenchmark tests with a nonlinear, nonstationary catchment model confirm that\nensemble hydrograph separation reliably quantifies both new water fractions\nand transit time distributions across widely varying catchment behaviors,\nusing either daily or weekly tracer concentrations as input. Numerical\nexperiments with the benchmark model also illustrate how ensemble hydrograph\nseparation can be used to quantify the effects of rainfall intensity, flow\nregime, and antecedent wetness on new water fractions and transit time\ndistributions.
|
| [46] |
|
| [47] |
|
| [48] |
|
| [49] |
|
| [50] |
|
| [51] |
|
| [52] |
|
| [53] |
\nA calibrated model does not guarantee cross‐scale and location transferability
|
| [54] |
\n Hydrological water quality models have gained wide acceptance from environmental scientists and water managers to address deterioration of surface water quality. Higher spatiotemporal accuracy of such models is increasingly required for better understanding the functional heterogeneity of catchments and improving management decisions at different governance levels. However, balancing spatial representation and model complexity remains challenging. We present a new flexibly designed, fully distributed nitrate transport and removal model (mHM‐Nitrate) at catchment scale. The model was developed mainly based on the mesoscale Hydrological Model (mHM) and the Hydrological Predictions for the Environment (HYPE) model. The mHM‐Nitrate model was tested in the Selke catchment (Central Germany), which is characterized by heterogeneous physiographic and land‐use conditions, using adequate observed hydrological and nitrate data at three nested gauging stations. Long term (1997–2015) daily simulations showed that the model well reproduced the seasonal dynamics of biweekly nitrate observations in forested, agricultural and urban areas. High‐frequency measurements (2010‐2015) were additionally used to validate model performance of simulating short‐term changes in stream‐water concentrations that reflect changes in runoff partitioning and event‐based dilution effects. Uncertainty analysis confirmed the model's robustness. Moreover, model calculations showed that mean terrestrial nitrate input/output (in total 105 kg ha\n −1\n yr\n −1\n ) and in‐stream removal (8% of mean nitrate load) were in comparable ranges with literature, respectively. The new mHM‐Nitrate model is capable of providing detailed spatial information on nitrate concentrations and fluxes, which can motivate more specific catchment investigations on nitrate transport processes and provide guidance on spatially differentiated agricultural practices and measures.\n
|
| [55] |
|
/
| 〈 |
|
〉 |