The overpumping of groundwater in California has led to near environmental catastrophe in some areas – land is sinking, seawater is intruding, and groundwater storage capacity has shrunk. But researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) believe machine learning could be part of the solution to restoring groundwater to sustainable levels and quality.
A team of computer and environmental scientists are developing a computational tool that will allow groundwater managers to plan and manage their precious water resources more sustainably, leading to improved resistance to droughts. Such tools, which do not exist currently, could be run on a laptop computer and will allow agencies to evaluate, in near real time, multiple future scenarios and the effects of different management actions.
“Water resource allocation decisions have become increasingly complex, given the need to adapt to future changes in land use that are expected to occur due to population and economic growth, and increased frequency of extreme events, such as floods and droughts, brought on by climate change,” said project leader Julianne Mueller, a scientist in Berkeley Lab’s Computational Research Division. “A more efficient computational approach and simulation tools are desperately required – they would facilitate a holistic understanding of the distribution of water resources and assessing surface water and groundwater vulnerability at the watershed scale.”
Although California tightly controls how much water is drawn from surface sources, such as rivers, lakes, and streams, there were few limitations until recently on pumping of groundwater, which makes up 30 to 50% of the state’s water supply. In fact, groundwater has been regularly used in many areas to make up for whatever water was needed but unavailable through surface water allocations.
However, a series of droughts in California over the last half century led to such massive overpumping of groundwater that in 2014 the state passed the Sustainable Groundwater Management Act (SGMA). SGMA empowers local jurisdictions to form Groundwater Sustainability Agencies (GSAs) to halt overdraft and requires them to bring groundwater basins into balanced levels of pumping and recharge – mostly by the year 2040.
To achieve that goal, GSAs will need access to long-term data, such as rainfall, groundwater level and quality, stream flow volumes, and water use, as well as efficient computational tools for timely decision support.
Making hard decisions with insufficient tools
Unfortunately, most GSAs lack the computing power needed to make good decisions. Geochemist Bhavna Arora notes that the decisions they make can have long-term implications: “The decisions might be, for example, what is the environmental flow you want to maintain so you don’t kill all the salmon in your stream, versus, a farmer trying to extract that water to irrigate their land. In a changing climate those decisions are hard. And timing matters a lot.”
Currently local agencies rely heavily on a patchwork of tools and consultants, with little consistency across the state. “Adjacent GSAs that draw from the same groundwater resource could potentially be using different tools and methodologies for evaluating what their impacts are to groundwater,” said Charuleka Varadharajan, a Berkeley Lab biogeochemist with expertise in analyzing environmental data. “Obviously, that’s a problem.”
The Berkeley Lab team aims to provide an easy-to-use computational tool that would predict groundwater levels using historical data, and allow water managers to run different future climate scenarios to find the optimal groundwater management solution. “We’ll calculate the risks and anticipated stressors,” Arora said. “We’ll be able to run multiple scenarios simultaneously and see where it’s most beneficial to withdraw the water and how much water to draw to sustain the resource.”
Machine learning approaches to develop data-driven surrogate models
Arora, Varadharajan, and hydrogeologist Boris Faybishenko, from the Lab’s Earth & Environmental Sciences Area, are partnering with Mueller, a mathematician with expertise in numerical parameter optimization.
The team, which also includes Deborah Agarwal and Reetik Sahu in the Computational Research Division, is looking to use an engineering method called surrogate modeling to make predictions that enable decision support for groundwater management. A surrogate model would not need to know the specifics of a watershed’s hydrology. Instead, using machine learning approaches, it would be trained with historical data to quickly supply close approximations of the ins and outs of each groundwater source.
“For a high-resolution model it can take a supercomputer at least a day to run one scenario,” Mueller said. “For our tool it will take just a few minutes. All you need to do is train the surrogate model. We trained a model on 12 years of data, and it took less than a minute to run a scenario.”
Varadharajan added: “Although a high-resolution model will likely be more accurate, it is computationally expensive. What’s more, most local water agencies don’t have access to supercomputing facilities.”
Surrogate models have been commonly used by scientists to predict streamflow – for example, to model the volume of water a river will receive based on precipitation and climate variables – but has not been used for groundwater management.
“Groundwater is a harder problem than surface water,” said Arora. “For surface water flow we use linear equations. Numerically you can easily approximate that. But groundwater is a nonlinear problem with several unknowns. So it’s like a black box you’re trying to solve with this surrogate model.”
In theory, the technique could be applied to manage any limited resource, Mueller noted.
“The project brings to bear one of Berkeley Lab’s strengths – computing – and applies it to an area that the Lab cares deeply about, and that is water resilience and sustainability,” said Mueller. Sustainable groundwater management is a major research thrust of Berkeley Lab’s Water-Energy Resilience Research Institute.
This research is funded by Berkeley Lab’s Laboratory Directed Research and Development (LDRD) program.
# # #
Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 13 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.