On Wednesday, Oct. 9, the Nobel Prize in Chemistry was awarded to three scientists for pioneering methods in computational chemistry that have brought a deeper understanding of complex chemical structure and reactions in biochemical systems. These methods can precisely calculate how very complex molecules work and even predict the outcomes of very complex chemical reactions.

Martin Karplus
Martin Karplus


One of the laureates—Martin Karplus of Harvard University—has been using supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) since 1998. The other laureates were Michael Levitt of Stanford University and Arieh Warshel of the University of Southern California.

According to the Royal Swedish Academy, these accomplishments have opened up an important collaboration between theory and experiment that has made many otherwise unsolvable problems solvable.

“Today the computer is just as important a tool for chemists as the test tube. Simulations are so realistic that they predict the outcome of traditional experiments,” writes the Royal Academy in its announcement of the winners.

Supercomputers and Modern Chemistry

Long gone are the days when chemists used plastic balls and sticks to create models of molecules. Today, modeling is carried out on computers, and Karplus’ work helped lay the foundation for the powerful programs that are used to understand and predict chemical processes. These models are crucial for most of the advances made in chemistry today.

Because chemical reactions happen at lightning speed, it is impossible to observe every step in a chemical process. To understand the mechanics of a reaction, chemists build computer models of these events to study them in detail. The models also allow researchers to look at these reactions at different scales, from electrons and nuclei at sub-atomic scale to large molecules.

Karplus, Levitt and Warshel, revolutionized the field of computational chemistry by making Newton’s classical physics work side-by-side with fundamentally different quantum physics. Previously, researchers could only model one or the other. Classical physics models were ideal for modeling large molecules, but they couldn’t capture chemical reactions. For that purpose, researchers instead had to use quantum physics. But these calculations required so much computing power that researchers could only simulate small molecules.

By combining the best from both physics worlds, researchers can now run simulations to understand complex processes like how drugs couple to its target proteins in the body. For example, quantum theoretical calculations show how atoms in the target protein interact with the drug. Meanwhile, less computationally demanding classical physics is used to simulate the rest of the large protein.

Karplus and NERSC

Karplus began computing at NERSC in 1998, with an award from Department of Energy’s Grand Challenges competition. The Grand Challenges applications addressed computation-intensive fundamental problems in science and engineering, whose solution could be advanced by applying high performance computing and communications technologies and resources.

At the time, Karplus and his colleague, Paul Bash who was at Northwestern University, were looking to understand chemical mechanisms in enzyme catalysis, which they couldn’t investigate experimentally. So they ran computer simulations at NERSC to gain a complete understanding of the relationship between biomolecular dynamics, structure and function.

One of the enzymes they looked at was a class called beta-lactamases. Researchers knew that these enzymes were responsible for the increasing resistance of bacteria to antibiotics, but the precise chemical resistance mechanisms were unknown. So Karplus and Bash ran simulations on NERSC supercomputers to investigate this mechanism at an atomic-level of detail.

In his 15 years as a NERSC investigator, Karplus and his research group have explored everything from how the molecule ATP synthase acts as a motor that fuels cells, to how myosin, the molecular engine behind muscles, operates. Today, Karplus’ group is tackling the science behind molecular machines, which may someday power man-made systems, for example by converting sunlight into biofuels; working as tiny “molecular motors” capable of performing chemical analyses or other tests for “lab-on-chip” devices; or even “manufacturing” nanodevices.

Here’s a sampling of his work at NERSC over the last 15 years:

1998: Protein Dynamics and Biocatalysis

2000: Theoretical Study on Catalysis by Protein Enzymes and Ribozyme

2001: Theoretical Study on Catalysis by Protein Enzymes, Ribosome and Molecular Motors

2002: QM/MM Studies of the Triosephosphate Isomerase-Catalyzed Reaction

2005: Protein Dynamics on the Supercomputer Big Screen

2010: Discovering How Muscles Really Work

About Berkeley Lab Computing Sciences

The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy’s research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe. ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 5,500 scientists at national laboratories and universities, including those at Berkeley Lab’s Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation.