The team behind Project Jupyter, an effort pioneered by Fernando Pérez, an assistant professor of statistics at UC Berkeley and staff scientist in the Usable Software Systems Group at Berkeley Lab’s Computational Research Division, has been honored with an Association of Computing Machinery Software System Award for developing a tool that has had a lasting influence on computing.
Astrophysicists at Lawrence Berkeley National Laboratory (Berkeley Lab) and the Institute of Cosmology and Gravitation at the University of Portsmouth in the U.K. say strongly lensed Type Ia supernovae could help resolve a discrepancy in measurements of the universe’s accelerating expansion.
Berkeley Lab and UC Berkeley scientists were part of a team that helped to decipher one of the most bizarre spectacles ever seen in the night sky: A supernova that refused to stop shining, remaining bright far longer than an ordinary stellar explosion. What caused the event is puzzling.
The American Academy of Arts and Sciences announced today the election of 188 fellows, five of whom are scientists at Berkeley Lab. The new Berkeley Lab fellows are Jamie Cate, Christopher Chang, Roger Falcone, Michael Witherell and Katherine Yelick. All hold joint faculty appointments at UC Berkeley.
A new Berkeley Lab study shows that high-resolution models captured hurricanes and big waves that low-resolution ones missed. Better extreme wave forecasts are important for coastal cities, the military, the shipping industry, and surfers.
For the first time, a new tool developed at Berkeley Lab allows researchers to interactively explore the hierarchical processes that happen in the brain when it is resting or performing tasks. Scientists also hope that the tool can shed some light on how neurological diseases like Alzheimer’s spread throughout the brain.
Particle accelerators are on the verge of transformational breakthroughs—and advances in computing power and techniques are a big part of the reason. Long valued for their role in scientific discovery and in medical and industrial applications such as cancer treatment, food sterilization and drug development, particle accelerators, unfortunately, occupy a lot of space and carry
Scientists at Berkeley Lab will be sifting through loads of new data expected from the latest experimental run at CERN’s Large Hadron Collider.
After a massive upgrade, the Large Hadron Collider (LHC), the world’s most powerful particle collider is now smashing particles at an unprecedented 13 tera-electron-volts (TeV)—nearly double the energy of its previous run from 2010-2012. In just one second, the LHC can now produce up to 1 billion collisions and generate up to 10 gigabytes of
The past century has seen a 0.8°C increase in average global temperature, and according to the IPCC, the overwhelming source of this increase has been emissions of greenhouse gases and other pollutants from human activities. What remains unclear is precisely what fraction of the observed changes in these climate-sensitive systems can confidently be attributed to human-related influences, rather than mere natural regional fluctuations in climate. So Gerrit Hansen of the Potsdam Institute for Climate Impact Research and Dáithí Stone of Berkeley Lab developed and applied a novel methodology for answering this challenging question.