Physicists have derived a quantum version of Bayes’ rule, revealing how beliefs and probabilities update in the quantum realm with potential applications in computing and beyond. Credit: Shutterstock
An international team of researchers has identified a quantum counterpart to Bayes’ rule.
The likelihood you assign to an event is influenced by what you already believe about the surrounding conditions. This is the basic principle of Bayes’ rule, a method for calculating probabilities first introduced in 1763. An international group of scientists has now demonstrated how this rule also applies within the realm of quantum physics.
“I would say it is a breakthrough in mathematical physics,” said Professor Valerio Scarani, Deputy Director and Principal Investigator at the Centre for Quantum Technologies, and member of the team. His co-authors on the work published in Physical Review Letters are Assistant Professor Ge Bai at the Hong Kong University of Science and Technology in China, and Professor Francesco Buscemi at Nagoya University in Japan.
“Bayes’ rule has been helping us make smarter guesses for 250 years. Now we have taught it some quantum tricks,” said Prof Buscemi.
While researchers before them had proposed quantum analogues for Bayes’ rule, they are the first to derive a quantum Bayes’ rule from a fundamental principle.
Conditional probability
Bayes’ rule takes its name from Thomas Bayes, who first described conditional probability in ‘An Essay Towards Solving a Problem in the Doctrine of Chances’.
Imagine a situation where someone receives a positive flu test. They might have already suspected illness, but the test result shifts how they assess their condition. Bayes’ rule offers a framework for calculating the likelihood of flu that accounts not only for the test outcome and the chance of error but also for the person’s prior assumptions.
What would Thomas Bayes think? In 1763, he proposed a new approach to calculate probabilities. An international team has now updated his ideas to deliver a quantum Bayes’ rule. Credit: Centre for Quantum Technologies
The rule treats probability as a measure of belief in the likelihood of an event. This perspective has been controversial, since some statisticians argue that probability should represent something “objective” rather than belief-based. Still, when prior knowledge and assumptions matter, Bayes’ rule serves as a widely accepted reasoning tool. For this reason, it has become essential in fields ranging from medicine and meteorology to data science and machine learning.
Principle of minimum change
When calculating probabilities with Bayes’ rule, the principle of minimum change is obeyed. Mathematically, the principle of minimum change minimizes the distance between the joint probability distributions of the initial and updated belief. Intuitively, this is the idea that for any new piece of information, beliefs are updated in the smallest possible way that is compatible with the new facts. In the case of the flu test, for example, a negative test would not imply that the person is healthy, but rather that they are less likely to have the flu.
In their work, Prof Scarani, who is also from NUS Department of Physics, Asst Prof Bai, and Prof Buscemi began with a quantum analogue to the minimum change principle. They quantified change in terms of quantum fidelity, which is a measure of the closeness between quantum states.
Researchers always thought a quantum Bayes’ rule should exist because quantum states define probabilities. For example, the quantum state of a particle provides the probability of it being found at different locations. The goal is to determine the whole quantum state, but the particle is only found at one location when a measurement is performed. This new information will then update the belief, boosting the probability around that location.
Deriving a quantum Bayes’ rule
The team derived their quantum Bayes’ rule by maximizing the fidelity between two objects that represent the forward and the reverse process, in analogy with a classical joint probability distribution. Maximizing fidelity is equivalent to minimizing change. They found in some cases their equations matched the Petz recovery map, which was proposed by Dénes Petz in the 1980s and was later identified as one of the most likely candidates for the quantum Bayes’ rule based just on its properties.
“This is the first time we have derived it from a higher principle, which could be a validation for using the Petz map,” said Prof Scarani. The Petz map has potential applications in quantum computing for tasks such as quantum error correction and machine learning. The team plans to explore whether applying the minimum change principle to other quantum measures might reveal other solutions.
Reference: “Quantum Bayes’ Rule and Petz Transpose Map from the Minimum Change Principle” by Ge Bai, Francesco Buscemi and Valerio Scarani, 28 August 2025, Physical Review Letters. DOI: 10.1103/5n4p-bxhm
Researchers have discovered that ice can generate electricity when bent or deformed, offering new insight into both nature and technology. Credit: Universitat Autonoma de Barcelona
Ice can generate electricity when bent, a process called flexoelectricity. The discovery connects to lightning formation and future device applications.
Ice is among the most common materials on Earth, covering glaciers, mountain ranges, and the polar regions. Despite its familiarity, ongoing research continues to uncover surprising aspects of its behavior.
A team from ICN2 at the UAB campus, Xi’an Jiaotong University (Xi’an), and Stony Brook University (New York) has demonstrated for the first time that regular ice displays flexoelectricity. This means it can produce an electrical charge when mechanically bent or unevenly deformed. The finding has potential applications in future technologies and may also help clarify natural processes such as the formation of lightning during storms.
The work, published in Nature Physics, represents a major advance in understanding the electromechanical behavior of ice.
“We discovered that ice generates electric charge in response to mechanical stress at all temperatures. In addition, we identified a thin ‘ferroelectric’ layer at the surface at temperatures below -113ºC (160K). This means that the ice surface can develop a natural electric polarization, which can be reversed when an external electric field is applied—similar to how the poles of a magnet can be flipped. The surface ferroelectricity is a cool discovery in its own right, as it means that ice may have not just one way to generate electricity but two: ferroelectricity at very low temperatures, and flexoelectricity at higher temperatures all the way to 0 °C,” said Dr Xin Wen, a lead researcher from the ICN2 Oxide Nanophysics Group.
This dual ability places ice alongside electroceramic materials such as titanium dioxide, which are already employed in advanced technologies like sensors and capacitors.
Ice, flexoelectricity, and thunderstorms
One of the most striking outcomes of this research is its link to natural processes. The findings indicate that the flexoelectric behavior of ice may contribute to the buildup of electrical charge in storm clouds, potentially playing a part in how lightning is generated.
It is known that lightning forms when an electric potential builds up in clouds due to collisions between ice particles, which become electrically charged. This potential is then released as a lightning strike. However, the mechanism by which ice particles become electrically charged has remained unclear, since ice is not piezoelectric — it cannot generate charge simply by being compressed during a collision.
However, the study shows that ice can become electrically charged when it is subjected to inhomogeneous deformations, i.e. when it bends or deforms irregularly.
“During our research, the electric potential generated by bending a slab of ice was measured. Specifically, the block was placed between two metal plates and connected to a measuring device. The results match those previously observed in ice-particle collisions in thunderstorms,” explains ICREA Prof. Gustau Catalán, leader of the Oxide Nanophysics Group at ICN2.
Implications and future applications
Thus, the results suggest that flexoelectricity could be one possible explanation for the generation of the electric potential that leads to lightning during storms.
The researchers in the group are already exploring new lines of investigation aimed at exploiting these properties of ice for real-world applications. Although it is still a bit early to discuss potential solutions, this discovery could pave the way for the development of new electronic devices that use ice as an active material, which could be fabricated directly in cold environments.
Reference: “Flexoelectricity and surface ferroelectricity of water ice” by X. Wen, Q. Ma, A. Mannino, M. Fernandez-Serra, S. Shen and G. Catalan, 27 August 2025, Nature Physics. DOI: 10.1038/s41567-025-02995-6
Northwestern researchers created a nickel catalyst that simplifies recycling by breaking down mixed plastics, even those contaminated with hard-to-recycle PVC, into valuable new products. Their breakthrough could transform how plastic waste is upcycled. Credit: Stock
A new catalyst may enable mixed plastic recycling
The future of plastic recycling could soon become far simpler and more efficient.
Researchers at Northwestern University have developed a new plastic upcycling method that greatly reduces — and may even eliminate — the need to pre-sort mixed plastic waste.
At the heart of the process is a low-cost nickel-based catalyst that selectively targets polyolefin plastics, including polyethylenes and polypropylenes, which make up nearly two-thirds of global single-use plastic consumption. This means the catalyst could be applied to large volumes of unsorted polyolefin waste.
When activated, the catalyst converts these low-value solid plastics into liquid oils and waxes that can be repurposed into higher-value products such as fuels, lubricants, and candles. The catalyst can be reused multiple times and, notably, is also capable of breaking down plastics contaminated with polyvinyl chloride (PVC), a toxic material long considered to make plastics “unrecyclable.”
Key challenges and breakthrough potential
The study was recently published in the journal Nature Chemistry.
“One of the biggest hurdles in plastic recycling has always been the necessity of meticulously sorting plastic waste by type,” said Northwestern’s Tobin Marks, the study’s senior author. “Our new catalyst could bypass this costly and labor-intensive step for common polyolefin plastics, making recycling more efficient, practical, and economically viable than current strategies.”
“When people think of plastic, they likely are thinking about polyolefins,” said Northwestern’s Yosi Kratish, a co-corresponding author on the paper. “Basically, almost everything in your refrigerator is polyolefin-based — squeeze bottles for condiments and salad dressings, milk jugs, plastic wrap, trash bags, disposable utensils, juice cartons and much more. These plastics have a very short lifetime, so they are mostly single-use. If we don’t have an efficient way to recycle them, then they end up in landfills and in the environment, where they linger for decades before degrading into harmful microplastics.”
A world-renowned catalysis expert, Marks is the Vladimir N. Ipatieff Professor of Catalytic Chemistry at Northwestern’s Weinberg College of Arts and Sciences and a professor of chemical and biological engineering at Northwestern’s McCormick School of Engineering. He is also a faculty affiliate at the Paula M. Trienens Institute for Sustainability and Energy. Kratish is a research assistant professor in Marks’ group, and an affiliated faculty member at the Trienens Institute. Qingheng Lai, a research associate in Marks’ group, is the study’s first author. Marks, Kratish and Lai co-led the study with Jeffrey Miller, a professor of chemical engineering at Purdue University; Michael Wasielewski, Clare Hamilton Hall Professor of Chemistry at Weinberg; and Takeshi Kobayashi a research scientist at Ames National Laboratory.
The polyolefin predicament
From yogurt cups and snack wrappers to shampoo bottles and medical masks, polyolefin plastics are part of everyday life. They are the most widely used plastics in the world, produced in enormous quantities. By some estimates, more than 220 million tons of polyolefin products are manufactured globally each year. Yet, according to a 2023 report in the journal Nature, recycling rates for these plastics remain troublingly low, falling between less than 1% and 10% worldwide.
This poor recycling record is largely due to the durability of polyolefins. Their structure is made up of small molecules connected by carbon-carbon bonds, which are notoriously strong and difficult to break apart.
“When we design catalysts, we target weak spots,” Kratish said. “But polyolefins don’t have any weak links. Every bond is incredibly strong and chemically unreactive.”
Problems with current processes
Currently, only a few, less-than-ideal processes exist that can recycle polyolefin. It can be shredded into flakes, which are then melted and downcycled to form low-quality plastic pellets. But because different types of plastics have different properties and melting points, the process requires workers to scrupulously separate various types of plastics. Even small amounts of other plastics, food residue, or non-plastic materials can compromise an entire batch. And those compromised batches go straight into the landfill.
Another option involves heating plastics to incredibly high temperatures, reaching 400 to 700 degrees Celsius. Although this process degrades polyolefin plastics into a useful mixture of gases and liquids, it’s extremely energy-intensive.
“Everything can be burned, of course,” Kratish said. “If you apply enough energy, you can convert anything to carbon dioxide and water. But we wanted to find an elegant way to add the minimum amount of energy to derive the maximum value product.”
Precision engineering
To uncover that elegant solution, Marks, Kratish, and their team looked to hydrogenolysis, a process that uses hydrogen gas and a catalyst to break down polyolefin plastics into smaller, useful hydrocarbons. While hydrogenolysis approaches already exist, they typically require extremely high temperatures and expensive catalysts made from noble metals like platinum and palladium.
“The polyolefin production scale is huge, but the global noble metal reserves are very limited,” Lai said. “We cannot use the entire metal supply for chemistry. And, even if we did, there still would not be enough to address the plastic problem. That’s why we’re interested in Earth-abundant metals.”
For its polyolefin recycling catalyst, the Northwestern team pinpointed cationic nickel, which is synthesized from an abundant, inexpensive, and commercially available nickel compound. While other nickel nanoparticle-based catalysts have multiple reaction sites, the team designed a single-site molecular catalyst.
The single-site design enables the catalyst to act like a highly specialized scalpel — preferentially cutting carbon-carbon bonds — rather than a less controlled blunt instrument that indiscriminately breaks down the plastic’s entire structure. As a result, the catalyst allows for the selective breakdown of branched polyolefins (such as isotactic polypropylene) when they are mixed with unbranched polyolefins — effectively separating them chemically.
“Compared to other nickel-based catalysts, our process uses a single-site catalyst that operates at a temperature 100 degrees lower and at half the hydrogen gas pressure,” Kratish said. “We also use 10 times less catalyst loading, and our activity is 10 times greater. So, we are winning across all categories.”
Accelerated by contamination
With its single, precisely defined, and isolated active site, the nickel-based catalyst possesses unprecedented activity and stability. The catalyst is so thermally and chemically stable, in fact, that it maintains control even when exposed to contaminants like PVC. Used in pipes, flooring, and medical devices, PVC is visually similar to other types of plastics but significantly less stable upon heating. Upon decomposition, PVC releases hydrogen chloride gas, a highly corrosive byproduct that typically deactivates catalysts and disrupts the recycling process.
Amazingly, not only did Northwestern’s catalyst withstand PVC contamination, PVC actually accelerated its activity. Even when the total weight of the waste mixture is made up of 25% PVC, the scientists found their catalyst still worked with improved performance. This unexpected result suggests the team’s method might overcome one of the biggest hurdles in mixed plastic recycling — breaking down waste currently deemed “unrecyclable” due to PVC contamination. The catalyst also can be regenerated over multiple cycles through a simple treatment with inexpensive alkylaluminium.
“Adding PVC to a recycling mixture has always been forbidden,” Kratish said. “But apparently, it makes our process even better. That is crazy. It’s definitely not something anybody expected.”
Reference: “Stable single-site organonickel catalyst preferentially hydrogenolyses branched polyolefin C–C bonds” by Qingheng Lai, Xinrui Zhang, Shan Jiang, Matthew D. Krzyaniak, Selim Alayoglu, Amol Agarwal, Yukun Liu, Wilson C. Edenfield, Takeshi Kobayashi, Yuyang Wang, Vinayak Dravid, Michael R. Wasielewski, Jeffery T. Miller, Yosi Kratish and Tobin J. Marks, 2 September 2025, Nature Chemistry. DOI: 10.1038/s41557-025-01892-y
Supported by the U.S. Department of Energy (award number DE-SC0024448) and The Dow Chemical Company.
Jan Bartek – AncientPages.com – Around 800 BCE, a significant landslide occurred in Gauldal, a river valley located in Central Norway. This event left the entire area covered in clay deposits.
Historical evidence suggests that this location held significant spiritual importance for the ancient inhabitants. In preparation for the expansion of the E6 highway, an archaeological survey was conducted to explore potential historical artifacts or other ancient remains.
Drone photo of the excavation. Credit: Kristin Eriksen / NTNU University Museum
Hanne Bryn, an experienced archaeologist from NTNU University Museum, soon came across signs of past human activity. The team faced the challenge of examining a large area with clay layers that reached a thickness of up to three meters. The excavation took longer than initially planned, spanning two summers instead of one. Still, their discoveries beneath the clay proved invaluable, offering significant insights into ancient life in the region.
“It’s a very special find. We’ve never found anything quite like it. In a Central Norwegian context, it’s entirely unique,” says Bryn.
While excavating, archaeologists made a fascinating discovery in Gauldal: a 3,000-year-old cult site where ancient people practiced their religion.
Archaeologists have conducted an excavation of the burial cairn, uncovering a stone that features both a cup mark and a pecked footprint. This discovery provides valuable insights into the cultural practices and symbolic expressions of the people who constructed these ancient sites. Credit: Mats Aspvik / NTNU University Museum
According to Bryn, the site comprises two main areas, each featuring a longhouse about 10 to 12 meters in length, along with associated burial structures. Although these longhouses are not particularly large, they hold significant historical value.
Adjacent to one of the longhouses is a larger burial cairn—a stone-made burial mound—accompanied by three stone slab chambers used as burial chambers. Scattered around both the house and the burial site are loose stones adorned with carvings.
The carved stone shortly after its discovery. Credit: Hanne Bryn / NTNU University Museum
Among these findings is a stone from the burial cairn that displays an intricately pecked footprint, complete with toes and a cup mark—a round depression measuring approximately 5 to 10 centimeters in diameter. Near the house lies a semicircle of stones; one features both an outlined footprint and a cup mark, while another bears several cup marks.
Additionally, at one end of the longhouse, an arrangement of slightly larger stones was found beneath which Bryn discovered a small engraved stone with markings on both sides.
The stone measures about 10 by 20 centimetres. On one side, a human figure and probably a dog are pecked in with dots. A bow and arrow are carved above the hand on the right side. On the other side, there is a human figure, an unknown figure, and a large boat. Credit: Hanne Bryn / NTNU University Museum
The stone, measuring approximately 20 by 10 centimeters, resembles a small photograph in size. On one side, it features a pecked depiction of a human figure alongside what seems to be a dog. Above the hand of this human figure, an engraving of a bow and arrow is depicted, crafted using a technique distinct from the rest of the image. The reverse side of the stone displays another human figure and an unidentified shape, both created with similar pecking techniques. Additionally, there is an engraving of a ship next to this second human figure.
“It’s a very special find,” says Bryn. “It’s so small. It’s portable, you could carry it in your pocket.”
Most rock art in Norway is carved or pecked directly into bedrock.
“Finding portable stones like this, lying in the landscape where they were once used, is especially rare. There aren’t many discoveries that compare,” says Bryn.
Were people there when the landslide hit?
Archaeologists investigating the site found no evidence of settlements beneath the clay layer. However, they did discover cooking pits and a fire pit likely used for bronze casting. The area between zones containing longhouses and graves was otherwise devoid of habitation signs, suggesting it wasn’t a residential area but rather served another purpose. Bryn suggests this indicates the site’s special significance, though the exact use of carved stones remains uncertain.
Hanne Bryn. Credit: Rut Helene Langbrekke Nilsen / Sør-Trøndelag County Municipality
Bryn notes that these stones held ritual importance alongside burial structures. Within stone slab chambers, archaeologists uncovered burnt bones confirmed to be human and dated approximately between 1000 and 800 BCE, coinciding with an estimated landslide period.
The question arises whether the site remained active when it was buried by clay. Bryn clarifies that there are no signs of people present at that time; unlike Pompeii—famously preserved after Mount Vesuvius erupted in 79 CE—the site does not show evidence of being suddenly abandoned due to disaster. Determining if the cult site was still in use during the landslide remains challenging, according to Bryn’s analysis.
Drawing of the carved boat on the small stone. Image: Kristoffer R. Rantala / NTNU University Museum
Near Gaulfossen, archaeologists have discovered numerous rock carvings, according to Bryn. Additional carvings have been identified on a plateau located south of the landslide pit. This entire region is recognized as a Bronze Age cultural landscape, indicating significant historical activity in the area.
Bryn and her team are optimistic about uncovering more findings this summer as their fieldwork progresses. Despite working amidst the ongoing E6 highway expansion, they are currently excavating a slightly elevated plateau situated just behind the origin of the clay slide.
Although no extraordinary artifacts have been found yet, there are indications that people once excavated this area. As a result, Bryn refers to it as a settlement area.
A carved stone featuring the outline of a foot and a shallow cup mark was discovered lying face down, concealed beneath clay. As a result, the peck marks on the stone appeared fresh and new, even though they are estimated to be up to 3,000 years old. Credit: Hanne Bryn / NTNU University Museum
Near Gaulfossen, archaeologists have discovered numerous rock carvings, according to Bryn. Additional carvings have been identified on a plateau located south of the landslide pit. This entire region is recognized as a Bronze Age cultural landscape, indicating significant historical activity in the area.
Bryn and her team are optimistic about uncovering more findings this summer as their fieldwork progresses. Despite working amidst the ongoing E6 highway expansion, they are currently excavating a slightly elevated plateau situated just behind the origin of the clay slide.
Although no extraordinary artifacts have been found yet, there are indications that people once dug into this ground. As a result, Bryn refers to it as a settlement area.
Written by Jan Bartek – AncientPages.com Staff Writer
A new mathematical model of memory hints that seven senses, not five, may be the optimal number for maximizing mental capacity. Credit: Shutterstock
A mathematical model shows memory capacity is maximized when represented by seven features. The study links this to the potential for seven senses, with applications in AI and neuroscience.
Skoltech researchers have developed a mathematical model to study how memory works. Their analysis led to unexpected insights that may advance the design of robots, artificial intelligence, and our understanding of human memory. The study, published in Scientific Reports, suggests there could be an ideal number of senses. If that is true, then humans with five senses might actually benefit from having a few more.
“Our conclusion is, of course, highly speculative in application to human senses, although you never know: It could be that humans of the future would evolve a sense of radiation or magnetic field. But in any case, our findings may be of practical importance for robotics and the theory of artificial intelligence,” said study co-author Professor Nikolay Brilliantov of Skoltech AI. “It appears that when each concept retained in memory is characterized in terms of seven features — as opposed to, say, five or eight — the number of distinct objects held in memory is maximized.”
Modeling memory engrams
Building on a framework established in the early 20th century, the team focused on the basic units of memory known as “engrams.” An engram can be described as a sparse network of neurons distributed across different brain regions that activate together. Its conceptual content is an idealized object defined by multiple characteristics.
In human memory, these characteristics map to sensory inputs. For instance, the memory of a banana would include its image, smell, taste, and other sensory details. Altogether, this forms a five-dimensional representation that exists within a larger five-dimensional space containing all other stored concepts.
The five senses. Credit: Modified by Nicolas Posunko/Skoltech from image generated by Deep Style (Abstract) model on Deep Dream Generator
Over time, engrams can become more refined or more diffuse depending on how frequently they are triggered by external stimuli acting through the senses, which in turn recall the memory of the object. This process represents how learning strengthens memories while disuse leads to forgetting through environmental interaction.
“We have mathematically demonstrated that the engrams in the conceptual space tend to evolve toward a steady state, which means that after some transient period, a ‘mature’ distribution of engrams emerges, which then persists in time,” Brilliantov commented. “As we consider the ultimate capacity of a conceptual space of a given number of dimensions, we somewhat surprisingly find that the number of distinct engrams stored in memory in the steady state is the greatest for a concept space of seven dimensions. Hence, the seven senses claim.”
Maximizing conceptual space
In other words, let the objects that exist out there in the world be described by a finite number of features corresponding to the dimensions of some conceptual space. Suppose that we want to maximize the capacity of the conceptual space expressed as the number of distinct concepts associated with these objects. The greater the capacity of the conceptual space, the deeper the overall understanding of the world. It turns out that the maximum is attained when the dimension of the conceptual space is seven. From this, the researchers conclude that seven is the optimal number of senses.
According to the researchers, this number does not depend on the details of the model — the properties of the conceptual space and the stimuli providing the sense impressions. The number seven appears to be a robust and persistent feature of memory engrams as such. One caveat is that multiple engrams of differing sizes existing around a common center are deemed to represent similar concepts and are therefore treated as one when calculating memory capacity.
The memory of humans and other living beings is an enigmatic phenomenon tied to the property of consciousness, among other things. Advancing the theoretical models of memory will be instrumental to gaining new insights into the human mind and recreating humanlike memory in AI agents.
Reference: “The critical dimension of memory engrams and an optimal number of senses” by Wendy Otieno, Ivan Y. Tyukin and Nikolay Brilliantov, 15 August 2025, Scientific Reports. DOI: 10.1038/s41598-025-11244-y
Purdue researchers have uncovered how fat-laden immune cells in the brain fuel Alzheimer’s. Credit: Shutterstock
Excess fat in brain immune cells weakens defenses against Alzheimer’s. Blocking fat storage restored their ability to fight disease.
For many years, scientists believed that fat in the brain had little connection to neurodegenerative diseases. Purdue University researchers are now challenging that view.
Their study, published in Immunity, demonstrates that an accumulation of fat in microglia, the brain’s immune cells, weakens their disease-fighting capacity. The discovery points toward new therapeutic strategies in lipid biology that could support microglial activity and improve neuronal health in conditions such as Alzheimer’s. The work was led by Gaurav Chopra, the James Tarpo Jr. and Margaret Tarpo Professor of Chemistry and (by courtesy) of Computer Science at Purdue.
Looking beyond plaques and tangles
Most Alzheimer’s treatments in development aim at the disease’s main hallmarks: amyloid beta protein plaques and tau protein tangles. Chopra, however, is directing attention to the unusually fat-laden cells found around damaged areas of the brain.
In earlier research published in Nature, Chopra and colleagues showed that astrocytes—cells that provide support to neurons—release a fatty acid that becomes toxic to brain cells under disease conditions. Another collaborative study with the University of Pennsylvania, also published in Nature the previous year, connected age-related mitochondrial dysfunction in neurons to fat buildup in glial cells, highlighting a key risk factor for neurodegeneration.
“In our view, directly targeting plaques or tangles will not solve the problem; we need to restore function of immune cells in the brain,” Chopra said. “We’re finding that reducing accumulation of fat in the diseased brain is the key, as accumulated fat makes it harder for the immune system to do its job and maintain balance. By targeting these pathways, we can restore the ability of immune cells like microglia to fight disease and keep the brain in balance, which is what they’re meant to do.”
Gaurav Chopra and graduate students Palak Manchanda and Priya Prakash led research on how fat disables the brain’s immune shield in Alzheimer’s disease. Credit: Purdue University
Chopra’s team worked in collaboration with researchers at Cleveland Clinic led by Dimitrios Davalos, assistant professor of molecular medicine. Chopra is also the director of Merck-Purdue Center and a member of the Purdue Institute for Integrative Neuroscience; the Purdue Institute for Drug Discovery; the Purdue Institute of Inflammation, Immunology and Infectious Disease; and the Regenstrief Center for Healthcare Engineering.
Chopra’s work is part of Purdue’s presidential One Health initiative, which brings together research on human, animal, and plant health. His research supports the initiative’s focus on advanced chemistry, where Purdue faculty study complex chemical systems and develop new techniques and applications.
Fat droplets as drivers of disease
Over a century ago, Alois Alzheimer documented unusual features in the brain of a patient with the condition later named after him. These included protein plaques, tangles, and cells packed with lipid droplets. For many years, such lipid deposits were regarded as mere by-products of the disease.
Chopra and his colleagues, however, have uncovered strong evidence linking fats in microglia and astrocytes—two types of glial cells that support neurons—to neurodegeneration. Based on these findings, Chopra proposes a “new lipid model of neurodegeneration,” referring to these accumulations as “lipid plaques” since they differ in form from typical spherical droplets.
“It is not the lipid droplets that are pathogenic, but the accumulation of these droplets is bad. We think the composition of lipid molecules that accumulate within brain cells is one of the major drivers of neuroinflammation, leading to different pathologies, such as aging, Alzheimer’s disease, and other conditions related to inflammatory insults in the brain. The specific composition of these lipid plaques may define particular brain diseases,” Chopra said.
Microglia impaired by lipid accumulation
The Immunity paper focuses on microglia, the “bona fide immune cells of the brain,” which clear out debris, such as misfolded proteins like amyloid beta and tau, by absorbing and breaking them down through a process called phagocytosis. Chopra’s team examined microglia in the presence of amyloid beta and asked a simple question: What happens to microglia when they come into contact with amyloid beta?
Images of brain tissue from people with Alzheimer’s disease showed amyloid beta plaques surrounded by microglia. Microglia located within 10 micrometers of these plaques contained twice as many lipid droplets as those farther away. These lipid droplet-laden microglia closest to the plaques cleared 40% less amyloid beta than ordinary microglia from brains without disease.
How fatty acids become trapped
In their investigation into why microglia were impaired in Alzheimer’s brains, the team used specialized techniques and found that microglia in contact with plaques and disease-related inflammation produced an excess of free fatty acids. While microglia normally use free fatty acids as an energy source — and some production of these fatty acids is even beneficial — Chopra and his team discovered the microglia closest to amyloid beta plaques convert these free fatty acids to triacylglycerol, a stored form of fat, in such large quantities that they become overloaded and immobilized by their own accumulation. The formation of these lipid droplets depends on age and disease progression, becoming more prominent as Alzheimer’s disease advances.
By tracing the complex series of steps microglia use to convert free fatty acids to triacylglycerol, the research team zeroed in on the final step of this pathway. They found abnormally high levels of an enzyme called DGAT2 catalyzes the final step of converting free fatty acids to triacylglycerol. They expected to see equally high levels of the DGAT2 gene — since the gene must be copied to produce the protein — but that was not the case. The enzyme accumulates because it is not degrading as quickly as it normally would, rather than being overproduced. This accumulation of DGAT2 causes microglia to divert fatty acids into long-term storage and fat accumulation instead of using them for energy or repair.
Restoring microglial function
“We showed that amyloid beta is directly responsible for the fat that forms inside microglia,” Chopra said. “Because of these fatty deposits, microglial cells become dysfunctional — they stop clearing amyloid beta and stop doing their job.”
Chopra said the researchers don’t yet know what causes the DGAT2 enzyme to persist. However, in their search for a remedy, the team tested two molecules: one that inhibits DGAT2’s function and another that promotes its degradation. The degradation of the DGAT2 enzyme was ultimately beneficial to reduce fat in the brains, improve function of microglia and their ability to eat amyloid-beta plaques, and improve markers of neuronal health in Alzheimer’s disease animal models.
“What we’ve seen is that when we target the fat-making enzyme and either remove or degrade it, we restore the microglia’s ability to fight disease and maintain balance in the brain — which is what they’re meant to do,” Chopra said.
“This is an exciting finding that reveals how a toxic protein plaque directly influences how lipids are formed and metabolized by microglial cells in Alzheimer’s brains,” said Priya Prakash, a first co-author of the study. “While most recent work in this area has focused on the genetic basis of the disease, our research paves the way for understanding how lipids and their pathways within the brain’s immune cells can be targeted to restore their function and combat the disease.”
“It’s incredibly exciting to connect fat metabolism to immune dysfunction in Alzheimer’s,” said Palak Manchanda, the other first co-author. “By pinpointing this lipid burden and the DGAT2 switch that drives it, we reveal a completely new therapeutic angle: Restore microglial metabolism and you may restore the brain’s own defense against disease.”
References:
“Neurotoxic reactive astrocytes induce cell death via saturated lipids” by Kevin A. Guttenplan, Maya K. Weigel, Priya Prakash, Prageeth R. Wijewardhane, Philip Hasel, Uriel Rufen-Blanchette, Alexandra E. Münch, Jacob A. Blum, Jonathan Fine, Mikaela C. Neal, Kimberley D. Bruce, Aaron D. Gitler, Gaurav Chopra, Shane A. Liddelow and Ben A. Barres, 6 October 2021, Nature. DOI: 10.1038/s41586-021-03960-y
“Amyloid-β induces lipid droplet-mediated microglial dysfunction via the enzyme DGAT2 in Alzheimer’s disease” by Priya Prakash, Palak Manchanda, Evi Paouri, Kanchan Bisht, Kaushik Sharma, Jitika Rajpoot, Victoria Wendt, Ahad Hossain, Prageeth R. Wijewardhane, Caitlin E. Randolph, Yihao Chen, Sarah Stanko, Nadia Gasmi, Anxhela Gjojdeshi, Sophie Card, Jonathan Fine, Krupal P. Jethava, Matthew G. Clark, Bin Dong, Seohee Ma, Alexis Crockett, Elizabeth A. Thayer, Marlo Nicolas, Ryann Davis, Dhruv Hardikar, Daniela Allende, Richard A. Prayson, Chi Zhang, Dimitrios Davalos and Gaurav Chopra, 19 May 2025, Immunity. DOI: 10.1016/j.immuni.2025.04.029
“Senescent glia link mitochondrial dysfunction and lipid accumulation” by China N. Byrns, Alexandra E. Perlegos, Karl N. Miller, Zhecheng Jin, Faith R. Carranza, Palak Manchandra, Connor H. Beveridge, Caitlin E. Randolph, V. Sai Chaluvadi, Shirley L. Zhang, Ananth R. Srinivasan, F. C. Bennett, Amita Sehgal, Peter D. Adams, Gaurav Chopra and Nancy M. Bonini, 5 June 2024, Nature. DOI: 10.1038/s41586-024-07516-8
Funding: U.S. Department of Defense, NIH/National Institute of Neurological Disorders and Stroke, NIH/National Institute of Mental Health, NIH/National Institutes of Health, NIH/National Institute on Aging
A. Sutherland – AncientPages.com – Shakespeare’s “Macbeth” portrays a seemingly honest and courageous man loyal to the king but consumed by power. As he struggles to reconcile this with his nobility, he loses his humanity and no longer feels fear. The witches’ prophecies convince him of his invincibility until Birnam Forest reaches Dunsinane Castle. They say that no assassination, no rebellion, no betrayal, simply no one can beat Macbeth, before Birnam Forest hits Dunsinane Castle.
Can you believe in a forest that attacks? Who would expect such from trees or bushes? This occurs in the witches’ prophecy, which becomes a curse: Birnam Wood is destined to defeat Macbeth when it approaches his castle. Shakespeare’s wandering forest in Macbeth is just one example of forests with prophetic messages, often foretelling death.
This is a preview of our premium article available only to members of Ancient Pages.
A groundbreaking calculator built on data from nearly 500 clinical trials could revolutionize hypertension treatment by showing doctors how much different drugs lower blood pressure. Credit: Shutterstock
Researchers created a tool to estimate how much different drugs lower blood pressure. It may transform hypertension care by improving treatment selection and saving lives.
A newly developed Blood Pressure Treatment Efficacy Calculator, the first of its kind, was created using data from nearly 500 randomized clinical trials involving more than 100,000 participants. The tool enables physicians to estimate how much different medications are expected to reduce blood pressure.
The findings, published in The Lancet, suggest this resource could change the way hypertension is managed by helping doctors tailor treatment plans to the exact level of blood pressure reduction each patient requires.
“This is really important because every 1mmHg reduction in systolic blood pressure lowers your risk of heart attack or stroke by two percent,” said Nelson Wang, cardiologist and Research Fellow at The George Institute for Global Health.
“But with dozens of drugs, multiple doses per drug, and most patients needing two or more drugs, there are literally thousands of possible options, and no easy way to work out how effective they are,” he said.
Addressing complexity in treatment choices
The calculator addresses this problem by averaging treatment effects across hundreds of clinical trials. It also classifies therapies into low, moderate, or high intensity groups, depending on how much they reduce blood pressure (BP), a method already widely applied in cholesterol-lowering treatments.
Typically, a single antihypertensive drug—the most common first step in treatment—reduces systolic BP by only 8–9 mmHg. However, many patients require drops of 15–30 mmHg to achieve recommended targets.
According to Dr. Wang, the traditional strategy of adjusting treatment based on individual blood pressure readings is flawed, since BP values fluctuate widely and are often too “noisy” to provide reliable guidance.
“Blood pressure changes from moment to moment, day to day, and by season – these random fluctuations can easily be as big or larger than the changes brought about by treatment,” he said.
“Also, measurement practices are often not perfect, bringing in an additional source of uncertainty – this means it’s very hard to reliably assess how well a medicine is working just by taking repeated measurements.”
Moving beyond the traditional approach
Anthony Rodgers, Senior Professorial Fellow at The George Institute for Global Health, said that while hypertension, or high blood pressure, is the most common reason people visit their doctor, there has been no single, up-to-date resource to show how effective different blood pressure medications are—especially when used in combination or at varying doses.
“Using the calculator challenges the traditional ‘start low, go slow, measure and judge’ approach to treatment, which comes with the high probability of being misled by BP readings, inertia setting in or the burden on patients being too much,” he said.
“With this new method, you specify how much you need to lower blood pressure, choose an ideal treatment plan to achieve that based on the evidence, and get the patient started on that ideally sooner rather than later.”
Global health implications
The next step is to test this new approach in a clinical trial, where patients will be prescribed treatments based on how much they need to lower their blood pressure, guided by the calculator.
High blood pressure is one of the world’s biggest health challenges, affecting as many as 1.3 billion people and leading to around ten million deaths each year.
Often called a silent killer as it does not cause any symptoms on its own, it can remain hidden until it leads to a heart attack, stroke or kidney disease. Fewer than one in five people with hypertension have it under control.
“Given the enormous scale of this challenge, even modest improvements will have a large public health impact – increasing the percentage of people whose hypertension is under control globally to just 50% could save many millions of lives,” Professor Rodgers added.
The Blood Pressure Treatment Efficacy Calculator can be accessed at www.bpmodel.org.
Reference: “Blood pressure-lowering efficacy of antihypertensive drugs and their combinations: a systematic review and meta-analysis of randomised, double-blind, placebo-controlled trials” by Nelson Wang, Abdul Salam, Rashmi Pant, Amit Kumar, Rupasvi Dhurjati, Faraidoon Haghdoost, Kota Vidyasagar, Prachi Kaistha, Hariprasad Esam, Sonali R Gnanenthiran, Raju Kanukula, Paul K Whelton, Brent Egan, Aletta E Schutte, Kazem Rahimi, Otavio Berwanger and Anthony Rodgers, August 30, 2025, The Lancet. DOI: 10.1016/S0140-6736(25)00991-2
NGC 7456 reveals stunning details: dust lanes, glowing star nurseries, and ultraluminous X-ray sources. Its brilliant, active core cements its status as a galaxy worth watching. Credit: ESA/Hubble & NASA, D. Thilker
NGC 7456 may look like just another spiral galaxy, but it’s full of surprises.
From vibrant star-forming regions glowing pink to mysterious ultraluminous X-ray sources, it’s a cosmic laboratory for astronomers.
A Hidden Galaxy With a Story to Tell
At first glance, this galaxy might look ordinary, one spiral among countless others scattered across the Universe. But the subject of the ESA/Hubble Picture of the Week, known as NGC 7456, holds far more than meets the eye. It lies more than 51 million light-years away in the constellation Grus (the Crane).
The image highlights the uneven spiral arms of NGC 7456, laced with pockets of dark dust that block starlight. Scattered across the galaxy are brilliant pink regions, glowing clouds of gas where new stars are taking shape. Their intense light excites the surrounding material, producing the distinctive red glow that signals stellar birth. The Hubble program gathered this data as part of its effort to study galactic evolution, tracking the formation of stars, clouds of hydrogen, and star clusters over time.
Extreme X-Ray Powerhouses
Hubble, with its ability to capture visible, ultraviolet and some infrared light, is not the only observatory focused on NGC 7456. ESA’s XMM-Newton satellite has imaged X-rays from the galaxy on multiple occasions, discovering a number of so-called ultraluminous X-ray sources. These small, compact objects emit terrifically powerful X-rays, much more than would be expected for their size. Astronomers are still trying to pin down what powers these extreme objects, and NGC 7456 contributes a few more examples.
On top of that, the region around the galaxy’s supermassive black hole is spectacularly bright and energetic, making NGC 7456 an active galaxy. Whether looking at its core or its outskirts, at visible light or X-rays, this galaxy has something interesting to show!
What’s in a word? In 1965 the great Cambridge historian Professor (later Sir) J.H. Plumb delivered the Ford Lectures at Oxford, on the topic of political stability in 18th-century England. One of those lectures was titled ‘The Rage of Party’. In it, Plumb expressed profound suspicion of Lewis Namier’s famous thesis, then tremendously fashionable, that principles matter not a jot and the sole driver of 18th-century politics was individual lust for glory. One needed only to feel the heat rising from public debate in this era to appreciate the importance of Whig and Tory, argued Plumb: ‘Party was real and it created instability.’ Two years later, in 1967, Plumb’s greatest student, Geoffrey Holmes, published his masterpiece of historiography, British Politics in the Age of Anne, in which Plumb’s scepticism of the Namierite position morphed into something more subtle. For Holmes, it was fundamentally the structures of partisanship, not its passions, that defined the era. Plumb’s ‘rage of party’ became Holmes’ ‘age of party’.
Now the ‘rage’ is back. George Owers’ tremendously entertaining new book is an unabashedly narrative account of the twists and turns of Whig and Tory through a period of immense turmoil, both foreign (protracted wars with France and Spain) and domestic (a royal coup, a crisis of succession, and ten general elections within 25 years). In some respects, this is a deeply unfashionable endeavour. To be sure, there has been some excellent scholarship on the origins of party in areas such as intellectual history (recent work by Max Skjönsberg comes to mind), but narrative historians have typically found it difficult to articulate what was at stake in early 18th-century politics, particularly when set against the grand set pieces of the Civil Wars and Protectorate a generation earlier. Owers negotiates this problem with considerable brio, combining the high politics of court and parliament with a broader-brush history of ideas, religion, and culture. Equal attention is devoted to sex lives, medical ailments, and boozing as to parliamentary division lists and electoral tallies. It makes for a heady cocktail of policy and personality.
What drove people into opposite political camps? Neither party was a monolith and there were outliers on both sides, but the chief differences lay in three areas: government, religion, and foreign policy. For most Whigs, the origins of government were secular; for most Tories, they were sacred. In religion, Whigs lobbied for the toleration of Protestant dissenters whereas Tories cleaved to a vision of the Church of England as an engine of religious persecution, securing uniformity by coercion and oppression. And in foreign policy, Tories sought a quick and cheap end to what they perceived as a needlessly destructive war with France, whereas Whigs sought total victory whatever the cost. Whigs favoured the moneyed interest, Tories the landed interest. Whigs accused Tories of being agents of French popery; Tories accused Whigs of being schismatic republicans. Whigs feared foreign absolutism, Tories domestic anarchy. By the start of the 18th century the world itself was being refracted through this lens. On the reading of a parliamentary bill in 1703, Jonathan Swift imagined all London’s street cats divided up into Whig and Tory, claws bared, hackles raised.
At the heart of Owers’ narrative is the tussle for political dominance during the reign of Queen Anne between the venal, self-enriching lords who comprised the Whig ‘Junto’, and the rabble-rousing high-church zealots of the Tory Commons. The great hero in all this is Robert Harley, who during a long career attempted to steer successive governments along an increasingly narrow middle-course between the extremes of either side. A decent case can be made for Harley, not Robert Walpole, as the nation’s first ‘prime minister’ (a term of abuse in the period rather than an official office). His political acumen was legendary. He was the first high-ranking politician to fully appreciate the power of the press, employing Swift and Defoe as sympathetic journalists and funding the production of pamphlets supporting the government position. As Owers puts it, Harley used this ‘mastery of the shady arts’ to carve a ‘calm, moderate path through a thicket of bone-headed partisanship’.
Antagonists appear on all fronts: while the rank-and-file Whigs never pose much of a threat (on one page they ‘fluff it’, on the next they ‘bungle it’, and on the third they ‘crumble’), the financial resources of the Junto make them a dangerous enemy to Harley’s centre ground; Sidney Godolphin, with all the dexterity of a rusty lawnmower, sometimes threatened to sink the ship by sheer incompetence; eventually, though, it is the ‘totally ruthless pragmatist’ Henry St John who lands the metaphorical killing blow, cutting his own throat in the process. For all its comic moments, the final act of the book, in which Harley finally runs out of road, is nothing short of tragic.
Owers is eager to pack everything in, which makes for an occasionally breathless narrative. He covers a lot of ground and his readers, like his protagonists, don’t get to enjoy much downtime. Things just keep happening. In one chapter alone, two rakehells kill one another in a duel, Jonathan Swift disarms a booby trap, a gang of aristocratic hooligans attack people on the street, and the great peace treaty of Utrecht, which has loomed for so many pages, is finally brokered. Nor is this a work of profound literary subtlety. Those in glass houses shouldn’t throw stones, I know, but the prose occasionally feels part Top Gear, part Tatler. Single-sentence paragraphs close chapters with a sledgehammer thump while figures such as Marlborough, the ‘dashing, dishy courtier’, appear to have stepped freshly coiffured from the society gossip pages. Yet I say this with a smile not a sneer, because the effect is deeply enjoyable. Owers’ enthusiasm for the period and its characters is palpable. Allow yourself to be swept along by it and you’ll have a tremendous time.
The Rage of Party: How Whig Versus Tory Made Modern Britain George Owers Constable, 576pp, £30 Buy from bookshop.org (affiliate link)
Joseph Hone is the author of a book about Robert Harley, The Paper Chase: The Printer, the Spymaster, and the Hunt for the Rebel Pamphleteers (Chatto & Windus, 2020).
Contains custom information set by the web developer via the _setCustomVar method in Google Analytics. This cookie is updated every time new data is sent to the Google Analytics server.
2 years after last activity
__utmx
Used to determine whether a user is included in an A / B or Multivariate test.
18 months
_ga
ID used to identify users
2 years
_gali
Used by Google Analytics to determine which links on a page are being clicked
30 seconds
_ga_
ID used to identify users
2 years
_gid
ID used to identify users for 24 hours after last activity
24 hours
_gat
Used to monitor number of Google Analytics server requests when using Google Tag Manager
1 minute
_gac_
Contains information related to marketing campaigns of the user. These are shared with Google AdWords / Google Ads when the Google Ads and Google Analytics accounts are linked together.
90 days
__utma
ID used to identify users and sessions
2 years after last activity
__utmt
Used to monitor number of Google Analytics server requests
10 minutes
__utmb
Used to distinguish new sessions and visits. This cookie is set when the GA.js javascript library is loaded and there is no existing __utmb cookie. The cookie is updated every time data is sent to the Google Analytics server.
30 minutes after last activity
__utmc
Used only with old Urchin versions of Google Analytics and not with GA.js. Was used to distinguish between new sessions and visits at the end of a session.
End of session (browser)
__utmz
Contains information about the traffic source or campaign that directed user to the website. The cookie is set when the GA.js javascript is loaded and updated when data is sent to the Google Anaytics server