<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/researchers-find-the-gas-pedal-and-brake-for-anxiety-and-they-arent-neurons/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Researchers find the “gas pedal” and “brake” for anxiety, and they aren’t neurons</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 18th 2025, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.1038/s41380-025-03190-y" target="_blank">Molecular Psychiatry</a></em> has found that specific immune cells within the mouse brain are a direct cause of chronic anxiety and compulsive grooming. The research demonstrates that two distinct lineages of these cells, known as microglia, function in opposition to one another, with one group promoting these behaviors and the other acting to suppress them. These findings shift attention from neurons to the brain’s immune system as a potential regulator of certain psychiatric conditions.</p>
<p>Microglia are the resident immune cells of the central nervous system, responsible for maintaining brain health by removing debris and responding to injury or infection. It has become clear that their role extends beyond simple housekeeping. In mice, the microglial population is composed of two separate lineages that arise at different times during embryonic development. The majority are a type known as canonical non-Hoxb8 microglia, while a smaller subset of about 25 percent are called Hoxb8 microglia, named for a developmental gene they express.</p>
<p>A team of researchers at the University of Utah School of Medicine, led by Distinguished Professor Mario R. Capecchi, previously established a link between the Hoxb8 gene and specific behaviors. Mice with a disrupted Hoxb8 gene exhibit chronic anxiety and pathological overgrooming, a behavior that resembles trichotillomania, an obsessive-compulsive spectrum disorder in humans. Since the only cells in the brain that express the Hoxb8 gene are this specific subset of microglia, the team formulated a direct question: are defective Hoxb8 microglia the cause of these behaviors?</p>
<p>To answer this, the scientists performed a series of cell transplantation experiments. The first step required creating recipient mice with brains that were essentially a blank slate, devoid of their own native microglia. This was achieved by genetically engineering mice so that a gene called Csf1r, which is essential for microglia survival, was disabled only in those cells. These recipient mice provided a unique environment to test the function of transplanted cells in isolation.</p>
<p>The researchers then isolated microglia progenitor cells, an early-stage cell that develops into mature microglia, from the fetal livers of mouse embryos. They collected these progenitors from two different sources: from normal, healthy mice and from mice with the defective Hoxb8 gene. These purified cells were then injected into the brains of the microglia-less newborn recipient mice. This procedure created two experimental groups: one populated with healthy Hoxb8 microglia and another populated solely with defective Hoxb8 microglia.</p>
<p>The behavioral outcomes of these two groups were distinctly different. Mice that received the defective Hoxb8 microglia grew up to display the same pathological behaviors seen in the original mutant mice. They groomed themselves compulsively, leading to significant hair loss, and exhibited signs of heightened anxiety in standardized tests, such as avoiding open spaces in a maze. </p>
<p>In contrast, mice that received healthy Hoxb8 microglia from normal donors behaved just like typical mice, showing no signs of overgrooming or elevated anxiety. This experiment established a direct causal link, demonstrating that the defective Hoxb8 microglia were themselves sufficient to produce both behaviors.</p>
<p>This finding raised a more complex question about how the two different microglial populations work together. In previous work, the team proposed what they call the “Accelerator/Brake” model. This model suggests the two microglia lineages have opposing functions. The non-Hoxb8 microglia are thought to act as an “accelerator,” promoting anxiety and grooming. The Hoxb8 microglia are thought to function as a “brake,” downregulating these same behaviors to maintain equilibrium.</p>
<p>The model made a powerful prediction: a mouse containing only the accelerator cells, the non-Hoxb8 microglia, should exhibit abnormally high levels of anxiety and grooming because the braking system would be absent. To test this, the team performed a second set of transplantations. This time, they used a different type of recipient mouse that is born completely without any microglia, providing an even cleaner experimental system.</p>
<p>From the brains of healthy newborn mice, the scientists isolated and purified both types of microglia, separating the Hoxb8 from the non-Hoxb8 populations. They then created several groups of recipient mice. Some received only non-Hoxb8 microglia (the accelerator). Others received only Hoxb8 microglia (the brake). A third group received a mixture of both cell types that mimicked the brain’s natural 75-to-25 percent ratio.</p>
<p>The results strongly supported the Accelerator/Brake model. Mice populated exclusively with non-Hoxb8 microglia developed pathological grooming habits and showed increased anxiety. The “accelerator” was effectively stuck on. Mice that received only Hoxb8 microglia, or those that received the correct mixture of both populations, showed normal, low levels of grooming and anxiety. This suggests that in a healthy brain, the two types of microglia work in concert to fine-tune behavioral responses to the environment.</p>
<p>A final, subtle observation from the experiments added another layer of complexity. The original mice with the Hoxb8 gene disruption showed even more severe grooming and anxiety than the mice transplanted with only non-Hoxb8 microglia. A mouse with only non-Hoxb8 microglia represents a pure “loss of function” for the Hoxb8 gene, as the braking cells are simply absent. </p>
<p>The more severe symptoms in the original mutant mice imply that their defective Hoxb8 microglia are not just failing to apply the brakes; they are doing something actively detrimental. This phenomenon, known as a “gain of function,” suggests the mutated cells may be sending their own incorrect signals that further contribute to the behaviors.</p>
<p>These experiments are based on mouse models, and any direct application to human anxiety disorders requires further investigation. “Humans also have two populations of microglia that function similarly,” Dr. Capecchi notes. He suggests this work could reframe how researchers approach psychiatric conditions, which have historically been viewed almost exclusively through the lens of neurons. “This knowledge will provide the means for patients who have lost their ability to control their levels of anxiety to regain it,” he adds.</p>
<p>Future research will likely explore the molecular mechanisms through which these microglia influence neuronal circuits to control such complex behaviors. Understanding this cellular dialogue could open new therapeutic avenues. “We’re far from the therapeutic side,” says Donn Van Deren, the study’s first author, now a postdoctoral fellow at the University of Pennsylvania. “But in the future, one could probably target very specific immune cell populations in the brain and correct them through pharmacological or immunotherapeutic approaches. This would be a major shift in how to treat neuropsychiatric disorders.”</p>
<p>The study, “<a href="https://doi.org/10.1038/s41380-025-03190-y" target="_blank">Defective Hoxb8 microglia are causative for both chronic anxiety and pathological overgrooming in mice</a>,” was authored by Donn A. Van Deren, Ben Xu, Naveen Nagarajan, Anne M. Boulet, Shuhua Zhang & Mario R. Capecchi.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/scientists-reveal-intriguing-new-insights-into-how-the-brain-processes-and-predicts-sounds/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Scientists reveal intriguing new insights into how the brain processes and predicts sounds</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 18th 2025, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <a href="https://doi.org/10.1002/advs.202507878"><em>Advanced Science</em></a> suggests that the brain uses two distinct, large-scale networks to recognize memorized musical sequences. One network appears to handle general sound processing, while the other is specifically engaged in comparing incoming information to memory and detecting prediction errors. These findings provide a more integrated view of how the brain supports complex cognitive functions through the coordinated activity of widespread neural systems.</p>
<p>Predictive coding is a theory suggesting that the brain continuously generates expectations about incoming sensory information. When reality deviates from these expectations, the brain updates its predictions through a process called prediction error.</p>
<p>Much of the past research on this topic has focused on either small brain regions or narrow frequency bands. These studies have helped identify some of the building blocks of prediction, such as early sensory responses to unexpected sounds. But they often overlook how multiple brain regions cooperate as a network, especially during tasks involving memory for complex sequences like music.</p>
<p>In the new study, a team of researchers sought to address this gap in our understanding of how predictive coding works at the level of the whole brain. The research was led by <a href="https://leonardobonetti.org/">Leonardo Bonetti</a>, an associate professor at the Center for Music in the Brain at Aarhus University and the Centre for Eudaimonia and Human Flourishing at the University of Oxford, and <a href="https://scholar.google.com/citations?user=SbYDcxEAAAAJ&hl=it">Mattia Rosso</a>, a researcher affiliated with the Center for Music in the Brain at Aarhus University and the IPEM Institute for Systematic Musicology at Ghent University.</p>
<p>“For several years, we (Leonardo Bonetti and Mattia Rosso) have been interested in understanding how the brain organises its activity across different regions when we perceive, remember, or predict sounds. Most existing analytical tools focus on small sets of brain regions or predefined connections, which means that we often miss the broader, system-level picture. Moreover, several methods rely on fairly strong assumptions or rather complex analytical procedures which limit the interpretability of the findings,” the researchers told PsyPost.</p>
<p>“We wanted to overcome these limitations by creating a new method that could capture the brain’s whole dynamic activity, how multiple regions cooperate in real time. This motivation led us to develop BROAD-NESS, a framework that identifies broadband brain networks in a way that is simple, effective, fast, and highly interpretable. Our goal was to give researchers a tool that is both mathematically rigorous and accessible, allowing them to map large-scale brain interactions without imposing strong assumptions on the data.”</p>
<p>The study involved 83 volunteers, ranging in age from 19 to 63. Participants first listened to and memorized a short musical piece by Johann Sebastian Bach. Following this memorization phase, their brain activity was recorded using magnetoencephalography (MEG), a technique that measures the magnetic fields produced by the brain’s electrical currents with high temporal precision.</p>
<p>During the recording, participants listened to 135 different five-tone musical excerpts. Some of these excerpts were taken directly from the piece they had memorized, while others were novel variations. For each excerpt, participants had to indicate whether it was part of the original music (“memorized”) or a new variation (“novel”).</p>
<p>The core of the analysis was the novel method, BROAD-NESS, which stands for BROadband brain Network Estimation via Source Separation. The researchers first used the MEG data to estimate the location of neural activity across 3,559 points, or voxels, throughout the brain.</p>
<p>They then applied a statistical technique called Principal Component Analysis to this massive dataset. This method identifies the main patterns of synchronized activity across all brain voxels, with each major pattern representing a distinct, simultaneously operating brain network. The analysis also quantifies how much of the brain’s total activity each network explains.</p>
<p>The two primary networks together explained about 88% of the variance in the broadband, source-reconstructed MEG data recorded during the task. The first network, which explained the majority of the activity (about 72%), was centered on the auditory cortices and the medial cingulate gyrus.</p>
<p>The activity in this network showed a more consistent pattern across all conditions, with less pronounced differences between memorized and novel sequences. This pattern suggests its primary role is in the fundamental processing of sounds as they are being heard.</p>
<p>The second network explained a smaller but significant portion of the activity (about 16%). This network also included the auditory cortices but extended to involve regions associated with memory and higher-order processing, such as the hippocampus, anterior cingulate, insula, and inferior temporal regions.</p>
<p>Unlike the first network, the activity in this second network was highly dependent on the experimental condition. Its dynamics appeared to reflect the processes of matching incoming sounds to stored memories and flagging prediction errors when the sounds deviated from what was expected.</p>
<p>“The key takeaway is that the brain works as a dynamic network, not as a collection of isolated regions,” Bonetti and Rosso explained. “When we remember a sound or predict what will come next, many brain areas interact simultaneously, and the quality of these interactions matters for how well we perform.”</p>
<p>“Using BROAD-NESS, we discovered that the auditory cortices are not just doing one job at a time. Instead, they participate in two major networks: one focused on processing the sensory details of sounds, and another that supports memory and predictive processes, linking to deeper brain structures such as the hippocampus and anterior cingulate cortex.”</p>
<p>To better understand the timing and organization of these networks, the researchers used additional analytical techniques. One method, called recurrence quantification analysis, examined the stability and predictability of the networks’ activity over time.</p>
<p>The results indicated that when participants were listening to the correctly memorized musical sequences, the combined activity of the two networks was more structured and stable. Across all participants, this increased stability was associated with better performance on the task, including higher accuracy and faster response times. This provides evidence that organized and recurrent network dynamics are linked to successful cognitive function.</p>
<p>“Interestingly, participants who showed more stable and recurrent interactions between these networks also performed better in memory recognition,” Bonetti and Rosso said. “In simpler terms, when the brain’s networks work together in a stable and coordinated way, cognition becomes more efficient.”</p>
<p>A separate analysis focused on the spatial organization of the networks. By clustering brain voxels based on their participation in the two networks, the researchers found a nuanced pattern of engagement. Some brain regions, such as parts of the auditory cortex, were highly active in both networks, suggesting they act as hubs that contribute to both sound perception and memory-based prediction.</p>
<p>Other regions were more specialized, contributing strongly to one network but not the other. For example, the medial cingulate was primarily involved in the first network, while the hippocampus was a key component of the second.</p>
<p>The study also provides a new perspective on the “dual-stream” hypothesis of brain organization. Originally described for vision, this model proposes separate pathways for processing “what” an object is versus “where” it is. The second network identified in this study aligns well with the “what” pathway, or ventral stream, as it involves regions critical for recognition and memory.</p>
<p>However, the first network does not map cleanly onto the traditional “where” pathway. Instead, it seems to represent a distinct system involved in sustained auditory attention and processing, suggesting a more complex organization for auditory memory than previously thought.</p>
<p>“While we expected to see a link between auditory and memory systems, what really stood out was how the auditory cortex was simultaneously engaged in two distinct large-scale networks: one for perception and one for prediction,” Bonetti and Rosso told PsyPost. “This shows that the same brain region can flexibly contribute to different computational roles depending on context. This pattern confirms that the brain is fundamentally organised to support parallel processing, where multiple cognitive operations run at once and influence each other in real time.”</p>
<p>The study has some limitations. The task, while effective for research, was relatively simple and did not involve the complexity of real-world music listening. Future research could use the BROAD-NESS method to investigate brain network dynamics during more naturalistic experiences.</p>
<p>The researchers also plan to apply this framework to study clinical populations. Examining how these large-scale network dynamics differ in individuals with conditions that affect memory or predictive processing, such as Alzheimer’s disease or schizophrenia, could offer new insights into the neural basis of these disorders.</p>
<p>“Our next steps are twofold,” Bonetti and Rosso said. “First, we want to continue refining the BROAD-NESS framework, improving its accessibility and scalability so other researchers can apply it to their own data. Second, we plan to apply it across a variety of datasets, both in healthy individuals and clinical populations, to explore how large-scale brain networks differ between health and disease.”</p>
<p>“Ultimately, we hope this approach can help us better understand not just how the brain works when everything goes well, but also what changes occur in pathological conditions. In the long run, this could contribute to developing new biomarkers or targets for interventions based on whole-brain network dynamics.”</p>
<p>“One of the things we value most about BROAD-NESS is that it’s fully data-driven and transparent,” Bonetti and Rosso added. “The pipeline is built to integrate spatial, temporal, and dynamical analyses in a way that is easy to interpret, making it suitable not just for specialists in neuroscience, but for researchers across psychology, medicine, and computational science.”</p>
<p>“More broadly, this work aligns with a growing effort to move from studying where things happen in the brain to understanding how they unfold and interact as part of a living, dynamic system. That’s the big picture we hope to contribute to.”</p>
<p><em>This research emerged from an international collaboration that brought together several leading institutions. The study was carried out by researchers from the Center for Music in the Brain, which is affiliated with Aarhus University and The Royal Academy of Music in Denmark, along with partners from the Department of Clinical Medicine at Aarhus University, the University of Oxford, and the Department of Physics at the University of Bologna. This collaborative work was made possible by financial support from several key organizations, including the Danish National Research Foundation, the Independent Research Fund Denmark, and the Lundbeck Foundation.</em></p>
<p>The study, “<a href="https://doi.org/10.1002/advs.202507878">BROAD-NESS Uncovers Dual-Stream Mechanisms Underlying Predictive Coding in Auditory Memory Networks</a>,” was authored by Leonardo Bonetti, Gemma Fernández-Rubio, Mathias H. Andersen, Chiara Malvaso, Francesco Carlomagno, Claudia Testa, Peter Vuust, Morten L. Kringelbach, and Mattia Rosso.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/psychological-safety-mediates-link-between-ai-adoption-and-worker-depression/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Psychological safety mediates link between AI adoption and worker depression</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 17th 2025, 21:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Artificial intelligence is changing how many companies operate, but its impact on workers’ mental health is not fully understood. A new study published in Humanities and Social Sciences Communications suggests that adopting AI in organizations may negatively affect employee well-being by reducing psychological safety, which can in turn contribute to depression. The research also provides evidence that ethical leadership may help protect employees from these effects by fostering a safer and more supportive work environment.</p>
<p>The researchers—Byung-Jik Kim, Min-Jik Kim, and Julak Lee—sought to examine how AI adoption in the workplace influences employee depression. While many studies have looked at AI’s benefits or its technical aspects, fewer have addressed how AI affects employee mental health. The authors argue that this is a critical gap, especially given the growing role of AI in business operations and the potential for these technologies to reshape job roles, increase uncertainty, and disrupt workplace relationships.</p>
<p>They focused on depression because it is one of the most common and costly mental health conditions in the workplace. The researchers also wanted to explore not just whether AI adoption affects depression, but how this happens. They proposed that psychological safety—a shared belief that it is safe to speak up or take interpersonal risks at work—might explain this link. Additionally, they investigated whether ethical leadership, which involves fairness, transparency, and care for others, might reduce the negative effects of AI on psychological safety.</p>
<p>The study used a three-stage survey design with 381 employees from various organizations in South Korea. Data were collected through an online panel maintained by a major survey firm. Surveys were administered at three time points over several months to reduce bias and strengthen the results.</p>
<p>At the first stage, employees reported on the extent to which their organization had adopted AI and how ethical they perceived their leaders to be. AI adoption was measured across five areas: human resources, operations, marketing, strategy, and finance. Ethical leadership was assessed using a standard 10-item scale.</p>
<p>At the second stage, employees responded to questions about psychological safety. Items included statements such as whether it felt safe to take risks or ask for help at work.</p>
<p>At the final stage, employees completed a widely used 10-item measure of depression. This scale included questions about feelings of sadness, hopelessness, loneliness, and fatigue.</p>
<p>To analyze the data, the researchers used structural equation modeling, a method that allows for testing complex relationships among variables. They also conducted a bootstrapping analysis to test whether psychological safety acted as a bridge linking AI adoption to depression.</p>
<p>The results showed that AI adoption was not directly linked to employee depression. Instead, the relationship was indirect: AI adoption was associated with lower levels of psychological safety, and lower psychological safety was linked to higher levels of depression. This means that employees working in environments with more AI were less likely to feel safe speaking up or asking for help, which in turn made them more likely to experience depressive symptoms.</p>
<p>The analysis also showed that ethical leadership played a protective role. In organizations where employees perceived their leaders as ethical, the negative effect of AI adoption on psychological safety was weaker. In other words, ethical leadership helped reduce the loss of psychological safety that often accompanies AI integration. When leaders acted fairly, involved employees in decision-making, and communicated openly about changes, workers felt more supported and less threatened by AI technologies.</p>
<p>These findings suggest that the way organizations implement AI—and the kind of leadership in place—can make a significant difference in how employees respond psychologically.</p>
<p>While the study provides important insights, it also has limitations. First, all data were collected from self-reports, which can introduce bias. Although the researchers used time-lagged surveys to reduce this concern, future studies could include supervisor ratings or objective mental health data.</p>
<p>Second, the study was conducted in South Korea. Cultural factors such as high power distance and a strong emphasis on hierarchy may influence how AI adoption and leadership affect employees. Research in other cultural contexts is needed to see whether the findings apply more broadly.</p>
<p>The authors also note that not all effects of AI are negative. In some cases, AI could reduce stress by automating tedious tasks or improving job efficiency. Future research could explore both the positive and negative psychological effects of AI adoption, and under what conditions each occurs.</p>
<p>“The fast adoption and integration of AI at work is having a profound impact on the physical and mental health of employees. Given that AI is radically altering work processes and the overall employee experience, it is critical to examine the psychological risks and challenges that these technical enhancements entail,” the researchers concluded. </p>
<p>The study, “<a href="https://doi.org/10.1057/s41599-025-05040-2" target="_blank">The dark side of artificial intelligence adoption: linking artificial intelligence adoption to employee depression via psychological safety and ethical leadership</a>,” was authored by Byung-Jik Kim, Min-Jik Kim, and Julak Lee.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/the-impulse-to-garden-in-hard-times-has-deep-roots/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">The impulse to garden in hard times has deep roots</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 17th 2025, 19:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>The coronavirus pandemic set off a global <a href="https://www.reuters.com/article/us-health-coronavirus-gardens/home-gardening-blooms-around-the-world-during-coronavirus-lockdowns-idUSKBN2220D3">gardening boom</a>.</p>
<p>In the early days of lockdown, seed suppliers <a href="https://www.sfchronicle.com/culture/article/A-comeback-for-victory-gardens-amid-Bay-Area-15177272.php">were depleted</a> of inventory and <a href="https://www.theguardian.com/world/2020/apr/08/coronavirus-gardening-boom-overwhelms-seed-suppliers-in-new-zealand-and-australia">reported</a> “unprecedented” demand. Within the U.S., the trend <a href="https://www.nytimes.com/2020/03/25/dining/victory-gardens-coronavirus.html">has been</a> <a href="https://crosscut.com/2020/03/wwii-era-victory-gardens-make-comeback-amid-coronavirus?fbclid=IwAR3ZAISo4LLibQcwlnYwjX2sKEdAIwCeU4maVah0DR8164AW7u76kLlZIMw">compared</a> to World War II <a href="https://library.si.edu/exhibition/cultivating-americas-gardens/gardening-for-the-common-good">victory gardening</a>, when Americans grew food at home to support the war effort and feed their families.</p>
<p>The analogy is surely convenient. But it reveals only one piece in a much bigger story about why people garden in hard times. Americans have long turned to the soil in moments of upheaval to manage anxieties and imagine alternatives. <a href="https://ugapress.org/book/9780820353197/gardenland">My research</a> has even led me to see gardening as a hidden landscape of desire for belonging and connection; for contact with nature; and for creative expression and improved health.</p>
<p>These motives have varied across time as growers respond to different historical circumstances. Today, what drives people to garden may not be the fear of hunger so much as hunger for physical contact, hope for nature’s resilience and a longing to engage in work that is real.</p>
<h2>Why Americans garden</h2>
<p>Prior to industrialization, most Americans were <a href="https://www.nass.usda.gov/AgCensus/">farmers</a> and would have considered it odd to grow food as a leisure activity. But as they moved into cities and suburbs to take factory and office jobs, coming home to putter around in one’s potato beds took on a kind of novelty. Gardening also appealed to nostalgia for the passing of traditional farm life.</p>
<p>For black Americans denied the opportunity to abandon subsistence work, Jim Crow-era gardening reflected a different set of desires.</p>
<p>In her essay “<a href="https://docs.google.com/viewer?a=v&pid=sites&srcid=ZGVmYXVsdGRvbWFpbnxhbWVyaWNhbmxpdDE0MTV8Z3g6NWRlMGUyYzc5NDJjMTRmNA">In Search of Our Mothers’ Gardens</a>,” Alice Walker recalls her mother tending an extravagant flower garden late at night after finishing brutal days of field labor. As a child, she wondered why anyone would voluntarily add one more task to such a difficult life. Later, Walker understood that gardening wasn’t just another form of labor; it was an act of artistic expression.</p>
<p>Particularly for black women relegated to society’s least desirable jobs, gardening offered the chance to reshape a small piece of the world in, as Walker put it, one’s “personal image of Beauty.”</p>
<p>This isn’t to say that food is always a secondary factor in gardening passions. Convenience cuisine in the 1950s spawned its <a href="https://www.jstor.org/stable/10.7591/j.ctt5hh1rm">own generation</a> of home-growers and <a href="https://uwpress.wisc.edu/books/4372.htm">back-to-the-land</a> movements rebelling against a <a href="https://www.ucpress.edu/book/9780520250352/meals-to-come">mid-century diet</a> now infamous for Jell-O mold salads, canned-food casseroles, TV dinner and Tang.</p>
<p>For millennial-era growers, gardens have responded to longings for <a href="https://www.researchgate.net/publication/329483674_The_Earth_Knows_My_Name_Food_Culture_and_Sustainability_in_the_Gardens_of_Ethnic_Americans">community and inclusion</a>, especially among <a href="https://www.themarshallproject.org/2015/06/09/doing-whatever-it-takes-to-create-a-prison-garden">marginalized groups</a>. Immigrants and inner-city residents lacking access to green space and fresh produce have taken up “<a href="https://www.ucpress.edu/book/9780520277779/paradise-transplanted">guerrilla gardening</a>” in vacant lots to revitalize their communities.</p>
<p>In 2011, Ron Finley – a resident of South Central L.A. and self-identified “<a href="https://www.latimes.com/food/dailydish/la-fo-ron-finley-project-20170503-story.html">gangsta gardener</a>” – was even threatened with arrest for installing vegetable plots along sidewalks.</p>
<p>Such appropriations of public space for community use are often seen as threats to existing power structures. Moreover, many people can’t wrap their heads around the idea that someone would spend time cultivating a garden but not reap all of the rewards.</p>
<p>When reporters asked Finley if he were concerned that people would steal the food, <a href="https://www.ted.com/talks/ron_finley_a_guerrilla_gardener_in_south_central_la">he replied</a>, “Hell no I ain’t afraid they’re gonna steal it, that’s why it’s on the street!”</p>
<h2>Gardening in the age of screens</h2>
<p>Since the lockdown began, I’ve watched my sister Amanda Fritzsche transform her neglected backyard in Cayucos, California, into a blooming sanctuary. She has also gotten into Zoom workouts, binged on Netflix and joined online happy hours. But as the weeks stretch into months, she seems to have less energy for those virtual encounters.</p>
<p>Gardening, on the other hand, has overtaken her life. Plantings that started out back have expanded around the side of the house, and gardening sessions have stretched later into the evening, when she sometimes works by headlamp.</p>
<p>When I asked about her new obsession, Amanda kept returning to her unease with screen time. She told me that virtual sessions gave a momentary boost, but “there’s always something missing … an empty feeling when you log off.”</p>
<p>Many can probably sense what’s missing. It’s the physical presence of others, and the opportunity to use our bodies in ways that matter. It’s the same longing for community that fills coffee shops with fellow gig workers and yoga studios with the heat of other bodies. It’s the electricity of the crowd at a concert, the students whispering behind you in class.</p>
<p>And so if the novel coronavirus underscores an age of distancing, gardening arises as an antidote, extending the promise of contact with something real. My sister talked about this, too: how gardening appealed to the whole body, naming sensory pleasures like “hearing song birds and insects, tasting herbs, the smell of dirt and flowers, the warm sun and satisfying ache.” While the virtual world may have its own ability to absorb attention, it is not immersive in the way gardening can be.</p>
<p>But this season, gardening is about more than physical activity for the sake of activity. Robin Wallace, owner of a photo production business in Camarillo, California, noted how the lockdown made her professional identity “suddenly irrelevant” as a “non-essential” worker. She went on to point out a key benefit of her garden: “The gardener is never without a purpose, a schedule, a mission.”</p>
<p>As automation and better algorithms make more forms of work obsolete, that longing for purpose gains special urgency. Gardens are a reminder that there are limits to what can be done without physical presence. As with handshakes and hugs, one cannot garden through a screen.</p>
<p>You might pick up skills from YouTube, but, as gardening icon Russell Page <a href="https://www.nyrb.com/products/the-education-of-a-gardener?variant=1094931829">once wrote</a>, real expertise comes from directly handling plants, “getting to know their likes and dislikes by smell and touch. ‘Book learning’ gave me information,” he explained, “but only physical contact can give any real … understanding of a live organism.”</p>
<h2>Filling the void</h2>
<p>Page’s observation suggests a final reason why the coronavirus pandemic has ignited such a flurry of gardening. Our era is one of profound <a href="https://doi.org/10.1016/j.amepre.2017.01.010">loneliness</a>, and the proliferation of <a href="https://www.upmc.com/media/news/012219-primack-sidani-posneg">digital devices</a> is only one of the causes. That emptiness also proceeds from the staggering <a href="https://www.un.org/sustainabledevelopment/blog/2019/05/nature-decline-unprecedented-report/">retreat of nature</a>, a process underway well before screen addiction. The people coming of age during the COVID-19 pandemic have already witnessed oceans die and glaciers disappear, watched Australia and the Amazon burn and mourned the astonishing <a href="https://www.worldwildlife.org/press-releases/wwf-report-reveals-staggering-extent-of-human-impact-on-planet">loss of global wildlife</a>.</p>
<p>Perhaps this explains why <a href="https://www.nytimes.com/2020/04/15/magazine/quarantine-animal-videos-coronavirus.html">stories of nature’s “comeback”</a> are continually <a href="https://www.latimes.com/environment/story/2020-04-21/wildlife-thrives-amid-coronavirus-lockdown">popping up</a> alongside those gardening headlines. We cheer at images of animals <a href="https://www.nytimes.com/2020/04/01/science/coronavirus-animals-wildlife-goats.html">reclaiming</a> abandoned spaces and birds filling skies cleared of pollution. Some of these accounts are credible, others <a href="https://www.nationalgeographic.com/animals/2020/03/coronavirus-pandemic-fake-animal-viral-social-media-posts/">dubious</a>. What matters, I think, is that they offer a glimpse of the world as we wish it could be: In a time of immense suffering and climate breakdown, we are desperate for signs of life’s resilience.</p>
<p>My final conversation with Wallace offered a clue as to how this desire is also fueling today’s gardening craze. She marveled at how life in the garden continues to “spring forth in our absence, or even because of our absence.” Then she closed with an insight at once “liberating” and “humiliating” that touches on hopes reaching far beyond the nation’s backyards: “No matter what we do, or how the conference call goes, the garden will carry on, with or without us.”</p>
<p> </p>
<p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/the-impulse-to-garden-in-hard-times-has-deep-roots-137223">original article</a>.</em></p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/a-sparse-population-of-neurons-plays-a-key-role-in-coordinating-the-brains-blood-supply/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">A sparse population of neurons plays a key role in coordinating the brain’s blood supply</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 17th 2025, 17:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.7554/eLife.105649.3.sa0" target="_blank">eLife</a></em> has found that a remarkably rare type of neuron acts as a master conductor for the brain’s electrical activity and blood flow. Researchers at The Pennsylvania State University found that removing these specific cells, known as type-I nNOS neurons, from one side of a mouse’s brain led to significant disruptions in both local brain function and the synchronized communication between the two cerebral hemispheres. </p>
<p>The work suggests this sparse population of cells plays an outsized role in orchestrating brain-wide dynamics, and their loss could contribute to the cognitive decline associated with aging and neurodegenerative disease.</p>
<p>The brain is a profoundly complex and energy-demanding organ. To function properly, its billions of neurons must constantly communicate through electrical signals, a process that requires a steady and precisely regulated supply of oxygen and nutrients from the bloodstream. This tight relationship between neural activity and blood flow is known as neurovascular coupling. When a group of neurons becomes active, nearby blood vessels dilate to increase blood supply to that region, a fundamental process that forms the basis of modern brain imaging techniques like functional magnetic resonance imaging.</p>
<p>One of the key chemical messengers involved in dilating these blood vessels is nitric oxide. This small molecule is produced by an enzyme called neuronal nitric oxide synthase, or nNOS, which is found in certain inhibitory brain cells called interneurons. </p>
<p>Researchers have identified two main categories of these cells, but a team of engineers and neuroscientists led by Patrick J. Drew at Penn State focused on a particularly enigmatic subgroup called type-I nNOS neurons. These cells are exceedingly scarce, making up less than one percent of all neurons in the cortex, yet they possess long, branching connections that extend across wide swaths of the brain. The researchers wanted to understand what specific functions these rare but widely connected cells perform.</p>
<p>A defining feature of type-I nNOS neurons is that they are the only cells in the cortex that have a specific surface receptor, called TACR1. This receptor acts like a lock that can only be opened by a particular key, a molecule called substance P. The research team used this unique feature to design a highly specific method for studying the neurons’ function. </p>
<p>They engineered a “molecular weapon” by attaching a potent toxin, saporin, to a synthetic version of substance P. When this compound was injected into a small region of the mouse brain, only the type-I nNOS neurons with the TACR1 receptor would absorb it, leading to their selective elimination while leaving other neighboring cells unharmed.</p>
<p>The team injected this substance into the somatosensory cortex of mice, the brain region responsible for processing touch, particularly from the whiskers. A separate group of control mice received an injection with a harmless, scrambled version of the substance P-toxin conjugate. </p>
<p>After allowing several weeks for recovery, the scientists used an array of advanced techniques to observe brain function in the awake, behaving animals. Widefield optical imaging allowed them to measure changes in blood volume across the brain’s surface, while two-photon microscopy provided a magnified view of individual arteries dilating and constricting. At the same time, implanted electrodes recorded the collective electrical activity of neurons, known as the local field potential.</p>
<p>When the researchers puffed air on the mice’s whiskers to simulate a touch sensation, they observed notable changes in the brain’s response. In mice lacking the type-I nNOS neurons, the sustained increase in blood flow during a prolonged stimulus was significantly reduced. After a brief stimulus, the typical small dip in blood volume that follows the initial surge was completely absent. These results indicated that while these neurons are not solely responsible for initiating blood flow responses, they play a substantial part in shaping and sustaining them over time.</p>
<p>The elimination of these neurons also had a profound effect on the brain’s intrinsic electrical rhythms. The researchers found a marked reduction in the power of slow brain waves in the delta frequency band, between 1 and 4 Hertz. These slow waves are prominent during deep sleep and are thought to be important for memory consolidation and the brain’s waste clearance system. The reduction in delta wave activity was apparent across all states, whether the animals were alert, quietly resting, or asleep.</p>
<p>Beyond these local effects, the study revealed a breakdown in brain-wide coordination. Normally, the corresponding regions of the brain’s left and right hemispheres show highly synchronized activity. However, after removing the type-I nNOS neurons from just one hemisphere, this synchrony diminished. </p>
<p>The moment-to-moment fluctuations in blood volume and the patterns of high-frequency neural activity became less correlated between the two sides of the brain. This suggests that these rare neurons act as key nodes in a network that helps bind the two hemispheres together, ensuring they operate in a coordinated fashion.</p>
<p>Another key finding related to the spontaneous, rhythmic pulsing of the brain’s arteries, a phenomenon known as vasomotion. These oscillations, which occur even in the absence of any specific task or stimulation, are believed to help circulate cerebrospinal fluid and clear metabolic waste products from the brain. In mice without type-I nNOS neurons, the amplitude of these resting-state blood volume oscillations was significantly dampened. The arteries still pulsed, but their rhythmic dilations and constrictions were much weaker.</p>
<p>The study includes some important considerations. The experiments were conducted weeks after the neurons were removed, which may have allowed the brain to partially reorganize and compensate for their absence. Additionally, the relationship between neural signals and vascular responses is not always linear. </p>
<p>It is possible that multiple redundant pathways for vasodilation exist, such that removing one component does not cause a complete failure of the system, especially during strong stimulation when the response might already be near its maximum. This could help explain why some previous studies that artificially activated these neurons saw very large vascular effects, while this study saw more nuanced changes upon their removal.</p>
<p>Future research will likely explore the long-term consequences of losing these specific neurons. Because type-I nNOS neurons are known to be particularly vulnerable to cellular stress, their gradual loss over a lifetime could be a contributing factor to age-related cognitive decline. By connecting this select population of cells to fundamental processes like delta waves, interhemispheric communication, and vasomotion, this work opens up new avenues for understanding how the brain maintains its health and how these systems can fail in disease.</p>
<p>The study, “<a href="https://doi.org/10.7554/eLife.105649.3.sa0" target="_blank">Type-I nNOS neurons orchestrate cortical neural activity and vasomotion</a>,” was authored by Kevin Turner, Dakota Brockway, Md Shakhawat Hossain, Keith Griffith, Denver Greenawalt, Qingguang Zhang, Kyle Gheres, Nicole Crowley, and Patrick J Drew.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/street-dancing-may-improve-cognitive-reserve-in-young-women-study-finds/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Street dancing may improve cognitive reserve in young women, study finds</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 17th 2025, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A neuroimaging study in China found that young female college students who participated in an 18-week street dance program showed greater activation in areas of the brain linked to attention, inhibition, and task switching. They also demonstrated better accuracy and faster responses in cognitive tasks. The paper was published in <em><a href="https://doi.org/10.3389/fnins.2025.1640555" target="_blank">Frontiers in Neuroscience</a></em>.</p>
<p>As people live longer, medical conditions characteristic of advanced age are becoming an increasingly important public health concern. A key concern is age-related cognitive decline. As people reach advanced age, many of their cognitive abilities start to decline, with working memory and processing speed usually the most affected. The decline tends to be slow at first but accelerates as a person ages.</p>
<p>Studies have indicated that a key protective factor against age-related cognitive decline is cognitive reserve. Cognitive reserve is the brain’s ability to maintain normal cognitive functioning despite damage, aging, or disease. This reserve develops through lifelong factors like education, intellectually stimulating activities, social engagement, and complex occupations. It is thought to result from both structural aspects (like greater synaptic density) and the functional adaptability of neural networks.</p>
<p>Cognitive reserve helps explain why two people with similar brain damage can show very different levels of cognitive impairment. It is closely related to the concept of brain plasticity, emphasizing the brain’s capacity to reorganize itself. Lifestyle choices such as regular exercise, learning new skills, and maintaining social connections can enhance cognitive reserve.</p>
<p>Study author Yongbo Wang and his colleagues wanted to explore whether street dancing could be used to improve cognitive reserve. In this study, street dance is defined “as an umbrella term for styles such as Breaking, Locking, and Popping, characterized by improvisation, rhythmic variety, and cognitively demanding choreographic sequences.”</p>
<p>The study participants were 28 healthy female college students with an average age of 20 years. The authors randomly assigned them to either a street dance group or a control group.</p>
<p>At the start of the study, all participants underwent functional near-infrared spectroscopy (fNIRS) of their brains and completed a set of cognitive tasks. The street dance group then participated in an 18-week program consisting of three 80-minute sessions per week.</p>
<p>The control group did not undergo any intervention and was instructed to maintain their usual daily routines. After 18 weeks, both groups underwent fNIRS scans again and completed the same cognitive tasks.</p>
<p>Results indicated that the street dancing group had increased activation in multiple prefrontal cortex regions while performing the cognitive tasks. These areas included the right dorsolateral prefrontal cortex, the right frontopolar area, and the left inferior frontal gyrus, which are linked to attention, inhibition, and task switching. The street dance group participants also showed higher accuracy and faster responses in the cognitive tasks.</p>
<p>“The 18-week street dance intervention effectively improved working memory, inhibitory control, and cognitive flexibility, contributing to enhanced cognitive reserve,” the study authors concluded. “As a physical activity combining rhythm and coordination, street dance offers a promising early intervention strategy for delaying cognitive decline and reducing dementia risk.”</p>
<p>The study contributes to the scientific understanding of the links between behavior and brain activity. However, it should be noted that the study was conducted on a very small group of healthy, young, female participants. Results on other demographic groups and on larger samples might differ.</p>
<p>The paper, “<a href="https://doi.org/10.3389/fnins.2025.1640555" target="_blank">Street dancing enhances cognitive reserve in young females: an fNIRS study</a>,” was authored by Yongbo Wang, Quansheng Zheng, Yanbai Han, Yaqing Fan, Hongen Liu, and Hongli Wang.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/why-people-think-kindness-is-in-your-dna-but-selfishness-isnt/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Why people think kindness is in your DNA but selfishness isn’t</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 17th 2025, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>People tend to believe that good behavior reflects who someone really is and is more deeply rooted in their biology, but are less likely to see bad behavior the same way. A new study published in <em><a href="https://doi.org/10.1177/09636625251375283" target="_blank">Public Understanding of Science</a></em> provides evidence that this tendency may be shaped by how natural a behavior seems.</p>
<p>People tend to be more comfortable attributing positive behavior to genetic causes than negative behavior. Past work has consistently found this pattern, but the reasons for it have remained unclear. Some theories suggest that people are motivated to hold wrongdoers accountable and are reluctant to attribute antisocial behavior to biological factors. </p>
<p>Others have proposed that people see positive traits as part of a person’s “true self,” and that they associate genes with this essential core. A third possibility is that people see good behavior as more natural, and that this sense of naturalness increases the likelihood that they’ll see it as genetically caused.</p>
<p>The new study aimed to test these three possible explanations more thoroughly. The researchers also wanted to know whether the pattern of greater genetic attribution for good behavior was influenced by the identity of the person whose behavior was being judged. </p>
<p>“The origin story of this work lies in the fact that my colleagues and I had noticed some inconsistency in the research literature about the effects of genetic explanations for behavior on perceptions of blameworthiness,” said study author Matthew S. Lebowitz, an assistant professor of medical psychology at Columbia University.</p>
<p>“When people are given information suggesting that a stigmatized health outcome, such as obesity or a mental health problem, may have a genetic cause, this often seems to reduce the extent to which people with these conditions are blamed for having them. But when people are told that immoral behavior, such as criminality, is influenced by genetic causes, this does not consistently lead to reductions in the perceived blameworthiness of the perpetrator. We wondered whether this might be because people are skeptical of the idea that genetic factors can cause wrongdoing.”</p>
<p>To investigate these questions, the research team recruited a nationally representative sample of 1,500 adults in the United States. The participants were randomly assigned to read a short description of a fictional person named Pat. In some versions, Pat was described as being kind, generous, and caring. In other versions, Pat was described as mean, selfish, and uncaring. Across different versions, Pat was also randomly described as either a man or a woman, and as either Black or White.</p>
<p>After reading about Pat, participants were asked a series of questions. These included how much they thought Pat’s behavior reflected Pat’s true self, how responsible Pat was for their behavior, how natural the behavior seemed, and how much of a role they thought genetics played in causing it. Each response was rated on a scale from 1 to 7. These ratings allowed the researchers to examine whether perceptions of responsibility, true self, or naturalness could explain why people were more likely to see good behavior as genetically influenced.</p>
<p>In line with previous research, Lebowitz and his colleagues found that people tended to see prosocial behavior, such as being generous or kind, as more genetically influenced than antisocial behavior, such as being selfish or mean. On average, participants gave higher genetic attribution scores when Pat was described positively. This difference was not affected by Pat’s race or gender. That is, the tendency to link genes to good behavior more than bad behavior occurred regardless of whether Pat was a man or a woman, or Black or White.</p>
<p>However, when the researchers looked at the overall level of genetic attributions, they did find differences based on race and gender. Participants were less likely to say that behavior was caused by genetics when Pat was described as a Black man, compared to a White man. This pattern did not appear among the female versions of Pat. The study did not test directly why this pattern occurred, but the authors suggest it may reflect broader cultural beliefs or stereotypes.</p>
<p>To understand what might explain the main difference in genetic attributions between prosocial and antisocial behavior, the researchers conducted a series of analyses testing the three possible mediators. First, they looked at how strongly each factor—responsibility, true self, and naturalness—predicted genetic attributions. They found that both true self and naturalness ratings were significantly related to genetic explanations, but responsibility ratings were not.</p>
<p>Next, they used statistical mediation analyses to determine whether these factors explained the link between behavior type and genetic attributions. The clearest finding was that perceived naturalness played a large role. When people saw behavior as more natural, they were more likely to see it as genetically influenced. Since people tended to view kind and generous behavior as more natural than selfish or mean behavior, this helped explain why prosocial behavior was more often attributed to genes.</p>
<p>True self ratings also played a role, but a smaller one. People tended to see good behavior as more reflective of Pat’s true self, and this increased the likelihood of attributing the behavior to genes. However, this effect was not as strong as the one linked to naturalness. </p>
<p>Responsibility ratings, by contrast, did not significantly mediate the relationship. In fact, participants rated Pat as less responsible in the antisocial condition than in the prosocial one, a finding that did not align with some earlier research and may point to complexities in how people understand accountability.</p>
<p>“We found that people consistently perceive antisocial (that is, harmful or morally bad) behavior to be less genetically influenced than prosocial (that is, helpful or morally good) behavior,” Lebowitz told PsyPost. “It seems that in large part, this is because people perceive prosocial or virtuous behavior to be more “natural” than antisocial or immoral behavior, and genetic causation is linked with naturalness in people’s minds.”</p>
<p>“The pattern of results that we observed mirrors one that we have seen across many different studies published in many different papers, so we think it’s pretty safe to conclude that it’s a real phenomenon. And it suggests that as we increasingly understand the influence of genes on behavior, it will be important to consider the ways in which people’s evaluative judgments might influence how this understanding is received by the public. Regardless of what the science says, people might be more inclined to <em>believe</em> that genes influence some kinds of behavior more than other kinds.”</p>
<p>The study, like all research, has some caveats. It relied on a single pair of behavioral descriptions—one prosocial and one antisocial—which may not capture the full range of morally relevant behavior. The vignettes also used general descriptions of personality traits rather than specific actions, which might affect how participants interpreted the scenarios. Additionally, the study focused only on race and gender, and future research could examine how other demographic characteristics, such as age or socioeconomic status, affect genetic attributions.</p>
<p>There are also some unanswered questions about why behavior described as coming from a Black man was seen as less genetically influenced. The study was not designed to fully explain this pattern, and the authors encourage further research to explore whether it reflects underlying biases, cultural beliefs, or other factors.</p>
<p>The study, “<a href="https://doi.org/10.1177/09636625251375283" target="_blank">Beliefs about genetic influences on prosocial and antisocial behavior in a U.S. sample</a>,” was authored by Matthew S. Lebowitz, Baoyi Shi, Kathryn Tabb, Paul S. Appelbaum, and Linda Valeri.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/new-study-links-soft-drink-consumption-to-depression-via-the-gut-microbiome/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">New study links soft drink consumption to depression via the gut microbiome</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 17th 2025, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study suggests a connection between the consumption of soft drinks and major depressive disorder, potentially acting through alterations in the gut’s microbial community. The research, published in <em><a href="https://jamanetwork.com/journals/jamapsychiatry/fullarticle/2839019" target="_blank" rel="noopener">JAMA Psychiatry</a></em>, identifies a specific bacterium that may play a part in this relationship, offering a new perspective on how dietary choices can influence mental health.</p>
<p>The human gut is home to trillions of microorganisms, collectively known as the gut microbiome. This internal ecosystem communicates with the brain through a complex network of signals, often called the gut-brain axis. The composition of these microbial communities can influence mood, behavior, and brain function. This connection has led researchers to explore how external factors, like diet, might impact mental health by first changing the gut environment.</p>
<p>Previous research has already connected soft drink consumption with a range of negative physical health outcomes. Separately, other studies have shown that the gut microbiomes of individuals with depression can differ from those without the condition. For instance, experiments involving the transfer of gut microbes from depressed human patients to rodents have induced behaviors in the animals that resemble anxiety and depression.</p>
<p>A large, collaborative team of researchers from institutions primarily in Marburg, Münster, and Frankfurt, Germany, sought to bring these lines of inquiry together. They aimed to investigate if there was a direct link between soft drink intake and diagnosed depression, and if the gut microbiome could be the biological mechanism explaining such a link.</p>
<p>To conduct their investigation, the researchers drew upon data from the Marburg-Münster Affective Disorders Cohort Study. This provided them with a large sample of 932 participants, which included 405 patients who had a clinical diagnosis of major depressive disorder (MDD) and 527 healthy control individuals. The two groups were comparable in age and sex distribution.</p>
<p>Participants’ dietary habits were assessed using a detailed food frequency questionnaire. This tool asked them to report how often they consumed a standard portion of various foods and drinks, including soft drinks like lemonade and cola, over the past year.</p>
<p>To measure the severity of depressive symptoms, individuals completed the Beck Depression Inventory, a standardized self-report questionnaire. In addition, stool samples were collected from a subset of participants to analyze the composition of their gut microbiomes. The team used a genetic sequencing technique known as 16S ribosomal RNA analysis, which allows for the identification and quantification of different types of bacteria present in a sample.</p>
<p>The initial analysis examined the direct relationship between soft drink consumption and depression. The results showed that higher intake of soft drinks was associated with a diagnosis of MDD. This connection appeared to be driven primarily by female participants; in women, higher consumption was linked to an increased likelihood of having an MDD diagnosis, while no such effect was observed in men.</p>
<p>A similar pattern emerged when the researchers looked at symptom severity. In the entire sample, greater soft drink intake was associated with more severe depressive symptoms. When the data was separated by sex, this association remained strong for women but was not statistically significant for men. This sex-specific finding prompted the team to focus their subsequent microbiome analyses exclusively on the female participants.</p>
<p>“Our data suggests that the relation between soft drinks and depressive symptoms arises via the influence of the microbiome,” said Sharmili Edwin Thanarajah, the study’s lead author from the University Hospital Frankfurt and the Max Planck Institute for Metabolism Research in Cologne.</p>
<p>The next step was to see if soft drinks affected the gut microbiome in a way that could explain the link to depression. Building on previous work that identified two bacterial genera, Eggerthella and Hungatella, as being potentially related to MDD, the team tested whether soft drink consumption was associated with the abundance of these specific microbes.</p>
<p>They found that in women, higher soft drink intake was indeed linked to a greater abundance of Eggerthella. No such link was found with Hungatella. The analysis also revealed that higher soft drink consumption was associated with lower overall microbial diversity, which can be an indicator of a less resilient gut ecosystem.</p>
<p>With these pieces in place, the researchers performed a mediation analysis. This statistical method tests a hypothesized chain of events to see if the data is consistent with it. The proposed model was that soft drink consumption influences the abundance of Eggerthella, which in turn influences the risk and severity of depression.</p>
<p>The analysis supported this model for female participants. The abundance of Eggerthella was found to explain a small but statistically significant portion of the total effect of soft drink consumption on both MDD diagnosis (about 3.8%) and symptom severity (about 5.0%).</p>
<p>The researchers note that sugary drinks can disrupt the balance of the gut microbiome, favoring the growth of certain bacteria. “Changes in the microbiome can be influenced by diet and are therefore a potential therapeutic target,” explained Edwin Thanarajah. “Even small adjustments in consumer behavior might have a big impact, especially when considering the widespread consumption of soft drinks.”</p>
<p>While the study presents a detailed picture, it has several important limitations. As an observational study, it can identify associations but cannot prove that soft drink consumption causes depression. The relationship could be bidirectional. Ciara McCabe, a professor of neuroscience at the University of Reading who was not involved in the study, noted this point. “Simply, those with depression could drink more soft drinks,” McCabe told <a href="https://www.sciencemediacentre.org/expert-reaction-to-study-on-association-between-soft-drink-consumption-and-depression-mediated-by-gut-microbiome/" target="_blank" rel="noopener">the Science Media Centre</a>. “As the authors say themselves depression is associated with increased emotional eating and preference for high sugar food, which may lead to greater soft drink consumption.”</p>
<p>The proportion of the effect explained by the microbiome was also modest. Guillaume Meric, an associate professor at the University of Bath, commented on this aspect. “The microbiome mediation is statistically significant but explains only about 4–5% of the association, which makes it an interesting hypothesis to validate with further studies,” he said. The authors contend that even mediators with small effects can be meaningful in complex systems, as they might point toward pathways that are easier to modify.</p>
<p>Other experts pointed to potential confounding factors. Andrew McQuillin, a professor of molecular psychiatry at University College London, questioned the strength of the evidence. “The effect sizes reported are very small with wide confidence intervals and the findings have not been replicated in an independent study,” he remarked. The reliance on self-reported dietary information could also introduce inaccuracies.</p>
<p>Future research will be needed to untangle the direction of these relationships and confirm the findings. Randomized controlled trials, while challenging in dietary research, would provide stronger evidence of a causal link. Further investigation is also required to understand the sex-specific nature of the findings, which could involve hormonal differences or other biological factors.</p>
<p>Despite these caveats, the study adds to a body of evidence suggesting that diet is an important, modifiable factor in mental health. Rachel Lippert, a researcher from the German Institute of Human Nutrition Potsdam-Rehbrücke and a co-author of the study, sees potential in these findings.</p>
<p>“Microbiome-based approaches such as targeted nutritional therapies or probiotic strategies might help to effectively alleviate depressive symptoms in the future,” she said. The work suggests that public health strategies aimed at reducing soft drink intake could have benefits that extend beyond physical health to include mental well-being.</p>
<p>The study, “<a href="https://jamanetwork.com/journals/jamapsychiatry/fullarticle/2839019" target="_blank" rel="noopener">Soft Drink Consumption and Depression Mediated by Gut Microbiome Alterations</a>,” was authored by Sharmili Edwin Thanarajah, Adèle H. Ribeiro, Jaehyun Lee, Nils R. Winter, Frederike Stein, Rachel N. Lippert, Ruth Hanssen, Carmen Schiweck, Leon Fehse, Mirjam Bloemendaal, Mareike Aichholzer, Aicha Bouzouina, Carmen Uckermark, Marius Welzel, Jonathan Repple, Silke Matura, Susanne Meinert, Corinna Bang, Andre Franke, Ramona Leenings, Maximilian Konowski, Jan Ernsting, Lukas Fisch, Carlotta Barkhau, Florian Thomas-Odenthal, Paula Usemann, Lea Teutenberg, Benjamin Straube, Nina Alexander, Hamidreza Jamalabadi, Igor Nenadić, Andreas Lügering, Robert Nitsch, Sarah Kittel-Schneider, John F. Cryan, Andreas Reif, Tilo Kircher, Dominik Heider, Udo Dannlowski, and Tim Hahn.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>