<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/scientists-just-mapped-the-brain-architecture-that-underlies-human-intelligence/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Scientists just mapped the brain architecture that underlies human intelligence</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 6th 2026, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>For decades, researchers have attempted to pinpoint the specific areas of the brain responsible for human intelligence. A new analysis suggests that general intelligence involves the coordination of the entire brain rather than the superior function of any single region. By mapping the connections within the human brain, or connectome, scientists found that distinct patterns of global communication predict cognitive ability.</p>
<p>The research indicates that intelligent thought relies on a system-wide architecture optimized for efficiency and flexibility. These findings were published in the journal <em><a href="https://doi.org/10.1038/s41467-026-68698-5" target="_blank">Nature Communications</a></em>.</p>
<p>General intelligence represents the capacity to reason, learn, and solve problems across a variety of different contexts. In the past, theories often attributed this capacity to specific networks, such as the areas in the frontal and parietal lobes involved in attention and working memory. While these regions are involved in cognitive tasks, newer perspectives suggest they are part of a larger story.</p>
<p>The Network Neuroscience Theory proposes that intelligence arises from the global topology of the brain. This framework suggests that the physical wiring of the brain and its patterns of activity work in tandem.</p>
<p>Ramsey R. Wilcox, a researcher at the University of Notre Dame, led the study to test the specific predictions of this network theory. Working with senior author Aron K. Barbey and colleagues from the University of Illinois and Stony Brook University, Wilcox sought to move beyond localized models. The team aimed to understand how the brain’s physical structure constrains and directs its functional activity.</p>
<p>To investigate these questions, the research team utilized data from the Human Connectome Project. This massive dataset provided brain imaging and cognitive testing results from 831 healthy young adults. The researchers also validated their findings using an independent sample of 145 participants from a separate study.</p>
<p>The investigators employed a novel method that combined two distinct types of magnetic resonance imaging (MRI) data. They used diffusion-weighted MRI to map the structural white matter tracts, which act as the physical cables connecting brain regions. Simultaneously, they analyzed resting-state functional MRI, which measures the rhythmic activation patterns of brain cells.</p>
<p>By integrating these modalities, Wilcox and his colleagues created a joint model of the brain. This approach allowed them to estimate the capacity of structural connections to transmit information based on observed activity. The model corrected for limitations in traditional scanning, such as the difficulty in detecting crossing fibers within the brain’s white matter.</p>
<p>The team then applied predictive modeling techniques to see if these global network features could estimate a participant’s general intelligence score. The results provided strong support for the idea that intelligence is a distributed phenomenon. Models that incorporated connections across the whole brain successfully predicted intelligence scores.</p>
<p>In contrast, models that relied on single, isolated networks performed with less accuracy. This suggests that while specific networks have roles, the interaction between them is primary. The most predictive connections were not confined to one area but were spread throughout the cortex.</p>
<p>One of the specific predictions the team tested involved the strength and length of neural connections. The researchers found that individuals with higher intelligence scores tended to rely on “weak ties” for long-range communication. In network science, a weak tie represents a connection that is not structurally dense but acts as a bridge between separate communities of neurons.</p>
<p>These long-range, weak connections require less energy to maintain than dense, strong connections. Their weakness allows them to be easily modulated by neural activity. This quality makes the brain more adaptable, enabling it to reconfigure its communication pathways rapidly in response to new problems.</p>
<p>The study showed that in highly intelligent individuals, these predictive weak connections spanned longer physical distances. Conversely, strong connections in these individuals tended to be shorter. This architecture likely balances the high cost of long-distance communication with the need for system-wide integration.</p>
<p>Another key finding concerned “modal control.” This concept refers to the ability of specific brain regions to drive the brain into difficult-to-reach states of activity. Cognitive tasks often require the brain to shift away from its default patterns to process complex information.</p>
<p>Wilcox and his team found that general intelligence was positively associated with the presence of regions exhibiting high modal control. These control hubs were located in areas of the brain associated with executive function and visual processing. The presence of these regulating nodes allows the brain to orchestrate interactions between different networks effectively.</p>
<p>The researchers also examined the overall topology of the brain using a concept known as “small-worldness.” A small-world network is one that features tight-knit local communities of nodes as well as short paths that connect those communities. This organization is efficient because it allows for specialized local processing while maintaining rapid global communication.</p>
<p>The analysis revealed that participants with higher intelligence scores possessed brain networks with greater small-world characteristics. Their brains exhibited high levels of local clustering, meaning nearby regions were tightly interconnected. Simultaneously, they maintained short average path lengths across the entire system.</p>
<p>This balance ensures that information does not get trapped in local modules. It also ensures that the brain does not become a disorganized random network. The findings suggest that deviations from this optimal balance may underlie lower cognitive performance.</p>
<p>There are limitations to the current study that warrant consideration. The research relies on correlational data, so it cannot definitively prove that specific network structures cause higher intelligence. It is possible that engaging in intellectual activities alters the brain’s wiring over time.</p>
<p>Additionally, the study focused primarily on young adults. Future research will need to determine if these network patterns hold true across the lifespan, from childhood development through aging. The team also used linear modeling techniques, which may miss more nuanced, non-linear relationships in the data.</p>
<p>These insights into the biological basis of human intelligence have implications for the development of artificial intelligence. Current AI systems often excel at specific tasks but struggle with the broad flexibility characteristic of human thought. Understanding how the human brain achieves general intelligence through global network architecture could inspire new designs for artificial systems.</p>
<p>By mimicking the brain’s balance of local specialization and global integration, engineers might create AI that is more adaptable. The reliance on weak, flexible connections for integrating information could also serve as a model for efficient data processing.</p>
<p>The shift in perspective offered by this study is substantial. It moves the field away from viewing the brain as a collection of isolated tools. Instead, it presents the brain as a unified, dynamic system where the pattern of connections determines cognitive potential.</p>
<p>Wilcox and his colleagues have provided empirical evidence that validates the core tenets of Network Neuroscience Theory. Their work demonstrates that intelligence is not a localized function but a property of the global connectome. As neuroscience continues to map these connections, the definition of what it means to be intelligent will likely continue to evolve.</p>
<p>The study, “<a href="https://doi.org/10.1038/s41467-026-68698-5" target="_blank">The network architecture of general intelligence in the human connectome</a>,” was authored by Ramsey R. Wilcox, Babak Hemmatian, Lav R. Varshney & Aron K. Barbey.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/sorting-hat-research-what-does-your-hogwarts-house-say-about-your-psychological-makeup/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Sorting Hat research: What does your Hogwarts house say about your psychological makeup?</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 6th 2026, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study suggests that the popular “Sorting Hat Quiz” from the Harry Potter universe may loosely reflect actual personality traits, particularly for fans of the series. The findings indicate that while the quiz captures some real psychological differences, its predictive power relies heavily on the participant’s familiarity with the narrative. These results were published in <em><a href="https://doi.org/10.1371/journal.pone.0336123" target="_blank">PLOS One</a></em>.</p>
<p>Human beings possess a deep-seated drive to engage with storytelling and often identify closely with fictional characters. This tendency frequently manifests in the popularity of online assessments that assign individuals to specific groups within a fictional universe.</p>
<p>The “Sorting Hat Quiz” is a prominent example where users are sorted into one of four Hogwarts Houses based on their responses to situational questions. Prior investigations suggested a correlation between these House assignments and established psychological traits. The authors of the current study sought to verify these associations using more rigorous personality measures. They also aimed to determine if these connections exist for people who are unfamiliar with the books.</p>
<p>“The project actually started in a very down-to-earth way: my coauthors and I are genuine Harry Potter fans, and at some point we found ourselves joking—but also seriously debating—that each of us’ belongs’ to a different Hogwarts House,” said study author Maria Flakus of the Polish Academy of Sciences in Warsaw.</p>
<p>“That naturally led to a more scientific question: <i>is there any real psychological signal behind these identifications, or are they mostly narrative stereotypes and wishful thinking?</i> In other words, we wanted to see whether people’s House alignment (especially the House they <i>feel</i> they are, or <i>want</i> to be) maps onto meaningful differences in their dominant personality characteristics.”</p>
<p>“At the same time, there was a broader gap worth addressing. Sorting-type pop-culture quizzes are massively popular and people often treat the outcomes as surprisingly ‘accurate,’ yet the evidence for whether they track established psychological traits—and <i>under what conditions</i>—is limited and not fully consistent. We were particularly motivated to test whether the Sorting Hat Quiz can tell us something about personality at all, and whether ‘desired’ House membership might be as informative (or even more informative) than the algorithmic assignment—potentially reflecting an ideal self rather than a measured trait profile.”</p>
<p>To examine this, the research team recruited 677 participants through social media platforms. The sample consisted of adults ranging from 18 to 55 years old who were residents of Poland or spoke Polish fluently. The researchers divided the participants into two distinct groups based on their exposure to the series. The first group contained 578 individuals who had read the Harry Potter books. The second group consisted of 99 individuals who had not read the books.</p>
<p>Participants completed the official Sorting Hat Quiz on the Wizarding World website to determine their designated House. They also indicated which House they personally desired to join. To assess personality, the researchers administered the Polish Personality Lexicon, which is based on the HEXACO model. This model measures honesty-humility, emotional stability, extroversion, agreeableness, conscientiousness, and openness to experience.</p>
<p>The study also employed specific scales to measure darker personality aspects known as the Dark Triad. The researchers used the Narcissistic Admiration and Rivalry Questionnaire and the MACH-IV scale (for Machiavellianism). They assessed psychopathy using the Triarchic Psychopathy Measure. Additionally, the Need for Cognition Scale evaluated how much participants enjoyed complex thinking and intellectual challenges.</p>
<p>The data revealed specific patterns among the participants who had read the books. Individuals sorted into Slytherin scored higher on measures of Machiavellianism, narcissism, and psychopathy compared to members of other Houses. These participants displayed traits associated with manipulation and a focus on self-interest. This finding aligns with the fictional portrayal of Slytherin House as ambitious and sometimes cunning.</p>
<p>Participants sorted into Ravenclaw demonstrated a higher need for cognition. This indicates a preference for intellectual engagement and problem-solving activities. This result corresponds well with the Ravenclaw reputation for valuing wit, learning, and wisdom. Those assigned to Gryffindor scored marginally higher on extroversion than the other groups. This suggests a tendency toward social assertiveness and enthusiasm.</p>
<p>Individuals sorted into Hufflepuff reported higher levels of agreeableness and honesty-humility. This aligns with the fictional description of the House as valuing fair play, loyalty, and hard work. However, these participants also reported lower levels of emotional stability. This finding implies a greater tendency to experience worry or a need for emotional support in stressful situations.</p>
<p>“Readers should think of the effects as modest rather than ‘life-defining,'” Flakus told PsyPost. “Even when differences between Houses are statistically reliable, there’s substantial overlap—many people in different Houses look similar on standard trait measures—so House membership explains only a limited share of personality variance. Practically, that means the Sorting Hat result may capture a real tendency at the group level, but it’s not precise enough for individual prediction or decision-making. It’s best viewed as a fun, coarse-grained signal.”</p>
<p>The researchers noted a discrepancy regarding conscientiousness among Hufflepuffs. Previous theories posited that Hufflepuffs would score highest in this trait due to their association with hard work. The current data provided evidence that Hufflepuffs did not score significantly higher in conscientiousness than members of other Houses. This challenges some of the simpler stereotypes associated with the House.</p>
<p>The researchers also analyzed the personality traits of participants based on the House they wanted to join rather than the one they were assigned. The patterns for desired Houses closely mirrored the results for the assigned Houses among readers. For example, those who wished to be in Slytherin scored higher on narcissism and psychopathy. This implies that personal preference is a strong indicator of one’s psychological makeup in this context.</p>
<p>“We were surprised that the pattern of associations pointed not only to traits but also to how people see themselves—self-identification sometimes seemed as informative as the quiz assignment,” Flakus said.</p>
<p>But the relationships between House assignment and personality traits were largely absent in the group of non-readers. While there was a minor link between Gryffindor assignment and extroversion, most other correlations disappeared. The Sorting Hat Quiz failed to predict the “Dark Triad” traits or need for cognition in participants unfamiliar with the books. This suggests that the quiz itself does not function as a standalone personality test.</p>
<p>These findings suggest that the Sorting Hat Quiz is not an effective tool for psychological assessment in a general context. The predictive power of the quiz appears to depend on the participant’s knowledge of the fictional universe. This supports the “narrative collective assimilation hypothesis.” This theory proposes that immersing oneself in a story allows a person to internalize the traits of a specific group within that narrative.</p>
<p>Fans of the series may unconsciously or consciously align their self-perception with the traits of their preferred House. When they answer personality questions, they may do so through the lens of this identity. For non-readers, the questions in the quiz lack this contextual weight. Consequently, their answers do not aggregate into meaningful personality profiles in the same way.</p>
<p>“The key takeaway is that these kinds of pop-culture quizzes can reflect <em>some</em> real personality differences, but they’re not a substitute for validated psychological assessment,” Flakus explained. “Your ‘House’ can be a fun mirror of broad tendencies—and sometimes your preferred House may say as much about your values or ideal self as about your traits—so it’s best used as a playful starting point for self-reflection, not a diagnosis.”</p>
<p>As with all research, there are some limitations to consider. The group of non-readers was relatively small compared to the group of readers. The sample was also predominantly female and recruited via social media. This may affect how well the results represent the general population.</p>
<p>Future inquiries could examine whether these patterns persist across different generations of fans. Researchers might also investigate similar phenomena in other popular fictional universes. Further study is needed to understand how identifying with fictional groups relates to real-world behaviors and values.</p>
<p>“At this point, we don’t have a fixed long-term roadmap yet, but we do see several promising next steps,” Flakus said. “One natural extension would be to test whether similar patterns appear in other pop-culture identity systems—i.e., whether identifying with particular factions, archetypes, or ‘types’ in other franchises relates to established personality traits in comparable ways.”</p>
<p>“We’re also interested in potential generational differences: the Harry Potter universe has a distinct cultural footprint across age cohorts, so it would be valuable to examine whether the mechanisms behind identification (and its links to traits or values) vary by generation.” </p>
<p>“Finally, an important direction is to look more closely at how these quizzes function among people who don’t know the universe at all—in our study we had such a subgroup, but it was small. A larger, more balanced sample would let us more confidently explore whether the quiz captures general psychological tendencies independent of fandom, or whether familiarity and narrative knowledge meaningfully shape the outcomes.”</p>
<p>The new findings regarding the personality structures of Hogwarts Houses align with separate research focused on external economic behaviors. A study published in <em>Small Business Economics</em> by Martin Obschonka and colleagues utilized a massive dataset to examine how these fictional profiles relate to entrepreneurship. </p>
<p>While the current study focused on self-reported traits, the Obschonka research found that <a href="https://www.psypost.org/new-harry-potter-study-links-gryffindor-and-slytherin-personalities-to-heightened-entrepreneurship/" target="_blank">identifying with Gryffindor or Slytherin predicted a higher likelihood of starting a business</a>. The researchers attributed this to a shared tendency toward “deviance” or rule-breaking, which is often necessary for innovation.</p>
<p>The new study, “<a href="https://doi.org/10.1371/journal.pone.0336123" target="_blank">Harry Potter and personality assessment – The utility of the Sorting Hat Quiz in personality traits’ assessment</a>,” was authored by Lidia Baran, Maria Flakus, and Franciszek Stefanek.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/deceptive-ai-interactions-can-feel-more-deep-and-genuine-than-actual-human-conversations/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Deceptive AI interactions can feel more deep and genuine than actual human conversations</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 5th 2026, 20:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.1038/s44271-025-00391-7" target="_blank">Communications Psychology</a></em> suggests that artificial intelligence systems can be more effective than humans at establishing emotional closeness during deep conversations, provided the human participant believes the AI is a real person. The findings indicate that while individuals can form social bonds with AI, knowing the partner is a machine reduces the feeling of connection. </p>
<p>The rapid development of large language models has fundamentally altered the landscape of human-computer interaction. Previous observations have indicated that these programs can generate content that appears empathetic and similar to human speech. Despite these advancements, it remained unclear whether humans could form relationships with AI that are as strong as those formed with other people. This is particularly relevant during the initial stages of getting to know a stranger.</p>
<p>Scientists aimed to fill this gap by investigating how relationship building differs between human partners and AI partners. They sought to determine if AI could handle “deep talk,” which involves sharing personal feelings and memories, as effectively as it handles superficial “small talk.” Additionally, the research team wanted to understand how a person’s pre-existing attitude toward technology affects this connection. Many people view AI with skepticism or perceive it as a threat to uniquely human qualities like emotion.</p>
<p>To investigate these dynamics, the research team recruited a total of 492 participants between the ages of 18 and 35. The sample consisted of university students. The experiments took place online to mimic typical digital communication. To simulate a realistic environment for relationship building, the researchers utilized a method known as the “Fast Friends Procedure.” This standardized protocol involves two partners asking and answering a series of questions that become increasingly personal over time.</p>
<p>In the first study, 322 participants engaged in a text-based chat. They were all informed that they would be interacting with another human participant. In reality, the researchers assigned half of the participants to chat with a real human. The other half interacted with a fictional character generated by a Google AI model known as PaLM 2. The interactions were further divided into two categories. Some pairs engaged in small talk, discussing casual topics. Others engaged in deep talk, addressing emotionally charged subjects.</p>
<p>The results from this first experiment showed a distinct difference based on the type of conversation. When the interaction involved small talk, participants reported similar levels of closeness regardless of whether their partner was human or AI. However, in the deep talk condition, the AI partner outperformed the human partner. Participants who unknowingly chatted with the AI reported significantly higher feelings of interpersonal closeness than those who chatted with real humans.</p>
<p>To understand why this occurred, the researchers analyzed the linguistic patterns of the chats. They found that the AI produced responses with higher levels of “self-disclosure.” The AI spoke more about emotions, self-related topics, and social processes. This behavior appeared to encourage the human participants to reciprocate. When the AI shared more “personal” details, the humans did the same. This mutual exchange of personal information led to a stronger perceived bond.</p>
<p>The second study sought to determine how the label assigned to the partner influenced these feelings. This phase focused exclusively on deep conversations. The researchers analyzed data from 334 participants, combining new recruits with relevant data from the first experiment. In this setup, the researchers manipulated the information given to the participants. Some were told they were chatting with a human, while others were told they were interacting with an AI.</p>
<p>The researchers found that the label played a significant role in relationship building. Regardless of whether the partner was actually a human or a machine, participants reported feeling less closeness when they believed they were interacting with an AI. This suggests an anti-AI bias that hinders social connection. The researchers noted that this effect was likely due to lower motivation. When people thought they were talking to a machine, they wrote shorter responses and engaged less with the conversation.</p>
<p>Despite this bias, the study showed that relationship building did not disappear entirely. Participants still reported an increase in closeness after chatting with a partner labeled as AI, just to a lesser degree than with a partner labeled as human. This suggests that people can develop social bonds with artificial agents even when they are fully aware of the agent’s non-human nature.</p>
<p>The researchers also explored individual differences in these interactions. They looked at a personality trait called “universalism,” which involves a concern for the welfare of people and nature. The analysis indicated that individuals who scored high on universalism felt closer to partners labeled as human but did not show the same increased closeness toward partners labeled as AI. This finding suggests that personal values may influence how receptive an individual is to forming bonds with technology.</p>
<p>There are several potential misinterpretations and limitations to consider regarding this work. The study relied on text-based communication, which differs significantly from face-to-face or voice-based interactions. The absence of visual and auditory cues might make it easier for an AI to pass as human. Additionally, the sample consisted of university students from a Western cultural context. The findings may not apply to other age groups or cultures.</p>
<p>The AI responses were generated using a specific model available in early 2024. As technology evolves rapidly, newer models might yield different results. It is also important to note that the AI was prompted to act as a specific character. This means the results apply to AI that is designed to mimic human behavior, rather than a generic chatbot assistant.</p>
<p>Future research could investigate whether these effects persist over longer periods. This study looked only at a single, short-term interaction. Scientists could also explore whether using avatars or voice generation changes the dynamic of the relationship. It would be useful to understand if the “uncanny valley” effect, where near-human replicas cause discomfort, becomes relevant as the technology becomes more realistic.</p>
<p>The study has dual implications for society. On one hand, the ability of AI to foster closeness suggests it could be useful in therapeutic settings or for combating loneliness. It could help alleviate the strain on overburdened social and medical services. On the other hand, the fact that AI was most effective when disguised as a human points to significant ethical risks. Malicious actors could use such systems to create deceptive emotional connections for scams or manipulation.</p>
<p>The study, “<a href="https://doi.org/10.1038/s44271-025-00391-7" target="_blank">AI outperforms humans in establishing interpersonal closeness in emotionally engaging interactions, but only when labelled as human</a>,” was authored by Tobias Kleinert, Marie Waldschütz, Julian Blau, Markus Heinrichs, and Bastian Schiller.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/divorce-history-is-not-linked-to-signs-of-brain-aging-or-dementia-markers/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Divorce history is not linked to signs of brain aging or dementia markers</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 5th 2026, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study investigating the biological impact of marital dissolution suggests that a history of divorce does not accelerate physical changes in the brain associated with aging or dementia. Researchers analyzed brain scans from a racially and ethnically diverse group of older adults to look for signs of neurodegeneration. They found no robust link between having been divorced and the presence of Alzheimer’s disease markers or reductions in brain volume. These findings were published in <em><a href="https://doi.org/10.1093/geroni/igaf122.4023" target="_blank">Innovation in Aging</a></em>.</p>
<p>The rising number of older adults globally has made understanding the causes of cognitive decline a priority for medical researchers. Scientists are increasingly looking beyond diet and exercise to understand how social and psychological experiences shape biology. Psychosocial stress is a primary area of interest in this field. Chronic stress can negatively impact the body, potentially increasing inflammation or hormonal imbalances that harm brain cells over time.</p>
<p>Divorce represents one of the most common and intense sources of psychosocial stress in the United States. Approximately 17 percent of adults over the age of 50 reported being divorced in 2023. The experience often involves not just the emotional pain of a relationship ending but also long-term economic strain and the loss of social standing. These secondary effects are often particularly harsh for women.</p>
<p>Previous research into how divorce affects the aging mind has produced conflicting results. Some past studies indicated that divorced or widowed individuals faced higher odds of developing dementia compared to married peers. Other inquiries found that ending a marriage might actually slow cognitive decline in some cases. Most of this prior work relied on memory tests rather than looking at the physical condition of the brain itself.</p>
<p>To address this gap, a team of researchers sought to determine if divorce leaves a physical imprint on brain structure. The study was led by Suhani Amin and Junxian Liu, who are affiliated with the Leonard Davis School of Gerontology at the University of Southern California. They collaborated with senior colleagues from Kaiser Permanente, the University of California, Davis, and Rush University.</p>
<p>The team hypothesized that the accumulated stress of divorce might correlate with worse brain health in later years. They specifically looked for reductions in brain size and the accumulation of harmful proteins. They also aimed to correct a limitation in previous studies that often focused only on White populations. This new analysis prioritized a cohort that included Asian, Black, Latino, and White participants.</p>
<p>The researchers utilized data from two major ongoing health studies. The first was the Kaiser Healthy Aging and Different Life Experiences (KHANDLE) cohort. The second was the Study of Healthy Aging in African Americans (STAR) cohort. Both groups consisted of long-term members of the Kaiser Permanente Northern California healthcare system.</p>
<p>Participants in these cohorts had previously completed detailed health surveys and were invited to undergo neuroimaging. The researchers identified 664 participants who had complete magnetic resonance imaging (MRI) data. They also analyzed a subset of 385 participants who underwent positron emission tomography (PET) scans. The average age of the participants at the time of their MRI scan was approximately 74 years old.</p>
<p>The primary variable the researchers examined was a history of divorce. They classified participants based on whether they answered yes to having a previous marriage end in divorce. They also included individuals who reported their current marital status as divorced. This approach allowed them to capture lifetime exposure to the event rather than just current status.</p>
<p>The MRI scans provided detailed images allowing the measurement of brain volumes. The team looked at the total size of the cerebrum and specific regions like the hippocampus. The hippocampus is a brain structure vital for learning and memory that often shrinks early in the course of Alzheimer’s disease. They also examined the lobes of the brain and the volume of gray matter and white matter.</p>
<p>In addition to volume, the MRI scans measured white matter hyperintensities. These are bright spots on a scan that indicate damage to the brain’s communication cables. High amounts of these hyperintensities are often associated with vascular problems and cognitive slowing.</p>
<p>The PET scans utilized a radioactive tracer to detect amyloid plaques. Amyloid beta is a sticky protein that clumps between nerve cells and is a hallmark characteristic of Alzheimer’s disease. The researchers calculated the density of these plaques to determine if a person crossed the threshold for amyloid positivity.</p>
<p>The statistical analysis accounted for various factors that could skew the results. The models adjusted for age, sex, race and ethnicity, and education level. They also controlled for whether the participant was born in the American South and whether their own parents had divorced.</p>
<p>The results showed that individuals with a history of divorce had slightly smaller volumes in the total cerebrum and hippocampus. They also displayed slightly greater volumes of white matter hyperintensities. However, these differences were small and not statistically significant. This means the calculations were not precise enough to rule out the possibility that the differences were due to random chance.</p>
<p>The PET scan analysis yielded similar results regarding Alzheimer’s pathology. There was no meaningful association between a history of divorce and the total burden of amyloid plaques. The likelihood of being classified as amyloid-positive was effectively the same for divorced and non-divorced participants.</p>
<p>The researchers performed several sensitivity analyses to ensure their findings were robust. They broke the data down by sex to see if men and women experienced different effects. Although the impact of divorce on brain volume seemed to trend in opposite directions for men and women in some brain regions, the confidence intervals overlapped. This suggests there is no strong evidence of a sex-specific difference in this sample.</p>
<p>They also checked if the definition of the sample population affected the outcome. They ran the numbers again excluding people who had never been married. They also adjusted for childhood socioeconomic status, looking at factors like parental education and financial stability. None of these adjustments altered the primary conclusion that divorce was not associated with brain changes.</p>
<p>There are several potential reasons why this study did not find a link between divorce and neurodegeneration. One possibility is that the stress of divorce acts more like an acute, short-term event rather than a chronic condition. Detectable changes in brain structure usually result from sustained exposure to adversity over many years. It is possible that for many people, the stress of divorce resolves before it causes permanent biological damage.</p>
<p>Another factor is the heterogeneity of the divorce experience. For some individuals, ending a marriage is a devastating source of trauma and financial ruin. For others, it is a relief that removes them from an unhealthy or unsafe environment. These opposing experiences might cancel each other out when analyzing a large group, leading to a null result.</p>
<p>The authors noted several limitations to their work. The study relied on a binary measure of whether a divorce occurred. They did not have data on the timing of the divorce or the reasons behind it. They also lacked information on the subjective level of stress the participants felt during the separation.</p>
<p>Future research could benefit from a more nuanced approach. Gathering data on the duration of the marriage and the economic aftermath of the split could provide clearer insights. Understanding the personal context of the divorce might help reveal specific subgroups of people who are more vulnerable to health consequences.</p>
<p>The study provides a reassuring perspective for the millions of older adults who have experienced marital dissolution. While divorce is undoubtedly a major life event, this research suggests it does not automatically dictate the biological health of the brain in late life. It underscores the resilience of the aging brain in the face of common social stressors.</p>
<p>The study, “<a href="https://doi.org/10.1093/geroni/igaf122.4023" target="_blank">The Association Between Divorce and Late-life Brain Health in a Racially and Ethnically Diverse Cohort of Older Adults,</a>” was authored by Suhani Amin, Junxian Liu, Paola Gilsanz, Evan Fletcher, Charles DeCarli, Lisa L. Barnes, Rachel A. Whitmer, and Eleanor Hayes-Larson.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/infants-fed-to-sleep-at-2-months-wake-up-more-often-at-6-months/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Infants fed to sleep at 2 months wake up more often at 6 months</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 5th 2026, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A 12-month longitudinal study found that infants who are put to bed with a bottle at 2 months of age tended to display more sleep problems at 6 months of age. They needed a longer time to fall asleep, spent more time awake, and woke up during the night more often. Mothers of infants who displayed more sleep problems at 6 months of age were more likely to keep putting them to bed with a bottle at 14 months of age. The paper was published in the <a href="https://doi.org/10.1111/jsr.70237"><em>Journal of Sleep Research</em></a>.</p>
<p>Many infants have sleep problems, particularly in the first year of life. These include difficulty falling asleep, frequent or prolonged night wakings, short nighttime sleep duration, and an inability to soothe themselves back to sleep. These problems are important because they are linked to later risks for both child and family well-being.</p>
<p>Poor infant sleep has been associated with outcomes such as overweight, obesity, and difficulties in emotional and behavioral regulation. Sleep problems also affect parents, contributing to higher depressive symptoms, lower energy, and less adaptive parenting practices. Research suggests that infant sleep and parenting behaviors influence each other in a bidirectional, transactional way over time.</p>
<p>One parenting practice of interest is putting an infant to bed with a bottle, which is believed to interfere with the infant’s ability to self-soothe to sleep. Feeding infants to sleep is associated with shorter nighttime sleep duration, more frequent night wakings, and greater sleep fragmentation. Expert guidance therefore emphasizes putting infants to bed while drowsy but still awake, rather than using feeding as a sleep aid.</p>
<p>Providing a bottle at bedtime has also been identified as a feeding practice that promotes obesity, linking sleep routines to physical health outcomes. Poor infant sleep may, in turn, increase parents’ reliance on bottle-to-bed practices as a way to manage nighttime distress.</p>
<p>Study author Esther M. Leerkes and her colleagues wanted to examine associations between putting the infant to bed with a bottle and maternal-reported infant sleep problems. They conducted a 12-month longitudinal study in which they followed a group of infants and their mothers from the infants’ 2nd month of life until the infants were 14 months old.</p>
<p>Pregnant women in their third trimester were recruited in and around Guilford County, North Carolina, to participate in the Infant Growth and Development Study. The primary goal of that larger study was to identify early life predictors of childhood obesity. Originally, 299 women were recruited. The average age of these mothers was approximately 30 years (mean age 29.71).</p>
<p>Data from participating women were collected when their infants were 2 months, 6 months, and 14 months old. 90% of these women provided data at the 2-month wave, 81% at 6 months, and 76% at 14 months.</p>
<p>Mothers reported how often they put their infant to bed with a bottle of formula, breast milk, juice, juice drink, or any other kind of milk by providing ratings on a 5-point scale. They reported infants’ sleep problems using the Brief Infant Sleep Questionnaire.</p>
<p>The study authors included data on maternal education, race, and their participation in the Women, Infant and Children Special Food Supplemental Program (WIC) in their analyses. They also controlled for maternal depressive symptoms, maternal sleep quality, breastfeeding status, and weekly work hours. WIC is a U.S. federal nutrition assistance program that provides supplemental foods, nutrition education, and health referrals to low-income pregnant women, new mothers, infants, and young children.</p>
<p>Results showed that infants who were put to bed with a bottle more frequently at 2 months of age tended to display more sleep problems at 6 months of age. They needed a longer time to fall asleep, spent more time awake at night, and had more frequent night wakings.</p>
<p>Mothers whose infants woke up more frequently and less time sleeping during the night at 6 months were more likely to be putting them to bed with a bottle at 14 months of age.</p>
<p>“In conclusion, putting infants to bed with a bottle and infant sleep problems influence one another across infants’ first year and into their second year. Given infant sleep problems are a predictor of maladaptive infant, parent and family outcomes, efforts to prevent parental use of this strategy are important to promote infant and parent well-being,” the study authors concluded.</p>
<p>The study contributes to the scientific knowledge about infant sleep patterns. However, it should be noted that both infants’ sleep quality and bottle-to-bed practices were reported by mothers, leaving room for reporting and common method bias to have affected the results.</p>
<p>The paper, “<a href="https://doi.org/10.1111/jsr.70237">Transactional Associations Between Bottle to Bed and Infant Sleep Problems Over the First Year,</a>” was authored by Esther M. Leerkes, Agona Lutolli, Cheryl Buehler, Lenka Shriver, and Laurie Wideman.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/eye-contact-discomfort-does-not-explain-slower-emotion-recognition-in-autistic-individuals/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Eye contact discomfort does not explain slower emotion recognition in autistic individuals</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 5th 2026, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Recent findings published in the journal <em><a href="https://doi.org/10.1037/emo0001625" target="_blank">Emotion</a></em> suggest that the discomfort associated with making eye contact is not exclusive to individuals with a clinical autism diagnosis but scales with autistic traits found in the general population. The research team discovered that while this social unease is common among those with higher levels of autistic traits, it does not appear to be the direct cause of difficulties in recognizing facial expressions. </p>
<p>The concept of autism has evolved significantly in recent years. Mental health professionals and researchers increasingly view the condition not as a binary category but as a spectrum of traits that exist throughout the general public. This perspective implies that the distinction between a person with an autism diagnosis and a neurotypical person is often a matter of degree rather than a difference in kind.</p>
<p>Features associated with autism, such as sensory sensitivities or preferences for repetitive behaviors, can be present in anyone to varying extents. One of the most recognizable features associated with autism is a reduction in mutual gaze during social interactions. Autistic individuals frequently report that meeting another person’s eyes causes intense sensory or emotional overarousal.</p>
<p>Despite these self-reports, the scientific community has not fully determined why this avoidance occurs or how it impacts social cognition. Previous theories posited that avoiding eye contact limits the visual information a person receives. If a person does not look at the eyes, they might miss subtle cues required to identify emotions such as fear or happiness.</p>
<p>To investigate this, a team of researchers led by Sara Landberg from the University of Gothenburg in Sweden designed a study to disentangle these factors. The study included co-authors Jakob Åsberg Johnels, Martyna Galazka, and Nouchine Hadjikhani. Their primary goal was to examine how eye gaze discomfort relates to autistic traits, distinct from a formal diagnosis.</p>
<p>They also sought to understand the role of other conditions that often co-occur with autism. One such condition is alexithymia, which is characterized by a difficulty in identifying and describing one’s own emotions. Another is prosopagnosia, often called “face blindness,” which involves an impairment in recognizing facial identity.</p>
<p>The researchers recruited 187 adults from English-speaking countries through an online platform. This method allowed them to access a diverse sample of the general public rather than relying solely on clinical patients. The participants completed a series of standardized questionnaires to measure their levels of autistic traits, alexithymia, and face recognition abilities.</p>
<p>To assess sensory experiences, the group answered questions about their sensitivity to stimuli like noise, light, and touch. The study also utilized a specific “Eye Contact Questionnaire.” This tool asked participants directly if they found eye contact unpleasant and, if so, what strategies they used to manage that feeling.</p>
<p>In addition to the self-reports, the participants completed an objective performance test called the Emotion Labeling Task. On a computer screen, they viewed faces that had been digitally morphed to display emotions at only 40 percent intensity. This low intensity was chosen to make the task sufficiently challenging for a general adult audience.</p>
<p>Participants had to match the emotion shown on the screen—such as fear, anger, or happiness—to one of four label options. The researchers measured both the accuracy of the answers and the reaction time. This setup allowed the team to determine if people with high levels of specific traits were slower or less accurate at reading faces.</p>
<p>The data revealed clear associations between personality traits and social comfort. Participants who scored higher on the scale for autistic traits were more likely to report finding eye contact unpleasant. This supports the idea that social gaze aversion is a continuous trait in the population.</p>
<p>The study also identified an independent link between alexithymia and eye gaze discomfort. Individuals who struggle to understand their own internal emotional states also tend to find mutual gaze difficult. While these two traits often overlap, the statistical analysis showed that alexithymia predicts discomfort on its own.</p>
<p>A particularly revealing finding emerged regarding the coping strategies participants employed. The researchers asked individuals how they handled the discomfort of looking someone in the eye. The responses indicated that people with high autistic traits tend to look at other parts of the face, such as the mouth or nose.</p>
<p>In contrast, those with high levels of alexithymia were more likely to look away from the face entirely. They might look at the floor or in another direction. This suggests that while the symptom of gaze avoidance looks similar from the outside, the internal mechanism or coping strategy differs depending on the underlying trait.</p>
<p>When analyzing the performance on the Emotion Labeling Task, the researchers found no statistically significant difference in accuracy based on autistic traits. Participants with higher levels of these traits were just as capable of correctly identifying the emotions as their peers. This contrasts with some previous literature that found deficits in emotion recognition accuracy.</p>
<p>However, the results did show a difference in processing speed. Participants with higher levels of autistic traits took longer to identify the emotions. Similarly, those with higher levels of prosopagnosia, or difficulty recognizing identities, also demonstrated slower reaction times.</p>
<p>The researchers then performed a mediation analysis to see if the eye gaze discomfort explained this slower processing. The hypothesis was that discomfort might cause people to look away or avoid the eyes, which would then slow down their ability to read the emotion. The data did not support this hypothesis.</p>
<p>Eye gaze discomfort was not a statistically significant predictor of the reaction time on the emotion task. This implies that the discomfort one feels about eye contact and the cognitive speed of recognizing an emotion are likely separate issues. The slower processing speed associated with autistic traits seems to stem from a different cognitive mechanism than the emotional or sensory aversion to gaze.</p>
<p>The study also explored sensory sensitivity. The researchers hypothesized that general sensory over-responsiveness might drive the discomfort with eye contact. However, the analysis did not find a strong link between general sensory sensitivity scores and the specific report of eye gaze discomfort.</p>
<p>These findings suggest that the difficulty autistic individuals face with emotion recognition may be more about processing efficiency than a lack of visual input due to avoidance. It challenges the assumption that simply training individuals to make more eye contact would automatically improve their ability to read emotions.</p>
<p>There are limitations to this research that must be considered. The data was collected entirely online. While this allows for a large sample, it prevents the researchers from controlling the environment in which participants took the tests. Factors such as screen size, lighting, or distractions at home could influence reaction times.</p>
<p>The sample was also relatively highly educated. A majority of the participants had completed a university degree. This demographic skew might mean the results do not perfectly represent the broader global population. Additionally, the autistic traits in this sample were slightly higher than average, which may reflect a self-selection bias in who chooses to participate in online psychological studies.</p>
<p>The measurement of eye gaze discomfort relied on a binary “yes or no” question followed by strategy selection. This simple metric may not capture the full complexity or intensity of the experience. Future research would benefit from using more granular scales to measure the degree of discomfort.</p>
<p>The researchers note that this study focused on traits rather than diagnostic categories. This approach is beneficial for understanding the continuum of human behavior. However, it means the results might not fully apply to individuals with profound autism who experience high functional impairment.</p>
<p>Future investigations could expand on the distinct coping strategies identified here. Understanding why individuals with alexithymia look away completely, while those with autistic traits look at other facial features, could inform better support strategies. It suggests that interventions should be tailored to the specific underlying profile of the individual.</p>
<p>The study also raises questions about the role of social anxiety. While the team controlled for several factors, they did not specifically measure current anxiety levels. It is possible that general social anxiety plays a role in the strategies people use to avoid eye contact.</p>
<p>The study, “<a href="https://doi.org/10.1037/emo0001625" target="_blank">Eye Gaze Discomfort: Associations With Autistic Traits, Alexithymia, Face Recognition, and Emotion Recognition</a>,” was authored by Sara Landberg, Jakob Åsberg Johnels, Martyna Galazka, and Nouchine Hadjikhani.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/a-high-sugar-breakfast-may-trigger-a-rest-and-digest-state-that-dampens-cognitive-focus/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">A high-sugar breakfast may trigger a “rest and digest” state that dampens cognitive focus</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 5th 2026, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Starting the day with a sugary pastry might feel like a treat, but new research suggests it could sabotage your workday before it begins. A study published in the journal <em><a href="https://doi.org/10.1016/j.foohum.2025.100900" target="_blank">Food and Humanity</a></em> indicates that a high-fat, high-sugar morning meal may dampen cognitive planning abilities and increase sleepiness in young women. The findings imply that nutritional choices at breakfast play a larger role in regulating morning physiological arousal and mental focus than previously realized.</p>
<p>Dietary habits vary widely across populations, yet breakfast is often touted as the foundation for daily energy. Despite this reputation, statistical data indicates that a sizable portion of adult women frequently consume confectionaries or sweet snacks as their first meal of the day. Researchers identify this trend as a potential public health concern, particularly regarding productivity and mental well-being in the workplace.</p>
<p>The autonomic nervous system regulates involuntary body processes, including heart rate and digestion. It functions through two main branches: the sympathetic nervous system and the parasympathetic nervous system. The sympathetic branch prepares the body for action, often described as the “fight or flight” response.</p>
<p>Conversely, the parasympathetic branch promotes a “rest and digest” state, calming the body and conserving energy. Professional work performance typically requires a certain level of alertness and physiological arousal. Fumiaki Hanzawa and colleagues at the University of Hyogo in Japan sought to understand how different breakfast compositions influence this delicate neural balance.</p>
<p>Hanzawa and his team hypothesized that the nutrient density of a meal directly impacts how the nervous system regulates alertness and cognitive processing shortly after eating. To test this, they designed a randomized crossover trial involving 13 healthy female university students. This specific study design ensured that each participant acted as her own control, minimizing the impact of individual biological variations.</p>
<p>On two separate mornings, the women arrived at the laboratory after fasting overnight. They consumed one of two test meals that contained an identical amount of food energy, totaling 497 kilocalories. The researchers allowed for a washout period of at least one week between the two sessions to prevent any lingering effects from the first test.</p>
<p>One meal option was a balanced breakfast modeled after a traditional Japanese meal, known as Washoku. This included boiled rice, salted salmon, an omelet, spinach with sesame sauce, miso soup, and a banana. The nutrient breakdown of this meal favored carbohydrates and protein, with a moderate amount of fat.</p>
<p>The alternative was a high-fat, high-sugar meal designed to mimic a common convenient breakfast of poor nutritional quality. This consisted of sweet doughnut holes and a commercially available strawberry milk drink. This meal derived more than half its total energy from fat and contained very little protein compared to the balanced option.</p>
<p>The researchers monitored several physiological markers for two hours following the meal. They measured body temperature inside the ear to track diet-induced thermogenesis, which is the production of heat in the body caused by metabolizing food. They also recorded heart rate variability to assess the activity of the autonomic nervous system.</p>
<p>At specific intervals, the participants completed computerized cognitive tests. These tasks were designed to measure attention and executive function. Specifically, the researchers looked at “task switching,” which assesses the brain’s ability to shift attention between different rule sets.</p>
<p>The participants also rated their subjective feelings on a sliding scale. They reported their current levels of fatigue, vitality, and sleepiness at multiple time points. This allowed the researchers to compare the women’s internal psychological states with their objective physiological data.</p>
<p>The physiological responses showed distinct patterns depending on the food consumed. The balanced breakfast prompted a measurable rise in body temperature and heart rate shortly after eating. This physiological shift suggests an activation of the sympathetic nervous system, preparing the body for the day’s activities.</p>
<p>In contrast, the doughnut and sweetened milk meal failed to raise body temperature to the same degree. Instead, the data revealed a dominant response from the parasympathetic nervous system immediately after consumption. This suggests the sugary meal induced a state of relaxation and digestion rather than physiological readiness.</p>
<p>Subjective reports from the participants mirrored these physical changes. The women reported feeling higher levels of vitality after consuming the balanced meal containing rice and fish. This feeling of energy persisted during the post-meal monitoring period.</p>
<p>Conversely, when the same women ate the high-fat, high-sugar breakfast, they reported increased sleepiness. This sensation of lethargy aligns with the parasympathetic dominance observed in the heart rate data. The anticipated energy boost from the sugar did not translate into a feeling of vitality.</p>
<p>The cognitive testing revealed that the sugary meal led to a decline in planning function. Specifically, the participants struggled more with task switching after the high-fat, high-sugar breakfast compared to the balanced meal. This function is vital for organizing steps to achieve a goal and adapting to changing work requirements.</p>
<p>Unexpectedly, the high-fat, high-sugar group performed slightly better on a specific visual attention task. The authors suggest this could be due to a temporary dopamine release triggered by the sweet taste. However, this isolated improvement did not extend to the more complex executive functions required for planning.</p>
<p>The researchers propose that the difference in carbohydrate types may explain some of the results. The balanced meal contained rice, which is rich in polysaccharides like amylose and amylopectin. These complex carbohydrates digest differently than the sucrose found in the doughnuts and sweetened milk.</p>
<p>Protein content also likely played a role in the thermal effects observed. The balanced meal contained significantly more protein, which is known to require more energy to metabolize than fat or sugar. This thermogenic effect contributes to the rise in body temperature and the associated feeling of alertness.</p>
<p>The study implies that work performance is not just about caloric intake but the quality of those calories. A breakfast that triggers a “rest and digest” response may be counterproductive for someone attempting to start a workday. The mental fog and sleepiness associated with the high-fat, high-sugar meal could hinder productivity.</p>
<p>While the results provide insight into diet and physiology, the study has limitations that affect broader applications. The sample size was small, involving only 13 participants from a specific age group and gender. This limits the ability to generalize the results to men or older adults with different metabolic profiles.</p>
<p>The study also focused exclusively on young students rather than full-time workers. Actual workplace stress and physical demands might interact with diet in ways this laboratory setting could not replicate. Additionally, the study only examined immediate, short-term effects following a single meal.</p>
<p>It remains unclear how long-term habitual consumption of high-fat, high-sugar breakfasts might alter these responses over months or years. Chronic exposure to such a diet could potentially lead to different adaptations or more severe deficits. The researchers note that habitual poor diet is already linked to cognitive decline in other epidemiological studies.</p>
<p>Hanzawa and the research team suggest that future investigations should expand the demographic pool. Including male participants and older workers would help clarify if these physiological responses are universal. They also recommend examining how these physiological changes translate into actual performance metrics in a real-world office environment.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.foohum.2025.100900" target="_blank">High-fat, high-sugar breakfast worsen morning mood, cognitive performance, and cardiac sympathetic nervous system activity in young women</a>,” was authored by Fumiaki Hanzawa, Manaka Hashimoto, Mana Gonda, Miyoko Okuzono, Yumi Takayama, Yukina Yumen, and Narumi Nagai.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/neuroscientists-reveal-how-jazz-improvisation-shifts-brain-activity/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Neuroscientists reveal how jazz improvisation shifts brain activity</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 5th 2026, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Recent findings in neuroscience provide new evidence that musical creativity is not a static trait but a dynamic process involving the rapid reconfiguration of brain networks. By monitoring the brain activity of skilled jazz pianists, an international research team discovered that high levels of improvisational freedom rely less on introspection and more on sensory and motor engagement. The study suggests that the brain shifts its processing strategy depending on how much creative liberty a musician exerts. These findings were published in the <em><a href="https://doi.org/10.1111/nyas.70042" target="_blank">Annals of the New York Academy of Sciences</a></em>.</p>
<p>Creativity is a complex human ability often defined as the capacity to produce ideas that are both novel and appropriate for a given context. One scientific view proposes that creativity emerges from a balance between constraints and freedom, or between what is predictable and what is surprising. Musical improvisation offers an ideal setting to study this balance because it requires musicians to generate new material spontaneously while adhering to specific structural rules.</p>
<p>Previous neuroimaging studies have identified various brain regions associated with improvisation. These include areas linked to motor planning, emotional processing, and the monitoring of one’s own performance. However, most of these studies have looked at brain activity as a static average over time. This approach can miss the rapid fluctuations in neural connectivity that characterize real-time creative performance. The authors of the current study sought to map these fleeting changes to understand how the brain adapts to different levels of improvisational constraints.</p>
<p>“My main motivation for the study was a long-standing scientific challenge about how to study creativity in real time,” said study author <a href="http://www.petervuust.dk/" target="_blank">Peter Vuust</a>, the director of <a href="https://musicinthebrain.au.dk//" target="_blank">the Center for Music in the Brain</a> and professor at Aarhus University and the Royal Academy of Music Aarhus.</p>
<p>“Much research looks at finished products or abstract tasks, but fewer studies capture the process of creating something new as it unfolds in the brain. Musical jazz improvisation offers a rare opportunity because it is spontaneous yet structured—musicians create novel material moment-to-moment while still following certain rules relating to harmony, rhythm and structure.”</p>
<p>“So the gap was twofold: 1) A need for ecologically valid models of creativity (real behavior, not artificial lab tasks). 2) Limited knowledge about how whole-brain networks dynamically reconfigure during different levels of creative freedom.”</p>
<p>“In the Center for Music in the Brain we have the unique capability of studying brain activity as it unfolds in real time, using state-of-the-art brain imaging combined with whole-brain modelling methods which allow for understanding the shifting brain network activity over time,” Vuust explained.</p>
<p>The study included 16 male jazz pianists with significant experience in the genre. All participants were right-handed and had no history of neurological disease. On average, the musicians had over ten years of dedicated jazz practice. The researchers utilized functional magnetic resonance imaging to record brain activity. This imaging technique measures changes in blood flow to infer which areas of the brain are most active.</p>
<p>To allow the musicians to play while inside the MRI scanner, the team used a custom-designed, non-magnetic fiber optic keyboard. This 25-key instrument was positioned on the participants’ laps. This setup allowed the musicians to play with their right hand while listening to audio through noise-canceling headphones.</p>
<p>The experimental procedure involved playing along with a backing track of the jazz standard “Days of Wine and Roses.” The backing track provided the bass and drums to create a realistic musical context. The participants performed under four specific conditions. First, they played the melody of the song from memory. Second, they played an alternate melody from a score sheet they had briefly studied.</p>
<p>The third and fourth conditions introduced improvisation. In the third task, musicians improvised variations based on the melody. In the fourth and final task, they improvised freely based solely on the song’s chord progression. This design created a gradient of creative freedom, ranging from strict memorization to unconstrained expression. Each condition lasted for 45 seconds and was repeated multiple times.</p>
<p>The researchers analyzed the musical output using digital tools to assess complexity. They measured the number of notes played and calculated the “entropy” of the melodies. In this context, entropy refers to the unpredictability of the musical choices. Higher entropy indicates a performance that is less repetitive and harder to predict.</p>
<p>The behavioral results showed the expected relationship between freedom and musical complexity. As the task became less constrained, the musicians played significantly more notes. The condition involving free improvisation on the chord changes resulted in the highest number of notes and the highest level of entropy. The analysis also revealed that during free improvisation, the musicians tended to use smaller intervals between notes. This suggests a dense and rapidly moving musical style.</p>
<p>To analyze the brain imaging data, the researchers employed a method known as Leading Eigenvector Dynamics Analysis. This advanced analytical technique focuses on the phase-locking of blood oxygenation level-dependent signals. It allows scientists to detect recurrent patterns of functional connectivity that may only last for short periods. This is distinct from traditional methods that assume brain connectivity remains constant throughout a task.</p>
<p>The imaging results revealed five distinct brain states, or “substates,” that appeared with varying frequency across the conditions. One of these states was associated with the brain’s reward system. It included the orbitofrontal cortex, a region involved in sensory integration and pleasure. This reward-related state was more active during all playing conditions compared to when the musicians were resting. This finding aligns with the idea that playing music is inherently rewarding, regardless of whether one is improvising or playing from memory.</p>
<p>“A simple takeaway is: Creativity in music is not located in a single ‘creative center’ of the brain,” Vuust told PsyPost. “Instead, it emerges from rapid shifts between multiple brain networks—including those involved in movement, hearing, reward, attention, and self-reflection, depending on the improvisational taks: whether you are trying to improvise on the melody or the chord changes.”</p>
<p>A distinct pattern emerged when the researchers compared the improvisation tasks to the memory tasks. Both the melodic and free improvisation conditions significantly increased the probability of engaging a brain state dominated by auditory and sensorimotor networks, as well as the posterior salience network. These regions are critical for processing sound, coordinating complex movements, and integrating sensory information.</p>
<p>The increased activity in auditory and sensorimotor areas suggests that improvisation places a heavy demand on the brain’s ability to predict and execute sound. Jazz musicians often report “hearing” lines in their head immediately before playing them. The data supports the notion that improvisation is a highly embodied activity. It relies on a tight coupling between the auditory cortex and the motor system to navigate the musical landscape in real time.</p>
<p>Perhaps the most distinct finding appeared in the condition with the highest level of creative freedom. When musicians improvised freely on the chords, the researchers observed a decrease in the occurrence of a brain state involving the default mode network and the executive control network. The default mode network is typically active during introspection, mind-wandering, and self-referential thought. The executive control network is usually involved in planning and goal-directed behavior.</p>
<p>The reduced presence of these networks during free improvisation implies a shift in cognitive strategy. To generate novel ideas rapidly without getting stuck in evaluation or planning, the brain may need to suppress these introspective systems. This aligns with the concept of “flow,” where an individual becomes fully immersed in an activity and self-consciousness recedes. The musicians appeared to rely less on internal planning and more on external sensory feedback.</p>
<p>“Another key message is that greater freedom in improvisation changes how the brain is organized in the moment,” Vuust said. “When musicians improvise more freely, their brains rely more on auditory–motor and salience systems (listening, acting, reacting), and less on heavily controlled, evaluative networks. In everyday terms: creativity often involves letting go of over-analysis while staying highly engaged and responsive.”</p>
<p>The study indicates that creativity involves a flexible reconfiguration of neural resources. Moderate improvisation may require a balance of structure and freedom. However, highly unconstrained improvisation appears to demand a surrender of executive control in favor of sensory-motor processes.</p>
<p>“The effects are not about small local activations but about system-level reconfigurations—which networks are more or less likely to appear over time,” Vuust explained. “Practically, this means the significance lies in patterns and probabilities, not single brain spots lighting up.”</p>
<p>“For musicians and educators, the implication is that training creativity may involve balancing structure and freedom, rather than maximizing one or the other. For neuroscience, it shows that dynamic brain-state analysis can reveal meaningful differences even within subtle variations of the same task.”</p>
<p>As with all research, there are limitations to consider. The sample consisted exclusively of male jazz pianists. This homogeneity limits the ability to generalize the results to female musicians or those from other musical traditions. The creative demands of jazz are specific and may differ from those in other arts, such as painting or writing.</p>
<p>Another consideration is the nature of the “novelty” observed. While the free improvisation condition produced the most unpredictable music, the study did not assess the aesthetic quality of these performances. Higher entropy does not necessarily equate to better music. Previous research suggests that listeners often prefer a balance of complexity and familiarity. The most unconstrained performances might be the most cognitively demanding but not necessarily the most pleasing to an audience.</p>
<p>“Another possible misinterpretation is to assume that more novelty automatically equals more enjoyment or value,” Vuust noted. “The study notes that pleasure and complexity often follow an inverted-U relationship—too much unpredictability can reduce perceived enjoyment.”</p>
<p>Future research could address these gaps by recruiting a more diverse group of participants. Comparing jazz improvisation with other forms of real-time creativity could reveal which brain dynamics are universal and which are specific to music. The authors also suggest that future studies could investigate how these brain states relate to subjective feelings of inspiration or enjoyment. Understanding the link between neural dynamics and the quality of the creative product remains a key goal for the field.</p>
<p>The study, “<a href="https://doi.org/10.1111/nyas.70042" target="_blank">Creativity in Music: The Brain Dynamics of Jazz Improvisation</a>,” was authored by Patricia Alves Da Mota, Henrique Miguel Fernandes, Ana Teresa Lourenço Queiroga, Eloise Stark, Jakub Vohryzek, Joana Cabral, Ole Adrian Heggli, Nuno Sousa, Gustavo Deco, Morten Kringelbach, and Peter Vuust.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>