<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/live-music-causes-brain-waves-to-synchronize-more-strongly-with-rhythm-than-recorded-music/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Live music causes brain waves to synchronize more strongly with rhythm than recorded music</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Apr 18th 2026, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in the journal <em><a href="https://doi.org/10.1093/scan/nsag021" target="_blank" rel="noopener">Social Cognitive and Affective Neuroscience</a></em> provides evidence that listening to live music causes brain waves to synchronize more strongly with musical rhythms compared to listening to a recording. This enhanced brain-music synchronization tends to predict how much pleasure and engagement a person experiences during a performance. The findings offer a biological explanation for why attending a concert can feel so much more moving than playing a track on a phone or computer.</p>
<p>Live music attendance remains widely popular worldwide, even as high-quality audio streaming makes pristine recordings available on demand. This persistence led researchers Arun Asthagiri and Psyche Loui to ask why a live experience feels noticeably different from a recorded one.</p>
<p>“If a recording can faithfully reproduce the acoustic signal, why does the live experience feel so different? A growing body of work shows that audiences physiologically synchronize with each other during live concerts, and that rhythmic entrainment — the tendency of neural oscillations to align with external rhythmic stimuli — underlies the pleasurable urge to move to music,” said Loui, an associate professor and associate dean of research at Northeastern University College of Arts, Media and Design and associate director at the Institute for Cognitive and Brain Health at Northeastern University.</p>
<p>“But we didn’t know whether the mere context of a live performer, independent of differences in the acoustic signal itself, could alter the strength of neural entrainment.” Neural entrainment is the brain’s tendency to align its internal electrical rhythms with external patterns like a musical beat. Asthagiri and Loui set out to determine if this syncing process changes during a live performance independent of differences in the actual sound quality. “We wanted to test that directly, in an ecologically valid setting — a real concert hall — rather than in a standard EEG laboratory.”</p>
<p>To capture this natural environment, the researchers turned to a local musical institution. “We were lucky to partner with New England Conservatory for this project,” Loui said. “The study’s first author, Arun Asthagiri, went there as a violin student and has strong ties with the conservatory. Arun is now a PhD student in the College of Arts, Media, and Design at Northeastern University.”</p>
<p>The scientists recruited 21 participants, all of whom had formal musical training. Participants listened to four different solo violin excerpts composed by Johann Sebastian Bach. Two of the pieces were fast and two were slow. Half of the excerpts were performed live on stage by professional violinist Joshua Brown. The other half were played from high-quality audio recordings of the same violinist using a speaker system placed in the exact same location on stage.</p>
<p>The researchers matched the volume of the live violin and the speaker system to ensure the sound levels were identical. They also asked participants to keep their eyes closed during the performances. This step isolated the perceptual experience of hearing live music from the visual aspects of watching a performer move or seeing an instrument being played.</p>
<p>While the participants listened, the scientists recorded their brain activity using an electroencephalogram. This device, commonly known as an EEG, involves placing a cap with sensors on the scalp to measure electrical signals in the brain. After each piece of music ended, participants filled out a survey rating their experience on factors like pleasure, engagement, spontaneity, and focus.</p>
<p>The data showed that participants consistently rated the live performances higher on a combined scale of pleasure and engagement than the recorded versions. Beyond these subjective ratings, the EEG data revealed differences in how the brain processed the sounds. The scientists focused on a metric called cerebro-acoustic phase-locking.</p>
<p>Phase-locking measures how consistently the cyclic patterns of brain waves line up with the rhythmic pulses in the music. For the fast-paced musical pieces, live performances resulted in significantly stronger phase-locking than the recorded tracks. Specifically, the brain waves synchronized more tightly with the rate at which individual musical notes were played.</p>
<p>In the fast pieces, this brain wave synchronization occurred in the theta frequency band. This specific frequency corresponds to about four to eight cycles per second, which perfectly matched the speed of the individual musical notes.</p>
<p>“The liveness effect on phase-locking was statistically robust and survived correction for multiple comparisons across frequencies,” Loui told PsyPost. “To put it concretely: within participants, the expected phase-locking value for live compared to recorded performance was about 31% higher (model estimate e^0.27 = 1.31).”</p>
<p>“That’s a meaningful difference given how carefully we controlled the sensory environment — loudness, source location, and even visual exposure were matched between conditions. The effect was also specific to rhythmically salient frequencies (the note rate of the fast excerpts), rather than appearing broadly across the spectrum, which strengthens confidence in the interpretation.”</p>
<p>The scientists also found a direct mathematical relationship between the brain data and the survey responses. “The most striking finding was the brain-behavior relationship,” Loui explained. “We tested whether the degree to which someone’s neural phase-locking increased for live over recorded music predicted how much their pleasure and engagement also increased — and it did, significantly (β = 2.85, P<.001). Stronger neural coupling with the music’s rhythm during live performance was directly associated with a more positive subjective experience. This points toward a bidirectional relationship between low-level auditory processing and affect that we find exciting.”</p>
<p>So what is the main takeaway? The findings indicate that “your brain responds measurably differently to live music than to a recording, even when the music itself is identical,” Loui said. “We found that neural oscillations locked more tightly onto the rhythmic structure of the music during live performances — a phenomenon called cerebro-acoustic phase-locking — and that this stronger neural coupling predicted how much pleasure and engagement listeners reported.”</p>
<p>“In other words, the brain and the subjective experience told the same story: something about the live context strengthens the connection between your neural rhythms and the music’s rhythms, and that difference registers in how you feel.”</p>
<p>While the study provides new insights into music processing, there are a few limitations to consider. Because all 21 participants were musically trained, the scientists note that these specific brain responses might not represent those of the general population. People with extensive musical experience might be unusually sensitive to the subtle differences between a live musician and a speaker.</p>
<p>Additionally, the experiment controlled for social factors by having people listen alone with their eyes closed. A typical live concert involves visual stimulation and a crowd of other people. This means the brain effects measured in this isolated setting are likely a baseline rather than a full picture of a normal concert experience.</p>
<p>Another caveat is that the enhanced brain synchronization was only statistically significant for the fast-paced musical excerpts. The slow pieces featured more rhythmic variation and expressive timing, which is a musical technique known as rubato. This shifting tempo might have made it harder for the brain to lock onto a steady pulse, regardless of whether the music was live or recorded.</p>
<p>Looking forward, the researchers plan to expand on this line of research.</p>
<p>“First, we’re interested in scaling up the social dimension: what happens when multiple listeners are present simultaneously, or when there is explicit performer-audience interaction?” Loui told PsyPost. “Second, we’re interested in the implications for music-based interventions in brain health.”</p>
<p>“Neural entrainment to rhythm is preserved across aging and has been implicated in attention and sensorimotor function. If live music engagement produces stronger neural coupling than recorded music, that has practical relevance for how we design music-based therapeutic environments — for older adults, for people with attentional difficulties, and for neurological populations more broadly.”</p>
<p>“The study was supported by National Science Foundation and National Institutes of Health,” Loui said. “We are thankful for the Sound Health Network which served as a public clearinghouse for this kind of work in the past few years. We hope to find ways to continue our work at the intersection of arts, sciences, and health and creativity.”</p>
<p>The study, “<a href="https://doi.org/10.1093/scan/nsag021" target="_blank" rel="noopener">From Lab to Concert Hall: Effects of Live Performance on Neural-Acoustic Phase-Locking and Engagement</a>,” was authored by Arun Asthagiri and Psyche Loui.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/scientists-find-evidence-some-alzheimers-symptoms-may-begin-outside-the-brain/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Scientists find evidence some Alzheimer’s symptoms may begin outside the brain</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Apr 17th 2026, 20:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study has found that the physical movement difficulties often associated with Alzheimer’s disease can originate outside the brain. By creating a microscopic model of human nerves and muscles, researchers demonstrated that these motor problems occur independently of cognitive decline, which suggests new targets for medical treatments. The research was published in the journal <em><a href="https://doi.org/10.1002/alz.71281" target="_blank">Alzheimer’s & Dementia</a></em>.</p>
<p>Alzheimer’s disease is usually associated with memory loss and severe cognitive decline. Doctors routinely observe that patients also experience movement issues like a slower walking pace, diminished grip strength, and poor balance well before any mental symptoms appear.</p>
<p>Historically, medical research has treated these physical symptoms as secondary effects stemming directly from brain degeneration. It remained unproven if the illness was also independently attacking the body’s peripheral nerves, which make up the vast network connecting the spinal cord to the rest of the body.</p>
<p>To investigate the root of these movement problems, researchers looked closely at the neuromuscular junction. This is the exact biological point where a nerve cell sends a chemical signal to a muscle cell, commanding it to contract and create physical movement.</p>
<p>The study was led by University of Central Florida professors James Hickman and Xiufang Guo, with Akhmetzada Kargazhanov serving as the lead author. The academic team collaborated with scientists at Hesperos, a biotechnology company co-founded by Hickman, to build a specialized laboratory model.</p>
<p>The researchers focused their efforts on familial Alzheimer’s disease. This is a rare, hereditary version of the condition that typically appears when a patient is between 40 and 65 years old. It is distinct from the more common sporadic form of the illness, which generally affects older populations and lacks a single genetic cause.</p>
<p>To conduct the experiment without using live human subjects, the researchers utilized human induced pluripotent stem cells. These are adult cells, usually taken from skin or blood, that have been chemically reprogrammed to act like embryonic stem cells. This reprogramming allows scientists to coax the cells into developing into almost any tissue type in the human body.</p>
<p>The scientists used this cellular technology to grow human motor neurons, which are the specialized nerve cells responsible for controlling our voluntary movements. They genetically modified these nerves to carry one of two specific genetic mutations associated with familial Alzheimer’s disease.</p>
<p>The researchers then paired these mutated nerve cells with healthy human muscle cells in a microscopic laboratory device known as a human-on-a-chip. This miniature system was split into two separate chambers, effectively mimicking a functional neuromuscular junction while completely removing the brain and spinal cord from the equation.</p>
<p>By keeping the central nervous system out of the model, the researchers could isolate the exact source of any physical failures. If the movement system stopped working correctly in this isolated environment, it would prove that the disease attacks the peripheral nerves on its own.</p>
<p>Testing drugs on animal models like mice can be problematic because human and animal biology differ in ways that affect how a condition progresses. The microscopic human cell models bypass this biological gap, allowing researchers to gather data that more accurately reflects the human body.</p>
<p>During the main experiment, the team ran electrical currents through the nerve chamber to stimulate the cells, prompting them to send movement signals to the connected muscle chamber. They used high-speed cameras and computer software to track how well the muscles responded to these commands.</p>
<p>The researchers measured multiple specific parameters to gauge the health of the cells. They looked at fidelity, which measures how reliably the muscle actually contracts when the nerve sends a signal. They also tested the fatigue index, recording how long the muscle could maintain a tight contraction under rapid electrical stimulation.</p>
<p>The results indicated that the nerve cells carrying the Alzheimer’s mutations struggled to communicate with the healthy muscle cells. The neurons carrying a genetic mutation known as PSEN1 displayed severe deficiencies across all testing days.</p>
<p>These specific cells failed to reliably trigger muscle contractions and showed a higher rate of muscle fatigue. The biological connections between the nerves and muscles were also less stable over time compared to healthy control cells.</p>
<p>The neurons carrying a different flaw, called the APP mutation, showed moderate deficiencies. While they performed better than the PSEN1 cells, they still experienced a drop in their ability to trigger reliable muscle contractions during the middle of the testing period.</p>
<p>Because the muscle cells used in the experiment were completely healthy, the breakdown in communication was definitively caused by the diseased motor neurons. The team had successfully demonstrated that peripheral nerve damage happens independently of brain degeneration.</p>
<p>Inside the cells, researchers also looked at microscopic structures called endosomes, which act like tiny transport pods or recycling centers. The scientists noticed that the transport pods in the mutated nerve cells were abnormally enlarged. Because these pods help recycle the chemicals needed to send movement signals, their malfunction offers a biological clue as to why the nerve communication was failing.</p>
<p>The researchers also wanted to see if common Alzheimer’s medications could fix this peripheral nerve dysfunction. They treated the diseased cells with memantine and galantamine, two drugs routinely prescribed to help manage the cognitive symptoms of the illness in its early stages.</p>
<p>Memantine works by blocking a specific chemical receptor to prevent nerve cell damage, while galantamine stops the breakdown of chemical messengers to prolong their effects. Adding these medications to the microscopic model yielded no statistically significant improvement in the function of the nerve and muscle connections.</p>
<p>The medications failed to restore reliable communication between the nerves and muscles. This lack of recovery suggests that treatments designed exclusively to heal the brain do not automatically repair damage in the rest of the body.</p>
<p>Hickman noted the importance of this specific realization. “This is the first time it’s been demonstrated that deficits in the peripheral nervous system can arise directly from these mutations,” Hickman says. “It means drugs that target the brain may not fix problems in the rest of the body.”</p>
<p>While the model provided clear insights, the study does have a few distinct limitations. The microscopic system used in the experiment only contained motor neurons and skeletal muscle cells, making it a very basic representation of human biology.</p>
<p>In a living human body, other cell types, such as protective astrocytes and Schwann cells, interact with nerves and muscles to support their daily function. Introducing these supporting cells into the laboratory model could alter the results, either making the symptoms worse or compensating for the diseased nerves.</p>
<p>Future research will likely expand on this model by testing muscle cells that carry Alzheimer’s mutations, rather than just the nerve cells. Researchers could also use the miniature devices to model sensory neurons, which would help map out how the disease affects the body’s pathways for feeling touch and pain.</p>
<p>Because current medications did not heal the peripheral nerves in this study, the miniature human-on-a-chip could become a testing ground for new pharmaceutical compounds. Developing combination therapies that target both cognitive decline and physical deterioration could eventually improve the overall quality of life for patients.</p>
<p>The study, “<a href="https://doi.org/10.1002/alz.71281" target="_blank">Evaluating the peripheral nervous system pathology of Alzheimer’s disease utilizing a functional human NMJ microphysiological system</a>,” was authored by Akhmetzada Kargazhanov, Romy Aiken, Kenneth Hawkins, Rafael Lopez, Ahmad Nawaz, Gaurav Srivastava, Chase Miller, Will Bogen, Christopher Long, David Morgan, Xiufang Guo, James Hickman.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/the-narcissistic-mirror-how-extreme-personalities-view-their-friends-humor/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">The narcissistic mirror: how extreme personalities view their friends’ humor</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Apr 17th 2026, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Social relationships form the foundational infrastructure of human well-being and psychological health. Strong connections help protect against daily stress and build lifelong emotional resilience. Conversely, social isolation is tied to a host of physical and mental health vulnerabilities. While familial interactions and romantic bonds see plenty of academic attention, platonic friendships are just as vital to a long and healthy life.</p>
<p>Friendships offer unique psychological benefits compared to other types of social ties. Relationships with relatives often carry rigid biological or cultural obligations, and romantic partnerships are typically weighted with intense emotional expectations. A platonic friendship is a lower-pressure environment where people can engage in voluntary self-disclosure. These relationships provide a safe arena to practice social skills and find comfortable, judgment-free companionship.</p>
<p>Because establishing a friendship is an entirely voluntary process, a major ingredient in building that relationship is perceived similarity. People naturally gravitate toward strangers who share their personal values, core beliefs, and behavioral quirks. Once a bond is established, friends tend to naturally evaluate each other as being much like themselves. However, specialized personality traits can act as a distorting lens, fundamentally altering how people actively perceive similarities in the individuals around them.</p>
<p>People who report high levels of narcissistic traits view the shared habits in their friendships differently than most people do, often perceiving wide gaps between their own behaviors and those of their peers. Depending on the specific type of narcissism involved, these individuals may artificially elevate their own traits while bringing their friends down, or they may idealize their companions at their own expense. The research describing these relationship dynamics was published in <i><a href="https://doi.org/10.1016/j.paid.2025.113211">Personality and Individual Differences</a></i>.</p>
<p>Narcissism introduces a unique tension into the normal rules of everyday social bonding. Typical narcissistic traits include an inflated sense of grandiosity, a constant preoccupation with validation, and a strong assumption of superiority over others. A person displaying these traits genuinely believes they are uniquely special compared to the general population. This deeply held belief creates an internal psychological conflict when it comes to forming close, trusting friendships.</p>
<p>On one side of the conflict, a narcissistic person might want to surround themselves strictly with exceptional, high-status friends. By projecting their own supposed greatness onto their peers, they attempt to validate their own superior social standing. On the other side of the issue, these individuals continually strive to maintain an isolated sense of personal uniqueness. Viewing their friends as total equals might threaten their internal sense of social dominance, leading them to ultimately devalue their chosen companions.</p>
<p>Tobias Altmann at the University of Duisburg-Essen and Destaney Sauls at Michigan Technological University wanted to test how these conflicting internal motivations play out in everyday life. They needed a metric that was highly relevant to human bonding. They chose to focus on how people use humor in social situations as a testing ground for their questions.</p>
<p>Humor is a powerful relational tool that people rely upon to bond with strangers, reduce tension in awkward situations, and establish their particular social identity. Psychologists generally categorize human humor into four distinct approaches based on their intent and target. Two of these are considered adaptive styles, meaning they generally promote mental well-being and social harmony. Two are considered maladaptive styles, meaning they are associated with neuroticism or interpersonal friction.</p>
<p>Affiliative humor is an adaptive style focused on enhancing relationships through inclusive, positive jokes. A person using an affiliative style might recount a funny, shared memory to make everyone at a party feel welcomed. Self-enhancing humor is another adaptive style, characterized by maintaining a humorous, optimistic outlook in the face of inevitable life challenges.</p>
<p>On the maladaptive side, aggressive humor is used to mock, belittle, or disparage other individuals in an attempt to assert social dominance. A person might use sarcastic teasing to bring another person down and artificially elevate their own social standing. Self-defeating humor involves making oneself the target of a joke to gain fleeting approval, often by aggressively highlighting personal flaws.</p>
<p>Altmann and Sauls assessed how individuals evaluate their own regular use of these four styles compared to the habits of their closest friends. The researchers also differentiated between two completely distinct expressions of narcissism that guide outward behavior.</p>
<p>Grandiose narcissism manifests as overt entitlement, outward assertiveness, and a general lack of empathy for others. People with high levels of grandiose narcissism actively project extreme confidence and try to convince the world of their special status. Vulnerable narcissism shares the exact same underlying sense of entitlement, but the outward presentation is completely different. This type of personality is accompanied by deep insecurity, hypersensitivity, and a strong tendency to withdraw socially from uncomfortable situations.</p>
<p>People with vulnerable narcissistic traits oscillate wildly between feelings of intense grandiosity and crippling shame. To capture the real-world effects of both traits in action, the researchers organized two independent cross-sectional assessments. The first data collection format involved 129 participants living in Germany. The second study served to evaluate the patterns in a distinctly different cultural and age demographic, recruiting 131 participants residing in the United States.</p>
<p>In both geographic locations, participants completed detailed psychological questionnaires measuring their self-reported levels of both grandiose and vulnerable narcissism. They were then asked to think of one specific same-sex best friend. Using a modified humor metric, they rated their own comedic preferences and subsequently rated the perceived humor habits of their chosen friend.</p>
<p>By analyzing both sets of behavioral responses, the researchers could look for hidden mathematical discrepancies. They evaluated whether the participants rated themselves higher or lower than their friends across the four different comedic categories.</p>
<p>Most participants without pronounced narcissistic traits reported that their friends utilized social humor in a manner highly similar to their own. However, as self-reported levels of narcissism increased, this reported alignment completely broke down. High narcissism scores were consistently associated with lower rates of perceived similarity in the friendships.</p>
<p>The exact nature of the interpersonal disconnect depended entirely on the flavor of narcissism the participant exhibited. Grandiose narcissism was closely linked to an observable phenomenon the researchers described as self-enhancement. These individuals consistently placed themselves on a behavioral pedestal while looking down at their peers.</p>
<p>Specifically, individuals testing high for grandiose narcissism rated themselves as using more adaptive, positive styles of humor than their companions. At the exact same time, they reported that their friends relied more heavily on maladaptive, aggressive, or self-defeating jokes. They effectively elevated their own social standing while diminishing the positive qualities of their designated best friend.</p>
<p>Participants exhibiting high levels of vulnerable narcissism displayed the exact opposite psychological pattern. They routinely evaluated their friends far more favorably than they evaluated themselves. They assumed their close peers utilized highly adaptive bonding strategies, while they viewed their own humor as largely self-defeating and maladaptive.</p>
<p>These distinct reporting patterns track closely with the underlying, poorly met psychological needs of each narcissism type. Grandiose narcissists operate by minimizing any positive traits in others that might threaten their own perceived dominance. By describing their friends’ jokes as hostile or mean-spirited, they insulate their own egos.</p>
<p>Vulnerable narcissists struggle with deep-seated personal insecurity and may overcompensate by heavily idealizing the people around them. They desperately need constant reassurance from the outside world. This inner turmoil might drive them to artificially elevate the value of their companions, even if it requires distorting their own self-assessment in the process.</p>
<p>While these broad trends appeared in the data sets, the exact mathematical results showed some inconsistencies when comparing the two national samples. The older German sample demonstrated the self-enhancement and friend-enhancement effects across a wide variety of humor styles. The American sample showed similar directional trends but required different calculations to reveal some of the behavioral nuances.</p>
<p>The researchers noted several boundaries to the current analysis that shape how the data should be evaluated. All the information came from cross-sectional, self-reported questionnaires. Humor is an inherently subjective topic, meaning two different people can hear the exact same sarcastic comment and interpret it wildly differently based on their own personal background.</p>
<p>Evaluating another person’s general sense of humor requires blindly guessing at their internal thought processes. This is an abstraction that even lifelong friends might struggle to answer accurately on a standardized test. Future research could investigate this question by having participants generate actual jokes or funny captions, and then asking their friends to rate that direct creative output in real time.</p>
<p>The two sample populations also differed in ways that could alter basic friendship dynamics. The American group was composed largely of young college students living on a campus. Young adults frequently form social bonds simply by geographic proximity, such as sitting next to each other in a crowded classroom.</p>
<p>In contrast, the German sample consisted of older adults with a wider range of educational backgrounds and life experiences. Older individuals might have more freedom to select friends based entirely on shared personality traits and mutual humor preferences rather than simple convenience.</p>
<p>The study authors also required participants to evaluate a same-sex friend to maintain baseline consistency in the survey. Cross-sex friendships often navigate very different social expectations, so the findings may not map perfectly onto mixed-gender relationships. The specific questionnaires used to measure grandiose narcissism also exhibited low internal consistency scoring, an issue common with brief psychological surveys but one that limits total statistical confidence in the metric.</p>
<p>Even with these limitations, the assessments point toward a distinct and fascinating behavioral trend. Extreme personality traits alter the way people experience basic companionship and shared laughter. For those navigating narcissistic tendencies, a friendship serves as a psychological mirror, one that is either polished to reflect their superiority or angled to amplify their deep insecurities.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.paid.2025.113211">Friendship through a narcissistic lens: The role of narcissism in perceived humor similarity among friends in Germany and the US</a>,” was authored by Tobias Altmann and Destaney Sauls.</p>
<h2>Headline options</h2>
<ul>
<li>How narcissism limits shared humor among close friends</li>
<li>The narcissistic mirror: how extreme personalities view their friends</li>
<li>Why grandiose narcissists think their jokes are better than yours</li>
<li>Does your best friend actually share your sense of humor?</li>
<li>The hidden psychology of humor and narcissism in friendship</li>
<li>How vulnerable narcissists use friendship to mask their insecurities</li>
<li>Narcissistic traits change how people view their best friends</li>
<li>The humor gap: tracking extreme personality traits in social groups</li>
<li>Why some people purposefully devalue the humor of their closest peers</li>
<li>Idealized or devalued: the conflicting roles of friendship in narcissism</li>
<li>What your style of joking reveals about your ego and your friendships</li>
<li>Psychologists test how narcissistic traits affect long-term platonic bonds</li>
</ul></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/adolescent-cognitive-scores-and-schooling-linked-to-adult-mental-health-outcomes/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Higher intelligence in adolescence linked to lower mental illness risk in adulthood</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Apr 17th 2026, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Higher cognitive abilities and greater educational achievement in adolescence strongly predict a lower risk of developing mental health conditions later in life. A nationwide study of Norwegian men demonstrated that individuals with lower test scores and less schooling experienced notably higher rates of psychological distress in adulthood. The research was published in the journal <a href="https://doi.org/10.1177/09567976251347221"><i>Psychological Science</i></a>.</p>
<p>Previous sociological research has established a strong connection between educational attainment and overall mental well-being. People with advanced degrees generally experience fewer mood and anxiety disorders compared to those who leave school early. At the same time, academic success is intimately tied to a person’s general cognitive abilities, which encompass skills like problem solving, numerical reasoning, and language comprehension.</p>
<p>This overlap leaves an unresolved question regarding the root cause of these health disparities. Researchers wanted to know whether the elevated risk of mental illness associated with leaving school early is actually driven by underlying cognitive traits. To answer this, investigators needed to examine both cognitive test scores and educational attainment simultaneously in a large representative sample.</p>
<p>Historically, most psychological surveys have relied on limited subsets of the population. People grappling with severe psychiatric issues or possessing lower cognitive scores frequently drop out of long-term observational studies. This healthy-volunteer bias creates a skewed understanding of health dynamics within the general public.</p>
<p>To overcome these selection problems, a team of researchers utilized comprehensive national administrative databases. Lead author Magnus Nordmo, a researcher at the University of South-Eastern Norway, collaborated with investigators from the Norwegian Institute of Public Health, Duke University, and the University of Oslo. They aimed to untangle the distinct impacts of teenage schooling and test performance on adult well-being.</p>
<p>The research team examined records from more than 270,000 Norwegian men born between 1970 and 1979. During that era, military service was mandatory in Norway, requiring young men to complete a standardized assessment of their mental abilities around the age of eighteen. The evaluations measured fluid intelligence alongside accumulated knowledge, utilizing tasks that involved word comprehension, pattern recognition, and mathematical reasoning.</p>
<p>After standardizing the military test scores, the researchers tracked these individuals into middle age using national health registries. They reviewed primary care records from when the men were between thirty-six and forty years old to identify formal diagnoses of mental health conditions. These diagnoses included depressive disorders, anxiety, post-traumatic stress disorder, sleep disturbances, substance abuse, and schizophrenia.</p>
<p>The team also accessed educational registry data to determine the highest level of schooling each participant had completed by age thirty-five. Educational categories ranged from compulsory primary schooling to advanced master-level and doctoral degrees. By merging these extensive datasets, the investigators mapped out how adolescent assessments and adult educational levels corresponded with later medical visits for psychological care.</p>
<p>The results demonstrated a steady, upward trend in mental well-being as adolescent cognitive scores increased. Men who scored in the lowest category on the military assessments were roughly three times more likely to receive a mental health diagnosis compared to those in the highest-scoring bracket. Almost thirty percent of the lowest-scoring group experienced a diagnosable condition in adulthood.</p>
<p>Only ten percent of the highest-scoring participants sought primary care for mental health issues. When educational achievement was factored into the analysis, a distinct vulnerability pattern emerged. Earning a university degree offered a protective effect regardless of early baseline test scores.</p>
<p>Men who recorded low cognitive scores and completed only compulsory education experienced the absolute highest rates of psychiatric conditions. Nearly forty percent of this specific demographic group received a diagnosis in middle age. The gap between this highly vulnerable group and the most protected group was vast, reaching almost thirty percentage points.</p>
<p>The researchers also evaluated a popular concept known as hyperbrain theory, which suggests that extraordinarily high intelligence might predispose individuals to psychological turmoil. Earlier surveys of high-IQ society members hinted that extreme cognitive capability comes with a sensitive nervous system that raises the risk of anxiety or depression. The Norwegian population data did not support this hypothesis.</p>
<p>Instead of showing increased vulnerability, men with the absolute highest test scores exhibited the lowest rates of clinical diagnoses for almost every condition studied. The protective effect of high test scores held strong even at the extreme upper end of the score distribution.</p>
<p>There was one notable exception to this general pattern across the diagnostic categories. The rates of affective psychosis, a medical category that includes bipolar disorder, did not align tightly with cognitive test performance. The relationship between adolescent test scores and this specific psychiatric condition was much less pronounced than the dramatic gradients seen for depression or substance abuse.</p>
<p>The investigators wanted to ensure their findings were not merely a byproduct of poverty or challenging home environments, which can depress test scores and harm health. To isolate these variables, the team performed a comparative analysis using data from more than 80,000 brothers. Brothers typically share parents, an adolescent household, and general demographic backgrounds.</p>
<p>When looking exclusively at differences between brothers raised under the same roof, the connection between lower cognitive scores and elevated psychiatric risk persisted. However, the association for certain conditions like post-traumatic stress disorder and personality disorders were not statistically significant in the sibling model. For most other conditions, the within-family analysis suggested that mental health disparities genuinely relate to specific individual cognitive trajectories, not just parental income.</p>
<p>The investigators proposed several possible reasons for this persistent lifelong link. People with lower cognitive test scores frequently end up with lower-paying jobs and fewer educational credentials. This occupational path can lead to difficult working conditions, financial instability, and residence in under-resourced neighborhoods.</p>
<p>These environmental stressors can accumulate over decades, heavily taxing an individual’s psychological resilience. Navigating a modern society that heavily rewards academic and professional achievement might also cause persistent friction for those who struggle in traditional learning environments. Continual underachievement can generate chronic stress, which frequently precipitates severe depressive and anxiety states.</p>
<p>The authors noted that men possessing lower cognitive scores who also leave school early might represent an under-recognized demographic requiring preventative psychological care. Developing specialized support systems for adolescents struggling in academic settings could help mitigate their elevated risk for future psychiatric emergencies. Better occupational counseling and targeted therapy adaptations could improve outcomes for this group.</p>
<p>While the study utilized massive national registries, the research team acknowledged several limitations in their approach. The analysis relied exclusively on primary care diagnoses recorded by general practitioners. This means that men who experienced deep psychological distress but never sought formal medical treatment were completely excluded from the illness counts.</p>
<p>If individuals with higher cognitive abilities happen to handle emotional stress without visiting a doctor, that behavioral avoidance could influence the final health statistics. Additionally, the investigation only included Norwegian men born in a specific decade. Because Norway features a robust universal healthcare system and strong social safety nets, the observed patterns might not translate directly to nations with different economic realities.</p>
<p>Finally, the researchers cautioned against assuming a strictly one-way causal street between adolescent test scores and later mental wellness. Although the psychiatric evaluations occurred nearly two decades after the cognitive tests, some participants may have experienced undocumented psychological distress during early childhood. Early mental distress could disrupt fundamental academic learning, subsequently causing the lower test scores in late adolescence.</p>
<p>The study, “<a href="https://doi.org/10.1177/09567976251347221">Cognitive Abilities and Educational Attainment as Antecedents of Mental Disorders: A Total Population Study of Males</a>,” was authored by Magnus Nordmo, Hans Fredrik Sunde, Thomas H. Kleppestø, Morten Nordmo, Avshalom Caspi, Terrie E. Moffitt, and Fartein Ask Torvik.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/maturing-brain-pathways-explain-the-sudden-leap-in-childrens-language-skills/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Maturing brain pathways explain the sudden leap in children’s language skills</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Apr 17th 2026, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Between the ages of three and four, children undergo a major leap in their ability to use grammar, and recent research provides evidence that this behavioral milestone corresponds to the maturation of specific neural pathways. A new study published in <em><a href="https://doi.org/10.1016/j.dcn.2026.101715" target="_blank" rel="noopener">Developmental Cognitive Neuroscience</a></em> suggests that the white matter connections along the upper routes of the brain mature during this time to facilitate the learning of grammar rules. This structural shift helps explain why children’s language abilities expand so rapidly just before they start kindergarten.</p>
<p>Learning a native language requires young children to master a massive vocabulary and figure out the complex rules for combining words. While scientists know a great deal about how the adult brain processes these rules, it remains unclear exactly how the developing brain supports early language acquisition.</p>
<p>In adults, rule-based linguistic processes rely on specific white matter pathways. White matter consists of bundles of nerve fibers that act like communication cables, connecting different brain regions and allowing them to share information rapidly.</p>
<p>These particular language pathways mature relatively late in child development. Scientists designed this study to figure out if these late-maturing pathways already help young children learn grammar rules, or if toddlers rely on completely different, earlier-maturing brain connections to communicate.</p>
<p>“The preschool period (between 3 and 5 years of age) is marked by major leaps in language development, with grammar especially taking off during this phase. While we know quite a lot about the mature language network in the adult brain – specifically that the ‘dorsal route’ seems to be a crucial pathway for processing grammar – we know much less about how the developing brain supports these skills,” said study author Cheslie C. Klein of the Max Planck Institute for Human Cognitive and Brain Sciences.</p>
<p>“This is largely because acquiring MRI data with young children is so challenging. In this study, we were particularly interested in the brain’s ‘wiring’ – the white matter pathways that connect the frontal and temporal regions – which ensures these areas can efficiently work together to accomplish complex cognitive tasks like language. This question was particularly interesting because the dorsal white matter connection that supports grammar in adults matures relatively late compared to other connections within the language network.”</p>
<p>The researchers evaluated a sample of 120 typically developing, monolingual German-speaking children. The group included 47 three-year-olds and 73 four- to five-year-olds. The scientists used magnetic resonance imaging, commonly known as MRI, to safely scan the children’s brains and observe the structural development of their white matter pathways.</p>
<p>During the imaging process, the researchers measured how water molecules moved along the nerve fibers. As children grow, their brain’s communication cables become better insulated and organized, which changes how water diffuses along them. By tracking this movement, scientists can estimate the physical maturity of specific neural highways.</p>
<p>Before the brain scans, the children completed a behavioral language test designed to assess their grammar skills at the word level. The researchers used a picture-based game where children were asked to name the plural form of different nouns. For example, a child would see a picture of one car and hear a spoken description, and then see a picture of three cars and be prompted to say the plural word.</p>
<p>The researchers scored the children’s answers based on whether they applied the correct grammatical rule to form the plural word. Because the German language has multiple different rules for making nouns plural, this task requires children to actively sort and apply grammatical patterns.</p>
<p>Young children often make mistakes when learning these rules, such as adding the wrong ending to a word. These errors actually provide evidence that a child is actively attempting to apply a newly learned grammatical rule rather than simply repeating memorized words.</p>
<p>For the four- to five-year-olds, the test also included made-up words. This allowed researchers to see if the older children could apply grammar rules to completely new words they had never heard before. The scientists then mathematically compared these behavioral test scores to the maturation of specific fiber pathways in the children’s brains.</p>
<p>Specifically, the researchers focused on the dorsal routes, which are neural pathways located in the upper part of the brain. They looked at one dorsal pathway extending to Broca’s area, a brain region that handles grammar rules in adults.</p>
<p>They also examined a second dorsal pathway that connects auditory regions to the premotor cortex. The premotor cortex is an area that helps translate sounds into physical mouth movements for speech.</p>
<p>Finally, they evaluated a lower brain route called the ventral pathway, which processes word meanings and memory retrieval. To ensure their findings were specific to language development, the scientists also tracked a completely unrelated neural pathway that controls general body movement. This served as a baseline control measure for the experiment.</p>
<p>The data revealed developmental differences between the three-year-olds and the four- to five-year-olds. In the older group, higher scores on the plural word test were directly associated with the structural maturity of both dorsal pathways. This indicates that four- and five-year-olds use the upper brain route to process sound-to-motor speech movements as well as complex grammar rules.</p>
<p>In contrast, the researchers found no relationship between grammar abilities and these specific white matter pathways in the three-year-old children. The findings suggest that a major qualitative developmental milestone occurs between ages three and four. At this stage, the brain’s upper communication cables become mature enough to actively support a child’s expanding grammar skills.</p>
<p>The researchers also noticed a slight relationship between the lower brain route, the ventral pathway, and grammar scores in the older children. This lower route tends to help with retrieving word meanings from memory. The scientists suspect this pathway helped the older children draw upon real words they already knew to figure out how to pluralize the made-up words in the test.</p>
<p>As expected, the control pathway governing basic body movement showed no connection to the children’s language scores. This confirms that the observed brain changes were specifically related to grammar acquisition, rather than general physical growth.</p>
<p>“The main takeaway is that a qualitative milestone seems to occur between the ages of 3 and 4, when white matter connections via the ‘dorsal route’ (the upper route through the brain) mature to facilitate the acquisition of grammar rules,” Klein told PsyPost. “Caregivers may have noticed how much language abilities improve between ages 3 and 4, and our findings align well with these behavioral milestones.”</p>
<p>The new research builds upon previous findings from the same team of scientists, who earlier mapped the brain’s processing centers, known as gray matter. In their 2023 study, <a href="https://doi.org/10.1093/cercor/bhac430" target="_blank" rel="noopener">the researchers found</a> that three-year-olds tend to rely on a lower brain region called the temporal lobe to process sentences, but by age four, this activity shifts to the frontal lobe, specifically to an area that handles complex grammar rules in adults. Together, these studies provide evidence that both the brain’s processing regions and the neural highways connecting them experience a synchronized developmental leap to support a child’s rapidly expanding language skills.</p>
<p>“Most striking was how nicely the maturation of these white matter pathways aligned with our previous gray matter findings and the behavioral timeline for grammar acquisition reported in the literature during this time frame,” Klein said.</p>
<p>While the study provides detailed insights into early brain development, it comes with certain limitations. Because three-year-old children did not show a connection between grammar skills and the evaluated pathways, it is still unclear which exact brain structures handle grammar before age four. Scanning the brains of very young children is notoriously difficult, which represents a major hurdle for future research.</p>
<p>Additionally, this study only evaluated word-level grammar, specifically how children pluralize nouns. Because children learn nouns and verbs at different rates, it is possible that the brain processes verb rules differently.</p>
<p>“It is important to note that this is basic research,” Klein noted. “The immediate significance lies in furthering our understanding of how the brain changes during typical language development. Since acquiring grammar is a fundamental language skill, understanding the specific structural network that supports it may also advance our future understanding of developmental delays or atypical language development.”</p>
<p>The researchers suggest that future studies should examine sentence-level grammar and explore whether similar brain pathways help children learn to use verbs. Scientists could also look beyond the core language centers to see how other brain networks interact to support early language learning in young children.</p>
<p>“I would like to acknowledge the tremendous effort required to successfully conduct neuroimaging research with such a young group of participants,” Klein added. “This research would not be possible without the support of families interested in advancing our understanding of the developing human brain.”</p>
<p>The study, “<a href="https://doi.org/10.1016/j.dcn.2026.101715" target="_blank" rel="noopener">Grammar acquisition in preschool children is related to white matter maturation of the dorsal language network</a>,” was authored by Cheslie C. Klein, Philipp Berger, Charlotte Grosse Wiesmann, and Angela D. Friederici.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/people-with-better-cardiorespiratory-fitness-tend-to-be-less-anxious-and-more-resilient-in-emotional-situations/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">People with better cardiorespiratory fitness tend to be less anxious and more resilient in emotional situations</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Apr 17th 2026, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A study in Brazil found that individuals with better cardiorespiratory fitness tended to have lower levels of trait anxiety. They also tended to be more resilient in situations of emotional stress. The paper was published in <a href="https://doi.org/10.1016/j.actpsy.2026.106371"><em>Acta Psychologica</em></a>.</p>
<p>Cardiorespiratory fitness is the ability of the heart, blood vessels, lungs, and muscles to supply and use oxygen efficiently during sustained physical activity. It reflects how well the body can perform activities such as walking, running, cycling, or swimming over time without becoming overly fatigued. A person with better cardiorespiratory fitness can usually exercise longer and recover faster after exertion. This form of fitness is important because it is closely linked to physical health, endurance, and a reduced risk of cardiovascular disease.</p>
<p>One key indicator of cardiorespiratory fitness is maximal oxygen uptake, called VO2max, which estimates the body’s capacity to use oxygen during intense exercise. Resting heart rate is another indicator, because lower resting rates are associated with better cardiovascular efficiency. Heart rate recovery after exercise is also useful, since faster recovery generally suggests better fitness. Endurance performance on activities such as timed walking, running, or cycling tests can also indicate cardiovascular fitness. Additional indicators include blood pressure responses to exercise and how easily a person can sustain moderate or vigorous activity.</p>
<p>Study author Thalles Guilarducci Costa and his colleagues investigated how cardiorespiratory fitness is associated with changes in anxiety and anger in response to emotionally charged visual stimuli. They note that modern lifestyles are associated with exposure to emotional stress from various sources and that stressful life events can decrease the practice of physical activity and, therefore, cardiorespiratory fitness.</p>
<p>These authors hypothesized that participants with higher cardiorespiratory fitness would be more resilient to anger and anxiety changes incited by emotional picture stimuli compared to individuals with lower cardiorespiratory fitness. They also expected higher cardiorespiratory fitness to be associated with lower trait anger and trait anxiety (i.e., proneness to anger and anxiety as a general tendency or permanent characteristic).</p>
<p>Study participants were 40 healthy individuals recruited by the study authors through personal invitations and advertisements on social media. Twenty-three of them were women, and their ages ranged between 18 and 40 years.</p>
<p>Participants visited the laboratory twice, with 24 to 72 hours between the two visits. During the first visit, participants completed weight and height measurements and baseline assessments of trait anger and trait anxiety. They also self-reported their physical activity levels, which the study authors used to mathematically estimate their cardiorespiratory fitness. Based on this, participants were classified into groups with either above-average or below-average cardiorespiratory fitness.</p>
<p>Across the two visits, participants viewed sets of pictures from the International Affective Picture System, a validated database of emotionally graded images known to incite specific emotions. Participants viewed a 69-picture set of unpleasant images during one visit, and a 69-picture set of neutral images during the other. The order of the picture sets was randomly assigned, and each viewing session lasted 30 minutes.</p>
<p>During the slideshows, participants reported their emotional responses to each image using a non-verbal rating scale called the Self-Assessment Manikin. They also completed assessments of state anxiety and state anger (i.e., how angry and anxious they currently felt in that exact moment) immediately before and after viewing each picture set. Additionally, the study authors monitored the participants’ heart rates while they viewed the pictures.</p>
<p>Results showed that while the two fitness groups did not differ in their baseline state anger, state anxiety, or heart rates immediately before the viewing sessions, they did differ in their general traits. Specifically, higher estimated maximum oxygen uptake (VO2max) was significantly associated with lower trait anxiety. Contrary to their hypothesis, however, the researchers found that cardiorespiratory fitness was not associated with trait anger.</p>
<p>Furthermore, the stressful stimuli heavily impacted the less fit group. After viewing the unpleasant pictures, participants with below-average cardiorespiratory fitness reported significantly greater spikes in state anger and state anxiety compared to those with above-average fitness. Those with below-average fitness were 775% more likely to see their state anxiety levels jump from “intermediate” to “high” after viewing the stressful images.</p>
<p>“Our findings indicate that individuals with higher CRF [cardiorespiratory fitness] tend to exhibit lower trait anxiety and greater resilience when exposed to emotionally stressful stimuli, reinforcing the growing evidence that physical activity plays an important role in emotional health,” the study authors concluded.</p>
<p>The study contributes to the scientific understanding of the psychological correlates of cardiorespiratory fitness. However, it should be noted that the study was conducted on a very small group of participants and that all information regarding cardiorespiratory fitness was estimated based solely on self-reported physical activity rather than direct laboratory testing. Results of studies on larger groups using more objective measures of cardiorespiratory fitness may differ.</p>
<p>The paper, “<a href="https://doi.org/10.1016/j.actpsy.2026.106371">Cardiorespiratory fitness is associated with lower anger and anxiety and higher emotional resilience,</a>” was authored by Thalles Guilarducci Costa, Lucas Carrara do Amaral, Naiane Silva Morais, Wellington Fernando da Silva, Douglas Assis Teles Santos, Rodrigo Luiz Vancini, Carlos Alexandre Vieira, Mario Hebling Campos, Marilia Santos Andrade, Beat Knechtle, Katja Weiss, Ricardo Borges Viana, and Claudio Andre Barbosa de Lira.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/DY9DKf?signature=018b8cbbc88d9e8cfaad18cbd390691eacade4e6f93e24e02ae01614222f75b8'>unsubscribe from this feed</a></a></small></s></p>