<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/the-psychological-reason-we-judge-groups-much-more-harshly-than-individuals/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">The psychological reason we judge groups much more harshly than individuals</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 18th 2026, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research published in <a href="https://doi.org/10.1037/pspa0000479"><em>Journal of Personality & Social Psychology</em></a> finds that people see themselves as moral, individuals as decent, and groups as falling short.</p>
<p>For decades, psychologists have documented the “better-than-average effect,” the tendency for people to believe they possess more positive qualities than others. This effect is especially strong <a href="https://www.psypost.org/narcissists-psychopaths-and-sadists-often-believe-they-are-morally-superior/">in the moral domain</a>: individuals often believe they are kinder, more fair, and more principled than the typical person. However, most research on moral self-enhancement relies on comparisons between the self and others, leaving an important question unanswered: do people actually <a href="https://www.psypost.org/moral-character-is-shaped-by-self-view-reputation-and-shared-perceptions/">see themselves and others</a> as morally good or bad in an absolute sense?</p>
<p>André Vaz and colleagues conducted a series of five studies using different participant samples and experimental designs. Across the studies, participants were asked to estimate how frequently certain everyday behaviors occur, including both moral actions (for example, helping someone in need) and immoral actions (such as littering or keeping extra change by mistake).</p>
<p>Importantly, participants were not only asked about the behaviors of specific targets, such as themselves or other people, but were also asked to indicate the “moral threshold.” This threshold represented the point at which the frequency of a behavior would be considered morally acceptable rather than morally inadequate.</p>
<p>For instance, participants might indicate what percentage of the time someone would need to recycle or help others in order to be considered a morally good person. By comparing people’s estimates of behavior with these thresholds, the researchers could determine whether a person or group was perceived as morally above or below the line of moral adequacy.</p>
<p>The first study introduced this moral-threshold measure. Undergraduate participants evaluated several everyday moral and immoral behaviors and estimated how often those behaviors were performed either by themselves or by other participants in the study. A separate group identified the moral threshold for each behavior. Later studies expanded this design. In one large online study with U.S. participants, individuals again evaluated their own behavior and moral thresholds, but they also judged the behavior of several types of social targets.</p>
<p>These included a specific individual from the study, a non-individuated individual identified only by an ID number, the other participants in the study as a group, and people in society in general. Participants also reported two additional standards: how often people ideally should perform behaviors, and how often they ought to perform them, allowing the researchers to examine how moral thresholds differed from other moral expectations.</p>
<p>Subsequent studies further explored why people judge individuals more positively than collectives. In one experiment, participants evaluated a randomly selected individual from the study or the collective of all participants. The design emphasized the difference between these targets visually, showing either a group of figures representing the collective or a highlighted single individual randomly selected from that group. Participants again estimated how frequently the targets would engage in moral and immoral behaviors and later indicated how confident they were in these judgments.</p>
<p>In the final studies, the researchers experimentally tested a psychological explanation for the difference between individuals and collectives. Participants were asked to consider how uncomfortable or negative it would feel to make cynical judgments about either a specific person or a group of people. These studies examined whether anticipating such negative feelings might encourage people to judge individuals more generously.</p>
<p>Across the studies, a clear pattern emerged. Participants consistently believed that their own behavior exceeded the moral threshold. In other words, they reported performing moral behaviors more frequently than was required to be considered morally good, and immoral behaviors less frequently than would be tolerated. This pattern appeared reliably across different sets of behaviors and participant samples, indicating that people see themselves as clearly morally adequate, morally <em>better than necessary</em> to meet the standard they themselves set.</p>
<p>Perceptions of others depended on whether those others were described as individuals or as groups. When participants judged collectives, such as the other participants in the study or people in society more broadly, their estimates tended to fall below the moral threshold. This suggests a form of moral pessimism about groups of people, implying that the average person does not meet the standard required to be morally good.</p>
<p>In contrast, when participants judged specific individuals, even individuals they knew almost nothing about, their estimates generally exceeded the moral threshold. Participants therefore believed that a randomly selected individual from a group was likely to behave more morally than the group itself.</p>
<p>These findings produced a consistent ranking in moral perceptions: the self was judged most moral, individual others were judged moderately moral, and collectives were judged least moral.</p>
<p>Further studies investigated why individuals receive more favorable moral judgments than groups. The researchers found that differences in confidence about these estimates did not explain the effect; people were not simply more certain about their judgments of individuals. Instead, participants expected that it would feel more uncomfortable or unpleasant to be cynical about a specific person than about a group.</p>
<p>Because judging an identifiable individual harshly might evoke stronger negative feelings, people appear to avoid this emotional discomfort by giving individuals the benefit of the doubt. This tendency leads people to view individuals as morally adequate even while believing that groups of people fall short of moral standards.</p>
<p>Of note is that the studies were conducted primarily in Western, industrialized countries, meaning the findings may not generalize to other cultural contexts. They also relied on a limited set of everyday behaviors, which may not capture the full range of moral actions people consider in real life.</p>
<p>Overall, the findings suggest that people see themselves as especially moral, give individual strangers the benefit of the doubt, yet view groups and society with moral skepticism.</p>
<p>The research “<a href="https://doi.org/10.1037/pspa0000479">Absolute Moral Perceptions of the Self and Others: People Are Bad, a Person Is Good, I Am Great</a>” was authored by André Vaz, André Mata, and Clayton R. Critcher.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/scientists-discover-how-gut-inflammation-can-drive-age-associated-memory-loss/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Scientists discover how gut inflammation can drive age-associated memory loss</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 18th 2026, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>As organisms grow older, changes in the bacteria living inside the digestive system can directly cause the memory loss commonly associated with aging. By reversing these microbial shifts or stimulating the nerves that connect the digestive tract to the brain, researchers found that memory function could be entirely restored in aging mice. These results were recently published in the journal <em><a href="https://doi.org/10.1038/s41586-026-10191-6" target="_blank">Nature</a></em>.</p>
<p>Timothy O. Cox, a graduate researcher at the University of Pennsylvania, led the research team. Christoph A. Thaiss and Maayan Levy, both pathology researchers at Stanford Medicine and the Arc Institute, served as senior authors on the paper. The team wanted to understand the biological mechanisms that dictate how memory changes over a lifespan. They focused on a concept called interoception, which is the way the brain senses the internal state of the body.</p>
<p>Unlike external senses like sight or hearing, interoception relies on internal pathways like the vagus nerve. This long bundle of nerve fibers acts as a high-speed communication line between the internal organs and the brain. It transmits continuous updates from the stomach and intestines to a brain region called the hippocampus. The hippocampus is the primary center for forming and storing new memories.</p>
<p>The researchers suspected that the gut microbiome, which consists of hundreds of species of bacteria living in the digestive tract, might influence this internal communication. As animals age, the specific types of bacteria residing in their intestines naturally shift. The team sought to determine if these bacterial changes could alter the signals sent along the vagus nerve.</p>
<p>If the gut microbiome affects nerve signaling, it could explain why cognitive abilities falter over time. “Our study emphasizes that processes in the brain can be modulated through peripheral intervention,” Levy said in a press release. She noted that because the digestive system is easy to reach with oral treatments, altering the chemicals produced by gut bacteria offers an appealing way to control brain function.</p>
<p>To test the relationship between gut bacteria and memory, the researchers housed young mice in the same cages as older mice. Because mice naturally consume feces found in their environment, the young animals quickly acquired the intestinal bacteria of the older animals. After a month of living together, the microbial populations in the young mice closely resembled those of the aged mice. The researchers then tested the cognitive abilities of the young animals.</p>
<p>The team used a novel object recognition test, which evaluates a mouse’s natural curiosity and ability to remember familiar items. They also placed the mice in a specialized maze that requires spatial memory to find an exit. Young mice that possessed an older microbiome performed poorly on both tasks. They showed little curiosity about unfamiliar objects and struggled to navigate the maze, behaving much like the older mice.</p>
<p>To isolate the effect of the bacteria from the social stress of living with older animals, the team performed a transplant experiment. They collected fecal matter from older mice and transferred it into the stomachs of young mice that had been raised in a completely sterile environment. These young, previously germ-free mice also lost their ability to form memories after receiving the older bacteria. Older mice raised in sterile environments without any gut bacteria maintained sharp memories well into old age.</p>
<p>The team then administered broad-spectrum antibiotics to the young mice that had acquired older microbiomes. The antibiotics wiped out the newly introduced bacteria. Following this treatment, the young mice regained their memory and easily completed the maze and object recognition tests. Surprisingly, older mice treated with the same antibiotics also experienced a restoration of their memory functions.</p>
<p>Next, the researchers worked to identify the specific bacteria responsible for the cognitive decline. By cataloging the microbial changes that occur over a mouse’s lifespan, they noticed a steady increase in a bacterial species called Parabacteroides goldsteinii. When the researchers introduced only this specific bacterium into the digestive tracts of young mice, the animals developed memory deficits. Other types of bacteria did not produce this effect.</p>
<p>The team analyzed the chemical byproducts created by Parabacteroides goldsteinii to understand how it affects the body. They found that these bacteria produce large amounts of medium-chain fatty acids, which are specific types of fat molecules. When the researchers fed these isolated fat molecules to young mice, the animals immediately showed signs of memory loss. The molecules were acting as a signal that altered the local environment of the intestines.</p>
<p>In the digestive tract, these fat molecules interact with myeloid cells, a type of white blood cell that patrols the gut for threats. The fatty acids attach to a specific receptor on the outside of the white blood cells. Once attached, they trigger the white blood cells to release inflammatory chemicals. The researchers noted that this inflammatory response was localized to the gut and nearby fat deposits, rather than spreading throughout the entire bloodstream.</p>
<p>This local inflammation directly impacted the nearby vagus nerve. Using advanced imaging techniques, the team monitored the electrical activity of the vagus nerve in real time. They observed that the inflammatory chemicals blunted the nerve’s ability to fire electrical signals to the brain. Because the vagus nerve was sending fewer signals, the hippocampus became less active and failed to properly encode new memories.</p>
<p>To prove that this blocked nerve pathway was the root of the problem, the researchers attempted to bypass the inflammation. They gave the older mice capsaicin, the chemical that makes chili peppers spicy, which naturally stimulates the sensory fibers of the vagus nerve. They also tested gut hormones that are known to activate the same nerve pathways. When the vagus nerve was artificially stimulated, the older mice performed just as well on memory tests as the younger animals.</p>
<p>The team also used genetic techniques to remove the fatty acid receptors from the white blood cells of certain mice. Without these receptors, the white blood cells could not detect the bacterial fat molecules and did not trigger an inflammatory response. These genetically modified mice maintained their sharp memories even when their intestines were colonized by the older bacteria. Blocking the inflammation successfully protected the vagus nerve from damage.</p>
<p>While these results offer a new perspective on aging, the experiments were entirely conducted in animal models. The researchers note that it remains unclear if the exact same bacterial species and fatty acids drive memory loss in humans. The exact biological chain of events connecting chronic gut inflammation to decreased nerve excitability also requires further investigation. The anatomical pathways linking the brainstem to the hippocampus are not yet fully mapped.</p>
<p>Future research will explore how these mechanisms translate to the human body and whether targeted therapies can help people experiencing cognitive decline. Scientists are particularly interested in seeing if altering diet or administering specific bacterial treatments could safely reduce gut inflammation in older adults. </p>
<p>“Our hope is that ultimately these findings can be translated into the clinic to combat age-related cognitive decline in people,” Thaiss said in the press release. Additionally, devices that electrically stimulate the vagus nerve are already approved for conditions like epilepsy and might hold promise for protecting memory in the future.</p>
<p>The study, “<a href="https://doi.org/10.1038/s41586-026-10191-6" target="_blank">Intestinal interoceptive dysfunction drives age-associated cognitive decline</a>,” was authored by Timothy O. Cox, Ashwarya S. Devason, Alan de Araujo, Sydney Mason, Madhav Subramanian, Andrea F. M. Salvador, Hélène C. Descamps, Junwon Kim, Yixuan Zhu, Lev Litichevskiy, Sunhee Jung, Won-Suk Song, Adrián Cortés-Martín, Nathan T. Henderson, Kuei-Pin Huang, Thao Nguyen, Wisath Sae-Lee, Iboro C. Umana, Maria Sacta, Ryan J. Rahman, Stephen Wisser, J. Andrew D. Nelson, Ilona Golynker, Alana M. McSween, Eric F. Hohmann, Shaan Patel, Anna L. Bub, Clara Soekler, Niklas Blank, Kevt’her Hoxha, Lavinia Boccia, Andrea C. Wong, Klaas Bahnsen, Jihee Kim, Natalie Biderman, Dina Abbasian, Clarissa Shoffler, Christopher Petucci, Fiona E. McAllister, Amber L. Alhadeff, Marc V. Fuccillo, Colin Hill, Cholsoon Jang, J. Nicholas Betley, Guillaume de Lartigue, Virginia Y.-M. Lee, Maayan Levy & Christoph A. Thaiss.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/new-psychology-research-reveals-the-cognitive-cost-of-smartphone-notifications/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">New psychology research reveals the cognitive cost of smartphone notifications</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 18th 2026, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.1016/j.chb.2026.108926" target="_blank">Computers in Human Behavior</a></em> suggests that receiving a smartphone notification disrupts a person’s concentration for about seven seconds. The research provides evidence that the frequency of checking a phone and the volume of notifications received are better predictors of this distraction than total daily screen time. These findings indicate that fragmented digital habits play a significant role in how technology affects human attention.</p>
<p>While previous experiments have shown that notifications impair task performance, these studies often used artificial alerts that did not reflect real-world conditions. Many past studies also failed to track the exact temporal duration of the distraction or isolate the underlying psychological mechanisms.</p>
<p>The researchers wanted to separate the different reasons a notification might capture attention. They aimed to determine if distraction is caused simply by the visual suddenness of a pop-up, a concept known as perceptual salience. They also wanted to see if distraction stems from learned habits, known as conditioning, or from the personal relevance of the message.</p>
<p>“People receive a very large number of smartphone notifications every day (more than 100 per day on average in our sample). While it is well established that notifications can automatically capture attention, much less is known about the cognitive mechanisms underlying this capture and why some individuals may be more vulnerable than others. Our goal was to better understand both the mechanisms involved and the individual differences that may explain this sensitivity,” said study author Hippolyte Fournier, a postdoctoral fellow at the Institute of Psychology at the University of Lausanne.</p>
<p>For their study, the scientists recruited 180 university students with an average age of about 21. The participants were randomly assigned to one of three experimental groups. All participants completed a Stroop task, which is a classic psychological test that measures mental processing speed and focused attention.</p>
<p>During a Stroop task, words representing colors are displayed on a screen, but the font color does not match the word itself. For example, the word “blue” might be printed in red ink. Participants must identify the font color while ignoring the written word, which requires intense mental effort. While the participants completed this task, the researchers presented smartphone-style notifications on the computer screen.</p>
<p>The scientists used a deceptive setup to make the experience feel authentic. For the first group, known as the personal-notification group, researchers used a cover story to convince the 60 participants that their own smartphones were synced to the computer. This setup made the participants believe the pop-ups appearing during the task were their actual, personal messages.</p>
<p>The second group of 60 participants saw clear, realistic social media notifications but knew the messages belonged to someone else. This dummy-notification group allowed the researchers to test the effect of learned habits without the element of personal relevance. The final group of 60 participants saw blurred notifications that popped up and moved like normal alerts but contained no readable information.</p>
<p>This blurred-notification group helped the scientists isolate the distraction caused purely by the visual movement of an unexpected object. Before the experiment, the participants completed questionnaires measuring different types of anxiety, including social anxiety and the fear of missing out. After the task, the researchers collected three weeks of objective screen time data from the participants’ smartphones to track their daily usage patterns.</p>
<p>The researchers found that a single notification slowed down a participant’s cognitive processing for approximately seven seconds. The delay happened across all groups but was most pronounced in the personal-notification group. This pattern suggests that distraction is driven by a combination of the visual pop-up, learned associations with the phone, and the personal meaning of the alert.</p>
<p>Within the personal-notification group, the magnitude of the distraction depended heavily on how relevant the participant felt the notification was. Alerts that triggered a strong emotional response or a high desire to check the content caused a longer delay in reaction time. The researchers also tracked the participants’ pupil dilation using an eye-tracking device during the task.</p>
<p>Pupil dilation is a physiological reaction that typically indicates a state of heightened arousal or deep mental effort. The scientists observed changes in pupil size that mirrored the behavioral delays. This provides evidence that emotionally relevant notifications trigger a measurable physical response in the body.</p>
<p>When analyzing daily smartphone habits, the scientists found that total screen time did not strongly predict how distracted a participant would become. Instead, the number of notifications a person typically received each day and how often they checked their phone were much stronger predictors. Participants who tended to have highly fragmented phone habits experienced the most severe attentional disruptions.</p>
<p>“Our findings suggest that notifications can disrupt cognitive processing for about seven seconds and that this disruption reflects multiple mechanisms, including perceptual salience, learned conditioning through repeated exposure, and their potential social relevance,” Fournier told PsyPost. </p>
<p>“Importantly, beyond total screen time, we found that the number of notifications received and the frequency of smartphone checks were associated with stronger disruption effects. This suggests that fragmented smartphone use, not just overall usage time, may play an important role in how digital technology affects our attention.”</p>
<p>“Although the delays we observed may seem small in isolation, their importance comes from how frequently notifications occur in everyday life. Even short disruptions, when repeated dozens or hundreds of times per day, may meaningfully affect concentration and productivity. The practical significance therefore lies more in their cumulative impact.” </p>
<p>To their surprise, anxiety levels did not show a clear link to the severity of the distraction in the main personal-notification group. The data suggests that when notifications are viewed as positive or personally engaging, general anxiety does not drastically alter the level of distraction.</p>
<p>The researchers note a few limitations to keep in mind when interpreting these findings. Pupil dilation can be affected by physical movements and changes in screen brightness, meaning the physiological data contains some natural variations. Because the notifications in the study tended to be viewed as pleasant, the experiment might not have fully captured how anxiety interacts with negative or threatening digital messages.</p>
<p>The scientists also warn against misinterpreting these findings as a sign that all social media use should be strictly banned. The goal of this research is to encourage more mindful and adaptive technology habits rather than complete avoidance. </p>
<p>“Given the strong public debate about social media use, our findings should not be interpreted as suggesting that social media or notifications should be avoided entirely,” Fournier explained. “Rather, they highlight the importance of better understanding the cognitive mechanisms involved in order to promote more balanced and mindful use, particularly for individuals who may be more vulnerable to attentional disruption. The goal is not prohibition, but informed and adaptive use.”</p>
<p>Future research will explore how notifications become so intensely attention-grabbing over time. The scientists also plan to investigate whether frequent social media use alters a person’s ability to sustain focus on long-term goals. Another planned area of study involves understanding why people engage in repetitive scrolling behaviors and how this relates to daily emotion regulation.</p>
<p>“One challenge in this field is that it is difficult to study the real cognitive impact of notifications (or social media use) in laboratory settings while maintaining experimental control,” Fournier said. “In this study, we used a cover story that allowed us to measure the effects of participants’ real notifications in a controlled environment. This approach opens promising avenues for future research aiming to study digital behavior in more ecologically valid ways.”</p>
<p>The study, “<a href="https://doi.org/10.1016/j.chb.2026.108926" target="_blank">Attention hijacked: How social media notifications disrupt cognitive processing</a>,” was authored by Hippolyte Fournier, Arnaud Fournel, François Osiurak, Olivier Koenig, Flora Pâris, Vivien Gaujoux, and Fabien Ringeval.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/using-ai-to-verify-human-advice-could-damage-your-professional-relationships/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Using AI to verify human advice could damage your professional relationships</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 17th 2026, 22:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research published in <em><a href="https://doi.org/10.1016/j.chb.2026.108934" target="_blank">Computers in Human Behavior</a></em> suggests that consulting artificial intelligence (AI) for advice may unintentionally strain relationships with human professionals.</p>
<p>AI tools are rapidly becoming part of everyday decision-making, promising quick answers, personalised guidance, and lower costs. Many individuals utilize these tools alongside human professionals to double-check information or get a second opinion.</p>
<p>Previous studies have demonstrated that human advisors sometimes react negatively when clients consult multiple experts. In those situations, advisors may interpret the search for a second opinion as a lack of trust. Yet until recently, little attention had been given to how advisors respond when the second opinion originates from a computer algorithm rather than another person.</p>
<p>Hence, researchers Gerri Spassova (Monash University, Australia) and Mauricio Palmeira (University of South Florida, USA) set out to explore how human advisors react when clients consult AI in addition to seeking professional advice.</p>
<p>To investigate, the pair conducted four experiments involving roughly 180 to 300 adult participants each. In the first experiment, the participants had actual real-world advisory experience. In the subsequent three studies, participants were general adults asked to imagine working in advisory roles such as travel planning, finance, and nutrition. All participants read scenarios in which they had already provided professional advice to a client.</p>
<p>For example, in one experiment, financial advisors were told that after receiving their investment recommendation, the client also sought advice from either another human financial advisor or an artificial intelligence system. The advisors then rated how motivated they felt about the situation and if it affected their willingness to continue working with the client.</p>
<p>Across all four studies, a clear pattern emerged: advisors were noticeably less motivated to work with clients who had also consulted AI. In fact, the negative reaction was stronger than when clients consulted another human advisor.</p>
<p>The researchers suggested the motivation behind the negative response lies in professional identity. Advisors often view AI systems as far less capable than trained professionals. As a result, when clients place an AI tool alongside a human expert as a comparable source of advice, the comparison can feel insulting.</p>
<p>The study also uncovered another surprising effect: advisors tended to judge clients who used AI more negatively. Participants rated those clients as less competent and less warm compared to clients who sought advice from another human expert.</p>
<p>Importantly, the negative reaction persisted even when the AI system was used only for initial background information (rather than a final decision), or as a complementary service (rather than a replacement for the human expert’s advice). In other words, simply checking an AI tool could be enough to change how advisors view their clients.</p>
<p>“Our findings suggest that learning that the client consults AI may, consciously or not, change how the advisor perceives the client and how much effort they are willing to invest in the relationship. Such negative effects, even if subtle, could, in the long run, undermine the advisor’s relationship with the client and potentially result in missed opportunities,” Spassova and Palmeira concluded.</p>
<p>However, the study has several limitations. The research relied heavily on experimental roleplaying scenarios rather than real-world advisory relationships, meaning actual reactions may vary in practice. Additionally, it remains unclear whether these negative responses persist, diminish, or disappear entirely within longer‑term advisor-client relationships, particularly when the advisor knows the client well.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.chb.2026.108934" target="_blank">Offended by the Algorithm: The Hidden Interpersonal Costs of Clients Seeking AI Second Opinion</a>,” was authored by Gerri Spassova and Mauricio Palmeira.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/brain-scans-reveal-a-bipolar-like-link-to-childhood-trauma-in-some-depressed-patients/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Brain scans reveal a bipolar-like link to childhood trauma in some depressed patients</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 17th 2026, 20:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A neuroimaging study in Italy found that patients with bipolar disorder reporting more adverse childhood experiences tended to have worse integrity of brain white matter. This association was present in depressed patients as well, but the effects were less pronounced and structurally different. The research was published in <a href="https://doi.org/10.1016/j.euroneuro.2025.11.011"><em>European Neuropsychopharmacology</em></a>.</p>
<p>Adverse childhood experiences (ACEs) are potentially traumatic events that occur during childhood and can affect a child’s physical, emotional, and psychological development. The concept was popularized by the Adverse Childhood Experiences Study, which examined how early life stress relates to later health outcomes.</p>
<p>ACEs commonly include experiences such as physical abuse, emotional abuse, sexual abuse, neglect, and exposure to domestic violence. They can also involve household dysfunction, such as living with a family member who has substance abuse problems, mental illness, or who has been incarcerated.</p>
<p>These experiences can disrupt a child’s sense of safety and stability and may lead to chronic stress during critical developmental periods. Prolonged exposure to stress in childhood can influence the developing brain and stress-regulation systems in the body. Research has shown that individuals with higher numbers of ACEs have a greater risk of mental health problems such as depression, anxiety, and substance use disorders.</p>
<p>ACEs have also been associated with increased risk of chronic physical health conditions, including cardiovascular disease and diabetes. However, the presence of supportive relationships and protective environments can buffer the negative effects of adverse experiences.</p>
<p>Study author Marco Paolini and his colleagues note that previous studies indicate that adverse childhood experiences might have a detrimental effect on the integrity of brain white matter. However, these effects do not seem to be general, but rather dependent on the mental health diagnosis a person has. They believed that these experiences would have a clear effect in patients with bipolar disorder and a less clear effect in individuals with major depressive disorder.</p>
<p>These researchers conducted a study based on the hypothesis that the effect of adverse childhood experiences on microstructure integrity of brain white matter would be different in individuals suffering from bipolar disorder and in those suffering from major depressive disorder.</p>
<p>Brain white matter integrity refers to the structural quality and organization of the brain’s white matter tracts. The brain white matter consists of bundles of myelinated nerve fibers that connect different brain regions and enable communication between them.</p>
<p>Myelin is the material that insulates nerve fibers, allowing faster nerve impulse transmission and creating the white look. Higher white matter integrity generally indicates more efficient neural connectivity and information transmission in the brain, whereas reduced integrity may reflect developmental abnormalities, aging, injury, or neurological and psychiatric disorders.</p>
<p>Study participants were 260 inpatients admitted to the San Raffaele hospital psychiatric ward during an ongoing depressive episode. 140 of them were diagnosed with major depressive disorder, while 120 were diagnosed with bipolar disorder. Patients’ age ranged between 21 and 69 years.</p>
<p>Patients underwent magnetic resonance imaging scans of their brain structure. A subsample of 162 patients gave blood samples, allowing researchers to conduct genotyping and calculate Polygenic Risk Scores (PRS)—an individualized estimate of a person’s genetic liability for developing major depressive disorder and bipolar disorder. Participants also completed an assessment of adverse childhood experiences (the 28-item Childhood Trauma Questionnaire), including the adversity of their family environment (the Risky Family Questionnaire).</p>
<p>Results showed that patients with bipolar disorder who reported more adverse childhood experiences—specifically physical abuse, emotional abuse, and physical neglect—tended to have widespread, worse white matter integrity. The situation was different in patients with major depressive disorder, where the association was less pronounced and affected different structural metrics of the white matter.</p>
<p>Further analyses revealed that the strength of the association between adverse childhood experiences and the integrity of white matter in the brain depends on the genetic risk for bipolar disorder a person has. Crucially, this genetic moderation was present specifically in patients with major depressive disorder, not those with bipolar disorder. In bipolar patients, childhood trauma negatively impacted white matter regardless of their genetic risk score. </p>
<p>However, in depressed patients, those with a high genetic risk for bipolar disorder showed white matter changes that closely mimicked the bipolar patients. Conversely, depressed patients with a low genetic risk for bipolar disorder showed an opposite biological response to trauma. These findings suggest that major depression is a highly heterogeneous diagnosis, with a subset of depressed patients actually possessing a bipolar-like biological response to trauma.</p>
<p>“In the present study we identified a differential effect of childhood maltreatment on WM [white matter] microstructure between patients suffering from major depression or bipolar disorder, with more pronounced detrimental effects in BD [bipolar disorder] compared to MDD [major depressive disorder]: this may point to distinct pathophysiological routes through which childhood maltreatment, a shared environmental risk factor, affects the development of the two disorders,” the study authors concluded. They added that their findings “give credence to the notion of a shared disease biology with bipolar disorder in a portion of MDD patients, and possibly provide future tools to disentangle MDD heterogeneity.”</p>
<p>The study contributes to the scientific understanding of the neural underpinnings of mental health disorders. However, it should be noted that the design of the study does not allow definitive causal inferences to be derived from the results. Additionally, information on adverse childhood experiences was based on recall of childhood, leaving room for recall bias to have affected the results.</p>
<p>The paper, “<a href="https://doi.org/10.1016/j.euroneuro.2025.11.011">Different effect of adverse childhood experiences on white matter microstructure in major depression and bipolar disorder: moderating role of genetic liability,</a>” was authored by Marco Paolini, Laura Raffaelli, Valentina Bettonagli, Cristina Lorenzi, Sara Spadini, Beatrice Bravi, Lidia Fortaner-Uya, Giulia Gulino, Chiara Fabbri, Alessandro Serretti, Raffaella Zanardi, Cristina Colombo, Francesco Benedetti, and Sara Poletti.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/outdoor-athletes-show-superior-color-detection-in-their-peripheral-vision/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Outdoor athletes show superior color detection in their peripheral vision</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 17th 2026, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in the journal <em><a href="https://doi.org/10.1177/03010066251414084" target="_blank">Perception</a></em> provides evidence that people who play outdoor sports have superior color detection in their peripheral vision compared to indoor athletes and non-athletes. This suggests that intense athletic training in large, open environments can physically shape and improve basic visual skills. The findings indicate that the adult brain retains the ability to adapt its low-level sensory functions based on real-world experiences, long after childhood development has ended.</p>
<p>The human eye has a specific biological structure that dictates how well we see certain things. The very center of the visual field is packed with specific light-detecting cells that process bright light and rich colors. As the gaze moves outward toward the edges, a concept scientists call retinal eccentricity, the eyes naturally become less sensitive to color and fine details.</p>
<p>Because of this natural biological limit, people usually move their eyes to bring important objects into the direct center of their focus. In fast-paced sports, players cannot always look directly at every moving teammate, opponent, or ball. They rely heavily on their side vision to monitor their surroundings and anticipate game movements.</p>
<p>Scientists wanted to understand if this constant reliance on side vision actually changes how the eyes and brain process visual information over time. This concept is known as perceptual learning, which refers to a lasting improvement in how the brain perceives sensory information after repeated exposure and practice. Past studies on athletic vision usually measured overall performance, like reaction times or the physical ability to hit moving targets in a limited time window.</p>
<p>The researchers designed this new study to isolate pure visual perception. They specifically wanted to test whether athletes literally see peripheral colors better than non-athletes, rather than simply reacting to them faster. They reasoned that athletes who play on large outdoor fields might develop stronger peripheral color vision to compensate for the natural drop in color sensitivity at the edges of the eye.</p>
<p>“The debate is the ancient one of nature versus nurture. Basic functions such as color perception are typically thought to be hard-coded in our brains, as if we are born with them and that is it. However, together with other studies, we provide evidence that this is not the case: they can be improved through intensive training because our brains are, to some extent, plastic,” said <a href="https://staffprofiles.bournemouth.ac.uk/display/mtoscani" target="_blank">Matteo Toscani</a>, a senior lecturer at Bournemouth University and co-director of <a href="https://bournemouthperceptionlab.co.uk/index.html" target="_blank">the Bournemouth Perception Lab</a>.</p>
<p>“While I am generally interested in color vision and visual plasticity, the specific idea of testing athletes — while also comparing indoor and outdoor sports — was proposed by the lead author, Sidney Uden-Taylor, a student in my lab who also carried out the study.”</p>
<p>The researchers recruited 26 college-aged participants to complete the visual tests. The sample included eight outdoor athletes, nine indoor athletes, and nine non-athletes. The athletes had all participated in their respective sports for at least three years and trained an average of four and a half hours per week.</p>
<p>During the experiment, participants sat in a soundproof, dark room and stared at a cross in the center of a computer monitor. The scientists presented visual stimuli on the screen for just 250 milliseconds. These quick flashes appeared either on the left or right side of the screen at specific angles away from the center, ensuring the participants could not look directly at them.</p>
<p>The flashing images were either a small photograph of a human figure or a simple circle. The researchers adjusted the simple circle so it had the exact same number of pixels and the same average color as the human figure. The backgrounds behind these images displayed either an indoor blue sports venue or a grassy outdoor pitch.</p>
<p>Participants pressed a specific keyboard key to indicate whether the hidden object appeared on the left or right side of their visual field. To accurately measure their sensory limits, the researchers used a specialized computer program to constantly adjust the visibility of the objects. They achieved this by blending the target object with the background picture, making the image look slightly transparent.</p>
<p>If a participant answered correctly, the program made the object fade slightly more into the background for the next round. This adaptive method allowed the scientists to find the exact threshold of contrast at which a person could barely distinguish the object from the background. Each participant completed 1,920 individual trials during the two-hour session.</p>
<p>The data revealed that outdoor athletes were significantly better at detecting peripheral colors than both indoor athletes and non-athletes. The outdoor athletes required almost one-third less color contrast to successfully spot the objects on the edges of the screen. This indicates their visual systems were highly sensitive to faint visual cues in their periphery.</p>
<p>The scientists also found an interaction between the shape of the object and the type of background. Participants could spot the human figure and the simple circle equally well when they appeared on the indoor sports background. However, the human figure became significantly harder to see when it was presented against the visually complex outdoor grass background.</p>
<p>Despite this difficulty, the general visual advantage for outdoor athletes persisted across all conditions. They outperformed the other groups even when viewing simple circles on the indoor sports background. This suggests that the visual improvements gained from outdoor sports transfer to general situations and are not restricted to familiar sporting environments.</p>
<p>The scientists noted that outdoor sports like soccer and rugby take place on large fields, forcing players to constantly monitor wide, unpredictable spaces. Indoor sports, while fast-paced, happen in smaller, more enclosed areas. The results suggest that the expansive nature of outdoor sports provides the specific visual training needed to enhance peripheral color perception.</p>
<p>“If color vision is plastic, people born with color perception deficiencies may still outperform their congenital limits through interaction with the environment,” Toscani told PsyPost. “Evidence for this comes from studies of color-blind individuals (e.g. Boehm et al., 2014). Our results suggest that outdoor athletes need almost one third less contrast—that is, stimulus visibility—for a peripheral stimulus to be effectively detected compared with non-athletes and indoor athletes, likely reflecting adaptation to training in large open fields. </p>
<p>“We found this effect in peripheral vision, where reduced performance is attributed to anatomical constraints, from the structure of the eye to the way visual information is projected to and processed by the brain. The fact that peripheral vision is relatively poor is indeed why we move our eyes to bring objects into central vision, where resolution is higher. Outdoor athletes don’t need to do it as much as we do, as they have a better peripheral color, their brain probably adapted to accurate monitoring events occurring in the periphery.”</p>
<p>While the findings point toward the benefits of outdoor training, readers should avoid assuming a direct cause-and-effect relationship. Because the scientists observed naturally occurring groups of athletes and non-athletes, the study is correlational. </p>
<p>“While the study is consistent with the idea that outdoor training improves peripheral vision, which was the main focus of our investigation, this type of study remains correlational,” Toscani noted. “We could not experimentally manipulate training experience—for example, by randomly assigning children to 10 years of indoor sport, outdoor sport, or no sport, and then assessing their vision afterwards.”</p>
<p>It remains possible that people with naturally superior peripheral vision are simply drawn to outdoor sports from a young age. Another potential factor is the overlap in athletic activities outside of formal competition, which could muddy the data. For example, outdoor athletes might occasionally play indoor sports during their off-season, an activity that the researchers did not officially track.</p>
<p>The scientists are already looking ahead and plan to use immersive technology to present objects even further into the periphery of a person’s vision. “I have applied for a BBSRC grant to support further research into how vision in adults can be modified through gamified training in virtual reality environments,” Toscani said.</p>
<p>The study, “<a href="https://doi.org/10.1177/03010066251414084" target="_blank">Athletes are better at peripheral colour detection</a>,” was authored by Sidney Uden-Taylor, Anna Metzger, and Matteo Toscani.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/narcissistic-traits-and-celebrity-worship-are-linked-to-excessive-instagram-scrolling-via-emotional-struggles-and-fear-of-missing-out/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Narcissistic traits and celebrity worship are linked to excessive Instagram scrolling via emotional struggles and fear of missing out</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 17th 2026, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>People who obsess over celebrities or exhibit high levels of narcissism are more likely to develop unhealthy habits on Instagram. A new study published in <em><a href="https://doi.org/10.1080/00223980.2025.2603473" target="_blank">The Journal of Psychology</a></em> reveals that this behavior is driven by an underlying fear of missing out and a struggle to manage difficult emotions. The research offers a clearer picture of how specific personality traits make some users vulnerable to addictive social media routines.</p>
<p>Psychological researchers frequently examine how the internet shapes human behavior. As social platforms grow, experts want to understand why some individuals use these apps in ways that disrupt their daily lives. Instagram is particularly popular, boasting roughly two billion active monthly users globally. It features highly visual tools like photo editing and short, disappearing video updates.</p>
<p>These features make the platform highly engaging. For some users, this engagement shifts into a pattern resembling a behavioral addiction. This happens when using an app takes over a person’s life and alters their mood. It can also cause individuals to build a tolerance, meaning they need more screen time to feel the same effects.</p>
<p>Addiction also involves withdrawal symptoms when the app is removed, as well as recurring conflicts in offline relationships. Because true addiction requires all these strict criteria to be met, researchers prefer a broader term for most users. They call this broader pattern problematic Instagram use.</p>
<p>Hadi Fazelirad, a doctoral student in clinical psychology at Kharazmi University in Iran, led a team to investigate the psychological roots of this behavior. The group wanted to test a specific psychological framework used to study behavioral addictions. This framework suggests that an individual’s underlying personality traits combine with their emotional and cognitive responses to create addictive habits.</p>
<p>To test this idea, the researchers focused on two distinct personality traits as starting points. The first trait was narcissism. Narcissism is a condition where a person has an inflated sense of their own importance and a deep need for excessive attention and admiration. The platform’s visual focus provides an ideal stage for individuals with high narcissism to showcase their lives and receive praise.</p>
<p>The second personality trait was celebrity worship. This term describes an intense, sometimes obsessive preoccupation with a famous individual. Some users simply follow a celebrity for entertainment or to connect with other fans. Others develop extreme emotional attachments, which can occasionally border on pathological behaviors.</p>
<p>The researchers suspected that two internal mechanisms might link these personality traits to problematic Instagram use. One mechanism is the fear of missing out. This is a persistent anxiety that other people are having rewarding experiences without you. The other mechanism is a general difficulty with emotion regulation.</p>
<p>Emotion regulation refers to a person’s ability to manage and respond to their feelings in a healthy way. People with poor emotion regulation might struggle to control impulses when they are upset. They might also lack the necessary strategies to calm themselves down. The researchers theorized that these emotional and cognitive hurdles push vulnerable personalities toward excessive app usage.</p>
<p>The team surveyed 450 students from six different universities across Iran. The participants ranged in age from 18 to 35, and nearly 80 percent of the group identified as female. To gather the data, the researchers distributed online questionnaires assessing the various psychological traits in question.</p>
<p>Participants completed several standardized psychological assessments. They answered questions measuring their attitudes toward celebrities, checking for varying levels of obsession. The survey also included a specific scale to measure narcissistic tendencies. Other sections evaluated participants’ struggles with managing their emotions, their fear of missing out on social events, and the severity of their Instagram habits.</p>
<p>After collecting the responses, the researchers analyzed the data using structural equation modeling. This statistical method allows scientists to look at complex relationships between multiple variables at the same time. It helps determine if one factor directly causes another or if a hidden variable acts as a bridge between the two.</p>
<p>The researchers found a positive link between narcissistic traits, celebrity worship, and problematic Instagram use. In other words, individuals who scored higher in narcissism or celebrity obsession also reported unhealthier relationships with the app. However, the connection was not just a direct line from personality to addiction.</p>
<p>Instead, the study revealed that the fear of missing out acted as a bridge between these personality traits and social media habits. For highly narcissistic individuals, Instagram offers a vast audience to impress. These individuals develop an intense worry that they will miss opportunities to gather attention or control their social image. This anxiety drives them to keep checking the app.</p>
<p>A similar pattern emerged for those obsessed with famous figures. Celebrities share constant updates about their personal and professional lives on the platform. Fans who obsess over these figures develop a deep anxiety about missing a post or a story. This fear of missing out pushes them to monitor their feeds constantly.</p>
<p>The study also revealed that emotional regulation difficulties played a central linking role. Narcissistic individuals often struggle to process negative emotions healthily. Rather than dealing with these feelings internally, they might turn to Instagram for a quick mood boost. A few likes or comments can serve as a temporary distraction from their emotional distress.</p>
<p>Fans who intensely worship celebrities face similar emotional hurdles. Constantly comparing their own lives to the highly curated, idealized lives of famous people can damage their self-esteem. Lacking the tools to cope with these negative feelings, they return to the app to distract themselves. This creates a cycle of problematic use that is hard to break.</p>
<p>By understanding the emotional mechanisms behind problematic app usage, mental health professionals might develop better treatments. Therapies that teach people how to accept negative emotions and build healthier offline habits could be quite effective. Recognizing the underlying anxieties that drive endless scrolling is a necessary step in helping users regain control of their digital lives.</p>
<p>The researchers highlighted potential interventions that could help vulnerable users. One option is a treatment model designed to address emotional disorders by teaching cognitive reappraisal. This technique encourages individuals to view negative emotions as temporary states that will eventually pass. By accepting these emotions rather than avoiding them, users might feel less urge to escape into social media.</p>
<p>Other practical interventions could directly target the fear of missing out. Educational programs that emphasize good sleep hygiene and limit technology use before bed have proven helpful in similar situations. Therapy centers and parents could work together to foster offline emotional awareness, reducing a young adult’s reliance on digital validation.</p>
<p>While the study offers new insights, the researchers noted a few limitations to their work. The participant pool consisted entirely of university students in Iran. Because of this specific demographic, the results might not automatically apply to people in other age groups. Individuals living in different cultural contexts might also interact with social media differently.</p>
<p>The data relied on self-reported surveys, which can sometimes skew results. People are not always objective when answering questions about their own flaws, introducing a potential bias into the findings. The research was also cross-sectional, meaning it only captured a single moment in time. This type of research cannot definitively prove that one behavior causes another.</p>
<p>Future research could address these gaps by tracking participants over longer periods. The researchers suggest that future studies should include an equal number of male and female participants to see if gender changes these dynamics. It would also be helpful to differentiate between specific types of narcissism.</p>
<p>The researchers also recommend looking at how overall internet habits influence Instagram-specific behavior. Tracking the total time spent online could provide a much broader picture of a person’s digital life. Understanding how different platforms interact might help experts design more comprehensive strategies for digital wellbeing.</p>
<p>Despite these limitations, the research clarifies the psychological pathways that lead to problematic digital habits. Social media platforms will likely continue to grow and introduce new features designed to capture user attention. Identifying the personality traits and emotional struggles that make users vulnerable is an important step in promoting healthier technology use.</p>
<p>The study, “<a href="https://doi.org/10.1080/00223980.2025.2603473" target="_blank">Celebrity Worship, Narcissism, and Problematic Instagram Use: The Mediating Role of Difficulties in Emotion Regulation and Fear of Missing Out</a>,” was authored by Hadi Fazelirad, Mehrane Pirzade, Jafar Hasani, Bahman Bouruki Milan, Robabeh Noury Ghasem Abadi and Mark D. Griffiths.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/neuroticism-is-linked-to-altered-communication-between-the-brains-emotional-networks/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Neuroticism is linked to altered communication between the brain’s emotional networks</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 17th 2026, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in <em><a href="https://doi.org/10.1016/j.neuroimage.2025.121655" target="_blank">NeuroImage</a></em> reveals that neuroticism is linked to altered communication between different brain networks rather than isolated brain activity. Researchers discovered that people with higher levels of this personality trait show increased connectivity between brain regions responsible for processing emotions, regulating memory, and detecting threats. These findings suggest that emotional instability arises from how the brain’s emotional hubs synchronize with other areas.</p>
<p>Marvin S. Meiering, a researcher at the Medical School Berlin, led the study alongside a team of scientists. The group wanted to understand the biological foundations of neuroticism, a personality trait involving a tendency to experience intense negative emotions on a regular basis. People with high levels of neuroticism often struggle to bounce back from stressful events and face a higher risk of developing mental health conditions like depression.</p>
<p>For a long time, researchers thought neuroticism was simply caused by an overly active amygdala. The amygdala is an almond-shaped structure deep inside the brain that acts as an emotional alarm system. It detects potential threats in the environment and triggers fear or anxiety responses, but recent scientific reviews have questioned this straightforward idea.</p>
<p>Newer theories propose a different mechanism for emotional instability. The focus has shifted toward how the amygdala communicates with other parts of the brain. One important partner is the hippocampus, a brain region primarily known for helping us form memories and navigate physical spaces.</p>
<p>Recent scientific models suggest the hippocampus also plays a role in creating time stamps for emotional experiences. It helps anchor our feelings to specific events and contexts. If the amygdala becomes too active, it might interrupt this time-stamping process.</p>
<p>Without clear boundaries, negative emotional memories can bleed into other situations. This causes bad feelings to persist long after a stressful event has ended. The researchers wanted to map how the amygdala interacts with the hippocampus in people with varying levels of neuroticism.</p>
<p>Another important brain area they examined is the dorsolateral prefrontal cortex. This region sits near the front of the brain and acts like a control center. It is heavily involved in cognitive control, which includes the ability to regulate and calm emotional responses.</p>
<p>To investigate this, Meiering and his colleagues recruited 115 healthy adults between the ages of 18 and 45. The researchers used functional magnetic resonance imaging to monitor the participants’ brains. This scanning technology tracks blood flow in the brain, allowing scientists to see which areas are active in real time.</p>
<p>While inside the scanner, the participants viewed a series of pictures. Some images showed human faces displaying negative emotions like fear, disgust, and sadness. Other images were simply neutral, scrambled patterns surrounded by colored borders.</p>
<p>The emotional faces used in the experiment came from a specialized database designed for psychological research. The pictures featured actors displaying specific, intense emotional states. This visual setup has a proven track record of reliably activating the brain regions responsible for processing negative feelings.</p>
<p>The researchers did not ask the participants to actively manage their feelings during the experiment. Instead, they gave the volunteers a simple task. The participants just had to identify the gender of the faces or the color of the borders around the scrambled images.</p>
<p>This simple activity ensured that any emotional regulation happening in the brain was completely automatic and unconscious. Before the brain scans, the participants also completed five different personality questionnaires. The researchers combined the results from these surveys into a single, highly accurate score for each person.</p>
<p>This mathematical approach helps eliminate the random errors that often happen when relying on just one survey. When analyzing the brain scans, the team first checked the isolated activity levels of specific brain regions. The results regarding isolated brain activity were not statistically significant.</p>
<p>The amygdala did not simply work harder or light up more in highly neurotic individuals. Instead, the differences appeared in the way different brain regions synchronized their activity. The researchers measured functional connectivity, which tracks how closely the activity patterns of two brain areas match up over time.</p>
<p>They found that people with higher neuroticism scores showed increased communication between the left amygdala and the left hippocampus. This hyperactive connection aligns with the newer theories about emotional memory. If the amygdala and hippocampus are constantly interacting during negative experiences, it might disrupt the brain’s ability to box up those emotions.</p>
<p>This biological quirk could explain why highly neurotic people struggle with lingering negative moods that generalize to safe situations. The researchers also discovered increased communication between the amygdala and the dorsolateral prefrontal cortex. Because this prefrontal area usually calms emotions down, increased connectivity here might initially seem like a positive trait.</p>
<p>However, the researchers suspect it actually reflects an inefficient emotional control system. The brain of a highly neurotic person might have to work extra hard to manage negative feelings automatically. Their control centers must constantly communicate with the amygdala to keep baseline emotional reactions in check, which could leave them feeling emotionally drained.</p>
<p>Expanding their view to look at the entire brain, the research team found even more unique wiring patterns. The right amygdala showed strong connections to the anterior insula and the midcingulate cortex. Both of these brain areas are core components of the salience network.</p>
<p>The salience network is a large brain system that constantly scans the environment for important information. It helps the brain decide which external stimuli or internal feelings require immediate attention. Increased synchronization between the amygdala and this network suggests a persistent state of hypervigilance.</p>
<p>The brains of highly neurotic individuals seem specially wired to constantly hunt for potential threats. The anterior insula, in particular, helps bridge the gap between physical sensations and conscious emotions. Its strong connection to the amygdala during this experiment provides new biological evidence for the physical discomfort often reported by highly neurotic people.</p>
<p>Like all scientific investigations, this study comes with limitations. The imaging methods used can only show a relationship between brain connectivity and personality traits. They cannot prove that these specific brain patterns directly cause a person to become neurotic.</p>
<p>The researchers also noted that they only examined the participants at a single point in time. This makes it difficult to separate permanent personality traits from temporary bad moods. Future research will need to track participants over several years to see if these neural signatures remain stable over a lifetime.</p>
<p>Additionally, brain and behavior studies often struggle with small mathematical effect sizes. The researchers acknowledged that many of their findings would not pass traditional, strict statistical hurdles. To adapt, they relied on advanced statistical models that focus on the general size and direction of the brain patterns. Moving forward, scientists will need to replicate this experiment with thousands of participants to verify these subtle differences in brain connectivity.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.neuroimage.2025.121655" target="_blank">Neuroticism is associated with increased amygdala connectivity to hippocampal and prefrontal regions during emotional face processing</a>,” was authored by Marvin S. Meiering, David Weigner, Simone Grimm, and Sören Enge.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/a-massive-review-reveals-cannabis-falls-short-in-treating-psychiatric-disorders/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">A massive review reveals cannabis falls short in treating psychiatric disorders</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 17th 2026, 11:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Despite the rising popularity of medical cannabis for mental health and addiction, a new sweeping review shows little proof that these products actually help treat most of these conditions. The comprehensive analysis reveals that while cannabis-based medicines might offer mild relief for a handful of specific issues, they do not improve conditions like depression or anxiety and carry a greater risk of side effects. These findings were recently published in the journal <em><a href="https://doi.org/10.1016/S2215-0366(26)00015-5" target="_blank">The Lancet Psychiatry</a></em>.</p>
<p>Cannabinoids are the active chemical compounds found in the cannabis plant. The most well-known of these are tetrahydrocannabinol, which produces the high associated with marijuana, and cannabidiol, a non-intoxicating compound often sold as a wellness product. In recent years, an increasing number of people have turned to these substances to manage their mental health.</p>
<p>In the United States and Canada, roughly 27 percent of people between the ages of 16 and 65 report using cannabis for medical reasons. About half of those individuals use it specifically to treat mental health struggles. In Australia, prescription approvals for cannabinoid medications have soared, often for psychological conditions and addiction.</p>
<p>This surge in use has largely outpaced the medical evidence. Researchers wanted to know if prescribing these plant-based and pharmaceutical products was truly justified based on science. Jack Wilson, a researcher at the University of Sydney in Australia, led a team to investigate whether these treatments actually work and whether they are safe.</p>
<p>The research team sought to provide clarity during a time of expanding clinical use. They noticed a substantial gap between how often these products are prescribed and the scientific proof backing them up. To bridge this gap, Wilson and his colleagues set out to gather and evaluate all the best available data from the past four decades.</p>
<p>To figure out if cannabinoids are effective, the researchers conducted a systematic review and meta-analysis. A systematic review involves searching through major scientific databases to collect every study that meets a strict set of criteria. A meta-analysis then combines the numerical data from all those separate studies into one large statistical model.</p>
<p>Combining data this way allows researchers to see the big picture. It gives them a much clearer idea of a treatment’s true effect than looking at any single study alone. For this project, the team searched for randomized controlled trials published between January 1980 and May 2025.</p>
<p>A randomized controlled trial is considered the highest standard of scientific research for testing medical treatments. In these studies, participants are randomly assigned to either receive the treatment being tested or a placebo. A placebo is an inactive substance, like a sugar pill, that looks just like the real medication but has no physical effect.</p>
<p>The researchers ultimately gathered 54 of these trials, which included a total of 2,477 participants. They looked specifically for trials where a mental health condition or a substance use disorder was the primary reason for treatment. They evaluated how well the cannabinoids reduced symptoms and tracked any adverse events, which are unwanted side effects like dizziness or nausea.</p>
<p>The results for most mental health conditions were not statistically significant. The data showed no true benefit for people struggling with anxiety, psychotic disorders, or post-traumatic stress disorder. There were also no randomized trials available that tested cannabis as a primary treatment for depression.</p>
<p>This absence of proof is particularly notable because anxiety, depression, and post-traumatic stress disorder are among the most common reasons people seek out medical cannabis. The researchers found the same lack of benefit for obsessive-compulsive disorder and bipolar disorder. In trials testing treatments for attention-deficit hyperactivity disorder, the improvements were not statistically significant.</p>
<p>For eating disorders, the findings were similarly unsupportive. Two studies looked at people with anorexia nervosa, an eating disorder characterized by an intense fear of gaining weight. The researchers found no real difference in weight gain or physical activity levels between those who took cannabinoids and those who received a placebo.</p>
<p>The team also looked at substance use disorders, which occur when a person cannot stop using a drug or medication despite it causing health and social problems. They found no benefit for treating opioid addiction or tobacco dependence. In fact, for people with a cocaine use disorder, the results showed that taking cannabinoids actually increased their cravings for cocaine.</p>
<p>Wilson noted this specific danger in a recent statement about the research. He cautioned against applying one drug to all addictions. “However, when medicinal cannabis was used to treat people with cocaine-use disorder, it increased their cravings. This means it should not be considered for this purpose and may, in fact, worsen cocaine dependence,” he said.</p>
<p>The analysis did find a few areas where cannabinoids offered some potential benefits, though the quality of the evidence was generally considered low. One such area was the treatment of cannabis use disorder itself. People with this condition struggle to control their use of marijuana.</p>
<p>The data showed that using pharmaceutical-grade combinations of tetrahydrocannabinol and cannabidiol helped reduce withdrawal symptoms and the total amount of cannabis a person consumed. Wilson likened this to other addiction treatments. “Similar to how methadone is used to treat opioid-use disorder, cannabis medicines may form part of an effective treatment for those with a cannabis-use disorder. When administered alongside psychological therapy, an oral formulation of cannabis was shown to reduce cannabis smoking,” he said.</p>
<p>Another area showing slight improvement was the treatment of tic disorders and Tourette’s syndrome. These conditions cause sudden, uncontrollable movements or vocal sounds. Participants who received a combination of tetrahydrocannabinol and cannabidiol experienced a reduction in the severity of their tics compared to those who took a placebo.</p>
<p>The researchers also looked at autism spectrum disorder, a developmental condition that affects how people communicate and interact with the world. Across two studies, cannabinoids were linked to a reduction in certain traits associated with autism. However, the researchers cautioned that these specific studies had a high risk of bias, meaning the way the trials were designed or reported might have skewed the results.</p>
<p>Sleep issues were another condition where cannabinoids showed some promise. Among people with insomnia, taking any type of cannabinoid led to an increase in total sleep time. This was measured both by electronic sleep-tracking devices and by the participants writing in sleep diaries.</p>
<p>Despite these few positive notes, the safety data raised some concerns. Participants who took cannabinoids were more likely to experience general adverse events than those in the placebo groups. For every seven people treated with these medicines, one additional person experienced a side effect like dry mouth, nausea, or diarrhea.</p>
<p>The researchers noted that the medications did not increase the odds of serious adverse events. Serious adverse events are severe medical issues that might require hospitalization or pose a major health threat. People taking cannabinoids were also no more likely to drop out of the studies than those taking a placebo.</p>
<p>Even with a relatively mild side effect profile, Wilson warned about the broader implications of these findings. He noted that unproven treatments carry hidden risks. “Though our paper didn’t specifically look at this, the routine use of medicinal cannabis could be doing more harm than good by worsening mental health outcomes, for example a greater risk of psychotic symptoms and developing cannabis use disorder, and delaying the use of more effective treatments,” he said.</p>
<p>There are a few caveats to consider when interpreting these results. The studies included in the review were often quite small, which makes it harder to draw firm conclusions. The researchers also pointed out that many of the trials had a high risk of bias, often because the companies manufacturing the cannabis products were involved in funding or running the studies.</p>
<p>Additionally, the review only looked at outcomes from the longest follow-up period in each study. This means they might have missed some short-term benefits or harms that occurred earlier in the treatment process. The analysis was also limited by a lack of data on whether these treatments affect men and women differently.</p>
<p>Moving forward, the researchers emphasized the need for better-designed studies. Future trials must include larger groups of participants to provide clearer, more accurate results. Scientists also need to conduct studies that are free from industry influence to ensure the findings are completely independent.</p>
<p>Until that better evidence arrives, the researchers urge medical professionals to be highly cautious. They hope their work guides safer prescribing practices. “Our study provides a comprehensive and independent assessment of the benefits and risks of cannabis medicines, which may support clinicians to make evidence-based decisions, helping to ensure patients receive effective treatments while minimising harm from ineffective or unsafe cannabis products,” Wilson said.</p>
<p>The study, “<a href="https://doi.org/10.1016/S2215-0366(26)00015-5" target="_blank">The efficacy and safety of cannabinoids for the treatment of mental disorders and substance use disorders: a systematic review and meta-analysis</a>,” was authored by Jack Wilson, Olivia Dobson, Andrew Langcake, Palkesh Mishra, Zachary Bryant, Janni Leung, Danielle Dawson, Myfanwy Graham, Maree Teesson, Tom P Freeman, Wayne Hall, Gary C K Chan, and Emily Stockings.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>