<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/epigenetic-age-acceleration-moderates-the-link-between-loneliness-and-chronic-health-conditions/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Epigenetic age acceleration moderates the link between loneliness and chronic health conditions</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 22nd 2024, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <a href="https://doi.org/10.1037/pag0000822"><em>Psychology and Aging</em></a> reports that loneliness may accelerate biological aging and exacerbate chronic health conditions in older adults.</p>
<p>Loneliness is a <a href="https://www.psypost.org/massive-meta-analysis-finds-loneliness-has-increased-in-emerging-adults-in-the-last-43-years/">growing public health concern</a>. Previous research has linked loneliness to a range of <a href="https://www.psypost.org/loneliness-linked-to-cognitive-decline-in-older-adults-study-finds/">health issues</a>, including cardiovascular, inflammatory, and metabolic conditions, as well as overall increased mortality. However, these associations do not account for all the health risks related to loneliness.</p>
<p>The concept of epigenetic aging, where biological age differs from chronological age due to molecular changes in DNA, is a promising area for understanding these risks. In this study, researchers <a href="https://psycnet.apa.org/search/results?latSearchType=a&term=Freilich%2C%20Colin%20D.">Colin D. Freilich</a> and colleagues investigated whether loneliness is associated with accelerated epigenetic aging and whether this, in turn, impacts chronic health conditions.</p>
<p>The researchers utilized data from the Midlife Development in the United States (MIDUS) study, a comprehensive, longitudinal study that examines the role of psychological, social, and biological factors in aging. A total of 445 participants (between ages 26-86) who had completed longitudinal follow-ups were included in the analyses.</p>
<p>Loneliness was measured at the initial time point using three self-report items, with participants indicating how much of the time in the past 30 days they felt lonely, close to others, and like they belonged, on a 5-point scale. At two subsequent time points, participants reported any of 30 chronic health conditions they had experienced or been treated for in the past 12 months.</p>
<p>Epigenetic age acceleration (EAA) was measured using several epigenetic clocks derived from DNA methylation profiles obtained from blood samples. The clocks included the Horvath, DunedinPACE, and GrimAge measures, which estimate biological age based on DNA methylation patterns. The DNA samples were collected, frozen, and subjected to genome-wide methylation profiling using Illumina Methylation EPIC microarrays.</p>
<p>The researchers found that greater loneliness was weakly associated with greater EAA across different measures after accounting for demographic and behavioral covariates. This indicates that individuals who reported higher levels of loneliness also exhibited greater biological aging as measured by the Horvath, DunedinPACE, and GrimAge epigenetic clocks.</p>
<p>Loneliness also predicted increases in the number of chronic health conditions over time. The effect of loneliness on chronic health conditions was more pronounced in individuals with higher DunedinPACE EAA values, suggesting a possible synergistic effect. While EAA was associated with both loneliness and health outcomes, it did not fully explain the relationship between them, highlighting the direct impact of loneliness on health.</p>
<p>A limitation outlined by the authors is the reliance on self-reported measures for loneliness and chronic health conditions, which can be subject to biases and inaccuracies.</p>
<p>The study, “<a href="https://doi.org/10.1037/pag0000822">Loneliness, Epigenetic Age Acceleration, and Chronic Health Conditions</a>,” was authored by Colin D. Freilich, Kristian E. Markon, Steve W. Cole, and Robert F. Krueger.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/mental-health-linked-to-better-aging-cheese-and-lifestyle-matter/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Mental health linked to better aging: Cheese and lifestyle matter</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 22nd 2024, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study highlights the significant role mental wellbeing plays in determining healthy aging, regardless of socioeconomic status. By analyzing genetic data from over 2.3 million Europeans, researchers found that better mental wellbeing leads to healthier aging, characterized by greater resilience, improved self-rated health, and longevity. Interestingly, they also identified certain lifestyle choices, including being active, not smoking, and eating cheese and fruit, as beneficial to mental wellbeing and healthy aging.</p>
<p>The findings have been published in <em><a href="https://doi.org/10.1038/s41562-024-01905-9">Nature Human Behaviour</a>.</em></p>
<p>Human life expectancy has increased significantly over recent decades, posing challenges for individuals and society, such as healthcare demands and financial burdens. While physical health and longevity have often been the focus of aging research, the role of mental wellbeing has received less attention. This study aimed to explore the causal relationship between mental wellbeing and healthy aging, and whether this relationship is independent of socioeconomic status.</p>
<p>The study employed a technique known as Mendelian randomization to investigate the causal relationship between mental wellbeing and healthy aging. This method uses genetic data to determine whether an observed association between two traits is causal or merely correlational. By using genetic variants as proxies for exposures, Mendelian randomization helps mitigate biases commonly found in observational studies, such as confounding factors and reverse causality.</p>
<p>The researchers analyzed data from eight genetic datasets encompassing over 2.3 million individuals of European descent. These datasets included information on five key mental wellbeing traits: overall wellbeing, life satisfaction, positive affect, neuroticism, and depressive symptoms. In addition to mental wellbeing, the study considered three socioeconomic indicators: income, education, and occupation.</p>
<p>The study was conducted in two phases. In the first phase, the researchers assessed the causal associations between mental wellbeing traits and various aging phenotypes, which included resilience, self-rated health, healthspan, parental lifespan, and longevity. They also examined whether these associations were independent of socioeconomic status.</p>
<p>In the second phase, they investigated potential mediating factors that could influence the relationship between mental wellbeing and healthy aging. These factors included lifestyle choices (e.g., diet, physical activity, smoking), behaviors (e.g., medication use, cognitive performance), physical functions (e.g., body mass index, cholesterol levels), and diseases (e.g., cardiovascular diseases, diabetes).</p>
<p>The study found a strong causal relationship between better mental wellbeing and healthier aging outcomes. Specifically, individuals with higher levels of mental wellbeing exhibited significantly higher scores on the aging-related genetic influence phenotypes (aging-GIP), as well as greater resilience, improved self-rated health, longer healthspan, and extended parental lifespan.</p>
<p>For example, the study revealed that a genetically determined increase in overall wellbeing was associated with a substantial rise in aging-GIP (1.21 standard deviations), resilience (1.11 standard deviations), self-rated health (0.84 points), healthspan (1.35 odds ratio), and parental lifespan (3.35 years). However, no significant association was found between overall wellbeing and longevity (odds ratio of 1.56).</p>
<p>Importantly, the study demonstrated that the relationship between mental wellbeing and healthy aging persisted regardless of socioeconomic status. While higher income, education, and occupational attainment were each associated with better mental wellbeing, the positive impact of mental wellbeing on aging outcomes remained significant even after adjusting for these socioeconomic factors. This suggests that mental wellbeing exerts a robust and independent influence on healthy aging.</p>
<p>The researchers also identified several lifestyle factors that contribute to mental wellbeing and, consequently, to healthy aging. Among these, being physically active and avoiding smoking were linked to improved mental wellbeing and healthier aging outcomes. Other influential factors included cognitive performance, age at smoking initiation, and the use of certain medications, which also mediated the relationship between mental wellbeing and aging. Additionally, dietary habits such as consuming more cheese and fruit were found to be beneficial.</p>
<p>Interestingly, this is not the first study to find a link between cheese consumption and mental well-being. A study published in the journal <em>Nutrients</em> found a correlation between <a href="https://www.psypost.org/cheese-consumption-might-be-linked-to-better-cognitive-health-study-finds/" target="_blank" rel="noopener">regular cheese consumption and cognitive health</a> in the elderly population. Analyzing data from 1,516 participants aged 65 and above, those researchers found that individuals who regularly ate cheese tended to have better cognitive function scores.</p>
<p>While the new study provides compelling evidence of the causal relationship between mental wellbeing and healthy aging, it has some limitations. For instance, the study focused on individuals of European descent, so the findings may not be generalizable to other populations. Future research should investigate whether these relationships hold true across different ethnic groups.</p>
<p>Nevertheless, the results suggest that strategies to enhance mental health could significantly improve aging outcomes.</p>
<p>“Our results underscore the imperative to prioritize mental well-being in health policies geared towards fostering healthy aging, and propose that interventions to remediate healthy aging disparities related to suboptimal mental well-being could target promoting healthy lifestyles such as restricting TV watching time and avoiding smoking; monitoring performances and physical functions such as enhancing cognitive function and regulating adiposity; and preventing common chronic diseases,” the researchers concluded.</p>
<p>The study, “<a href="https://www.nature.com/articles/s41562-024-01905-9" target="_blank" rel="noopener">Mendelian randomization evidence for the causal effect of mental well-being on healthy aging</a>,” was authored by Chao-Jie Ye, Dong Liu, Ming-Ling Chen, Li-Jie Kong, Chun Dou, Yi-Ying Wang, Min Xu, Yu Xu, Mian Li, Zhi-Yun Zhao, Rui-Zhi Zheng, Jie Zheng, Jie-Li Lu, Yu-Hong Chen, Guang Ning, Wei-Qing Wang, and Yu-Fang Bi.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/two-week-social-media-detox-yields-positive-psychological-outcomes-in-young-adults/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Two-week social media detox yields positive psychological outcomes in young adults</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 22nd 2024, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in <a href="https://doi.org/10.3390/bs13121004"><em>Behavioral Sciences</em></a> has found that a two-week social media digital detox can significantly reduce smartphone and social media addiction while improving physical, mental and social health among young adults.</p>
<p>Smartphones have become an essential part of modern life, offering a range of functions from communication to entertainment. However, excessive use of these devices has been linked to several negative health impacts, including mental health issues, poor sleep, and reduced physical activity. This has led to growing interest in digital detoxes (more colloquially known as ‘unplugging’ or ‘disconnecting’), where individuals take a break from their electronic devices or social media to improve their health and well-being.</p>
<p>Researchers Paige Coyne from Henry Ford Health and Sarah J. Woodruff from the University of Windsor therefore set out to explore the effects of a two-week social media digital detox on young adults. The study aimed to address the limitations of previous research by using device-based/objective measures, by incorporating follow-up measurements into the study design, and also by providing a more realistic restriction of technology instead of going “cold turkey”.</p>
<p>The study involved 31 young adults aged 18 to 30 who were recruited from a mid-sized university in Ontario, Canada. The participants were regular social media users, spending at least one hour per day on social media applications, and used iPhones with Screen Time tracking enabled.</p>
<p>Participants were asked to limit their social media use to 30 minutes per day for two weeks. Coyne and Woodruff noted, “This specific time limit was implemented in hopes that it would significantly reduce participants’ social media use on their smartphones but not be so restrictive that participants would be unable to complete the intervention successfully.”</p>
<p>Their smartphone and social media usage were tracked using the iPhone’s iOS 12 Screen Time feature. Participants completed surveys at three different timepoints: before the detox with unrestricted social media use, during the detox with restricted social media use, and after the detox when the restriction was removed.</p>
<p>These surveys assessed various health-related outcomes, including smartphone and social media addiction, physical activity, sedentary behavior, sleep, eating behaviors, life satisfaction, stress, and perceived wellness.</p>
<p>The results were promising. On average, time spent on social media was reduced by 77.7%. Participants showed a significant reduction in both smartphone and social media addiction during the detox period.</p>
<p>The researchers highlighted, “comparisons of quantitative [data before and after the detox] indicate that both addiction and all the health-related outcomes studied showed positive or neutral improvement,” suggesting that the effects of the detox lasted for some time and there were no negative outcomes.</p>
<p>Additionally, there were notable improvements in several health-related outcomes. Sleep quality improved, with participants also reporting longer sleep duration during the detox. Life satisfaction also increased, and stress levels decreased. Interestingly however, there was no effect on levels of physical activity, sedentary behavior, or mindful eating.</p>
<p>The data from the interviews with the participants provided further insights. Many participants expressed feelings of relief and decreased pressure to maintain their social media presence. However, some participants did experience feelings of disconnection from friends and family.</p>
<p>The detox posed initial challenges but most participants eventually adapted, where “many … suggested that half an hour was a sort of manageable sweet spot, where they could still engage with social media but not get caught scrolling for hours.”</p>
<p>Nevertheless, despite the reduction in social media time, participants reported that their overall screen time remained high as many turned to other digital activities like gaming or entertainment apps, or increased their use of other devices such as laptops.</p>
<p>Coyne and Woodruff also reported that after the detox, “a great number of participants disclosed that they overindulged in social media for a short period of time” but that “many suggested that they were aware of the binging behavior they were engaging in, that it only lasted a few days, and that their overall awareness of their social media usage increased as a result of participating in a detox.”</p>
<p>The authors concluded on a positive note, “the participants shared many valuable suggestions for future detoxes, with particular emphasis being placed on making detoxes realistic, sustainable, and personalized to each user, where possible.”</p>
<p>It is worth noting that the study had some limitations. For example, the study design lacked a control group of participants who did not undergo the detox. Furthermore, there was an inability to control participants’ use of social media on other devices separate to their phone.</p>
<p>The study, “<a href="https://www.mdpi.com/2076-328X/13/12/1004">Taking a Break: The Effects of Partaking in a Two-Week Social Media Digital Detox on Problematic Smartphone and Social Media Use, and Other Health-Related Outcomes among Young Adults”,</a> was authored by Paige Coyne and Sarah J. Woodruff.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/dopamine-disruption-impairs-mentalizing-abilities/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Dopamine disruption impairs mentalizing abilities</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 22nd 2024, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study has uncovered a significant link between the brain chemical dopamine and our ability to understand and attribute mental states to ourselves and others, a process known as mentalizing. Conducted by researchers at the University of Birmingham, the study demonstrates that altering dopamine levels in the brain affects these mentalizing abilities. These findings are detailed in the journal <em><a href="https://doi.org/10.1371/journal.pbio.3002652" target="_blank" rel="noopener">PLOS Biology</a></em>.</p>
<p>The study was motivated by the observation that people with disorders characterized by dopamine dysfunction, such as Parkinson’s disease, Huntington’s disease, Tourette’s syndrome, and schizophrenia, often struggle with mentalizing. This impairment can lead to severe social challenges, including social isolation and a decreased quality of life.</p>
<p>Despite these connections, the role of dopamine in mentalizing had not been directly tested in healthy individuals. The researchers aimed to fill this gap by investigating whether manipulating dopamine levels could causally influence mentalizing abilities.</p>
<p>“While the mentalizing abilities of people who are struggling with Parkinson’s may not be the main focus of treatment, it nonetheless has a huge impact on people with the disease,” said lead author Bianca Schuster. “Gaining a better understanding of how dopamine imbalances may affect mentalizing processes in the brain could therefore be really significant for individuals, as well as gaining a better understanding of the secondary effects of the drugs prescribed for Parkinson’s and other disorders.”</p>
<p>The study involved 43 healthy volunteers, with an average age of 26 years, who participated in two testing sessions. The participants were given either a dopamine-blocking drug called haloperidol or a placebo in a double-blind setup, meaning neither the participants nor the researchers knew which substance was administered on which day. Haloperidol works by blocking dopamine receptors, thus reducing dopamine activity in the brain.</p>
<p>Each participant underwent a series of tasks designed to measure mentalizing, emotion recognition, working memory, and motor function. The primary mentalizing task involved interpreting short animations where geometric shapes interacted in ways that implied various mental states or simple goal-directed actions.</p>
<p>The results were clear: haloperidol reduced participants’ ability to accurately label the mental states depicted in the animations. This suggests a direct role for dopamine in mentalizing. Specifically, when participants took haloperidol, their accuracy in identifying mental states was significantly lower compared to when they took the placebo.</p>
<p>Interestingly, the impairment was not limited to mental state animations but extended to goal-directed actions as well. This implies that dopamine might influence general cognitive functions like attention and working memory, which are essential for making inferences about others’ actions.</p>
<p>Additionally, the study found that the similarity between participants’ movements and the movements they observed in the animations affected their accuracy in mentalizing. Under placebo, participants who moved in a way similar to the animations were better at identifying the depicted mental states. However, this effect disappeared under haloperidol, suggesting that dopamine disruption impacts the use of motor codes in social cognition.</p>
<p>While the study provides strong evidence for a causal role of dopamine in mentalizing, there are several limitations to consider. First, the tasks used in the study, though well-established, may not fully capture the complexities of real-world social interactions. Future research could explore how dopamine influences mentalizing in more naturalistic settings, such as face-to-face interactions.</p>
<p>Second, the study did not investigate the potential interactions between dopamine and other neuromodulators like serotonin, which are also known to affect social cognition. Understanding how these systems work together could provide a more comprehensive picture of the neurochemical basis of mentalizing.</p>
<p>Furthermore, the study’s findings are based on a healthy population. It remains to be seen how these results translate to individuals with dopamine-related disorders, who may have additional complexities influencing their mentalizing abilities.</p>
<p>“The main implication of our work is that in disorders with dopamine dysfunctions, in addition to producing the primary symptoms associated with these disorders (such as motor symptoms in Parkinson’s disease), the dopamine imbalance also affects individuals’ socio-cognitive abilities,” added Schuster. “This work could have implications for the way in which we treat Parkinson’s in the future, but also the way in which we use any drugs which affect the action of dopamine in the brain.”</p>
<p>The study, “<a href="https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3002652" target="_blank" rel="noopener">Disruption of dopamine D2/D3 system function impairs the human ability to understand the mental states of other people</a>,” was authored by Bianca A. Schuster, Sophie Sowden, Alicia J. Rybicki, Dagmar S. Fraser, Clare Press, Lydia Hickman, Peter Holland, and Jennifer L. Cook.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/study-uses-mixed-reality-to-demonstrate-link-between-psychopathic-traits-and-reduced-anxiety/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Study uses mixed reality to demonstrate link between psychopathic traits and reduced anxiety</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 22nd 2024, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A study conducted in a mixed reality environment found that individuals with more pronounced psychopathic traits tended to show less anxiety-related behaviors in an anxiety-inducing mixed-reality environment used in the study. Fearlessness and lack of empathy best predicted (the lack of) anxious behavior. The paper was published in <a href="https://doi.org/10.1038/s41598-024-62438-9"><em>Scientific Reports</em></a>.</p>
<p>Psychopathic traits are a set of personality characteristics that make individuals prone to antisocial behaviors, lacking empathy, manipulativeness, and equipped with superficial charm. Individuals with these traits often exhibit impulsivity, irresponsibility, and a disregard for societal norms. They may also display shallow emotions and be unable to form genuine relationships.</p>
<p>Early research in psychopathy indicated that psychopathic individuals also tend to show a lack of fear or anxiety. These individuals tend to show very weak physiological reactions to aversive stimuli. Researchers hypothesized that psychopathic individuals tend to have lower anxiety than non-psychopathic individuals. This would make psychopathic individuals less inhibited by anxiety, allowing them to act in fearless and risk-taking ways. This hypothesis is known as the low-anxiety hypothesis of psychopathy.</p>
<p>However, testing the low-anxiety hypothesis in an experimental setting was almost impossible until recently. To test this hypothesis, researchers would need to subject participants to a treatment that induces high levels of anxiety or fear in non-psychopathic individuals. There used to be no credible ways to do this without exposing participants to real danger, which is not acceptable from the standpoint of scientific ethics. However, the development of virtual and augmented reality technology promises to change this.</p>
<p>Study author Alexander Voulgaris and his colleagues wanted to test the low-anxiety hypothesis by exposing participants to an anxiety-inducing environment using mixed reality. Mixed reality blends the physical and digital worlds by superimposing virtual objects onto the real environment, allowing for interactive experiences that integrate both real and virtual elements seamlessly.</p>
<p>The researchers combined virtual reality technology with a slightly elevated wooden platform to create a mixed-reality elevated plus-maze apparatus for humans. This type of apparatus is widely used in research on rodents and consists of an elevated cross-like structure with four paths extending from a central platform. The entire platform is at a high elevation. Two paths are enclosed with walls, while two are open with no walls.</p>
<p>In classic experiments on rodents, a rodent is placed on the central part of the platform, and researchers observe its movement, whether it ventures onto the open paths, how long it takes to start exploring, and how long it spends on each path. This helps assess anxiety-like behaviors in rodents.</p>
<p>In this study, researchers created a similar environment for humans. They expected individuals with more pronounced psychopathic traits to spend more time on the open paths of the maze (the ones without walls) and take less time to start exploring these paths.</p>
<p>The participants were 170 volunteers recruited through electronic and physical bulletin boards on the researchers’ university campus and other public spaces. Sixty-eight percent of participants were female, and their average age was 26 years. Eighty-seven percent were born in Germany.</p>
<p>The elevated plus-maze apparatus consisted of a wooden platform elevated 20 cm from the floor, placed in a 5.5-meter by 5.5-meter experimental room. Participants started the study standing in the center of the platform, using virtual reality gear to simulate standing in the center of a maze on a rocky cliff over the sea, facing one of the open paths. The study authors simulated wind using two fans. The two enclosed paths appeared to be on the cliff, while the open paths were suspended over a deep abyss. Participants had 300 seconds to explore the maze freely.</p>
<p>Participants also completed assessments of psychopathic traits (the Brief Questionnaire of Psychopathic Personality Traits), acrophobia (the Acrophobia Questionnaire, acrophobia is an intense fear of heights), and sensation seeking (the Zuckerman Sensation Seeking Scale).</p>
<p>Results showed that individuals with higher levels of psychopathic traits tended to spend more time exploring the open paths of the maze (the ones suspended over the abyss with no walls) and less time in the enclosed paths. They also began exploring these open paths sooner. In other words, they showed more approach and less avoidance behaviors in this anxiety-inducing situation. Overall, individuals with more pronounced psychopathic traits showed lower levels of all anxiety-related behaviors in this setting.</p>
<p>When researchers examined which aspects of psychopathy were most strongly associated with these behaviors, they found that lack of empathy and fearlessness were the key factors. Psychopathic traits were also associated with lower subjective levels of anxiety. Individuals with more pronounced psychopathic traits tended to report feeling less anxiety while in the mixed reality elevated maze.</p>
<p>“Our results demonstrate the influence of specific psychopathic personality traits on human behavior in a mixed reality environment. To the best of our knowledge, our study is the first to examine the interplay between psychopathic traits and anxiety, not only on a subjective level but also on a behavioral level,” the study authors concluded.</p>
<p>“As hypothesized, in our non-clinical sample, a higher sum score of psychopathy correlated with less anxiety-related behavior and lower subjective levels of anxiety. More specifically, our results show an association between the specific subscales of Fearlessness, Lack of Empathy, and Impulsivity as measured by the FPP [the assessment of psychopathy], and anxious behavior in the EPM [elevated plus-maze].”</p>
<p>The study sheds light on the links between fear and anxiety. However, although mixed reality environments can be quite immersive, human participants remain aware that what they are observing is a computer simulation and that there is no real danger. It is possible that results would differ if real danger were involved.</p>
<p>The paper “<a href="https://doi.org/10.1038/s41598-024-62438-9">The impact of psychopathic traits on anxiety‑related behaviors in a mixed reality environment</a>” was authored by Alexander Voulgaris, Sarah V. Biedermann, Daniel Biedermann, Susanne Bründl, Lateefah Roth, ChristianWiessner, Peer Briken, and Johannes Fuss.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/atheists-are-perceived-as-more-prone-to-infidelity-according-to-new-research/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Atheists are perceived as more prone to infidelity, according to new research</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 22nd 2024, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in the journal <em><a href="https://psycnet.apa.org/record/2024-71239-001" target="_blank" rel="noopener">Psychology of Religion and Spirituality</a></em> has shed light on the stereotypes people hold about atheists and theists when it comes to relationships. The findings indicate that atheists are perceived as more likely to engage in infidelity and adopt cost-inducing strategies in relationships, while theists are seen as more inclined towards benefit-provisioning behaviors.</p>
<p>Researchers Mitch Brown and Patrick R. Neiswender of the University of Arkansas aimed to understand how stereotypes about atheists and theists influence perceptions of their behavior in relationships. According to sexual strategies theory, individuals seek mates who demonstrate commitment and benevolence, which are crucial for successful long-term relationships.</p>
<p>Previous research has shown that religiosity often signals monogamous intent and trustworthiness, making religious individuals appear desirable for long-term mating. On the other hand, atheism is often associated with untrustworthiness and a lack of commitment, fostering stereotypes that atheists are more prone to exploitative behaviors.</p>
<p>To explore how stereotypes about atheist influences perceptions of the likelihood of infidelity and use of mate retention strategies, the researchers conducted three studies with a total of 432 participants from a large public university in the Southeastern United States.</p>
<p>The first study aimed to identify the expectations people have regarding the reproductive strategies of atheists versus theists. The researchers recruited 156 undergraduates from a large public university in the Southeastern United States. The participants comprised 112 women, 43 men, and one individual identifying as non-binary, with an average age of 18.96 years. The majority (84.6%) identified as White, and the sample included 111 theists, 26 agnostics, and 18 atheists.</p>
<p>Participants were randomly assigned to one of two conditions: they read a vignette about a fictional college student named Henry, who was either described as an atheist or a theist. The vignette provided information about Henry’s relationship status and activities but differed only in terms of his religious beliefs.</p>
<p>After reading the vignette, participants rated Henry’s likelihood of engaging in various mate retention behaviors using a scale derived from the Mate Retention Inventory–Short Form. This scale included 19 subscales representing different tactics, such as benefit-provisioning and cost-inducing behaviors.</p>
<p>Benefit-provisioning behaviors in relationships include actions that enhance the partner’s well-being and strengthen the bond, such as showing love and care, making personal sacrifices, enhancing personal appearance, providing gifts and financial support, publicly affirming the relationship, and using sexual intimacy to reinforce commitment.</p>
<p>In contrast, cost-inducing behaviors impose negative consequences to deter infidelity and maintain control, including reacting strongly to infidelity threats, restricting the partner’s social interactions, closely monitoring their activities, inducing jealousy, criticizing the partner to lower their self-esteem, speaking negatively about rivals, and even engaging in physical aggression against potential competitors.</p>
<p>The participants also assessed Henry’s interest in long-term versus short-term mating and his propensity for infidelity.</p>
<p>In the first study, participants consistently viewed Henry as more likely to use sexual inducement tactics and be prone to infidelity when described as an atheist compared to a theist. Specifically, 45.5% of participants rated the atheist Henry as likely to use sexual inducement, compared to 26.8% for the theist Henry. Similarly, 28.8% perceived the atheist Henry as prone to infidelity, while only 19.6% thought the same of the theist Henry.</p>
<p>Participants also perceived Henry as more interested in short-term mating when described as an atheist (32.3%) compared to a theist (16.7%), while the theist Henry was seen as more interested in long-term mating (62.6% vs. 49.7% for the atheist Henry).</p>
<p>The second study sought to replicate and extend the findings of the first study by introducing the variable of physical attractiveness. The researchers recruited 210 undergraduates, again from a large public university in the Southeastern United States. The sample included 150 women and 60 men, with an average age of 18.79 years. Similar to the first study, the majority (84.3%) identified as White, and the sample included 164 theists, 40 agnostics, and six atheists.</p>
<p>Participants were assigned to one of four conditions in a 2×2 factorial design: they read the same vignettes from the first study, describing Henry as either an atheist or a theist, but this time paired with a photograph of either an attractive or unattractive man. The photographs were selected from the Chicago Faces Database, which had previously been normed on attractiveness. Participants rated Henry’s likelihood of engaging in mate retention behaviors, interest in long-term versus short-term mating, and propensity for infidelity using the same scales as in the first study.</p>
<p>The second study replicated these findings, with participants again perceiving the atheist Henry as more likely to use cost-inducing strategies and be interested in short-term mating. Interestingly, physical attractiveness did not significantly alter these perceptions. Both attractive and unattractive atheists were viewed similarly in terms of their propensity for cost-inducing behaviors and infidelity.</p>
<p>The third study explored the mental representations people have of atheists and theists and how these representations influence expectations of mate retention behaviors. The researchers recruited 66 undergraduates from the same university, with 51 women and 15 men participating. The average age was 18.85 years, and the sample was predominantly White (83.3%), with 50 participants identifying as theists, 14 as unsure, and two as atheists.</p>
<p>Participants were shown two images generated through a reverse-correlation procedure, which created composite faces representing the typical appearance of an atheist and a theist based on aggregated data from previous studies. These images had been identified in earlier research as connoting morality for the theist and immorality for the atheist.</p>
<p>Participants rated these faces using the same mate retention scales from the previous studies, evaluating the perceived likelihood of the individuals engaging in benefit-provisioning and cost-inducing behaviors, as well as their interest in long-term versus short-term mating and propensity for infidelity.</p>
<p>The results were consistent with the previous studies: the atheist face was perceived as more prone to infidelity and cost-inducing strategies, while the theist face was seen as more likely to provide benefits and engage in long-term mating behaviors.</p>
<p>“Atheism exhibits a persistent suite of stereotypes that implicate them as disinterested in adhering to broader codified interpersonal rules,” the researchers wrote. “Such perceptions would position them to appear prone to promiscuous reproductive strategies and aggressive relationship behavior. This program of research has provided evidence for how these mental representations of irreligiosity track expectations of relationship conflict.”</p>
<p>While the findings provide valuable insights into stereotypes about atheists and theists, the study has some limitations. The sample was predominantly from the Southeastern United States, a region known for higher religiosity, which may influence the generalizability of the results. Future research should explore these stereotypes in more diverse and less religiously homogeneous populations.</p>
<p>Additionally, the study focused primarily on male targets. Future research should examine perceptions of female targets to see if similar stereotypes apply and how they might differ across genders.</p>
<p>Another direction for future research could involve examining the actual mate retention behaviors of atheists and theists to see if they align with these stereotypes. Understanding whether these perceptions have a basis in reality or are purely driven by prejudice could help in addressing and mitigating the negative stereotypes associated with atheism.</p>
<p>The study, “<a href="https://doi.org/10.1037/rel0000523" target="_blank" rel="noopener">Lay Theories of Mating Interest and Mate Retention Strategies for Atheists and Theists in the Southern United States</a>,” was published April 4, 2024.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/fluoride-exposure-during-pregnancy-linked-to-child-neurobehavioral-issues/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Fluoride exposure during pregnancy linked to child neurobehavioral issues</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 21st 2024, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Fluoride has been added to community drinking water systems in the United States since 1945 to prevent dental cavities. Currently, 73% of the U.S. population receives fluoridated water at a targeted concentration of 0.7 milligrams per liter. However, a new study suggests that prenatal fluoride exposure at these levels may increase the risk of neurobehavioral problems in children.</p>
<p>The research, published in <em><a href="https://doi.org/10.1001/jamanetworkopen.2024.11987" target="_blank" rel="noopener">JAMA Network Open</a></em>, found that a 0.68 milligram per liter increase in fluoride exposure during pregnancy was associated with nearly double the chance of a child exhibiting neurobehavioral issues at age three.</p>
<p>Fluoride is a naturally occurring mineral found in water, soil, and various foods. In teeth, fluoride helps to rebuild (remineralize) weakened tooth enamel, making it more resistant to acid attacks from bacteria in the mouth. This process helps prevent the formation of cavities.</p>
<p>But fluoride might impact neurodevelopment due to its ability to cross the placental barrier and reach the developing fetus. High levels of fluoride exposure have been shown in animal studies to cause neurobiochemical changes, such as oxidative stress, disruption of neurotransmitter function, and alterations in cellular signaling pathways.</p>
<p>Recent studies in Mexico and Canada have indicated that even lower levels of fluoride exposure, similar to those found in the United States, might be linked to poorer neurodevelopmental outcomes. These studies have shown associations between higher prenatal fluoride exposure and lower IQ, increased symptoms of attention-deficit/hyperactivity disorder, and poorer cognitive functioning.</p>
<p>However, U.S.-based research on this topic has been lacking. The researchers aimed to address this gap by examining whether prenatal fluoride exposure is associated with neurobehavioral outcomes in children in the United States.</p>
<p>“There is no known benefit of fluoride consumption to the developing fetus, but we do know that there is possibly a risk to their developing brain,” said the study’s lead investigator Ashley Malin, an assistant professor at the University of Florida.</p>
<p>The study involved 229 mother-child pairs from the Maternal and Developmental Risks from Environmental and Social Stressors (MADRES) cohort. This cohort consists predominantly of Hispanic women of low socioeconomic status living in urban Los Angeles, California. The participants were recruited during prenatal care visits between 2015 and 2020, with eligibility criteria including being 18 years or older, less than 30 weeks pregnant at the time of recruitment, and fluent in English or Spanish.</p>
<p>To assess fluoride exposure, the researchers collected single spot urine samples from the mothers during their third trimester of pregnancy. These samples were analyzed for urinary fluoride levels, which provide a reliable measure of total fluoride intake. The measurements were adjusted for specific gravity to account for variations in urine concentration. The mean gestational age at the time of urine collection was approximately 31.6 weeks.</p>
<p>When the children reached the age of 36 months, their mothers completed the Preschool Child Behavior Checklist (CBCL), a widely used parent-reported measure of child neurobehavior. The CBCL includes 99 items that assess a range of behavioral and emotional problems, such as emotionally reactive, anxious-depressed, somatic complaints, withdrawn, sleep problems, attention problems, and aggressive behavior.</p>
<p>The checklist also includes scales consistent with the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) categories, including depressive problems, anxiety problems, oppositional defiant problems, autism spectrum problems, and attention-deficit/hyperactivity disorder problems.</p>
<p>The study found that the median urinary fluoride concentration among the mothers was 0.76 milligrams per liter. A key finding was that a 0.68 milligram per liter increase in maternal urinary fluoride levels during pregnancy was associated with nearly double the odds of the child scoring in the borderline clinical or clinical range for total neurobehavioral problems.</p>
<p>Specifically, this increase in fluoride exposure was linked to a 2.29-point increase in internalizing problems, such as emotional reactivity, anxiety, and somatic complaints, and a 2.14-point increase in total neurobehavioral problems.</p>
<p>In addition to the overall increase in neurobehavioral problems, higher fluoride exposure was also associated with specific behavioral issues. For instance, a 0.68 milligram per liter increase in fluoride was linked to a 13.54% increase in scores for emotionally reactive behaviors and a 19.60% increase in somatic complaints. Furthermore, there were significant associations with DSM-5-oriented scales, including an 11.29% increase in anxiety problems and an 18.53% increase in autism spectrum problems.</p>
<p>“Women with higher fluoride exposure levels in their bodies during pregnancy tended to rate their 3-year-old children higher on overall neurobehavioral problems and internalizing symptoms, including emotional reactivity, anxiety and somatic complaints,” said Tracy Bastain, an associate professor at the University of Southern California and senior author of the study.</p>
<p>The study did not find significant associations between fluoride exposure and externalizing problems, such as aggressive behavior and attention problems. Additionally, the researchers did not observe any interaction between fluoride exposure and the child’s sex, indicating that the associations were consistent across both boys and girls.</p>
<p>The findings from this study suggest that prenatal fluoride exposure, even at levels considered optimal for preventing dental cavities, may be associated with an increased risk of neurobehavioral problems in children. The researchers emphasized that the fluoride levels found in the study participants’ samples are typical for people living in communities with fluoridated water.</p>
<p>Variations in a person’s fluoride exposure can be attributed to differences in dietary habits, such as using tap water for drinking and cooking instead of filtered water, and consuming foods and beverages naturally high in fluoride. These include green and black tea, certain seafoods, and foods treated with fluoride-containing pesticides.</p>
<p>There are currently no formal guidelines for limiting fluoride intake during pregnancy. Given the widespread use of fluoridated water, these results highlight the need for further research to confirm these findings and to better understand the potential risks of fluoride exposure.</p>
<p>“I think this is important evidence, given that it’s the first U.S.-based study and findings are quite consistent with the other studies published in North America with comparable fluoride exposure levels,” Malin said. “Conducting a nationwide U.S. study on this topic would be important, but I think the findings of the current study and recent studies from Canada and Mexico suggest that there is a real concern here.”</p>
<p>The study, “<a href="https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2818858" target="_blank" rel="noopener">Maternal Urinary Fluoride and Child Neurobehavior at Age 36 Months</a>,” was authored by Ashley J. Malin, Sandrah P. Eckel, Howard Hu, E. Angeles Martinez-Mier, Ixel Hernandez-Castro, Tingyu Yang, Shohreh F. Farzan, Rima Habre, Carrie V. Breton, and Theresa M. Bastain.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/scientists-reveal-startling-impact-of-junk-food-on-the-brains-reward-center/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Scientists reveal startling impact of junk food on the brain’s reward center</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 21st 2024, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in the journal <em><a href="https://doi.org/10.1016/j.neuropharm.2023.109772" target="_blank" rel="noopener">Neuropharmacology</a> </em>sheds light on how consuming high-calorie, sugary, and fatty “junk-food” diets affects brain function and behavior. Researchers found that these diets not only alter neural pathways but also influence food-seeking behaviors, particularly in rats prone to obesity. This discovery could have significant implications for understanding obesity and developing strategies to combat it.</p>
<p>With obesity rates climbing worldwide, it is crucial to understand how calorie-dense diets impact brain function and behavior. Previous research has shown that such diets can alter the function of brain reward centers, especially the nucleus accumbens. The nucleus accumbens is a key brain region involved in the reward circuitry, playing a pivotal role in processing pleasurable stimuli and reinforcing behaviors. It is particularly important in the release of dopamine, which influences motivation, pleasure, and reward-seeking behaviors.</p>
<p>However, little is known about how diet-induced changes in the nucleus accumbens differ between individuals who are prone to obesity and those who are resistant. This study aimed to explore these differences and understand how junk-food consumption and subsequent deprivation impact food-seeking behavior and neural plasticity.</p>
<p>The study was conducted using male rats selectively bred to be either obesity-prone or obesity-resistant. These rats were divided into three groups: those fed standard lab chow, those given free access to a specially prepared junk-food diet, and those fed junk food followed by a period of deprivation where they only had access to standard lab chow. The junk-food diet consisted of a mash made from Ruffle potato chips, Chips Ahoy cookies, Nesquik, Jiff peanut butter, and standard lab chow, designed to mimic a high-calorie, high-fat human diet.</p>
<p>Behavioral experiments included Pavlovian conditioning, instrumental training, and testing to evaluate food-seeking and motivation. In Pavlovian conditioning, rats learned to associate a specific cue with the delivery of food pellets. Instrumental training involved pressing a lever to obtain food pellets, with the researchers measuring how many times the rats pressed the lever to assess their motivation to seek food. Additionally, free consumption tests were conducted to measure how much food the rats consumed when given free access to pellets, both under normal conditions and after a period of food restriction.</p>
<p>To examine changes in brain function, the researchers conducted <em>ex vivo</em> electrophysiological studies focusing on CP-AMPAR transmission in the nucleus accumbens. CP-AMPAR transmission involves the activity of calcium-permeable AMPA receptors, which enhance synaptic responses to the neurotransmitter glutamate. These receptors play a key role in synaptic plasticity, influencing learning, memory, and reward-related behaviors.</p>
<p>The study revealed distinct behavioral and neural changes induced by the junk-food diet, particularly in obesity-prone rats. In the behavioral experiments, all rats demonstrated a similar motivation to work for the presentation of a food cue during conditioned reinforcement tests.</p>
<p>However, differences emerged during instrumental responding tests. Obesity-prone rats fed junk food exhibited reduced lever pressing compared to those fed standard chow, indicating a lower motivation to seek food when it was freely available.</p>
<p>But when junk food was followed by a period of deprivation, these obesity-prone rats showed increased lever pressing and food-seeking behaviors, suggesting that the deprivation period heightened their motivation to seek food.</p>
<p>In contrast, obesity-resistant rats did not show significant changes in food-seeking behaviors following junk-food deprivation, highlighting a key difference between the two groups. The free consumption tests further supported these findings, as obesity-prone rats that experienced junk-food deprivation consumed more food pellets after a period of food restriction compared to those consistently fed junk food or standard chow.</p>
<p>The electrophysiological studies provided insights into the neural mechanisms underlying these behavioral changes. The researchers found increased CP-AMPAR transmission in the nucleus accumbens of obesity-prone rats following junk-food deprivation, but not in obesity-resistant rats.</p>
<p>This effect was specific to inputs from the medial prefrontal cortex (mPFC) but not the basolateral amygdala (BLA). Additionally, reducing activity in mPFC-to-NAc inputs through pharmacological inhibition or optogenetic techniques was sufficient to recruit CP-AMPARs in the nucleus accumbens of obesity-prone rats.</p>
<p>These findings suggest that a history of junk-food consumption and subsequent deprivation can lead to significant neural and behavioral changes, particularly in individuals prone to obesity. The study highlights the importance of understanding how diet-induced plasticity in brain reward pathways contributes to obesity and suggests potential targets for interventions aimed at mitigating the effects of obesogenic diets.</p>
<p>“These data provide further evidence that interactions between predisposition and diet-induced neurobehavioral plasticity likely contribute to weight gain and the maintenance of obesity,” the researchers concluded. “In light of modern diet culture, these data also emphasize the importance of understanding lasting changes that occur after stopping a sugary, fatty diet and set the stage for future studies linking these synaptic changes to behavioral outcomes.”</p>
<p>“Finally, data here demonstrate for the first time that reducing excitatory transmission can recruit synaptic CP-AMPARs in adult brain slices and the NAc. Thus, these data reveal novel insights into the mechanisms underlying CP-AMPAR recruitment in the NAc that likely involve synaptic scaling mechanisms. This has important implications for both cue-triggered food- and potentially drug-seeking behaviors.”</p>
<p>The study, “<a href="https://www.sciencedirect.com/science/article/pii/S0028390823003623" target="_blank" rel="noopener">Effects of junk-food on food-motivated behavior and nucleus accumbens glutamate plasticity; insights into the mechanism of calcium-permeable AMPA receptor recruitment</a>,” was authored by Tracy L. Fetterly, Amanda M. Catalfio, and, Carrie R. Ferrario.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/caffeine-exacerbates-brain-changes-caused-by-sleep-loss-study-suggests/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Caffeine exacerbates brain changes caused by sleep loss, study suggests</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 21st 2024, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study has found that caffeine might exacerbate the negative effects of chronic sleep restriction on the brain’s grey matter. The researchers discovered that people who consumed caffeine during a period of sleep restriction showed more significant reductions in grey matter volume compared to those who did not consume caffeine. The findings were recently published in <em><a href="https://www.nature.com/articles/s41598-024-61421-8" target="_blank" rel="noopener">Scientific Reports</a></em>.</p>
<p>Caffeine is the most widely used psychoactive substance globally, renowned for its ability to improve alertness and alleviate cognitive impairments caused by lack of sleep. However, both acute sleep loss and daily caffeine intake have been associated with reductions in gray matter volume, a key component of the brain involved in processing information and regulating various cognitive functions.</p>
<p>By examining the interaction between chronic sleep restriction and daily caffeine intake, the researchers aimed to uncover whether caffeine consumption during periods of sleep deprivation would lead to further reductions in gray matter volume. Additionally, the study sought to explore the role of the adenosine system, particularly the availability of adenosine A1 receptors, in mediating the brain’s response to caffeine and sleep loss.</p>
<p>Adenosine A1 receptors are a type of receptor in the brain that plays a crucial role in regulating neural activity and promoting sleep. They are part of the adenosine system, which helps to balance energy consumption and maintain homeostasis. Caffeine acts as an antagonist to these receptors, blocking their action and thereby enhancing alertness and counteracting sleepiness.</p>
<p>The research was conducted, in part, by first author Yu-Shiuan Lin of the University Psychiatric Clinics Basel in Switzerland and senior author David Elmenhorst of the Institute of Neuroscience and Medicine at Forschungszentrum Jülich in Germany.</p>
<p>“The conception of this work was inspired by the findings in Dr. Yu-Shiuan Lin’s <a href="https://academic.oup.com/cercor/article/31/6/3096/6135013" target="_blank" rel="noopener">previous study</a> at the <a href="https://www.chronobiology.ch/" target="_blank" rel="noopener">Centre for Chronobiology</a>. There we observed a caffeine concentration-dependent decrease in gray matter after a controlled 10-day caffeine intake, which was independent of the caffeine-induced vasoconstriction and was only partially mitigated after 36 hours,” the two researchers told PsyPost in a joint statement.</p>
<p>“This effect of caffeine on gray matter plasticity added up to the abundant animal evidence that demonstrates the involvement of adenosine receptors, the primary property caffeine binds to and exerts effects through, in modulating the synaptic plasticity in central nervous systems. To understand a potential role of adenosine in gray matter plasticity, Yu-Shiuan reached out to Professor David Elmenhorst at the Forschungszentrum Jülich for this collaborative project.”</p>
<p>“At the time David, was conducting a PET-MRI study together with the German Aerospace Center to investigate the effects of chronic sleep restriction and daily caffeine intake on adenosine A1 receptor (A1R) availability. Besides the A1R measurement <em>in vivo</em>, this study was exhilarating as it could potentially provide insights to the adenosinergic modulation in brain plasticity through the interference with sleep.”</p>
<p>“The metabolism and signaling of adenosine play a crucial role in the homeostatic mechanism of sleep,” Lin and Elmenhorst explained. “Disrupted sleep can in turn alter adenosine signaling and has been frequently found to impair brain structures in both animal models and humans. Insufficient sleep also consolidates the caffeine intake behaviors, and both are, often concurrently, prevalent in the modern society.</p>
<p>“Hence, we took the opportunity and repurposed the PET and T1-weighted MR images in addition to the data of Arterial Spin Labeling to account for the caffeine effects on brain perfusion, and examined the gray matter plasticity after chronic sleep restriction with and without combining use of caffeine, as well as the potential mediation of A1R in these gray matter responses.”</p>
<p>The study was conducted at the German Aerospace Center’s research facility in Cologne with 36 healthy adult participants, including 15 females and 21 males, aged around 29 years. These participants were selected based on their low habitual caffeine intake (less than 450 mg per day) and non-smoking status. The participants were divided into two groups: one group received caffeine-containing coffee (the CAFF group), while the other received decaffeinated coffee (the DECAF group).</p>
<p>The experiment spanned nine days in a controlled laboratory setting. It started with an adaptation day, followed by two baseline days where participants had 8 hours of sleep per night. This was followed by five days of chronic sleep restriction, with participants limited to 5 hours of sleep per night, and concluded with a recovery day of 8 hours of sleep. During the chronic sleep restriction phase, the CAFF group received 200 mg of caffeine in the morning and 100 mg in the afternoon, while the DECAF group received equivalent volumes of decaffeinated coffee.</p>
<p>To measure the effects on gray matter volume, participants underwent magnetic resonance imaging (MRI) scans and positron emission tomography (PET) scans at three points: after the baseline days, after the chronic sleep restriction phase, and after the recovery day. Saliva samples were collected regularly to monitor caffeine levels, ensuring accurate tracking of caffeine intake and its physiological effects.</p>
<p>The results of the study indicated that chronic sleep restriction led to changes in gray matter volume, which were significantly influenced by caffeine intake. Participants who did not consume caffeine (DECAF group) during the sleep restriction phase showed an increase in gray matter volume in several brain regions, including the prefrontal cortex, temporal-occipital cortex, and thalamus. These regions are associated with various cognitive and sensory functions, indicating a potential compensatory response to sleep loss.</p>
<p>“It was somewhat surprising to us to observe an increase, instead of a decrease in gray matter in the participants without caffeine (DECAF group) after chronic sleep restriction,” Lin and Elmenhorst told PsyPost. “However, an earlier study (<a href="https://www.frontiersin.org/journals/psychiatry/articles/10.3389/fpsyt.2018.00266/full" target="_blank" rel="noopener">Dai et al. (2018)</a>) shed light on a potential explanation to this finding. This study examined the changes in gray matter along the course between 20 and 36 hours of wakefulness, and they found that multiple brain regions in fact started from showing an increase in the early stage (20h) and turned to be a reduction later (36h).”</p>
<p>“Although the impact of a total sleep deprivation could not be generalized to chronic sleep restriction, we speculated that the gray matter responses to an increasing duration and/or intensity of sleep loss may not follow a linear trajectory. More studies are certainly warranted to systematically examine the gray matter changes in different patterns of sleep restrictions.”</p>
<p>In contrast, participants who consumed caffeine (CAFF group) during the sleep restriction phase exhibited a decrease in gray matter volume in these same regions. This suggests that caffeine might inhibit the brain’s compensatory mechanisms during periods of insufficient sleep, potentially exacerbating the negative impact of sleep loss on brain structure.</p>
<p>The researchers also found that individual differences in adenosine receptor availability played a significant role in the extent of gray matter changes. The participants with lower baseline availability of subcortical adenosine receptors experienced greater reductions in gray matter volume when they consumed caffeine during sleep restriction. This finding highlights the importance of adenosine receptor activity in mediating the effects of sleep deprivation and caffeine on brain structure.</p>
<p>“People who have a higher A1R availability seem to have more resistance to the effect of caffeine on gray matter,” the researchers explained. “After a recovery sleep and around 30-hour caffeine withdrawal, most of the changes in gray matter have recovered, except the increased dorsolateral prefrontal cortex associated with chronic sleep restriction and the decreased thalamus associated caffeine intake,”</p>
<p>“It is commonly known that caffeine intake combats sleepiness. Our data further indicate that caffeine intake also interferes with the brain plasticity induced by sleep loss. However, caffeine does not simply suppress or normalize the gray matter change but also impacts gray matter in an opposite direction. It is unclear how the effect of this brain plasticity manifests on the cognitive behavioral levels; what we know is that it is likely demonstrating the adenosine modulation in neural homeostasis.”</p>
<p>Despite its rigorous methodology, the study has some limitations to consider. The sample size was relatively small, and participants were selected based on specific genetic profiles related to caffeine metabolism, which might limit how well the findings apply to the broader population.</p>
<p>Additionally, while MRI scans showed changes in grey matter, they can’t tell us exactly what caused these changes. It could be a gain or loss of neurons, changes in synapse density, or variations in the number of support cells like microglia. To pinpoint these specific changes, future studies could use PET scans with special markers to measure synapses, mitochondria, or microglia, the researchers explained.</p>
<p>“The synaptic homeostasis hypothesis (SHY) <a href="https://doi.org/10.1016/j.neuron.2013.12.025" target="_blank" rel="noopener">states that</a> ‘sleep is the price we pay for brain plasticity,'” the researchers said. “A decade has passed since SHY was published, and this fascinating hypothesis is still yet to be tested in humans <em>in vivo</em>. David therefore has been dedicated to studying the molecular mechanism of sleep-wake regulations using biomedical imaging. On the other hand, Yu-Shiuan continues focusing on pharmacological PET-MR imaging to investigate the adenosinergic modulation and its role in the effects of caffeine. We are hoping that future outcomes along these lines will provide insights to the findings in this current work.”</p>
<p>The study, “<a href="https://doi.org/10.1038/s41598-024-61421-8" target="_blank" rel="noopener">Repeated caffeine intake suppresses cerebral grey matter responses to chronic sleep restriction in an A1 adenosine receptor-dependent manner: a double-blind randomized controlled study with PET-MRI</a>,” was authored by Yu-Shiuan Lin, Denise Lange, Diego Manuel Baur, Anna Foerges, Congying Chu, Changhong Li, Eva-Maria Elmenhorst, Bernd Neumaier, Andreas Bauer, Daniel Aeschbach, Hans-Peter Landolt, and David Elmenhorst.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/more-neurotic-and-conscientious-individuals-tend-to-feel-stronger-attachment-to-pets/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">More neurotic and conscientious individuals tend to feel stronger attachment to pets</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 21st 2024, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study conducted in the United Kingdom has found that individuals with pronounced traits of neuroticism and conscientiousness tend to report stronger attachments to their pets. Additionally, females, dog owners, people over 50 years of age, and those caring for children under 18 also tend to report stronger bonds with their pets. The study was published in the <a href="https://doi.org/10.3389/fpsyt.2024.1406590"><em>Frontiers in Psychiatry</em></a>.</p>
<p>Pet ownership is a widespread phenomenon in modern society, with estimates suggesting that over 500 million pets reside in homes worldwide. People keep pets for various reasons, including companionship, emotional support, and the joy they bring to daily life.</p>
<p>Pets can provide a sense of purpose and reduce feelings of loneliness, contributing positively to mental health. Additionally, caring for a pet encourages physical activity and fosters social interactions with other pet owners. Pets also offer unconditional love and loyalty, creating strong emotional bonds with their owners.</p>
<p>The emotional bond between pets and their owners can significantly impact the owner’s well-being and overall health. While some studies have found that having a pet is associated with better health outcomes, other studies have not supported this conclusion. Some researchers have proposed that the health effects of pet ownership might depend on the strength of the emotional bond between the owner and the pet, which could be influenced by the owner’s personality.</p>
<p>Study authors Deborah L. Wells and Kathryn R. Treacy wanted to explore whether the strength of the emotional bond with the pet depends on the owner’s personality. They decided to explore the link with both the classic, Big Five personality traits, and with the Dark Triad traits, a set of three personality characteristics associated with manipulative, callous, and socially malevolent behaviors.</p>
<p>The researchers invited adult dog and cat owners from across the globe to participate in their study. They recruited them via advertisements placed on social media platforms. In this way, they collected valid responses from 938 participants. Of these, 85% were women and 76% were married. 30% came from the U.K. and Ireland and another 47-48% were from Americas and the rest of Europe.</p>
<p>The survey contained assessments of the Big Five personality traits (the Big Five Personality Scale-Short), Dark Triad personality traits (the Short Dark Triad), and of the strength of attachment to the pet (the Lexington Attachment to Pets Scale). It also asked some demographic questions.</p>
<p>The researchers found that individuals with higher levels of neuroticism and conscientiousness reported stronger bonds with their pets. Neuroticism is characterized by emotional instability, anxiety, and moodiness. Conscientiousness involves being diligent, careful, and organized, with a strong sense of responsibility and reliability.</p>
<p>Interestingly, the study also found a weak link between Machiavellianism and stronger attachment to pets. Machiavellianism refers to a manipulative and deceitful personality style, marked by a cynical view of human nature and a focus on personal gain.</p>
<p>In addition to personality traits, demographic factors also influenced attachment levels. Women, dog owners, people over 50, and those caring for children under 18 reported stronger bonds with their pets. This aligns with previous research suggesting that women and dog owners tend to form stronger attachments to their pets, possibly due to higher levels of empathy and the social nature of dogs.</p>
<p>“Overall, this study points to a relationship between strength of attachment to one’s pet and owner personality, at least as assessed using the Big Five approach to personality measurement. There was little to support the idea that the Dark Triad traits were associated with strength of attachment to one’s pet, although the link between these characteristics and attachment styles is still unknown. There are clearly important links between human-animal attachment and mental health outcomes, both for people and their pets,” the study authors concluded.</p>
<p>The study sheds light on the links between emotional bonds with pets and personality. However, it also has limitations that need to be considered. The online recruitment method likely attracted individuals who are strongly attached to their pets, which may not represent the general population. Future studies should aim to include a more diverse and representative sample to validate these findings.</p>
<p>Additionally, the study’s focus on self-reported data may introduce bias, as participants might respond in a socially desirable manner. Future research could incorporate more objective measures of attachment, such as observing interactions between owners and pets or measuring physiological responses.</p>
<p>The paper, “<a href="https://doi.org/10.3389/fpsyt.2024.1406590">Pet attachment and owner personality,</a>” was authored by Deborah L. Wells and Kathryn R. Treacy.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/new-research-uncovers-brain-hierarchies-in-music-perception/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">New research uncovers brain hierarchies in music perception</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 21st 2024, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Ever heard a snippet of a song and instantly known what comes next? Or picked up the rhythm of a chorus after just a few notes? New research from the Center for Music in the Brain at Aarhus University and the Centre for Eudaimonia and Human Flourishing at the University of Oxford has found that our brains process music through a specific hierarchical activation of several regions. The findings, published in <em><a href="https://doi.org/10.1038/s41467-024-48302-4" target="_blank" rel="noopener">Nature Communications</a></em>, provide new insights into the neural mechanisms underlying our ability to anticipate and identify familiar melodies.</p>
<p>While previous research has established the hierarchical organization of auditory perception, it has mostly focused on elementary auditory stimuli and automatic predictive processes. However, much less is known about how this information integrates with complex cognitive functions, such as consciously recognizing and predicting sequences over time. By investigating these mechanisms, the researchers aimed to uncover new insights into how our brains handle complex auditory tasks.</p>
<p>“My interest in this topic began during my multidisciplinary education. As a child, I was passionate about both science and football, but I eventually dedicated myself to studying classical guitar in depth. Between the ages of 18 and 22, I performed in several concerts and taught guitar. However, I realized that my childhood passion for science was calling me back,” said study author Leonardo Bonetti (<a href="https://x.com/leonardobo92?lang=en">@LeonardoBo92</a>), an associate professor at Aarhus University and the University of Oxford.</p>
<p>“I transitioned first to studying psychology and then moved into neuroscience, with a particular interest for analytical methods. During my studies, I discovered that music could serve as a powerful tool to explore certain features of the brain that are challenging to understand with non-musical stimuli. This is because music consists of a series of hierarchical sounds arranged over time, making it an excellent means to investigate how the brain processes information consciously over periods.”</p>
<p>The study involved 83 participants between the ages of 19 and 63, all of whom had normal hearing and were predominantly university-educated. Participants were first introduced to a short musical piece, specifically the first four bars of Johann Sebastian Bach’s Prelude No. 2 in C Minor, BWV 847. They listened to this piece twice and were asked to memorize it.</p>
<p>Following this memorization phase, the participants were subjected to an auditory recognition task while their brain activity was recorded using magnetoencephalography (MEG). MEG is a non-invasive imaging technique that captures the magnetic fields produced by neural activity, providing precise temporal and spatial resolution.</p>
<p>The recognition task consisted of 135 five-tone musical sequences, some of which were identical to the original piece while others were systematically varied. These variations were introduced at different points in the sequence to observe how the brain responds to changes in familiar patterns.</p>
<p>Bonetti and his colleagues found that when participants recognized the original memorized sequences, their brain activity followed a specific hierarchical pattern. This pattern began in the auditory cortex, the region responsible for processing basic sound information, and progressed to the hippocampus and cingulate gyrus, areas associated with memory and cognitive evaluation.</p>
<p>When variations were introduced into the sequences, the brain generated prediction errors. These errors started in the auditory cortex and then spread to the hippocampus, anterior cingulate gyrus, and ventromedial prefrontal cortex. Notably, the anterior cingulate gyrus and ventromedial prefrontal cortex exhibited their strongest responses when the variations were introduced.</p>
<p>The study also uncovered a consistent brain hierarchy characterized by feedforward and feedback connections. Feedforward connections from the auditory cortices to the hippocampus and cingulate gyrus, along with simultaneous feedback connections in the opposite direction, were observed.</p>
<p>This hierarchical organization was consistent for both previously memorized and varied sequences, although the strength and timing of the brain responses varied. This suggests that while the overall structure of brain processing remains stable, the dynamics change depending on whether the sequence is familiar or novel.</p>
<p>“Our study shows that the brain processes music (and information over time) by activating several brain regions in a specific, hierarchical order,” Bonetti told PsyPost. “Initially, sensory regions like the auditory cortex handle basic sound features. Then, this information is passed to a larger network of regions that arguably analyze the sounds more deeply, including the relationships between them (such as musical intervals). This process helps the brain determine if the sequence of sounds is familiar or new.”</p>
<p>“This study not only explains how we perceive music but also provides insights into how the brain processes and recognizes information over time. On a practical level, future research could focus on studying this phenomenon in aging, both healthy and pathological (like dementia). By using music, advanced neuroscientific tools, and analytical methods, we might gain further understanding of dementia and memory disorders.”</p>
<p>Bonetti said the long-term goals of this research are to develop dementia screening tools based on brain responses to music and to enhance data collection methods by integrating MEG with intracranial recordings for a more comprehensive understanding of music memory mechanisms.</p>
<p>“By studying aging and dementia over time, I aim to develop screening tools based on brain responses during music recognition,” he explained. “These tools could predict the risk of older adults developing dementia.”</p>
<p>“Second, I want to expand our data collection methods. Currently, we use magnetoencephalography (MEG), which is a great non-invasive tool but lacks the ability to focus deeply within the brain. In the future, I plan to integrate MEG with intracranial recordings from electrodes implanted in epileptic patients. This combination will help us understand the brain mechanisms involved in music memory across a wider range of time and spatial scales.”</p>
<p>“I wish to thank very much the several foundations which are supporting our work, in particular Lundbeck Foundation, Carlsberg Foundation, the Danish National Research Foundation and the Linacre College of the University of Oxford,” Bonetti added.</p>
<p>The study, “<a href="https://www.nature.com/articles/s41467-024-48302-4" target="_blank" rel="noopener">Spatiotemporal brain hierarchies of auditory memory recognition and predictive coding</a>,” was authored by L. Bonetti, G. Fernández-Rubio, F. Carlomagno, M. Dietz, D. Pantazis, P. Vuust, and M. L. Kringelbach.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/adhd-and-autism-new-insights-into-their-unique-neural-profiles/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">ADHD and autism: New insights into their unique neural profiles</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 20th 2024, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Attention deficit hyperactivity disorder (ADHD) and autism spectrum disorder (ASD) are two of the most prevalent neurodevelopmental disorders. Despite their distinct diagnostic criteria, there is a notable clinical and genetic overlap between the two disorders.</p>
<p>A new meta-analysis, published in <a href="https://doi.org/10.1176/appi.ajp.20230270"><em>The American Journal of Psychiatry</em></a>, sought to investigate the neural correlates underlying these overlaps and distinctions by examining 243 task-based functional MRI (fMRI) studies. The findings reveal that while ADHD and ASD share some brain activity patterns, the unique differences in brain function for each disorder are much more significant. This suggests that ADHD and ASD should be considered distinct conditions, as their brain activity patterns are more different than similar.</p>
<p>The motivation for this study stems from the observed clinical and genetic overlap between ADHD and ASD. ADHD is characterized by persistent patterns of inattention, hyperactivity, and impulsivity that interfere with daily functioning. ASD, on the other hand, is marked by difficulties in social communication and interaction, along with restricted interests and repetitive behaviors.</p>
<p>Despite their distinct diagnostic criteria, individuals with ADHD often exhibit symptoms typically associated with ASD and vice versa. Additionally, genetic studies have revealed shared genetic factors between the two disorders, further blurring the lines between them.</p>
<p>Previous research has attempted to understand these overlaps by using task-based functional MRI (fMRI) studies to identify the neural correlates of ADHD and ASD symptoms. However, these studies often used specific tasks designed for each disorder, which could introduce bias and limit the generalizability of the findings. The researchers wanted to overcome these limitations and gain a clearer picture of the neural mechanisms underlying ADHD and ASD by conducting a meta-analysis.</p>
<p>A meta-analysis is a statistical technique that combines the results of multiple scientific studies to derive a more comprehensive understanding of a particular research question. This method allows researchers to pool data from various individual studies, enhancing the overall statistical power and reliability of the findings. By aggregating data, a meta-analysis can identify patterns, trends, and effects that might not be apparent in individual studies due to limited sample sizes or varying methodologies.</p>
<p>The meta-analysis included data from 243 original task-based fMRI studies that involved either individuals with ADHD, individuals with ASD, or both, alongside typically developing controls. The studies were selected through a rigorous search of multiple databases, including PubMed and Web of Knowledge, and were screened based on strict inclusion and exclusion criteria.</p>
<p>The final sample consisted of 3,084 participants with ADHD, 2,654 participants with ASD, and 6,795 control subjects. The studies used a variety of neuropsychological tasks, such as go/no-go and n-back tasks for cognitive control, as well as tasks focusing on social processes, reward responsiveness, and attention.</p>
<p>The results highlighted the existence of both shared and disorder-specific neural activations in ADHD and ASD. Shared activations included greater activation in the right-lateralized lingual gyrus and rectal gyrus, as well as lower activation in the left middle frontal gyrus and superior temporal gyrus. These shared activations suggest some common neural pathways involved in the cognitive and behavioral symptoms of both disorders.</p>
<p>However, disorder-specific activations were more prominent. For ASD, greater-than-typical activations were observed in the left middle temporal gyrus, inferior parietal lobule, right hippocampus, and left putamen. Lower activations were noted in the left middle frontal gyrus, right middle temporal gyrus, left amygdala, and right hippocampus. These findings indicate that ASD is associated with specific neural dysfunctions in regions related to social processes, cognitive flexibility, and emotional processing.</p>
<p>For ADHD, greater-than-typical activations were found in the right insula, posterior cingulate cortex, right amygdala, and putamen. Lower activations were seen in the right middle temporal gyrus, left inferior frontal gyrus, right globus pallidus, and left thalamus. These results suggest that ADHD involves distinct neural abnormalities in areas related to attention, inhibition, and reward processing.</p>
<p>In <a href="https://psychiatryonline.org/doi/10.1176/appi.ajp.20240264" target="_blank" rel="noopener">an editorial</a> about the study, Philip Shaw, an Earl Stadtman Senior Investigator at the Neurobehavioral Clinical Research Section of the National Human Genome Research Institute, wrote that the findings highlight the need for more fMRI studies where individuals with ADHD and ASD perform the same tasks. By conducting such studies, researchers can obtain clearer and more consistent data on the unique and shared neural features of these conditions.</p>
<p>“As the authors stress, there are only a handful of fMRI studies that include individuals with ADHD and individuals with ASD performing the same task. Although these head-to-head-studies have also found that diagnostic differences exceed similarities, the brain regions identified did not overlap with those emerging from the meta-analysis. To resolve this discrepancy, we need more fMRI studies where individuals with ADHD, ASD, or both diagnoses perform the same task,” Shaw explained.</p>
<p>“Tamon et al. found largely distinct neural landscapes in ADHD and ASD, suggesting we should split apart rather than lump together these conditions. A third option is to collect more data. Specifically, by collecting data from a common core set of tasks transdiagnostically, we could obtain the large data sets needed to capture fully the brain’s functional architecture in these complex neurodevelopmental conditions.”</p>
<p>The study, “<a href="https://psychiatryonline.org/doi/10.1176/appi.ajp.20230270" target="_blank" rel="noopener">Shared and Specific Neural Correlates of Attention Deficit Hyperactivity Disorder and Autism Spectrum Disorder: A Meta-Analysis of 243 Task-Based Functional MRI Studies</a>,” was authored by Hiroki Tamon, Junya Fujino, Takashi Itahashi, Lennart Frahm, Valeria Parlatini, Yuta Y. Aoki, Francisco Xavier Castellanos, Simon B. Eickhoff, and Samuele Cortese.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/brain-health-in-aging-intermittent-fasting-and-healthy-diets-show-promising-results/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Brain health in aging: Intermittent fasting and healthy diets show promising results</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 20th 2024, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Researchers at Johns Hopkins Medicine and the National Institutes of Health’s National Institute on Aging have uncovered promising findings regarding the effects of intermittent fasting and a standard healthy diet on brain health in older adults with obesity and insulin resistance. Their study, published in <a href="https://doi.org/10.1016/j.cmet.2024.05.017"><em>Cell Metabolism</em></a>, found that both diets led to improvements in cognition, with intermittent fasting showing slightly stronger benefits.</p>
<p>As people live longer, the prevalence of conditions like Alzheimer’s and related dementias is expected to rise, posing significant challenges for individuals, families, and healthcare systems. Cognitive decline not only impacts quality of life but also leads to increased disability and loss of independence, creating a pressing demand for effective preventive strategies.</p>
<p>One key factor implicated in brain aging and the development of Alzheimer’s disease is insulin resistance. Insulin resistance, which is more common with advancing age and obesity, affects the body’s ability to regulate glucose and has been linked to cognitive impairment and neurodegenerative diseases. Given this connection, interventions that improve insulin sensitivity could potentially mitigate cognitive decline and promote brain health in older adults.</p>
<p>“There is a widespread impression both among scientists and in the general public that diets in general and intermittent fasting in particular are good for cognitive function and brain health and may mitigate the risk for Alzheimer’s disease; however, there has been very little data from clinical studies to support this notion. We sought to close this evidence gap by comprehensively assessing cognition and multiple brain health biomarkers in response to a 5:2 intermittent fasting and a healthy living diet,” said study author <a href="https://irp.nih.gov/pi/dimitrios-kapogiannis" target="_blank" rel="noopener">Dimitrios Kapogiannis</a>, a senior investigator and chief of the Human Neuroscience Section at the National Institute on Aging.</p>
<p>The researchers recruited 40 participants who were older adults with obesity and insulin resistance, a group at higher risk for accelerated brain aging and cognitive decline. These participants were randomly assigned to one of two dietary plans: the 5:2 intermittent fasting diet or the USDA-approved healthy living diet.</p>
<p>The intermittent fasting group followed a regimen where they restricted their calorie intake to one-quarter of the recommended daily intake for two consecutive days each week, consuming only two shakes providing 480 calories each day. On the remaining five days, they followed the healthy living diet. The healthy living group, on the other hand, adhered to the healthy living diet every day, which emphasized balanced meals including fruits, vegetables, whole grains, lean proteins, and low-fat dairy, while limiting added sugars, saturated fats, and sodium.</p>
<p>To monitor adherence and reinforce the dietary plans, participants attended in-person visits at weeks 2, 4, and 6 for anthropometric measurements and blood draws, and were contacted by phone or email on weeks 1, 3, 5, and 7. The final visit took place at week 8, with assessments conducted at the start and end of the study period.</p>
<p>The assessments included brain health measures, cognition tests, and systemic and peripheral metabolism measures. Neuron-derived extracellular vesicles were collected from the participants’ blood to analyze biomarkers related to brain cell activity and insulin signaling. Additionally, brain imaging and cognitive performance tests were conducted to gauge the impact of the diets on brain aging and function.</p>
<p>The findings of the study revealed that both the intermittent fasting and healthy living diets led to improvements in insulin resistance and cognitive function. Participants in both groups exhibited decreased insulin resistance, but the improvements were more pronounced in the intermittent fasting group. This was evidenced by significant reductions in specific biomarkers of insulin resistance found in the neuron-derived extracellular vesicles.</p>
<p>In terms of brain health, the study found that both diets contributed to slowing the pace of brain aging, particularly in brain regions critical for executive function, such as the anterior cingulate and prefrontal cortex. This was measured using brain-age-gap estimates derived from MRI scans, which indicate how much older or younger an individual’s brain appears relative to their chronological age. Both diets resulted in similar reductions in the brain-age-gap, suggesting beneficial effects on brain aging.</p>
<p>“Both diets were good for overall health and brain health, but 5:2 intermittent fasting showed stronger effects for reversing insulin resistance, improving executive function, and optimizing brain metabolism than the healthy living diet,” Kapogiannis told PsyPost. “However, we did not find any evidence that these two diets change any Alzheimer’s-related biomarkers in the short term. Finally, sex and genetic factors, such as APOE, may modify responses to the diets. Therefore, which diet is best should be an individualized choice.”</p>
<p>Specifically, the intermittent fasting group showed significant improvements in tasks related to strategic planning and cognitive flexibility. They also exhibited greater enhancements in memory, particularly in long delay cued recall, compared to the healthy living group. Physical activity levels increased in the intermittent fasting group, with a decrease in sedentary behavior, whereas the healthy living group did not show significant changes in physical activity.</p>
<p>Interestingly, despite the overall positive outcomes, the study did not find significant changes in cerebrospinal fluid biomarkers associated with Alzheimer’s disease, such as amyloid-beta and tau proteins. This suggests that while the dietary interventions had clear benefits for insulin resistance and cognitive function, their impact on Alzheimer’s disease-specific biomarkers was limited.</p>
<p>“A couple things surprised us: The fact that a low intensity conventional intervention, such as the healthy living diet, was effective in improving brain health; almost as effective as a higher-intensity intervention such as 5:2 intermittent fasting for many outcomes,” Kapogiannis explained. “Also, the fact that cerebrospinal fluid biomarkers of Alzheimer’s disease did not show any improvements – however, the intervention lasted only 8 weeks, so the biomarkers might have improved with a longer intervention.”</p>
<p>While the study’s findings are promising, some limitations should be considered. The study duration was relatively short. Therefore, the long-term effects of the diets remain unknown. Additionally, the sample size was small, with only 20 participants in each diet group, which limits the ability to draw definitive conclusions about sub-groups based on sex or genetic factors.</p>
<p>“We can reasonably speculate about but not really know what the long-term effects of the diets are,” Kapogiannis noted. “Studying intermittent fasting for longer periods of time is essential. Also, combining diet with ketogenic supplements to see whether there are added benefits from driving brain ketones higher. Long-term, I think that the choice of diet for an individual should be decided along the principles of <a href="https://www.nih.gov/about-nih/what-we-do/nih-turning-discovery-into-health/promise-precision-medicine" target="_blank" rel="noopener">Precision Medicine</a>, based on sex, genetic factors and biomarkers.”</p>
<p>By employing a comprehensive and multimodal approach to assess the effects of dietary interventions on brain health, the study sets a methodological standard that future research can build upon. It highlights the potential of neuron-derived extracellular vesicles, magnetic resonance imaging, and magnetic resonance spectroscopy to offer detailed insights into how diets impact cognitive function and insulin resistance in older adults.</p>
<p>“I hope that this study offers a blueprint for future research to vigorously assess long-term effects of diet on brain health,” Kapogiannis said.</p>
<p>The study, “<a href="https://www.cell.com/cell-metabolism/abstract/S1550-4131(24)00225-0" target="_blank" rel="noopener">Brain responses to intermittent fasting and the healthy living diet in older adults</a>,” was authored by Dimitrios Kapogiannis, Apostolos Manolopoulos, Roger Mullins, Konstantinos Avgerinos, Francheska Delgado-Peraza, Maja Mustapic, Carlos Nogueras-Ortiz, Pamela J. Yao, Krishna A. Pucha, Janet Brooks, Qinghua Chen, Shalaila S. Haas, Ruiyang Ge, Lisa M. Hartnell, Mark R. Cookson, Josephine M. Egan, Sophia Frangou, and Mark P. Mattson.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/seeing-yourself-as-a-main-character-boosts-psychological-well-being-study-finds/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Seeing yourself as a main character boosts psychological well-being, study finds</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 20th 2024, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in the <a href="https://doi.org/10.1016/j.jrp.2024.104510"><em>Journal of Research in Personality</em></a> explores how people’s perceptions of their roles as major or minor characters in their life stories influence their psychological well-being. The researchers found that individuals who view themselves as major characters tend to have higher well-being and greater satisfaction of their basic psychological needs compared to those who see themselves as minor characters.</p>
<p>The study aimed to shed light on how autobiographical memories and narrative identities influence well-being. Previous research has shown that how people tell their life stories, including the emotions and themes they emphasize, can affect their mental health. This study, however, took a novel approach by asking participants to evaluate their role in their life stories, considering whether they see themselves as major characters driving their narrative or as minor characters observing from the background.</p>
<p>To examine this, the researchers conducted three studies involving undergraduate students from a large Midwestern university.</p>
<p>Study 1 involved 358 undergraduate students from a large Midwestern university, who participated in the study for course research credits. The average age of participants was 18.7 years, with the majority being female and Caucasian. Participants completed an online survey at two different time points, four weeks apart.</p>
<p>Participants were asked to rate themselves on three items designed to measure the degree to which they felt like a major or minor character in their life stories. These items used a 1 to 5 scale, with different terminologies such as “minor character” versus “major character,” “side character” versus “primary character,” and “background character” versus “lead character.” The three ratings were averaged to create a single major character score for each participant at each time point. Reliability estimates for this measure were high.</p>
<p>Additionally, the survey measured well-being, combining scores of positive affect, negative affect, and life satisfaction into a single well-being score. Need satisfaction was assessed using a six-item scale covering autonomy, competence, and relatedness. Self-esteem and narcissism were also measured using validated scales.</p>
<p>The researchers found that participants who perceived themselves as major characters in their life stories reported higher levels of well-being and greater satisfaction of basic psychological needs (autonomy, competence, and relatedness). The longitudinal data revealed that feeling like a major character at the initial time point predicted higher well-being four weeks later, even when controlling for initial well-being levels.</p>
<p>Further analyses indicated that these effects were robust even when controlling for self-esteem and narcissism, suggesting that the major character construct contributes uniquely to well-being outcomes.</p>
<p>Study 2 involved 326 students, with a similar demographic profile as Study 1. Participants were randomly assigned to one of two conditions: recalling a time when they felt like a major character in their life story or a time when they felt like a minor character. Participants completed an initial survey, wrote about their assigned memory, and then completed the survey again.</p>
<p>The pre- and post-manipulation surveys included measures of need satisfaction and well-being. Need satisfaction was assessed using the Basic Psychological Need Satisfaction and Frustration Scale, which includes items for autonomy, competence, and relatedness. Well-being was measured as affect balance, calculated by subtracting negative affect scores from positive affect scores.</p>
<p>The results showed significant interaction effects between the condition (major vs. minor character) and the time of assessment (pre vs. post) on both need satisfaction and well-being. Participants who recalled times when they felt like major characters experienced significant increases in need satisfaction and well-being following the manipulation. In contrast, those who recalled times when they felt like minor characters showed significant decreases in these measures.</p>
<p>Study 3 included 298 undergraduate students. Participants first listed three current goals they were pursuing and rated their motivations for these goals. They then completed measures of need satisfaction, well-being, and major character perceptions. Finally, participants wrote a narrative describing themselves as characters in their life stories.</p>
<p>Goal motivations were assessed using an eight-item Perceived Locus of Causality (PLOC) measure, which included items for different types of motivational regulation, from external to intrinsic. Self-reported major character perceptions were measured using the same items as in Study 1. Narratives were coded for agency, defined as the degree to which individuals felt they could influence their lives and outcomes.</p>
<p>The researchers found that participants who viewed themselves as major characters were more likely to pursue goals that were personally meaningful and aligned with their values. These individuals showed higher levels of autonomous motivation (identified and intrinsic regulation) and lower levels of controlled motivation (external and introjected regulation).</p>
<p>Major character perceptions were positively associated with higher coded agency, and both major character perceptions and agency were significant predictors of need satisfaction and well-being. The final regression analysis showed that while both major character perceptions and agency initially predicted well-being, their effects were mediated by need satisfaction. This finding suggests that seeing oneself as a major character enhances well-being through the satisfaction of basic psychological needs.</p>
<p>“These results support our notion that the way in which an individual perceives themselves as a character in their life story is likely to impact their well-being. When people see themselves as being the agentic force in their lives and make decisions for themselves, as major characters do, rather than being swept about by external forces (and other people),<br>
they are more integrated and fully functioning selves,” the researchers explained.</p>
<p>“Such individuals feel more autonomous, more competent and effective, and also experience better relational satisfaction with others, as evidenced by their increased basic psychological need satisfaction. Conversely, those who see themselves as minor characters are more likely to feel thwarted in getting these needs satisfied, a condition associated with diminished self-integration and wellbeing.”</p>
<p>But it is important to note that the samples consisted undergraduate students, which may limit the generalizability of the findings. The cultural context also plays a role; individualistic societies might emphasize the importance of being a major character more than collectivist cultures. Future research should explore these dynamics in more diverse and older populations.</p>
<p>“In conclusion, this research has identified a new meta-narrative construct that varies between individuals and has important implications for experiences of well-being,” the researchers wrote. “We hope this work represents a significant contribution to expanding approaches to narrative and autobiographical assessment, and suggest that this new perspective could be considered in future narrative identity research as a short supplemental measure, allowing narrative researchers to take into consideration the subjective viewpoint participants take on as they respond to narrative assessments.”</p>
<p>The study, “<a href="https://www.sciencedirect.com/science/article/abs/pii/S0092656624000588">The autobiographical critic within: Perceiving oneself as a major character in one’s life story predicts well-being</a>,” was authored by Ryan Goffredi and Kennon M. Sheldon.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/low-testosterone-and-high-neurofilament-protein-predict-cognitive-decline-in-older-men/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Low testosterone and high neurofilament protein predict cognitive decline in older men</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 20th 2024, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>In a recent study conducted by researchers at Fudan University in Shanghai, it was discovered that lower levels of testosterone, combined with higher levels of a protein called neurofilament light chain, significantly increase the risk of cognitive decline in older men. The paper was published in the journal <i><a href="https://alz-journals.onlinelibrary.wiley.com/doi/10.1002/alz.13889">Alzheimer’s & Dementia</a>.</i></p>
<p>As people age, their cognitive abilities generally deteriorate. This deterioration can be subtle at first but may become quite pronounced in advanced age. However, cognitive decline does not affect everyone equally. Some individuals maintain good cognitive functioning well into their 70s, 80s, or even later years, while others experience a much faster decline.</p>
<p>When an individual experiences a severe and progressive deterioration of cognitive function that significantly interferes with daily life, it is referred to as dementia. Dementia encompasses various conditions caused by different neurological issues that impair the nervous system. The most common type is Alzheimer’s disease, characterized by a buildup of specific proteins in the brain that create plaques and tangles, progressively killing neurons in the affected areas.</p>
<p>Some forms of dementia can be prevented, or their progression slowed. This has led scientists to intensely research ways to predict who will develop dementia. Among the factors being investigated are sex hormones, which are believed to modulate the risk of cognitive decline. For example, a premature reduction in estrogen in women can indicate the potential development of dementia.</p>
<p>Another important marker of impending dementia is neurofilament light chain. This protein helps maintain the shape and structural integrity of nerve cells, acting as a cellular skeleton. Normally, neurofilament light chain molecules remain within neurons. However, if these proteins are found in elevated quantities in the blood, it indicates that neurons have been damaged, releasing these proteins into the bloodstream.</p>
<p>Study author Shuning Tang and his colleagues wanted to explore how precisely can future cognitive decline of older men be predicted based on information about testosterone and neurofilament light chain levels. They hypothesized that lower testosterone level will be associated with cognitive decline, and that predicting cognitive decline will be even more effective if based both on testosterone and neurofilament light chain levels.</p>
<p>The researchers analyzed data from 581 older men participating in the Shanghai Aging Study. This longitudinal, community-based cohort study began in 2010 to investigate the prevalence, incidence, and risk factors of cognitive impairment among older Chinese adults. All participants were residents of the Jingansi community in downtown Shanghai, aged 60 or above, and dementia-free at the start of the study. On average, participants were followed for 6.7 years.</p>
<p>At the study’s outset, participants provided blood samples, which allowed researchers to measure testosterone and neurofilament light chain levels. Participants also completed a series of neuropsychological tests to assess their cognitive functioning. Between 2014 and 2023, participants were re-tested at least once, enabling researchers to compare their cognitive functioning over time and determine if dementia was developing.</p>
<p>The results showed that 45 participants developed cognitive decline during the study period. Compared to those who did not develop cognitive decline, these men were older, had fewer years of education, and were more likely to have a history of coronary heart disease, stroke, and hypertension. They also tended to have lower testosterone levels and higher neurofilament light chain levels in their blood.</p>
<p>By combining data on testosterone and neurofilament light chain levels, the researchers categorized participants into three risk groups: high, medium, and low. Participants in the high-risk group experienced cognitive decline 5-6 times more often than those in the low-risk group.</p>
<p>“Our findings suggest that the combination of testosterone and neurodegenerative markers may provide reliable predictive insights into future cognitive decline,” the study authors concluded.</p>
<p>The study offers a new method to predict future cognitive decline in older men. However, there are several limitations. The dementia diagnosis in this study was based solely on cognitive performance measures without examining the specific types and causes of dementia. This approach does not differentiate between different forms of dementia, such as Alzheimer’s disease or vascular dementia.</p>
<p>Additionally, the study participants were all from urban areas with relatively high levels of education, which may limit the generalizability of the findings to other populations. Higher education is associated with a slower rate of cognitive decline, potentially influencing the results.</p>
<p>Future research should aim to replicate these findings in larger, more diverse populations and explore the mechanisms underlying the observed associations. Longitudinal studies with repeated measurements of testosterone and neurofilament light chain could provide more nuanced insights into how these factors interact over time to influence cognitive health. Understanding the specific biological pathways through which testosterone and neurofilament light chain affect cognition could lead to new preventive and therapeutic strategies for dementia.</p>
<p>The paper, “<a href="https://doi.org/10.1002/alz.13889">Joint effect of testosterone and neurofilament light chain on cognitive decline in men: The Shanghai Aging Study,</a>” was authored by Shuning Tang, Zhenxu Xiao, Fangting Lin, Xiaoniu Liang, Xiaoxi Ma, Jie Wu, Xiaowen Zhou, Qianhua Zhao, Junling Gao, Qianyi Xiao, Ding Ding.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/researchers-observe-a-surprising-moral-tendency-among-impulsive-psychopaths/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Researchers observe a surprising moral tendency among impulsive psychopaths</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 20th 2024, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research published in <em><a href="https://www.sciencedirect.com/science/article/pii/S2666497624000213" target="_blank" rel="noopener">Comprehensive Psychoneuroendocrinology</a></em> provides insight into moral decision-making processes in psychopathic individuals. Researchers found that clinical psychopaths with high impulsivity tend to make deontological choices in high-emotion scenarios, avoiding direct harm even at the expense of optimal outcomes.</p>
<p>Psychopathy is a complex personality disorder characterized by a range of emotional, interpersonal, and behavioral deficits. Individuals with this condition often display a profound lack of empathy, disregard for the rights and feelings of others, and a tendency toward manipulative and antisocial behaviors.</p>
<p>These traits make psychopathic individuals more prone to engaging in criminal activities and other forms of antisocial conduct. The prevalence of psychopathy in the general population is relatively low, but its impact is disproportionately high in forensic and clinical settings, particularly among violent offenders and those with repeated criminal behaviors.</p>
<p>The motivation behind the study stemmed from the need to better understand the moral decision-making processes of psychopathic individuals. Given their high rates of criminal recidivism and the significant societal costs associated with their behaviors, researchers aimed to explore how these individuals make moral choices in situations that require weighing the greater good against causing harm to others.</p>
<p>“Since I work in a Dutch forensic observation clinic for mental assessment of alleged criminal offenders, one of my main interests is psychopathy. This personality disorder is said to be related to disturbed processing of moral issues,” said study author Ronald J.P. Rijnders, a forensic psychiatrist at the <a href="https://www.nifp.nl/" target="_blank" rel="noopener">Netherlands Institute for Forensic Psychiatry and Psychology</a>.</p>
<p>The study involved two groups of male participants: 24 psychopathic patients recruited from maximum-security forensic psychiatric hospitals in the Netherlands and 28 non-psychopathic controls, consisting of security guards and nursing staff from the same hospitals. Psychopathy in the patients was confirmed using the Psychopathy Checklist-Revised (PCL-R), with a cutoff score of 26 or higher, while the controls were screened using the Psychopathic Personality Inventory-Revised (PPI-R).</p>
<p>“A unique point of this study is that we investigated a clinically identified and PCL-R confirmed group of forensic psychopathic patients who were not treated with medication like selective serotonin reuptake inhibitors, selective noradrenaline reuptake inhibitors, antipsychotics, or hormonal libido inhibitors,” Rijnders noted.</p>
<p>Participants were presented with a series of moral dilemmas designed to elicit either utilitarian (outcome-based) or deontological (harm-averse) responses. Each dilemma was displayed on a computer screen and read aloud through headphones, with participants required to indicate the moral permissibility of the proposed action via a forced-choice question (“Would you…?”) that could be answered with “yes” or “no.”</p>
<p>“We used moral choices that were either utilitarian or deontological in nature,” Rijnders told PsyPost. “Utilitarians have rational responses to maximize total welfare, that is, to promote the greater good of all, even if it means breaking common social rules. Deontologists, on the other hand, have an automatic emotional aversion to inflicting harm on other people because the nature of the ultimate action itself determines whether that action is considered right or wrong.”</p>
<p>“Moral dilemmas are classified as either personal or impersonal. In personal acts, harm is caused by direct physical contact while in impersonal acts, harm is inflicted in an indirect, non-physical way. Utilitarian actions in personal dilemmas are associated with a stronger emotional value. Personal dilemmas are further divided into either inevitable or evitable dilemmas. Inevitable harm assumes that regardless of whether and what action is taken, the person involved will eventually suffer harm, whereas the latter is not the case with evitable harm if the action is waived.”</p>
<p>Among psychopathic patients, the researchers found that those with higher levels of impulsivity were more likely to make deontological choices in high-emotion scenarios. Specifically, these patients were more inclined to avoid causing harm when it involved direct physical action, even if this meant achieving less optimal outcomes overall. This finding was particularly evident in personal-evitable dilemmas, where the harm could be avoided by choosing not to act.</p>
<p>In the control group, psychopathic traits such as lack of empathy and failure to consider consequences were associated with a higher likelihood of making utilitarian decisions, but only in scenarios with low emotional stakes. This suggests that while psychopathic traits can predict utilitarian choices in the general population, this tendency is influenced by the emotional context of the decision.</p>
<p>Contrary to some previous studies, the researchers found no evidence that psychopathic patients, in general, made more utilitarian choices compared to non-psychopathic controls. Instead, the severity of psychopathy in patients was associated with more deontological choices, particularly in scenarios involving high emotional investment and where harm could be evitable.</p>
<p>“We hypothesized that psychopathic patients admitted to a forensic psychiatric hospital would have a propensity for utilitarian choices compared to a control group,” Rijnders said. “Our study showed that this was not the case, as there were no significant differences between the two groups.”</p>
<p>“However, there was an interesting finding regarding the psychopathy severity as measured by the Psychopathy Checklist Revised (for the patients) and the Psychopathic Personality Inventory – Revised (for the normal controls). Our hypothesis was that in both groups the percentage of utilitarian choices was positively related to the severity of psychopathy. This was indeed the case for the normal controls, but not for the psychopathic patients.”</p>
<p>“Highly impulsive psychopathic patients were more likely to make a harm-averse choices in the personal evitable dilemma. We think that the combination of absent self-interest, high impulsivity, an emotionally charged decision that is harmful to the other person and must be carried out by direct physical force may tilt the response toward a deontological choice. Choosing the emotionally charged use of avoidable harm may be immediately considered ‘too hot’ and then impulsively rejected.”</p>
<p>The psychopathic patients underwent two test sessions, one in which they self-administered a nasal spray containing 24 IU of synthetic oxytocin and another in which they received a placebo nasal spray. (The normal controls did not receive any nasal spray and were tested in one session only.) The time interval between the two test sessions for the psychopathic patients was approximately 12 days, and the start times were kept consistent to control for circadian effects.</p>
<p>“In the group of psychopathic patients, we examined the effect of a single nasal application of the neuropeptide oxytocin,” Rijnders told PsyPost. “Contrary to our expectations we found no effects of oxytocin on moral decision-making.”</p>
<p>The study highlights the nuanced ways in which psychopathic traits and impulsivity interact with the emotional context of moral decisions. But as with any study, there are some caveats. The study’s sample size was relatively small, and the findings may not generalize to all psychopathic individuals or to different cultural contexts.</p>
<p>Another limitation is the reliance on a single administration of oxytocin, which may not be sufficient to induce significant behavioral changes. Future research could explore the effects of repeated oxytocin administration over longer periods to assess more substantial and lasting impacts on moral decision-making.</p>
<p>“We will continue our research on moral choice in other forensic populations,” Rijnders said. “Perhaps a design with multiple applications of nasal oxytocin over weeks is possible.”</p>
<p>The study, “<a href="https://doi.org/10.1016/j.cpnec.2024.100245" target="_blank" rel="noopener">Would you? Effects of oxytocin on moral choices in forensic psychopathic patients</a>“, was authored by Ronald J.P. Rijnders, Sophie van den Hoogen, Jack van Honk, David Terburg, and Maaike M. Kempes.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/adolescent-alcohol-use-linked-to-altered-hippocampal-structure-in-young-adulthood/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Adolescent alcohol use linked to altered hippocampal structure in young adulthood</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 19th 2024, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in <em><a href="https://psycnet.apa.org/record/2024-83892-001?doi=1" target="_blank" rel="noopener">Experimental and Clinical Psychopharmacology</a></em> has uncovered a surprising link between adolescent alcohol use and brain structure. Researchers found that larger hippocampal volumes are associated with alcohol use during adolescence, while no such relationship was found for tobacco or cannabis use. This study adds new dimensions to our understanding of how different patterns of substance use affect the adolescent brain.</p>
<p>Substance use among adolescents is a critical public health issue due to its potential long-term impact on both physical and mental health. Adolescence is a period of significant brain development, making it a particularly vulnerable time for the potential detrimental effects of substance use.</p>
<p>Previous research has linked adolescent substance use with cognitive deficits, such as memory disruption and impulsivity, that can persist into adulthood. Despite this, much of the existing neuroimaging research has focused on heavy substance use, leaving a gap in our understanding of how more typical, recreational levels of use affect the brain.</p>
<p>The present study aimed to address this gap by examining the relationship between trajectories of alcohol, tobacco, and cannabis use during adolescence and brain gray matter volume in young adulthood. This focus on the pattern of use over time, rather than a binary heavy-use vs. non-use approach, is novel and provides insights into how varying levels of substance use impact brain development.</p>
<p>Gray matter is a key component of the central nervous system, consisting mainly of neuronal cell bodies, dendrites, and unmyelinated axons. It is crucial for processing information in the brain and spinal cord, enabling functions such as muscle control, sensory perception, memory, emotions, and decision-making. Gray matter forms the outer layer of the brain, known as the cerebral cortex, and is also found in various subcortical structures, contributing to the brain’s ability to interpret and respond to a wide range of stimuli.</p>
<p>The researchers recruited 1,594 participants from the Birmingham, Alabama area as part of the Healthy Passages Study, a longitudinal investigation of adolescent health. Participants were initially recruited from fifth-grade classrooms and followed up at ages 11, 13, 16, and 19. At each time point, participants reported their use of alcohol, tobacco, and cannabis, and a subset of 350 participants underwent magnetic resonance imaging (MRI) to measure brain structure at approximately age 20.</p>
<p>The study used latent growth curve models (LGCMs) to analyze the trajectories of substance use over time, estimating the initial level of use at age 14, the linear progression of use, and the acceleration or deceleration of use. These trajectories were then used to predict brain gray matter volume in various regions, including the hippocampus, amygdala, and nucleus accumbens.</p>
<p>The researchers found that cortical gray matter volume was not associated with trajectories of alcohol, tobacco, or cannabis use. However, a significant relationship was found between subcortical gray matter volume and alcohol use trajectories.</p>
<p>Greater alcohol use at age 14 was associated with larger volumes of the hippocampus on both sides of the brain. The intercept of alcohol use, which represents the level of use at age 14, had a positive correlation with hippocampal volume, indicating that early initiation of alcohol use might be linked to larger hippocampal size in young adulthood.</p>
<p>There was no observed relationship between the use of tobacco or cannabis and the volume of either cortical or subcortical gray matter regions.</p>
<p>These findings challenge some of the existing notions about adolescent substance use and brain development. While many studies have reported that heavy alcohol use is associated with reduced gray matter volume in various brain regions, this study found that even typical, recreational use of alcohol during adolescence is linked to larger hippocampal volumes.</p>
<p>The hippocampus plays a key role in memory formation and emotional regulation, and changes in its structure could underlie some of the cognitive and emotional effects associated with alcohol use. The finding that early alcohol use is linked to larger hippocampal volumes suggests that different patterns of alcohol use may impact brain development in diverse ways. This could imply that light or recreational use interferes with the natural pruning process of synapses, leading to a retention of connections that would otherwise be pruned.</p>
<p>Additionally, the lack of significant findings for tobacco and cannabis use suggests that these substances may have less impact on brain structure than alcohol, or that the patterns of use in the study population were not sufficient to detect changes.</p>
<p>The results highlight the importance of considering different patterns of substance use and their specific impacts on brain development. Future research should continue to explore these relationships, particularly with larger samples and more diverse populations. Longitudinal neuroimaging studies that track brain changes over time in relation to substance use are essential for understanding the causal pathways involved.</p>
<p>“These results suggest that certain alcohol use trajectories (i.e., early initiation) may be the most important patterns to address through prevention and intervention programs at the population level, given their relationship with brain structure,” the researchers concluded.</p>
<p>“These findings provide novel insight into the neural impact of recreational levels of adolescent alcohol use, given that prior neuroimaging research has primarily focused on heavy alcohol use. Thus, the results of the present study may inform prevention efforts by highlighting alcohol use trajectories that are most likely to be associated with changes in brain structure. This new knowledge may help to promote the efficient use of resources and target patterns of substance use that are most harmful at the population level.”</p>
<p>The study, “<a href="https://psycnet.apa.org/doi/10.1037/pha0000722" target="_blank" rel="noopener">Hippocampal Gray Matter Volume in Young Adulthood Varies With Adolescent Alcohol Use</a>,” was authored by Juliann B. Purcell, Nathaniel G. Harnett, Sylvie Mrug, Marc N. Elliott, Susan Tortolero Emery, Mark A. Schuster, and David C. Knight.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/facebook-and-instagrams-algorithmic-favoritism-towards-extremist-parties-revealed-in-new-study/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Facebook and Instagram’s algorithmic favoritism towards extremist parties revealed in new study</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 19th 2024, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in the journal <em><a href="https://doi.org/10.1093/pnasnexus/pgae247" target="_blank" rel="noopener">PNAS Nexus</a></em> has shed light on how social media algorithms favor politically sponsored content from certain parties, even when the same budget is applied. This research, conducted by the Politecnico di Milano, LMU – Ludwig Maximilians Universität of Munich, and the CENTAI institute of Turin, analyzed over 80,000 political ads on Facebook and Instagram leading up to the 2021 German federal elections.</p>
<p>These ads, representing parties across the political spectrum, generated over 1.1 billion impressions among more than 60 million eligible voters. The findings reveal significant discrepancies in the effectiveness of these ads, with a bias towards more extremist groups.</p>
<p>Social media platforms like Facebook and Instagram have become essential tools for political campaigns due to their vast user base and sophisticated targeting capabilities. However, there are growing concerns about the fairness, accountability, and transparency of these platforms’ proprietary algorithms. These algorithms determine which users see specific ads, potentially introducing biases that favor certain political messages and groups.</p>
<p>To conduct their study, the researchers gathered data from the Meta Ad Library API, focusing on political ads from the 2021 German federal elections. The data included detailed information about ad content, spending, start and stop dates, and the number of impressions distributed across different demographics (gender and age). The team analyzed ads from six major political parties in Germany: Linke, Grüne, SPD, FDP, Union, and AfD.</p>
<ul>
<li>Die Linke, or The Left, is a democratic socialist party advocating for social justice, anti-capitalism, and policies that support the working class and disadvantaged groups.</li>
<li>Bündnis 90/Die Grünen, or The Greens, focuses on environmental protection, sustainability, and social justice, promoting policies to combat climate change and support human rights.</li>
<li>The Social Democratic Party of Germany (SPD) is a center-left party advocating for social democracy, workers’ rights, and a strong welfare state to ensure social equity.</li>
<li>The Free Democratic Party (FDP) is a libertarian party that promotes individual freedom, free-market economic policies, and reduced government intervention in the economy.</li>
<li>The Union, comprising the Christian Democratic Union (CDU) and its Bavarian sister party, the Christian Social Union (CSU), is a center-right political alliance supporting conservative values, a social market economy, and European integration.</li>
<li>Alternative für Deutschland, or Alternative for Germany (AfD), is a right-wing populist party known for its anti-immigration stance, Euroscepticism, and conservative social policies.</li>
</ul>
<p>The researchers used statistical and machine learning methods to analyze the data. They computed the discrepancies between the intended (targeted) and actual audiences reached by the ads. The team also evaluated the efficiency of ads in terms of impressions-per-EUR (a measure of how many impressions an ad generates for each euro spent). They employed regression analysis to identify key factors influencing ad reach and used a random forest model to predict ad performance based on the available data.</p>
<p>The study revealed that 72.3% of all political ads used targeting strategies, accounting for 72.6% of the total ad spending. Parties used a wide range of targeting categories, with a preference for exclusion criteria to narrow down audiences. This approach allowed the social media platform’s algorithms to optimize ad delivery among broad audiences.</p>
<p>The analysis showed significant disparities in the efficiency of ads across different political parties. On average, political ads generated 126.71 impressions per euro spent. However, ads from the Grüne party achieved only 36.18 impressions per euro, while those from the FDP and AfD were much more efficient, with 181.53 and 203.49 impressions per euro, respectively. This suggests that certain parties, particularly more extremist groups, benefited more from the platform’s algorithmic delivery.</p>
<p>“The greater success of their advertising could be explained by the fact that the incendiary political issues promoted by populist parties tend to attract a lot of attention on social media. Consequently, algorithms would favor campaign ads with such content,” explained Francesco Pierri, a researcher from the Data Science research group of the Department of Electronics, Information, and Bioengineering at the Politecnico di Milano, who co-led the study.</p>
<p>The researchers also found discrepancies between the targeted and actual audiences. Ads generally reached a younger audience than intended, except for the far-right AfD, which reached an older audience. Gender-wise, most ads were shown to fewer female users than targeted, except for those from the Grüne party, which reached more female users than intended. This indicates a potential algorithmic bias in ad delivery that could reinforce existing political and social biases.</p>
<p>The regression analysis revealed that more granular targeting criteria often resulted in lower impressions-per-EUR, especially when using exclusion criteria. Targeting single-gender audiences was associated with higher ad efficiency. Ads with positive sentiment and those published earlier in the week or for longer durations also tended to perform better.</p>
<p>The machine learning model’s low prediction performance suggested that the available data was insufficient to fully explain the variance in ad performance. This highlights the need for greater transparency from social media platforms regarding their ad delivery algorithms and pricing mechanisms.</p>
<p>“We see a systematic bias in how the political ads of different parties are distributed. If they aim at a specific audience or send contradictory messages on political issues to different groups, this can limit the political participation of disadvantaged groups,” Pierri said. “Even worse, the algorithms used by the platforms do not allow verification if they involve biases in ad distribution. If, for example, some parties systematically pay higher prices than others for similar ads, this damages political competition. We need greater transparency from the platforms regarding political advertising to ensure fair and uncompromised elections.”</p>
<p>The study, “<a href="https://academic.oup.com/pnasnexus/article/3/7/pgae247/7695718" target="_blank" rel="noopener">Systematic discrepancies in the delivery of political ads on Facebook and Instagram</a>,” was authored by Dominik Bär, Francesco Pierri, Gianmarco De Francisci Morales, and Stefan Feuerriegel.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/nature-contact-increases-prosocial-behaviors-through-self-transcendence/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Nature contact increases prosocial behaviors through self-transcendence</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 19th 2024, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Spending time in nature is often associated with relaxation and well-being. A recent study published in the <em><a href="https://doi.org/10.1016/j.jenvp.2024.102324" target="_blank" rel="noopener">Journal of Environmental Psychology</a></em> reveals that nature contact not only benefits physical and mental health but also fosters prosocial behaviors — actions intended to benefit others. Through five methodologically diverse studies, researchers consistently found that exposure to nature increases prosociality, primarily mediated by a sense of self-transcendence.</p>
<p>Previous studies have consistently shown that exposure to nature can reduce stress, improve mood, and enhance overall mental well-being. Additionally, nature contact has been linked to increased cooperation and environmentally sustainable behaviors.</p>
<p>In their new study, the researchers sought to investigate whether the positive effects of nature contact on prosocial behaviors—actions intended to benefit others or the collective—could be observed beyond environmental contexts. They were particularly interested in identifying the underlying mechanisms through which nature contact might promote prosociality.</p>
<p>The first two studies (Study 1a and Study 1b) employed correlational methodologies to examine the relationship between nature contact and prosocial behaviors. Study 1a involved 339 community members who were recruited online. Participants completed surveys assessing their daily nature contact, nature connectedness, and prosocial tendencies. Nature contact was measured using the Nature Contact Questionnaire, which included items like “Last week, you bought flowers to decorate the room, either dried or fake flowers.”</p>
<p>Nature connectedness was measured with the Connectedness to Nature Scale, which included items such as “I feel embedded in the broader natural world, like a tree in a forest.” Prosocial tendencies were assessed with the Prosocial Tendencies Measure, which asked participants to rate statements like “I tend to help people who are really in trouble or desperate need of help.”</p>
<p>The results of Study 1a revealed significant positive associations between nature contact, nature connectedness, and prosocial tendencies. Mediation analysis showed that nature connectedness partially explained the relationship between nature contact and prosocial behaviors.</p>
<p>Study 1b focused on 360 organizational employees who also completed surveys similar to those in Study 1a. In addition to measuring nature contact and prosocial tendencies, this study included assessments of self-transcendence and materialism. Self-transcendence was measured with items like “I feel that on a higher level, all of us share a common bond,” while materialism was assessed with items such as “I envy people who have expensive houses, cars, and clothes.”</p>
<p>The findings from Study 1b indicated that both self-transcendence and reduced materialism mediated the positive effects of nature contact on prosocial behaviors. The relationships between nature contact, self-transcendence, and reduced materialism were significant, suggesting that these factors play a role in enhancing prosocial tendencies.</p>
<p>Study 2 and Study 3 utilized experimental designs to causally test the impact of nature contact on prosocial behaviors. In Study 2, 194 college students were randomly assigned to one of three conditions: viewing a nature video, an urban video, or a blank screen (control). After watching the six-minute videos, participants completed tasks to measure prosocial behaviors, such as willingness to donate to a charity and participation in a prisoner’s dilemma game. They also rated their feelings of self-transcendence and nature connectedness.</p>
<p>The results of Study 2 demonstrated that participants who watched nature videos reported higher self-transcendence and were more willing to donate to charity compared to those who watched urban or control videos. The nature contact condition also led to higher cooperation in the prisoner’s dilemma game. Mediation analyses revealed that self-transcendence, but not nature connectedness, significantly mediated the relationship between nature contact and prosocial behaviors.</p>
<p>Study 3 followed a similar experimental design with 188 college students. Participants were again randomly assigned to watch nature, urban, or control videos. Afterward, they engaged in a trust game and real helping situations to measure actual prosocial behaviors. Additionally, they completed surveys to assess self-transcendence, nature connectedness, and materialism.</p>
<p>The findings from Study 3 indicated that participants in the nature contact condition demonstrated greater trust and more helping behavior compared to those in the urban or control conditions. Mediation analyses showed that self-transcendence and reduced materialism significantly mediated the effects of nature contact on prosocial behaviors, whereas nature connectedness did not.</p>
<p>Study 4 extended the investigation into a real-world setting by having participants engage in a five-day photo-taking task. A total of 201 organizational employees were recruited and randomly assigned to take photos of nature scenes, urban scenes, or without specific instructions (free condition).</p>
<p>Before and after the five-day task, participants completed the Nature Contact Questionnaire to measure their level of nature contact. Following the task, they participated in a public goods game, which measured their contributions to a shared resource. They also completed surveys assessing nature connectedness, self-transcendence, and materialism.</p>
<p>The results of Study 4 showed that participants in the nature contact condition perceived a higher level of nature contact after the task compared to before. They also demonstrated stronger nature connectedness, greater self-transcendence, lower materialism, and greater prosocial behavior in the public goods game compared to those in the urban contact condition.</p>
<p>Interestingly, there was no significant difference between the nature contact and free contact conditions, suggesting that any form of increased engagement with the environment might enhance prosocial behaviors. Mediation analyses indicated that self-transcendence, nature connectedness, and reduced materialism mediated the relationship between nature contact and prosocial behaviors, though the effects were more consistent for self-transcendence.</p>
<p>“Through five studies with diverse designs and measures for manipulation and prosociality, the current research consistently found a facilitative effect of nature contact on prosociality,” the researchers concluded. “It was found that self-transcendence was the key and reliable mediator of this effect, while the mediating roles of nature connectedness and materialism were partially supported. The findings of this research are valuable for deepening the conceptual understanding of the nature-human behavior relationship.”</p>
<p>The study, “<a href="https://www.sciencedirect.com/science/article/abs/pii/S0272494424000975" target="_blank" rel="noopener">Nature contact promotes prosociality: The mediating roles of self-transcendence, nature connectedness, and materialism</a>,” was authored by Dongmei Mei, Ding Yang, Tong Li, Xin Zhang, Kang Rao, and Liman Man Wai Li.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/emophilia-is-a-distinct-psychological-trait-and-linked-to-infidelity/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Emophilia is a distinct psychological trait and linked to infidelity</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 19th 2024, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A group of Norwegian researchers recently examined the validity of the Emotional Promiscuity Scale, a popular measure of emophilia, on a Scandinavian population. Emophilia is a psychological trait that describes how easily and how often a person falls in love. The study, published in <a href="https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2024.1265247/full"><em>Frontiers in Psychology</em></a>, found that individuals with high emophilia tend to have more romantic relationships and higher instances of unfaithfulness.</p>
<p>Romantic love is an intense emotional and physical attraction toward another person, characterized by feelings of passion, intimacy, and commitment. It typically begins with an initial attraction and infatuation, where individuals experience heightened emotions and a strong desire to be close to each other. If these feelings are mutual, the individuals involved might start a romantic relationship. As the relationship progresses, deeper emotional bonds and attachment develop through shared experiences, communication, and mutual support.</p>
<p>The initial part of this process is “falling in love,” the transition from not experiencing romantic love to experiencing it. This experience includes intense emotions, many of which are pleasant, but others can be painful, such as emotional pain in the absence of the loved one or longing for them.</p>
<p>People differ in how easily and how often they fall in love. While some individuals fall in love often and very easily, others have this experience much less frequently or never, and it can take much more for it to start. Researchers propose that the ease with which one falls in love and how often it happens is a relatively stable psychological trait, named emophilia.</p>
<p>Study author Sol E. Røed and his colleagues aimed to assess whether the Emotional Promiscuity Scale (EPS) accurately measures emophilia in the Scandinavian population. Their assumption was that emophilia could predict the number of romantic relationships a person will have and how often that person will be unfaithful.</p>
<p>The researchers collected data using an online survey distributed through the Norwegian newspaper VG+ and the Swedish newspaper Aftonbladet+. They gathered responses from 2,607 individuals, 75% of whom were women.</p>
<p>Participants reported the number of romantic relationships they had in their lives (“How many romantic relationships have you had in your life?”) and the number of times they were unfaithful (“How many times have you been unfaithful?”). They also completed the EPS to measure emophilia and two assessments of personality traits: the Dirty Dozen (for Dark Triad traits) and the Mini International Personality Item Pool (for Big Five Personality traits).</p>
<p>The results confirmed that the EPS produces valid measures of emophilia in a Scandinavian context. The researchers compared whether the assessment functions equally for men and women and across different age groups. They found that the scores are comparable for men and women but not directly comparable across different age groups (up to 35 years, 36-55 years, and 56 or older).</p>
<p>Further analysis revealed that emophilia had some associations with personality traits, particularly with neuroticism, Machiavellianism, and narcissism. However, these associations were weak, confirming that emophilia can be considered a distinct psychological characteristic.</p>
<p>There were no significant gender differences in emophilia. Individuals with more pronounced emophilia tended to have had more romantic relationships and also reported being unfaithful more often.</p>
<p>“The present study indicates that the EPS [the Emotional Promiscuity Scale] holds good psychometric properties [functions as intended]. Emophilia showed satisfactory discriminant validity against the personality traits included [showed that it is distinct from the examined personality traits]. Lastly, the study indicates that emophilia may be associated with entering more romantic relationships and unfaithfulness, but the cross-sectional design of the current study precludes conclusions concerning directionality [whether the number of relationship and unfaithfulness episodes affect emophilia levels or vice versa],” the study authors concluded.</p>
<p>The study makes a valuable contribution to the development of psychological assessment tools and to the scientific understanding of emophilia. However, it should be noted that all data were collected using self-reports, while the survey included questions about sensitive topics. This leaves some room for reporting bias to affect the results. Additionally, the questions about the number of romantic relationships and instances of unfaithfulness were not accompanied by definitions of what constitutes a romantic relationship or unfaithfulness, leaving room for participants’ interpretations.</p>
<p>The paper, “<a href="https://doi.org/10.3389/fpsyg.2024.1265247">Emophilia: psychometric properties of the emotional promiscuity scale and its association with personality traits, unfaithfulness, and romantic relationships in a Scandinavian sample,</a>” was authored by Sol E. Røed, Randi K. Nærland, Marie Strat, Ståle Pallesen, and Eilin K. Erevik.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href="https://blogtrottr.com/unsubscribe/565/DY9DKf">unsubscribe from this feed</a></a></small></s></p>