<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/competitive-athletes-exhibit-lower-off-field-aggression-and-enhanced-brain-connectivity/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Competitive athletes exhibit lower off-field aggression and enhanced brain connectivity</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Dec 23rd 2025, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in <em><a href="https://doi.org/10.1016/j.psychsport.2025.103051" target="_blank">Psychology of Sport & Exercise</a></em> has found that long-term engagement in competitive athletics is linked to reduced aggression in daily life and specific patterns of brain organization. The findings challenge the common stereotype that contact sports foster violent behavior outside of the game. By combining behavioral assessments with advanced brain imaging, the researchers identified a biological basis for the observed differences in aggression between athletes and non-athletes.</p>
<p>Aggression is a complex trait influenced by both biological and environmental factors. A persistent debate in psychology concerns the impact of competitive sports on an individual’s tendency toward aggressive behavior. One perspective, known as social learning theory, suggests that the aggression often required and rewarded in sports like football or rugby can spill over into non-sport contexts. This theory posits that athletes learn to solve problems with physical dominance, which might make them more prone to aggression in social situations.</p>
<p>An opposing perspective argues that the structured environment of competitive sports promotes discipline and emotional regulation. This view suggests that the intense physical and mental demands of high-level competition require athletes to develop superior self-control to succeed. </p>
<p>According to this framework, the ability to inhibit impulsive reactions during a game translates into better behavioral regulation in everyday life. Previous research attempting to settle this debate has yielded mixed results, largely relying on self-reported questionnaires without examining the underlying biological mechanisms.</p>
<p>“This study was motivated by inconsistent findings in previous research regarding the relationship between long-term engagement in competitive sports and aggression,” explained study author Mengkai Luan, associate professor of psychology at the Shanghai University of Sport.</p>
<p>“While some studies suggest that competitive sports, particularly those involving intense physical and emotional demands, may increase off-field aggression through a ‘spillover’ effect, other research indicates that athletes, due to the emotional regulation and discipline developed through long-term training, often exhibit lower levels of aggression in everyday situations compared to non-athletes. This study aims to examine how long-term engagement in competitive athletics is associated with off-field aggression, while also exploring the neural mechanisms underlying these behavioral differences using resting-state functional connectivity analysis.”</p>
<p>The research team recruited a total of 190 participants from a university community in China. The sample consisted of 84 competitive athletes drawn from university football and rugby teams. These athletes had an average of nearly seven years of competitive experience and engaged in rigorous weekly training. The comparison group included 106 non-athlete controls who did not participate in regular organized sports.</p>
<p>All participants completed the Chinese version of the Buss–Perry Aggression Questionnaire. This widely used psychological tool measures an individual’s general aggression levels as well as four specific subtypes. These subtypes include physical aggression, verbal aggression, anger, and hostility. Participants also rated their tendency toward self-directed aggression. The researchers compared the scores of the athlete group against those of the non-athlete control group to identify behavioral differences.</p>
<p>Following the behavioral assessment, participants underwent functional magnetic resonance imaging (fMRI) scans. The researchers utilized a resting-state fMRI protocol. This method involves scanning the brain while the participant is awake but not performing any specific cognitive task. It allows scientists to map the brain’s intrinsic functional architecture by observing spontaneous fluctuations in brain activity. This approach is particularly useful for identifying stable, trait-like characteristics of brain organization.</p>
<p>The behavioral data revealed clear differences between the two groups. Athletes reported significantly lower scores on total aggression than the non-athlete controls. When the researchers analyzed the specific subscales, they found that athletes scored lower on physical aggression, anger, hostility, and self-directed aggression. </p>
<p>The only dimension where no significant difference appeared was verbal aggression. These results provide behavioral evidence supporting the idea that competitive sport participation functions as a protective factor against maladaptive aggression.</p>
<p>The brain imaging analysis offered insights into the potential neural mechanisms behind these behavioral findings. The researchers used a method called Network-Based Statistics to compare the whole-brain connectivity matrices of athletes and non-athletes. They identified a large subnetwork where athletes exhibited significantly stronger connectivity than controls. This enhanced network comprised 105 connections linking 70 distinct brain regions.</p>
<p>The strengthened connections in athletes were not random but were concentrated within specific systems. The analysis showed increased integration between the salience network and sensorimotor networks. The salience network is responsible for detecting important stimuli and coordinating the brain’s response, while sensorimotor networks manage movement and sensory processing. This pattern suggests that the athletic brain is more efficiently wired to integrate sensory information with motor control and attentional resources.</p>
<p>To further understand the link between brain function and behavior, the authors employed a machine-learning technique called Connectome-Based Predictive Modeling. This analysis aimed to determine if patterns of brain connectivity could accurately predict an individual’s aggression scores, regardless of their group membership. The model successfully predicted levels of total aggression and physical aggression based on the fMRI data.</p>
<p>The predictive modeling revealed that lower levels of aggression were associated with specific connectivity patterns involving the prefrontal cortex. The prefrontal cortex is the brain region primarily responsible for executive functions, such as decision-making, impulse control, and planning. </p>
<p>The analysis showed that stronger negative connections between the prefrontal cortex and subcortical regions were predictive of reduced aggression. This implies that a well-regulated brain utilizes top-down control mechanisms to inhibit impulsive drives originating in deeper brain structures.</p>
<p>The researchers also found a significant overlap between the group-level differences and the individual prediction models. Four specific neural connections were identified both as distinguishing features of the athlete group and as strong predictors of lower aggression. These connections involved the orbitofrontal cortex and the cerebellum. The orbitofrontal cortex is key for emotion regulation, while the cerebellum is traditionally associated with balance and motor coordination but is increasingly recognized for its role in emotional processing.</p>
<p>The convergence of these findings suggests that the demands of competitive sports may induce neuroplastic changes that support better behavioral regulation. The need to execute complex motor skills while managing high levels of physiological arousal and adhering to game rules likely strengthens the neural pathways that integrate motor and emotional control. This enhanced neural efficiency appears to extend beyond the field, helping athletes manage frustration and suppress aggressive impulses in their daily lives.</p>
<p>“The study challenges the common stereotype that individuals who participate in competitive, contact sports are more aggressive or dangerous in everyday life,” Luan told PsyPost. “In fact, the research suggests that long-term participation in these sports may help individuals manage aggression better. Through their training, they develop emotional regulation and self-discipline, which may be linked to brain changes that help them control aggression and behavior off the field.”</p>
<p>There are some limitations. The research utilized a cross-sectional design, which captures data at a single point in time. This means the study cannot definitively prove that sports training caused the brain changes or the reduced aggression. It is possible that individuals with better emotional regulation and specific brain connectivity patterns are naturally drawn to and successful in competitive sports.</p>
<p>The sample was also limited to university-level athletes in team-based contact sports within a specific cultural setting. Cultural values regarding emotion and social harmony may influence how aggression is expressed and regulated. </p>
<p>“One of our long-term goals is to expand the sample to include athletes from a wider range of sports, including individual and non-contact sports, as well as participants from different cultural backgrounds,” Luan said. “This would help increase the generalizability of our findings.” </p>
<p>“Additionally, since our current study is cross-sectional, it cannot establish causal relationships. In future research, we plan to adopt longitudinal and intervention-based designs to better understand the causal mechanisms behind the observed effects, and to separate pre-existing individual traits from the neural adaptations resulting from sustained athletic training.”</p>
<p>The study, “<a href="https://doi.org/10.1016/j.psychsport.2025.103051" target="_blank">Competitive sport experience is associated with reduced off-field aggression and distinct functional brain connectivity</a>,” was authored by Yujing Huang, Zhuofei Lin, Chenglin Zhou, Yingying Wang, and Mengkai Luan.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/wrinkles-around-the-eyes-are-the-primary-driver-of-age-perception-across-five-ethnic-groups/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Wrinkles around the eyes are the primary driver of age perception across five ethnic groups</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Dec 22nd 2025, 20:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Recent research published in the <em><a href="https://doi.org/10.1111/ics.13045" target="_blank">International Journal of Cosmetic Science</a></em> provides evidence that wrinkles around the eyes are the primary physical feature driving perceptions of age and attractiveness across diverse ethnic groups. While factors such as skin color and gloss contribute to how healthy a woman appears, the depth and density of lines in the periorbital region consistently predict age assessments in women from Asia, Europe, and Africa.</p>
<p>The rationale behind this study stems from the fact that the skin around the eyes is structurally unique. It is significantly thinner than facial skin in other areas and contains fewer oil glands. This biological reality makes the eye area particularly susceptible to the effects of aging and environmental damage.</p>
<p>In addition to its delicate structure, the skin around the eyes is subjected to constant mechanical stress. Humans blink approximately 15,000 times per day, and these repeated muscle contractions eventually lead to permanent lines. Previous surveys have indicated that women worldwide consider under-eye bags, dark circles, and “crow’s feet” to be among their top aesthetic concerns.</p>
<p>However, most prior research on this topic has focused on specific populations or general facial aging. It has remained unclear whether specific changes in the eye region influence social perceptions in the same way across different cultures. The authors of the current study aimed to determine if the visual impact of periorbital skin features is consistent globally or if it varies significantly by ethnicity.</p>
<p>To investigate this, the researchers utilized a multi-center approach involving participants and assessors from five distinct locations. Data collection took place in Guangzhou, China; Tokyo, Japan; Lyon, France; New Delhi, India; and Cape Town, South Africa. The team initially recruited 526 women across these five locations to serve as the pool for the study.</p>
<p>From this larger group, the researchers selected a standardized subset of 180 women to serve as the subjects of the analysis. This final sample included exactly 36 women from each of the five ethnic groups. The participants ranged in age from 20 to 65 years, allowing for a comprehensive view of the aging process.</p>
<p>The researchers recorded high-resolution digital portraits of these women using a specialized system known as ColorFace. This equipment allowed for the standardization of lighting and angles, which is essential for accurate computer analysis. The team then defined two specific regions of interest on each face for detailed measurement.</p>
<p>The first region analyzed was the area directly under the eyes, which included the lower eyelid and the infraorbital hollow. The second region was the area at the outer corners of the eyes where lateral canthal lines, commonly known as crow’s feet, typically develop. The researchers used digital image analysis software to objectively quantify skin characteristics in these zones.</p>
<p>For the region under the eyes, the software measured skin color, gloss, skin tone evenness, and wrinkles. Skin color was broken down into specific components, including lightness, redness, and yellowness. Gloss was measured in terms of its intensity and contrast, while tone evenness was calculated based on the similarity of adjacent pixels.</p>
<p>For the crow’s feet region, the analysis focused exclusively on the measurement of wrinkles. The software identified wrinkles by detecting lines in the image that met specific criteria. The researchers quantified these features by calculating the total length of the wrinkles, their density within the region, and their volume.</p>
<p>To determine how these objective features translated into social perceptions, the study employed a large panel of human assessors. The researchers recruited 120 assessors in each of the five study locations, resulting in a total of 600 raters. These assessors were “naïve,” meaning they were not experts in dermatology or cosmetics.</p>
<p>The assessors were matched to the participants by ethnicity. For example, Chinese assessors rated the images of Chinese women, and French assessors rated the images of French women. Each assessor viewed the digital portraits on color-calibrated monitors.</p>
<p>They were asked to rate each face for perceived age, health, and attractiveness. These ratings were given on a continuous scale ranging from 0 to 100, where 0 represented a low attribute score and 100 represented a high attribute score. The researchers then used statistical methods to identify relationships between the objective skin measurements and the subjective ratings.</p>
<p>The results revealed distinct biological differences in how skin ages across the different groups. For instance, Indian and South African women tended to have lower skin lightness scores under the eyes compared to Chinese, Japanese, and French women. South African women also exhibited the highest density of wrinkles in the under-eye region among all groups.</p>
<p>Regarding the crow’s feet region, the analysis showed that South African, Chinese, and French women had similar levels of wrinkling. These levels were notably higher than those observed in Indian and Japanese women. This finding aligns with some previous research suggesting that wrinkle onset and progression can vary significantly based on ethnic background.</p>
<p>Despite these physical differences, the study found strong consistencies in how these features influenced perception. When looking at the full sample, wrinkles in both the under-eye and crow’s feet regions showed a strong positive correlation with perceived age. This means that as wrinkle density and volume increased, assessors consistently rated the faces as looking older.</p>
<p>On the other hand, wrinkles were negatively correlated with ratings of health and attractiveness. Faces with more pronounced lines around the eyes were perceived as less healthy and less attractive. This pattern held true regardless of the ethnic group of the woman or the assessor.</p>
<p>The study also highlighted the role of skin gloss, or radiance. Higher levels of specular gloss, which corresponds to the shine or glow of the skin, were associated with perceptions of better health and higher attractiveness. This suggests that skin radiance is a universal cue for vitality.</p>
<p>In contrast, skin tone evenness showed a more complex relationship. While generally associated with youth and health, it appeared to be a stronger cue for health judgments than for age. Uneven pigmentation and lower skin lightness were linked to lower health ratings, particularly in populations with darker skin tones.</p>
<p>Regression analyses allowed the researchers to determine which features were the strongest predictors of the ratings. For perceived age, wrinkles in the crow’s feet region emerged as a significant predictor for all five ethnic groups. This confirms that lines at the corners of the eyes are a primary marker used by people to estimate a woman’s age.</p>
<p>For Japanese and French women, wrinkles specifically under the eyes provided additional information for age judgments. This suggests that in these groups, the under-eye area may contribute more distinct visual information regarding aging than in other groups.</p>
<p>When predicting perceived health, the results were more varied. While wrinkles remained a negative predictor, skin color variables played a more prominent role. For Indian women, lighter skin in the under-eye region was a significant positive predictor of rated health.</p>
<p>Similarly, for South African women, skin yellowness was a positive predictor of both health and attractiveness ratings. This indicates that while wrinkles drive age perception, color cues are vital for judgments of well-being in these populations. The researchers posit that pigmentary issues, such as dark circles, may weigh more heavily on health perception in darker skin types.</p>
<p>An exception to these specific predictive patterns was observed in the French group regarding health ratings. While the overall statistical models were effective, no single skin feature stood out as a solitary predictor for health judgments in French women. This implies that French assessors might use a more holistic approach, combining multiple features rather than relying on a single cue like wrinkles or color.</p>
<p>The study has certain limitations that warrant mention. The sample size for the specific sub-group analyses was relatively small, with only 36 women per ethnicity. This reduces the statistical power to detect very subtle differences within each group.</p>
<p>Additionally, the study relied on static digital images. In real-world interactions, facial dynamics and expressions play a major role in the visibility of crow’s feet and other lines. Future research could investigate how movement influences the perception of these features.</p>
<p>The study, “<a href="https://doi.org/10.1111/ics.13045" target="_blank">Effects of under-eye skin and crow’s feet on perceived facial appearance in women of five ethnic groups</a>,” was authored by Bernhard Fink, Remo Campiche, Todd K. Shackelford, and Rainer Voegeli.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/microdosing-cannabis-a-new-hope-for-alzheimers-patients/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Microdosing cannabis: a new hope for Alzheimer’s patients?</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Dec 22nd 2025, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>As the world’s population ages, the number of people living with dementias such as Alzheimer’s disease increases. Given the lack of curative treatments and the limited effectiveness of available medications, interest in new therapeutic approaches is growing. Among them are cannabinoids from the cannabis plant.</p>
<p>A small new Brazilian study published in the international <a href="https://doi.org/10.1177/13872877251389608">Journal of Alzheimer’s Disease</a> investigated the effects of microdoses of cannabis extract on patients with mild Alzheimer’s disease. The results found positive effects, without the associated “high” of cannabis.</p>
<h2>The logic of microdoses</h2>
<p>The study, led by professor Francisney Nascimento and colleagues at the <a href="https://portal.unila.edu.br/">Federal University of Latin American Integration (UNILA)</a>, recruited 24 elderly patients (60-80 years) diagnosed with mild Alzheimer’s. It evaluated the effects of daily use of an oil prepared from Cannabis extract containing THC and CBD in similar proportions and extremely low concentrations (0.3 mg of each cannabinoid). These sub-psychoactive doses do not cause the “high” associated with recreational use of the plant.</p>
<p>The extract used was donated by <a href="https://abraceesperanca.org.br/">ABRACE</a>, Brazil’s biggest patient association and had no contribution from cannabis companies or other funding sources.</p>
<p>“Microdosing” is a term usually associated with recreational use of psychedelics. Given the size of the dose, it would be easy to question whether it could have any effect at all.</p>
<p>Doses below 1 mg of the cannabinoid compounds are not frequently reported in the literature of clinical practice. However, the researchers’ decision to use microdosing did not come out of nowhere. In 2017, the group led by <a href="https://en.wikipedia.org/wiki/Andreas_Zimmer">Andreas Zimmer</a> and <a href="https://www.researchgate.net/profile/Andras-Bilkei-Gorzo">Andras Bilkei-Gorzo</a> had already demonstrated that very low doses of THC restored cognition in elderly mice, reversing gene expression patterns and brain synapse density in the hippocampus to levels similar to those of young animals.</p>
<p>Subsequently, <a href="https://pubmed.ncbi.nlm.nih.gov/35820856/">other studies in mice</a> reinforced that the endocannabinoid system, which is important for neuroprotection and regulates normal brain activity (ranging from body temperature to memory), undergoes a natural decline during ageing.</p>
<p>Inspired by these findings, the group initially tested microdosing of cannabis extract in <a href="https://pubmed.ncbi.nlm.nih.gov/35820856/">a single patient with Alzheimer’s disease for 22 months</a>. They found cognitive improvement, assessed using the Adas-Cog scale, a set of tasks using things like word recall to test cognitive function. This triggered the decision to run a more robust clinical trial in human volunteers to verify the cognitive-enhancement effects observed in the volunteer. The second study was a properly <a href="https://pubmed.ncbi.nlm.nih.gov/41160460/">controlled randomised and double-blinded clinical trial</a>.</p>
<h2>What we found</h2>
<p>Several clinical scales were used to objectively measure the impact of cannabis treatment. This time, the improvement was observed in the <a href="https://muhc.ca/sites/default/files/micro/m-PT-OT/OT/Mini-Mental-State-Exam-%28MMSE%29.pdf">mini-mental state exam (MMSE) scale</a>, a widely used scale for assessing cognitive function in patients with dementia. It’s a validated set of questions that are asked to the patient, with the aid of an accompanying person (typically a family member of helper). After 24 weeks of treatment, the group receiving the cannabis extract showed stabilisation in their scores, while the placebo group showed cognitive deterioration (worsening of Alzheimer’s symptoms).</p>
<p>The impact was modest but relevant, patients using cannabis microdosing scored two to three points higher than their placebo counterparts (full points on the MMSE is 30). In patients with preserved or moderately impaired cognitive function, it may be unrealistic to expect major changes in a few weeks.</p>
<p>Cannabis extracts did not improve other non-cognitive symptoms, like depression, general health or overall quality of life. On the other hand, there was no difference in adverse side effects. This was likely due to the extremely low dose used.</p>
<p>This result echoes findings from <a href="https://doi.org/10.1038/s41398-022-02208-1">my 2022 study</a> which found a reduction in endocannabinoid signalling during ageing, meaning ageing brains are more prone to cognitive degradation without the protection of the cannabinoids. Among other mechanisms, cannabinoids seem to protect cognition by reducing drivers of inflammation in the brain.</p>
<h2>A new paradigm: cannabis without the ‘high’</h2>
<p>The biggest obstacle to the acceptance of cannabis as a therapeutic tool in brain ageing is perhaps not scientific, but cultural. In many countries, the fear of “getting high” deters many patients and even healthcare professionals. But studies such as this show there are ways to get around this problem by using doses so low they do not cause noticeable changes in consciousness, but which can still modulate important biological systems, such as inflammation and neuroplasticity.</p>
<p>Microdoses of cannabis can escape the psychoactive zone and still deliver benefits. This could open the door to new formulations focused on prevention, especially in more vulnerable populations, such as elderly people with mild cognitive impairment or a family history of dementia.</p>
<h2>What now?</h2>
<p>Despite its potential, the study also has important limitations: the sample size is small, and the effects were restricted to one dimension of the cognition scale. Still, the work represents an unprecedented step: it is the first clinical trial to successfully test the microdose approach in patients with Alzheimer’s disease. It is a new way of looking at this plant in the treatment of important diseases.</p>
<p>To move forward, new studies with a larger number of participants, longer follow-up times, and in combination with biological markers (such as neuroimaging and inflammatory biomarkers) will be necessary. Only then will it be possible to answer the fundamental question: can cannabis slow down the progression of Alzheimer’s disease? We have taken an important step towards understanding this, but for now, the question remains unanswered.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/271170/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1"><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p> </p>
<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/small-study-finds-microdoses-of-cannabis-stalled-cognitive-decline-in-alzheimers-patients-271170">original article</a>.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/inflammation-linked-to-brain-reward-dysfunction-in-american-indians-with-depression/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Inflammation linked to brain reward dysfunction in American Indians with depression</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Dec 22nd 2025, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Recent research indicates that bodily inflammation may disrupt the brain’s ability to process rewards and risks in American Indian adults who have experienced depression. The study found that higher levels of specific inflammatory markers in the blood corresponded with reduced activity in brain regions essential for motivation. These findings were published in <em><a href="https://doi.org/10.1016/j.bpsc.2025.08.015" target="_blank">Biological Psychiatry: Cognitive Neuroscience and Neuroimaging</a></em>.</p>
<p>Major Depressive Disorder is a complex mental health condition that goes beyond feelings of sadness. One of its hallmark symptoms is anhedonia, which is a reduced ability to experience pleasure or interest in daily activities. This symptom is often linked to dysfunctions in the brain’s reward circuitry. This system governs how the brain anticipates positive outcomes, such as winning a prize, or negative outcomes, like a financial loss.</p>
<p>Scientists are increasingly looking at the immune system to understand these brain changes. Physical inflammation is the body’s natural response to injury or stress. However, chronic stress can lead to persistent, low-grade inflammation that affects the entire body. Over time, the immune system releases signaling proteins called cytokines that can cross into the brain. Once there, these proteins may alter how neural circuits function.</p>
<p>This biological connection is particularly relevant for American Indian populations. Many Indigenous communities face unique and chronic stressors rooted in historical trauma. These stressors include the long-term psychological impacts of colonization and systemic health disparities. Previous research links symptoms of historical loss to higher risks for both depression and physical health issues.</p>
<p>The researchers hypothesized that this unique stress environment might elevate inflammation levels. They proposed that this inflammation could, in turn, impair the brain’s reward system. This pathway might explain why depression prevalence and severity can be higher in these communities. To test this, the study focused on American Indian individuals who had been diagnosed with Major Depressive Disorder at some point in their lives.</p>
<p>Leading the investigation was Lizbeth Rojas from the Department of Psychology at Oklahoma State University. She collaborated with a team of experts from the Laureate Institute for Brain Research and other academic institutions. The team aimed to move beyond simple surveys by looking at direct biological and neurological evidence. They sought to connect blood markers of inflammation with real-time brain activity.</p>
<p>The study included 73 adult participants who identified as American Indian. All participants had a history of clinical depression. To assess their biological state, the researchers collected blood samples from each individual. They analyzed these samples for specific biomarkers related to the immune system.</p>
<p>The team measured levels of proinflammatory cytokines, which promote inflammation. These included tumor necrosis factor (TNF) and interleukin-6 (IL-6). They also measured C-reactive protein (CRP), a general marker of inflammation produced by the liver. Additionally, they looked at interleukin-10 (IL-10), a cytokine that helps reduce inflammation.</p>
<p>To observe brain function, the researchers utilized two advanced imaging technologies simultaneously. Participants entered a functional magnetic resonance imaging (fMRI) scanner. This machine measures brain activity by tracking changes in blood oxygen levels. At the same time, participants wore caps to record electroencephalography (EEG) data. EEG measures the electrical activity of the brain with high time precision.</p>
<p>While inside the scanner, the participants performed a specific psychological test called the Monetary Incentive Delay task. This task is designed to activate the brain’s reward centers. Participants viewed a screen that displayed different visual cues. Some cues indicated a chance to win money, while others indicated a risk of losing money.</p>
<p>After seeing a cue, the participant had to press a button rapidly. If they were fast enough on a “win” trial, they gained a small amount of cash. If they were fast enough on a “loss” trial, they avoided a financial penalty. The researchers focused on the “anticipation phase” of this task. This is the brief moment after seeing the cue but before pressing the button.</p>
<p>During this anticipation phase, a healthy brain typically shows high activity in the basal ganglia. This is a group of structures deep in the brain that includes the striatum. The striatum is essential for processing incentives and generating the motivation to act. In people with depression, this area often shows “blunted” or reduced activity.</p>
<p>The study’s results revealed a clear link between the immune system and this brain activity. The researchers used statistical models to predict brain response based on inflammation levels. They found that higher concentrations of TNF were associated with reduced activation in the basal ganglia during the anticipation of a potential win.</p>
<p>This relationship was notably influenced by the sex of the participant. The negative association between TNF and brain activity was observed specifically in male participants. This suggests that for men in this sample, high inflammation dampened the brain’s excitement about a potential reward.</p>
<p>The researchers also examined how the brain reacted to the threat of losing money. In this context, they looked at the interaction between TNF and CRP. They found that elevated levels of both markers predicted reduced brain activation. The basal ganglia were less responsive even when the participant was trying to avoid a negative outcome.</p>
<p>Another finding involved the nucleus accumbens, a key part of the brain’s reward circuit. The study showed that medication status played a role here. Among participants taking psychotropic medication, higher TNF levels were linked to lower activity in this region during loss anticipation. This highlights the complexity of how treatments and biology interact.</p>
<p>The study also attempted to use EEG to measure a specific brain wave called the P300. The P300 is a spike in electrical activity that relates to attention and updating working memory. Previous studies have suggested that people with depression have a smaller P300 response. The researchers expected inflammation to predict the size of this brain wave.</p>
<p>However, the analysis did not find a statistical link between the inflammatory markers and the P300 amplitude. The electrical signals did not show the same clear pattern as the blood flow changes measured by the fMRI. This suggests that inflammation might affect the metabolic demand of brain regions more than the specific electrical timing measured by this task.</p>
<p>These findings support the idea that the immune system plays a role in the biology of depression. The presence of high inflammation appears to “turn down” the brain’s sensitivity to incentives. When the brain is less responsive to rewards, a person may feel less motivation. This aligns with the clinical experience of patients who feel a lack of drive or pleasure.</p>
<p>The authors described several limitations that provide context for these results. The study relied on a relatively small sample size of 73 people. A larger group would provide more statistical certainty. Additionally, the data came from parent studies that were not designed exclusively for this specific investigation.</p>
<p>Another limitation was the lack of a healthy control group. The study only looked at people with a history of depression. Without a non-depressed comparison group, it is difficult to determine if these patterns are unique to depression. They might also appear in people with high inflammation who are not depressed.</p>
<p>The study also could not fully account for cultural factors. While the background emphasizes the role of historical trauma, the analysis did not measure cultural connectedness. Previous research suggests that connection to one’s culture can protect against stress. It acts as a buffer that might improve mental health outcomes.</p>
<p>Despite these caveats, the research offers a specific biological target for understanding depression in American Indian populations. It moves away from purely psychological explanations. Instead, it frames mental health within a “biopsychosocial” model. This model considers how biological stress and social history combine to affect the brain.</p>
<p>The authors suggest that future research should focus on resilience. Understanding how some individuals maintain low inflammation despite stress could be key. This could lead to better prevention strategies. Interventions might focus on reducing inflammation as a way to help restore normal brain function.</p>
<p>Treating depression in these communities may require addressing physical health alongside mental health. If inflammation drives brain dysfunction, then reducing stress on the body is vital. This reinforces the need for holistic healthcare approaches. Such approaches would respect the unique history and challenges faced by American Indian communities.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.bpsc.2025.08.015" target="_blank">Major Depressive Disorder and Serum Inflammatory Biomarkers as Predictors of Reward-Processing Dysfunction in an American Indian Sample</a>,” was authored by Lizbeth Rojas, Eric Mann, Xi Ren, Danielle Bethel, Nicole Baughman, Kaiping Burrows, Rayus Kuplicki, Leandra K. Figueroa-Hall, Robin L. Aupperle, Jennifer L. Stewart, Salvador M. Guinjoan, Sahib S. Khalsa, Jonathan Savitz, Martin P. Paulus, Ricardo A. Wilhelm, Neha A. John-Henderson, Hung-Wen Yeh, and Evan J. White.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/study-finds-no-independent-link-between-visceral-fat-index-and-cognitive-decline/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Study finds no independent link between visceral fat index and cognitive decline</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Dec 22nd 2025, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>An analysis of the National Health and Nutrition Examination Survey data of older adults found no independent association between visceral adiposity and cognitive performance. While some correlations were initially found, these disappeared after the study authors controlled for sociodemographic factors and clinical conditions. The paper was published in <a href="http://dx.doi.org/10.1097/MD.0000000000045814"><em>Medicine</em></a><em>.</em></p>
<p>Adiposity refers to the accumulation of body fat. It reflects the amount and distribution of fat tissue in the body. While some adiposity is normal and necessary for energy storage, insulation, and hormone regulation, excessive adiposity increases the risk of metabolic and cardiovascular diseases. Body fat is not uniform, and its health impact depends greatly on where it is located.</p>
<p>One specific type of adiposity is visceral adiposity. Visceral adiposity refers specifically to fat stored deep inside the abdominal cavity, surrounding organs such as the liver, pancreas, and intestines. This visceral fat is metabolically active and releases inflammatory molecules and hormones that disrupt glucose and lipid metabolism.</p>
<p>High visceral adiposity is strongly linked to insulin resistance, type 2 diabetes, hypertension, and heart disease, often more so than general obesity. In contrast, subcutaneous fat stored under the skin is less harmful and sometimes even protective when overall weight is stable. People may have a normal body weight yet still exhibit high visceral adiposity, a condition sometimes called “normal-weight obesity.”</p>
<p>Study author Long He and his colleagues note that previous studies indicated an association between excess adiposity and age-related cognitive decline in older individuals. They also note that visceral adiposity has been associated with a heightened risk of metabolic disorders. With this in mind, the authors investigated whether visceral adiposity is associated with cognitive performance in older adults.</p>
<p>The study authors analyzed data from the National Health and Nutrition Examination Survey (NHANES) 2011 to 2014. This is an epidemiological survey that uses a complex sampling system to obtain nationally representative data on the health and nutritional status of U.S. civilians.</p>
<p>To estimate visceral fat, the researchers used the Visceral Adiposity Index (VAI), a calculated score based on waist circumference, Body Mass Index (BMI), triglycerides, and HDL cholesterol. They analyzed data from 1,323 participants who were 60 years of age or older and for whom data on all cognitive assessments was available. These individuals completed the NHANES cognitive battery consisting of three tests: the CERAD Word List Learning Test, which measures immediate and delayed verbal memory; the Animal Fluency Test (AFT), which assesses semantic retrieval and executive functioning; and the Digit Symbol Substitution Test (DSST), which evaluates processing speed, attention, and working memory.</p>
<p>Results showed that after controlling for demographic factors and participants’ health conditions, there was no statistically significant association between participants’ VAI scores and their performance on the cognitive tests. While some of the cognitive tests showed associations with VAI scores before all demographic and clinical factors were taken into account, these associations disappeared after full adjustment.</p>
<p>“Age- and lifestyle-adjusted analyses showed inverse, domain‑specific links between higher VAI and cognition (most notably processing speed), but these weakened after full sociodemographic and clinical adjustment, suggesting measured sociodemographic and cardiometabolic factors largely explain the crude associations,” the study authors concluded.</p>
<p>The study contributes to the scientific understanding of the links between visceral adiposity and cognitive performance. However, it should be noted that visceral adiposity contributes to several of the clinical conditions the study authors controlled for (such as dyslipidemia). In doing so, the statistical models may have removed the part of the relationship between visceral adiposity and cognitive performance that acts through those factors.</p>
<p>The paper, “<a href="http://dx.doi.org/10.1097/MD.0000000000045814">Association between visceral adiposity index and cognitive dysfunction in US participants derived from NHANES data: A cross-sectional analysis,</a>” was authored by Long He, Cheng Xing, Xueying Yang, Shilin Wang, Boyan Tian, Jianhao Cheng, Yushan Yao, and Bowen Sui.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/longer-paternity-leave-is-linked-to-reduced-maternal-gateclosing/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Longer paternity leave is linked to reduced maternal gateclosing</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Dec 22nd 2025, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research published in <a href="https://doi.org/10.1007/s11199-025-01565-7"><em>Sex Roles</em></a> finds that when new fathers take longer paternity leave, mothers tend to show fewer gateclosing behaviors and hold more flexible attitudes about parental roles.</p>
<p>Becoming a parent brings major changes, especially for dual-earner couples who have to balance the demands of infant care with work responsibilities. Even though fathers in the United States have become increasingly involved in childcare over the past several decades, mothers still take on the greater share during infancy, in part due to longstanding norms and constraints around parental roles.</p>
<p>Expanding parental leave, especially leave available to fathers, has been considered one way to support more equal caregiving. Research has shown that when fathers take leave, they become more engaged in childcare and may even carry those habits forward for years after their child is born. Reed Donithen and colleagues were interested in whether this increased involvement might also change how mothers encourage or restrict fathers’ participation in caregiving.</p>
<p>One important family dynamic in this context is maternal gatekeeping, which includes behaviors and attitudes that either facilitate (“gateopening”) or restrict (“gateclosing”) fathers’ engagement in parenting. Past work has linked higher maternal gateclosing to less father involvement, lower-quality father-child relationships, and greater strain in the romantic relationship.</p>
<p>Despite increasing interest in paternal leave, no prior studies have examined whether new fathers’ leave length might shift maternal gatekeeping. Because parental identities are developing in the early postpartum period, the authors proposed that fathers’ longer leave could lead both parents to adopt more egalitarian views of childcare, reducing mothers’ gateclosing tendencies.</p>
<p>The study drew on data from a longitudinal project that followed 182 dual-earner, different-sex couples in the Midwestern United States through their transition to parenthood. Couples were originally recruited during the pregnant mother’s third trimester through childbirth classes, advertisements, flyers, and referrals.</p>
<p>After applying eligibility criteria, which required that both parents worked before and after birth, provided leave-length information, and excluding one extreme outlier, the final sample included 130 couples. Participants completed surveys during pregnancy and again at 3, 6, and 9 months postpartum.</p>
<p>Mothers’ and fathers’ leave lengths were measured in days across the postpartum follow-ups, distinguishing paid from unpaid leave. Maternal gatekeeping was assessed at nine months postpartum. Both mothers and fathers completed the Parental Regulation Inventory, which captures gateopening (e.g., asking for the father’s input) and gateclosing (e.g., criticizing or redoing fathers’ childcare efforts). Mothers also completed attitude subscales from the Allen & Hawkins (1999) measure, capturing standards/responsibilities and maternal role confirmation.</p>
<p>Multiple psychological and demographic factors measured during pregnancy, including parental self-efficacy, maternal psychological distress, maternal parenting perfectionism, maternal essentialism, fathers’ essentialism, relationship confidence, and socioeconomic status, were included as controls. Path analyses were then used to test whether fathers’ leave length predicted maternal gatekeeping behaviors and attitudes at nine months postpartum.</p>
<p>On average, mothers took about 67 days of leave, while fathers took about 14. Across analyses, longer paternity leave predicted significantly lower maternal gateclosing behaviors, according to both mothers’ and fathers’ reports. Fathers’ longer leave was also linked to more flexible maternal attitudes, including less stringent standards/responsibilities and weaker maternal role confirmation.</p>
<p>These associations remained significant even after adjusting for mothers’ leave time and the wide range of psychological and demographic covariates. In contrast, fathers’ leave length was not associated with maternal gateopening behaviors, meaning that mothers did not necessarily increase their active encouragement of father involvement despite becoming less restrictive.</p>
<p>Maternal leave length, by comparison, did not predict any form of maternal gatekeeping. Several covariates also showed meaningful associations. For example, maternal parenting perfectionism predicted stronger gateclosing and stricter household standards, and maternal confidence in the couple’s future predicted greater gateopening.</p>
<p>However, these factors did not alter the central finding, that paternity leave length uniquely and consistently predicted reductions in maternal gateclosing. Exploratory analyses examining whether the effects of paternity leave depended on maternity leave length found no significant interactions.</p>
<p>The authors note that the study relied on a U.S. sample of largely White, highly educated, dual-earner couples from one geographic region, which may limit generalizability to more diverse families or contexts with different parental leave policies.</p>
<p>These findings highlight that when fathers take longer leave after the birth of a child, mothers appear less likely to restrict fathers’ involvement and hold more flexible views of parental roles, offering insight into how paternity leave may support more egalitarian coparenting.</p>
<p>The study, “<a href="https://doi.org/10.1007/s11199-025-01565-7">When New Fathers Take More Leave, Does Maternal Gatekeeping Decline</a>?” was authored by Reed Donithen, Sarah Schoppe-Sullivan, Miranda Berrigan, and Claire Kamp Dush.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/adolescents-with-high-emotional-intelligence-are-less-likely-to-trust-ai/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Adolescents with high emotional intelligence are less likely to trust AI</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Dec 22nd 2025, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in the journal <em><a href="https://doi.org/10.3390/bs15081142" target="_blank">Behavioral Sciences</a></em> highlights generational differences in how adolescents and their parents interact with artificial intelligence. The research suggests that teens with higher emotional intelligence and supportive, authoritative parents tend to use AI less frequently and with greater skepticism. Conversely, adolescents raised in authoritarian environments appear more likely to rely on AI for advice and trust it implicitly regarding data security and accuracy.</p>
<p>Artificial intelligence has rapidly integrated into daily life, reshaping how information is accessed and processed. This technological shift is particularly impactful for adolescents. This demographic is at a developmental stage where they are refining their social identities and learning to navigate complex information ecosystems. </p>
<p>While AI offers educational support, it also presents risks related to privacy and the potential for emotional over-reliance. Previous investigations have examined digital literacy or parenting styles in isolation. However, few have examined how these factors interact with emotional traits to shape trust in AI systems.</p>
<p>The authors of this study sought to bridge this gap by exploring the concept of a “digital secure base.” This theoretical framework proposes that strong, supportive family relationships provide a safety net that helps young people explore the digital world responsibly. </p>
<p>The researchers aimed to understand if emotional skills and specific family dynamics might predict whether a teen uses AI as a helpful tool or as a substitute for human connection. They hypothesized that the quality of the parent-child relationship could influence whether an adolescent develops a critical or dependent attitude toward these emerging technologies.</p>
<p>To investigate these dynamics, the research team recruited 345 participants from southern Italy. The sample consisted of 170 adolescents between the ages of 13 and 17. It also included 175 parents, with an average age of roughly 49. Within this group, the researchers were able to match 47 specific parent-adolescent pairs for a more detailed analysis. The data was collected using online structured questionnaires.</p>
<p>Participants completed several standardized assessments. They answered questions regarding parenting styles, specifically looking for authoritative or authoritarian behaviors. They also rated their own trait emotional intelligence, which measures how people perceive and manage their own emotions. Additional surveys evaluated perceived social support from family and friends. </p>
<p>To measure AI engagement, the researchers developed specific questions about the frequency of use and trust. These items asked about sharing personal data, seeking behavioral advice, and using AI for schoolwork. Trust was measured by how much participants believed AI data was secure and whether AI gave better advice than humans.</p>
<p>The data revealed a clear generational divide regarding usage habits. Adolescents reported using AI more often than their parents for school or work-related tasks. Approximately 32 percent of teens used AI for these purposes frequently, compared to only 17 percent of parents. Adolescents were also more likely to ask AI for advice on how to behave in certain situations.</p>
<p>In terms of trust, the younger generation appeared much more optimistic than the adult respondents. Teens expressed higher confidence in the security of the data they provided to AI systems. They were also more likely to believe that AI could provide better advice than their family members or friends. This suggests that adolescents may perceive these systems as more competent or benevolent than their parents do.</p>
<p>The researchers then analyzed how personality and family environment related to these behaviors. They found that adolescents with higher levels of trait emotional intelligence tended to use AI less frequently. These teens also expressed lower levels of trust in the technology. This negative association suggests that emotionally intelligent youth may be more cautious and critical. They may rely on their own internal resources or human networks rather than turning to algorithms for guidance.</p>
<p>A similar pattern emerged regarding parenting styles. Adolescents who described their parents as authoritative—characterized by warmth, open dialogue, and clear boundaries—were less likely to rely heavily on AI. This parenting style was associated with what the researchers called “balanced” use. These teens engaged with the technology but maintained a level of skepticism.</p>
<p>A different trend appeared for those with authoritarian parents. This parenting style involves rigid control and limited communication. Adolescents in these households were more likely to share personal data with AI systems. They also tended to seek behavioral advice from AI more often. This suggests a potential link between a lack of emotional support at home and a reliance on digital alternatives.</p>
<p>Using the matched parent-child pairs, the study identified two distinct profiles among the adolescents. The researchers labeled the first group “Balanced Users.” This group made up about 62 percent of the matched sample. These teens had higher emotional intelligence and reported strong family support. They used AI cautiously and did not view it as superior to human advice.</p>
<p>The second group was labeled “At-Risk Users.” These adolescents comprised roughly 38 percent of the matched pairs. They reported lower emotional intelligence and described their parents as more authoritarian. This group engaged with AI more intensively. They were more likely to share personal data and trust the advice given by AI over that of their parents or peers. They also reported feeling less support from their families.</p>
<p>These findings imply that emotional intelligence acts as a buffer against uncritical technology adoption. Adolescents who can regulate their own emotions may feel less need to turn to technology for comfort or guidance. They appear to approach AI as a tool rather than a companion. This aligns with the idea that emotionally competent individuals are better at critical evaluation.</p>
<p>The connection between parenting style and AI use highlights the importance of the family environment. Authoritative parenting seems to foster independent thinking and digital caution. When parents provide a secure emotional foundation, teens may not feel the need to seek validation from artificial agents. In contrast, authoritarian environments might leave teens seeking support elsewhere. If they cannot get emotional regulation from their parents, they may turn to AI systems that appear competent and non-judgmental.</p>
<p>The study provides evidence that AI systems cannot replace the emotional containment provided by human relationships. The results suggest that rather than simply restricting access to technology, interventions should focus on strengthening family bonds. </p>
<p>Enhancing emotional intelligence and encouraging open communication between parents and children could serve as protective factors. This approach creates a foundation that allows teens to navigate the digital world without becoming overly dependent on it.</p>
<p>The study has several limitations that affect how the results should be interpreted. The design was cross-sectional, meaning it captured data at a single point in time. This prevents researchers from proving that parenting styles cause specific AI behaviors. It is possible that the relationship works in the other direction or involves other factors. The sample size for the matched parent-child pairs was relatively small. This limits the ability to generalize the specific user profiles to broader populations.</p>
<p>Additionally, the study relied on self-reported data. Participants may have answered in ways they felt were socially acceptable rather than entirely accurate. There is also the potential for common-method bias since the same individuals provided data on both their personality and their technology use. The research focused primarily on psychological and relational factors. It did not account for socioeconomic status or cultural differences that might also influence access to and trust in AI.</p>
<p>Future research should look at these dynamics over time. Longitudinal studies could track how changes in emotional intelligence influence AI trust as teens grow older. Researchers could also include objective measures of AI use, such as usage logs, rather than relying solely on surveys. </p>
<p>Exploring these patterns in different cultural contexts would also be beneficial to see if the findings hold true globally. Further investigation is needed to understand how specific features of AI, such as human-like conversation styles, specifically impact adolescents with lower emotional support.</p>
<p>The study, “<a href="https://doi.org/10.3390/bs15081142" target="_blank">Emotional Intelligence and Adolescents’ Use of Artificial Intelligence: A Parent–Adolescent Study</a>,” was authored by Marco Andrea Piombo, Sabina La Grutta, Maria Stella Epifanio, Gaetano Di Napoli, and Cinzia Novara.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>