Your Daily digest for PsyPost – Psychology News Daily Digest (Unofficial)

Article Digests for Psychology & Social Work article-digests at lists.clinicians-exchange.org
Sat Aug 31 07:32:43 PDT 2024


PsyPost – Psychology News Daily Digest (Unofficial)

 

(https://www.psypost.org/people-who-believe-they-are-physically-attractive-also-believe-they-are-important/) People who believe they are physically attractive also believe they are important
Aug 31st 2024, 10:00

A series of three studies found that individuals who believe they are physically attractive also tend to believe their social status is higher. The findings were published in (https://www.sciencedirect.com/science/article/pii/S2666622724000261) Current Research in Ecological and Social Psychology.
In modern society, physically attractive people enjoy numerous advantages in life outcomes compared to their less attractive peers. For example, studies show that physically attractive people are preferred as leaders, have higher income levels, and achieve better job performance scores. They receive more generous offers in economic games when paired with strangers (in research studies). Others perceive them as having a higher social status, and as being more trustworthy, intelligent, competent, and healthy.
This widespread positivity that physically attractive people experience is called “beautism.” Beautism refers to the cultural and social emphasis on physical beauty, elevating it as an ideal or standard of worth. However, previous research has most often focused on the effects of physical beauty as assessed by others. It remained relatively unknown whether attractive people themselves believe their standing in social hierarchies is higher simply because of their beauty.
Study author Lynn K.L. Tan and her colleagues sought to explore this question. They hypothesized that individuals who perceive themselves as more physically attractive would also consider themselves to have higher social status. To test this, they conducted a series of three studies.
The first study was a pilot study conducted on 303 U.S. Amazon Mechanical Turk (MTurk) workers. Participants rated their own physical attractiveness and where they believed they stood compared to others in terms of jobs, wealth, and prestige. Participants in the second study were 349 adults recruited via Prolific. Similarly, they rated their own physical attractiveness, social status, and social likability on a set of short assessments.
The third study was an experiment in which the researchers manipulated participants’ perceptions of their own physical attractiveness and examined whether this influenced their self-perceived social status. The experiment was conducted online with 441 U.S. MTurk workers as participants.
Participants in this study were divided into three groups. The first group wrote a short essay recalling a situation when they felt more physically attractive than others. The second group wrote an essay about a time when they felt less physically attractive than others, while the third group, serving as a control, wrote an essay recalling their previous day. After writing the essays, participants rated their own physical attractiveness and socioeconomic status.
The results of the first study showed that individuals who saw themselves as more physically attractive also tended to view their social status as higher. The second study confirmed this finding and also suggested that part of the association between self-rated physical attractiveness and social status might be mediated by self-perceived social likability.
In other words, it is possible that individuals who see themselves as more physically attractive also tend to believe they are more socially likable, and as a result, perceive their social status as higher. However, the association between self-rated physical attractiveness and self-perceived social status is not fully explained by social likability.
The results of the experiment showed that participants who wrote an essay about how they were more attractive than others later rated their own attractiveness as higher compared to participants in the other two groups. They also tended to rate their socioeconomic status as higher compared to the other two groups. This finding confirmed the researchers’ hypothesis that individuals who see themselves as more physically attractive also tend to perceive their social status as higher.
“Our work provides complementary evidence for our main hypothesis that self-rated physical attractiveness causally increases first-person perceptions of status inferences. This finding has important implications for status navigation behaviors because of the lability of self-rated physical attractiveness in the modern world,” the study authors concluded.
The study sheds light on the importance of physical attractiveness in social relations. However, since the study relied solely on self-reports, there is room for reporting bias to have affected the results. Additionally, it remains unclear how stable or consistent self-reported physical attractiveness ratings are, given that the experiment showed they were relatively easy to manipulate experimentally.
The paper, “(https://doi.org/10.1016/j.cresp.2024.100205) Hot at the Top: The Influence of Self-Rated Attractiveness on Self-Perceived Status,” was authored by Lynn K.L. Tan, Micha l Folwarczny, Tobias Otterbring, and Norman P. Li.

(https://www.psypost.org/your-name-influences-your-appearance-as-you-age-according-to-new-research/) Your name influences your appearance as you age, according to new research
Aug 31st 2024, 08:00

Is it possible that your face gradually comes to reflect your name as you age? New research suggests that the name you’re given at birth might subtly shape your appearance as you grow older. Researchers found that adults often look like their names, meaning people can match a face to a name more accurately than random guessing. But this isn’t true for children, which suggests that our faces grow into our names over time. The findings have been published in the (https://doi.org/10.1073/pnas.2405334121) Proceedings of the National Academy of Sciences.
The idea that names might influence facial appearance draws from the broader concept of self-fulfilling prophecies—where expectations about a person can influence how they behave and, in this case, perhaps even how they look. If society has certain expectations of how someone named “John” or “Emma” should appear, it’s possible that over time, individuals might unconsciously shape their physical appearance to align with those expectations.
However, it’s also possible that babies might be born with certain facial features that subconsciously influence parents to give them a name that “fits” their appearance. This study sought to determine which of these two scenarios is more plausible: Are people’s faces influenced by their names as they age, or do people already look like their names from birth?
“We noticed that even though we are good with names, there are few people that we can’t remember their name or even call them by mistake by a different name,” said study author (https://www.runi.ac.il/en/faculty/zwebner-yonat/) Yonat Zwebner, an assistant professor of marketing at Reichman University. “It is also common to hear that someone’s name ‘really suits’ them while sometimes you hear that someone’s name really doesn’t suit them. So it got us thinking – could it really be that most people look like their name?”
To answer this question, the researchers conducted a series of five studies that combined human perception tests with machine learning techniques to analyze whether faces could be matched to names more accurately than by chance alone.
In the first two studies, the researchers aimed to test whether people could accurately match names to faces more often than would be expected by random chance. They did this by using a straightforward experimental setup where both adult and child participants were shown a series of headshot photographs of unfamiliar faces. Each photograph was accompanied by a list of four possible names, one of which was the correct name for the person in the photograph. The task for the participants was to choose which name they believed matched the face.
Study 1 included two groups of participants: 117 adults (aged 18 to 30 years) and 76 children (aged 8 to 13 years). These participants were asked to match names to both adult and child faces. The researchers wanted to see if participants could more accurately match names to adult faces compared to child faces, suggesting that adults might “grow into” their names over time.
To ensure the robustness of their findings, Study 2 replicated the experiment with a different sample of participants and faces. This study included 195 adult participants (aged 20 to 40 years) and 168 child participants (aged 8 to 12 years). All participants completed the experiment online, and the faces used were sourced from a professional database to ensure consistency in image quality and background characteristics.
In both studies, the researchers found that participants were able to match names to adult faces more accurately than by random chance, but they were not able to do the same for children’s faces. These findings suggest that the face-name congruence—the idea that people might look like their names—seems to develop over time, as it was present in adults but not in children.
In the third study, the researchers used machine learning to examine facial similarities among people with the same name. They employed a Siamese Neural Network, which was trained on a large dataset of facial images from both adults and children. The dataset included 607 adult faces and 557 child faces, with each group featuring the same 20 names (8 male and 12 female names).
The neural network was trained using “triplet loss,” where the model was presented with an anchor image, a positive image (same name as the anchor), and a negative image (different name). The model learned to identify the positive image as more similar to the anchor than the negative image. This training process helped determine whether there was a detectable pattern of facial similarity among individuals with the same name.
The findings from Study 3 were consistent with those of the earlier studies. The Siamese Neural Network found that adults who shared the same name had more similar facial representations compared to those with different names. The model showed a “similarity lift” of 60.05% for adult faces with the same name, significantly higher than the random-chance level of 50%. In contrast, for children’s faces, the similarity lift was only 51.88%, which did not differ significantly from chance.
In the final two studies, the researchers tested whether the name-face matching effect could be observed in faces that were artificially aged to resemble adults. They used Generative Adversarial Networks (GANs) to digitally age photographs of children, creating “artificial adults.”
Study 4A involved 100 adult participants (aged 19 to 39 years) who were asked to match names to a mix of real adult faces and digitally aged faces of children. Participants saw these images and selected the correct name from four options, just as in the earlier studies. The goal was to determine if the artificially aged faces would exhibit the same name-face congruence as real adult faces.
The participants were able to match names to real adult faces with an accuracy of 27.98%, significantly above chance. However, when it came to the artificially aged faces, their accuracy dropped to 24.25%, which was not significantly different from chance. This result suggested that merely aging a face digitally does not produce the same name-face congruence seen in real adults.
In Study 4B, the researchers used the same machine learning approach from Study 3 to analyze the artificially aged faces. The dataset for this study included 310 artificially aged faces (108 males and 202 females). The Siamese Neural Network evaluated the similarity of these faces to real adult faces with the same names, comparing the similarity scores to determine if the name-face matching effect observed in real adults could be replicated in these digitally aged faces. The model found that the similarity lift for the digitally aged faces was only 51.41%, almost identical to the lift observed for children’s faces and not significantly different from chance.
But how could someone shape their own facial features? The researchers suggest that this might happen both directly, through choices like hairstyle, glasses, and makeup, and indirectly, through life experiences that leave their mark—such as the way repeated smiling can create wrinkles over time.
“We know how belonging to a specific gender can have a strong social structuring impact, but now we know that even our name, which is chosen for us by others, and is not biological, can influence the way we look, through our interactions with society,” Zwebner told PsyPost.
“I’ll elaborate more on how we suggest the effect is developed: An example we all know too well is gender stereotypes. If for example, society expects girls to be gentle and polite while boys to be more assertive and aggressive, through self-fulfilling prophecy processes most boys and girls become exactly like that. We think the same process underlies our face-name matching effect.
“We already know from previous research that names have stereotypes,” Zwebner said. “For example, prior published studies show that in the U.S., you will evaluate a person named ‘Katherine’ as more successful than a person named ‘Bonnie.’ You will also evaluate a person named ‘Scott’ as more popular than ‘Herman.’ Moreover, we know from prior research that people imagine a ‘Bob’ to have a rounder face compared to a ‘Tim.’ All these are name stereotypes that also entail how we think someone with a specific name should look like.”
“Therefore, like other stereotypes, one may indeed become more and more like his/her name expectations, including appearance. This is strongly supported by the fact that our participants chose names according to hairstyle alone. It suggests that people embrace a certain hairstyle, and probably more facial features that fit the expectations of how they should look according to their names. Assuming that within a society all share a similar stereotype for Katherine, then we interact with her in a way that matches our shared stereotype. We treat her with specific expectations. As a result, Katherine becomes more and more like a Katherine is expected to be, resulting with a specific matching look. It could also be a more direct association, if the name stereotype is related to a specific look (e.g., wears a ponytail)—then the person could embrace that look.”
But it is also important to note that while the research suggests that, on average, adults tend to develop a face-name congruence over time, this effect is not uniform across all individuals.
“In our studies, there is of course a diversity – there are people who have a very high face-name match while others have a low match and others are in the middle,” Zwebner explained. “So clearly, some people look very much like their names, and some don’t. We think we demonstrated that this face-name match is something that is part of our social world so these different levels of face-name match should have life implications. For example, would you trust a salesperson who completely does not look like his/her name? Would you hire someone that looks very different from what you imagined according to his/her name? And, of course, are there other factors that are correlated with looking like your name or not?”
While these studies provide evidence for the idea that social expectations tied to names can influence facial appearance, they also have some limitations. For example, the study primarily focused on participants and facial images from specific cultural backgrounds, which might limit the generalizability of the findings. It’s possible that the effects observed in this study could vary in different cultural contexts where names and social expectations differ.
Additionally, the researchers suggested that it would be interesting to investigate the point at which people start to “grow into” their names. At what age do people begin to exhibit the face-name congruence observed in adults? Understanding this could provide further insights into the developmental processes at play. Future research could also investigate the impact of a person’s name on their life outcomes.
“If a name can influence appearance it can affect many other things, and this research opens an important direction that may suggest how parents should consider better the names they give their children,” Zwebner said.
“It is interesting to see a pattern in people’s reaction to our findings: their first reaction is typically ‘no way!’ but then, they comment that it ‘actually seems totally reasonable,’ and proceed to tell us a story about their or a friend’s name, and how it matches their face,” she added. “The fact that the findings relate to any person, anywhere. Anyone can relate to it, whether they are surprised at first or they feel that it is intuitive.”
The study, “(https://www.pnas.org/doi/10.1073/pnas.2405334121) Can names shape facial appearance?“, was authored by Yonat Zwebner, Moses Miller, Noa Grobgeld, Jacob Goldenberg, and Ruth Mayo.

(https://www.psypost.org/different-childhood-adversities-linked-to-accelerated-or-delayed-brain-aging/) Different childhood adversities linked to accelerated or delayed brain aging
Aug 31st 2024, 06:00

New research published in (https://www.sciencedirect.com/science/article/pii/S0006322324014860) Biological Psychiatry reveals that different types of early-life adversity can lead to distinct patterns of brain development. The study found that children who experience emotional neglect tend to have younger-looking brains. On the other hand, children exposed to more other forms of adversity, such as caregiver mental illness and socioeconomic hardship, often have older-looking brains. These findings provide new insights into how varied early-life experiences can shape the developing brain in different ways.
While previous studies have shown that exposure to adversity can alter brain structure and function, most of this research has focused on singular types of adversity, like violence or poverty, or has combined multiple forms of adversity into a single measure. This approach, however, might miss the nuances of how different types of adversity affect the brain differently. The authors behind the new study wanted to address these limitations by investigating how distinct dimensions of early-life adversity affect brain development.
“I was mainly interested in this topic due to the growing body of research that suggests that different types of adversity (such as those related to emotional or physical neglect or threatening environments where a child experiences physical or sexual abuse) can have differential effects on the developing brain,” said study author (https://danibeck.net/) Dani Beck, a postdoctoral researcher of neurodevelopment at the University of Oslo and Diakonhjemmet Hospital in Oslo.
“I wanted to test this using brain age prediction, a machine learning algorithm that provides an estimation of an individual’s ‘biological’ age based on characterises of their magnetic resonance imaging (MRI) scan. Here, the predicted age can be compared to their chronological age (termed the brain age gap or BAG) and this information can be used to see if an individual’s brain is older or younger-looking in the context of a phenotype, for example, in this case, early life adversity.”
For their study, the research team used data from the Adolescent Brain Cognitive Development (ABCD) Study, which includes a large sample of children and adolescents from across the United States. The ABCD Study is an ongoing project that tracks the brain development and health of children over time, making it a valuable resource for studying the effects of early-life adversity.
The researchers focused on a sample of approximately 11,800 children aged 9 to 14 years old. The key measure they focused on was the brain age gap. A positive brain age gap indicates that the brain appears older than expected for the child’s age, while a negative brain age gap suggests a younger-looking brain.
In addition to brain imaging, the researchers analyzed detailed information on the children’s early-life experiences. Previous work had identified ten different dimensions of adversity, such as emotional neglect, caregiver mental illness, socioeconomic disadvantage, trauma exposure, and family conflict. These dimensions were determined through a combination of child and parent reports, as well as assessments by researchers.
The researchers found that children who experienced emotional neglect—such as a lack of support from their primary and secondary caregivers and insufficient supervision—tended to have younger-looking brains. This finding suggests that emotional neglect may delay brain maturation, possibly because the absence of emotional and social support slows the development of certain brain structures.
On the other hand, children who were exposed to more severe forms of adversity—such as caregiver mental illness, socioeconomic disadvantage, family aggression, trauma, and separation from a biological parent—were more likely to have older-looking brains. This suggests that these more intense forms of adversity might accelerate brain development, potentially as an adaptive response to stressful or dangerous environments. For instance, children living in unsafe neighborhoods or with caregivers who have mental health issues might develop faster to cope with these challenging circumstances.
“Although there are theories such as Threat versus Deprivation and The Stress Acceleration Hypothesis that suggest violent and threatening environments are conceptually different and have different impacts on the developing brain than emotional neglect and deprivation, I was still surprised that the dimensions we explored (which are derived from a data-driven approach) loaded so intuitively with accelerated and delayed maturational patterns that seem to support previous work,” Beck told PsyPOst
Interestingly, the study also found that the impact of certain adversities, like caregiver mental illness and family aggression, became more pronounced over time. This means that as children grew older, the brain age gap for those who had experienced these types of adversity increased, indicating a greater divergence from typical brain development patterns. This finding suggests that the effects of these adversities on brain development may accumulate or intensify as children grow older.
“The main take away from the research is that our results suggest that dimensions of early-life adversity are differentially associated with distinct neuro-developmental patterns, indicative of dimension-specific delayed and accelerated brain maturation,” Beck said. “Our findings are generally in line with theories positing that adverse experiences related to threat versus deprivation are two different dimensions, though further research is needed.”
While the study provides insights into how different types of early-life adversity affect brain development, it also has limitations to consider. The study’s findings are based on data from the ABCD Study, which includes a large but relatively general sample of children. This means that the results may not fully capture the effects of more extreme forms of adversity, such as severe abuse or chronic neglect, which might be underrepresented in the sample.
“The sample we used is not enriched for adversity exposure,” Beck explained. “And while this is good for facilitating broader generalization, more research is needed on children exposed to more severe forms of adversity. There remain also additional challenges such as accounting for differences in chronicity of adversity events, interindividual differences in resilience, and overlap in adversity types.”
The study, “(https://www.sciencedirect.com/science/article/pii/S0006322324014860) Dimensions of Early Life Adversity Are Differentially Associated with Patterns of Delayed and Accelerated Brain Maturation,” was authored by Dani Beck, Lucy Whitmore, Niamh MacSweeney, Alexis Brieant, Valerie Karl, Ann-Marie G. de Lange, Lars T. Westlye, Kathryn L. Mills, and Christian K. Tamnes.

(https://www.psypost.org/when-doubt-creeps-in-study-sheds-light-on-the-toll-of-suspected-infidelity/) When doubt creeps in: Study sheds light on the toll of suspected infidelity
Aug 30th 2024, 14:00

Trust is the cornerstone of any strong marriage, but what happens when that trust is broken — or even just doubted? A new study in (https://doi.org/10.1111/famp.12974) Family Process highlights that merely suspecting a partner of infidelity is strongly linked to lower levels of happiness in marriage.
While infidelity is a well-known cause of relationship breakdowns, little research had been conducted on how a partner’s belief or suspicion of infidelity might influence the dynamics of a marriage. The researchers aimed to fill this gap by examining not only the prevalence of different types of infidelity but also how these variations impact the happiness and satisfaction of both partners in a relationship.
The study reanalyzed data from the National Couples Survey, which was originally designed to explore contraceptive decision-making among couples. This survey, conducted between 2005 and 2006, involved 236 married couples from four U.S. cities: Seattle, Durham, St. Louis, and Baltimore. On average, the couples had been married for about five years, and the study included only those who were not pregnant, postpartum, or seeking to become pregnant.
Participants were asked two key questions: whether they had engaged in extramarital sex since getting married and whether they believed their spouse had done so. Relationship satisfaction was measured using a straightforward scale where participants rated their overall happiness in the marriage.
About 12% of men and 9% of women reported having engaged in extramarital sex at some point during their marriage. These figures are somewhat lower than previous estimates, likely because the sample consisted only of currently married couples. Among husbands who admitted to extramarital sex, 62.1% of their wives were aware of the infidelity, while 57.1% of husbands knew when their wives had been unfaithful. Conversely, 9.1% of wives and 7.4% of husbands suspected their spouses of infidelity despite their partners denying it.
The study categorized couples into four distinct groups based on their responses: (1) couples where neither partner reported or suspected infidelity, (2) couples where one partner suspected infidelity but the other partner reported no infidelity, (3) couples where one partner had engaged in extramarital sex but kept it a secret, and (4) couples where the infidelity was known to both partners. Most participants fell into the first category, with 80% of men and 84% of women reporting neither engaging in nor suspecting infidelity.
When the researchers examined relationship satisfaction across these groups, they found significant differences. Both men and women in the groups where infidelity was suspected, secret, or known reported lower levels of marital satisfaction compared to those in the no-infidelity group. Notably, the lowest levels of satisfaction were reported by those whose infidelity was known to their partner. This suggests that the awareness of infidelity might be particularly damaging to relationship happiness, possibly due to the breach of trust and the emotional turmoil that follows such revelations.
The study also revealed that it was not just the act of infidelity itself that mattered but also the suspicion of it. Both men and women who suspected their partner of cheating, even if those suspicions were unfounded, experienced lower marital satisfaction. Additionally, the belief that one’s partner suspected them of cheating was similarly associated with lower relationship satisfaction. This highlights the significant role that trust and perception play in maintaining marital happiness.
“Believing that one’s partner has engaged in extramarital sex can be stressful, and one study of undergraduates in a romantic relationship, most of whom were in a dating relationship and all of whom either suspected their partner was cheating on them or had cheated on them in the past 3 months, found that the degree of suspicion about a partner’s infidelity was not only negatively associated with relationship satisfaction but also positively associated with depression, physical health symptoms, and risky health behavior (Weigel & Shrout, 2021),” the researchers explained.
“It may be that these types of psychological, physical, and behavioral factors contribute to lower relationship satisfaction for people who suspect that their partner is engaging in or has engaged in extramarital sex. The current study builds on this research, and to the best of our knowledge is the first study to include both members of married couples to examine the associations between both members’ history of extramarital sex and their beliefs about their partner’s history of extramarital sex and their own level of relationship satisfaction.”
The study, “(https://onlinelibrary.wiley.com/doi/abs/10.1111/famp.12974) ‘I know what you did’: Associations between relationship satisfaction and reported and suspected extramarital sex,” was authored by Mark A. Whisman and Lizette Sanchez.

(https://www.psypost.org/eating-more-fruits-may-help-prevent-depression-in-older-adults-new-study-suggests/) Eating more fruits may help prevent depression in later life, new study suggests
Aug 30th 2024, 12:00

Could a banana a day keep the blues away? A new study published in (https://www.sciencedirect.com/science/article/pii/S1279770724003622) The Journal of Nutrition, Health and Aging suggests that regular consumption of fruits may help lower the risk of developing depression in older adults. The research, conducted as part of the Singapore Chinese Health Study, found that people who ate more fruits were less likely to experience depressive symptoms later in life.
Depression is a significant issue among older adults, often compounding other health problems and drastically reducing quality of life. With the number of older adults on the rise, the economic and social costs of depression are expected to increase as well. This study was part of a broader effort to explore how diet, a modifiable lifestyle factor, could contribute to healthier aging.
“We were interested in this topic because many countries, including Singapore, face the challenges of a rapidly ageing population. Yet, in addition to living longer, it is even more important to help older adults live in good health in order to maintain functional ability to enable well-being in old age,” said study author Woon-Puay Koh, a professor at the Yong Loo Lin School of Medicine in the National University of Singapore and the principal investigator of the (https://sph.nus.edu.sg/research/cohort-schs/) Singapore Chinese Health Study.
“Healthy ageing is a multidimensional concept that includes mental health. Late-life depressive symptoms that occur in older adults are common mental problems characterized by depressed feelings, lack of pleasure, delayed thinking, and reduced volitional activity, and are often accompanied by loss of appetite, insomnia, poor concentration, and increased fatigue.
“Although such symptoms are often mild and only detected through careful screening, the prevalence ranges from 17.1% to 34.4% in older populations worldwide, and it has been estimated that 8%-10% of those with depressive symptoms transition into major depression every year. Late-life depression is also associated with increased risk of morbidity and mortality, and older adults with depressive symptoms often don’t respond well to medical treatment. Hence, it is important to identify modifiable factors associated with ageing-related depression so that we can implement risk-reduction strategies at early stage.”
“We were particularly interested in fruits because they abound with antioxidant vitamins and antioxidants, and could conceivably reduce oxidative stress and curb inflammatory processes that have been linked to possible neuroinflammation underlying the development of depression in ageing. Hence, we were keen to discover epidemiologic evidence, if present, of an association between fruit intake and the risk of ageing-related depression.”
The study involved more than 13,700 participants from the Singapore Chinese Health Study, which has been following Chinese adults living in Singapore for over two decades. The participants were between 45 and 74 years old at the start of the study, which began in the mid-1990s. They provided detailed information about their diet, including how often they ate various types of fruits and vegetables. This information was collected using a comprehensive food-frequency questionnaire designed to capture the typical dietary habits of the participants.
In addition to dietary information, the study also collected data on a wide range of other factors that could influence depression, such as physical activity, smoking, alcohol consumption, and pre-existing health conditions like diabetes and cardiovascular disease. These factors were carefully considered in the analysis to ensure that the relationship between fruit consumption and depression was not being influenced by other variables.
To assess depressive symptoms, the researchers used the Geriatric Depression Scale, a widely accepted screening tool designed for older adults. Participants were considered to have depressive symptoms if they scored five or higher on this scale. The study focused on depressive symptoms identified during the third follow-up period, which occurred between 2014 and 2016, nearly 20 years after the initial dietary data was collected.
The researchers found that those who consumed more fruits during midlife were less likely to experience depressive symptoms in their later years. Specifically, people in the highest quartile of fruit consumption had a 29% lower likelihood of depressive symptoms compared to those in the lowest quartile. This inverse relationship between fruit intake and depression risk was consistent across different types of fruits, including oranges, bananas, papayas, and watermelons.
Interestingly, the glycemic index of the fruits—an indicator of how quickly a food can raise blood sugar levels—did not seem to affect this association. Whether the fruits were low, moderate, or high on the glycemic index, higher consumption was linked to a lower risk of depression.
“The message is that eating two to three servings of fruits a day could reduce the risk of developing ageing-related depressive symptoms, and this can be achieved by eating fruits as snack between meals or as dessert after meals,” Koh told PsyPost. “Furthermore, since further studies are still needed to identify the micronutrients in fruits that may mediate the protective effects against development of depression, it is definitely better to eat the wholesome fruits rather than to consume supplements.”
In contrast, the study found no significant association between vegetable consumption and depression risk. Even after adjusting for fruit intake, the consumption of vegetables did not appear to have any protective effect against developing depressive symptoms. This finding was somewhat unexpected, as vegetables, like fruits, are rich in vitamins and other nutrients that have been thought to contribute to mental health.
There are several possible explanations for why fruits might be more protective against depression than vegetables. One theory is that fruits are often consumed raw, preserving their nutrient content, whereas vegetables are typically cooked, which can alter the availability and effectiveness of their nutrients.
“We were surprised that vegetables, also a rich source of antioxidants, did not have any effect on the risk of developing ageing-related depressive symptoms,” Koh said. “Fruits and vegetables are often prepared and consumed in different ways as fruits are typically eaten raw as snacks throughout the day, whereas vegetables are usually cooked for meals. Cooking is known to be a process which may change the bioavailability and activity of nutrients in vegetables, and this could perhaps limit the protective effects of these nutrients on depression. However, these are just our postulations, and further studies are needed to identify the specific micronutrients present abundantly in fruits that may mediate the protective effects of fruits on depression.”
While these findings are promising, there are some caveats to consider. For instance, while the study found a strong association between fruit consumption and reduced depression risk, it cannot definitively prove that eating more fruits directly causes this reduction. There could be other factors at play that were not fully accounted for in the analysis.
Future research could help to clarify these findings by exploring the specific nutrients in fruits that might protect against depression and by examining whether similar associations are found in other populations. The researchers suggest that further studies could also investigate the potential effects of different ways of preparing and consuming vegetables, to see if certain methods might enhance their protective effects.
“The long-term goal is to use an integrated and holistic approach to examine the effects of biological, lifestyle and socioeconomic factors that affect multi-dimensional outcomes as Singaporeans transit from health to morbidity in ageing,” Koh explained. “Thus far, we have also investigated and published how lifestyle and diet affect the risk of other adverse ageing outcomes such as physical frailty and cognitive impairment.”
“While ageing is a global challenge, Asia sees one of the fastest increase in the percentage of older adults aged 65 years and above. Yet, most of the studies on ageing outcomes have been done in Western populations, and evidence from Asian populations is limited. Since there are marked differences in food habits and lifestyle preferences between Western and Asian populations due to social-economic and regional factors, studies among Asian populations with long-term follow-up periods are necessary to fill the research gaps. We hope that our data from the Singapore Chinese Health Study, such as this one on fruits and depression, can contribute to evidence for interventions that may prove to be feasible and effective in promoting healthy ageing in older adults, especially in Asian populations.”
The study, “(https://doi.org/10.1016/j.jnha.2024.100275) Association between consumption of fruits and vegetables in midlife and depressive symptoms in late life: the Singapore Chinese Health Study,” was authored by Huiqi Li, Li-Ting Sheng, Aizhen Jin, An Pan, and Woon-Puay Koh.

Forwarded by:
Michael Reeder LCPC
Baltimore, MD

This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. 

 

(#) unsubscribe from this feed
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.clinicians-exchange.org/pipermail/article-digests-clinicians-exchange.org/attachments/20240831/fe9ed552/attachment.htm>


More information about the Article-digests mailing list