Adolescents with high emotional intelligence are less likely to trust AI

A new study published in the journal Behavioral Sciences highlights generational differences in how adolescents and their parents interact with artificial intelligence. The research suggests that teens with higher emotional intelligence and supportive, authoritative parents tend to use AI less frequently and with greater skepticism. Conversely, adolescents raised in authoritarian environments appear more likely to rely on AI for advice and trust it implicitly regarding data security and accuracy.

Artificial intelligence has rapidly integrated into daily life, reshaping how information is accessed and processed. This technological shift is particularly impactful for adolescents. This demographic is at a developmental stage where they are refining their social identities and learning to navigate complex information ecosystems.

While AI offers educational support, it also presents risks related to privacy and the potential for emotional over-reliance. Previous investigations have examined digital literacy or parenting styles in isolation. However, few have examined how these factors interact with emotional traits to shape trust in AI systems.

The authors of this study sought to bridge this gap by exploring the concept of a “digital secure base.” This theoretical framework proposes that strong, supportive family relationships provide a safety net that helps young people explore the digital world responsibly.

The researchers aimed to understand if emotional skills and specific family dynamics might predict whether a teen uses AI as a helpful tool or as a substitute for human connection. They hypothesized that the quality of the parent-child relationship could influence whether an adolescent develops a critical or dependent attitude toward these emerging technologies.

To investigate these dynamics, the research team recruited 345 participants from southern Italy. The sample consisted of 170 adolescents between the ages of 13 and 17. It also included 175 parents, with an average age of roughly 49. Within this group, the researchers were able to match 47 specific parent-adolescent pairs for a more detailed analysis. The data was collected using online structured questionnaires.

Participants completed several standardized assessments. They answered questions regarding parenting styles, specifically looking for authoritative or authoritarian behaviors. They also rated their own trait emotional intelligence, which measures how people perceive and manage their own emotions. Additional surveys evaluated perceived social support from family and friends.

To measure AI engagement, the researchers developed specific questions about the frequency of use and trust. These items asked about sharing personal data, seeking behavioral advice, and using AI for schoolwork. Trust was measured by how much participants believed AI data was secure and whether AI gave better advice than humans.

The data revealed a clear generational divide regarding usage habits. Adolescents reported using AI more often than their parents for school or work-related tasks. Approximately 32 percent of teens used AI for these purposes frequently, compared to only 17 percent of parents. Adolescents were also more likely to ask AI for advice on how to behave in certain situations.

In terms of trust, the younger generation appeared much more optimistic than the adult respondents. Teens expressed higher confidence in the security of the data they provided to AI systems. They were also more likely to believe that AI could provide better advice than their family members or friends. This suggests that adolescents may perceive these systems as more competent or benevolent than their parents do.

The researchers then analyzed how personality and family environment related to these behaviors. They found that adolescents with higher levels of trait emotional intelligence tended to use AI less frequently. These teens also expressed lower levels of trust in the technology. This negative association suggests that emotionally intelligent youth may be more cautious and critical. They may rely on their own internal resources or human networks rather than turning to algorithms for guidance.

A similar pattern emerged regarding parenting styles. Adolescents who described their parents as authoritative—characterized by warmth, open dialogue, and clear boundaries—were less likely to rely heavily on AI. This parenting style was associated with what the researchers called “balanced” use. These teens engaged with the technology but maintained a level of skepticism.

A different trend appeared for those with authoritarian parents. This parenting style involves rigid control and limited communication. Adolescents in these households were more likely to share personal data with AI systems. They also tended to seek behavioral advice from AI more often. This suggests a potential link between a lack of emotional support at home and a reliance on digital alternatives.

Using the matched parent-child pairs, the study identified two distinct profiles among the adolescents. The researchers labeled the first group “Balanced Users.” This group made up about 62 percent of the matched sample. These teens had higher emotional intelligence and reported strong family support. They used AI cautiously and did not view it as superior to human advice.

The second group was labeled “At-Risk Users.” These adolescents comprised roughly 38 percent of the matched pairs. They reported lower emotional intelligence and described their parents as more authoritarian. This group engaged with AI more intensively. They were more likely to share personal data and trust the advice given by AI over that of their parents or peers. They also reported feeling less support from their families.

These findings imply that emotional intelligence acts as a buffer against uncritical technology adoption. Adolescents who can regulate their own emotions may feel less need to turn to technology for comfort or guidance. They appear to approach AI as a tool rather than a companion. This aligns with the idea that emotionally competent individuals are better at critical evaluation.

The connection between parenting style and AI use highlights the importance of the family environment. Authoritative parenting seems to foster independent thinking and digital caution. When parents provide a secure emotional foundation, teens may not feel the need to seek validation from artificial agents. In contrast, authoritarian environments might leave teens seeking support elsewhere. If they cannot get emotional regulation from their parents, they may turn to AI systems that appear competent and non-judgmental.

The study provides evidence that AI systems cannot replace the emotional containment provided by human relationships. The results suggest that rather than simply restricting access to technology, interventions should focus on strengthening family bonds.

Enhancing emotional intelligence and encouraging open communication between parents and children could serve as protective factors. This approach creates a foundation that allows teens to navigate the digital world without becoming overly dependent on it.

The study has several limitations that affect how the results should be interpreted. The design was cross-sectional, meaning it captured data at a single point in time. This prevents researchers from proving that parenting styles cause specific AI behaviors. It is possible that the relationship works in the other direction or involves other factors. The sample size for the matched parent-child pairs was relatively small. This limits the ability to generalize the specific user profiles to broader populations.

Additionally, the study relied on self-reported data. Participants may have answered in ways they felt were socially acceptable rather than entirely accurate. There is also the potential for common-method bias since the same individuals provided data on both their personality and their technology use. The research focused primarily on psychological and relational factors. It did not account for socioeconomic status or cultural differences that might also influence access to and trust in AI.

Future research should look at these dynamics over time. Longitudinal studies could track how changes in emotional intelligence influence AI trust as teens grow older. Researchers could also include objective measures of AI use, such as usage logs, rather than relying solely on surveys.

Exploring these patterns in different cultural contexts would also be beneficial to see if the findings hold true globally. Further investigation is needed to understand how specific features of AI, such as human-like conversation styles, specifically impact adolescents with lower emotional support.

The study, “Emotional Intelligence and Adolescents’ Use of Artificial Intelligence: A Parent–Adolescent Study,” was authored by Marco Andrea Piombo, Sabina La Grutta, Maria Stella Epifanio, Gaetano Di Napoli, and Cinzia Novara.

Leave a comment
Stay up to date
Register now to get updates on promotions and coupons
HTML Snippets Powered By : XYZScripts.com

Shopping cart

×