In a world where digital technologies gradually invest all areas of daily life, including education, artificial intelligence is a tool with multiple potential. Recently, Google has announced Gemini's launch for Kids, a variation of its generalist model designed specifically for children. This initiative awakens many questions: What is this tool with precision? What objectives do you pursue? And what guarantees does it offer in terms of data protection and ethics?

Motivations behind the development of a AI for children

Children are evolving today in a saturated environment of digital content. According to a study conducted in 2023 by Common Sense Media, children aged 8 to 12 spend an average of 5 hours a day in front of a screen, a constantly increased industrial figure. This observation pushes technological actors to offer adequate solutions, capable of supervising these uses while contributing to the cognitive and creative development of the youngest.

It is in this context that Google designed Gemini for children, an educational AI for:

  • Support interactive and personalized learning.
  • Promote creativity and intellectual curiosity through content adapted to the user's age.
  • Strengthen children's online safety thanks to the filtering and parental control mechanisms.

By creating a safe and educational digital environment, this initiative aims to meet a growing need for technological tools specifically designed for minor audiences.

Gemini for children and operational principles

Derived from the Gemini model, this version destined for children incorporates several characteristics adapted to the cognitive needs and capabilities of the youngest:

  • Contextualized and simplified responsesAdapted to the user's age, using an accessible vocabulary and pictorial explanations.
  • Fun and interactive modulesincluding educational contests and educational games aimed at strengthening learning through game.
  • An advanced parental control systemmaking it possible to define authorized issues, the duration of use and access activity reports.
  • An automated content filterresponsible for blocking sensitive or inappropriate content.

In addition, Google said that children's personal data would not be used for advertising purposes and that no historical conversation would remain by default. This approach is part of the recommendations issued by organizations such as the Council of Europe and the CNIL, which recommend a specific framework for data collection with respect to minors.

Educational interests and limits of this type of device

Gemini contributions for children can be considered from several angles. Ai could in particular:

  • Facilitate learning languages ​​and scientific concepts.
  • Help children in their duties and documentary research.
  • Stimulate its creativity through proposals for adapted activities.
  • Contribute to the awareness of the uses responsible for digital technology.

However, several limits deserve to be noted. One of the main ones refers to screen time. The research published in Jama Pediatrics in 2022 indicates that the excessive use of children in children correlates with attention disorders and emotional regulation difficulties. Therefore, it is essential to strictly supervise the exposure time to these technologies.

On the other hand, the question of replacing human interactions remains raised. An AI, as advanced as possible, cannot replace exchanges between children and adults in learning and socialization processes.

Finally, despite the commitments shown by Google, the challenges linked to the protection of minors's personal data require greater surveillance. The recent adoption of the European Parliament of the Law (March 2025), which imposes reinforced obligations for the systems for children, recalls the importance of strict legal supervision for these technologies⁴.

Towards a necessary supervision of artificial intelligence for minors

The deployment of Gemini for Children intervenes in a context of the increased regulation of AI in Europe and internationally. The European Commission, in its work on ACT, plans to classify certain uses for children as of high risk, with specific transparency, security and respect for fundamental rights.

These measures are part of a global dynamic aimed at guaranteeing the ethical use of artificial intelligence, particularly with the most vulnerable public. The development of educational solutions based on AI raises crucial, educational and legal problems, that designers, institutions and families will have to anticipate and supervise.

What future balances for the educational AI dedicated to children?

The arrival of Gemini for children attests to rapid developments in the artificial intelligence sector and their penetration into educational spheres. If this initiative offers interesting perspectives to support children's learning in a safe and personalized environment, it also raises fundamental questions: what limits will establish in these uses? How to preserve the central role of human interactions? What guarantees to offer regarding the rights of minors in the digital environment?

These questions require collective and interdisciplinary reflection, associates of investigators, educators, parents and legislators. Continuous surveillance will be essential to ensure that artificial intelligence at children remains an emancipation and protection tool, not a dependency or control factor.

References

1. Common sense, the census of common sense: use of means of preteens and adolescents, 2023, www.comMsoSsemedia.org.

2. Council of Europe, Guide to the Rights of the Child in the digital environment, 2022.

3. Madigan S. et al., Association between screen time and child performance in the development detection test, Jama Pediatrics, 2022.

4. European Parliament, Artificial Intelligence Law, March 2025.