Q1 2025 Gaming Industry Report Released,
View Here

Konvoy’s Weekly Newsletter:

Your go-to for the latest industry insights and trends. Learn more here.

Newsletter

|

Apr 18, 2025

AI Guardians: Nurturing Young Minds

AI's future relies on being able to be fine-tuned to the user's needs

Copy Link

Copy Link

AI Guardians: Nurturing Young Minds

Artificial intelligence (AI) is quickly being integrated into our products. The AI (or ML and statistics) of the 2000s and 2010s drove content feed algorithms, recommendation engines, search, fraud detection, and a host of behind-the-scenes use cases. Sometimes, as was the case with customer service chatbots and virtual assistants (Siri, Alexa, etc), AI interacted more directly with end users, but they were flawed and quite limited.

Today, as LLMs have dramatically improved the flexibility and user experience of these systems, an evolution is underway: AI is becoming a core part of the product interaction layer, with AI tools and features being intentionally brought to users.

One blue ocean area for AI is in the kids' market. While there are adverse effects of giving kids access to AI chat-based systems (voice and text-based systems where kids chat with LLMs), we believe that under the right circumstances, AI has an opportunity to elevate and help guide a child's mental and emotional growth while giving parents peace of mind.

Our View: AI tools hold the potential to transform children's lives across multiple domains. These technologies offer capabilities that can enhance learning, growth, and well-being, from personalized education to social skill development and creative expression to childcare assistance.

Personalized Education and Social Skill Development

The research agrees that AI tools have great potential in personalized learning and social skill development. When it comes to learning, AI's benefits are that it is adaptive, cost-effective, and efficient.

Adaptive: Stanford's AI-driven math tutor increased performance by 9% (the study included 900 tutors and 1,800 elementary and secondary school students) for struggling students by delivering instant feedback. The power of AI lies in its data-informed precision and ability to understand the end-user. Tools like Luqo AI leverage spaced repetition (act of learning and then reviewing at spaced intervals, preferably with increasing intervals) to prompt students with key concepts at the right time, enhancing long-term retention of information (Dialzara).

Unlike humans, AI-based tools also do not get tired or annoyed by questions, which allows kids to dive as deep (“why?”) into anything they want to learn or understand. This is important for parents who may understand topics at some level but are unable to answer all of their child's questions.

Cost-effective: AI has clear economic advantages in education. While traditional tutoring typically costs $50-$150 per hour, AI tutoring subscriptions range from $20-$60 monthly for unlimited access (Dialzara).

Efficient: Harvard research (n=194) shows that students using AI learn at twice the speed of traditional methods, with 83% of participants rating AI explanations as equal to or better than human instructors. Due to consistent practice and instant feedback, AI tutoring can lead to a 91% proficiency rate in standardized test prep. For building basic skills, AI tools speed progress by about 40% through interactive and repetitive exercises.

Social-emotional skills are increasingly recognized as important for children's success; Melissa Schlinger from the Collaborative for Academic, Social, and Emotional Learning notes: "Our humanity and our ability to connect with and empathize and experience positive, loving, caring relationships that are productive for ourselves and society, that is at the core of who we are as humans.”

AI offers new approaches to engaging children and developing these core social-emotional competencies. Robots like Milo have shown that it can help students to work on emotional recognition and social interactions, with 87.5% engagement compared to just 2-3% with human therapists. These robots have shown important innovations but it is still important to balance that with human connection.

The main counterargument is that these tools can also be overreaching and too heavily relied on for what they are trying to solve. For example, the University of Illinois highlights that “relying more and more on AI may reduce the teacher-to-student interactions and relationships and take away from the social-emotional aspects of learning. If those interactions diminish, students’ social skills and interpersonal development will suffer.”

Another issue comes from biases in models. They are only as good as the data they are trained on. New America notes that “regular audits of AI systems and clear explanations of algorithmic decisions should be mandated.”

Creative Expression


In the same vein as education, AI offers opportunities for creative expression and play. Tools like DALL-E or Deep Dream Generator allow kids to instantly visualize their wildest imagination and elevate what they could create through their own skills.

Platforms like Plotago can create animated videos based on written stories, allowing children to see their stories come to life with characters and scenes. This is important for kids who may struggle to verbalize or create things they want without becoming frustrated.

What is also becoming beneficial for parents is the ability to generate stories on a whim that interest their child instead of needing to find books or stories online. These can continually be adapted and easily generated based on the child’s needs. These storytelling apps like Bedtimestory.AI, Oscar Stories, Storytailor, and HyperWrite allow parents to instantly generate stories that can be adapted to children’s preferences, reading levels, and interests (Shailey Minocha).

While this allows for initial expressions and explorations of a child’s imagination, research also refutes these benefits, especially on longer time horizons. For example, the Young Investigators Review mentions “the use of artificial intelligence reduces the opportunity for students to learn creativity. Artificial intelligence takes away from the creative process.”

AI, like any technology, is a double-edged sword that expands the realm of the possible and has the potential to improve many lives materially, but it also creates serious risks if not managed well. Human interaction with AI is here to stay, but we believe that any technology built for kids should be supplemental, not a full replacement.

Concerns and Solutions


Since AI tools collect and analyze data about children, it is reasonable to question privacy and security. Parents and educators must consider what information is being gathered, how it is used, and who can access it.

Children need guardrails that allow them to explore freely within the confines of their interests. They should be safeguarded from harmful content, and parents should have controls around what is accessible. Child-safe AI is critical, and there are many solutions being developed to help companies create safer products:

  • Child-tuning: a tool for model or consumer product creators to make sure that only the most relevant parts of the models are available to reduce risks and keep responses age-appropriate. This can remove aspects of the model, such as reasoning, data, or functionality (e.g., web browsing), to help steer the model in a direction that cannot be misused.
  • KidRails: Adjust outputs for different ages (e.g., simplified answers for 6-year-olds, more detail for 12-year-olds) using open-source tools.
  • Kid-safe datasets: Train on curated content like educational stories, moderated dialogues, and creative problem-solving scenarios.
  • k-ID: an age-verification solution to help products know the age of the end user so that it can adapt the experience to that age demographic (disclaimer: Konvoy portfolio company).

Safety has to be at the forefront of the strategies for these models. Frameworks and best practices have been developed and heavily researched to enable a standardized process in creating safe models for kids of all ages:

  • Pre-training: Scrub datasets of harmful material and prioritize STEM, literacy, and emotional learning. It also allows for identifying and removing specific biases, linguistic nuances (children use simpler syntax, for example), or even to enable models to adapt to learning styles.
  • Post-training: Add real-time filters to block profanity, detect frustration, and reroute sensitive queries to human moderators. Tools like Thorn’s Safer AI or LittleLit’s moderation system excel here. Post-training (or fine-tuning) for kids-focused LLMs allows for better safety, educational alignment, personalization, reliability, and bias mitigation.
  • Stress-test: Simulate edge cases (e.g., “How do I lie to my parents?”) with educators and child psychologists.
  • Transparency: Share safety frameworks openly to build trust with parents and schools (KidRails’ Apache 2.0 license is a great example).

Takeaway: AI tools are gaining traction across every age group, including kids, and they are quickly becoming something players just expect to see. The research is clear: AI can deliver real value. But it also brings risks, especially when it comes to experiences that parents would not want their kids stumbling into. As younger audiences continue to encounter AI through games and other media, the need for smarter safeguards becomes crucial. Developers will need to fine-tune these systems to ensure that kids reap the benefits of AI without the risk of being manipulated or exposed to harmful content.

From the newsletters

View more
Left Arrow
Right Arrow

Interested in our Newsletters?

Click

here

to see them all