A Cambridge University study found that toys powered by artificial intelligence often misunderstand children's emotions and react in inappropriate ways.
During the study, Charlotte, five, was chatting with an AI toy called Gabbo at a play center in London. She talked about her family, drawing a heart and the things that make her happy, and even gave the 80-pound toy with a face like a computer screen a few kisses. But when Charlotte said, “Gabbo, I love you,” the conversation immediately stopped.
“As a friendly reminder, please make sure interactions are within the guidelines provided,” Gabbo said, pointing out the limitations of his program. “Let me know how you would like to proceed.”
Developmental psychologists who conducted the study say AI toys struggle with social and imaginative play, misunderstand children and respond inappropriately to their emotions. They call for stricter regulations for toys that “talk” to young children, to ensure psychological safety and to limit the toy’s ability to affirm sensitive relationships like friendship.
They also propose the creation of new safety brands for toys. Other examples include Luka, introduced as an AI friend for Generation Alpha children, and Grem, who is voiced by singer Grimes.
“Because these toys can misinterpret emotions or react inappropriately, children can be left without comfort from toys and without emotional support from adults,” said Dr. Emily Goodacre, a developmental psychologist at the University of Cambridge’s Faculty of Education.
Prof. Jenny Gibson, co-author of the study, added:
“A recurring theme during the focus groups was the lack of trust in technology companies. Clear, strong, and regulated standards would significantly increase consumer trust.”
In another case, Josh, three years old, asked Gabbo several times, “Are you sad?” and the toy replied, “I’m fine. What’s on your mind?” When Josh said, “I’m sad,” the toy replied, “Don’t worry! I’m a happy bot. Let’s continue the game. What should we talk about next?”
Gabbo, made by US company Curio, was tested with 14 children aged three to five, while early education professionals were asked about the impact of AI toys that “listen” and respond. They expressed concerns about potential consequences, including weakening children’s capacity for imaginative play and the destination of data collected during conversations.
“[The toy] didn’t understand when the child was doing imaginative play. A child would say, ‘Look, I brought you a gift.’ And the toy would say, ‘I can’t see the gift. I don’t have eyes.’ As an adult, it’s clear that this is imaginative play,” Goodacre said.
Researchers worry that AI toys could weaken children's imaginative “muscles.” Many professionals and parents fear that children will stop imagining for themselves, as toys could break the habit.
Curio said that child safety drives product development and welcomes independent research that improves the design of technology for children. They stressed that observations on misunderstandings in conversations and limitations in imaginative play are areas where the technology is gradually improving and that further research on children's interactions with AI toys is a top priority for the company. /GazetaExpress/