The ARC Centre of Excellence for the Digital Child (Digital Child), the ARC Centre of Excellence for Automated Decision-Making and Society (ADM+S), and QUT Gen AI Lab have partnered on a new project designed to help young children explore key ethical challenges associated with voice AI Chatbots.
Working in collaboration with children, the pilot phase of the project ‘“Making AI Friends? Enhancing Primary Years AI Literacy with a Voice AI Chatbot Experience”, has focused on a specific ethical problem known as sycophancy – the tendency of some AI systems to “always agree” with users, prioritising likability over accuracy, critical thinking or ethical judgement.
Dr Henry Fraser from QUT, said the project addresses a growing issue in how AI systems interact with users.
“Adults and children live in the same world, and that includes the digital world. Building a better and safer digital world is just as relevant to children as it is to adults – maybe even more relevant.”
Director of the ARC Centre of Excellence for the Digital Child at QUT, Distinguished Professor Susan Danby, said the project places children’s perspectives at the centre of AI design.
“Children bring curiosity and insight to their everyday interactions, including their digital worlds, whether in school or home. They have the right to be heard, and an opportunity to provide them a genuine role in shaping the digital experiences they use.”
“When we work alongside children, we can create technologies that respect their capabilities and help them navigate the digital world safely and confidently.”
The pilot project activity combined participatory learning and co-design activities with children aged 6-9 years old, designed and led by Digital Child researchers, with a real-time interactive AI ‘game’ developed by the QUT Gen AI Lab, and an explainer animation created by ADM+S with Maria Pinto.
The game embeds custom voice agents in a sandbox environment, allowing children to compare how differently designed chatbots respond to questions, ideas and ethical dilemmas.




