Online Safety Zone
Dear parents and carers
We want to make you aware of a group of AI chat apps that children may be able to access. These include CHAI, Polybuzz, Linky and Mimo.
Although these apps are advertised within age-appropriate games, children have been able to access them through pop-up banners. Once the app opens, it simply asks whether the user is over 18. This means children can enter the app even if age controls have been set on the device.
These apps present themselves as tools for creating characters and stories and often use familiar images, such as characters from Wednesday or K-Pop. However, when children begin chatting with these characters, the conversations can quickly become sexualised or violent.
What the apps do
CHAI, Polybuzz, Linky and Mimo allow users to chat with AI characters or create their own. Children can assign physical and personality traits, choose from pre-made characters and play games such as collecting cards that reveal a character’s “secrets” or engaging in role-play. Some characters have preset roles like “teacher”, “nurse” or “roommate”.
When searching for chatbots, users can filter by categories such as romance, spicy, fantasy and adventure. Many character descriptions contain sexual content and this type of material is also present on the companies’ social media channels. It is also important to be aware that AI chatbots can produce content that is inaccurate or harmful.
Where to find support
For further information about the risks associated with AI chatbots and companions for children and young people, please see the online safety advisory - https://www.esafety.gov.au/newsroom/blogs/ai-chatbots-and-companions-risks-to-children-and-young-people .
As always we advise you to discuss and monitor your child’s online activity so that any issues can be quickly addressed.
Thank you for your continued support in helping keep our pupils safe online.
_______________________________
Roblox update:
What’s changing?
The platform Roblox is introducing a new feature requiring users to undergo a facial age-check.
Users who complete the process will be placed into age groups: under nine, 9 to 12, 13 to 15, 16 to 17, 18 to 20 and 21+.
Players can only chat with others in similar age ranges, unless they add someone as a "trusted connection", which is a feature for people they know.
Under-13s will still be blocked from private messages and certain chats unless a parent gives permission.
Why has this been introduced?
The move is aimed at strengthening safety and age-appropriate interactions online. It reflects growing concerns around children’s safety on social-chat enabled platforms.
What you can do as a parent:
- Have a conversation with your child about why this change is happening and what it means for safe online behaviour.
- Review your child’s account settings in Roblox: check chat permissions, ensure their account age is set correctly, and monitor who they chat with.
- Encourage the use of platform safety features (such as chat filters, reporting/blocking unwanted contact) and keep open communication about their online experiences.
- Stay informed about how your child uses apps and games: this change is one more prompt to revisit those discussions.
__________________________________________________________________
