Tech companies are making more changes to keep kids safe online.
The latest Apple iOS update came with nudity-detecting algorithms that alert children if they're about to send or receive a picture message containing nudity.
However, the feature isn't automatic. It needs to be turned on through a family-sharing account.
If nudity is detected, the phone will blur the image in the message and warn about the dangers of sharing explicit photos. It will then ask the viewer if they want to proceed.
"I'm not going to say it's going to change or detract all behaviors, but it is going to encourage kids to stop, think and reflect for a moment," said Ana Homayoun, the author of "Social Media Wellness."
The feature does not notify a parent about a nude photo on a child's device. Instead, the child can message a grown-up and alert a parent themselves.
Homayoun says the feature drives home the importance of children having trusted adults with whom they can talk.
"We need every child to know that no matter what happens, they have trusted adults that they can reach out to if something doesn't go as planned online," she said.
Thorn is a non-profit that builds technology to protect kids from sexual abuse. It found more than double the number of adolescents — ages 9 to 12 years old — were sharing their own nude photos in 2020 than in 2019.
Adolescents are also more likely to see non-consensually shared explicit pictures of other kids and think that behavior is normal. Thorn experts say that's common among adolescents because of puberty.
Experts say parents should have conversations with kids of all grades and age levels with access to screens and photo messages about the types of content they can encounter online. Thorn has a parent hub with all sorts of resources to have age-appropriate conversations about digital safety at their website.