“Is Apple going to see the nudes on my phone?”
“Is Apple going to mistake my newborn in the bathtub for child pornography?”
“Is Apple scanning all my messages and photos?”
It certainly wasn’t an average week in tech questions from my friends and family. And I don’t blame them for the freakout. Apple’s announcement last week of two distinct child-protection measures for iOS confused—and creeped out—even the most tech savvy.
For those catching up: One initiative is software intended to identify child pornography—also known as “child sexual abuse material”—when it is stored using Apple’s iCloud Photos. The other is a parental control enabling iPhones, iPads and Macs to blur out sexually explicit photos in the Messages app, and warn children about sending or receiving them. It could also alert parents of children 12 and under that they are sending or receiving such images.