Apple says researchers can vet its child safety features. But it’s suing a startup that does just that.


    In 2019, Apple filed a lawsuit against Corellium, which lets security researchers cheaply and easily test mobile devices by emulating their software rather than requiring them to access the physical devices. The software, which also emulates Android devices, can be used to fix those problems.

    In the lawsuit, Apple argued that Corellium violated its copyrights, enabled the sale of software exploits used for hacking, and shouldn’t exist. The startup countered by saying that its use of Apple’s code was a classic protected case of fair use. The judge has largely sided with Corellium so far. Part of the two-year case was settled just last week—days after news of the company’s CSAM technology became public. 

    On Monday, Corellium announced a $15,000 grant for a program it is specifically promoting as a way to look at iPhones under a microscope and hold Apple accountable. On Tuesday, Apple filed an appeal continuing the lawsuit.

    In an interview with MIT Technology Review, Corellium’s chief operating officer, Matt Tait, said that Federighi’s comments do not match reality.

    “That’s a very cheap thing for Apple to say,” he says. “There is a lot of heavy lifting happening in that statement.”

    “iOS is designed in a way that’s actually very difficult for people to do inspection of system services.”

    “iOS is designed in a way that’s actually very difficult for people to do inspection of system services.”

    Matt Tait, Corellium

    He is not the only one disputing Apple’s position.

    “Apple is exaggerating a researcher’s ability to examine the system as a whole,” says David Thiel, chief technology officer at Stanford’s Internet Observatory. Thiel, the author of a book called iOS Application Security, tweeted that the company spends heavily to prevent the same thing it claims is possible.

    “It requires a convoluted system of high-value exploits, dubiously sourced binaries, and outdated devices,” he wrote. “Apple has spent vast sums specifically to prevent this and make such research difficult.”

    Surveillance accountability

    If you want to see exactly how Apple’s complex new tech works, you can’t simply look inside the operating system on the iPhone that you just bought at the store. The company’s “walled garden” approach to security has helped solve some fundamental problems, but it also means that the phone is designed to keep visitors out—whether they’re wanted or not. 

    (Android phones, meanwhile, are fundamentally different. While iPhones are famously locked down, all you need to do to unlock an Android is plug in a USB device, install developer tools, and gain the top-level root access.) 

    Apple’s approach means researchers are left locked in a never-ending battle with the company to try to gain the level of insight they require.

    There are a few possible ways Apple and security researchers could verify that no government is weaponizing the company’s new child safety features, however.





    Source link

    Previous article2023 Nissan Z Sports Car Debuts With Turbos and Tech
    Next articleApple releases watchOS 8 beta 6 to developers