Apple has ordered Parler to tighten up its moderation tactics and wipe “objectionable content” from the platform within the next 24 hours or it will be booted from the App Store.
The decision comes a day after Parler’s CEO flaunted the platform’s hands-off approach to content moderation in the wake of premeditated mayhem at the U.S. Capitol, much of which has been linked to Parler. In an email sent by Apple to Parler this morning (Pacific Time) and obtained by Input, the company provided numerous examples of Parler users explicitly calling for violence and referenced CEO John Matze’s comment that he doesn’t “feel responsible for any of this and neither should the platform.”
Apple disagrees. “We want to be clear that Parler is in fact responsible for all the user generated content present on your service and for ensuring that this content meets App Store requirements for the safety and protection of our users,” the company said. “We won’t distribute apps that present dangerous and harmful content.”
“We have now rejected your app for the App Store Review Guidelines detailed below,” it added.
UPDATE (7:45 p.m. ET): Google has weighed in, too, and Parler faces similar punitive measures on the Play Store. Google’s statement follows at the end of this story.
The email from Apple is included in its entirety below (in italics), as well as images the company included as examples:
We require your immediate attention regarding serious App Store guideline violations that we have found with your app, Parler.
We have received numerous complaints regarding objectionable content in your Parler service, accusations that the Parler app was used to plan, coordinate, and facilitate the illegal activities in Washington D.C. on January 6, 2021 that led (among other things) to loss of life, numerous injuries, and the destruction of property. The app also appears to continue to be used to plan and facilitate yet further illegal and dangerous activities.
Our investigation has found that Parler is not effectively moderating and removing content that encourages illegal activity and poses a serious risk to the health and safety of users in direct violation of your own terms of service, found here: https://legal.parler.com/documents/Elaboration-on-Guidelines.pdf
Examples of these complaints can be viewed on these links:
https://twitter.com/slpng_giants/status/1347190280492089344?s=20
https://twitter.com/EmmanueLoree/status/1347260055410896897/photo/1
https://twitter.com/Lovedrea/status/1347263797614972928?s=20
https://twitter.com/Wilmographer/status/1346714000554303489?s=20
https://twitter.com/pjg0014/status/1347265499210592256?s=20
Content of this dangerous and harmful nature is not appropriate for the App Store. As you know from prior conversations with App Review, Apple requires apps with user generated content to effectively moderate to ensure objectionable, potentially harmful content is filtered out. Content that threatens the well being of others or is intended to incite violence or other lawless acts has never been acceptable on the App Store.
Your CEO was quoted recently saying “But I don’t feel responsible for any of this and neither should the platform, considering we’re a neutral town square that just adheres to the law.” We want to be clear that Parler is in fact responsible for all the user generated content present on your service and for ensuring that this content meets App Store requirements for the safety and protection of our users. We won’t distribute apps that present dangerous and harmful content.
We have now rejected your app for the App Store Review Guidelines detailed below.
Guideline 1.1 – Safety – Objectionable Content
We found that your app includes content that some users may find upsetting, offensive, or otherwise objectionable. Specifically, we found direct threats of violence and calls to incite lawless action.
Guideline 1.2 – Safety – User Generated Content
Your app enables the display of user-generated content but does not have sufficient precautions in place to effectively manage objectionable content present in your app.
See the attached screenshots for more details.
Next Steps
Nothing is more important to the App Store than the safety of our users. You must resolve these issue immediately for your app to remain on the App Store.
Please remove all objectionable content from your app and submit your revised binary for review. Such content includes any content similar to the examples attached to this message, as well as any content referring to harm to people or attacks on government facilities now or at any future date. In addition, you must respond to this message with detailed information about how you intend to moderate and filter this content from your app, and what you will do to improve moderation and content filtering your service for this kind of objectionable content going forward.
To ensure there is no interruption of the availability of your app on the App Store, please submit an update and the requested moderation improvement plan within 24 hours of the date of this message. If we do not receive an update compliant with the App Store Review Guidelines and the requested moderation improvement plan in writing within 24 hours, your app will be removed from the App Store.
If you have any questions about this message, please reply and let us know.
Regards,
App Review Board
Images Apple included with the email:
How we got here — Ever since MAGA extremists attempted a putsch on Wednesday to keep President Trump in office, all eyes have been on the platforms they used to plan the insurrection. Namely, Parler. In recent months, Parler has become a haven for Trump acolytes thanks to its lax guidelines, which state that “in no case” will it make the decision to pull content or accounts “on the basis of the opinion expressed.” Those guidelines have allowed dangerous disinformation and all-out calls for violence to thrive, serving as a petri dish for hate. As Input reported yesterday, many Parler users have set their focus on forming militias across the country and have openly stated plans for a second attack at the end of the month.
Apple and Google have consequently been facing heat from a public that’s been left to wonder why they’ve continued to enable this by hosting it on their app stores.
Congratulations, you played yourself — Instead of attempting to save face, Parler CEO John Matze went for the opposite in an interview with veteran tech journalist Kara Swisher immediately after the events at the Capitol, making it explicitly clear that planning a civil war falls within Parler’s accepted behaviors. Parler users are free to keep doing that.
During the damning conversation on Wednesday evening, which went public the following day on Swisher’s Sway podcast, Matze repeatedly met the journalist’s questions with arrogant responses that danced around the issues, only addressing them head-on when pressed. His fallback defense was a tired precept: if not here, “they’ll just go somewhere else.”
Whether or not it’s Parler, it’s Twitter, it’s Facebook, it’s Google, it’s Telegram, WhatsApp, whatever it might be, you can’t stop people and change their opinions by force by censoring them. They’ll just go somewhere else and do it. So as long as it is legal, it’s allowed.
At one point, he even remarks that these “disenfranchised” groups “need to come together and have a discussion on a place like Parler.”
There are some exceptions to Parler’s “anything goes” principle, though. Doxxing? Instant removal. Pornography or nudity? There’s an algorithm for that. For “illegal activity,” on the other hand, there’s a caveat. It could be taken down, as long as it’s really illegal; Matze reminds us, “if someone says something mean or violent, it’s not necessarily illegal.” The Parler CEO also downplayed the threats of violence brewing in the app and their real-life consequences, despite the fact that they have already begun to materialize.
Well, for violence, and advocation of violence, or violence specifically, it needs to be a clear and imminent threat. And I don’t know — I’ve been witnessing what happened today a little bit, but I’m not really too much in the weeds on this stuff. I haven’t seen a whole lot of illegal activity. Maybe there has been some, but it’s a minority of the cases.
While he insists that specific instructions to carry out violence are not allowed, mounting examples of Parler posts show users doing just that. One such example that’s gained the most attention in the last few days goads followers to return to the Capitol armed on January 19.
Regardless of what really is and isn’t circulating among Parler’s users, Matze says he thinks neither he nor the app should be held responsible.
Update: CEO John Matze has responded, on Parler of course.
Anyone who buys an Apple phone is apparently a user. Apperently [sic] they know what is best for you by telling you which apps you may and may not use.
Apparently they believe Parler is responsible for ALL user generated content on Parler. Therefor by the same logic, Apple must be responsible for ALL actions taken by their phones. Every car bomb, every illegal cell phone conversation, every illegal crime committed on an iPhone, Apple must also be responsible for.
Standards not applied to Twitter, Facebook or even Apple themselves, apply to Parler.
UPDATE (7:45 p.m. ET): Google has issued a statement of its own:
“In order to protect user safety on Google Play, our longstanding policies require that apps displaying user-generated content have moderation policies and enforcement that removes egregious content like posts that incite violence. All developers agree to these terms and we have reminded Parler of this clear policy in recent months.
We’re aware of continued posting in the Parlerapp that seeks to incite ongoing violence in the U.S. We recognize that there can be reasonable debate about content policies and that it can be difficult for apps to immediately remove all violative content, but for us to distribute an app through Google Play, we do require that apps implement robust moderation for egregious content. In light of this ongoing and urgent public safety threat, we are suspending the app’s listings from the Play Store until it addresses these issues.“