Online Platforms Provided Significant Discretion Over What They Choose to Delete on Their Systems
James Domen and Church United alleged that Vimeo discriminated against them on the basis of their religion and sexual orientation when it deleted Church United’s account from Vimeo’s platform. The district court concluded that Vimeo deleted Church United’s account because of Church United’s violation of one of Vimeo’s content policies barring the promotion of sexual orientation change efforts (“SOCE”) on its platform. This policy, in turn, fell within the confines of the good-faith content policing immunity that the CDA provides to interactive computer services.
The 2nd Circuit ruled that pursuant to Section 230(c)(2), Vimeo is free to restrict access to material that, in good faith, it finds objectionable.
On the appeal, Domen and Church United argued that Vimeo demonstrated bad faith by:
- discriminating against them on the basis of their religion and sexual orientation, which they describe as “former” homosexuality;
- deleting Church United’s entire account, as opposed to only the videos at issue; and
- permitting other videos with titles referring to homosexuality to remain on the website.
Agreeing with other Circuit Courts, the 2nd Circuit noted that Section 230 immunity is broad. Specifically, subsection (c)(2) immunizes interactive computer service providers from liability for “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” 47 U.S.C § 230(c)(2). Notably, the provision explicitly provides protection for restricting access to content that providers consider objectionable, even if the material would otherwise be constitutionally protected, granting significant subjective discretion. Therefore, Vimeo is statutorily entitled to consider SOCE content objectionable and may restrict access to that content as it sees fit including account deletions the court ruled.
In 2018 Vimeo notified Domen that five of the videos he uploaded violated the platform’s SOCE policy and gave him 24 hours to remove the videos or it would take action, including the possible deletion of the account. After Domen did not remove the videos, Vimeo deleted Domen’s account. The court noted that especially because Vimeo warned Domen that it would delete the account and told Domen to download and save the videos that risked being deleted, Vimeo acted in good faith in its content moderation.
The 2nd Circuit’s decision shores-up subsection (c)(2) of Section 230 as a powerful defensive mechanism to legal actions over online content or account deletion. In the past, subsection (c)(1) has been the primary statutory tool for defending against such legal actions, as (c)(1) does not contain the good faith requirement that (c)(2) requires for immunity to kick-in. The court specifically ruled that Section 230(c)(2) immunity can be triggered via a preliminary motion to dismiss, welcome news to online platforms who wish to avoid costly protracted litigation to obtain Section 230 immunity for account deletions over the content produced by users on their platforms.
Case citation: Domen v. Vimeo, Inc., 2021 WL 922749 (2d Cir. March 11, 2021)