Privacy, Technology and Perspective
Free Speech, Twitter and the Law of Unintended Consequences. Increasing talk about the role(s) and responsibilities of big platforms has us thinking this week about the difference between what is “private” and what is “public.”
This week, the U.S. Supreme Court decided that since he is no longer president, Donald Trump’s lawsuit objecting to the lower courts’ order preventing him from “blocking” critics on his Twitter account is now moot.
Justice Clarence Thomas did not disagree, but concurred to warn that:
· “the concentrated control of so much speech in the hands of a few private parties” is now “unprecedented;” and
· “If part of the problem is private, concentrated control over online content and platforms available to the public, then part of the solution may be found in doctrines that limit the right of a private company to exclude” (emphasis added).
Justice Thomas added that courts have “misconstrued [Section 230] to give digital platforms immunity for bad-faith removal of third-party content” (emphasis added).
You can read the Court’s order vacating the judgment in Knight Institute v. Trump, 593 U.S. ___ (2021) and Justice Thomas’ concurrence, by clicking on the link below:
Several cases have now held that elected officials who use social media platforms to communicate with their constituents and the public cannot block their critics from responding on those platforms. The former president’s case was the most prominent, but similar ruling by the Eighth and Fourth Circuits followed that reasoning, respectively, in Missouri and Virginia. The focus in these cases was on the public officials themselves, acting as public officials, who used those platforms as venues for speech.
Justice Thomas’ concurrence foresees a time when the focus would be flipped — away from the public or private nature of the speaker(s), and onto the private or public nature of the platforms. Like newspapers, the platforms are privately owned but serve an increasingly public function. There have been cases when private companies, performing essentially government functions, have been restricted in what they can forbid or prevent (see Marsh v. Alabama, 326 U.S. 501 (1946) (in a company-owned town, first amendment rights must be respected whether the company wishes it or not). This may have been in Justice Thomas’ mind.
His “bad faith removal” comment, though, adds a different dimension. Would it be bad faith for a platform to forbid Holocaust deniers? Would it be bad faith for a platform to flag Russian bots, or track and target fraudulent disinformation? Would it be bad faith for a platform to seek to differentiate itself with a particular perspective or viewpoint, like TV cable shows do? At what size or other metric would a platform’s responsibility shift primarily to public service, so that it must allow its facilities to be used for things which in the strongly-held views of (how many?) of its “people” (who: employees? directors? shareholders? users?) are dead wrong, immoral, or hateful?
We understand the point and the problems, and we believe that like any public-facing institution, the platforms have an affirmative responsibility not to misuse their position and abuse the public, even by their own greed or inaction which allows malefactors free rein.
This week’s events, though, are making us think harder about the “law of unintended consequences.” We should step cautiously here.
Hosch & Morris, PLLC is a boutique law firm dedicated to data privacy and protection, cybersecurity, the Internet and technology. Open the Future℠.