🎙️ Voice is AI-generated. Inconsistencies may occur.
Social media platforms are having a tough time. First, Missouri Attorney General Eric Schmitt released a trove of documents strongly suggesting the companies coordinated with the government to suppress information about COVID-19. Then, Google announced it would not list Truth Social in its Play Store due to "insufficient content moderation" practices, particularly those terms prohibiting violent content.
Google has the right not to host any app it wants, including Truth Social. But by claiming that Truth Social doesn't meet its content moderation standards, the tech giant is creating a political problem for itself—regardless of whether the decision is justified.
Making matters worse for Google is that the Truth Social app is already available in Apple's App Store. Admittedly, Google and Apple have different content moderation standards for app developers. But it makes Google's decision to deny Truth Social access to the Play Store look all the more as though it were influenced by political considerations—and in the arena of politics, perception is reality. This isn't the only high-profile case where Google has been accused of allowing partisan politics to invade its decision-making process.
Truth Social launched in late February and quickly rocketed to the top of Apple's free app download rankings. Both then and now, the app's owner, Trump Media and Technology Group, sought to distinguish Truth Social from other social media apps through its content moderation practices. The app's website advertises that it is "America's 'Big Tent' social media platform that encourages an open, free, and honest global conversation without discriminating on the basis of political ideology."
Natural tensions arise from appealing to a "Big Tent" audience while adhering to some standard of content moderation. Apple provided the freedom Truth Social needed to experiment with those standards.

The minimum content moderation standard Apple requires developers to meet includes methods "for filtering objectionable material from being posted to the app [and] a mechanism to report offensive content and timely responses to concerns." Because the term "objectionable content" is both ambiguous and subjective, Apple provides some examples, such as "defamatory, discriminatory, or mean-spirited content"; "realistic portrayals of people or animals being killed, maimed, tortured, or abused." When referring to "false information," Apple's developer terms focus on the functionality of the app, rather than the content posted.
Google similarly has terms to which developers must adhere. Like Apple, it requires developers to have content moderation systems in place and terms of service that "define[ ] objectionable content and behaviors (in a way that complies with Google Play Developer Program Policies)." But Google has a much more extensive definition of objectionable content that includes, for example, "apps that depict or facilitate gratuitous violence or other dangerous activities" and "apps that capitalize on or are insensitive toward a sensitive event with significant social, cultural, or political impact, such as...public health emergencies." Google also prohibits "hate speech," which it defines as "apps that promote violence, or incite hatred against individuals or groups based on race or ethic origin, religion, disability, age, nationality, veteran status, sexual orientation, gender, gender identity," among other categories.
Google is focused on the content that could appear on an app and not on the framework for removing objectionable content, while Apple lays out general principles and allows developers to experiment with content moderation systems. This focus on content rather than on a moderation framework leads to a political problem: Republicans argue that Google discriminates against conservatives.
Republicans will not listen to arguments that Google is simply trying to enforce its contractual terms. Instead, they will add the Truth Social incident to the list of evidence of pervasive political bias, alongside Google's decision to delist Parler, discrimination against Republican political emails, and attempts to manipulate the 2020 election in favor of Democrats.
Google and Apple have the right to insist upon certain standards, such as having reasonable content moderation standards. Google's decision not to list Truth Social is likely based on a good faith interpretation of its Developer Program Policies. But technology companies need to understand how even legitimate reasons can be twisted for political purposes and result in unnecessary controversy.
Jonathon Hauenschild is Policy Counsel at the Lincoln Network.
The views expressed in this article are the writer's own.