1 Title Content Moderation Synonyms Content screening; community management; community moderation Author and Affiliation Sarah T. Roberts Department of Information Studies University of California, Los Angeles sarah.roberts@ucla.edu Definition Content moderation is the organized practice of screening user-generated content (UGC) posted to Internet sites, social media and other online outlets, in order to determine the appropriateness of the content for a given site, locality, or jurisdiction. The process can result in UGC being removed by a moderator, acting as an agent of the platform or site in question. Increasingly, social media platforms rely on massive quantities of UGC data to populate them and to drive user engagement; with that increase has come the need for platforms and sites to enforce their rules and relevant or applicable laws, as the posting of inappropriate content is considered a major source of liability. The style of moderation can vary from site to site, and from platform to platform, as rules around what UGC is allowed are often set at a site or platform level, and reflect that platform’s brand and reputation, its tolerance for risk, and the type of user engagement it wishes to attract. In some cases, content moderation may take place in haphazard, disorganized or inconsistent ways; in others, content moderation is a highly organized, routinized and specific process. Content moderation may be undertaken by volunteers or, increasingly, in a commercial context by individuals or firms who receive remuneration for their services. The latter practice is known as commercial content moderation, or CCM. The firms who own social media sites and platforms that solicit UGC employ content moderation as a means to protect the firm from liability, negative publicity, and to curate and control user experience History The internet and its many underlying technologies are highly codified and protocol-reliant spaces with regard to how data are transmitted within it (Galloway, 2009), yet the subject matter and nature of content itself has historically enjoyed a much greater freedom. Indeed, a central claim to the early promise of the internet as espoused by many of its proponents was that it was highly resistant, as a foundational part of its ethos, to censorship of any kind. Nevertheless, various forms content moderation occurred in early online communities. Such content moderation was frequently undertaken by volunteers, and was typically based on the enforcement of local rules of engagement around community norms and user behavior. Moderation practices and style therefore developed locally among communities and their