TheQuartering [3/23/2021]
YouTube is pushing hard for censorship.
Over the last few months, YouTube has come under fire for its content moderation decisions. The company, like other hosts of user-generated content, is not obliged to take down most content under US law, nor can it be held liable for much of what it hosts, giving it significant power over its users’ expression.
YouTube’s emergence amid the blogging boom of the mid-2000s was revolutionary. Suddenly, anyone could easily share their own videos with the entire world. The platform, created by Chad Hurley and Steve Chen—both under 30 at the time—was quickly snatched up by Google for a whopping $1.65 billion just a year after its launch. In 2006, Time magazine named “you” its person of the year, dubbing YouTube “the people’s platform” and crediting it with the “opportunity to build a new kind of international understanding.” YouTube was on its way up.
But almost immediately, YouTube was faced with tough decisions about what types of content it should—or could, legally—allow. The Digital Millennium Copyright Act (DMCA) and Section 230 of the Communications Decency Act (“CDA 230”)—two of the most important regulations governing user-generated content—had only been around for a decade, and had not yet been significantly applied in the international sphere. Both would soon come to be tested as YouTube and other platforms rapidly transformed the way people communicate and share content online.
YouTube’s first content struggles came not long after Google’s acquisition of the company, when, in 2006, the Japanese Society for Rights of Authors, Composers, and Publishers successfully issued DMCA takedown requests for more than 30,000 pieces of content. Shortly thereafter, Comedy Central filed a complaint for copyright infringement, resulting in the removal of its content from the platform.