The New Screening Warrant Bill Is an Absolute Disaster

After the SOPA/PIPA defeat, Big Content mainly focused on quiet, clandestine deals for copyright law, like the unconstitutional CASE Act, which was so unpopular it had to be slid into a unavoidable bill in the middle of winter. But now, almost exactly a decade later, they’ve come out screaming with a proposal almost as bad as SOPA/PIPA. Hopefully it won’t take an internet outage to kill him this time.

The new proposal, cynically titled SMART Copyright Act, gives the Library of Congress, in “consultation” with other government agencies, the authority to designate “technical measures” that Internet services must use to combat copyright infringement. author. In other words, it gives the Copyright Office the power to set the rules for Internet technology and services, with very little avenue of appeal.

First, some background: one of the safe harbor conditions against copyright liability included in the Digital Millennium Copyright Act – safe harbors essential to the survival of all sorts of middlemen and platforms, from a knitting website to your ISP – is that the provider must adapt to all “standard technical measures” to combat online infringement. Congress has reasonably required that such measures be “developed pursuant to a broad consensus of copyright owners and service providers through an open, fair, voluntary, and multi-industry standard-setting process.” In practice, no such broad consensus has ever emerged, nor even a “multi-sector standardization process” to develop it. There are several reasons for this, and one of the most important is that the number and variety of service providers and copyright owners have exploded since 1998. These industries and owners have structures, technologies and extremely varied interests. What has emerged instead are privately developed and deployed automated filters, typically deployed at the platform level. And some influential copyright owners want to see these technologies become a legal requirement across the board.

This bill seeks to accomplish that by establishing a new process that abandons the whole notion of consensus and fair process. Instead, it gives the Librarian of Congress responsibility for designating technical measures and requires virtually all service providers to comply with them.

This invoice cannot be repaired. Let’s count the paths:

Tech mandates will inevitably stifle legal expression

For decades, Big Tech has struggled to appease Big Content by implementing a technical measure they love: filters. The best-known example, YouTube’s Content ID system, works by allowing users to upload their content to a database managed by YouTube. New downloads are compared to what’s in the database, and when the algorithm finds a match, the system applies the default rule chosen by the copyright owners, such as remove it or monetize it (including benefits will go to the copyright holder.) They may also, upon being notified of a match, submit a DMCA notice, putting the creator at risk of losing their account.

Despite over a decade of tinkering and over $100 million in sunk costs, the system routinely fails. In 2015, for example, Sébastien Tomczak uploaded a ten-hour video of white noise. A few years later, as a result of YouTube’s content identification system, a series of copyright claims were filed against Tomczak’s video. Five different claims were dropped onto the sound that Tomczak himself created. Although the plaintiffs did not force the removal of Tomczak’s video, they all opted to monetize it. In other words, ads were placed on the video without Tomczack’s consent and the ten-hour video would then generate revenue for those claiming copyright over the static. In 2020, CBS discovered that its Comic-Con panel had been blocked. YouTube creators report that they avoid any use of music on their videos, even if it’s clearly legal, for fear of copyright flags.

Things are not better on Facebook. For example, since filters cannot tell the difference between two different performances of the same public domain work, a copyright owner’s claim to a particular version of a work can block many other executions. As a result, as one headline put it, “copyright bots and classical musicians are battling it out online. The bots win.

Third-party tools can be even more faulty. For example, a “content protection service” called Topple Track has sent out a slew of abusive takedown notices to wrongfully remove sites from Google search results. Topple Track boasted of being “one of the core members of the Google Trusted Copyright Program”. In practice, Topple Track’s algorithms were so out of control that they sent out inappropriate notices targeting an EFF case page, Beyonce and Bruno Mars’ authorized music stores, and a New Yorker article on patriotic songs. Topple Track even sent an inappropriate review targeting an article by a member of the European Parliament who was on inappropriate automated copyright notices.

The central problem is this: distinguishing legal from illegal uses generally requires context. For example, the “quantity and substance” factor in the fair dealing analysis depends on the purpose of the dealing. So while the usage might be for a few seconds, as in some sort of music review, it can also be the whole piece, as in a musical parody. Humans can point out these differences; automated systems cannot.

Technical mandates will stifle competition

Any obligation to implement filtering or other technical measures would distort the market for internet services by favoring service providers with sufficient resources to develop and/or implement costly filtering systems, reduce investment in new services and hamper incentives to innovate.

In fact, the biggest tech companies will likely already have implemented mandatory technical measures, or something close to it, so the burden of this mandate will fall primarily on small and medium-sized services. If the price of hosting or transmitting content creates and maintains a copyright filter, investors will find better ways to spend their money, and today’s tech giants will stay comfortably entrenched.

Tech mandates put your security and privacy at risk

Virtually any technology mandate will raise security and privacy concerns. For example, when DNS filtering was proposed ten years ago as part of SOPA/PIPA, security researchers sounded the alarm, explaining that the costs would far outweigh the benefits. And as 83 leading Internet inventors and engineers have explained in the context of site blocking and other measures proposed in these unfortunate bills, any measure that interferes with Internet infrastructure will inevitably lead to network errors and security issues. This is true in China, Iran and other countries that censor the network today; this will be equally true of American censorship. This is also true whether the censorship is implemented through DNS, proxies, firewalls, or any other method. The network errors and insecurity we face today will become more widespread and will affect sites other than those blacklisted by the US government.

The desires of some copyright holders to offload the responsibility of stopping online infringement onto service providers large and small must give way to the deep public interest in a robust, reliable and open.

Technical mandates give the Library of Congress a veto over innovation – a right it is clearly ill-equipped to exercise.

Proponents of the bill apparently hope to mitigate at least some of these harms through the designation process itself, which is supposed to take into account the various public interests at stake, as well as any effects on competition, privacy and security. Recognizing that the Librarian of Congress is unlikely to have the expertise to assess these effects, the bill requires him to consult with other government agencies that do.

There are at least two fundamental problems here. First, it means at best that a group of well-meaning DC bureaucrats can dictate how we build and use technology, informed mostly by those who can afford to submit evidence and expertise. Startups, small businesses, independent creators and ordinary users, who will all be affected, are unlikely to know about the process, let alone have a say in it.

Second – and this is perhaps the most cynical aspect of the whole proposal – it is modeled on the Section 1201 exemption process that the library already conducts every three years. Anyone who has actually been involved in this process can tell you that it was broken from the start.

Section 1201 of the DMCA makes it illegal to “bypass” digital locks that control access to copyrighted works and to manufacture and sell devices that break digital locks. Realizing that the law could impede lawful fair uses, the law authorizes the Library of Congress to organize a three-year rulemaking process to identify and grant exemptions for such uses. The supposed “safety valve” is anything but. Instead, it creates a cumbersome and costly expression licensing regime that has no binding standards, does not move at the speed of innovation, and works at all only because of the work of clinical students and public benefit organizations, all of whom could put that energy into better causes than going hat in hand to the Copyright Office every three years.

Worse still, while 1201 exemptions for lawful expression expire if not renewed, once passed, tech mandates will be permanent until successfully challenged. In other words, there are greater obstacles to protecting fair dealing than to impeding it.

Worse still, the Library of Congress will now be responsible for both designating technical mandates and designate when and how it is acceptable to break them for fair use purposes. It’s a terrifying power – and far too great to put in the hands of a bunch of DC lawyers, no matter how well-meaning. It should be recalled that the Copyright Office granted no significant exemptions to Section 1201 during the first six years of this law. What innovative new services and potential competitors of today’s tech giants could we lose six years from now?

Remaking the Internet to serve the entertainment industry was a bad idea ten years ago and it’s a bad idea today. This dangerous bill is a no-start.

Take action

Tell your senators to oppose the Filter Mandate

Comments are closed.