Export Compliance Daily is a service of Warren Communications News.

Copyright Regime Change Needs Care, Study Group Told

A body studying copyright exceptions and curbs on archive and library operation should take care, warned publishing and high-tech interests. The group, which held public meetings this month in D.C. and L.A., will prepare findings and make suggestions this year to the Librarian of Congress on changing the law. But co-chair Laura Gasaway, head of the U. of N.C.’s law library, told us the 19-member group is unlikely to meet a midyear deadline.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

The working group’s deliberations are confidential and Gasaway wouldn’t speak in detail about how she thought the 2 public hearings turned out. Members convened once after the March 8 L.A. meeting but haven’t assembled since the D.C. event a week later. The panel doesn’t meet again until May, Gasaway said. “Everybody was very happy with what we heard, as far as people being forthcoming and answering the questions we asked,” she said. Register of Copyrights Marybeth Peters told a State Bar of Cal. event this month she hopes Congress will update Sec. 108 to reflect the digital world after the study group has finished its work. Sec. 108 largely is limited to photocopying and Peters wants “a section that’s more reflective of how libraries operate today” (WID March 1 p6), she said. Sec. 108, enacted in the 1976 Copyright Act and amended in later laws, specifies preservation, replacement and patron access exceptions for libraries and archives to make copies.

Significant concerns exist about broadening Sec. 108’s reach, as some have proposed, Software Information Industry Assn. (SIIA) Vp-IP Policy & Enforcement Keith Kupferschmid told us. SIIA represents Apple, Sun Microsystems, Bloomberg, Oracle and others. “Whenever you take steps to broaden an exception to copyright law, there are people who are going to abuse that,” he said: “There are lots of examples of people who'll look at copyright law and use it as a guise to hide what they're really doing.”

Kupferschmid cited as an example Nathan Peterson, a Cal. man who ran iBackups.net. The site, advertised on the Web as an archival service, actually was selling pirate copies of software, claiming they were “backup copies” buyers could use if systems crashed. Peterson, who was arrested and pleaded guilty to copyright infringement, awaits sentencing April 14 in U.S. Dist. Court, Alexandria, Va. He faces up to 10 years in prison and restitution of $5.4 million -- the highest penalty ever imposed for software copying. SIIA says iBackups caused nearly $20 million in damage. Peterson tried to defend himself under Sec. 108 as well as Sec. 117, which sets limitations and exclusive rights for computer programs and digital copies.

Any virtual library exemption created must be narrower than proponents urge, Kupferschmid said. Instituting a trusted 3rd party system appeals to SIIA and other groups, including the Authors Guild, he said. Under it, the Copyright Office would set up a certification process and guidelines “so we know who the trusted libraries and archives are and so someone can’t just say ‘Hey, I'm a library,'” he said.

The main concern at the Authors Guild is that libraries not take on the publishing function, exec. dir. Paul Aiken told us. “Once you talk about libraries allowing offsite access to their collections, it can easily slip into a licensed right -- something the authors license the publishers to do,” he said: “That could have a substantial effect on the market for literary and other works.”

The meetings generated interesting points for the study group to ponder, Kupferschmid and Aiken agreed, but both said they weren’t sure of the group’s course. Usually, when bodies like this meet with the public, they work from a preliminary report or guidance document, Kupferschmid said; in this case, officials started with an old law and a blank slate. “There are legitimate needs here to archive materials but it’s a tricky and dangerous area and we have to take great care,” Aiken said: “If you're looking to preserve the materials in the wrong way, you could destroy the market and undermine the very thing you're trying to preserve.”

An Argument for Change

The Web already has lost a huge amount of content in its brief life. This is due in part to today’s legal framework, which inhibits archiving and preservation, Cornell U. computer science prof. William Arms told the working group. Citing his work with Web preservation efforts like the Cornell Web Library, the Library of Congress’s Minerva Web archiving program and the National Science Foundation’s National Science Digital Library (NSDL), Arms had several recommendations:

(1) Web preservation rules must be technically flexible, and not make assumptions about current or future technology. (2) A way is needed for site owners to indicate their wishes about collection of materials they post for preservation and access after preservation. (3) A Web preservation exception should apply to materials on the Internet accessible by browser or PC using standard Web protocols with no special authorization. (4) The exception should say libraries and archives may delegate operational aspects of preservation to other organizations. (5) Provisions should be made to collect and preserve all Web materials, exclusions notwithstanding, such as robots exclusion and copyright owner requests. (6) Policies for access to preserved Web materials should be couched in terms of the category of use and respect for copyright owners’ interests.

There are no technical barriers to carrying out these ideas, Arms said. Modern Web crawlers have “politeness algorithms” ensuring a single crawler doesn’t make excessive demands, he said. But it may be necessary to curb the number of crawlers accessing a site for preservation, Arms said. A naming convention should be devised so all crawlers that claim the exception clearly identify themselves, he said. The question of how to identify and collect software needed to preserve the full user experience of digital information is “probably not solvable,” he told the group. But that quandary is less severe with Internet information than with other forms of digital content, since most Web pages can be rendered with a standard browser and a limited set of plug-in modules, he said.