Views of the Bertelsmann Foundation's Self-regulation of Internet Content Proposal

by Ernest Miller


In 1998, the Bertelsmann Foundation began a project whose overall mission was to facilitate the development of an integrated system of approaches to dealing with harmful and illegal content on the Internet through self-regulation.

As the foundation saw it there were five developments that characterized the situation with regard to Internet content:

The Bertelsmann Foundation believed that industry led / financed self-regulation and "enlightened" state prosecution seem to be the appropriate response to the problems of harmful and illegal content on the Internet. In this context, the foundation attempted to facilitate an essentially self-regulatory system for content on the Internet with four key areas of responsibility:

For each of these four areas the Bertelsmann Foundation commissioned area experts to put forth a study and proposal. Professor Jack M. Balkin, Knight Professor of Constitutional Law and the First Amendment and Director, The Information Society Project, Yale Law School was chosen to study self-rating and filtering mechanisms. His report, co-authored by Beth Noveck and Kermit Roosevelt, proposed a three-tiered system of self-rating by content providers, filtering templates by third parties, and implementation by end-users.

Balkin argued that some form of self-rating (whether RSACi, XML, or some other version of metadata) was inevitable, therefore the question was what would be the best form of self-rating and filtering. Other forms of filtering were rejected as ineffective or inefficient. Balkin was aware of the potential for intermediate filtering by governments and other censors using a self-rating system, but argued for political responses as well as proposing a unique technological solution.

Critics of the proposal argued that self-rating was not inevitable. Moreover, self-rating systems would harm smaller content producers,further stifling the voices of non-commercial organizations. Finally, critics made the point that any system of self-rating would inevitably make the job of censoring easier.


In September 1999, the Bertelsmann Foundation issued a multi-part report on "Self-regulation of Internet Content" addressing several issues regarding Internet content, including the protection of vulnerable parties, finding and evaluating information, and detection of electronic crimes. Some of the recommendations of the report were controversial when it was presented in Munich and the controversy continued at the Computers, Freedom and Privacy conference in Toronto [editor's note: the author contributed to the Bertelsmann's report].

Moderator Jean Camp began the panel by providing a comprehensive overview of the foundation's entire report. She noted that the report was framed as a series of questions, but that the panel would concentrate on only one - the answer to which was developed by Professor Jack Balkin of the Yale Law School recommending a multi-layered approach to content labeling and filtering.

Panelist Dianne Martin from George Washington University provided context for the proposal by tracing the recent development of labeling and filtering systems, starting with RSAC and console game ratings through the development of W3C's Platform for Internet Content Selection (PICS) and ending with ICRA. Martin explained that the Bertelsmann proposal was a significant improvement upon previous systems because it separated the labeling function from the filtering function. More importantly, she noted that the proposal was both more technically and socially complex, permitting greater context and multiple cultural values systems. The history lesson took a different turn as Christopher Hunter, Ph.D. candidate at the University of Pennsylvania, analyzed the system to the Catholic Church's list of banned books during the Middle Ages. Hunter feared that the system would not remain voluntary as the report recommended, but that governments would make compliance a legal requirement. Nevertheless, he claimed, many sites would remain unrated and be banished to a "no man's land where browsers fear to tread." Jordan Kessler from the Anti-Defamation League supported the proposal and was pleased by many aspects of the system, including the choice it provides consumers to choose different red/green lists and templates, the use of encryption to prevent upstream filtering and most importantly the default setting that unrated sites not be filtered. The role of the user was of key importance argued Kessler, saying that "users are not sheep. If are smart enough to turn the default settings off, they are smart enough to find and use white lists."

The final speaker was Barry Steinhardt of the American Civil Liberties Union who listed a number of the problems he found with the proposal. The biggest problem, claimed Steinhardt, was that websites would face a dilemma: either self-label and be blocked or fail to rate and be blocked. He also saw the scheme as too burdensome for website creators, citing one artists' website with over 25,000 pages of content. Steinhardt also reiterated Hunter's point that the voluntary nature of the proposed system was illusionary.

A number of questions from the audience followed, but the questions revealed that the audience was as divided as the panelists on the issue.