Andreas von Bonin, LL.M. '98
© Copyright Andreas von Bonin

Content on Demand

Paper for

"Managing Information, Communication and Media Resources"

B (V) 8210, Summer 1998

Prof. Eli Noam

Columbia Business School

GO TO DIRECTLY TO THE BEGINNING OF THE TEXT


Table of Contents

I. INTRODUCTION 1

II. BARRIERS FOR REGULATION 2

1. Regulation on the Supply Side 2

a. The Constitutional Barrier. 2

b. The Technology Barrier 4

c. The International Barrier 5

2. Regulation on the Demand Side 6

III. PRIVATIZATION OF CONTENT REGULATION 10

1. The Empowered Consumer 10

2. The Competitive Control Environment 11

a. The Goals 11

aa. The Supply Side 11

(1) Definitions 11

(2) Interplay 11

(3) Conclusion 14

bb. The Consumer Side 14

(1) Definitions 14

(2) The "Quality" Factor 15

(3) The "Price" and "Privacy" Factors 15

b. The Means 16

aa. Definitions 17

bb. The Impact of PICS-Rating on the Market 17

(1) The Impact of "PICS-Rating" on Supplier Distribution Factors 17

(2) The impact of PICS-Rating on Consumer Selection Factors 20

cc. The Impact of Encryption on the Market 25

(1) The Impact of Encryption on Supplier Distribution Factors 25

(2) The Impact of Encryption on Consumer Selection Factors 26

c. New Duties For Government 27

aa. The Duty to Empower the Citizens 28

bb. The Duty to Keep Cyberspace Open 29

cc. The Duty to Stimulate Alternative Rulemaking 3
 


Content on Demand

I. Introduction

The Internet is the first medium that offers any kind of content on demand to anybody. The user only needs a computer, a connection to the Internet and the appropriate software (newsreader, browser, mail client).

In the "traditional" media like telephone, print media, and broadcasting technical, economic and regulatory obstacles prevented the availability of a comparable variety of content on demand. The telephone lacks the ability to display visible content which makes it less attractive. The Internet can distribute sounds as well as text and (moving) pictures. The print media including the high number of specialty magazines are offering a broad variety of content on demand. But the physical character of the product makes it hard to get timely international content as well as to distribute to large audiences. Moreover, content production is not cheap and regulation exists that restricts the availability of certain content to minors. Broadcast and cable content is only offered by a limited number of transmitters. These are either seen as public trustees administering a piece of the valuable spectrum they got for free which makes them subject to various content regulations, or they are in possession of a no less valuable municipal franchise. Content production is expensive and as broadcasting and cable rely on advertising revenues, niche content is scarce. Moreover, content regulation in classical mass media is more feasible because of the limited number of transmitters. On the Internet, everybody can be a content producer and publisher. Content production is cheap and the audiences potentially reached are larger than those of most television programs are.
 
 

II. Barriers for Regulation

These characteristics of the Internet confront government with several regulatory problems as to the availability of content on demand. Different problems occur depending on where regulation is targeted. Traditionally, governments regulate the supply of unwanted content. This is what happened in the U.S. in 1996 and continues to happen in many European countries to date. The well-known problems with supply side based national content regulation should briefly be summarized. A comparative view is added, where appropriate, on the situation in Germany.

Another possibility that emerges from the specific characteristics of the Net is regulation of the demand (= the user) side. No government has thus attempted to enact such legislation. Nevertheless, the potential danger of this possibility as well as the legal objections to such an attempt should be pointed out briefly (see infra 2.).

1. Regulation on the Supply Side

Traditional government regulation of Internet content faces three barriers.

a. The Constitutional Barrier.

The publication of content that is not -by the respective constitutional jurisdiction- deemed plainly illegal (see infra b.), is basically protected by free speech provisions. Like in other mass media, government nevertheless might have an interest in restricting part of this content's availability on the Net - at least to certain groups of recipients, such as minors. Besides the question of their practical efficiency, those restrictions face additional legal thresholds: Content-based regulations in the US require the pursuance of a compelling state interest. Preferably, the protections of minors or national security function as accepted state interests to justify the prohibition of otherwise constitutionally protected content. European constitutions know a similar construction. For example, Art. 5 II of the German Grundgesetz requires a restrictive law to be "general", i.e. not to be targeted against a special opinion, unless it restricts content to protect the youth or the honor of the individual.

But to regulate constitutionally protected content, government also has to show that the means chosen are narrowly tailored to that compelling state interest. In the US, this is part of the strict scrutiny test under the First Amendment. No regulation can be considered narrowly tailored that e.g. reduces the content available to adults to what is fit for children. This rationale ("Butler-problem"(1)) was one basic ground for the invalidation of the CDA(2) by the U.S. Supreme Court in 1997. In other mass media the "Butler-problem" is avoided with zoning approaches: Indecent press products cannot be sold to minors, who - due to the physical character of the transaction - are identifiable(3). Adult movie theaters can constitutionally be removed to "the darkness on the edge of town"(4). Indecent broadcasting can only take place at a time when minors presumably do not listen or watch(5). A similar possibility was not found to exist in most parts of the Internet. Age-verification systems cannot be implemented in the delivery of single email messages or newsgroup articles. For the WWW, age verification systems that mostly use credit card information were found unreliable and easy to circumvent(6).

In the European tradition, the "narrow-tailoring" test is part of the regular proportionality requirement demanding that there be no less infringing means equally efficient to reach a legitimate goal. In Germany, no regulation has so far been struck down because it overbroadly restricts access to adult material for the sake of youth protection(7), but German youth protection laws use the same methods of "zoning" of content(8). As a conclusion, the "narrow tailoring" requirement is a constitutional barrier for regulation on the Internet.

b. The Technology Barrier

Even if certain content is deemed plainly illegal, that does not mean that government has the ability to effectively prevent its publication. Most countries know categories of content that are plainly illegal. They either do not enjoy constitutional protection or they clearly fall within the reach of the constitutional provisions' reservation clauses. In the US, obscene content is not protected by the First Amendment and can therefore constitutionally be forbidden. In Europe, content is basically protected by the Free speech clauses of the national constitutions as well as by the European Convention on Human Rights (Art. 10), but all these guarantees contain reservation clauses which permit prohibition of certain forms of content as child pornography, incitement to racial hatred and violence, and in some countries also political speech like national-socialist speech in Germany. In many other countries, even if no written constitution is in existence, Art. 19 of The International Covenant on Civil and Political Rights (December 16, 1966, entry into force March 23, 1976) is valid, albeit also containing a similar reservation clause. Besides that, most legal systems permit punishment for defamatory speech, for fraudulent speech or fraudulent advertisement. Even greater danger to the general security can be expected to arise from bomb making instructions or manuals on how to compromise data security systems. While this information might have been available also in the physical world, distribution is extremely facilitated on the Net. The values expressed in those - most widely accepted - prohibitions are considered to remain valid also in Cyberspace. But the technology has deprived national governments of the possibility to enforce them by means of classical criminal or administrative law enforcement. Although banning the publication of this kind of content is constitutionally permitted, it is practically not feasible. This is best proved by the fact that it exists openly and in significant amounts on the Net. Prosecutions and other attempts by national law enforcement authorities to shut down sites that offer illegal content is the exception. Especially obscene e-mail messages or pornographic postings in newsgroups, even if they can be spotted in the several terabytes of daily data traffic on the Net, are not necessarily traceable to an individual that can be subject to criminal prosecution. With public key encryption available to anyone who wants to communicate confidentially on-line, enforcement of speech prohibitions becomes completely impossible, unless government can impose a key escrow system. But even this solution is on the verge of being swept away by technological innovation(9).

c. The International Barrier

Third, and on top of these "national" problems, there is the international barrier. Even if a country manages to enforce its internal content restrictions perfectly, it will not be able to shield itself from the international traffic that - given today's open technology - is as accessible within its borders as it is anywhere on the globe. Two scenarios that happen parallelly can be distinguished: (1) Content that is illegal or unwanted is offered from a server located in a jurisdiction where such material is legal or accepted. Neonazist Ernst Zuendel is still publishing material on a site that is physically located in California(10). This material published in Germany would immediately activate law enforcement authority and the author would have to face jail-time if identifiable. Material that would be illegally obscene is legally offered from Scandinavian servers. (2) Content that is illegal or unwanted is offered from a server located in a jurisdiction where such material is also illegal or unwanted, but law enforcement does not happen and possibly the hosting country financially benefits from the presence of the server. Tonga is known as a place that offers top-level domain names to everyone willing to pay for it. Now independent countries of the former soviet union or in Eastern Europe sometimes function as "save havens" for illegal material.

2. Regulation on the Demand Side

If the supply side of a market for illegal or unwanted products can not be controlled efficiently, incriminating demand serves the same purpose of shielding the customer from these products. Nevertheless, government regulation never went this way when regulating media content(11). The CDA did not threat to put in jail the kid that surfs a porn site (and neither the parent who permits it). The new German laws on Internet regulation do not declare responsible the skinhead who downloads radical material from a site in Canada. This has no First Amendment / Art. 5 GG reasons in the first place: Of course, in most constitutional systems there is not only a free speech right, but also a corresponding right to the reception of information. But this is abridged identically by the incrimination of supply as it is by the incrimination of consumption, because the amount of material actually available to the consumer is reduced to the same extent under both alternatives. Denying this means assuming malenforcement of the first. The reason rather is that regulators think in the categories of the old media and are so eager to extend existing regulation to the Net that they do not see the changes that would make it even easier to target user behavior.

(1) In a world of few suppliers and many consumers - the old mass media world -, policing restrictions on the first is much easier. If both figures become identical and half of the suppliers are outside a national jurisdiction, this premise alters.

(2) For classical broadcast media, only the regulation of the supply side prevented accidental or unwanted exposure to undesirable content. If only the consumption of illegal content was criminalized, watching TV without having extensively studied the (hopefully reliable) program guide would be risky, the zapper would almost inevitably be a criminal. Automated content filtering in broadcasting has never been possible.

On the Internet this changes for two reasons: (a) Exposure to unwanted content seldomly happens on the Net. Few exceptions include misuse - the pervert who posts his sexual phantasies in the unmoderated rec.pets.cats - and mislabeling - the spammer who sends electronic advertisement using a misleading sender name or subject line. (b) Technology exists that enables the user to reduce the risk of unsolicited content even further. Content is generally not pushed only once to all consumers at the same time, but sits on a server waiting for being individually accessed. The continuing availability of content permits its rating by other (previous) consumers, see infra III.

In a medium where the consumer knows what she accesses, she always acts intentionally and is as "evil" as the author when downloading child pornography(12). In this world, the law-abiding user does not need governmental supplier side regulation anymore, because she can shield herself against unwanted content and criminalizing intentional consumption of such material becomes an option.(13)

(3) In the classical mass media, a concept of criminalizing only the consumption of undesirable content that at the same time is legally offered sounds ridiculous because of the anonymity of consumption. No prosecutor will ever know who watches this defamatory broadcast, unless TV viewers have to individually register every time they switch on the set. The reader of obscene magazines will remain unknown unless he has to give his name to his vendor. The creation of such an orwellian situation is technically possible and in some respects already reality on the Net. Every off-the-shelf browser today tells the world the logical address of the accessing computer. Personal registration and password log-in at web terminals in schools and public libraries is already frequently required. Many commercial sites require credit card information that easily individualizes the user. Cookies, little data files that are posted on the user's web site, contain personal data that is already used by web advertisers to target banner ads to attractive groups of consumers. Unless the user takes affirmative steps to undermine the gathering of his personal information by third parties, he and his movements on the Net can be closely monitored.

So, why doesn't government give up on the hopeless task of supply control and instead restricts consumer access by either (1) mandating the use of rating software or (2) directly criminalizing access to sites containing unwanted content?

(1) Regulators all over the world rather stick to inefficient regulations(14) than to yield to the reality that illegal content is and will be offered in this world -even if not in their country- by evil but untraceable individuals. Rating software bears the danger of private censorship(15) and only works under certain circumstances strange to the classical media world, see infra III. Its mandatory use allegedly would raise free speech concerns on the supplier side. This is likely so, because at first glance, a speaker's right to speak is equally abridged if the listener is forced to close his ears (through the mandatory use of filtering software that e.g. blocks out unrated sites). On the other hand, Supreme Court precedent is interpreted to make a sharp distinction between the right to speak and the right to be heard and to refuse the latter(16). The international problem cannot be completely solved by the shift to user targeted regulation. As the mandatory use of filtering software would require worldwide content labeling, international coordination is needed as well.

(2) In order to enforce a consumer targeted criminal sanction for the access of illegal material on the Net, government would have to (a) have an always up-to-date database of the sites that contain illegal material and (b) have to able to access all the user's communication data, i.e. when everyone in the country logged on to which WWW-Site or newsserver. Both prerequisites seem unfeasible for both technical as well as privacy reasons. Moreover, an unacceptable load of supporting legislation would have to be enacted, preventing anonymous use of the Internet and prohibiting encryption of content without escrowed keys. To reach email communications, no sanction for the reception of illegal content can be enforced without massive wire-tapping and - again - a prohibition on non-escrowed key encryption.

Encryption for internet content is easily available. If the government does not know what is offered on the Net, it has no chance to sanction any violation of neither consumer (nor supply) side content regulation even if the violator's identification is no problem.

Certainly the strongest reasons against this kind of consumer targeted content regulation is privacy.

III. Privatization of Content Regulation

With government facing severe obstacles of criminalizing both the supply and the access to illegal and unwanted content, the question becomes how society can by itself deal with the availability of content on demand.

1. The Empowered Consumer

From the consumer's perspective that means, the issue has changed from taking safety for guaranteed to managing choice. While in all other media, the consumer was only offered content preselected by private (editors) or government (law enforcement) authorities, on the Internet he has to do this selection by himself for the first time. It is important to understand that the technology that created the Internet has not only destroyed government powers but also offers all the necessary tools for competition on the content supply side as well as for orientation and selection on the consumer side. Cooperation between suppliers and consumers can create a competitive content control environment in which content they find undesirable is - although not eliminated - effectively kept out of the sight of consumers and possibly their dependents.
 
 

2. The Competitive Control Environment

It is important to determine how content will be allocated to the consumer willing to access it in the absence of regulation as one of the dominating factors. Therefore, the distribution goals of content suppliers and the selection goals of consumers have to be determined (a). Then, technological methods have to be addressed, how these goals can be achieved and brought into relation (b). Looking at the outcome of that scheme, necessary corrections will be identified and means to achieve them (c).

a. The Goals

aa. The Supply Side

As the dominant goals for content suppliers, "reach", "revenue" and "security" can be named.

(1) definitions

"Reach" should be understood as the number of addressees that are able to access the content offered. "Revenue" stands for the amount of money that can be charged through the consumers' access to or interaction with the content offered. "Security" means the degree of certainty with which the content offered reaches the designated audiences unaltered and unmonitored by third parties. It also includes the degree of certainty with which the other part of an on-line transaction is carried out, e.g. payment.

(2) Interplay

There are constellations in which one goal dominates the others and constellations in which preferences vary according to the service and content offered.
 
 
 
 

(2.1) One-goal suppliers

Refugee X wants to inform the world about human rights violations in her country. She will only be interested in the reach of her message. Revenue and security will be less important for her, because she wants to win the attention of an audience as large as possible.

The sender of confidential e-mail will favor security over revenue and reach.

(2.2.) Conflicting-goals suppliers

B, who commercially offers hard-core child pornography video clips for download, thinks that although reach might be a way to enlarge his business, he'd rather rely on the solid revenue from his clientele and avoid wide publicity that could make him subject to social pressure. Security is very important to him. He would even prefer anonymity for both him and his clients.

Movie producer D is interested in transmitting its films on-line to a mass audience, but even if only the recipient gets the film and payment is secured, D would be afraid that the film is stored, recorded, copied or sold again. Therefore D will only deliver to people whose hardware technically prevents copyright violations. This is the situation of content providers who want to sell digitized, copyrighted material online.

(2.3.) Multi-goal suppliers

Most suppliers who seek revenue on the Net will also emphasize security as a goal. But for some suppliers who want to sell content over the Net, reach does not play a significant role:

Hacker A who offers the code of a highly specialized decryption program on-line is only interested in reaching this handful of people worldwide who might need his product and who know his web address anyway.

Also B would fit in this category, if he did not have to actually object to reach for external reasons.

Most commercial content providers equally want reach, revenue and security. Three sub-models can be distinguished here.

In the first sub-model, revenue is directly created by content: E offers a consumer-rights oriented legal newsletter. But instead of selling advertising on his pages, he charges a small access fee to every visitor. E also organizes several discussion groups about issues covered by the newsletter. Because E has contracts with several well-known consumer-rights experts who give legal advise in these discussion groups, he can charge a fairly high membership fee for the groups. For E, security is valuable to enhance goodwill and reputation.

In the second sub-model, the on-line content only enables the consumer to purchase an off-line product. F runs an on-line bookstore, offering besides scientific political literature also fascist and other totalitarian books. The more people buy books online from her site, the more revenue is generated for F. Like every commercial provider she is interested in security.

Only where revenue is not directly generated in on-line transactions, security is less important. The third sub-model involves commercial providers whose content is only a vehicle for advertisement: C provides a free newsletter that is appealing to mass audiences. Because his product generates so many hits, he has considerable advertising revenue. He also offers a spamming service. Paid by advertisers, he produces mass mailings with commercial information and sends them unsolicitedly to members of free e-mail services he has contracts with. C does not care much about security though, because his audience is not specified and he is not involved in actual commercial transactions.

(3) Conclusion

The emphasis on the supplier goals varies according to whether commercial or non-commercial content is offered and which service on the Net is used (e-mail, WWW). Commercial suppliers will welcome security if they are engaged in direct on-line transactions. But the fact that some content or service offered might be illegal in some country of the world does not necessarily alter the supplier's goals. F could operate her bookstore outside Germany (international failure of regulation); X's content is likely to be protected by free speech provisions, if - for political reasons - not within her country, then in any case in the U.S. (constitutional failure of regulation); and B might use a pseudonym or act anonymously (technological failure of regulation), see infra b.

bb. The Consumer Side

The main factors for the consumers' content selection on the Internet will be - like in any other market - "quality" and "price". In addition, characteristics of the Net make "privacy" an additional factor for consumers. All these factors are elements of the "utility" of the product for the consumer, but they should be analyzed separately here.

(1) Definitions

"Quality" shall mean the entirety of characteristics of a certain content that determine its value for the respective consumer except price. "Price" is the amount of money a consumer has to pay for a certain content or service. Although price and quality might be influencing each other as factors of content selection, price is generated independently from the content's inherent characteristics. Price on the Net is in its infancy. Therefore the factor should be looked at separately. "Privacy" is meant here to stand for the combination of the following factors: a. security of communication, meaning the ability to know that the conversation does only take place with intended partners, thus the ability of knowing the way the switch works; b. anonymity, c. control over the use of information and d. the ability to prevent the accumulation or linking of information.

(2) The "Quality" Factor

Absent any governmental pre-selection of content, the consumer will realize that he will either have to screen the available content and services on the Net all by himself or buy selection service from third parties. This includes service that shields him from content that has the quality of being illegal (if offered in traditional media). But the consumer will have to understand the category "illegal" or "obscene" as just another quality category like "educational" or "extremist" or "Dutch" and decide accordingly if he wants to retrieve content of that category or not. In doing this he has a broad range of technology and service to help him as set out infra b.

Other "quality" elements that are not strictly content-based, but might influence consumer selection on the Net include e.g. server speed (the faster access or download is managed, the more likely consumers will choose this site while hitting the "back" button when access is slow).

(3) The "Price" and "Privacy" Factors

So far, most of the content on the Net is free. Financial transactions that are currently taking place, predominantly fall within the scheme of the third sub-model set out supra aa. (2.3.). Restricting access to actual content unless a fee is paid, still occurs relatively seldom.(17) But that does not mean that the consumer could not make content selections according to price, because in today's Net the price factor is often substituted by privacy. Instead of money, consumers are supposed to pay with a decrease in privacy. Often, they will be asked for a "registration" to enter a site. They are supposed to give their e-mail addresses or further data. Although today, in the absence of digital signatures being required for this kind of transactions, any consumer can set up a dummy e-mail account under a pseudonym in 5 minutes and use this information for such registration, many people won't be aware of this and will be reluctant to enter a site where they have to "pay in personal data". Of course, if consumer G wants to enter an adult material site, she might be willing to give real data for making shipments to her address possible. She might even see it as a "quality" sign that this site requests proper identification in order to keep non-members (data pirates interested in her user-profile, her employer etc.) out.

Establishing the possibility to price access to content easily might help consumers in their selection of content. Pricing might allow suppliers to offer service of better quality that they cannot afford to give for free and might generally enlarge the scope of available services. Moreover, gaining "price" as an additional factor for content selection would make bargaining higher privacy levels more easy for consumers. Today, consumer H cannot avoid to give personal registration data in order to enter "Hardcore Sex Online". Probably, he'd rather pay a monthly fee in e-cash than give his e-mail address.

b. The Means

There are two key factors that determine the market relations between suppliers and consumers that pursue the goals set out above: "PICS-Rating" and "Encryption".

Both terms stand for key technologies that will have a significant influence in shaping the Net in the near future. Moreover, they are the focus of policy decisions that are being made presently in many countries of the world.

aa. Definitions

"PICS-Rating" is used here to describe the technology, which enables content providers or third parties to attach a label with some related judgment or statement to a particular piece of content. This label is recognized by the accessing computer's software and thus allows blocking sites that come with certain labels. "Encryption" should encompass every technology that enables actors on the Net to encrypt their communications and content, e.g. by private/public-key encryption the way it is offered by Pretty Good Privacy (PGP) and similar programs.

bb. The Impact of PICS-Rating on the Market

(1) The Impact of "PICS-Rating" on Supplier Distribution Factors

For suppliers whose main goal is reach, filtering is likely to have a negative effect. Depending on it's subject, content that attempts to carry over a political or social message will use drastic means of expression and display that might be rated high enough to be filtered by the consumers' browsers. If X posts explicit photographic documentation of human rights violations on her site, she might get a high rating for violence (torture) from some popular rating agency and will be blocked out by browsers set to block every site that got a high violence rating by this agency. If she is not rated by a rating agency and hesitates to rate her site herself, she might be blocked out as well by browsers that leave out unrated sites by default. Her content might not be listed by the dominant search engines what further cuts on the reach of her message. Some ISPs might make it part of their marketing strategy to offer pre-selected content employing filtering software on the server level. A customer of this ISP won't get X's site even if he wants to, because the ISP does not inform it's customers which sites are blocked. With a rating system fully in function, only the reach of sites that are rated "appropriate for all audiences" will be relatively wider, the reach of niche or controversial content will be significantly reduced (18).

The use of PICS-Rating on whatever level of the Net is likely to have a chilling effect on suppliers that are interested in reach. They might observe that customers flatly set their browsers to block content that has a high ranking in a certain category, e.g. "sexually explicit language", and that they are using only one rating agency's ratings. P who offers teenage-language safer sex information will consider modifying his site in order not to be hidden from parents who might want their children to access this particular site. If he left his site unchanged, and consequently fell into the blocked category, those parents would not even have a chance to notice that they are withholding important information(19).

The impact of PICS-based filtering on revenue is ambivalent. On the one hand, prices for advertising on sites that are unlikely to be blocked by any rating agency/browser combination might rise, because - with filtering software widely installed - they will get relatively more hits. On the other hand, filtering does not in itself provide for a segmentation of audiences that would be particularly interesting for advertisers. This might rather be achieved by "positive lists" offered by some institutions.(20) Furthermore, there might be rating agencies that rate content according to the percentage of advertising they contain. Those ratings are likely to be used by more intellectual and wealthy consumers who want to block highly commercialized content. F's bookstore might loose attention - particularly in Europe - because it will be rated "fascist" or "totalitarian" although she might offer a considerable amount of unobjectionable literature. B will lose all the traffic from by-surfers on his child porn site, because he will inevitably be blocked by all systems.

Rating, at least self-rating, increases content production cost. If the supplier wants to make sure not to be blocked out by software that is defaulted to skip unrated sites, she wants to self-rate or buy service accordingly. In a competitive rating environment, there will be numerous agencies with different rating categories. Some might charge fees to register ratings, some might charge extra for timely rating service when sites are changed or expanded. Suppliers with many pages on-line will consider this a negative effect.(21)

The effect of "PICS-Rating" on security is not that clear, either. On the one hand, the effect is neutral, because security concerns on the supplier side only set in, once the customer has accessed a particular content. In this respect, filtering might only cause a "reach" problem, if e.g. the consumer has set her browser to not display all sites that attempt to send cookies, a device that a supplier who is particularly interested in security might want to use. On the other hand, PICS-Rating might have negative consequences on the number-one security technology, encryption, and thus may be indirectly detrimental for security.

Thus, the use of filtering software

will have a chilling effect on suppliers who use explicit language.

will negatively affect the revenue of suppliers who offer a variety of content on the same site -especially if including some content belonging to objectionable categories-, because it is not context sensitive.

will enhance content production costs and thus cut revenue.

will bring relatively more attention and possibly revenue to mainstream content, because all access to niche content by chance is blocked according to the consumer's pre-selection to filter out whole categories of content.

enables user's to explicitly block commercial content, which might affect reach and revenue for commercial suppliers.

(2) The impact of PICS-Rating on Consumer Selection Factors

Understood as consumer-defined quality, this factor can certainly be promoted by the use of PICS-Rating. In order to enhance truly his ability to make self-defined quality selections, however, the consumer has to do more than just to check a button in his browser's dialog box and certain market conditions have to be present(22):

The consumer has to decide to what extent he wants to allow his ISP to carry out content selection. Many ISPs will install PICS-Rating software on the level of their proxy servers and make the selection what to deliver to the end-user. If the consumer wants that, he should require information on the ISPs filtering policy. If he does not, he should choose an ISP that markets a non-filtering approach.

If the user chooses to use PICS-Rating on his own machine, he must decide, whether he wants to rely on only one rating agency and take the risk of not seeing material that for some reason falls within Level 3 of the nudity category he chose to block, but for another reason is just what he or his family is looking for. Tests have shown that some rating software filters out up to 95 % of the content that would otherwise be available, thus going far beyond what the consumer would call inappropriate.(23)

His alternative is to combine ratings of several agencies that he trusts. When his 8-year old daughter uses the computer, he will set his browser to block all content rated "over 12 only" by the Catholic Children Advisor, but let through all pages that are rated "educationally valuable" by the Liberal Teachers Association even if they fall within the Catholic's "over 12 only" category. Technically, there are no limits to these combinations.

In choosing a rating agency, the consumer will appreciate if the agency discloses its rating policy, standards and the sites it blocks under certain categories. He will favor agencies that clearly have a certain political or religious tendency, because they will rate predictably, over those that claim to have a "neutral", "objective" approach, because this is impossible with more sophisticated content.

In choosing which rating agency to trust, the consumer must know if they rely on self-rating by the respective authors or if they perform third-party rating, i.e. rate other people's content or if they combine both. This is important for him as self-rating might reflect the author's intention more accurately but might tend to be more lenient because the author naturally wants to keep his audience as big as possible. Mis-rating might occur more often with this approach. Third-party rating might often fail to reflect all aspects of the rated content, but at least the consumer can be sure that the agency has rated according to its policy. There will also be agencies that accept ratings from anyone as long as their categories are used (open ratings). These agencies might have a high percentage of sites rated, but the ratings will reflect different understandings of the content and the categories. How frequently do these agencies crosscheck submitted ratings ?

The consumer might want to know, how fast the agency reacts to reported mis-rating, how fast they rate new sites and how frequently they update ratings because content might have changed on particular sites.

Once a user has installed his customized combination of ratings, he will have to decide how to act on unrated sites. Rather than a browser that only leaves him with a choice between blocking all or letting all through, he will choose software that allows him to accept all unrated sites, but run a conventional word-filtering software that blocks sites which contain certain words in certain quantities.

There might be sites the consumer explicitly wants to let through or exclude. Thus, his browser should allow him to enter this kind of information.

If he uses his browser also as a newsreader and a mail client, he wants to make sure that the software also deals with this content. Particularly for spontaneous conversation, as in chat rooms, he will find that no solution can be offered. There is no possibility to prevent the reception of unwanted content in emails or chat rooms by the use of rating software.(24) Especially email spamming is not only annoying but can be dangerous.(25) Where self-help is not feasible(26), the user depends on the action taken by his ISP(27) and on the development of a custom.

When using the Net under his customized configuration, the consumer has to watch that his selection is not spoiled by on-line services he consults. Search engines will eventually run rating schemes that are not conform to his preferences. News servers might pre-screen posted messages under some moderation policy. The consumer has to know that and have the chance to choose another service.

Absent the competitive factors, software options, and actors mentioned above, PICS-Rating might actually cut on the consumer's ability to select content according to his preferred quality. Today, a broad variety of rating agencies has yet to emerge. Chances are that existing dominant companies will influence also the rating market and prevent the development of a competitive rating environment(28). Today, the major companies have rated between 40,00and 140,00sites. The exception is NetShepherd with more than 350,00rated sites. Given the total amount of sites being several million and growing every day, it is fair to say that the majority of sites, and most likely non-commercial or seldom-visited sites, will remain unrated unless alternative rating agencies take on this task(29). Furthermore, the acceptance of any sophisticated rating scheme will depend on its user-friendliness. With the Internet on the whole being in transition from a medium for technically skilled scientists to a mass medium, users might be happy once they manage to access anything and not worry about the content they filter out. Companies interested in channeling consumers to certain websites will refrain from promoting a user-friendly, competitive rating environment(30).

Keeping in mind that only a competitive rating environment as described above actually allows the consumer to make better selection of content according to his preferences, the question becomes what its effect on price will be. This will most likely depend on the consumer's content preferences. If he is satisfied with what free rating agencies will offer, there will be no direct effect on price. Those free rating agencies are and will increasingly be backed by a coalition between software manufacturers and mainstream content providers. Once they are established as the industry standard, they become useful tools to channel traffic to an array of "most popular" content, preferably commercial sites, see supra (1). Alternative rating agencies might have to cover part of their cost by charging usage fees. If a consumer prefers using three alternative rating agencies instead of RSACi, he will have to face a price increase.

In terms of privacy, even a sophisticated content rating environment can have negative impacts. By setting his content preferences readable for everyone, he gives away data the further use of which he cannot monitor. Technically, it is no problem for a server computer to find out the user's content preferences. He has to send cookies to the accessing user's machine or to lead the user over a couple of "test pages" labeled in certain ways and monitor to which pages the browser allows access. The capabilities of Microsoft's ActiveX technology in investigating or even compromising the settings of anyone's computer are even more alarming(31). It should be made clear though that the PICS-technology itself is not detrimental. In fact, it can positively be used for the implementation of a "privacy-rating" scheme, or "privacy-audit". Technologically, two new software systems - TRUSTe(32) and The Platform for Privacy Preferences (P3)(33) - are currently underway in response to these challenges. In their final stage, they will allow customers to define simple or complex privacy preferences that negotiate automatically, using the PICS content-labeling protocol, between a person's browser and the privacy/data practices of a website the privacy level that best meet the contractors' preferences.(34) The vision of this is that (1) users get a sense of security on the Net; they learn to distinguish between explicit transactions which should be handled privately or publicly; and, finally, their concern about privacy on the Net diminishes; (2) companies will try to generate trust by revealing information voluntarily.(35)

cc. The Impact of Encryption on the Market

Encryption enables confidentiality of any on-line transaction where needed. Content that is encrypted by the use of the recipient's public key can only be seen by him.

(1) The Impact of Encryption on Supplier Distribution Factors

Encryption has a positive influence on reach. Suppliers - like X - that do not want to restrict the availability of their content to any particular group of persons do not have to do so. Encrypted material that was accidentally misrouted or misaddressed will not be readable for the (wrong) recipient, thus making reach more accurate. B, who is interested in the confidentiality of all his on-line transacti