РефератыИностранный языкLeLegislative Proposal For New Indecency Language In

Legislative Proposal For New Indecency Language In

Telecom Bill Essay, Research Paper


Legislative Proposal for New Indecency Language in Telecom Bill


I. Summary


Although the October 16, 1995 legislative proposal purports to regulate ?


computer pornography?, the proposal contains fatal flaws which render the


proposal at best counterproductive and at worst devastating to on-line


communications. First, it prohibits, but fails to define, ?indecent? speech to


minors — a dangerously vague, medium-specific, and, after decades of litigation,


still undefined concept, which may include mere profanity. This may tie up


successful prosecution of the law in courts for years to come, while courts


wrestle to divine a constitutional definition of ?indecent? — and while


companies are left with uncertain liability.


Second, the October 16 proposal may actually hold systems liable for


communications over which they have no specific knowledge or control. The


proposal purports to target those who ?knowingly? send prohibited communications


– itself a relatively low standard of liability that may not even require


actual intent or willfulness. Nevertheless, because the proposal i) defines the


elements of criminal liability in vague and contradictory terms, and ii)


eliminates safeharbors in the Senate bill that would define a clear standard of


care, it might hold systems liable for actions that don’t reach even a ?


knowingly? standard of liability. As a result, access providers, system


managers and operators, and employers may potentially be liable for actions of


users over which they have no specific knowledge, intent, or control.


For any company that communicates by computer, the proposal:


1) Creates liability for, but never defines, ?indecent? speech, a dangerously


vague standard that could leave companies criminally liable for use of mere


profanity;


2) Establishes vague and contradictory standards of liability that could leave


innocent companies vicariously liable for communications over which they have no


control;


3) Strips workable affirmative defenses from the Senate bill, eliminating a


clear standard of care for companies.


Not only does the proposal endanger companies, it fails to protect


children. The indecency standard guarantees that enforcement will be tied up in


the courts for years to come. Companies will be particularly reticent to


identify and eradicate prohibited communications when they are incapable of


discerning which communications are ?indecent? and when the company’s consequent


knowledge of the communications may actually make them liable. At worst, the


proposal will either shut down systems entirely or will shut down any attempts


to constructively monitor and screen systems, as providers take a know-nothing


stance to avoid prosecution for purported knowledge.


II. The ?Indecency? Standard and Uncertain and Conflicting Standards of


Culpability Implicate Innocent Companies But Fail To Protect Children.


A. The undefined ?indecency? standard is possibly unenforceable and certainly


counterproductive.


Although the October 16 proposal purports to regulate ?computer


pornography?, it actually prohibits all ?indecent? communications by computer or


?telecommunications device? (an undefined term that presumably includes


telephones and facsimiles) to persons under 18. Because the term ?indecent? is


a medium-specific term that, after decades of litigation, remains undefined, it


is uncertain precisely what would be prohibited by this section. In the context


of broadcasting, the Supreme Court has defined mere expletives as indecent See


FCC v. Pacifica Foundation, 438 U.S. 726 (1978).: Would the use of an expletive


in a communication that is made available to a minor trigger a criminal felony?


An illustration. After this law passes, a 17-year old college freshman


is writing a paper on the ?indecency?. He decides to look at Supreme Court cases


to determine what he is prohibited from seeing. The university librarian, who


believes the student looks young for a freshman, directs the student to the


Supreme Court Pacifica case, which defined ?indecency? for the purpose of


broadcast media. If the librarian directs the student to the bound version of


the Supreme Court Reporter, she has done her job well. If she sends an


electronic version on-line, she goes to federal prison for 5 years. The


Pacifica case contains as an appendix a transcript of the George Carlin


monologue on ?Seven Dirty Words?, which the Court found indecent for purposes of


broadcasting.


The Supreme Court had no qualms about printing the case, because it was


in a different medium than broadcasting — one requiring someone to access it


and requiring literacy. The October 16 proposal recognizes no such distinction


between media, however. Nor does it define ?indecency?. Indeed, it treats all ?


indecency? as ?pornography?. Would the Pacifica case be banned from on-line


access by our schools and libraries by the October 16 proposal? It would by any


normally prudent access provider who wanted to avoid the possibility of spending


5 years in federal prison.


Other examples: (i) a sender posts a message to a Bulletin Board that


contains an expletive or a medical or literary passage that is ?indecent? and is


then read by a minor; (ii) a university provides on-line access to all students,


including some freshmen under the age of 18, to its library, including works


containing ?indecent? passages; (iii) a company that employs a high school


senior as an intern knowingly posts a message from an employee that contains


some of the ?Seven Dirty Words? on an employee bulletin board. Under a plain


language reading of the proposal, any of these actions might subject the sender


to a criminal felony conviction. Given such potential liability, companies may


be faced with avoiding liability by either shutting down screening of


communications, or shutting down systems entirely.


At best, the indecency provisions are simply unenforceable. In


regulating indecent speech, the courts have held that the government must take


into account the medium being regulated, must use the least restrictive means to


further its articulated interest, and may not curtail all adult discourse to


only what is fit for children. Sable Communications of California, Inc. v. FCC,


492 U.S. 115, 126, 128 (1989). The Department of Justice noted that the


language upon which Sec. (d) of the proposal is based raises constitutional


questions due to the lack of criminal intent required for the age element.


Letter from Kent Markus, Acting Assistant Attorney General, to Sen. Leahy (June


13, 1995), 141 Cong. Rec. S 8344. The Justice Department stated its concern


that ?this subsection would consequently have the effect of regulating indecent


speech between consenting adults?. Such a holding by a court could render the


indecency standard constitutionally unenforceable.


The indecency standard is counterproductive. First, it ensures that


rather than effectively protecting children on the Internet, the law will be


caught up in fruitless litigation for years to come. The much less expansive


statutory limitations and subsequent FCC regulations on dial-a-porn engendered


ten years of litigation before a constitutional standard was established.


Second, companies are apt in the face of uncertain liability and an


undefined standard of ?indecency? to abdicate any positive role in screening


rather than risk liability for discovered or imputed knowledge. Companies would


be particularly vulnerable during the years of litigation it would take to


establish a constitutional standard of ?indecency? by computer communications.


At worst, the indecency provisions would shut down entire networks.


At the very least, the indecency standard establishes a separate


standard of liability for the Net, relegating it to second class citizenship


among all media. Information which is freely available in bookstores, libraries,


and record shops could be banned on the Internet. The electronic editions of


newspapers could at times be prevented from publishing stories appearing in the


printed version.


In place of a nebulous indecency standard, children would be far better


protected by a ?harmful to minors? standard that spells out explicitly what type


of material is prohibited. Such a standard is currently in place in all 50


states and in the District of Columbia and has been upheld consistently be the


courts.


B. Vague and contradictory standards of liability threaten innocent companies.


The dangerously vague ?indecency? standard is compounded by vague and


contradictory criminal elements in the Title 18 and Title 47 offenses.


According to a former federal prosecutor in our firm, depending upon how courts


read such ambiguous elements, innocent companies might be left vicariously


liable for communications over which they have no specific knowledge or control.


This danger is particularly acute given the incredibly large amount of


information that flows over systems and the utter impossibility of companies to


screen, review, and remove all ?indecent? communications — even if they could


define such communications. Imagery and graphics are particularly troublesome,


as they can be screened only by the old fashioned way — by human inspection,


conceivably necessitating an indecency inspector at every company using on-line


systems. 1. Vague and Contradictory Standards of Intent and Control Subsection


(d)(1) holds a person or company liable for ?k

nowingly making available? any


prohibited communication, ?regardless of whether the maker of such communication


placed the call or initiated the communication(s)?. Disturbingly, ?knowingly?


and ?makes available? are undefined. According to a former federal prosecutor at


our firm, ?knowingly? is a relatively low standard of liability, that does not


require willfulness or intent.


The standard of duty to prevent communications once a company is on


notice that they exist is unclear. If notified that a potentially offending


communication exists on a bulletin board on the system, is the system manager


now culpable of ?knowingly . . . making vailable? the communication? If


notified that an offending communication exists somewhere on a company’s system,


is there then a duty to hunt for the material and delete it? Once given notice,


is there a duty to prevent retransmission? These problems are compounded


because even if a company is informed of the existence of an offending


communication, it may not know whether the communication is ?indecent?. Indeed,


the company may be precluded by state, local, or federal privacy statutes or


other laws from interfering with or even reviewing the communication.


The Title 18 offense and the Sec. (d) offense lack crucial elements


provided in the Sec. (a) offense that are necessary to ensure that companies are


held liable only for communications that they exert control over and intend to


send. Specifically, Sec. (a) provides that a sender must knowingly both (i) ?


make[], create[], solicit[]? and (ii) ?purposefully make[] available? or ?


initiate[] the transmission of? a communication in order to be held liable for


it. Courts would presumably attempt to reconcile the differences in identical


crimes in the same bill in a way that gives meaning to each word of the


legislation. Consequently, courts may read the lack of such elements in the


Title 18 and Sec. (d) offenses to implicate company-operated systems by


vicarious liability for the actions of users.


2. Vague and Contradictory Standards of Knowledge.


Furthermore, the Title 18 and Title 47 indecency to minors provisions


create vague and inexplicably conflicting standards of culpability as to the age


of a communication recipient. Both sections begin with a ?knowingly?


requirement. The Title 18 provision, however, requires in addition that the


communicator or transmitter ?believes? that the recipient has not attained the


age of 18, and ?know(s)? that the communication ?will be obtained by a person


believed to be under 18 years of age?. The Title 47 provision contains no such


additional requirements.


The Title 18 offense itself is dangerously vague on whether specific or


general knowledge of the recipient is required. If a communication is posted to


a bulletin board to which the sender ?believes? or ?knows? that children have


access, is the sender in violation? Is the bulletin board operator? Is the


system upon which the bulletin board is located?


Even more disturbing is the discrepancy between the elements of


liability in Titles 18 and 47. Again, courts would presumably attempt to


reconcile discrepancies in identical crimes in the same bill in a way that gives


meaning to each word of the legislation. Consequently, courts may read the


statute to establish that the level of knowledge or belief required to establish


liability under the Title 18 provision is greater than the level required for


liability under the Title 47 provision. Thus, someone might be prosecuted under


Title 47 despite the fact that he does not believe the recipient of a


communication is a minor, and despite the fact that he does not know whether the


communication will actually be received by a minor. Such a reading would be


supported by the fact that the Title 18 offense is punishable by a longer term


(5 years) than the Title 47 offense (2 years).


This standard is particularly troublesome for companies that operate


systems or bulletin boards that have the capacity of being accessed by minors,


as do nearly all systems or bulletin boards interconnected by the Internet. If


one need not know whether the recipient of a communication is a minor, or


whether a communication will actually be received by a minor, posting a


communication to a system potentially accessible by a minor, which in fact is


accessed by a minor, may render one liable, under such a reading, under the


Title 47 offense.


C. Sec. (d)(2) Protections for Companies Gutted.


As drafted, Sec. (d)(1) effectively guts the protections that Sec.


(d)(2) is intended to provide to businesses and other systems. Sec. (d)(2)


establishes protection against vicarious liability for system operators and


managers under Sec. (d)(1), by limiting liability for a ?telecommunications


facilities? under one’s control to where one has ?knowingly permit(ted)? the


facility to be used for a prohibited Sec. (d)(1) purpose, ?with the intent? that


it be so used. Sec. 223(d)(2). This protection is particularly important given


the recent court holding in Stratton Oakmont that systems may be liable for


every single communication sent over their network, regardless of their


knowledge of the nature of the communication. Stratton Oakmont Inc. v. Prodigy


Services Co., No. E31063/94 (N.Y. Sup. Ct. May 24, 1995).


The offense in Sec. (d)(1) is so broadly drawn, however, that it guts


this defense. Sec. (d)(1) holds liable anyone who ?makes or makes available? a


prohibited communication, ?regardless of whether the maker of such communication


placed the call or initiated the communication?. Sec. 223(d)(1). Any Sec.


(d)(2) offense would presumably entail a violation of this provision. Thus,


rather than being protected by a higher standard of liability, facilities could


be doubly liable, under Sec.s (d)(1) and (d)(2), for a prohibited message sent


by a user.


D. Affirmative Defenses Gutted.


Although the October 16 proposal’s authors purport to hold liable only


systems or access providers that knowingly transmit prohibited communications –


itself a low threshold — the proposal guts safeguards in the Senate-passed


telecommunications bill that would have ensured even that:


1. Mere Provision of Access.


First, the proposal strips a Senate defense that would protect access


providers against liability ?solely for providing access? to a network or system


not under their control. (Subsec. 402(f)(1).) Given the uncertainties of


application of the ?knowingly? standard, this defense is necessary to ensure


that access providers are not held liable for material of which they have no


knowledge or over which they have no ontrol.


2. Employer Defense.


Second, the proposal strips a Senate defense that would protect


employers from being held liable for the unauthorized actions of a rogue


employee. The Senate-passed bill established that employers shall not be held


liable for the actions of an employee or an agent such as a subcontractor unless


the employee or agent’s conduct is ?within the scope of his employment or agency


and the employer has knowledge of, authorizes, or ratifies the employees or


agent’s conduct?. (Subsec. 402(f)(2)). A former federal prosecutor in our firm


indicates that absent this defense, a company might be held liable under a


theory of agency or vicarious liability for the actions of an employee whether


or not the company intended those actions.


3. Screening and Compliance With FCC Regulations.


The sole remaining affirmative defense, which provides protection from


prosecution under Sec. (d) for compliance with access restrictions and


subsequent FCC regulations, is worthless to companies. First, this defense is


meaningless without a comparable defense to prosecution under Title 18, for


which companies are liable for even higher penalties (5 years in prison vs. 2


years in prison) for the same behavior (an ?indecent? communication to a minor).


The October 16 proposal provides no comparable Title 18 safeharbor, rendering


the Title 47 safeharbor worthless.


Second, the proposal prescribes restrictions with which companies must


comply until FCC regulations take effect, but the restrictions, lifted wholesale


from FCC dial-a-porn regulations, are inapplicable to most companies and would


be impossible to comply with. The interim restrictions require companies to


block or restrict access to any person under 18 through the use of a verified


credit card, adult access code, or adult personal identification number (PIN).


Such restrictions are workable for a dial-a-porn provider who provides


restricted access to a telephone number for a commercial charge. Such


restrictions are antithetical, however, to unrestricted, intentionally open


connections, such as within a company’s computer network between systems.


Companies are required to comply with the interim restrictions until FCC


regulations become effective, which, because the proposal restricts


constitutionally protected indecent speech, could take a decade or more. The


dial-a-porn regulations on which the interim restrictions are based took ten


years for constitutionally sustainable regulations to finally take effect. Thus,


companies could be left without a defense for a decade or more, while the FCC


attempts to fashion constitutional regulations — which may be nevertheless


prove useless to companies. Indeed, if the FCC regulations resemble the interim


restrictions in the proposal, they will in fact be useless to most companies.

Сохранить в соц. сетях:
Обсуждение:
comments powered by Disqus

Название реферата: Legislative Proposal For New Indecency Language In

Слов:3161
Символов:22497
Размер:43.94 Кб.