A small boy climbs onto his chair and starts his computer as if it were his gateway to a New World. The computer is up and running and the boy carefully moves his mouse around as if it were attached to his hand. He clicks on a highlighted link and a naked woman appears on his screen. The boy quickly turns around but he is alone. Engrossed in this picture with no one to scold him, he finds a new area of the Internet. He continues to navigate through this page of nude women and hears a noise. His mother opens the door to his room but its too late, the navigation window has already been closed and mom has no idea what her little child had been looking at. With a world of information and no way to monitor it, the Internet makes life informative as well as interesting limited only by each click of the mouse. The Internet is difficult to limit because the devices used today are not capable of monitoring such a vast new medium.
Any new medium must go through investigation and must be explored to see the full potential and how it affects society. In the 1940s radio emerged as a strong medium, allowing thousands to hear news, entertainment, and even propaganda. Radio was new and was therefore forced to restrict its use of certain ideas and vocabulary. Years went by and a new form of media emerged: the device television, which was used to see pictures. Another new medium of communication except now with sight too. Television too was forced to restrict profanity, ideas that were not to be expressed, and information
that was only for certain individuals. Both radio and television underwent heavy investigation on how they were to broadcast. With the rise of the Internet, a new censorship era has emerged.
There have been many attempts to restrict content on the Internet but nothing can be done to completely guarantee that questionable materials will not be seen. Many different methods have been tried to restrict Internet content. These methods include software developed to restrict profanity, human monitors to watch content, standard online filters, or firewalls to protect against amateur hacking. Nothing has managed to work because the Internet is too large to try and shackle down. There is always a way to smash through any form of restriction because there is no central point from which to monitor content. In addition, Internet users who want to see pornography or participate in illegal activity will continually try to do so because of their curiosity, determination and motivation. The more people try to restrict the Internet, the more people want to find ways around it.
The Internet is a home for many different companies and personalities, all of which are registered on the World Wide Web. Perhaps the most popular place to register with is InterNic.net, a non-profit organization. This organizations sole purpose is to ensure that no individual can claim the same address as another person and also is there to make sure that an address using profanity is rejected. Justin Hall, a Net publisher, was denied the right to register the domain fuck.com (Rowe 48). A reason why? Undoubtedly [there would have been] a large number of people who would be offended (Rowe 48). Although Rowe was denied the domain name fuck.com, another
subscriber managed to register f-u-c-k.com because of the difference in three dashes through the letters of fuck. Apparently InterNic.net felt that this name would not be offensive and let them proceed with the registration. In my opinion I think their reasoning behind the acceptance is that most people will not try to type dashes between letters in a word. Although this might seem like it could be a one-time occurrence, it is not.
Perhaps one of the other most offensive names to enter the Internet is Penis.com. This web site allows any user access to free gay pornography and even the ability to open an email address through them. This email ability will give any web user the chance to make an email address as [email protected]. Is there any reason for routing email through the name penis.com? Not really, but its amusing to some and offensive to many others when you receive mail from a domain name penis. Another interesting site is squirrel.com. As the page loads up the viewer sees a few pictures of nice furry squirrels. However, as one scrolls down you see many links to hacking and cracking information. A friendly page of animals can lead one to see how to bring down many computer systems and how to cause mass destruction in the electronic world. How can one stop this information when it is registered under the domain name of a friendly rodent?
It is obvious that the Internet is difficult to monitor but there have been many ideas developed to try to do so. The cheapest and most common way to try to restrict the Internet is by using software that limits access to web sites by filtering profanity. Cyber
Patrol, by Microsystems, Inc., and Net Nanny, by Net Nanny Software International are the two most popular software packages to filter out questionable material. Net Nannys web page at Netnanny.com allows anyone to submit a site they have found dealing with questionable material to their filtering database. This database helps restrict inappropriate content but to better ensure that sites are being watching, groups of people are assigned to search the Internet for questionable material. They gather large amounts of inappropriate sites and create a master list to send to Net Nanny and Cyber Patrol. These programs use this list and when a link is clicked, it is automatically matched against the master list to ensure protection (Rogers). We put our faith in these searchers but we can never rely on them completely because thousands of sites will go down and come up within the course of a day. This makes it difficult to create a standard database of questionable material.
The protection programs work by denying web access to sites that contain the words pornography, gambling, hate, fuck, penis and other profanity. These protection programs are classified as three specific types: 1. Software that consults a list of known sites and blocks them according to criteria chosen by the PC owner; 2. software that looks for suspect words to decide when to block sites; and 3. The fencing off of portions of the Internet by firms that provide Internet access (The Top Shelf: Internet Censorship).
The protection software is completely unreliable because it judges material on what has been listed as profane in the past. Software that consults a list of known sites
will not work because it takes a matter of seconds for the person running the site to change the address name and re-post it. The master list also contains limited sites because new URLs arise all the time. Therefore it would be difficult to keep updating a world of questionable sites. The software that looks for key words or words specific to a subject is an even worse idea because there are many sites that might mention a profane word as an example of what not to do or might use a sexual topic to explain it as information. A health services page will speak of sex and information on how to protect oneself against S.T.Ds. This page might be blocked and so might the valuable information it contained.
The majority of concerned adults want to block pornography but stopping the words that deal with sex will prevent all helpful information from being seen as well. The last method mentioned, fencing-off companies, may prove unreliable because many pages do not apply to firms. There are thousands of local freelance Internet suppliers (ISPs) who are control only the web space that they sell out. The large firms like eurosluts.com might be locked away from the public, but what about all those sites located under domains that are still undiscovered by those wanting to restrict them?
The programs used to try to restrict the Internet are too basic because they cannot restrict inappropriate links located on legitimate pages. Reviews of Cyber Patrol have stated that the software is far from foolproof: The testers of the software were able to
bypass the program and find backdoors to adult sites, as well as find access to use-groups that house and display pornography (Rogers). The testers are important because they are
the ones who find the flaws in the program and explain how to bypass the restrictions. If the testers themselves can find these backdoors, cant others too? Underestimating anyones ability, especially childrens, to find a back door in these protection programs is a poor decision.
In the same way, this software tends to malfunction, as well, in that it has blocked information regarding completely unrelated materials ranging from environmental issues, to eating disorders and feminism, as well as blocking innocent phrases such as Middlesex County (Rogers).
The problem with passing off of the job of restricting content is that there is no gray area with machines. They do restrict, or they do not. They can restrict, or they can not. Machines have no sense of morality or judgement. They only do what they are told. No artificial intelligence is perfect.
These devices have been compared to similar censorship devices used to prevent pornography on television: One version of this net blocking software would block twenty thousand web sites and another would allow access to only three thousand (Buckley). Since the programs rely on a static set of instructions, they cannot make judgements on the sites content.
While these programs have proved to be ineffective, online services have turned to a simpler, faster word-screening method. However, we notice that these programs show similar results. America Online faced major unrest when the programs shut down a breast cancer forum, as well as banning access to the official White House web site,
because it mentioned the presidential couple. (PC Magazine: Internet censorship) Clearly, these instances show us how unreasonable it is to assign machines and automated programs the task of filtering out what is inappropriate.
They are unable to accurately determine what are suitable and unsuitable materials for viewing.
These machines and programs prove to be too sensitive to restrict inappropriate material, so we are forced to turn towards our own human skills. Online services that provide chat rooms have turned to using human monitors to help with the task of controlling inappropriate material. On popular services [like] America Online, volunteers are recruited to discourage members from using language deemed to be vulgar, abusive or hateful (Rowe 48). These monitors are not paid and volunteer their free time by monitoring rooms for inappropriate language, pornography, and other illegal activities. The monitors have the power to boot a person offline and report their violation to top members. Punishments usually consist of a cancelled.
However, the key word to think about is that they are just volunteers who do not get paid for their services. Because they do not get paid they also can choose not to do their jobs correctly. The monitors are usually people who are just as addicted to chatting as your average AOL user. This causes a problem, as Rowe states that The judgment as to what constitutes offensive language rests largely with volunteer monitors, whose whims are the subject of much derision. These people have no training, or employment. Most of the online services provide a list of words and subject matter they supply their monitors with. The monitors must then uphold order by giving warnings and removing
chatters that violate these guidelines and regulations. As we have seen before, there are always ways around this.
Rowe notes, Last fall on AOL, a user typed the word prick during a discussion, then quickly added your finger on the next line to cover her tracks. This is a perfect example of a simple way around a complex situation. What can a monitor do in a situation like this? The damage, if any, is already done.
AOL recently attempted to restrict too much of their service by removing a gay discussion group. AOL removed a discussion area called YngM4YngM, says Rowe, prompting protests from gay teens who used it as a support group, AOL had closed several feminist forums with the word girl in their titles, fearing that youngsters might go there by mistake and be corrupted. It is a difficult situation and we will never be sure to what the purpose or intent each chat room contains. It appears the only way to keep some order is to judge every case individually. No one person has the time for this, or the patience.
Even if we were to put a monitor in every single chat room on the entire network, we still cannot scan the content of the entire Internet. Anyone can start his or her own homepage. Sites such as Geocities.com, Xoom.com, and Tripod.com give their customers free web space. Anyone with a computer, a modem, and a connection to the Internet can set up a page, with any theme or content. These sites claim that they screen content to these sites, however, they cannot constantly be monitored.
As we evolve, we seek to create barriers and control attempts that we come across. Even if, as the Germans, Chinese and Singaporeans are doing, you shut down those connections to services the local prudes and/or commissars find offensive, the citizenry can use international phone lines to hook themselves to restrict-free foreign computers. It may cost a bit more, but lust knows no bounds nor boundaries. (Carr 24) Human nature seeks to scale any walls that we come across.
Carr notes, There are ways of circumventing the barriers, but you have to know first that you are being “deprived”; second, how to get around the obstacles; and third, you have to find an unrestricted source that will let you in. Any intelligent, resourceful creative person can find ways around things. Age, background, or ethnicity does not bind this intelligence, resourcefulness, or creativity from functioning. Anyone can find a way around things, given the determination and motivation.
Just as we cannot restrict the Internet, we cannot prevent the constant change of material, inappropriate or not. There’s no stopping the stuff. It appeals to the most fundamental human appetites. Lust is a fundamental human behavior. (Garber 82)
Garber states that, No technology can regulate the proliferation of data over the Internet. It’s too big, it changes too fast, and there are too many back alleys and hide-holes in it. Just as humanity and society grows and evolves, the Internet grows and evolves. The Internet is just as sensitive as other types of media, such as Television, and Film. And just as those forms of media have rating systems and schedules that restrict viewing times, the Internet should have the same systems. However, due to the fact that
its much more limitless and expansive, the only way to restrict and restrict access is to personally view, on a case by case situation, the sites and locations seen through our browsers. To protect our children and loved ones from questionable content, we should not leave our children with our computers like we leave our children with our televisions. The Internet is not a babysitter. It is a tool. It is a device that is capable of helping us in many ways. But like any tool, it can be misused, or used in ways that it was not originally intended. Furthermore, the Internet is incredibly difficult to restrict because of its continually changing state but as technology improves more and more people will attempt to censor specific areas.
Works Cited
Buckley, William F,Jr. (1997, Aug. 11) Internet: the lost fight. National Review, p. 63.
Burr, Ty. (1997, Sept. 18) Down and dirty: now that the Supreme Court has protected pornography on the Net, the task of smut patrol falls to you. Entertainment Weekly, p. 92.
Carr, John. (1998, Feb. 20) Its time to tackle cyberporn. New Statesman, p. 24.
Garber, Joseph R. (1996, Aug. 26) Dirty bits, naughty bytes. Forbes, p. 82.
Kirchner, Jake. (1997, Oct. 7) Internet Censorship. PC Magazine, v16, pp. 30-1.
McManus, Terry. (1998, Aug. 13) Home Web sites thrust students into censorship debates. The New York Times, pp. D9, G9.
Rogers, Michael. Internet blocking software: online savior or scourge? Library Journal, 1 Apr, 1997: p.16.
Rowe, Chip. (1995, Jul. 5) Censorship glossary. Playboy, p. 48.
Sheldon, T. (1996, May 18) The top shelf: Internet censorship. The Economist, p. 84.