The
Children’s Internet Protection Act
The
Children’s
Internet Protection Act (CIPA) is one of the main tools used by
the Federal government to control access to the internet in public
spaces. CIPA is not the first effort to restrict content on public
computers. Earlier attempts to regulate and restrict online content,
through the
Communications Decency Act (CDA) and Child
Online Protection Act (COPA), were deemed unconstitutional on the
grounds that they violated First
Amendment rights.
The
CDA attempted to regulate content by making it a felony to
intentionally send “indecent” material to minors. The CDA being
found unconstitutional by the Supreme Court led to the creation of
COPA. COPA furthered the initial concept of CDA by adding that the
transmission of communication considered “harmful to minors”, if
made for commercial purposes, is prohibited (ALA, 2010).
The
problems surrounding both these acts include (but aren’t limited
to) vague definition of terms including “commercial”; whose
responsibility it is to set community standards of decency; and
infringement on adults' right to privacy (Electronic Frontier
Foundation, 2010). Unlike the CIPA, COPA had no bearing on libraries
but was directed at businesses.
The
creation of CIPA was accepted and passed by the US Congress in 2000,
directly affecting public schools and libraries. It states that
schools and libraries receiving federal funding to purchase computers
used to access the internet or to cover related costs for accessing
the internet must have:
in place a policy of internet
safety for minors that includes the operation of a technology
protection measure with respect to any of its computers with internet
access that protects access through such computers to visual
depictions that are (A)(i) (I) obscene; (II) child pornography, or
(III) harmful to minors; (ii) and is enforcing the operation of such
technology protection measure during any use of such computers by
minors (Title xvii – Children’s Internet Protection Act, p. 3).
The
American
Library Association (ALA) Intellectual Freedom Committee holds
that “use in libraries of software filters to block
constitutionally protected speech is inconsistent with the United
States Constitution and federal law and may lead to legal exposure
for the library and its governing authorities.” ALA considers
the problems that result from the use of filtering software include
the blocking of materials protected by the First Amendment and the
possibility of a viewpoint being imposed as a result of filters that
may not offer access all relevant information in order to enable the
user to be well-informed.
The
act could also imply to parents that their children will be protected
from information that the parents do not wish their children to view
or read. Additional problems with regard to minors’ rights include
the over-blocking of material that is constitutionally protected as
well as under-blocking of possibly inappropriate images and text.
Though CIPA does not require the filtering of text, the decision to
filter material other than visual images is determined locally (Boss,
2004).
Vague
definitions in the CIPA, such as the term ‘harmful to minors’,
mean that key terms are subjective, with definitions varying from
user to user, within communities, and among IT staff. This results in
these laws being implemented differently in different places, varying
across city, county, and state. The library and its location
therefore determine what information a person can access using public
library computers.
Filtering
software and its problems
To
know how these laws affect computer users, we visited a number of
libraries. We used search engines on library computers to find out
whether filtering is used and if so, what is filtered. We spoke to
library staff, including IT persons responsible for installing and
managing the filtering systems, in Pennsylvania, New York, Georgia,
and Delaware to name a few.
Some
locations employ filters, while others use terms of service to
explain what is allowed to be accessed. One IT staff member from the
northeastern region of the US indicated that there are filtering
software companies with categories of filtering that a library can
choose to use in their software package. Smartfilter,
for example, is a filtering program offered by the McAfee security
technology company. Smartfilter has more than 30 categories of
filtering, ranging from 'Alcohol' to 'Hate Speech' and 'Chat', that a
client may choose from. Such filtering software can also be
customized based on the user’s preference (McAfee, 2010). At this
particular library, they filter under three categories: pornography,
malicious
software,
and phishing.
In places
with filters, the items that are filtered are not standard across
libraries or software companies. Filtering cannot be fine-tuned. If a
filter excludes pornographic or violent content, it is likely to
exclude health information as well.
Internet
content filters deny access to certain websites based on blacklists
or lists of prohibited keywords, a practice which tends to exclude
more than the targeted websites (just as a search on a search engine
gives unwanted results). With keyword filtering, all of the unrelated
items are also blocked. For example, in a large east coast city, the
word “anal” seemed to be filtered, which prevented people gaining
access to information about (for example) anal cancer as well as any
potential sexual content.
In one small town library in New
England, no filters are being used. At this library, all information
can be accessed. However, computer users are asked to sign the terms
of reference, which includes a clause that minors use the library and
that inappropriate content is not allowed. This means that users who
access pornography or other material deemed inappropriate can be
asked to stop. The librarian said that they have had very few
problems, but that when she worked in a large city, this would not
have worked. She explained that in a small community where everyone
is familiar with each other, people, including youths, are not likely
to overtly breach community standards. However, in a large city with
a certain level of anonymity, there were not the same social
constraints.
In a
library located in the metropolitan area of a large city in the
southeastern region of the country, filtering software was only
installed on computers used by children, as is the case with some
other libraries that we visited. Yet, at this library the age
requirement for an adult library card is 13 years old. Though
parents must obtain the library card, once a child (under the age of
17) accesses the internet they have the same freedom of information
as adults. All internet users must accept the terms of use before
accessing the internet that include “I will avoid anything that
may be disruptive to other patrons, such as displaying pictures
unsuitable for a general audience.” (SAM Library Internet
Access Program, Internet Access Manager, 2009). Such vague statements
create a grey area as it is unclear what might material might be
unsuitable to other patrons, and this offers discretion – if no one
complains, any content may be accessible – as well as the
possibility of over-vigilance and restricting,
In another
library, in a large city located in the northeast, filtering software
is installed on all computers for both adult and child use. At this
library, we also clicked on selected websites that were generated in
the search results. Although we were able to search all terms, access
was denied for some of these websites. For example, when searching
the terms “sex work” and “sex worker rights” we were able to
access some sites such as http://www.sexwork101.com,
but unable to access www.desireealliance.org.
These are both sex worker organization websites. Another example is
the search term ‘sex change,’ which blocked the website
www.feminizationsurgery.com,
yet allowed access to www.srsmiami.com,
both websites providing information about sex reassignment surgery
and/or about the service provided. In addition, some websites that
were allowed may have contained pages that were blocked.
IT staff
at this library informed us that the Administration of Public
Services determines what is filtered. However, while a library patron
may request for a website or term be filtered, a patron may also
request that a filtered website be removed from the list of sites
filtered.
The
two examples of blocked websites mentioned above clearly show how
people could potentially be affected by the blocking of information
and influenced by incomplete access to information. For example, sex
workers and transgender people may be affected by such restrictions
to information if they rely on public library computers. Sex workers
in the US could access some information from Sex Work Awareness’s
sexwork101 site but be deprived of information about the Desiree
Alliance, the national network of sex workers in the US. People
seeking information about sex reassignment surgery may find out about
a particular specialist in Florida but be prevented from accessing
the site comparing outcomes from a variety of surgeons.
We
have been able to visit libraries in various locations in the eastern
part of the US and want to expand the geographic range of this
project. The next phase of research will be to enlist others to go to
their local libraries and use an online survey to conduct searches
and report what they were able to access, and what content, if any,
was blocked. The link for the survey is on the SWA site. We'll have a
good number of participants because renowned internet persona Audacia
Ray
will help us promote the survey. We
hope readers of genderIT.org in the US will participate>!
Kevicha Echols is a sex educator and a doctoral candidate at Widener University in Philadelphia, Pennsylvania. Melissa Ditmore is a researcher and writer known for her work on sex work, human trafficking. She is a post-doctoral fellow at National Development and Research Institutes, Inc. Echols and Ditmore are affiliated with Sex Work Awareness, a new organization dedicated to promoting information about and advocacy for sex workers. SWA is the USA partner on APC's ERoTics project. This project is helped by Audacia Ray, a media maker and activist who is passionate about sexual rights.
References:
American
Library Association (ALA) (2010). CPPA, COPA, CIPA: Which One Is
Which? Retrieved February 24, 2010 from
http://www.ala.org/ala/aboutala/offices/oif/ifissues/issuesrelatedlinks/...
ALA
Intellectual Freedom Committee (2000). Statement
on Library Use of Filtering Software.
http://www.ala.org/Template.cfm?Section=IF_Resolutions&Template=/ContentManagement/ContentDisplay.cfm&ContentID=13090
Boss,
R. W. (2004) Meeting
CIPA Requirements With Technology.
Retrieved February 24, 2010 from
http://www.ala.org/ala/mgrps/divs/pla/plapublications/platechnotes/internetfiltering.cfm
Electronic
Frontier Foundation (EFF) (2010). COPA
(“CDA II”) Legal Challenge Page. Retrieved February 24, 2010 from
http://w2.eff.org/legal/cases/ACLU_v_Reno_II/
McAfee
(2010). McAfee Smartfilter. Retrieved January 15, 2010 from
http://www.mcafee.com/us/enterprise/products/email_and_web_security/web/...
SAM
Library Internet Access Program, Internet Access Manager, (2009).
Internet
Terms of Use.
United
States Child Internet Protection Act (CIPA) (2001).
- 8075 views
Add new comment