Jodi Dean makes the claim that publicity represents the ideology of technoculture. Analyzing this idea requires considering the meaning of publicity and the nature of technoculture as well as how the two fit together. Technoculture arguably identified the cultural system we live in today, a system marked by the widespread use of technology. More than this, we can say that the technology used has become pervasive and links business, entertainment, and social life in a more and more seamless way as people accept the potential of technology while often failing to see the way their lives and interactions are being changed by technology. Some of the concerns raised about this by critics of the changes sound much like whistling in the wind. The growth and dissemination of technology is now unstoppable, and the direction taken by the culture evolves automatically based on the technological choices made by consumers. Technoculture embraces both old and new technology and is based on a view of the supremacy of technology as a force and a conduit for communication, and publicity as the ideology of technoculture shows that more and more communication involves a form of self-advertising for companies, organizations, movements, and individuals alike.
Technoculture and Publicity
Dean (2002) views technoculture as being trapped in what she calls “a weird matrix,” noting that “at just that moment when everything seems fully public, the media pulses with invocations of the secret” (p. 1). Dean notes how the media uses secrets as demonstrations of vulnerability and emphasizes security as a way of gathering and protecting secrets. A second and seemingly opposing aspect of technoculture is a seeming openness that is also characterized as more democratic than other actions in the public forum and that involves the exposure of the intimate:
That is to say, technoculture materializes aspirations long associated with the public sphere. Indeed, it sometimes seems a machinery produced by the very ideals inspiring democracy. Advances in computer-mediated interaction provide ever-greater numbers of people with access to information. No longer a privilege of the elite, information is available to anyone with a computer. Similarly, more people have opportunities to register their thoughts and opinions on political topics than ever before. Chat rooms, cybersalons, and e-zines are just some of the new electronic spaces in which people can participate as equals in processes of collective will formation (Dean 2002, p. 2).
Some theorists see the new technology as reflecting the ideal of the public sphere in that it offers universal access, uncoerced communication, freedom of expression, participation outside of traditional political institutions, and contributes to the creation of public opinion by means of public discussion. The fuel in the system is publicity, which links together the ideals of openness, inclusivity, visibility, equality, accessibility, and rationality (Dean 2002, p. 2). The new technology, like much of the old, carries the culture to the masses, and as Dean (2000) writes, “Cultural politics is about altering the boundaries that order American democracy” (p. 78).
Sclove (1995) takes the view that technologies “promote unintended social consequences” (p. 10), and the recognition of this fact has produced a number of concerns. Many have worried that the widespread use of the Internet will harm human interaction and reduce the sense of community in society, though others argue that this is not the case. Esther Dyson also refers to the way Americans in particular revere frontiers, and for her, cyberspace is a new frontier, a “place where you can go and be yourself without worrying about the neighbors” (in Kennedy, Kennedy, & Aaron 1997, p. 640). She says that what attracts people to cyberspace is that it is so different than the community they are accustomed to in their usual lives, and one difference is that cyberspace involves a degree of freedom not possible elsewhere. It brings together many different communities under one heading:
Formerly a playground for computer nerds and techies, cyberspace now embraces every conceivable constituency: schoolchildren, flirtatious singles, Hungarian-Americans, accountants — along with pederasts and porn fans. Can they all get along? (in Kennedy, Kennedy, & Aaron 1997, p. 640).
In 1998, an early study by Robert Kraut, a professor of social psychology at Carnegie Mellon University, found that the Internet was dangerous to one’s social and psychological well-being. Tranvik (2001) asks the key question when he asks about the new technologies “will they cause unimaginable levels of social isolation, or will they bring a new society with a friendlier face” (para. 1).
The fear has been that they will bring more isolation, while many users believe they interact more with others online than they do in life. Cyberspace most certainly does constitute a new community to which virtually anyone may belong, and how helpful or harmful this may be will be clear in time.
However, seeing the Internet as a separate community does not mean that this is a community completely isolated from the physical surroundings of the user, and it also does not mean a community that cannot be reached by the political and social forces of the socio-political entities of the real world. At one time, it was thought that the Internet could not be controlled, but to a degree, it can. Countries like China and Saudi Arabia have found ways to block large portions of the Internet and criminalize access to those sites. Of course, this is precisely why countries like Canada are considering doing something to correct the situation, but it must be recognized that governments will not give up their power willingly. Just as technology can overcome many obstacles governments might place in the user’s way, so can governments track users more thoroughly and so may be able to identify who is by-passing their controls, leading to retaliation. In addition, they would certainly see a need to retaliate against Canada. The new technology is liberating and democratizing as it gives everyone access to information as never before, but the process can be slowed to a great extent by a hostile government and even by attackers like viruses and worms.
Privacy and Access
While much of the emphasis has been on the growth in new media technology, in fact all media are becoming more integrated and even coordinated. This creates both a huge body of data for anyone to access but also raises more and more concerns about privacy at the same time, as Dean (2002) notes when she writes,
On the one hand, interconnected media exacerbate the drive for content, for the scoop, for information, in their competition for audience. There must be some secret out there that has not yet been revealed. On the other hand, with the abundance of information available on the Internet, cable television, and radio come more personal concerns about the disintegration of privacy (Dean 2002, pp. 71-72).
Privacy in the new media environment is diminishing by the day as we are all monitored by cameras and other means throughout the day, meaning, as Dean (2002) states, “We are all potential information” (p. 72).
In terms of computer technology, different solutions to the problem of privacy have been implemented, with encryption being one of the solutions implemented for a wide variety of types of communication. Encryption means the coding of messages and data files sent across the Internet, preventing third parties from reading the message, while the intended receiver may decode the message and read it. This means that the recipient must have access to a special key, or block of encoded data, that unlocks the message, allowing the receiver to display it as an image, hear it as a sound file, or otherwise recreate it in its original form. Encryption of this sort may be achieved in a variety of ways, such that the approach can be used not only for Internet file transfers but also for telephone transmissions, the sending of cable and television signals, and other firms of electronic information transfers. Some of the methods of encryption depicted below are used because they have been shown to be effective, and these will continue to prevail so long as they are effective.
Encryption emerged as a political issue in the 1992 when the Clinton Administration proposed a law requiring the including of what became known as a Clipper Chip in every decide that could encode messages, thus allowing the government to decode messages under certain circumstances. This proposal was seen as a challenge to privacy. Concerns about protection with encryption are only one of the concerns raised about privacy, but clearly privacy is a major concern of computer users today. Dockrill (1987) points to the usual concerns about invasion of privacy by computer and finds that many of these scenarios focus upon deliberate efforts of governmental agencies, cloaked within their administrative powers, to amass and utilize more and more information concerning individuals or the activities of individuals or organizations. Certain specific proposals have caused particular concern, such as that for the establishment of a National Data Center in the early 1970s. Dockrill then notes:
Throughout the literature, the shared concern appears to be about the growth in governmental powers in this area and the widespread development of private organizations which collect and disseminate, generally for a fee, both personal and financial information relating to individuals (Dockrill 1987, pp. 547-548).
The problem is stated clearly by Graham: “The legal community has paid little attention to the consequences for individual privacy of the development of computers” (Graham 1987, p. 1396). Graham does say that the common law has the capacity to protect privacy rights from invasion of privacy just as it expanded to combat threats in the past, but he also says that privacy law has lagged behind technology: “Privacy law has failed to respond, as it has in the past, to technological changes that influence the degree of privacy to which we are accustomed” (Graham 1987, p. 1396).
Technology has changed the nature of “privacy” according to some because technology has altered the meaning of “public.” In an earlier age, people possessed greater anonymity than in the computer age, given that information is increasing with vast stores of data about everyone accessible by computer. The old concept of privacy is thus disappearing, though computer users are realizing this fact more and more and so seek ways to prevent any further erosion of privacy. While it remains true that massive amounts of information may be gathered in one place, analyzed, and disseminated, users still try to remain anonymous to as great a degree as possible (“Virtual Privacy” 1996, pp. 16-17).
The Center for Democracy and Technology concluded in 1997,
The deployment of key recovery systems designed to facilitate surreptitious government access to encrypted data and communications introduces substantial risks and costs. These risks and costs may not be appropriate for many applications of encryption, and they must be more fully addressed as governments consider policies that would encourage ubiquitous key recovery (“The Risks of Key Recovery, Key Escrow, and Trusted Third Party Encryption” 1998).
Most of the encryption systems used today, the organization points out, support rather than hinder the prevention and detection of crime. Encryption is used to protect burglar alarms, cash machines, postal meters, and a variety of vending and ticketing systems from manipulation and fraud, and encryption is also being deployed to facilitate electronic commerce by protecting credit card transactions on the Internet and by hindering the unauthorized duplication of digital audio and video. The use of encryption remains patchy, however:
Most automatic teller machine transactions are protected by encryption, but transactions made by bank staff (which can involve much larger amounts of money) are often not protected. Most Internet electronic mail is still sent “in the clear” and is vulnerable to interception. Most cellular telephone calls in the U.S. are still sent over the air without the benefit of strong encryption. The situation is similar in other areas. Members of the law enforcement and intelligence communities continue to express concern about widespread use of unescrowed cryptography. At the same time, these communities have expressed increasing alarm over the vulnerability of “critical infrastructure.” But there is a significant risk that widespread insertion of government-access key recovery systems into the information infrastructure will exacerbate, not alleviate, the potential for crime and information terrorism (“The Risks of Key Recovery, Key Escrow, and Trusted Third Party Encryption” 1998).
This is because increasing the number of people with authorized access to the critical infrastructure and to business data will increase the likelihood of attack, whether this be by technical means, by exploitation of mistakes, or through corruption. In addition, key recovery requirements, to the extent that they make encryption cumbersome or expensive, can discourage or delay the deployment of cryptography in increasingly vulnerable computing and communications networks (“The Risks of Key Recovery, Key Escrow, and Trusted Third Party Encryption” 1998).
Encryption is also imperfect and may fail in some situations. Even among e-mail programs for instance, there are different and incompatible methods for accomplishing the task. This fact can mean that the sender will use an application that is not compatible with the system of the receiver, in which case the receiver will not be able to decode the message (Kenworthy & Lang 1998, p. 144).
Thus, it is evident that concerns over privacy are fueling much of the desire for encryption and opposition to any Key Escrow plan. Users do want the government to intervene to make encryption more uniform so one program will be able to understand another, but users do not want the government to possess the means to contravene the system to spy on Internet users. It is also believed that such spying would cause more harm than it would lead to any benefit for society. While encryption can pro5tect a given message and may contribute a sense of privacy to computer users, the ways of invading privacy increase exponentially. When a user buys something online, for instance, he or she uses encryption to protect his or her credit card, and this may be effective. The fact of the purchase itself can become part of a database about products and preferences, however, so that companies can decide who to contact about other products. Companies trade this sort of data and so break that aspect of privacy all the time, another demonstration that we are all information.
The Developing Technological World
Winner (1997) is divided on the direction and value of technological change and on the specific business and sociological changes accompanying the shift to a computer-driven society. Part of the impetus for his investigation is a visit he makes to the workplace of the Remote Encoding Center, where he finds workers who are caught up in work that is as repetitive, uninspired, and deadening as any assembly line ever was. He also finds that many of the predictions made about the way technology would transform society and make it more utopian have not come to pass and that the ideas of analysts are changing in face of the reality of this sort of work and the more widespread use of this technology.
The nature of the prevailing view and how widespread it is noted by the author, describing this as technological determinism, which many embrace and others fear. The author finds that the new ideology accompanying this change is based in part on the philosophy of Ayn Rand, with heroic individuals struggling against the forces of small-minded bureaucrats and the ignorant masses, based on supply-side economics, and a free-market economy. The author refers to those holding these views as cyberlibertarians and says that they believe this will lead to the ideals of classical communitarian anarchism. The author cites some of these authorities and considers their views, noting the belief that this new social order would also mean increased democracy, with cyberspace standing in as the new public square where everyone could indulge in free expression. This development is seen as inevitable and far-reaching, with various documents published online to support this position and to guarantee such freedom.
However, the author also finds that there are dystopian elements within this utopian ideal and that many of these elements are gaining power and becoming reality. Consolidation of the lines of communication in the hands of the few, as is taking place as various media become part of larger concerns, creates a major concern. Also of concern is what sort of communities will form in cyberspace, with some seeing increased social separation online alongside some unification in a more impersonal way. Ultimately, the author calls for readers to become more aware of the possibilities and to see what sorts of changes are actually taking place as a way of taking action, though he also says this will not be enough and that new opportunities for shaping technology must be created and used. What is happening more and more, though, is that the technology is shaping the social order rather than being shaped by it, and again he refers to those working in the Remote Encoding Center as examples.
As Dean (2002) points out, some of the concerns about the dystopian elements of technology derive not from its failure but from its success. She notes,
One might think that the possibility of limitless information would help realize the claims of a democratic public sphere. If those who participate in the “conversation” have an abundance of data at their disposal, shouldn’t they be able to make more informed decisions? Some versions of public deliberation stipulate that nothing be omitted from consideration, that participants have access to all relevant information. Yet the conspiracy rhetoric pervading current assessments of the Internet links precisely this vision of an end to ignorance, secrecy, and the rule of expert knowledge that animates the ideal of a public sphere with gullibility, seduction, and widespread irrationality. The very prevalence of information and inclusion of multiple voices claimed on behalf of democratic discourse morphs into the undecideability of truth claims and the fear that “all kinds of people” will enter the conversation (Dean 2002, p. 72).
Moor (1985) writes about the meaning of computer ethics, the first step in determining what any operating code should include, and he states that computer ethics “is the analysis of the nature and social impact of computer technology and the corresponding formulation and justification of policies for the ethical use of such technology” (Moor, 1985, p. 266). Moor also notes the difficulty of formulating a policy for the use of computer technology, though the fact that the task is difficult is not a reason for avoiding it. He says that what is required is a conceptual framework. Such a framework would include clear definitions of the different elements and issues involved, providing answers to the many questions that would arise in attempting to formulate a comprehensive policy. The formulation of such a policy does not take place in a vacuum, nor is it without precedent:
also wish to stress the underlying importance of general ethics and science to computer ethics. Ethical theory provides categories and procedures for determining what is ethically relevant (Moor, 1985, p. 267).
Gotterbarn (1991) points out that developing any policy regarding computer ethics is itself based on the determination that ethical problems in this area can be resolved, for if no resolution is possible, it is a waste of time to worry about them (Gotterbarn 1991, p. 26). However, it could also be stated that the creation of a code of ethics is a statement that resolutions to these problems should be sought whether they can always be found or not. Indeed, a code can reduce the problems to the bare minimum and leave only the most difficult issues left unresolved. Gotterbarn also provides a good rationale for developing a code of ethics for computer use when he makes an analogy between computer ethics and other sorts of ethics. He says that people think computer ethics are different and then gives some of the reasons for this, such as the fact that other forms of ethics seem to have fixed domains while computer ethics seems to have no such clear domain:
The scope of computer ethics has been made so broad that it includes numerous and conflicting values and methodologies (Gotterbarn 1991, p. 27).
Gotterbarn calls for a narrower definition of computer ethics in order to rein in the subject and make it manageable, but in general he sees no difference between computer ethics and other ethical issues:
maintain that computer ethics is not unique; the ethical issues of it as broadly defined above are either subsumable under the issues of general ethics or they are a type of professional ethics (Gotterbarn 1991, p. 27).
In terms of the specific issue of creating and using databases of consumer information, the issue has two major facets. Some object to the gathering of such information in the first place and to the means of doing it, such as planting “cookies” on the computers of those visiting websites to track computer use. A second question concerns the tendency of companies to sell access to their information to other companies, which may then market to the same consumers and use the information to target those consumers in ways the consumers may not like. Adhering to privacy agreements would preclude some of this behavior. And legal decisions might curtail even more of it.
The issues raised have been of sufficient importance to various individuals that groups have formed to promote ethical behavior and proper use of such databases. One such organization is the Electronic Frontier Foundation (EFF). The organization was founded to ensure that the principles embodied in the U.S. Constitution and Bill of Rights (and the United Nations Universal Declaration of Human Rights) are protected as new communications technologies emerge. Certain specific activities are therefore undertaken by the organization to accomplish this task: it supports educational activities, works to develop a better understanding of the issues among policy-makers, seeks to raise public awareness about civil liberties issues, supports litigation in the public interest, and encourages and supports the development of new tools which will endow non-technical users with full and easy access to computer-based telecommunications. The EFF has sponsored a number of legal cases in which users’ online civil liberties are protected and submits amicus briefs and finds pro bono counsel when possible for important legal cases in this field. The EFF has been effective in many of its court cases, has raised awareness of online free speech issues, and continues to fight attempts to censor online communications (Electronic Frontier Foundation, 2005).
As noted by Denise, Peterfreund, and White (1996), the definition of ethics itself differs form one theorist to the next:
In regard to the definition of ethics… The best appreciation of the meaning and importance of a problem comes form an examination of the various solutions which have been attempted. Each ethical theorist conceives ethics in a personal way… (Denise, Peterfreund, & White, 1996, pp. 6-7).
Yet, these definitions all in some way address how humans should behave and what is considered right behavior and what is considered wrong behavior. Ethics also applies to corporate actions and how companies treat the human beings with whom they interact.
Companies make use of data they themselves collect in order to target customers, reshape promotional messages, and decide on new products. More and more, they make use of private data they acquire from third parties for the same purpose:
Information about people is becoming increasingly useful to marketing agents, and vast digital collections of personal information are being collected. In some cases, the data is sold, but in others, it is freely available. Now that the Web has become the ultimate data dissemination tool, databases of personal information can be queried very quickly, easily, and above all, anonymously. Never has the task of an investigator (or a casual interrogator) been easier (Halstead & Ashman 2000, para. 1).
Yet, people express more fear of government data collection than private, though in fact data collection has been privatized to a great degree:
It also is clear that data collection is increasingly a core activity of private companies, rather than government organizations… However the public is less inclined to view commercial data collections as a threat to personal privacy or liberties. This could in part be because such data collections are hidden, and their extent and use are not publicized (Halstead & Ashman 2000, para. 2).
Another issue that affects the power of the Internet to be a democratizing force is accessibility. For one thing, widespread as the PC is, there are segments of society that do not have access to the computer and so to the Internet. This means that the distance between the haves and have-nots is likely to increase, which does not bode well for full participation. The cost of joining the newly-wired world may be high, but the cost of staying on the sidelines would be even greater. The new technologies can make life easier and expand our horizons if we use them properly and learn their capabilities. However, before the benefits of the Internet and associated technologies can be realized, the individual must have access to the necessary computer technology and to a connection to the Internet.
Web accessibility refers to more than the ability to connect to the Internet and includes the idea that websites should be, though often are not, designed so as to “facilitate access to the information on the site by people with disabilities” (Darsie 2004, p. 1) and others who need to access data and may find it difficult unless the site is designed to help rather than hinder in the search. From the point-of-view of the web designer, web accessibility is “the ability to produce Web sites that are easily accessible by as broad an audience as possible” (Focazio 2004, p. 1). Only in this way can the democratic ideal become more a reality.
The democratization of society begins in the schools as the virtues and values of democracy are inculcated into the young. However, this process can be altered or redirected unless full access to the new technology is also part of the process of education. According to Terrett (1994), “the use of technology, even though viewed by some as expensive and unnecessary, creates a cost-efficient mechanism that gives students access to materials and resources that were previously unavailable” (p. 30). Access to computer resources provides a significant resource for both school and work-based learning. The World Wide Web has become one of the most significant sources of information and offers a plethora of sources for many subjects. Access to the web provides for the greatest learning opportunities.
Paradoxically, it is also evident that for those students who have not had equal opportunities to learn and who conceivably could gain the most advantage from high-technology applications are also those students who, in most instances, do not have equal opportunities to use these applications. This fat has been recognized by major organizations such as the U.S. Department of Education., which is aware of the fact that the special need for equitable access for urban, rural, and disadvantaged schools exists. Furthermore, if these students do not have equal access to this technology the gap between the technology “haves” and the “have nots” will widen. At the same time, those students and schools that are already experiencing discrepancies in the quantity and quality of educational resources will only find the gap wider over time. This is due to the fact that the differences in the availability to these resources changes within populations because of location and socioeconomic conditions.
To consider this situation in even more detail it is important to consider the work of Means and Olson (1995). They report that “access to educational technology at school can give students from low-income homes, where there is little or no access to technology, a needed edge to compete with children coming from more affluent homes, where technology is commonplace” (p. 103). By insuring that all classrooms have access to educational technology that is affordable, curricular goals can be met. This would also create an opportunity to begin to address the inequities existing among schools and districts in terms of the availability of instructional resources.
However, it has been shown that simply having the educational technology present in the classroom guarantees academic benefits. The students will benefit only if the technology is implemented in a positive and useful way by trained teachers and educators. As noted, teachers are not always properly trained in providing the best educational benefits that the technology has to offer, and there are in fact many schools that are considered “at-risk” and while have students attending schools where the educational technology, although available, is not being used. If the technology is not being used by the schools, then the students are not benefiting so that the gap between the “haves” and the “have nots” is widening.
Dean (2002) points out that “Publicity is also the governing concept of the information age. Contemporary technoculture relies on the conviction that the solution to any problem is publicity. More information, greater (faster, better, cheaper!) access seems the only answer. It doesn’t even matter what the question is. People are supposed to find out for themselves, to search for the truth, to form their own opinions — and the way to do that is through new communication technologies (Dean 2002, p. 15).
Jodi Dean’s statement that publicity represents the ideology of technoculture shows that publicity in this view involves a complex clash between the public and the private in a world privacy is diminished even as people discover new ways of keeping some data secret while publishing more and more data that once would have remained private, either purposely or inadvertently. The nature of democracy is being altered by these changes in ways that the public may not yet perceive.
Darsie, R., 2005, Building Accessible Web Sites, Office of the Vice Provost Information and Educational Technology Expiration, http://tif.ucdavis.edu/meetings/2002/accessibility_recsol3.pdf.
Dean, J., 2000, Cultural Studies and Political Theory, Ithaca, New York, Cornell University Press.
Dean, J., 2002, Publicity’s Secret: How Technoculture Capitalizes on Democracy, New York, Cornell University Press.
Denise, T.C., Peterfreund, S.P. & White, N.P., 1996, Great traditions in ethics, New York, Wadsworth.
Dockrill, C., 1987, Computer data banks and personal information: Protection against negligent disclosure, the Dalhousie Law Journal, 546-550.
Electronic Frontier Foundation, 2005, July 11, 2005 at http://eff.org.
Focazio, P.C., 2004, Accessible Web Design: An Introduction, New York Sea Grant, Stony Brook University.
Gotterbarn, D., 1991, Summer, Computer ethics responsibility regained, Phi Kappa Phi Journal, 26-30.
Graham, J.P., 1987, June, Privacy, computers, and the commercial dissemination of personal information, Texas Law Review, 1395-1439.
Kennedy, X.J., Kennedy, D.M., & Aaron, J.E., 1997, the Bedford Reader, Boston, Bedford Books.
Kenworthy, K. & Lang, N.A. (1998, December 1, How Safe Is the Net?, Windows, 144.
Means, B. & Olson, k., 1998, August 31, Background Paper for the Expert Panel on Educational Technology, Criteria for Selection of Promising and Exemplary Practices, Issue IV, http://www.ed.gov/offices/OERI/ORAD/Background/lessons.html.
Moor, J.H., 1985, What is computer ethics?, Metaphilosophy, 266-275.
The Risks of Key Recovery, Key Escrow, and Trusted Third Party Encryption, 1998, Center for Democracy and Technology, April 10, 2007, http://www.cdt.org/crypto/risks98/.
Sclove, R.E., 1995, Democracy and Technology, New York, the Guilford Press.
Terrett D.L., 1994, Information Technology for Law Schools: The Second BILETA Report into Information Technology and Legal Education, Legal Education and Technology Association.
Tranvik, T., 2001, January 1, Power to the Internauts, Inroads: A Journal of Opinion, March 18, 2007, http://www.highbeam.com/doc/1G1-76702605.html.
Virtual Privacy, 1996, February 10, the Economist, 16-17.
Winner, L., 1997, Fall, Technology Today: Utopia or Dystopia?, Social Research, 64(3), 989.