To see the other types of publications on this topic, follow the link: Citations (interferences).

Journal articles on the topic 'Citations (interferences)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 31 journal articles for your research on the topic 'Citations (interferences).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Scheall, Scott, William N. Butos, and Thomas McQuade. "Social and scientific disorder as epistemic phenomena, or the consequences of government dietary guidelines." Journal of Institutional Economics 15, no. 3 (October 23, 2018): 431–47. http://dx.doi.org/10.1017/s1744137418000358.

Full text
Abstract:
AbstractWe begin with a process-oriented model of science according to which signals concerning scientific reputation serve both to coordinate the plans of individuals in the scientific domain and to ensure that the knowledge that emerges from interactions between scientists and the environment is reliable. Under normal circumstances, scientific order emerges from the publication–citation–reputation (PCR) process of science. We adopt and extend F. A. Hayek's epistemology according to which knowledge affords successful plan-based action and we employ this in the development of an epistemic theory of social order. We propose that external interferences with the PCR process have distorting effects on scientific knowledge and, thus, on scientific and social order more broadly. We support this claim by describing the history of the US federal government's development of standardized dietary guidelines for American consumers and its concomitant interference in the PCR process of nutritional science. We conclude that this interference contributed to social disorder in dietary science and beyond.
APA, Harvard, Vancouver, ISO, and other styles
2

Chai, Sen, Alexander D’Amour, and Lee Fleming. "Explaining and predicting the impact of authors within a community: an assessment of the bibliometric literature and application of machine learning." Industrial and Corporate Change 29, no. 1 (July 16, 2019): 61–80. http://dx.doi.org/10.1093/icc/dtz042.

Full text
Abstract:
Abstract Following widespread availability of computerized databases, much research has correlated bibliometric measures from papers or patents to subsequent success, typically measured as the number of publications or citations. Building on this large body of work, we ask the following questions: given available bibliometric information in one year, along with the combined theories on sources of creative breakthroughs from the literatures on creativity and innovation, how accurately can we explain the impact of authors in a given research community in the following year? In particular, who is most likely to publish, publish highly cited work, and even publish a highly cited outlier? And, how accurately can these existing theories predict breakthroughs using only contemporaneous data? After reviewing and synthesizing (often competing) theories from the literatures, we simultaneously model the collective hypotheses based on available data in the year before RNA interference was discovered. We operationalize author impact using publication count, forward citations, and the more stringent definition of being in the top decile of the citation distribution. Explanatory power of current theories altogether ranges from less than 9% for being top cited to 24% for productivity. Machine learning (ML) methods yield similar findings as the explanatory linear models, and tangible improvement only for non-linear Support Vector Machine models. We also perform predictions using only existing data until 1997, and find lower predictability than using explanatory models. We conclude with an agenda for future progress in the bibliometric study of creativity and look forward to ML research that can explain its models.
APA, Harvard, Vancouver, ISO, and other styles
3

McIsaac, Tara L., Eric M. Lamberg, and Lisa M. Muratori. "Building a Framework for a Dual Task Taxonomy." BioMed Research International 2015 (2015): 1–10. http://dx.doi.org/10.1155/2015/591475.

Full text
Abstract:
The study of dual task interference has gained increasing attention in the literature for the past 35 years, with six MEDLINE citations in 1979 growing to 351 citations indexed in 2014 and a peak of 454 cited papers in 2013. Increasingly, researchers are examining dual task cost in individuals with pathology, including those with neurodegenerative diseases. While the influence of these papers has extended from the laboratory to the clinic, the field has evolved without clear definitions of commonly used terms and with extreme variations in experimental procedures. As a result, it is difficult to examine the interference literature as a single body of work. In this paper we present a new taxonomy for classifying cognitive-motor and motor-motor interference within the study of dual task behaviors that connects traditional concepts of learning and principles of motor control with current issues of multitasking analysis. As a first step in the process we provide an operational definition of dual task, distinguishing it from a complex single task. We present this new taxonomy, inclusive of both cognitive and motor modalities, as a working model; one that we hope will generate discussion and create a framework from which one can view previous studies and develop questions of interest.
APA, Harvard, Vancouver, ISO, and other styles
4

Ganguli, Ina, Jeffrey Lin, and Nicholas Reynolds. "The Paper Trail of Knowledge Spillovers: Evidence from Patent Interferences." American Economic Journal: Applied Economics 12, no. 2 (April 1, 2020): 278–302. http://dx.doi.org/10.1257/app.20180017.

Full text
Abstract:
We show evidence of localized knowledge spillovers using a new database of US patent interferences terminated between 1998 and 2014. Interferences resulted when two or more independent parties submitted identical claims of invention nearly simultaneously. Following the idea that inventors of identical inventions share common knowledge inputs, interferences provide a new method for measuring knowledge spillovers. Interfering inventors are 1.4 to 4.0 times more likely to live in the same local area than matched control pairs of inventors. They are also more geographically concentrated than citation-linked inventors. Our results emphasize geographic distance as a barrier to tacit knowledge flows. (JEL D83, O31, O33, O34)
APA, Harvard, Vancouver, ISO, and other styles
5

Visser, Daniel, and Niall R. Whitty. "The Role of Interest in Unjustified Enrichment Claims." Edinburgh Law Review 25, no. 1 (January 2021): 48–88. http://dx.doi.org/10.3366/elr.2021.0673.

Full text
Abstract:
This essay addresses the question: when should pre-citation interest be awarded in actions for unjustified enrichment in Scots law? The answer depends mainly on the definition of the elements of enrichment liability, the manner of acquiring the enrichment, the type of enrichment-debtor, and his or her state of mind. The essay argues that (a) generally the actual interest earned (or saved) should be awarded, aided by a rebuttable presumption that interest was earned at a specified rate; (b) interest should normally be awarded at market rates where the defender knows that s/he holds the money or asset unjustifiably; and (c) in enrichment by interference with the pursuer's rights to money or other assets, an interest award might represent the time-value of exercising those rights during the period of interference.
APA, Harvard, Vancouver, ISO, and other styles
6

Al-Emran, Sulaiman, and Rakan Barakati. "A Method for Stabilizing a Lingual Fixed Retainer in Place Prior to Bonding." Journal of Contemporary Dental Practice 8, no. 7 (2007): 108–13. http://dx.doi.org/10.5005/jcdp-8-7-108.

Full text
Abstract:
Abstract Aim The objective of this article is to present a simple technique for stabilizing a lingual fixed retainer wire in place with good adaptation to the teeth surfaces and checking for occlusal interferences prior to the bonding procedure. Background Bonding of an upper or lower fixed lingual retainer using stainless steel wires of different sizes and shapes is a common orthodontic procedure. The retainer can be constructed in a dental laboratory, made at chair side, or it can be purchased in prefabricated form. All three ways of creating a fixed retainer are acceptable. However, the method of holding the retainer wire in place adjacent to the lingual surfaces of the teeth before proceeding with the bonding process remains a problem for some practitioners. Report The lingual fixed retainer was fabricated using three pieces of .010” steel ligature wire which were twisted into a single strand wire. Another four to five 0.010” pieces of steel ligature wires were twisted in the same way to serve as an anchor wire from the labial side of the teeth. The retainer wire was bonded using the foible composite. Summary The technique presented here for stabilizing the retainer wire prior to bonding provides good stabilization, adaptation, and proper positioning of the retainer wire while eliminating contamination of etched surfaces which might arise during wire positioning before bonding. This technique also allows the clinician the opportunity to check the occlusion and adjust the retainer wire to avoid occlusal interference prior to bonding maxillary retainers. This same clinical strategy can be used to stabilize wires for splinting periodontally affected teeth and traumatized teeth. Citation Al-Emran S, Barakati R. A Method for Stabilizing a Lingual Fixed Retainer in Place Prior to Bonding. J Contemp Dent Pract 2007 November; (8)7:108-113.
APA, Harvard, Vancouver, ISO, and other styles
7

Russ, John, and Chris Russ. "Removing Halftone Patterns From Scanned Images." Microscopy Today 8, no. 7 (September 2000): 22–24. http://dx.doi.org/10.1017/s1551929500054638.

Full text
Abstract:
Using illustrations from published technical articles is part of many lectures and presentation citations, and given the availability of flat bed scanners, should be easy to accomplish. The problem that remains is removing halftone patterns and other periodic noise that result from printing and scanning technology. Practically all magazines and newspapers are printed using a regular halftone pattern that uses an array of dots varied in size to produce the visual illusion of continuous gray scale. Color images are usually printed with three, four or even more such patterns using different colored inks and different pattern orientations. Scanning such images into a computer can introduce a further pattern due to the moire interference between the printed pattern and the spacing of the sensors in the scanner. Such patterns are also characteristic of images obtained from single-chip video cameras because of the color filters present in front of the light sensors on the chip.
APA, Harvard, Vancouver, ISO, and other styles
8

Schwarzwald, Ora (Rodrigue). "Linguistic Variations in Early Ladino Translations." Journal of Jewish Languages 2, no. 1 (June 9, 2014): 1–48. http://dx.doi.org/10.1163/22134638-12340023.

Full text
Abstract:
The differences between early Ladino liturgical translations andhalakhictranslations, both of which were based on Hebrew sources, are analyzed in this study. The liturgical translations include the Bible, Pirke Avot, the Passover Haggadah, and the Siddur as well as biblical citations in these sources. The halakhic translations includeMesa de el alma(Shulḥan Hapanimin Hebrew) which is a translation ofShulḥan Arukh, the translations ofḤovat Halevavot, and the halakhic instructions in the prayer books. While there are no significant variations in orthography between the two kinds of translations and morphology demonstrates few differences, syntax, discourse analysis, and lexicon reveal great variability. The halakhic translations demonstrate simplification, explicitation, normalization, and a small amount of interference, whereas the liturgical translations adhere to very strict norms of word-for-word translation. It was also found in both kinds of texts that the western translations from Italy and the Netherlands done by former converted Jews (anusim) follow Spanish norms more than the eastern Ladino conventions of the Jews in the Balkans and Asia Minor.
APA, Harvard, Vancouver, ISO, and other styles
9

Simmons, Edlyn S., and Bettina D. Spahl. "Of submarines and interference: legal status changes following citation of an earlier US patent or patent application under 35 USC §102 (e)." World Patent Information 22, no. 3 (September 2000): 191–203. http://dx.doi.org/10.1016/s0172-2190(00)00046-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chen, Liang, Liisa Heikkinen, Changliang Wang, Yang Yang, Huiyan Sun, and Garry Wong. "Trends in the development of miRNA bioinformatics tools." Briefings in Bioinformatics 20, no. 5 (June 17, 2019): 1836–52. http://dx.doi.org/10.1093/bib/bby054.

Full text
Abstract:
Abstract MicroRNAs (miRNAs) are small noncoding RNAs that regulate gene expression via recognition of cognate sequences and interference of transcriptional, translational or epigenetic processes. Bioinformatics tools developed for miRNA study include those for miRNA prediction and discovery, structure, analysis and target prediction. We manually curated 95 review papers and ∼1000 miRNA bioinformatics tools published since 2003. We classified and ranked them based on citation number or PageRank score, and then performed network analysis and text mining (TM) to study the miRNA tools development trends. Five key trends were observed: (1) miRNA identification and target prediction have been hot spots in the past decade; (2) manual curation and TM are the main methods for collecting miRNA knowledge from literature; (3) most early tools are well maintained and widely used; (4) classic machine learning methods retain their utility; however, novel ones have begun to emerge; (5) disease-associated miRNA tools are emerging. Our analysis yields significant insight into the past development and future directions of miRNA tools.
APA, Harvard, Vancouver, ISO, and other styles
11

Kringel, Dario, Sebastian Malkusch, and Jörn Lötsch. "Drugs and Epigenetic Molecular Functions. A Pharmacological Data Scientometric Analysis." International Journal of Molecular Sciences 22, no. 14 (July 6, 2021): 7250. http://dx.doi.org/10.3390/ijms22147250.

Full text
Abstract:
Interactions of drugs with the classical epigenetic mechanism of DNA methylation or histone modification are increasingly being elucidated mechanistically and used to develop novel classes of epigenetic therapeutics. A data science approach is used to synthesize current knowledge on the pharmacological implications of epigenetic regulation of gene expression. Computer-aided knowledge discovery for epigenetic implications of current approved or investigational drugs was performed by querying information from multiple publicly available gold-standard sources to (i) identify enzymes involved in classical epigenetic processes, (ii) screen original biomedical scientific publications including bibliometric analyses, (iii) identify drugs that interact with epigenetic enzymes, including their additional non-epigenetic targets, and (iv) analyze computational functional genomics of drugs with epigenetic interactions. PubMed database search yielded 3051 hits on epigenetics and drugs, starting in 1992 and peaking in 2016. Annual citations increased to a plateau in 2000 and show a downward trend since 2008. Approved and investigational drugs in the DrugBank database included 122 compounds that interacted with 68 unique epigenetic enzymes. Additional molecular functions modulated by these drugs included other enzyme interactions, whereas modulation of ion channels or G-protein-coupled receptors were underrepresented. Epigenetic interactions included (i) drug-induced modulation of DNA methylation, (ii) drug-induced modulation of histone conformations, and (iii) epigenetic modulation of drug effects by interference with pharmacokinetics or pharmacodynamics. Interactions of epigenetic molecular functions and drugs are mutual. Recent research activities on the discovery and development of novel epigenetic therapeutics have passed successfully, whereas epigenetic effects of non-epigenetic drugs or epigenetically induced changes in the targets of common drugs have not yet received the necessary systematic attention in the context of pharmacological plasticity.
APA, Harvard, Vancouver, ISO, and other styles
12

McNeil, C. M., and E. A. Musgrove. "Review of: c-Myc suppresses p21WAF1/CIP1expression during oestrogen signalling and antioestrogen resistance in human breast cancer cells." Breast Cancer Online 9, no. 5 (March 29, 2006): 1–4. http://dx.doi.org/10.1017/s1470903106004925.

Full text
Abstract:
Citation of original article:S. Mukherjee, S. E. Conrad.Journal of Biological Chemistry2005;280: 17616–17625.Abstract of the original article:Oestrogen rapidly induces expression of the proto-oncogene c-Myc. c-Myc is required for oestrogen-stimulated proliferation of breast cancer cells, and deregulated c-Myc expression has been implicated in antioestrogen resistance. In this report, we investigate the mechanism(s) by which c-Myc mediates oestrogen-stimulated proliferation and contributes to cell cycle progression in the presence of antioestrogen. The MCF-7 cell line is a model of oestrogen-dependent, antioestrogen-sensitive human breast cancer. Using stable MCF-7 derivatives with inducible c-Myc expression, we demonstrated that in antioestrogen-treated cells, the elevated mRNA and protein levels of p21WAF1/CIP1, a cell cycle inhibitor, decreased upon either c-Myc induction or oestrogen treatment. Expression of p21 blocked c-Myc-mediated cell cycle progression in the presence of antioestrogen, suggesting that the decrease in p21WAF1/CIP1is necessary for this process. Using RNA interference to suppress c-Myc expression, we further established that c-Myc is required for oestrogen-mediated decreases in p21WAF1/CIP1. Finally, we observed that neither c-Myc nor p21WAF1/CIP1is regulated by oestrogen or antioestrogen in an antioestrogen-resistant MCF-7 derivative. The p21 levels in the antioestrogen-resistant cells increased when c-Myc expression was suppressed, suggesting that loss of p21 regulation was a consequence of constitutive c-Myc expression. Together, these studies implicate p21WAF1/CIP1as an important target of c-Myc in breast cancer cells and provide a link between oestrogen, c-Myc, and the cell cycle machinery. They further suggest that aberrant c-Myc expression, which is frequently observed in human breast cancers, can contribute to antioestrogen resistance by altering p21WAF1/CIP1regulation.
APA, Harvard, Vancouver, ISO, and other styles
13

Almas, Khalid, Abdullah Al-Hawish, and Waheed Al-Khamis. "Oral Hygiene Practices, Smoking Habits, and Self-Perceived Oral Malodor Among Dental Students." Journal of Contemporary Dental Practice 4, no. 4 (2003): 77–90. http://dx.doi.org/10.5005/jcdp-4-4-77.

Full text
Abstract:
Abstract The aims of this study were to determine the prevalence of oral hygiene practices and halitosis among undergraduate students from King Saud University, College of Dentistry. A self-administered questionnaire was distributed among all 481 students; 263 male and 218 female students. A questionnaire was developed to assess the self-reported perception of oral breath, awareness of bad breath, timing of bad breath, treatment received for bad breath, oral hygiene practices, caries and bleeding gums, dryness of the mouth, smoking and tea drinking habits, and tongue coating. The response rate was 77%. Forty four percent of male and 32% of female students reported the self-perception of breath odor. Self-treatment was sought by 12% male and 22% female. Six percent of males and 4% of females experienced bad breath interference at their work. Seventy-eight percent of male and 62% of female students experienced bad breath after waking up. Brushing was prevalent among 81% of male and 99% of female students. Both miswak (chewing sticks) and tooth brushing were used by 53% male and 83% female students. Fifty seven percent of male students and 44% of female students reported caries. Bleeding gingiva was experienced by 26% of males and 14% of females. Dry mouth was common among 14% of males and 17% of females, while smoking was prevalent among 13% of males and 2% of females. Tea drinking was common among 44% of males and 37% of females, while tongue coating was equally common among both males and females (21% and 20%), respectively. The results indicate female students had better oral hygiene practices, significantly less self-reported oral bad breath, and smoked less compared to male students. There was no difference in tongue coating among male and female students. Further research is needed to examine oral malodor clinically and objectively by the standard procedures available. Students should be motivated to be a health symbol and keeping their mouths free from oral malodor. Citation Almas K, Al-Hawish A, Al-Khamis W. Oral Hygiene Practices, Smoking Habits, and Self-Perceived Oral Malodor Among Dental Students. J Contemp Dent Pract 2003 November;(4)4:077-090.
APA, Harvard, Vancouver, ISO, and other styles
14

Lovink, Geert, Greg Hearn, and David Marshall. "M/C Event: Directions for Cyberculture in the New Economy." M/C Journal 3, no. 3 (June 1, 2000). http://dx.doi.org/10.5204/mcj.1854.

Full text
Abstract:
An M/C Event held in the Conference Room of the University of Queensland Library on 12 May 2000 The early, mythological phase of digital culture is now rapidly running out of its utopian energies. Law and order are taking command over the last pockets of digital wilderness. The taming of the cyberculture by "click 'n mortar" businesses and their willing government executors took only a few years. The time of institutionalization, mega mergers and security paranoia has arrived. These new conditions, driven by the current hyper-growth, have an as yet invisible effect on the cultural new media sector (arts, design, education), which had perceived itself for so long as "ahead of the wave". To prevent Internet from turning into a nightmare (from which it then has to awake), neither the utopian vision has to be eliminated, nor do we need to withdraw to the apocalyptic pole, which states that the world and its network will collapse anyhow -- with or without our interference. The conflict between utopia and negativism needs to be played out. The deeper we are drawn into the Virtual, the more there is a need to stage its inherent paradoxes and contradictions. But how ? The recordings from this event are available in RealAudio and Windows Media streaming audio formats. "Directions for Cyberculture in the New Economy" Geert Lovink, Media Scholar and Activist Responses by Greg Hearn and David Marshall M/C Event University of Queensland 12 May 2000 Introduction and Acknowledgements Axel Bruns 28k 56k 28k 56k "Directions for Cyberculture in the New Economy" Geert Lovink 28k 56k 28k 56k First Response Greg Hearn 28k 56k 28k 56k Second Response David Marshall 28k 56k 28k 56k Reply to Respondents Geert Lovink 28k 56k 28k 56k Audience Questions & Answers - Part 1 Geert Lovink Greg Hearn David Marshall 28k 56k 28k 56k Audience Questions & Answers - Part 2 Geert Lovink Greg Hearn David Marshall 28k 56k 28k 56k Acknowledgements Geert Lovink visited Brisbane as a participant in Alchemy, an International Masterclass for New Media Artists and Curators, which was organised by the Australian Network for Art and Technology in association with the Brisbane Powerhouse - Centre for the Live Arts from 8 May to 9 June 2000. M/C and the Media and Cultural Studies Centre are highly grateful to ANAT and Geert Lovink as well as the Australian Key Centre for Cultural and Media Policy for making this event possible. Citation reference for this article MLA style: Geert Lovink, with Greg Hearn and David Marshall. "Directions for Cyberculture in the New Economy." M/C: A Journal of Media and Culture 3.3 (2000). [your date of access] <http://www.api-network.com/mc/0006/cyberculture.php>. Chicago style: Geert Lovink, with Greg Hearn and David Marshall, "Directions for Cyberculture in the New Economy," M/C: A Journal of Media and Culture 3, no. 3 (2000), <http://www.api-network.com/mc/0006/cyberculture.php> ([your date of access]). APA style: Geert Lovink, with Greg Hearn and David Marshall. (2000) Directions for cyberculture in the new economy. M/C: A Journal of Media and Culture 3(3). <http://www.api-network.com/mc/0006/cyberculture.php> ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles
15

Henley, Nadine. "Free to Be Obese in a ‘Super Nanny State’?" M/C Journal 9, no. 4 (September 1, 2006). http://dx.doi.org/10.5204/mcj.2651.

Full text
Abstract:
“Live free or die!” (New Hampshire State motto) Should individuals be free to make lifestyle decisions (such as what, when and how much to eat and how much physical activity to take), without undue interference from the state, even when their decisions may lead to negative consequences (obesity, heart disease, diabetes)? The UN Declaration of Human Rights enshrines the belief that “All human beings are born free and equal in dignity and rights”. The philosophy of Libertarianism (Locke) proposes that rights can be negative (e.g. the freedom to be free from outside interference) as well as positive (e.g. the right to certain benefits supplied by others). Robert Nozick, a proponent of Libertarianism, has argued that we have the right to make informed decisions about our lives without unnecessary interference. This entitlement requires that we exercise our rights only as far as they do not infringe the rights of others. The popular notion of the “Nanny State” (often used derogatively) is discussed, and the metaphor is extended to draw on the Super Nanny phenomenon, a reality television series that has been shown in numerous countries including the UK, the US, and Australia. It is argued in this paper that social marketing, when done well, can help create a “Super Nanny State” (implying positive connotations). In the “Nanny State” people are told what to do; in the “Super Nanny State” people are empowered to make healthier decisions. Social marketing applies commercial marketing principles to “sell” ideas (rather than goods or services) with the aim of improving the welfare of individuals and/or society. Where the common good may not be easily discerned, Donovan and Henley recommended using the UN Declaration of Human Rights as the baseline reference point. Social marketing is frequently used to persuade individuals to make healthier lifestyle decisions such as “eat less [saturated] fat”, “eat two fruits and five veg a day”, “find thirty minutes of physical activity a day”. Recent medical gains in immunisation, sanitation and treating infectious diseases mean that the health of a population can now be more improved by influencing lifestyle decisions than by treating illness (Rothschild). Social marketing activities worldwide are directed at influencing lifestyle decisions to prevent or minimise lifestyle diseases. “Globesity” is the new epidemic (Kline). Approximately one billion people globally are overweight or obese (compared to 850 million who are underweight); most worryingly, about 10% of children worldwide are now overweight or obese with rising incidence of type 2 diabetes in this population (Yach, Stuckler, and Brownwell). “Nanny state” is a term people often use derogatively to refer to government intervention (see Henley and Jackson). Knag (405) made a distinction between old-style, authoritarian “paternalism”, which chastised the individual using laws and sanctions, and a newer “maternalism” or “nanny state” which smothers the individual with “education and therapy (or rather, propaganda and regulation)”. Knag’s use of the term “Nanny State” still has pejorative connotations. In the “Nanny State”, governments are seen as using the tool of social marketing to tell people what they should and shouldn’t do, as if they were children being supervised by a nanny. At the extreme, people may be afraid that social marketing could be used by the State as a way to control the thoughts of the vulnerable, a view expressed some years ago by participants in a survey of attitudes towards social marketing (Laczniak, Lusch, and Murphy). More recently, the debate is more likely to focus on why social marketing often appears to be ineffective, rather than frighteningly effective (Hastings, Stead, and Macintosh). Another concern is the high level of fear being generated by much of the social marketing effort (Hastings and MacFadyen; Henley). It is as if nanny thinks she must scream at her children all the time to warn them that they will die if they don’t listen to her. However, by extension, I am suggesting that the “Super Nanny State” metaphor could have positive associations, with an authoritative (rather than authoritarian) parenting figure, one who explains appropriate sanctions (laws and regulations) but who is also capable of informing, inspiring and empowering. Still, the Libertarian ethical viewpoint would question whether governments, through social marketers, have the right to try to influence people’s lifestyle decisions such as what and how much to eat, how much to exercise, etc. In the rise of the “Nanny State”, Holt argued that governments are extending the range of their regulatory powers, restricting free markets and intruding into areas of personal responsibility, all under the guise of acting for the public’s good. A number of arguments, discussed below, can be proposed to justify interference by the State in the lifestyle decisions of individuals. The Economic Argument One argument that is often quoted to justify interference by the State is that the economic costs of allowing unsafe/unhealthy behaviours have to be borne by the community. It has been estimated in the US that medical costs relating to diabetes (which is associated directly with obesity) increased from $44 billion to $92 billion in five years (Yach, et al). The economic argument can be useful for persuading governments to invest in prevention but is not sufficient as a fundamental justification for interference. If we say that we want people to eat more healthily because their health costs will be burdensome to the community, we imply that we would not ask them to do so if their health costs were not burdensome, even if they were dying prematurely as a result. The studies relating to the economic costs of obesity have not been as extensive as those relating to the economic costs of tobacco (Yach, et al), where some have argued that prematurely dying of smoking-related diseases is less costly to the State than the costs incurred in living to old age (Barendregt, et al). This conclusion has been disputed (Rasmussen et al), but even if true, would not provide sufficient justification to cease tobacco control efforts. Similarly, I think people would expect social marketing efforts relating to nutrition and physical activity to continue even if an economic analysis showed that people dying prematurely from obesity-related illnesses were costing the State less overall in health care costs than people living an additional twenty years. The Consumer Protection Argument Some degree of interference by the State is desirable and often necessary because people are not entirely self-reliant in every circumstance (Mead). The social determinants of health (Marmot and Wilkinson) are sufficiently well-understood to justify government regulation to reduce inequalities in housing, education, access to health services, etc. Implicit in the criticism that the “Nanny State” treats people like children is the assumption that children are treated without dignity and respect. The positive parent or “Super Nanny” treats children with respect but recognises their vulnerability in unfamiliar or dangerous contexts. A survey of opinion in the UK in 2004 by the King’s Fund, an independent think tank, found that the public generally supported government initiatives to encourage healthier school meals; ensure cheaper fruit and vegetables; pass laws to limit salt, fat and sugar in foods; stop advertising junk foods for children and regulate for nutrition labels on food (UK public wants a “Nanny State”). The UK’s recently established National Social Marketing Centre has made recommendations for social marketing strategies to improve public health and Prime Minister Tony Blair has responded by making public health, especially the growing obesity problem, a central issue for government initiatives, offering a “helping hand” approach (Triggle). The Better Alternative Argument Wikler considered the case for more punitive government intervention in the obesity debate by weighing the pros and cons of an interesting strategy: the introduction of a “fat tax” that would require citizens to be weighed and, if found to be overweight, require them to pay a surcharge. He concluded that this level of state interference would not be justified because there are other ways to appeal to the risk-taker’s autonomy, through education and therapeutic efforts. Governments can use social marketing as one of these better alternatives to punitive sanctions. The Level Playing Field Argument Social marketers argue that many lifestyle behaviours are not entirely voluntary (O’Connell and Price). For example, it is argued that an individual’s choices about eating fast food, consuming sweetened soft drinks, and living sedentary lives have already been partially determined by commercial efforts. Thus, they argue that social marketing efforts are intended to level the playing field – educate, inform, and restore true personal autonomy to people, enabling them to make rational choices (Smith). For example, Kline’s media education program in Canada, with a component of “media risk reduction”, successfully educated young consumers (elementary school children) with strategies for “tuning out” by asking them to come up with a plan for what they would do if they “turned off TV, video games and PCs for a whole week?” (p. 249). The “tune out challenge” resulted in a reduction of media exposure (80%) displaced into active leisure pursuits. A critical aspect of this intervention was the contract drawn up in advance, with the children setting their own goals and strategies (Kline). In this view, the state is justified in trying to level the playing field, by using social marketing to offer information as well as alternative, healthier choices that can be freely accepted or rejected (Rothschild). Conclusion A real concern is that when people are treated like children, they become like children, retaining their desires and appetites but abdicating responsibility for their individual choices to the state (Knag). Some smokers, for example, declare that they will continue to smoke until the government bans smoking (Brown). Governments and social marketers have a responsibility to fund/design campaigns so that the audience views the message as informative rather than proscriptive. Joffe and Mindell (967) advocated the notion of a “canny state” with “less reliance on telling people what to do and more emphasis on making healthy choices easier”. Finally, one of the central tenets of marketing is the concept of “exchange” – the marketer must identify the benefits to be gained from buying a product. In social marketing terms, interference in an individual’s right to act freely can be effective and justified when the benefits are clearly identifiable and credible. Rothschild described marketing’s role as providing a middle point between libertarianism and paternalism, offering free choice and incentives to behave in ways that benefit the common good. Rather than shaking a finger at the individual (along the lines of earlier “Don’t Do Drugs” campaigns), the “Super Nanny” state, via social marketing, can inform and engage individuals in ways that make healthier choices more appealing and the individual feel more empowered to choose them. References Barendregt, J.J., L. Bonneux, O.J. van der Maas. “The Health Care Costs of Smoking.” New England Journal of Medicine 337.15 (1997): 1052-7. Brown, D. Depressed Men: Angry Women: Non-Stereotypical Gender Responses to Anti-Smoking Messages in Older Smokers. Unpublished Masters dissertation, Edith Cowan University, Perth, Western Australia, 2001. Donovan, R., and N. Henley. Social Marketing: Principles and Practice. Melbourne: IP Communications, 2003. Joffe, M., and J. Mindell. “A Tentative Step towards Healthy Public Policy.” Journal of Epidemiology and Community Health 58 (2004): 966-8. Hastings, G.B., and L. MacFadyen. “The Limitations of Fear Messages.” Tobacco Control 11 (2002): 73-5. Hastings, G.B., M. Stead, and A.M. Macintosh. “Rethinking Drugs Prevention: Radical Thoughts from Social Marketing.” Health Education Journal 61.4 (2002): 347-64. Henley, N. “You Will Die! Mass Media Invocations of Existential Dread.” M/C Journal 5.1 (2002). 1 May 2006 http://journal.media-culture.org.au/0203/youwilldie.php>. Henley, N., and J. Jackson. “Is It ‘Too Bloody Late’? Older People’s Response to the National Physical Activity Guidelines.” Journal of Research for Consumers 10 (2006). 7 Aug. 2006 <http://www.jrconsumers.com/_data/page/3180/ NPAGs_paper_consumer_version_may_06.pdf>. Holt, T. The Rise of the Nanny State: How Consumer Advocates Try to Run Our Lives. US: Capital Research Centre, 1995. Kline, S. “Countering Children’s Sedentary Lifestyles: An Evaluative Study of a Media-Risk Education Approach.” Childhood 12.2 (2005): 239-58. Knag, S. “The Almighty, Impotent State: Or, the Crisis of Authority.” Independent Review 1.3 (1997): 397-413. Laczniak, G.R., R.F. Lusch, and P. Murphy. “Social Marketing: Its Ethical Dimensions.” Journal of Marketing 43 (Spring 1979): 29-36. Locke, J. An Essay Concerning Human Understanding. Ed. J.W. Yolton. London: J.M. Dent & Sons, 1690/1961. Marmot, M.G., and R.G. Wilkinson, R.G., eds. Social Determinants of Health. Oxford: Oxford University Press, 1999. Mead, L. “Telling the Poor What to Do.” Public Interest 6 Jan. 1998. 1 May 2006 <http://www.polisci.wisc.edu/~soss/Courses/PA974/Readings/week%208/Mead_1998.pdf>. National Social Marketing Centre. It’s Our Health! Realising the Potential of Effective Social Marketing. Summary Report. 7 Aug. 2006 http://www.nsms.org.uk/images/CoreFiles/NCCSUMMARYItsOurHealthJune2006.pdf>. Nozick, R. Anarchy, State and Utopia. New York: Basic Books, 1974. O’Connell, J.K., and J.H. Price. “Ethical Theories for Promoting Health through Behavioral Change.” Journal of School Health 53.8 (1983): 476-9. Rasmussen, S.R., E. Prescott, T.I.A. Sorensen, and J. Sogaard. “The Total Lifetime Costs of Smoking”. European Journal of Public Health 14 (2004): 95-100. Rothschild, M. “Carrots, Sticks, and Promises: A Conceptual Framework for the Management of Public Health and Social Issue Behaviors.” Journal of Marketing 63.4 (1999): 24-37. Smith, A. “Setting a Strategy for Health.” British Medical Journal 304.6823 (8 Feb. 1992): 376-9. Triggle, N. “From Nanny State to a Helping Hand”. BBC News 25 July 2006. 9 Aug. 2006 http://news.bbc.co.uk/1/hi/health/5214276.stm>. “UK Public Wants a ‘Nanny State’”. BBC News 28 June 2004. 9 Aug. 2006 http://news.bbc.co.uk/1/hi/health/3839447.stm>. United Nations, Office of the High Commissioner of Human Rights. Universal Declaration of Human Rights. 18 Sep. 2001 http://www.unhchr.ch/udhr/lang/eng.htm>. Wikler, D. “Persuasion and Coercion for Health: Ethical Issues in Government Efforts to Change Life-Styles.” Millbank Memorial Fund Quarterly, Health and Society 56.3 (1978): 303-38. Yach, D., D. Stuckler, and K.D. Brownwell. “Epidemiological and Economic Consequences of the Global Epidemics of Obesity and Diabetes.” Nature Medicine 12.1 (2006): 62-6. Citation reference for this article MLA Style Henley, Nadine. "Free to Be Obese in a ‘Super Nanny State’?." M/C Journal 9.4 (2006). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0609/6-henley.php>. APA Style Henley, N. (Sep. 2006) "Free to Be Obese in a ‘Super Nanny State’?," M/C Journal, 9(4). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0609/6-henley.php>.
APA, Harvard, Vancouver, ISO, and other styles
16

Kabir, Nahid, and Mark Balnaves. "Students “at Risk”: Dilemmas of Collaboration." M/C Journal 9, no. 2 (May 1, 2006). http://dx.doi.org/10.5204/mcj.2601.

Full text
Abstract:
Introduction I think the Privacy Act is a huge edifice to protect the minority of things that could go wrong. I’ve got a good example for you, I’m just trying to think … yeah the worst one I’ve ever seen was the Balga Youth Program where we took these students on a reward excursion all the way to Fremantle and suddenly this very alienated kid started to jump under a bus, a moving bus so the kid had to be restrained. The cops from Fremantle arrived because all the very good people in Fremantle were alarmed at these grown-ups manhandling a kid and what had happened is that DCD [Department of Community Development] had dropped him into the program but hadn’t told us that this kid had suicide tendencies. No, it’s just chronically bad. And there were caseworkers involved and … there is some information that we have to have that doesn’t get handed down. Rather than a blanket rule that everything’s confidential coming from them to us, and that was a real live situation, and you imagine how we’re trying to handle it, we had taxis going from Balga to Fremantle to get staff involved and we only had to know what to watch out for and we probably could have … well what you would have done is not gone on the excursion I suppose (School Principal, quoted in Balnaves and Luca 49). These comments are from a school principal in Perth, Western Australia in a school that is concerned with “at-risk” students, and in a context where the Commonwealth Privacy Act 1988 has imposed limitations on their work. Under this Act it is illegal to pass health, personal or sensitive information concerning an individual on to other people. In the story cited above the Department of Community Development personnel were apparently protecting the student’s “negative right”, that is, “freedom from” interference by others. On the other hand, the principal’s assertion that such information should be shared is potentially a “positive right” because it could cause something to be done in that person’s or society’s interests. Balnaves and Luca noted that positive and negative rights have complex philosophical underpinnings, and they inform much of how we operate in everyday life and of the dilemmas that arise (49). For example, a ban on euthanasia or the “assisted suicide” of a terminally ill person can be a “positive right” because it is considered to be in the best interests of society in general. However, physicians who tacitly approve a patient’s right to end their lives with a lethal dose by legally prescribed dose of medication could be perceived as protecting the patient’s “negative right” as a “freedom from” interference by others. While acknowledging the merits of collaboration between people who are working to improve the wellbeing of students “at-risk”, this paper examines some of the barriers to collaboration. Based on both primary and secondary sources, and particularly on oral testimonies, the paper highlights the tension between privacy as a negative right and collaborative helping as a positive right. It also points to other difficulties and dilemmas within and between the institutions engaged in this joint undertaking. The authors acknowledge Michel Foucault’s contention that discourse is power. The discourse on privacy and the sharing of information in modern societies suggests that privacy is a negative right that gives freedom from bureaucratic interference and protects the individual. However, arguably, collaboration between agencies that are working to support individuals “at-risk” requires a measured relaxation of the requirements of this negative right. Children and young people “at-risk” are a case in point. Towards Collaboration From a series of interviews conducted in 2004, the school authorities at Balga Senior High School and Midvale Primary School, people working for the Western Australian departments of Community Development, Justice, and Education and Training in Western Australia, and academics at the Edith Cowan and Curtin universities, who are working to improve the wellbeing of students “at-risk” as part of an Australian Research Council (ARC) project called Smart Communities, have identified students “at-risk” as individuals who have behavioural problems and little motivation, who are alienated and possibly violent or angry, who under-perform in the classroom and have begun to truant. They noted also that students “at-risk” often suffer from poor health, lack of food and medication, are victims of unwanted pregnancies, and are engaged in antisocial and illegal behaviour such as stealing cars and substance abuse. These students are also often subject to domestic violence (parents on drugs or alcohol), family separation, and homelessness. Some are depressed or suicidal. Sometimes cultural factors contribute to students being regarded as “at-risk”. For example, a social worker in the Smart Communities project stated: Cultural factors sometimes come into that as well … like with some Muslim families … they can flog their daughter or their son, usually the daughter … so cultural factors can create a risk. Research elsewhere has revealed that those children between the ages of 11-17 who have been subjected to bullying at school or physical or sexual abuse at home and who have threatened and/or harmed another person or suicidal are “high-risk” youths (Farmer 4). In an attempt to bring about a positive change in these alienated or “at-risk” adolescents, Balga Senior High School has developed several programs such as the Youth Parents Program, Swan Nyunger Sports Education program, Intensive English Centre, and lower secondary mainstream program. The Midvale Primary School has provided services such as counsellors, Aboriginal child protection workers, and Aboriginal police liaison officers for these “at-risk” students. On the other hand, the Department of Community Development (DCD) has provided services to parents and caregivers for children up to 18 years. Academics from Edith Cowan and Curtin universities are engaged in gathering the life stories of these “at-risk” students. One aspect of this research entails the students writing their life stories in a secured web portal that the universities have developed. The researchers believe that by engaging the students in these self-exploration activities, they (the students) would develop a more hopeful outlook on life. Though all agencies and educational institutions involved in this collaborative project are working for the well-being of the children “at-risk”, the Privacy Act forbids the authorities from sharing information about them. A school psychologist expressed concern over the Privacy Act: When the Juvenile Justice Department want to reintroduce a student into a school, we can’t find out anything about this student so we can’t do any preplanning. They want to give the student a fresh start, so there’s always that tension … eventually everyone overcomes [this] because you realise that the student has to come to the school and has to be engaged. Of course, the manner and consequences of a student’s engagement in school cannot be predicted. In the scenario described above students may have been given a fair chance to reform themselves, which is their positive right but if they turn out to be at “high risk” it would appear that the Juvenile Department protected the negative right of the students by supporting “freedom from” interference by others. Likewise, a school health nurse in the project considered confidentiality or the Privacy Act an important factor in the security of the student “at-risk”: I was trying to think about this kid who’s one of the children who has been sexually abused, who’s a client of DCD, and I guess if police got involved there and wanted to know details and DCD didn’t want to give that information out then I’d guess I’d say to the police “Well no, you’ll have to talk to the parents about getting further information.” I guess that way, recognising these students are minor and that they are very vulnerable, their information … where it’s going, where is it leading? Who wants to know? Where will it be stored? What will be the outcomes in the future for this kid? As a 14 year old, if they’re reckless and get into things, you know, do they get a black record against them by the time they’re 19? What will that information be used for if it’s disclosed? So I guess I become an advocate for the student in that way? Thus the nurse considers a sexually abused child should not be identified. It is a positive right in the interest of the person. Once again, though, if the student turns out to be at “high risk” or suicidal, then it would appear that the nurse was protecting the youth’s negative right—“freedom from” interference by others. Since collaboration is a positive right and aims at the students’ welfare, the workable solution to prevent the students from suicide would be to develop inter-agency trust and to share vital information about “high-risk” students. Dilemmas of Collaboration Some recent cases of the deaths of young non-Caucasian girls in Western countries, either because of the implications of the Privacy Act or due to a lack of efficient and effective communication and coordination amongst agencies, have raised debates on effective child protection. For example, the British Laming report (2003) found that Victoria Climbié, a young African girl, was sent by her parents to her aunt in Britain in order to obtain a good education and was murdered by her aunt and aunt’s boyfriend. However, the risk that she could be harmed was widely known. The girl’s problems were known to 6 local authorities, 3 housing authorities, 4 social services, 2 child protection teams, and the police, the local church, and the hospital, but not to the education authorities. According to the Laming Report, her death could have been prevented if there had been inter-agency sharing of information and appropriate evaluation (Balnaves and Luca 49). The agencies had supported the negative rights of the young girl’s “freedom from” interference by others, but at the cost of her life. Perhaps Victoria’s racial background may have contributed to the concealment of information and added to her disadvantaged position. Similarly, in Western Australia, the Gordon Inquiry into the death of Susan Taylor, a 15 year old girl Aboriginal girl at the Swan Nyungah Community, found that in her short life this girl had encountered sexual violation, violence, and the ravages of alcohol and substance abuse. The Gordon Inquiry reported: Although up to thirteen different agencies were involved in providing services to Susan Taylor and her family, the D[epartment] of C[ommunity] D[evelopment] stated they were unaware of “all the services being provided by each agency” and there was a lack of clarity as to a “lead coordinating agency” (Gordon et al. quoted in Scott 45). In this case too, multiple factors—domestic, racial, and the Privacy Act—may have led to Susan Taylor’s tragic end. In the United Kingdom, Harry Ferguson noted that when a child is reported to be “at-risk” from domestic incidents, they can suffer further harm because of their family’s concealment (204). Ferguson’s study showed that in 11 per cent of the 319 case sample, children were known to be re-harmed within a year of initial referral. Sometimes, the parents apply a veil of secrecy around themselves and their children by resisting or avoiding services. In such cases the collaborative efforts of the agencies and education may be thwarted. Lack of cultural education among teachers, youth workers, and agencies could also put the “at-risk” cultural minorities into a high risk category. For example, an “at-risk” Muslim student may not be willing to share personal experiences with the school or agencies because of religious sensitivities. This happened in the UK when Khadji Rouf was abused by her father, a Bangladeshi. Rouf’s mother, a white woman, and her female cousin from Bangladesh, both supported Rouf when she finally disclosed that she had been sexually abused for over eight years. After group therapy, Rouf stated that she was able to accept her identity and to call herself proudly “mixed race”, whereas she rejected the Asian part of herself because it represented her father. Other Asian girls and young women in this study reported that they could not disclose their abuse to white teachers or social workers because of the feeling that they would be “letting down their race or their Muslim culture” (Rouf 113). The marginalisation of many Muslim Australians both in the job market and in society is long standing. For example, in 1996 and again in 2001 the Muslim unemployment rate was three times higher than the national total (Australian Bureau of Statistics). But since the 9/11 tragedy and Bali bombings visible Muslims, such as women wearing hijabs (headscarves), have sometimes been verbally and physically abused and called ‘terrorists’ by some members of the wider community (Dreher 13). The Howard government’s new anti-terrorism legislation and the surveillance hotline ‘Be alert not alarmed’ has further marginalised some Muslims. Some politicians have also linked Muslim asylum seekers with terrorists (Kabir 303), which inevitably has led Muslim “at-risk” refugee students to withdraw from school support such as counselling. Under these circumstances, Muslim “at-risk” students and their parents may prefer to maintain a low profile rather than engage with agencies. In this case, arguably, federal government politics have exacerbated the barriers to collaboration. It appears that unfamiliarity with Muslim culture is not confined to mainstream Australians. For example, an Aboriginal liaison police officer engaged in the Smart Communities project in Western Australia had this to say about Muslim youths “at-risk”: Different laws and stuff from different countries and they’re coming in and sort of thinking that they can bring their own laws and religions and stuff … and when I say religions there’s laws within their religions as well that they don’t seem to understand that with Australia and our laws. Such generalised misperceptions of Muslim youths “at-risk” would further alienate them, thus causing a major hindrance to collaboration. The “at-risk” factors associated with Aboriginal youths have historical connections. Research findings have revealed that indigenous youths aged between 10-16 years constitute a vast majority in all Australian States’ juvenile detention centres. This over-representation is widely recognised as associated with the nature of European colonisation, and is inter-related with poverty, marginalisation and racial discrimination (Watson et al. 404). Like the Muslims, their unemployment rate was three times higher than the national total in 2001 (ABS). However, in 1998 it was estimated that suicide rates among Indigenous peoples were at least 40 per cent higher than national average (National Advisory Council for Youth Suicide Prevention, quoted in Elliot-Farrelly 2). Although the wider community’s unemployment rate is much lower than the Aboriginals and the Muslims, the “at-risk” factors of mainstream Australian youths are often associated with dysfunctional families, high conflict, low-cohesive families, high levels of harsh parental discipline, high levels of victimisation by peers, and high behavioural inhibition (Watson et al. 404). The Macquarie Fields riots in 2005 revealed the existence of “White” underclass and “at-risk” people in Sydney. Macquarie Fields’ unemployment rate was more than twice the national average. Children growing up in this suburb are at greater risk of being involved in crime (The Age). Thus small pockets of mainstream underclass youngsters also require collaborative attention. In Western Australia people working on the Smart Communities project identified that lack of resources can be a hindrance to collaboration for all sectors. As one social worker commented: “government agencies are hierarchical systems and lack resources”. They went on to say that in their department they can not give “at-risk” youngsters financial assistance in times of crisis: We had a petty cash box which has got about 40 bucks in it and sometimes in an emergency we might give a customer a couple of dollars but that’s all we can do, we can’t give them any larger amount. We have bus/metro rail passes, that’s the only thing that we’ve actually got. A youth worker in Smart Communities commented that a lot of uncertainty is involved with young people “at-risk”. They said that there are only a few paid workers in their field who are supported and assisted by “a pool of volunteers”. Because the latter give their time voluntarily they are under no obligation to be constant in their attendance, so the number of available helpers can easily fluctuate. Another youth worker identified a particularly important barrier to collaboration: because of workers’ relatively low remuneration and high levels of work stress, the turnover rates are high. The consequence of this is as follows: The other barrier from my point is that you’re talking to somebody about a student “at-risk”, and within 14 months or 18 months a new person comes in [to that position] then you’ve got to start again. This way you miss a lot of information [which could be beneficial for the youth]. Conclusion The Privacy Act creates a dilemma in that it can be either beneficial or counter-productive for a student’s security. To be blunt, a youth who has suicided might have had their privacy protected, but not their life. Lack of funding can also be a constraint on collaboration by undermining stability and autonomy in the workforce, and blocking inter-agency initiatives. Lack of awareness about cultural differences can also affect unity of action. The deepening inequality between the “haves” and “have-nots” in the Australian society, and the Howard government’s harshness on national security issues, can also pose barriers to collaboration on youth issues. Despite these exigencies and dilemmas, it would seem that collaboration is “the only game” when it comes to helping students “at-risk”. To enhance this collaboration, there needs to be a sensible modification of legal restrictions to information sharing, an increase in government funding and support for inter-agency cooperation and informal information sharing, and an increased awareness about the cultural needs of minority groups and knowledge of the mainstream underclass. Acknowledgments The research is part of a major Australian Research Council (ARC) funded project, Smart Communities. The authors very gratefully acknowledge the contribution of the interviewees, and thank *Donald E. Scott for conducting the interviews. References Australian Bureau of Statistics. 1996 and 2001. Balnaves, Mark, and Joe Luca. “The Impact of Digital Persona on the Future of Learning: A Case Study on Digital Repositories and the Sharing of Information about Children At-Risk in Western Australia”, paper presented at Ascilite, Brisbane (2005): 49-56. 10 April 2006. http://www.ascilite.org.au/conferences/brisbane05/blogs/proceedings/ 06_Balnaves.pdf>. Dreher, Tanya. ‘Targeted’: Experiences of Racism in NSW after September 11, 2001. Sydney: University of Technology, 2005. Elliot-Farrelly, Terri. “Australian Aboriginal Suicide: The Need for an Aboriginal Suicidology”? Australian e-Journal for the Advancement of Mental Health, 3.3 (2004): 1-8. 15 April 2006 http://www.auseinet.com/journal/vol3iss3/elliottfarrelly.pdf>. Farmer, James. A. High-Risk Teenagers: Real Cases and Interception Strategies with Resistant Adolescents. Springfield, Ill.: C.C. Thomas, 1990. Ferguson, Harry. Protecting Children in Time: Child Abuse, Child Protection and the Consequences of Modernity. London: Palgrave Macmillan, 2004. Foucault, Michel. Power/Knowledge: Selected Interviews and Other Writings, 1972-1977. Ed. Colin Gordon, trans. Colin Gordon et al. New York: Pantheon, 1980. Kabir, Nahid. Muslims in Australia: Immigration, Race Relations and Cultural History. London: Kegan Paul, 2005. Rouf, Khadji. “Myself in Echoes. My Voice in Song.” Ed. A. Bannister, et al. Listening to Children. London: Longman, 1990. Scott E. Donald. “Exploring Communication Patterns within and across a School and Associated Agencies to Increase the Effectiveness of Service to At-Risk Individuals.” MS Thesis, Curtin University of Technology, August 2005. The Age. “Investing in People Means Investing in the Future.” The Age 5 March, 2005. 15 April 2006 http://www.theage.com.au>. Watson, Malcolm, et al. “Pathways to Aggression in Children and Adolescents.” Harvard Educational Review, 74.4 (Winter 2004): 404-428. Citation reference for this article MLA Style Kabir, Nahid, and Mark Balnaves. "Students “at Risk”: Dilemmas of Collaboration." M/C Journal 9.2 (2006). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0605/04-kabirbalnaves.php>. APA Style Kabir, N., and M. Balnaves. (May 2006) "Students “at Risk”: Dilemmas of Collaboration," M/C Journal, 9(2). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0605/04-kabirbalnaves.php>.
APA, Harvard, Vancouver, ISO, and other styles
17

Gemeinboeck, Petra. "Something Third, Other." M/C Journal 6, no. 4 (August 1, 2003). http://dx.doi.org/10.5204/mcj.2241.

Full text
Abstract:
In a networked virtual world, interconnected participants are able to enter a dialogue and to interact with one another; they cannot actually do so, however, with the remote participants, but rather with the interpretation and representation of the data transferred from the remote site(s). The process of the evolving dialogue in such tele-immersive scenarios is complexly interwoven with another liquid, hybrid and oscillating process, that of ‘becoming a subject’. The actual opacity of the individual sites – in that all sources, such as the participants’ appearance, their input and in fact anything connected to the actual and physical site are only represented on the ‘other site’ – holds something ambiguous, almost uncanny, opening the scope of something third, other, in between. This article addresses the issues of disguise inherent to tele-immersive virtual environments and communication; it examines the issue of presence emerging from the interrelationships between the networked participants, their virtual representation and the underlying computer-controlled system, as well as between the virtual place and the physical location. The issue of presence and embodiment in tele-immersive virtual environments differs from other virtual social spaces, such as chat rooms and current forms of online-games, in the human-scale, three-dimensional representation of the space and the user's manifestation – the so-called avatar. One of the main purposes of networking such virtual environments is the visual and acoustic representation of remote participants as they share the same virtual space with one’s Self, and thus create a virtual meeting place somewhere in between the participants’ remote locations. In such an environment, the tele-dialogue evolves based on an almost absurd scenario of disguise: while, locally, one is limited to communicating with an electronically masked opposite, this form of dialogue also implies that one’s Self only appears to our ‘human’ opposite as its computer graphic incarnation, an avatar – with which one is nevertheless identified. In a typical example of a networked scenario, multiple copies of the environment, as well as the representations of the users (avatars) are displayed at all client sites. Yet the fact that all modes of representations appear as an exact duplicate on the ‘other side’ does not represent a system-inherent condition, but is exclusively based on the intention of the programmer/designer and/or a convention shared by users of the tele-immersive environment. One possible reason for this common convention might be found in the primary impetus behind the development of virtual reality technologies, which is the most indistinguishable and controllable electronic replication of our physical reality and its inhabitants. Assuming, however, that the shared data is based on mathematical descriptions and instructions, their form of interpretation and representation is entirely subject to the modality of the program/system – and thus, in most cases, also to the individual system of each remote recipient (client). How an environment is represented remotely and how users (avatars) appear and behave on each client’s site is thus the expression of a (possibly selected) option, whereby the convention of reproduction is only one possible choice. As the temporary inhabitants are not actually able to enter the ‘other’ remote site, the virtual environment likewise cannot extend to another, remote place, but rather is generated at each local site (a server’s location, respectively). Decisions about the extent to which the data content can be reinterpreted, and in which form it is represented, establish political and hierarchical structures. Centralized network architectures, like the ‘master-slave’ model or the ‘server-client’ model, also shape our virtual architectures of communication. The politicization of the virtual terrain is thereby partly inscribed by the environment’s author and partly emerges from the opaque, disguising nature of each remote system’s signal- and data-processing. The author defines whether the participants are able to choose their own form of representation, how ‘permeable’, in general, the environment is designed to be, to what degree the user can modify its evolution and its outgoing and incoming signals, and the importance attributed to the imagination and identity of the participating co-author. However much the users are accommodated, the implementation of such a representing mediator and translator in between will never result in a ‘neutral’ system. According to the aforementioned mathematical encoding of represented realities, any structure and instruction can be modulated, re-associated or replaced, every single frame. Whether implemented as a time-based, narrative, or independently generative structure, such a potentially nonlinear sequence of dynamic, transformative events is very likely to entangle with the participants’ subjective Self and its formation of identity. For N. Katherine Hayles, Cyberspace opens common construction of body borders for transformative configurations, which always carry the trace of ‘the Other’. The simultaneous estrangement of the self from itself and its cybernetic reconstitution as ‘the Other’ produces a “diffusion of subjectivity” that “constitutes a second mirror stage: the Mirror of the Cyborg." (Hayles 1993, 186) In my tele-immersive installation Maya--Veil of Illusion, the interferences and distortions caused by the system as a third, unknown participant – the allegedly ‘other’ in the system’s own reflection of the participants’ dialogue –gain a strange, ambiguous component. The project translates the Hindu-Buddhist notion of ‘Maya’ (Sanskrit: illusion) into an elastic veil, spanned between two remote, networked sites. The relationship between the Self and the virtual representation of one’s Self and of the other remote participant becomes the cast of something third, other in the virtual layer in between. Although the veil’s distorted mirror image remains the untouchable ‘other’, it spatially materialises the other participant’s presence as it penetrates the electronic filter, occupying one’s private local space. (see figures 1,2 and 3) The combination of immersive, embodying representation, networking technology and a performative, systemic translator in between opens yet another chapter in the concept of ‘suspension of disbelief’ in contemporary media. In a tele-immersive virtual environment, the participants not only deal with a simple feedback loop (between themselves and the environment), but rather with a nesting of loops, in which the other (remote) sides are likewise involved. In other words, the participants pursue their dialogue via the dialogue with the virtual environment and the projection of their Selves. As addressed in Maya--Veil of Illusion, the dialogue between the remote participants becomes entirely mediated – and consequently controlled by the underlying computer system, which might be more or less transparent to the users. Click on image to see figures 1,2 and 3: Maya--Veil of Illusion; local site ingold: EVL, UIC Chicago, remote site in blue: IAO Fraunhofer Stuttgart In their realisation, tele-immersive virtual spaces appear much more introverted than extroverted, in the sense of actually stretching across the multiple remote sites. So it seems that we don’t actually travel to distant places but rather bring them into our local environment – together with the (representations of the) remote users. Both the sensuous experience as well as the process of interacting take place locally, ‘at home'. Thereby, the virtual place doesn’t seem to be able to break away from its physical anchorage and the cybernetic transfer of our Self (still) seems to be somewhat uncanny. The most exciting aspect for remote users is commonly the fact that they are connected to other participants, located in Chicago and Tokyo, physical places they can relate to in their mental world map, rather than the fact that all of them actually – virtually – share the same space. Does this imply that the virtual place and the virtual body, materialised in virtually accessible environments, are coupled with one another in a similar way as we experience the physical and cultural boundaries in our daily life? It seems that space, however, will always rub against the body – in whatever form of reality. Once, during a networking event at the Ars Electronica Center (Linz, Austria) (EVL: Alive on the Grid, Ars Electronica Festival, 2001), connected to Amsterdam, Chicago, and some other remote sites worldwide, I was approached by a participant. “I don’t believe that I am really interacting with all these remote people in Europe and America. How can you prove to me that it is not just a technical fake?” Well, I can’t. Works Cited Hayles, N. Katherine. The Seductions of Cyberspace. Conley, Verena (ed.): Rethinking Technologies. Minneapolis: University of Minnesota, 1993. Alive on the Grid: <http://www.evl.uic.edu/art/art_project.php3?indi=209> Maya--Veil of Illusion: <http://www.evl.uic.edu/art/art_project.php3?indi=240> Links http://www.evl.uic.edu/art/art_project.php3?indi=209 http://www.evl.uic.edu/art/art_project.php3?indi=240 Citation reference for this article Substitute your date of access for Dn Month Year etc... MLA Style Gemeinboeck, Petra. "Something Third, Other " M/C: A Journal of Media and Culture< http://www.media-culture.org.au/0308/09-something3rd.php>. APA Style Gemeinboeck, P. (2003, Aug 26). Something Third, Other . M/C: A Journal of Media and Culture, 6,< http://www.media-culture.org.au/0308/09-something3rd.php>
APA, Harvard, Vancouver, ISO, and other styles
18

Jr., Joseph Reagle. "Open Content Communities." M/C Journal 7, no. 3 (July 1, 2004). http://dx.doi.org/10.5204/mcj.2364.

Full text
Abstract:
In this brief essay I sketch the characteristics of an open content community by considering a number of prominent examples, reviewing sociological literature, teasing apart the concepts of open and voluntary implicit in most usages of the term, and I offer a definition in which the much maligned possibility of 'forking' is actually an integral aspect of openness. Introduction What is often meant by the term 'open' is a generalization from the Free Software, Open Source and open standards movements. Communities marshaling themselves under these banners cooperatively produce, in public view, software, technical standards, or other content that is intended to be widely shared. Free Software and Open Source The Free Software movement was begun by Richard Stallman at MIT in the 1980s. Previously, computer science operated within the scientific norm of collaboration and information sharing. When Stallman found it difficult to obtain the source code of a troublesome Xerox printer, he feared that the norms of freedom and openness were being challenged by a different, proprietary, conceptualization of information. To challenge this shift he created the GNU Project in 1984 (Stallman 1998), the Free Software Foundation (FSF) in 1985 (Stallman 1996), and the authored the GNU General Public License in 1989. The goal of the GNU Project was to create a free version of the UNIX computing environment with which many computer practitioners were familiar with, and even contributed to, but was increasingly being encumbered with proprietary claims. GNU is playful form of a recursive acronym: GNU is Not Unix. The computing environment was supposed to be similar to but independent of UNIX and include everything a user needed including an operating system kernel (e.g., Hurd) and common applications such as small utilities, text editors (e.g., EMACS) and software compilers (e.g,. GCC). The FSF is now the principle sponsor of the GNU Project and focuses on administrative issues such as copyright licenses, policy, and funding issues; software development and maintenance is still an activity of GNU. The GPL is the FSF's famous copyright license for 'free software'; it ensures that the 'freedom' associated with being able to access and modify software is maintained with the original software and its derivations. It has important safeguards, including its famous 'viral' provision: if you modify and distribute software obtained under the GPL license, your derivation also must be publicly accessible and licensed under the GPL. In 1991, Linus Torvalds started development of Linux: a UNIX like operating system kernel, the core computer program that mediates between applications and the underlying hardware. While it was not part of the GNU Project, and differed in design philosophy and aspiration from the GNU's kernel (Hurd), it was released under the GPL. While Stallman's stance on 'freedom' is more ideological, Torvalds approach is more pragmatic. Furthermore, other projects, such as the Apache web server, and eventually Netscape's Mozilla web browser, were being developed in open communities and under similar licenses except that, unlike the GPL, they often permit proprietary derivations. With such a license, a company may take open source software, change it, and include it in their product without releasing their changes back to the community. The tension between the ideology of free software and its other, additional, benefits led to the concept of Open Source in 1998. The Open Source Initiative (OSI) was founded when, "We realized it was time to dump the confrontational attitude that has been associated with 'free software' in the past and sell the idea strictly on the same pragmatic, business-case grounds that motivated Netscape" (OSI 2003). Since the open source label is intended to cover open communities and licenses beyond the GPL, they have developed a meta (more abstract) Open Source Definition (OSI 1997) which defines openness as: Free redistribution Accessible source code Permits derived works Ensures the integrity of the author's source code Prohibits discrimination against persons or groups Prohibits discrimination against fields of endeavor Prohibits NDA (Non-Disclosure Agreement) entanglements Ensures the license must not be specific to a product Ensures the license must not restrict other software Ensures the license must be technology-neutral A copyright license which is found by OSI to satisfy these requirements will be listed as a OSI certified/approved license, including the GPL of course. Substantively, Free Software and Open Source are not that different: the differences are of motivation, personality, and strategy. The FLOSS (Free/Libre and Open Source Software) survey of 2,784 Free/Open Source (F/OS) developers found that 18% of those that identified with the Free Software community and 9% of those that identified with the Open Source community considered the distinction to be 'fundamental' (Ghosh et al. 2002:55). Given the freedom of these communities, forking (a split of the community where work is taken in a different direction) is common to the development of the software and its communities. One can conceive of Open Source movement as having forked from Free Software movement. The benefits of openness are not limited to the development of software. The Internet Engineering Task Force (IETF) and World Wide Web Consortium (W3C) host the authoring of technical specifications that are publicly available and implemented by applications that must interoperably communicate over the Internet. For example, different Web servers and browsers should be able to work together using the technical specifications of HTML, which structures a Web page, and HTTP, which is used to request and send Web pages. The approach of these organizations is markedly different from the 'big S' (e.g., ISO) standards organizations which typically predicate membership on nationality and often only provide specifications for a fee. This model of openness has extended even to forms of cultural production beyond technical content. For example, the Wikipedia is a collaborative encyclopedia and the Creative Commons provides licenses and community for supporting the sharing of texts, photos, and music. Openness and Voluntariness Organization can be characterized along numerous criteria including size; public versus private ownership; criterion for membership; beneficiaries (cui bono); Hughes's voluntary, military, philanthropic, corporate, and family types; Parsons's social pattern variables; and Thompson and Tuden's decision making strategies, among others (Blau and Scott 1962:40). I posit that within the contemporary usage of the term 'open,' one can identify a number of defining characteristics as well as an implicit connotation of voluntariness. Openness The definition of an 'open' community in the previous section is extensional: describing the characteristics of Free/Open Software (F/OS), and open standards and content. While useful, this approach is incomplete because such a description is of products, not of the social organization of producers. For example, private firms do release F/OS software but this tells us little about how work is done 'in the open.' The approach of Tzouris was to borrow from the literature of 'epistemic' communities so as to provide four characteristics of 'free/open' communities: Shared normative and principled beliefs: refers to the shared understanding of the value-based rationale for contributing to the software. Shared causal beliefs: refers to the shared causal understanding or the reward structures. Therefore, shared causal beliefs have a coordinating effect on the development process. Shared notions of validity: refers to contributors' consensus that the adopted solution is a valid solution for the problem at hand. Common policy enterprise: refers to a common goal that can be achieved through contributing code to the software. In simple words, there is a mutual understanding, a common frame of reference of what to develop and how to do it. (Tzouris 2002:21) However, these criteria seem over-determined: it is difficult to imagine a coherent community ('open' or otherwise) that does not satisfy these requirements. Consequently, I provide an alternative set of criteria that also resists myopic notions of perfect 'openness' or 'democracy.' Very few organizations have completely homogeneous social structures. As argued in Why the Internet is Good: Community Governance That Works Well (Reagle 1999), even an organization like the IETF with the credo of, "We reject kings, presidents and voting. We believe in rough consensus and running code," has explicit authority roles and informal elders. Consequently, in the following definition of open communities there is some room for contention. An open community delivers or demonstrates: Open products: provides products which are available under licenses like those that satisfy the Open Source Definition. Transparency: makes its processes, rules, determinations, and their rationales available. Integrity: ensures the integrity of the processes and the participants' contributions. Non-discrimination: prohibits arbitrary discrimination against persons, groups, or characteristics not relevant to the community's scope of activity. Persons and proposals should be judged on their merits. Leadership should be based on meritocratic or representative processes. Non-interference: the linchpin of openness, if a constituency disagrees with the implementation of the previous three criteria, the first criteria permits them to take the products and commence work on them under their own conceptualization without interference. While 'forking' is often complained about in open communities -- it can create some redundancy/inefficiency -- it is an essential characteristic and major benefit of open communities as well. Voluntariness In addition to the models of organization referenced by Blau and Scott (1962), Amitai Etzioni describes three types of organizations: 'coercive' organizations that use physical means (or threats thereof), 'utilitarian' organizations that use material incentives, and 'normative' organizations that use symbolic awards and status. He also describes three types of membership: 'alienative members' feel negatively towards the organization and wish to leave, 'calculative members' weigh benefits and limitations of belonging, and 'moral members' feel positively towards the organization and may even sublimate their own needs in order to participate (Etzioni 1961). As noted by Jennifer Lois (1999:118) normative organizations are the most underrepresented type of organization discussed in the sociological literature. Even so, Etzioni's model is sufficient such that I define a -- voluntary -- community as a 'normative' organization of 'moral' members. I adopt this synonymous definition not only because it allows me to integrate the character of the members into the character of the organization, but to echo the importance of the sense of the collaborative 'gift' in discussions among members of the community. Yet, obviously, not all voluntary organizations are necessarily open according to the definition above. A voluntary community can produce proprietary products and have opaque processes -- collegiate secret societies are a silly but demonstrative example. However, like with openness, it is difficult to draw a clear line: one cannot exclusively locate open communities and their members strictly within the 'normative' and 'moral' categories, though they are dominant in the open communities I introduced. Many members of those open communities are volunteers, either because of a 'moral' inclination and/or informal 'calculative' concern with a sense of satisfaction and reputation. While the FLOSS survey concluded, "that this activity still resembles rather a hobby than salaried work" (Ghosh et al. 2002:67), 15.7% of their sample declared they do receive some renumeration for developing F/OS. Even at the IETF and W3C, where many engineers are paid to participate, it is not uncommon for some to endeavor to maintain their membership even when not employed or their employers change. The openness of these communities is perhaps dominant in describing the character of the organization, though the voluntariness is critical to understanding the moral/ideological light in which many of the members view their participation. Conclusion I've attempted to provide a definition for openness that reflects an understanding of contemporary usage. The popular connotation, and consequently the definition put forth in this essay, arises from well known examples that include -- at least in part -- a notion of voluntary effort. On further consideration, I believe we can identity a loose conceptualization of shared products, and a process of transparency, integrity, and non-discrimination. Brevity prevents me from considering variations of these characteristics and consequent claims of 'openness' in different communities. And such an exercise isn't necessary for my argument. A common behavior of an open community is the self-reflexive discourse of what it means to be open on difficult boundary cases; the test of an open community is if a constituency that is dissatisfied with the results of such a discussion can can fork (relocate) the work elsewhere. Works Cited Blau, Peter and W. Richard Scott. Formal organizations: a comparative approach. New York, NY: John Wiley, 1962. Etzioni, Amitai. Modern organizations. New York, NY: Free Press of Glencoe., 1961. Ghosh, Rishab, Ruediger Glott, Bernhard Krieger, and Gregorio Robles. Free/Libre and open source software: survey and study. 2002. http://www.infonomics.nl/FLOSS/report/ Lois, Jennifer. "Socialization to heroism: individualism and collectivism in a voluntary search and rescue group." Social Psychology Quarterly 62 (1999): 117-135. Nardi, Bonnie and Steve Whittaker. "The place of face-to-face communication in distributed work." Distributed Work. Ed. Pamela Hinds and Sara Kiesler. Boston, Ma: MIT Press., 2002. chapter 4. Reagle, Joseph. Why the Internet is good community governance that works well. 1999.http://cyber.law.harvard.edu/people/reagle/regulation-19990326.html Stallman, Richard. Free Software Foundation. 1996. http://www.gnu.org/fsf/fsf.html Stallman, Richard. Linux and the GNU project. 1997. http://www.gnu.org/gnu/linux-and-gnu.html Stallman, Richard. The GNU project. 1998. http://www.gnu.org/gnu/thegnuproject.html Tzouris, Menelaos. Software freedom, open software and the participant's motivation -- a multidisciplinary study. London, UK: London School of Economics and Political Science, 2002. Citation reference for this article MLA Style Reagle Jr., Joseph. "Open Content Communities." M/C Journal 7.3 (2004). <http://www.media-culture.org.au/0406/06_Reagle.rft.php>. APA Style Reagle Jr., J. (2004, Jul.) Open Content Communities, M/C Journal, 7(3), <http://www.media-culture.org.au/0406/06_Reagle.rft.php>.
APA, Harvard, Vancouver, ISO, and other styles
19

Collins, Steve. "‘Property Talk’ and the Revival of Blackstonian Copyright." M/C Journal 9, no. 4 (September 1, 2006). http://dx.doi.org/10.5204/mcj.2649.

Full text
Abstract:
Proponents of the free culture movement argue that contemporary, “over-zealous” copyright laws have an adverse affect on the freedoms of consumers and creators to make use of copyrighted materials. Lessig, McLeod, Vaidhyanathan, Demers, and Coombe, to name but a few, detail instances where creativity and consumer use have been hindered by copyright laws. The “intellectual land-grab” (Boyle, “Politics” 94), instigated by the increasing value of intangibles in the information age, has forced copyright owners to seek maximal protection for copyrighted materials. A propertarian approach seeks to imbue copyrighted materials with the same inalienable rights as real property, yet copyright is not a property right, because “the copyright owner … holds no ordinary chattel” (Dowling v. United States 473 US 207, 216 [1985]). A fundamental difference resides in the exclusivity of use: “If you eat my apple, then I cannot” but “if you “take” my idea, I still have it. If I tell you an idea, you have not deprived me of it. An unavoidable feature of intellectual property is that its consumption is non-rivalrous” (Lessig, Code 131). It is, as James Boyle notes, “different” to real property (Shamans 174). Vaidhyanathan observes, “copyright in the American tradition was not meant to be a “property right” as the public generally understands property. It was originally a narrow federal policy that granted a limited trade monopoly in exchange for universal use and access” (11). This paper explores the ways in which “property talk” has infiltrated copyright discourse and endangered the utility of the law in fostering free and diverse forms of creative expression. The possessiveness and exclusion that accompany “property talk” are difficult to reconcile with the utilitarian foundations of copyright. Transformative uses of copyrighted materials such as mashing, sampling and appropriative art are incompatible with a propertarian approach, subjecting freedom of creativity to arbitary licensing fees that often extend beyond the budget of creators (Collins). “Property talk” risks making transformative works an elitist form of creativity, available only to those with the financial resources necessary to meet the demands for licences. There is a wealth of decisions throughout American and English case law that sustain Vaidhyanathan’s argument (see for example, Donaldson v. Becket 17 Cobbett Parliamentary History, col. 953; Wheaton v. Peters 33 US 591 [1834]; Fox Film Corporation v. Doyal 286 US 123 [1932]; US v. Paramount Pictures 334 US 131 [1948]; Mazer v. Stein 347 US 201, 219 [1954]; Twentieth Century Music Corp. v. Aitken 422 U.S. 151 [1975]; Aronson v. Quick Point Pencil Co. 440 US 257 [1979]; Dowling v. United States 473 US 207 [1985]; Harper & Row, Publishers, Inc. v. Nation Enterprises 471 U.S. 539 [1985]; Luther R. Campbell a.k.a. Luke Skyywalker, et al. v. Acuff-Rose Music, Inc. 510 U.S 569 [1994].). As Lemley states, however, “Congress, the courts and commentators increasingly treat intellectual property as simply a species of real property rather than as a unique form of legal protection designed to deal with public goods problems” (1-2). Although section 106 of the Copyright Act 1976 grants exclusive rights, sections 107 to 112 provide freedoms beyond the control of the copyright owner, undermining the exclusivity of s.106. Australian law similarly grants exceptions to the exclusive rights granted in section 31. Exclusivity was a principal objective of the eighteenth century Stationers’ argument for a literary property right. Sir William Blackstone, largely responsible for many Anglo-American concepts concerning the construction of property law, defined property in absolutist terms as “that sole and despotic dominion which one man claims and exercises over the external things of the world, in total exclusion of the right of any other individual in the whole universe” (2). On the topic of reprints he staunchly argued an author “has clearly a right to dispose of that identical work as he pleases, and any attempt to take it from him, or vary the disposition he has made of it, is an invasion of his right of property” (405-6). Blackstonian copyright advanced an exclusive and perpetual property right. Blackstone’s interpretation of Lockean property theory argued for a copyright that extended beyond the author’s expression and encompassed the very “style” and “sentiments” held therein. (Tonson v. Collins [1760] 96 ER 189.) According to Locke, every Man has a Property in his own Person . . . The Labour of his Body and the Work of his hands, we may say, are properly his. Whatsoever then he removes out of the State that Nature hath provided and left it in, he hath mixed his Labour with, and joyned to it something that is his own, and thereby makes it his Property. (287-8) Blackstone’s inventive interpretation of Locke “analogised ideas, thoughts, and opinions with tangible objects to which title may be taken by occupancy under English common law” (Travis 783). Locke’s labour theory, however, is not easily applied to intangibles because occupancy or use is non-rivalrous. The appropriate extent of an author’s proprietary right in a work led Locke himself to a philosophical impasse (Bowrey 324). Although Blackstonian copyright was suppressed by the House of Lords in the eighteenth century (Donaldson v. Becket [1774] 17 Cobbett Parliamentary History, col. 953) and by the Supreme Court sixty years later (Wheaton v. Peters 33 US 591 [1834]), it has never wholly vacated copyright discourse. “Property talk” is undesirable in copyright discourse because it implicates totalitarian notions such as exclusion and inalienable private rights of ownership with no room for freedom of creativity or to use copyrighted materials for non-piracy related purposes. The notion that intellectual property is a species of property akin with real property is circulated by media companies seeking greater control over copyrighted materials, but the extent to which “property talk” has been adopted by the courts and scholars is troubling. Lemley (3-5) and Bell speculate whether the term “intellectual property” carries any responsibility for the propertisation of intangibles. A survey of federal court decisions between 1943 and 2003 reveals an exponential increase in the usage of the term. As noted by Samuelson (398) and Cohen (379), within the spheres of industry, culture, law, and politics the word “property” implies a broader scope of rights than those associated with a grant of limited monopoly. Music United claims “unauthorized reproduction and distribution of copyrighted music is JUST AS ILLEGAL AS SHOPLIFTING A CD”. James Brown argues sampling from his records is tantamount to theft: “Anything they take off my record is mine . . . Can I take a button off your shirt and put it on mine? Can I take a toenail off your foot – is that all right with you?” (Miller 1). Equating unauthorised copying with theft seeks to socially demonise activities occurring outside of the permission culture currently being fostered by inventive interpretations of the law. Increasing propagation of copyright as the personal property of the creator and/or copyright owner is instrumental in efforts to secure further legislative or judicial protection: Since 1909, courts and corporations have exploited public concern for rewarding established authors by steadily limiting the rights of readers, consumers, and emerging artists. All along, the author was deployed as a straw man in the debate. The unrewarded authorial genius was used as a rhetorical distraction that appealed to the American romantic individualism. (Vaidhyanathan 11) The “unrewarded authorial genius” was certainly tactically deployed in the eighteenth century in order to generate sympathy in pleas for further protection (Feather 71). Supporting the RIAA, artists including Britney Spears ask “Would you go into a CD store and steal a CD? It’s the same thing – people going into the computers and logging on and stealing our music”. The presence of a notable celebrity claiming file-sharing is equivalent to stealing their personal property is a more publicly acceptable spin on the major labels’ attempts to maintain a monopoly over music distribution. In 1997, Congress enacted the No Electronic Theft Act which extended copyright protection into the digital realm and introduced stricter penalties for electronic reproduction. The use of “theft” in the title clearly aligns the statute with a propertarian portrayal of intangibles. Most movie fans will have witnessed anti-piracy propaganda in the cinema and on DVDs. Analogies between stealing a bag and downloading movies blur fundamental distinctions in the rivalrous/non-rivalrous nature of tangibles and intangibles (Lessig Code, 131). Of critical significance is the infiltration of “property talk” into the courtrooms. In 1990 Judge Frank Easterbrook wrote: Patents give a right to exclude, just as the law of trespass does with real property … Old rhetoric about intellectual property equating to monopoly seemed to have vanished, replaced by a recognition that a right to exclude in intellectual property is no different in principle from the right to exclude in physical property … Except in the rarest case, we should treat intellectual and physical property identically in the law – which is where the broader currents are taking us. (109, 112, 118) Although Easterbrook refers to patents, his endorsement of “property talk” is cause for concern given the similarity with which patents and copyrights have been historically treated (Ou 41). In Grand Upright v. Warner Bros. Judge Kevin Duffy commenced his judgment with the admonishment “Thou shalt not steal”. Similarly, in Jarvis v. A&M Records the court stated “there can be no more brazen stealing of music than digital sampling”. This move towards a propertarian approach is misguided. It runs contrary to the utilitarian principles underpinning copyright ideology and marginalises freedoms protected by the fair use doctrine, hence Justice Blackman’s warning that “interference with copyright does not easily equate with” interference with real property (Dowling v. United States 473 US 207, 216 [1985]). The framing of copyright in terms of real property privileges private monopoly over, and to the detriment of, the public interest in free and diverse creativity as well as freedoms of personal use. It is paramount that when dealing with copyright cases, the courts remain aware that their decisions involve not pure economic regulation, but regulation of expression, and what may count as rational where economic regulation is at issue is not necessarily rational where we focus on expression – in a Nation constitutionally dedicated to the free dissemination of speech, information, learning and culture. (Eldred v. Ashcroft 537 US 186 [2003] [J. Breyer dissenting]). Copyright is the prize in a contest of property vs. policy. As Justice Blackman observed, an infringer invades a statutorily defined province guaranteed to the copyright holder alone. But he does not assume physical control over the copyright; nor does he wholly deprive its owner of its use. While one may colloquially link infringement with some general notion of wrongful appropriation, infringement plainly implicates a more complex set of property interests than does run-of-the-mill theft, conversion, or fraud. (Dowling v. United States 473 US 207, 217-218 [1985]). Copyright policy places a great deal of control and cultural determinism in the hands of the creative industries. Without balance, oppressive monopolies form on the back of rights granted for the welfare of society in general. If a society wants to be independent and rich in diverse forms of cultural production and free expression, then the courts cannot continue to apply the law from within a propertarian paradigm. The question of whether culture should be determined by control or freedom in the interests of a free society is one that rapidly requires close attention – “it’s no longer a philosophical question but a practical one”. References Bayat, Asef. “Un-Civil Society: The Politics of the ‘Informal People.’” Third World Quarterly 18.1 (1997): 53-72. Bell, T. W. “Author’s Welfare: Copyright as a Statutory Mechanism for Redistributing Rights.” Brooklyn Law Review 69 (2003): 229. Blackstone, W. Commentaries on the Laws of England: Volume II. New York: Garland Publishing, 1978. (Reprint of 1783 edition.) Boyle, J. Shamans, Software, and Spleens: Law and the Construction of the Information Society. Cambridge: Harvard UP, 1996. Boyle, J. “A Politics of Intellectual Property: Environmentalism for the Net?” Duke Law Journal 47 (1997): 87. Bowrey, K. “Who’s Writing Copyright’s History?” European Intellectual Property Review 18.6 (1996): 322. Cohen, J. “Overcoming Property: Does Copyright Trump Privacy?” University of Illinois Journal of Law, Technology & Policy 375 (2002). Collins, S. “Good Copy, Bad Copy.” (2005) M/C Journal 8.3 (2006). http://journal.media-culture.org.au/0507/02-collins.php>. Coombe, R. The Cultural Life of Intellectual Properties. Durham: Duke University Press, 1998. Demers, J. Steal This Music. Athens, Georgia: U of Georgia P, 2006. Easterbrook, F. H. “Intellectual Property Is Still Property.” (1990) Harvard Journal of Law & Public Policy 13 (1990): 108. Feather, J. Publishing, Piracy and Politics: An Historical Study of Copyright in Britain. London: Mansell, 1994. Lemley, M. “Property, Intellectual Property, and Free Riding.” Texas Law Review 83 (2005): 1031. Lessig, L. Code and Other Laws of Cyberspace. New York: Basic Books, 1999. Lessing, L. The Future of Ideas. New York: Random House, 2001. Lessig, L. Free Culture. New York: The Penguin Press, 2004. Locke, J. Two Treatises of Government. Ed. Peter Laslett. Cambridge, New York, Melbourne: Cambridge University Press, 1988. McLeod, K. “How Copyright Law Changed Hip Hop: An Interview with Public Enemy’s Chuck D and Hank Shocklee.” Stay Free (2002). 14 June 2006 http://www.stayfreemagazine.org/archives/20/public_enemy.html>. McLeod, K. “Confessions of an Intellectual (Property): Danger Mouse, Mickey Mouse, Sonny Bono, and My Long and Winding Path as a Copyright Activist-Academic.” Popular Music & Society 28 (2005): 79. McLeod, K. Freedom of Expression: Overzealous Copyright Bozos and Other Enemies of Creativity. United States: Doubleday Books, 2005. Miller, M.W. “Creativity Furor: High-Tech Alteration of Sights and Sounds Divides the Art World.” Wall Street Journal (1987): 1. Ou, T. “From Wheaton v. Peters to Eldred v. Reno: An Originalist Interpretation of the Copyright Clause.” Berkman Center for Internet & Society (2000). 14 June 2006 http://cyber.law.harvard.edu/openlaw/eldredvashcroft/cyber/OuEldred.pdf>. Samuelson, P. “Information as Property: Do Ruckelshaus and Carpenter Signal a Changing Direction in Intellectual Property Law?” Catholic University Law Review 38 (1989): 365. Travis, H. “Pirates of the Information Infrastructure: Blackstonian Copyright and the First Amendment.” Berkeley Technology Law Journal 15 (2000): 777. Vaidhyanathan, S. Copyrights and Copywrongs: The Rise of Intellectual Property and How It Threatens Creativity. New York: New York UP, 2003. Citation reference for this article MLA Style Collins, Steve. "‘Property Talk’ and the Revival of Blackstonian Copyright." M/C Journal 9.4 (2006). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0609/5-collins.php>. APA Style Collins, S. (Sep. 2006) "‘Property Talk’ and the Revival of Blackstonian Copyright," M/C Journal, 9(4). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0609/5-collins.php>.
APA, Harvard, Vancouver, ISO, and other styles
20

Mayo, Sherry. "NXT Space for Visual Thinking." M/C Journal 1, no. 4 (November 1, 1998). http://dx.doi.org/10.5204/mcj.1722.

Full text
Abstract:
"Space, the limitless area in which all things exist and move." -- Merriam-Webster Dictionary(658) Can we determine our point in time and space at this moment of pre-millennium anticipation? The evolution of our visualisation of space as a culture is shifting and entering the critical consciousness of our global village. The infinite expansion of space's parameters, definitions and visualisation remains the next frontier -- not only for NASA, but for visual culture. Benjamin's vision of loss of the aura of originality through reproduction has come to pass, so has the concept of McLuhan's global village, Baudrillard's simulacra, and Gibson's cyberpunk. Recent technologies such as digital imaging, video, 3-D modelling, virtual reality, and the Internet have brought us to the cusp of the millennium as pioneers of what I call this 'NXT space' for visual thinking, for artistic expression. The vision being constructed in pre-millennium culture takes place in an objectless fictionalised space. This virtual reality is a space that is expanding infinitely, as we speak. The vehicle through which access is gained into this layer takes the form of a machine that requires a mind/body split. The viewer probes through the intangible pixels and collects visual data. The data received on this or that layer have the potential to transport the viewer virtually and yield a visceral experience. The new tools for visualisation allow an expanded perception to an altered state of consciousness. The new works cross the boundaries between media, and are the result of virtual trips via the usage of digital imaging. Their aesthetic reflects our digital society in which people maintain extremely intimate relationships with their computers. This new era is populated by a new generation that is inside more than outside, emailing while faxing, speaking on the phone and surfing the Web with MTV on in the background. We have surpassed postmodernist ideas of pluralism and simultaneity and have produced people for whom the digital age is no revolution. Selected colours, forms and spaces refer to the pixelisation of our daily experience. We are really discussing pop for ahistorical youth, who consider virtual reality to be the norm of visualisation via digitally produced ads, movies, TV shows, music videos, video games and the computer. The term "new media" is already antiquated. We are participating in a realm that is fluent with technology, where the visualisation of space is more natural than an idea of objecthood. (At least as long as we're operating in the technology-rich Western world, that is.) The relationship of these virtual spaces with the mass audience is the cause of pre-millennium anxiety. The cool distance of remote control and the ability to remain in an altered state of consciousness are the residual effects of virtual reality. It is this alienated otherness that allows for the atomisation of the universe. We construct artifice for interface, and simulacra have become more familiar than the "real". NXT space, cyberspace, is the most vital space for visual thinking in the 21st century. The malleability and immateriality of the pixel sub-universe has exponential potential. The artists of this future, who will dedicate themselves successfully to dealing with the new parameters of this installation space, will not consider themselves "computer artists". They will be simply artists working with integrated electronic arts. Digital imaging has permeated our lives to such an extent that like Las Vegas "it's the sunsets that look fake as all hell" (Hickey). Venturi depicts the interior of Las Vegas's casinos as infinite dark spaces with lots of lights transmitting information. Cyberspace is a public/private space occupied by a global village, in that it is a public space through its accessibility to anyone with Internet access, and a social space due to the ability to exchange ideas and meet others through dialogue; however, it is also an intimate private space due to its intangibility and the distance between each loner at their terminal. NXT needs a common sign system that is seductive enough to persuade the visitor into entering the site and can act as a navigational tool. People like to return to places that feel familiar and stimulate reverie of past experiences. This requires the visitor to fantasise while navigating through a cybersite and believe that it is an actual place that exists and where they can dwell. Venturi's model of the sign system as paramount to the identification of the actual architecture is perfect for cyberspace, because you are selling the idea or the fiction of the site, not the desert that it really is. Although NXT can not utilise object cathexion to stimulate fantasy and attachment to site, it can breed familiarity through a consistent sign system and a dynamic and interactive social space which would entice frequent revisiting. NXT Space, a home for the other? "Suddenly it becomes possible that there are just others, that we ourselves are an 'other' among others", as Paul Ricoeur said in 1962. If one were to impose Heidegger's thinking in regards to building and dwelling, they would have to reconstruct NXT as a site that would promote dwelling. It would have to be built in a way in which people were not anonymous or random. A chat room or BBS would have to be attached, where people could actively participate with one another within NXT. Once these visitors had other people that they could identify with and repeatedly interact with, they would form a community within the NXT site. Mortals would roam not on earth, nor under the sky, possibly before divinities (who knows), but rather through pixel light and fiber optics without a physical interface between beings. If the goal of mortals is a Heideggerian notion of attachment to a site through building and building's goal is dwelling and dwelling's goal is identification and identification is accomplished through the cultivation of culture, then NXT could be a successful location. NXT could accommodate an interchange between beings that would be free of physiological constraints and identity separations. This is what could be exchanged and exposed in the NXT site without the interference and taint of socio-physio parameters that separate people from one another. A place where everyone without the convenience or burden of identity becomes simply another other. NXT could implement theory in an integral contextual way that could effect critical consciousness and a transformation of society. This site could serve as a theoretical laboratory where people could exchange and experiment within a dialogue. NXT as a test site could push the parameters of cyberspace and otherness in a real and tangible way. This "cyber-factory" would be interactive and analytical. The fictional simulated world is becoming our reality and cyberspace is becoming a more reasonable parallel to life. Travelling through time and space seems more attainable than ever before through the Internet. Net surfing is zipping through the Louvre, trifling through the Grand Canyon and then checking your horoscope. People are becoming used to this ability and the abstract is becoming more tangible to the masses. As techno-literacy and access increase, so should practical application of abstract theory. NXT would escape reification of theory through dynamic accessibility. The virtual factory could be a Voltaire's cafe of cyber-thinkers charting the critical consciousness and evolution of our Web-linked world. Although ultimately in the West we do exist within a capitalist system where every good thought leaks out to the masses and becomes popular, popularity creates fashion, fashion is fetishistic, thereby desirable, and accumulates monetary value. Market power depoliticises original content and enables an idea to become dogma; another trophy in the cultural hall of fame. Ideas do die, but in another time and place can be resurrected and utilised as a template for counter-reaction. This is analogous to genetic evolution -- DNA makes RNA which makes retro-DNA, etc. --, and the helix spirals on, making reification an organic process. However, will cyberspace ever be instrumental in transforming society in the next century? Access is the largest inhibitor. Privileged technophiles often forget that they are in the minority. How do we become more inclusive and expand the dialogue to encompass the infinite number of different voices on our planet? NXT space is limited to a relatively small number of individuals with the ability to afford and gain access to high-tech equipment. This will continue the existing socio-economic imbalance that restricts our critical consciousness. Without developing the Internet into the NXT space, we will be tremendously bothered by ISPs, with data transfer control and content police. My fear for the global village, surfing through our virtual landscape, is that we will all skid off this swiftly tilting planet. The addiction to the Net and to simulated experiences will subject us to remote control. The inundation of commercialism bombarding the spectator was inevitable, and subsequently there are fewer innovative sites pushing the boundaries of experimentation with this medium. Pre-millennium anxiety is abundant in technophobes, but as a technophile I too am afflicted. My fantasy of a NXT space is dwindling as the clock ticks towards the Y2K problem and a new niche for community and social construction has already been out-competed. If only we could imagine all the people living in the NXT space with its potential for tolerance, dialogue, and community. References Bachelard, Gaston. The Poetics of Space: The Classic Look at How We Experience Intimate Places. Boston, MA: Beacon, 1994. Benjamin, Walter. Illuminations. New York: Schocken, 1978. Gibson, William. Neuromancer. San Francisco: Ace Books, 1984. Heidegger, Martin. The Question Concerning Technology, and Other Essays. Trans. William Lovitt. New York: Garland, 1977. Hickey, David. Air Guitar: Four Essays on Art and Democracy. Los Angeles: Art Issues, 1997. Koch, Stephen. Stargazer: Andy Warhol's World and His Films. London: Calder and Boyars, 1973. McLuhan, Marshall. Understanding Media: The Extensions of Man. Cambridge, MA: MIT Press, 1994. The Merriam-Webster Dictionary. Springfield, MA: G.&.C. Merriam, 1974. Venturi, Robert. Learning from Las Vegas: The Forgotten Symbolism of Architectual Form. Cambridge, MA: MIT Press, 1977. Citation reference for this article MLA style: Sherry Mayo. "NXT Space for Visual Thinking: An Experimental Cyberlab." M/C: A Journal of Media and Culture 1.4 (1998). [your date of access] <http://www.uq.edu.au/mc/9811/nxt.php>. Chicago style: Sherry Mayo, "NXT Space for Visual Thinking: An Experimental Cyberlab," M/C: A Journal of Media and Culture 1, no. 4 (1998), <http://www.uq.edu.au/mc/9811/nxt.php> ([your date of access]). APA style: Sherry Mayo. (1998) NXT space for visual thinking: an experimental cyberlab. M/C: A Journal of Media and Culture 1(4). <http://www.uq.edu.au/mc/9811/nxt.php> ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles
21

Sampson, Tony. "Senders, Receivers and Deceivers: How Liar Codes Put Noise Back on the Diagram of Transmission." M/C Journal 9, no. 1 (March 1, 2006). http://dx.doi.org/10.5204/mcj.2583.

Full text
Abstract:
In the half-century since Shannon invented information theory… engineers have come up with brilliant ways of boiling redundancy out of information… This lets it be transmitted twice as fast (Bill Gates: 33). Shannon’s Code Puts an End to Noise The digital machine is often presented as the perfect medium for the efficient transmission of coded messages: an ever-improving machine, in which coded information travels near to the-speed-of-light. Integrated into a global network of communication, transmission is assumed to be friction-free – everything and everybody are just a click away. Indeed, the old problem of signal interference is subdued by the magnum opus of communication engineering – Shannon’s noiseless channel – a cure for the undesirable uncertainties of message sending (Shannon and Weaver 19). For that reason alone, the digitally enhanced fidelity of Shannon’s digital code, not only heralds a new age of communication, but also marks the end of the problem of noise. In effect, his mathematical theory of communication, establishes a highly effective coding mechanism that travels from sender to receiver, overcoming geographic constraint and the deafening raw of the analogue milieu. This makes the theory itself the substratum of the digital communication utopia, since Shannon’s conquest of noise has solved the reliability problem of code, allowing us to focus on the rapidity and fecundity of our messages. However, despite the ingenuity of the noiseless channel, its subsequent rapid expansion into a vast network of machines means that both senders and receivers pay a price for Shannon’s brilliance. The speed and boundless reproducibility of digital code outperforms our physical capacity to observe it. In this way, transmission works behind the scenes, becoming increasingly independent of the human gaze. Even so, we are assured that we will not be overwhelmed by code; a new digital order has purportedly emerged. As follows, network innovators provide us with robotic codes that work benevolently on our behalf, exploring a seemingly random universe of connection. These intelligent codes search the tangled webs that constitute digital communication networks, autonomously in step with our fleeting transactions and data desires. We can sleep safely at night… this is The Road Ahead. But of course, as we now know, the ideal system for perpetual communication has also turned out to be the perfect medium for the codes designed to destroy it (Gordon). Instead of efficiently taking care of our transmission needs, the flow of code has been interrupted by the relational interactions of a machinic assemblage (Deleuze and Guattari). This is a vast assemblage populated by both human and non-human actors. Its evolution has not followed a predictable path determined by the innovations of the science of code, but instead responds to the complex interactions and interconnectedness of the network environment. In this way, the binary switches of the robotic code have occasionally flipped over from true to false – from the munificent to the malevolent function. The interruption seems to be relatively new, but the human-computer assemblage has a long history of the production of what I term liar codes. They follow on from Gödel and Turing’s realisation of the incompleteness and undecidability of self-referential systems of logic in the 1930s. In the following decades, von Neumann’s ideas on self-reproducing code provided early programmers with the means to play coded games of life. Thirty years later and researchers discovered how unstable a network would become when a similarly coded evolutive got out of control (Shoch and Hupp, Cohen). By 1990, the digital worm had turned. Morris’s code famously ‘crashed’ the Internet. The liar code had escaped the research lab and entered the wild world of the network. Nevertheless, despite what appears to be the uncontrollable evolution of code, it is the assemblage itself that makes a difference. Many liar codes have since followed on from the games, experiments and accidents of the early human-computer assemblage. Some are simply mischievous pranks designed to take up space by making copies of themselves, while others conceal a deeper, sinister pre-programmed function of data piracy (Bey 401-434) and viral hijack. The former spread out across a network, spewing out fairly innocuous alerts, whereas the latter steel passwords, gaining access to safe places, capturing navigation tools, and redirecting our attention to the dark side of the global village. In addition to the deluge of spam, viruses and worms, liar code arrives hidden in Trojan programs. Like Russian dolls, code slips into email inboxes. Simple viral sentences repeatedly trick us into opening these programs and spreading the infection. By saying “I love you” code becomes a recursive deceiver, concealing the true intentions of the virus writer, while ensuring that the victim plays a crucial role in the propagation of the liar. Noise Is Dead – Long Live the New Noise! More recently Liar codes have been cunningly understood as contemporary instances of cultural noise – the other of order (Parikka). However, this does not mean that a solution can be found in the universality of Shannon’s linear diagram – this is an altogether different engineering problem. In principle, Shannon’s noise was more readily apprehended. It existed primarily at a technical level (signal noise), a problem solved by the incorporation of noise into a technical code (redundancy). Contrariwise, liar codes go beyond the signal/noise ratio of the content of a message. They are environmental absurdities and anomalies that resonate outside the technical layer into the cultural milieu of the human-computer assemblage. The new noise is produced by the hissing background distortion of the network, which relentlessly drives communication to a far-from-equilibrial state. Along these lines, the production of what appears to be a surplus of code is subject to the behaviour and functioning of a vast and vulnerable topology of human and non-human machinic interaction. Is the Solution to Be Found in a Clash of Codes? In an attempt to banish the network pirates and their growing phylum of liar codes there has been a mobilisation of antivirus technologies. Netizens have been drafted in to operate the malware blockers, set up firewalls or dig the electronic trenches. But these desperate tactics appeal only to those who believe that they can reverse the drift towards instability, and return a sense of order to the network. In reality, evidence of the effectiveness of these counter measures is negligible. Despite efforts to lower the frequency of attacks, the liar code keeps seeping in. Indeed, the disorder from which the new noise emerges is quite unlike the high entropic problem encountered by Shannon. Like this, digital anomalies are not simply undesirable, random distortions, repaired by coded negentropy. On the contrary, the liar is a calculated act of violence, but this is an action that emerges from a collective, war-like assemblage. Therefore, significantly, it is not the code, but the relational interactions that evolve. For this reason, it is not simply the liar codes that threaten the stability of transmission, but the opening-up of a networked medium that captures messages, turning them into an expression of the unknown of order. Code no longer conveys a message through a channel. Not at all, it is the assemblage itself that anarchically converts the message into an altogether different form of expression. The liar is a rhizome, not a root!! (See figure 1.) A Network Diagram of Senders, Receivers and Deceivers Rhizomatic liar code introduces an anarchic scrambling of the communication model. Ad nauseam, antivirus researchers bemoan the problem of the liar code, but their code-determined defence system has seemingly failed to tell apart the senders, receivers and deceivers. Their tactics cannot sidestep the Gödelian paradox. Interestingly, current research into complex network topologies, particularly the Internet and the Web (Barabási), appears to not only support this gloomy conclusion, but confirms that the problem extends beyond code to the dynamic formation of the network itself. In this way, complex network theory may help us to understand how the human-computer assemblage comes together in the production of viral anomalies. Indeed, the digital network is not, as we may think, a random universe of free arbitrary association. It does not conform to the principles leading to inevitable equilibrium (an averaging out of connectivity). It is instead, an increasingly auto-organised and undemocratic tangle of nodes and links in which a few highly connected aristocratic clusters form alongside many isolated regions. In this far-from-random milieu, the flow of code is not determined by the linear transmission of messages between senders and receivers, or for that matter is it guided by an algorithmic evolutive. On the contrary, while aristocratic networks provide a robust means of holding an assemblage together, their topological behaviour also makes them highly susceptible to viral epidemics. Liar codes easily spread through clusters formed out of preferential linkage, and a desire for exclusive, network alliances between humans and non-humans. From a remote location, a single viral code can promiscuously infect a highly connected population of nodes (Pastor-Satorras & Vespignani). This is the perfect environment for the actions of deceivers and their liar codes. On reflection, a revised diagram of transmission, which tackles head on the viral anomalies of the human-computer assemblage, would perhaps be unworkable. This is consistent with the problem of liar codes, and the resulting collapse of trustworthy transmission. The diagram would ideally need to factor in the anarchic, scrambled lines of communication (see figure 1), as well as the complex topological relations between node and link. Such a diagram would also need to trace significant topological behaviours and functions alongside the malfunctions of codes, coders and the sharing of codes over a network. It is this significant topological intensity of the human-computer assemblage that shifts the contemporary debate on noise away from Shannon’s model towards a complex, non-linear and relational interaction. In this sense, the diagram moves closer to the rhizomatic notion of a network (Deleuze and Guattari 9-10). Not so much a model of transmission, rather a model of viral transduction. References Barabási, Albert-László. Linked: The New Science of Networks. Cambridge, Mass: Perseus, 2002. Bey, Hakim in Ludlow, Peter (ed). Crypto Anarchy, Cyberstates and Pirate Utopias. Cambridge, Mass: MIT, 2001. Cohen, F. “Computer Viruses: Theory and Experiments.” Computers & Security 6 (1987): 22-35. Deleuze, Gilles, and Felix Guattari. A Thousand Plateaus: Capitalism and Schizophrenia. Trans. by Brian Massumi. London: The Athlone Press, 1987. Deleuze, Gilles, and Felix Guattari. Anti-Oedipus. London: The Athlone Press, 1984. Gates, Bill. The Road Ahead. London: Penguin, 1995/1996. Gordon, Sarah. “Technologically Enabled Crime: Shifting Paradigms for the Year 2000.” Computers and Security 1995. (5 Dec. 2005) http://www.research.ibm.com/antivirus/SciPapers/Gordon/Crime.html>. Latour, Bruno. Science in Action: How to Follow Scientists and Engineers through Society. Harvard University Press, 1988. Parikka, Jussi. “Viral Noise and the (Dis)Order of the Digital Culture.” M/C Journal 7.6 (2005). 5 Dec. 2005 http://journal.media-culture.org.au/0501/05-parikka.php>. Shannon, Claude, and Warren Weaver. The Mathematical Theory of Communication. University of Illinois Press, 1949/1998. Shoch, John F, and Jon A Hupp. “The ‘Worm’ Programs – Early Experience with a Distributed Computation.” Communications of the ACM 25.3 (March 1982): 172–180. 5 Dec. 2005. Pastor-Satorras, Romualdo, and Alessandro Vespignani. “Epidemic Spreading in Scale-Free Networks.” Physical Review Letters 86 (2001). Von Neumann, John, and Arthur Burks. Theory of Self-Reproducing Automata. University of Illinois Press, 1966. Citation reference for this article MLA Style Sampson, Tony. "Senders, Receivers and Deceivers: How Liar Codes Put Noise Back on the Diagram of Transmission." M/C Journal 9.1 (2006). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0603/03-sampson.php>. APA Style Sampson, T. (Mar. 2006) "Senders, Receivers and Deceivers: How Liar Codes Put Noise Back on the Diagram of Transmission," M/C Journal, 9(1). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0603/03-sampson.php>.
APA, Harvard, Vancouver, ISO, and other styles
22

Richardson, Catherine. "The Politics of a Country Culture." M/C Journal 3, no. 2 (May 1, 2000). http://dx.doi.org/10.5204/mcj.1841.

Full text
Abstract:
Traditionally, the country way of life, the country worldview -- the country culture -- has been understood differently to the city way of life. Notions of rural have been represented in terms such as 'Eden', 'Arcadia', 'Golden Age', and associated with beauty, fertility, moral uprightness and authenticity. In contrast, notions of urban have been characterised by pollution, sterility, degeneration and artificiality. In Australia, the culture of the first white settlers developed out of this tradition, but with its own distinctive characteristics. The harshness and indomitability of the landscape became the means by which unique character, unifying myths of belonging and societal significance were constructed and asserted. In contrast to the communities of the country's original inhabitants, which were perceived as passive, unproductive and disconnected, the new culture was characterised by notions of 'land', 'masculinity', 'white', 'productive', 'homogenous' and 'nationalistic' (Moore 54; Turner 6; Ward; White 16ff.). Defining the country worldview in contemporary Australia, however, is problematic. Question marks hang over the continued significance, even existence, of a specifically country culture. Post-war Australia has witnessed enormous economic and social changes, wrought by improved transport and communication networks, a shrinking rural population, and the decreasing importance of the agricultural industries. The steady decline in grass roots support for the National Party of Australia, traditional defender of the country way of life, suggests that the voting population no longer views the upholding of specifically pro-country policies as necessary to the well-being of the nation. Australia is now recognised as the most urbanised, sub-urbanised and multi-cultural of the western industrialised nations. Globalisation of the mass communications media has blurred the boundaries between rural, urban, state and national. Consequently, many argue that the differences between the country and city are now insignificant (for example, Aitkin 34-41). Yet notions of country that are distinct, even definitive, continue to be represented in various urban-based communications industries, cultural policies, and the discourses of environmental politics and nationalism. Examples include John Laws's very popular Across Australia radio talk-back programme which celebrates the outback, the farmer and 'battler', and the 'True Blue' music of country artist John Williamson; the push by the Green movement to separate and protect wilderness areas of 'natural' bushland from the corrupting influences of human cultivation; and the continued significance of the 'bush' and 'bushman' in divers constructions of national cultural identity. Share and Lawrence argue that such representations are a state of mind rather than a state of being, "in the imagination of the cosmopolis" only (Share & Lawrence 101). Imagined or otherwise, however, the evidence suggests that they are representations which are nevertheless there -- albeit constructed in varying ways, with varying emphases, and in a variety of settings. Tamworth: Country at Heart Jacka argues that it is the 'local', constructed by a specific set of forces and circumstances and operating within a particular time frame and place, that provides the best or most 'authentic' means of analysing notions of the 'country' (qtd. in Share & Lawrence 102). Tamworth, situated in North Western New South Wales, approximately four hundred kilometres from Sydney, is one such 'authentic' locality. The city of Tamworth and its surrounding hinterland is populated by some 55,000 people. Timber and farmland constitute 95% of its land use. Agricultural production generates the bulk of its net income. The Tamworth electoral district has been designated 'country' by the State Electoral Office. Promotional billboards erected by the Tamworth City Council and situated on all major highways into the city describe Tamworth as 'the heart of country'. Tamworth is renowned as 'the Country Music Capital of Australasia' and celebrates 'country' values annually through a highly successful Country Music Festival. Clearly, notions of country are significant in the shaping of how Tamworth is perceived as a community locally and nationally. These notions are an important component of the process of meaning generation, circulation and exchange inTamworth -- indeed, they are an important component of the essential fabric that constitutes the Tamworth culture. Analysis The Tamworth worldview was studied through an analysis of the coverage of the local NSW state election campaigns of 1995 and 1999 by Tamworth's only regional daily newspaper, the Northern Daily Leader. Regional daily newspapers are a useful means of analysing the major preoccupations of a culture. They contribute significantly to the construction and representation of the communities they serve: they are moulded by the specific needs of their communities; they are prominent influences of the norms, values and processes of these communities; they are the product of a community that is connected by common and local interests and knowledge, written with and by the people of this community (Mules et al. 242). The coverage of the 1995 and 1999 election campaigns represented a discrete sample of texts with a common focus. An important aspect of this focus was Mr Tony Windsor, Independent State Member for Tamworth. Windsor's Independent status was significant to the study. Firstly, it suggests that he was elected to office on his own merits or on the merits of his policies, as against any particular party affiliation. Papadakis and Bean argue that a vote for an Independent most often represents a protest vote against the dominant players in the political system rather than any systemic approval of the policy positions or other qualities of the recipient (109). This may well have been the case for Windsor's initial victory in 1991. However, in the 1995 election he won an unprecedented 83% of the primary vote, representing voters from right across the political spectrum. He further increased his majority in the 1999 election. Windsor's extraordinary popularity suggests that his appeal cut across the political boundaries into the social and cultural realms. As such, Windsor embodies a singular means of analysing the socio-politico-cultural preoccupations of those he represents. The study tracked story frequency and space, and analysed pictorial, headline and lead texts in terms of story focus, personal and thematic associations, and candidate agency. It was found that the Leader markedly privileged Windsor over his opponents in regards to story frequency and space. The pictorial and key word analyses identified Windsor's public persona as more active and more person-oriented than those of his opponents, and as associated more often with exterior settings, particularly those in or connected with 'bush' locations. This stood in contrast to the representations of his major ALP opponents. In both elections they were female, associated more with interior settings, and represented as speaking more than doing, passive more than active, and concerned more with their emotions and states of being than was Windsor. Overall, the Leader's representation of Windsor was found to comprise the six notions noted above as being characteristic of the traditional country worldview. Windsor's connections with and concerns for the land and country issues were significant. The construction of male and female gender roles was masculinist in nature. The absence of any signifiers associated with notions of 'Aboriginality', 'ethnicity', even 'diversity', indicated the existence of naturalised discourses of 'white' and 'homogeneity'. Notions of productivity were evident through Windsor's preoccupation with the business and industry. Nationalism was implied through Windsor's association with characteristics that epitomise traditional understandings of what it is to be an Australian. Two additional characteristics were also identified. The first of these was named 'Independent', as indicated through the significance placed upon Windsor's politically Independent status. It was defined by the traditional understandings of the country worldview and ideas of integrity, 'a fair go' for the country, and of giving power back to the people. In contrast, the major political parties, ALP and National Party, were associated with the city, corruption, interference, lack of democracy, the undermining of country values by city values, and a subordination of the country to the city. The second characteristic was named 'community'. It was indicated through ideas of belonging and like-mindedness, andWindsor's representation as friendly, person-oriented and concerned with the active provision of services for the people. Implications The Tamworth culture is characterised by the notions of 'land', 'masculinity', 'white', 'productive', 'homogenous', 'nationalistic', 'Independent' and 'community'. This very characterisation, however, is one that gives rise to a number of questions. What drove the Leader to construct and represent the Tamworth culture in this way? How did and does this particular characterisation serve the needs of the Tamworth people? How and why are these needs different to the needs of city people -- or even people in other rural communities? Perhaps the best answer lies with the demonstrated longevity of the essential nature of the Tamworth worldview. Traditional notions of country have remained distinctive, even definitive, despite Australia's urbanisation, suburbanisation, multiculturalism; despite the enormous economic and social changes that have been wrought by globalisation; despite the consequent blurring of boundaries between rural, urban, state and national. This traditional nature, it seems, is resistant to change. Yet there is also evidence that a blurring of boundaries, even change, has occurred in Tamworth. Examples include the fact that the combined income generated by secondary and tertiary industries in the Tamworth district is now greater than that generated by agriculture; Windsor, with whom the Leader so closely associates the land and other notions traditionally associated with the country, also holds a university degree in economics; the annual Country Music Festival is celebrated largely from within the confines of the city of Tamworth itself; Tamworth City Council and Country Music Festival both have sites on the World Wide Web, thereby connecting them with the very globalisation that the Leader would have them resisting. Although this may suggest that the country has actually appropriated, even assimilated many of the notions that are most often associated with change in today's society, it also seems that this assimilation is one that is on the country's terms only. Notions of the city are subordinated to notions of the country. Change is appropriated, but in a way that maintains the status quo -- that perpetuates the essential country worldview, both locally and nationally. Such evidence of change may also suggest that the Leader's representation of Windsor, of Tamworth, is perhaps a state of mind rather than a state of being. It is a representation that taps into the imagination of the people rather than their everyday existence. In so doing, it worked to position over 85% of the population into voting a particular way in the 1995 and 1999 NSW State elections. It may also work to draw the many people from around Australia who bring their tourist dollars into Tamworth each year to celebrate country values through the Country Music Festival. The Tamworth culture may well uphold a construction of Australian identity that is outside the direct experience of those who live on the coastal fringes, yet it provides an attractive, even desirable holiday destination for many. Perhaps this is because people, country and city alike, continue to see the country as a place that offers them a simple solution to tensions and conflicts that are otherwise unresolvable. Change produces anxiety -- especially a postmodern change in which all semblances of certainty have been removed. On the other hand, the study suggests that the country worldview represents that which does not change. Its definitive nature stands in contrast and provides an alternative to the relativism of the city. Notions of country represent a surety in a world that is otherwise uncertain. References Aitkin, D. "Countrymindedness: The Spread of an Idea." Australian Cultural History 4 (1985): 34-41. Moore, A. "The Old Guard and 'Countrymindedness' during the Great Depression." Journal of Australian Studies 27 (1990): 54. Mules, W., T. Shirato, and B. Wigman. "Rural Identity within the Symbolic Order: Media Representations of the Drought." Communication and Culture in Rural Areas. Ed. P. Share. Wagga Wagga: Charles Sturt UP, 1995. 242. 6. Papadakis, E., and C. Bean. "Independents and the Minor Parties: The Electoral System." Australian Journal of Political Science 30 (1995): 109. Share, P., G. Lawrence. "Fear and Loathing in Wagga Wagga: Cultural Representations of the Rural and Possible Policy Implications." Communication and Culture in Rural Areas. Ed. P. Share. Wagga Wagga: Charles Sturt UP, 1995. Turner, G. Making It National. Sydney: Allen & Unwin, 1994. Ward, R. The Australian Legend. Melbourne: Oxford UP, 1958. White, R. Inventing Australia. Sydney:Allen & Unwin, 1981. 16ff. Citation reference for this article MLA style: Catherine Richardson. "The Politics of a Country Culture: State of Mind or State of Being?" M/C: A Journal of Media and Culture 3.2 (2000). [your date of access] <http://www.api-network.com/mc/0005/country.php>. Chicago style: Catherine Richardson, "The Politics of a Country Culture: State of Mind or State of Being?," M/C: A Journal of Media and Culture 3, no. 2 (2000), <http://www.api-network.com/mc/0005/country.php> ([your date of access]). APA style: Catherine Richardson. (2000) The politics of a country culture: state of mind or state of being?. M/C: A Journal of Media and Culture 3(2). <http://www.api-network.com/mc/0005/country.php> ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles
23

Ballard, Su. "Information, Noise and et al." M/C Journal 10, no. 5 (October 1, 2007). http://dx.doi.org/10.5204/mcj.2704.

Full text
Abstract:
The two companions scurry off when they hear a noise at the door. It was only a noise, but it was also a message, a bit of information producing panic: an interruption, a corruption, a rupture of communication. Was the noise really a message? Wasn’t it, rather, static, a parasite? Michael Serres, 1982. Since, ordinarily, channels have a certain amount of noise, and therefore a finite capacity, exact transmission is impossible. Claude Shannon, 1948. Reading Information At their most simplistic, there are two means for shifting information around – analogue and digital. Analogue movement depends on analogy to perform computations; it is continuous and the relationships between numbers are keyed as a continuous ordinal set. The digital set is discrete; moving one finger at a time results in a one-to-one correspondence. Nevertheless, analogue and digital are like the two companions in Serres’ tale. Each suffers the relationship of noise to information as internal rupture and external interference. In their examination of historical constructions of information, Hobart and Schiffman locate the noise of the analogue within its physical materials; they write, “All analogue machines harbour a certain amount of vagueness, known technically as ‘noise’. Which describes the disturbing influences of the machine’s physical materials on its calculations” (208). These “certain amounts of vagueness” are essential to Claude Shannon’s articulation of a theory for information transfer that forms the basis for this paper. In transforming the structures and materials through which it travels, information has left its traces in digital art installation. These traces are located in installation’s systems, structures and materials. The usefulness of information theory as a tool to understand these relationships has until recently been overlooked by a tradition of media art history that has grouped artworks according to the properties of the artwork and/or tied them into the histories of representation and perception in art theory. Throughout this essay I use the productive dual positioning of noise and information to address the errors and impurity inherent within the viewing experiences of digital installation. Information and Noise It is not hard to see why the fractured spaces of digital installation are haunted by histories of information science. In his 1948 essay “The Mathematical Theory of Communication” Claude Shannon developed a new model for communications technologies that articulated informational feedback processes. Discussions of information transmission through phone lines were occurring alongside the development of technology capable of computing multiple discrete and variable packets of information: that is, the digital computer. And, like art, information science remains concerned with the material spaces of transmission – whether conceptual, social or critical. In the context of art something is made to be seen, understood, viewed, or presented as a series of relationships that might be established between individuals, groups, environments, and sensations. Understood this way art is an aesthetic relationship between differing material bodies, images, representations, and spaces. It is an event. Shannon was adamant that information must not be confused with meaning. To increase efficiency he insisted that the message be separated from its components; in particular, those aspects that were predictable were not to be considered information (Hansen 79). The problem that Shannon had to contend with was noise. Unwanted and disruptive, noise became symbolic of the struggle to control the growth of systems. The more complex the system, the more noise needed to be addressed. Noise is both the material from which information is constructed, as well as being the matter which information resists. Weaver (Shannon’s first commentator) writes: In the process of being transmitted, it is unfortunately characteristic that certain things are added to the signal which were not intended by the information source. These unwanted additions may be distortions of sound (in telephony, for example) or static (in radio), or distortions in shape or shading of picture (television), or errors in transmission (telegraphy or facsimile), etc. All of these changes in the transmitted signal are called noise. (4). To enable more efficient message transmission, Shannon designed systems that repressed as much noise as possible, while also acknowledging that without some noise information could not be transmitted. Shannon’s conception of information meant that information would not change if the context changed. This was crucial if a general theory of information transmission was to be plausible and meant that a methodology for noise management could be foregrounded (Pask 123). Without meaning, information became a quantity, a yes or no decision, that Shannon called a “bit” (1). Shannon’s emphasis on separating signal or message from both predicability and external noise appeared to give information an identity where it could float free of a material substance and be treated independently of context. However, for this to occur information would have to become fixed and understood as an entity. Shannon went to pains to demonstrate that the separation of meaning and information was actually to enable the reverse. A fluidity of information and the possibilities for encoding it would mean that information, although measurable, did not have a finite form. Tied into the paradox of this equation is the crucial role of noise or error. In Shannon’s communication model information is not only complicit with noise; it is totally dependant upon it for understanding. Without noise, either encoded within the original message or present from sources outside the channel, information cannot get through. The model of sender-encoder-channel-signal (message)-decoder-receiver that Shannon constructed has an arrow inserting noise. Visually and schematically this noise is a disruption pointing up and inserting itself in the nice clean lines of the message. This does not mean that noise was a last minute consideration; rather noise was the very thing Shannon was working with (and against). It is present in every image we have of information. A source, message, transmitter, receiver and their attendant noises are all material infrastructures that serve to contextualise the information they transmit, receive, and disrupt. Figure 1. Claude Shannon “The Mathematical Theory of Communication” 1948. In his analytical discussion of the diagram, Shannon actually locates noise in two crucial places. The first position accorded noise is external, marked by the arrow that demonstrates how noise is introduced to the message channel whilst in transit. External noise confuses the purity of the message whilst equivocally adding new information. External noise has a particular materiality and enters the equation as unexplained variation and random error. This is disruptive presence rather than entropic coded pattern. Shannon offers this equivocal definition of noise to be everything that is outside the linear model of sender-channel-receiver; hence, anything can be noise if it enters a channel where it is unwelcome. Secondly, noise was defined as unpredictability or entropy found and encoded within the message itself. This for Shannon was an essential and, in some ways, positive role. Entropic forces invited continual reorganisation and (when engaging the laws of redundancy) assisted with the removal of repetition enabling faster message transmission (Shannon 48). Weaver calls this shifting relationship between entropy and message “equivocation” (11). Weaver identified equivocation as central to the manner in which noise and information operated. A process of equivocation identified the receiver’s knowledge. For Shannon, a process of equivocation mediated between useful information and noise, as both were “measured in the same units” (Hayles, Chaos 55). To eliminate noise completely is to sacrifice information. Information understood in this way is also about relationships between differing material bodies, representations, and spaces, connected together for the purposes of transmission. It, like the artwork, is an event. This would appear to suggest a correlation between information transmission and viewing in galleries. Far from it. Although, the contemporary information channel is essentially a tube with fixed walls, (it is still constrained by physical properties, bandwidth and so on) and despite the implicit spatialisation of information models, I am not proposing a direct correlation between information channels and installation spaces. This is because I am not interested in ‘reading’ the information of either environment. What I am suggesting is that both environments share this material of noise. Noise is present in four places. Firstly noise is within the media errors of transmission, and secondly, it is within the media of the installation, (neither of which are one way flows). Thirdly, the viewer or listener introduces noise as interference, and lastly, it is present in the very materials thorough which it travels. Noise layered on noise. Redundancy and Modulation So far in this paper I have discussed the relationship of information to noise. For the remainder, I want to address some particular processes or manifestations of noise in New Zealand artists’ collective, et al.’s maintenance of social solidarity–instance 5 (2006, exhibited as part of the SCAPE Biennal of Art in Public Space, Christchurch Art Gallery). The installation occupies a small alcove that is partially blocked by a military-style portable table stacked with newspapers. Inside the space are three grey wooden chairs, some headphones, and a modified data projection of Google Earth. It is not immediately clear if the viewer is allowed within the spaces of the alcove to listen to the headphones as monotonous voices fill the whole space intoning political, social, and religious platitudes. The headphones might be a tool to block out the noise. In the installation it is as if multiple messages have been sent but their source, channel, and transmitter are unintelligible to the receiver. All that is left is information divorced from meaning. As other works by et al. have demonstrated, social solidarity is not a fundamentalism with directed positions and singular leaders. For example, in rapture (2004) noise disrupts all presence as a portable shed quivers in response to underground nuclear explosions 40,000km away. In the fundamental practice (2005) the viewer is left attempting to decode the un-encoded, as again sound and large steel barriers control and determine only certain movements (see http://www.etal.name/ for some documentation of these projects) . maintenance of social solidarity–instance 5 is a development of the fundamental practice. To enter its spaces viewers slip around the table and find themselves extremely close to the projection screen. Despite the provision of copious media the viewer cannot control any aspect of the environment. On screen, and apparently integral to the Google Earth imagery, are five animated and imposing dark grey monolith forms. Because of their connection to the monotonous voices in the headphones, the monoliths seem to map the imposition of narrative, power, and force in various disputed territories. Like their sudden arrival in Kubrick’s 2001: A Space Odyssey (1968) it is the contradiction of the visibility and improbability of the monoliths that renders them believable. On the video landscape the five monoliths apparently house the dispassionate voices of many different media and political authorities. Their presence is both redundant and essential as they modulate the layering of media forces – and in between, error slips in. In a broad discussion of information Gilles Deleuze and Felix Guattari highlight the necessary role of redundancy commenting that: redundancy has two forms, frequency and resonance; the first concerns the significance of information, the second (I=I) concerns the subjectivity of communication. It becomes apparent that information and communication, and even significance and subjectification, are subordinate to redundancy (79). In maintenance of social solidarity–instance 5 patterns of frequency highlight the necessary role of entropy where it is coded into gaps in the vocal transmission. Frequency is a structuring of information tied to meaningful communication. Resonance, like the stack of un-decodable newspapers on the portable table, is the carrier of redundancy. It is in the gaps between the recorded voices that connections between the monoliths and the texts are made, and these two forms of redundancy emerge. As Shannon says, redundancy is a problem of language. This is because redundancy and modulation do not equate with relationship of signal to noise. Signal to noise is a representational relationship; frequency and resonance are not representational but relational. This means that an image that might be “real-time” interrupts our understanding that the real comes first with representation always trailing second (Virilio 65). In maintenance of social solidarity–instance 5 the monoliths occupy a fixed spatial ground, imposed over the shifting navigation of Google Earth (this is not to mistake Google Earth with the ‘real’ earth). Together they form a visual counterpoint to the texts reciting in the viewer’s ears, which themselves might present as real but again, they aren’t. As Shannon contended, information cannot be tied to meaning. Instead, in the race for authority and thus authenticity we find interlopers, noisy digital images that suggest the presence of real-time perception. The spaces of maintenance of social solidarity–instance 5 meld representation and information together through the materiality of noise. And across all the different modalities employed, the appearance of noise is not through formation, but through error, accident, or surprise. This is the last step in a movement away from the mimetic obedience of information and its adherence to meaning-making or representational systems. In maintenance of social solidarity–instance 5 we are forced to align real time with virtual spaces and suspend our disbelief in the temporal truths that we see on the screen before us. This brief introduction to the work has returned us to the relationship between analogue and digital materials. Signal to noise is an analogue relationship of presence and absence. No signal equals a break in transmission. On the other hand, a digital system, due to its basis in discrete bits, transmits through probability (that is, the transmission occurs through pattern and randomness, rather than presence and absence (Hayles, How We Became 25). In his use of Shannon’s theory for the study of information transmission, Schwartz comments that the shift in information theory from analogue to digital is a shift from an analogue relationship of signal to noise to one of the probability of error (318). As I have argued in this paper, if it is measured as a quantity, noise is productive; it adds information. In both digital and analogue systems it is predictability and repetition that do not contribute information. Von Neumann makes the distinction clear saying that to some extent the “precision” of the digital machine “is absolute.” Even though, error as a matter of normal operation and not solely … as an accident attributable to some definite breakdown, nevertheless creeps in (294). Error creeps in. In maintenance of social solidarity–instance 5, et al. disrupts signal transmission by layering ambiguities into the installation. Gaps are left for viewers to introduce misreadings of scale, space, and apprehension. Rather than selecting meaning out of information within nontechnical contexts, a viewer finds herself in the same sphere as information. Noise imbricates both information and viewer within a larger open system. When asked about the relationship with the viewer in her work, et al. collaborator p.mule writes: To answer the 1st question, communication is important, clarity of concept. To answer the 2nd question, we are all receivers of information, how we process is individual. To answer the 3rd question, the work is accessible if you receive the information. But the question remains: how do we receive the information? In maintenance of social solidarity–instance 5 the system dominates. Despite the use of sound engineering and sophisticated Google Earth mapping technologies, the work appears to be constructed from discarded technologies both analogue and digital. The ominous hovering monoliths suggest answers: that somewhere within this work are methodologies to confront the materialising forces of digital error. To don the headphones is to invite a position that operates as a filtering of power. The parameters for this power are in a constant state of flux. This means that whilst mapping these forces the work does not locate them. Sound is encountered and constructed. Furthermore, the work does not oppose digital and analogue, for as von Neumann comments “the real importance of the digital procedure lies in its ability to reduce the computational noise level to an extent which is completely unobtainable by any other (analogy) procedure” (295). maintenance of social solidarity–instance 5 shows how digital and analogue come together through the productive errors of modulation and redundancy. et al.’s research constantly turns to representational and meaning making systems. As one instance, maintenance of social solidarity–instance 5 demonstrates how the digital has challenged the logics of the binary in the traditions of information theory. Digital logics are modulated by redundancies and accidents. In maintenance of social solidarity–instance 5 it is not possible to have information without noise. If, as I have argued here, digital installation operates between noise and information, then, in a constant disruption of the legacies of representation, immersion, and interaction, it is possible to open up material languages for the digital. Furthermore, an engagement with noise and error results in a blurring of the structures of information, generating a position from which we can discuss the viewer as immersed within the system – not as receiver or meaning making actant, but as an essential material within the open system of the artwork. References Barr, Jim, and Mary Barr. “L. Budd et al.” Toi Toi Toi: Three Generations of Artists from New Zealand. Ed. Rene Block. Kassel: Museum Fridericianum, 1999. 123. Burke, Gregory, and Natasha Conland, eds. et al. the fundamental practice. Wellington: Creative New Zealand, 2005. Burke, Gregory, and Natasha Conland, eds. Venice Document. et al. the fundamental practice. Wellington: Creative New Zealand, 2006. Daly-Peoples, John. Urban Myths and the et al. Legend. 21 Aug. 2004. The Big Idea (reprint) http://www.thebigidea.co.nz/print.php?sid=2234>. Deleuze, Gilles, and Felix Guattari. A Thousand Plateaus: Capitalism and Schizophrenia. Trans. Brian Massumi. London: The Athlone Press, 1996. Hansen, Mark. New Philosophy for New Media. Cambridge, MA and London: MIT Press, 2004. Hayles, N. Katherine. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature and Informatics. Chicago and London: U of Chicago P, 1999. Hayles, N. Katherine. Chaos Bound: Orderly Disorder in Contemporary Literature and Science. Ithaca and London: Cornell University, 1990. Hobart, Michael, and Zachary Schiffman. Information Ages: Literacy, Numeracy, and the Computer Revolution. Baltimore: Johns Hopkins UP, 1998. p.mule, et al. 2007. 2 Jul. 2007 http://www.etal.name/index.htm>. Pask, Gordon. An Approach to Cybernetics. London: Hutchinson, 1961. Paulson, William. The Noise of Culture: Literary Texts in a World of Information. Ithaca and London: Cornell University, 1988. Schwartz, Mischa. Information Transmission, Modulation, and Noise: A Unified Approach to Communication Systems. 3rd ed. New York: McGraw-Hill, 1980. Serres, Michel. The Parasite. Trans. Lawrence R. Schehr. Baltimore: John Hopkins UP, 1982. Shannon, Claude. A Mathematical Theory of Communication. July, October 1948. Online PDF. 27: 379-423, 623-656 (reprinted with corrections). 13 Jul. 2004 http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html>. Virilio, Paul. The Vision Machine. Trans. Julie Rose. Bloomington and Indianapolis: Indiana UP, British Film Institute, 1994. Von Neumann, John. “The General and Logical Theory of Automata.” Collected Works. Ed. A. H. Taub. Vol. 5. Oxford: Pergamon Press, 1963. Weaver, Warren. “Recent Contributions to the Mathematical Theory of Communication.” The Mathematical Theory of Commnunication. Eds. Claude Shannon and Warren Weaver. paperback, 1963 ed. Urbana and Chicago: U of Illinois P, 1949. 1-16. Work Discussed et al. maintenance of social solidarity–instance 5 2006. Installation, Google Earth feed, newspapers, sound. Exhibited in SCAPE 2006 Biennial of Art in Public Space Christchurch Art Gallery, Christchurch, September 30-November 12. Images reproduced with the permission of et al. Photographs by Lee Cunliffe. Acknowledgments Research for this paper was conducted with the support of an Otago Polytechnic Resaerch Grant. Photographs of et al. maintenance of social solidarity–instance 5 by Lee Cunliffe. Citation reference for this article MLA Style Ballard, Su. "Information, Noise and et al." M/C Journal 10.5 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0710/02-ballard.php>. APA Style Ballard, S. (Oct. 2007) "Information, Noise and et al.," M/C Journal, 10(5). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0710/02-ballard.php>.
APA, Harvard, Vancouver, ISO, and other styles
24

Barnet, Belinda. "Machinic Heterogenesis and Evolution." M/C Journal 2, no. 6 (September 1, 1999). http://dx.doi.org/10.5204/mcj.1789.

Full text
Abstract:
"I write for a species that does not yet exist." -- Nietzsche (958) III. Note on Self-Organisation and Selectionism According to your mainstream brand neo-Darwinian biologist, natural selection is the stuff of which evolution is made, the First Principle of life. There is nothing in the natural world which cannot be explained by random mutations within the genome and subsequent selection of the fittest form by the natural environment. Beyond the constraints set by the period of waiting for mutations to occur and external conditions, there are no limits to this system, and an organism forms from scratch to a furry crawling thing in a gradual process reliant on external factors. Neo-Darwinism is an attempt to reconcile two theories which are quite simply at odds with one another: Mendelian genetics, which claims that organisms do not change with time, and Darwinism, which claims that they do. This is usually done in a mathematical way, with natural selection as the linchpin of some tight equations. There can be no internal feedback from the body (phenotype) to the genes (genotype). There is no self-organising adaptive order: all emerges from the process of selection as a carefully articulated tree diagram, and adapts over eons. As the Darwinian critic Arthur Koestler pointed out, natural selection is hence the only process found in nature which is devoid of feedback. Neo-Darwinian theory is both unfalsifiable and all-pervasive; it is easy to forget that it is a theory which has not yet been proven beyond doubt by paleontological fact, and that Darwin himself suggested there may be processes other than natural selection at work in the unfolding of life. There are a couple of rogue biologists and a-life crazies, however, that don't believe the Selectionist hype. They are not suggesting that natural selection is a dud theory, but simply that there might be other factors involved, and that the really interesting questions don't just concern life as a Darwinian competition between furry, crawling things, but the interplay between structure and chaos at the basic levels of the system which might give rise to it. Biologists such as Brian Goodwin and Stuart Kauffman take issue with this, claiming that an understanding of life should begin at a more fundamental level than tree diagrams and zoology -- molecular biology, biochemistry, complexity theory. This is the 'language' of life: the way that structure spontaneously emerges from chaos. Niles Eldredge and Stephen Jay Gould looked at the fossil record a few years back and decided that there is no proof that one species turns into another slowly: the mathematics of the Neo-Darwinists relied upon the idea that species took hundreds of millions of years to evolve eyes and ears and legs and wings, branching off into other species in the manner of a tree diagram over billions of years. What Eldredge and Gould found was that species seem to spontaneously emerge fully formed: there is minimal variation going on. A species emerges rapidly, it lasts for a time (often a short time), and then it dies off. The in-between period, the period of mutation and selectionism, is largely unaccounted for by the fossil record, especially considering the importance of such transitory phases to the neo-Darwinists. There are many 'missing links' in the record. For over twenty years, Stuart Kauffman has been going on about what we might call a Second Principle in evolutionary biology: self-organisation. He argues that because natural selection alone is not enough to explain the relatively short timescale on which life arose, some other ordering principle is necessary. He locates this, as Katherine Hayles observes, in the ability of complex systems to self-organise (241). A self-organising system involves the heresy of internal feedback and internally-produced constraints. Living creatures would converge upon certain forms as much as diverge from them due to the influence of mutations caused by cosmic rays, wild chance and external factors. Creatures will not just evolve over billions of years due to selection, but will appear in a more concerted and spontaneous manner. Systems will seek their own order. The heresy in this (as far as neo-Darwinians are concerned, but not all evolutionary biologists) is located in the fact that such enabling constraints emerge from within the system itself. Consequently, natural selection is not the only force at work in evolution. The system is its own material of expression, and can generate its own tendencies and limits. Kauffman calls this process antichaos, or "order for free" (335). One can sense that such a theory would be objectionable to biologists: there is nothing distinctively biological about this explanation, which in fact borrows from physics and complexity theory, and it explores living organisms, chemical compositions and non-biological aggregates alike as systems, privileging no particular machine. A 'complex system', in particular, can be anything from the stockmarket to a flock of birds. XI. Sonicform We might note here a similarity with virtual artist Keith Nettos's Java-based sound system, Sonicform, whose evolving sound structures can be obtained from the artist on request <lucidweave@usa.net>: the divide between living and non-living is not the issue. As Keith puts it, "it's an echo of that Descartian dichotomy between mind and matter. Do such distinctions help us to know ourselves better? I'm not sure that they do". Sonicform is more a world of Newtonian discovery than biblical creation. Self-organisation works on a generative systemic level, and is a prerequisite more so than a defining quality of life or evolution; it is necessary but not sufficient to characterise an organic system. The computer is the perfect environment in which to explore the confusion and commonalities between animate and inanimate systems, and in that confusion, reveal something of the processes underlying the actual generation of self and order in the universe. Information-processing, and life, require a certain type of complexity. The system must be dynamic, yet allow for novel patterns. The computer emphasises the logic as well as the mechanics of life, which are then honed and honoured by the more familiar conception of natural selection. IV. Self-organisation is the natural consequence of simple components (cells, units of sound, air molecules, genes) interacting via equally simple rules. Patterns and forms emerge from the collective raucous, and these forms give rise to other forms. The components in such a system are bimbos: they have no idea what is going on in the greater body, and don't care. In other words, a complex system emerges from lots of small but well-chosen components interacting in a rule-governed way, developing a larger behaviour or pattern which cannot be predicted or divined from these constituent parts. Random mutation and selection will act upon such a system -- this is how Selectionism fits in: forms will not just evolve from scratch via selection, but will spontaneously emerge from within the system, working in conjunction with the First Principle. IV. Sonicform In the Sonicform system, the components are 'sound fragments', the samples attached to the images in the top left-hand corner of the screen at startup, and also the people seated at terminals who interact with these fragments. Although it might seem to be stretching the concept of systemic components to include the user population, the fact that the emerging pattern is dependent on these users to evolve renders them part of the system. The organisation of a machine has less to do with its materiality than with the inter-relations of its components. The rules in Sonicform are the 'sound controllers' located on the right-hand side of the screen, containing basic instructions such as "play sound", "loop sound" and "stop sound" that control the sound fragments and consequently limit the structure of the emerging acoustic pattern. Because Sonicform is linked via the Net to 'sonicserver' and consequently the multiple versions of itself which are being executed at any point in time, any changes that a user makes (e.g. attraction towards a particular kind of sound) are detected by sonicserver and fed back into a primary chain structure. This is the formative basis of Sonicform's 'evolution': a selection of internal behavioural constraints generated by its constituent parts. The heresy in this is the implication that both biological and technical systems are capable of self-organisation and evolution, that both are constellations of universes which are capable of autonomy and complexity (and 'life' as a certain form of complexity). This is not anti-humanist. It's not even post-humanist. Ideology is a human concept which is brought to bear on technology. We're talking a different register altogether. Technical machines, organic machines, conceptual machines: each will beget the other, each will inscribe its own pattern on the process, each will redefine the limits of such connections. VII. The Death of Metaphor: All That Consists Is Real 'Machinic heterogenesis', a term used by Felix Guattari in his book Chaosmosis, is a mode of being and production that draws on complexity theory and the work of Francisco Varela (a biologist interested in self-organisation in immune networks) and Kauffman. Guattari extends the concept of self-organisation to create a pragmatic philosophy. Machinic heterogenesis is a term to describe the way that the machines which populate the universe connect with each other, mutually affect each other, exchange segments and then bifurcate into new machines. Collective existential mutation. When we sit at a computer screen, we are connected with the computer's universes of reference through the circuits of sight, the play of fingers across the keyboard, the conceptual and logical limits of the exchange laid down by both parties. There is a certain synchrony going on across the zone of intersection and compromise to the limits of this exchange. In other words, the limits of the medium define the exchange and what we are becoming as we connect with it. What is the 'ness' of the computer medium, and what are the possible universes of exchange which extend from this? Sonicform explores this exchange through sound, and through a system which explicitly invites us to be a part of an evolving structure. The use of complexity theory and evolution in Sonicform makes explicit the rethinking of machines which we have been doing here in general: machines speak to machines before they speak to Man, and the ontological domains that they reveal and secrete tend towards pattern in an innate way, determined by the mode of aggregation of their constituent parts. Sonicform rethinks technology in terms of evolutionary, collective entities. And this rethinking allows for the particular qualities of the medium itself, its own characteristics, its own unique interpretations of our model of evolution, to express themselves. Here I might note something: evolution cannot be naturalised and reified as an entity independent of the conceptual, technical and scientific machinery of its production. In the eagerness to import biological models to the computer in a-life, we sometimes forget that from its very origins, the human species has been constituted by technical evolution, and that it is the mediation afforded by technics which makes "it impossible simply to describe evolution in terms of a self-contained, or monadic, subject that passively 'adapts' to an object-like environment" (Pearson 4). Similarly, we have produced our various models of evolution by analysing the 'natural environment' through the mediation of technology. Technology has always enjoyed more than just the position of a neutral tool to locate and test Nature, and has its own unique limits and qualities to contribute to anything we produce with it. So this will be the beginning of our rethinking. Constellations of universes colliding, machines exchanging particularities, components that retain their autonomy and yet can collect and self-organise into complex systems, even life. "The ideas that we have been devoting space to here -- instability, fluctuation, complex systems -- diffuse into the social sciences", in the words of Ilya Prigogine (312). If we can create an evolving complex system on the screen which we ourselves are components of, we tend to rethink the interface between nature and technology. What does it say about the "reference point" of the natural world when creatures whose entire function consists of weird acoustic dances across computer circuitry begin to self-replicate and exhibit the signs of open-ended evolution, resulting in formations which no longer have analogues in the 'natural' world? I'd like to hesitate a start here. Biology is its own material of semiotic expression. Techné is its own material of semiotic expression. To address the interface between nature and technology, we need a philosophy of cells, flocks, patterns, components, motors, and elements. We need a philosophy that will create an interference pattern across the zone of intersection. References Hayles, N. Katherine. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature and Informatics. Chicago: U of Chicago P, 1999. Kauffman, Stuart. "Order For Free." The Third Culture. Ed. Brockman. New York: Touchstone Press. Nietzsche, Friedrich. The Will to Power. Trans. W. Kaufmann and R. J. Hollingdale. New York: Random House, 1968. Prigogine, Ilya, and Isabelle Stengers. Order Out of Chaos: Man's New Dialogue with Nature. London: Bantam Books, 1984. Pearson, Keith Ansell. Viroid Life. London: Routledge, 1997. This work is part of the Australian Network for Art and Technology's Deep Immersion: Creative Collaborations project, funded by the AFC. Citation reference for this article MLA style: Belinda Barnet. "Machinic Heterogenesis and Evolution: Collected Notes on Sound, Machines and Sonicform." M/C: A Journal of Media and Culture 2.6 (1999). [your date of access] <http://www.uq.edu.au/mc/9909/sonic.php>. Chicago style: Belinda Barnet, "Machinic Heterogenesis and Evolution: Collected Notes on Sound, Machines and Sonicform," M/C: A Journal of Media and Culture 2, no. 6 (1999), <http://www.uq.edu.au/mc/9906/sonic.php> ([your date of access]). APA style: Belinda Barnet. (1999) Machinic heterogenesis and evolution: collected notes on sound, machines and Sonicform. M/C: A Journal of Media and Culture 2(6). <http://www.uq.edu.au/mc/9909/sonic.php> ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles
25

Collins, Steve. "Good Copy, Bad Copy." M/C Journal 8, no. 3 (July 1, 2005). http://dx.doi.org/10.5204/mcj.2354.

Full text
Abstract:
Nine Inch Nails have just released a new single; In addition to the usual formats, “The Hand That Feeds” was available for free download in Garageband format. Trent Reznor explained, “For quite some time I’ve been interested in the idea of allowing you the ability to tinker around with my tracks – to create remixes, experiment, embellish or destroy what’s there” (MacMinute 15 April 2005). Reznor invites creativity facilitated by copying and transformation. “Copy” carries connotations of unsavoury notions such as piracy, stealing, fake, and plagiarism. Conversely, in some circumstances copying is acceptable, some situations demand copying. This article examines the treatment of “copy” at the intersection of musical creativity and copyright law with regard to cover versions and sampling. Waldron reminds us that copyright was devised first and foremost with a public benefit in mind (851). This fundamental has been persistently reiterated (H. R Rep. (1909); Sen. Rep. (1909); H. R. Rep. (1988); Patterson & Lindberg 70). The law grants creators a bundle of rights in copyrighted works. Two rights implicated in recorded music are located in the composition and the recording. Many potential uses of copyrighted songs require a license. The Copyright Act 1976, s. 115 provides a compulsory licence for cover versions. In other words, any song can be covered for a statutory royalty fee. The law curtails the extent of the copyright monopoly. Compulsory licensing serves both creative and business sides of the recording industry. First, it ensures creative diversity. Musicians are free to reinterpret cultural soundtracks. Second, it safeguards the composer’s right to generate an income from his work by securing royalties for subsequent usage. Although s. 115 permits a certain degree of artistic licence, it requires “the arrangement shall not change the basic melody or fundamental character of the work”. Notwithstanding this proviso, songs can still be transformed and their meaning reshaped. Johnny Cash was able to provide an insight into the mind of a dying man through covering such songs as Nine Inch Nails’ “Hurt”, Depeche Mode’s “Personal Jesus” and Parker & Charles’ “We’ll Meet Again”. Compulsory licensing was introduced in response to a Supreme Court decision that deprived composers of royalties. Congress recognised: The main object to be desired in expanding copyright protection accorded to music has been to give to the composer an adequate return for the value of his composition, and it has been a serious and difficult task to combine the protection of the composer with the protection of the public, and to so frame an act that it would accomplish the double purpose of securing to the composer and at the same time prevent the formation of oppressive monopolies, which might be founded upon the very rights granted to the composer for the purpose of protecting his interests (H. R. Rep. (1909)). Composers exercise rights over the initial exploitation of a song. Once a recording is released, the right is curtailed to serve the public dimension of copyright. A sampler is a device that allows recorded (sampled) sounds to be triggered from a MIDI keyboard or sequencer. Samplers provide potent tools for transforming sounds – filters, pitch-shifting, time-stretching and effects can warp samples beyond recognition. Sampling is a practice that formed the backbone of rap and hip-hop, features heavily in many forms of electronic music, and has proved invaluable in many studio productions (Rose 73-80; Prendergast 383-84, 415-16, 433-34). Samples implicate both of the musical copyrights mentioned earlier. To legally use a sample, the rights in the recording and the underlying composition must be licensed. Ostensibly, acquiring permission to use the composition poses few obstacles due to the compulsory licence. The sound recording, however, is a different matter entirely. There is no compulsory licence for sound recordings. Copyright owners (usually record labels) are free to demand whatever fees they see fit. For example, SST charged Fatboy Slim $1000 for sampling a Negativland record (Negativland). (Ironically, the sample was itself an unlicensed sample appropriated from a 1966 religious recording.) The price paid by The Verve for sampling an obscure orchestral version of a Rolling Stones song was more substantial. Allan Klein owns the copyright in “The Last Time” released by The Andrew Oldham Orchestra in 1965 (American Hit Network, undated). Licence negotiations for the sample left Klein with 100% of the royalties from the song and The Verve with a bitter taste. To add insult to injury, “Bittersweet Symphony” was attributed to Mick Jagger and Keith Richards when the song was nominated for a Grammy (Superswell, undated). License fees can prove prohibitive to many musicians and may outweigh the artistic merit in using the sample: “Sony wanted five thousand dollars for the Clash sample, which … is one thousand dollars a word. In retrospect, this was a bargain, given the skyrocketing costs of sampling throughout the 1990s” (McLeod 86). Adam Dorn, alias Mocean Worker, tried for nine months to licence a sample of gospel singer Mahalia Jackson. Eventually his persistent requests were met with a demand for $10,000 in advance with royalties of six cents per record. Dorn was working with an album budget of a mere $40 and was expecting to sell 2500 copies (Beaujon 25). Unregulated licensing fees stifle creativity and create a de facto monopoly over recorded music. Although copyright was designed to be an engine of free expression1 it still carries characteristics of its monopolistic, totalitarian heritage. The decision in Bridgeport Music v. Dimension Films supported this monopoly. Judge Guy ruled, “Get a license or do not sample. We do not see this stifling creativity in any significant way” (397). The lack of compulsory licensing and the Bridgeport decision creates an untenable situation for sampling musicians and adversely impacts upon the public benefit derived from creative diversity and transformative works (Netanel 288, 331). The sobering potential for lawsuits, ruinous legal costs, injunctions, damages (to copyright owners as well as master recordings), suppresses the creativity of musicians unwilling or unable to pay licence fees (Negativland 251.). I’m a big fan of David Bowie. If I wanted to release a cover version of “Survive”, Bowie and Gabrels (composers) and BMI (publishers) could not prevent it. According the Harry Fox Agency’s online licensing system, it would cost $222.50 (US) for a licence to produce 2500 copies. The compulsory licence demands fidelity to the character of the original. Although my own individual style would be embedded in the cover version, the potential for transformation is limited. Whilst trawling through results from a search for “acapella” on the Soulseek network I found an MP3 of the vocal acapella for “Survive”. Thirty minutes later Bowie was loaded into Sonar 4 and accompanied by a drum loop and bass line whilst I jammed along on guitar and tinkered with synths. Free access to music encourages creative diversity and active cultural participation. Licensing fees, however, may prohibit such creative explorations. Sampling technology offers some truly innovative possibilities for transforming recorded sound. The Roland VariOS can pitch-eliminate; a vocal sample can be reproduced to a melody played by the sampling musician. Although the original singer’s voice is preserved the melody and characteristic nuances can be significantly altered: V-Producer’s Phrase Scope [a system software component] separates the melody from the rest of the phrase, allowing users to re-construct a new melody or add harmonies graphically, or by playing in notes from a MIDI keyboard. Using Phrase Scope, you can take an existing vocal phrase or melodic instrument phrase and change the actual notes, phrasing and vocal gender without unwanted artefacts. Bowie’s original vocal could be aligned with an original melody and set to an original composition. The original would be completely transformed into a new creative work. Unfortunately, EMI is the parent company for Virgin Records, the copyright owner of “Survive”. It is doubtful licence fees could be accommodated by many inspired bedroom producers. EMI’s reaction to DJ Dangermouse’s “Grey Album“ suggests that it would not look upon unlicensed sampling with any favour. Threatening letters from lawyers representing one of the “Big Four” are enough to subjugate most small time producers. Fair use? If a musician is unable to afford a licence, it is unlikely he can afford a fair use defence. Musicians planning only a limited run, underground release may be forgiven for assuming that the “Big Four” have better things to do than trawl through bins of White Labels for unlicensed samples. Professional bootlegger Richard X found otherwise when his history of unlicensed sampling caught up to him: “A certain major label won’t let me use any samples I ask them to. We just got a report back from them saying, ‘Due to Richard’s earlier work of which we are well aware, we will not be assisting him with any future projects’” (Petridis). For record labels “copy” equals “money”. Allan Klein did very well out of licensing his newly acquired “Bittersweet Symphony” to Nike (Superswell). Inability to afford either licences or legal costs means that some innovative and novel creations will never leave the bedroom. Sampling masterpieces such as “It Takes a Nation of Millions to Hold Us Back” are no longer cost effective (McLeod). The absence of a compulsory licence for sampling permits a de facto monopoly over recorded music. Tricia Rose notes the recording industry knows the value of “copy” (90). “Copy” is permissible as long as musicians pay for the privilege – if the resultant market for the sampling song is not highly profitable labels may decline to negotiate a licence. Some parties have recognised the value of the desire to creatively engage with music. UK (dis)band(ed) Curve posted component samples of their song “Unreadable Communication” on their website and invited fans to create their own versions of the song. All submissions were listed on the website. Although the band reserved copyright, they permitted me to upload my version to my online distribution website for free download. It has been downloaded 113 times and streamed a further 112 times over the last couple of months. The remix project has a reciprocal dimension: Creative engagement strengthens the fan base. Guitarist/programmer, Dean Garcia, states “the main reason for posting the samples is for others to experiment with something they love . . . an opportunity as you say to mess around with something you otherwise would never have access to2”. Umixit is testing the market for remixable songs. Although the company has only five bands on its roster (the most notable being Aerosmith), it will be interesting to observe the development of a market for “neutered sampling” and how long it will be before the majors claim a stake. The would-be descendants of Grand Master Flash and Afrika Bambaataa may find themselves bound by end-user licences and contracts. The notion of “copy” at the nexus of creativity and copyright law is simultaneously a vehicle for free expression and a vulgar infringement on a valuable economic interest. The compulsory licence for cover versions encourages musicians to rework existing music, uncover hidden meaning, challenge the boundaries of genre, and actively participate in culture creation. Lack of affirmative congressional or judicial interference in the current sampling regime places the beneficial aspects of “copy” under an oppressive monopoly founded on copyright, an engine of free expression. References American Hit Network. “Bittersweet Symphony – The Verve.” Undated. 17 April 2005 http://www.americanhitnetwork.com/1990/fsongs.cfm?id=8&view=detail&rank=1>. Beaujon, A. “It’s Not The Beat, It’s the Mocean.’ CMJ New Music Monthly, April 1999. EMI. “EMI and Orange Announce New Music Deal.” Immediate Future: PR & Communications, 6 January 2005. 17 April 2005 http://www.immediatefuture.co.uk/359>. H. R. Rep. No. 2222. 60th Cong., 2nd Sess. 7. 1909. H. R. Rep. No. 609. 100th Cong., 2nd Sess. 23. 1988. MacMinute. “NIN Offers New Single in GarageBand Format.” 15 April 2005. 16 April 2005 http://www.macminute.com/2005/04/15/nin/>. McLeod, K. “How Copyright Law Changed Hip Hop: An Interview with Public Enemy’s Chuck D and Hank Shocklee.” Stay Free 2002, 23 June 2004 http://www.stayfreemagazine.org/archives/20/public_enemy.html>. McLeod, K. Freedom of Expression: Overzealous Copyright Bozos and Other Enemies of Creativity. United States: Doubleday Books, 2005. Negativland. “Discography.” Undated. 18 April 2005 http://www.negativland.com/negdisco.html>. Negativland (ed.). Fair Use: The Story of the Letter U and the Numeral 2. Concord: Seeland, 2005. Netanel, N. W. “Copyright and a Democratic Civil Society.” 106 Yale L. J. 283. 1996. Patterson, L.R., and S. Lindberg. The Nature of Copyright: A Law of Users’ Rights. Georgia: U of Georgia P, 1991. Petridis, A. “Pop Will Eat Itself.” The Guardian (UK) 2003. 22 June 2004 http://www.guardian.co.uk/arts/critic/feature/0,1169,922797,00.html>. Prendergast, M. The Ambient Century: From Mahler to Moby – The Evolution of Sound in the Electronic Age. London: Bloomsbury, 2003. Rose, T. Black Noise: Rap Music and Black Culture in Contemporary America. Middletown: Wesleyan UP, 2004. Sen. Rep. No. 1108, 60th Cong., 2nd Sess. 7. 1909. Superswell. “Horror Stories.” 17 April 2005 http://www.superswell.com/samplelaw/horror.html>. Waldron, J. “From Authors to Copiers: Individual Rights and Social Values in Intellectual Property.” 68 Chicago-Kent Law Review 842, 1998. Endnotes 1 Harper & Row, Publishers, Inc. v. Nation Enterprises 471 U.S. 539, 558 (1985). 2 From personal correspondence with Curve dated 16 September 2004. Citation reference for this article MLA Style Collins, Steve. "Good Copy, Bad Copy: Covers, Sampling and Copyright." M/C Journal 8.3 (2005). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0507/02-collins.php>. APA Style Collins, S. (Jul. 2005) "Good Copy, Bad Copy: Covers, Sampling and Copyright," M/C Journal, 8(3). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0507/02-collins.php>.
APA, Harvard, Vancouver, ISO, and other styles
26

Ward, Christopher Grant. "Stock Images, Filler Content and the Ambiguous Corporate Message." M/C Journal 10, no. 5 (October 1, 2007). http://dx.doi.org/10.5204/mcj.2706.

Full text
Abstract:
A central concern of media studies is to understand the transactions of meaning that are established between the encoders and decoders of media messages: senders and receivers, authors and audiences, producers and consumers. More precisely, this discipline has aimed to describe the semantic disconnects that occur when organisations, governments, businesses, and people communicate and interact across media, and, further, to understand the causes of these miscommunications and to theorise their social and cultural implications. As the media environment becomes complicated by increasingly multimodal messages broadcast to diverse languages and cultures, it is no surprise that misunderstanding seems to occur more (and not less) frequently, forcing difficult questions of society’s ability to refine mass communication into a more streamlined, more effective, and less error-prone system. The communication of meaning to mass audiences has long been theorised (e.g.: Shannon and Weaver; Schramm; Berlo) using the metaphor of a “transmitted message.” While these early researchers varied in their approaches to the study of mass communication, common to their theoretical models is to characterise miscommunication as a dysfunction of the pure transmission process: interference that prevents the otherwise successful relay of meaning from a “sender” to a “receiver.” For example, Schramm’s communication model is based upon two individuals sharing “fields of experience”; error and misunderstanding occur to the extent that these fields do not overlap. For Shannon and Weaver, these disconnects were described explicitly as semantic noise: distortions of meaning that resulted in the message received being different from what was being transmitted. While much of this early research in mass communication continues to be relevant to students of communication and media studies, the transmission metaphor has been called into question for the way it frames miscommunication as a distortion of otherwise clear and stable “meaning,” and not as an inevitable result of the gray area that lies between every sender’s intention and a receiver’s interpretation. It is precisely this problem with the transmission metaphor that Derrida calls into question. For Derrida (as well as for many post-structuralists, linguists, and cultural theorists) what we communicate cannot necessarily be intended or interpreted in any stable fashion. Rather, Derrida describes communication as inherently “iterable … able to break with every given context, and engender infinitely new contexts in an absolutely unsaturable fashion” (“Signature” 320). Derrida is concerned that the transmission metaphor doesn’t account for the fact that all signs (words, images, and so on) can signify a multitude of things to different individuals in different contexts, at different points in time. Further, he reminds us that any perceived signification (and thus, meaning) is produced finally, not by the sender, but by the receiver. Within Derrida’s conception of communication as a perpetually open-ended system, the concept of noise takes on a new shape. Perhaps ambiguous meaning is not the “noise” of an otherwise pure system, but rather, perhaps it is only noise that constitutes all acts of communication. Indeed, while Derrida agrees that the consistency and repetition of language help to limit the effects of iterability, he believes that all meaning is ambiguous and never final. Therefore, to communicate is to perpetually negotiate this semantic ambiguity, not to overcome it, constrain it, or push it aside. With these thoughts in mind, when I return to a focus on mass media and media communication, it becomes readily apparent that there do exist sites of cultural production where noise is not only prolific, but where it is also functional—and indeed crucial to a communicator’s goals. Such sites are what Mark Nunes describes as “cultures of noise”: a term I specify in this paper to describe those organised media practices that seem to negotiate, function, and thrive by communicating ambiguously, or at the very least, by resisting the urge to signify explicitly. Cultures of noise are important to the study of media precisely for the ways they call into question our existing paradigms of what it means to communicate. By suggesting that aberrant interpretations of meaning are not dysfunctions of what would be an otherwise efficient system, cultures of noise reveal how certain “asignifying poetics” might be productive and generative for our communication goals. The purpose of this paper is to understand how cultures of noise function by exploring one such case study: the pervasive use of commercial stock images throughout mass media. I will describe how the semantic ambiguity embedded into the construction and sale of stock images is productive both to the stock photography industry and to certain practices of advertising, marketing, and communicating corporate identity. I will begin by discussing the stock image’s dependence upon semantic ambiguity and the productive function this ambiguity serves in supporting the success of the stock photography industry. I will then describe how this ambiguity comes to be employed by corporations and advertisers as “filler content,” enabling these producers to elide the accountability and risk that is involved with more explicit communication. Ambiguous Raw Material: The Stock Industry as a Culture of Noise The photographic image has been a staple of corporate identity for as long as identity has been a concern of corporations. It is estimated that more than 70% of the photographic images used in today’s corporate marketing and advertising have been acquired from a discrete group of stock image firms and photography stock houses (Frosh 5). In fact, since its inception in the 1970’s, increasing global dependence on stock imagery has grown the practice of commercial stock photography into a billion dollar a year industry. Commercial stock images are somewhat peculiar. Unlike other non-fiction genres of stock photography (e.g., editorial and journalistic) commercial stock images present explicitly fictive, constructed scenes. Indeed, many of the images of business workers, doctors, and soccer moms that one finds through a Google Image search are actually actors hired to stage a scene. In this way, commercial stock images share much more in common with the images produced for advertising campaigns, in that they are designed to support branding and corporate identity messages. However, unlike traditional advertising images, which are designed to deliver a certain message for a quite specific application (think ‘Tide stain test’ or ‘posh woman in the Lexus’), commercial stock images have been purposely constructed with no particular application in mind. On the contrary, stock images must be designed to anticipate the diverse needs of cultural intermediaries—design firms, advertising agencies, and corporate marketing teams—who will ultimately purchase the majority of these images. (Frosh 57) To achieve these goals, every commercial stock image is designed to be somewhat open-ended, in order to offer up a field of potential meanings, and yet these images also seem to anticipate the applications of use that will likely appeal to the discourses of corporate marketing and advertising. In this way, the commercial stock image might best be understood as undefined raw material, as a set of likely potentialities still lacking a final determination—what Derrida describes as “undecided” meaning: “I want to recall that undecidability is always a determinate oscillation between possibilities (for example, of meaning, but also of acts). These possibilities are highly determined in strictly concerned situations … they are pragmatically determined. The analyses that I have devoted to undecidability concern just these determinations and these definitions, not at all some vague “indeterminacy”. I say “undecidability” rather than “indeterminacy” because I am interested more in relations of force, in everything that allows, precisely, determinations in given situations to be stabilised through a decision … .” (Limited 148) A stock image’s ambiguity is the result of an intentional design process whereby the stock photography industry presents the maximum range of possible meanings, and yet, falls artfully short of “deciding” any of them. Rather, it is the advertisers, designers, and marketers who ultimately make these decisions by finding utility for the image in a certain context. The more customers that can find a use for a certain image, the more this image will be purchased, and the more valuable that particular stock image becomes. It is precisely in this way that the stock photography industry functions as a culture of noise and raises questions of the traditional sender-to-receiver model of communication. Cultures of noise not only embrace semantic ambiguity; they rely upon this ambiguity for their success. Indeed, the success of the stock photography industry quite literally depends upon the aberrant and unpredictable interpretations of buyers. It is now quite explicitly the “receiver,” and not the “sender,” who controls meaning by imbuing the image with meaning for a specific context and specific need. Once purchased, the “potentialities” of meaning within a stock image become somewhat determined by its placement within a certain context of circulation, such as its use for a banking advertisement or healthcare brochure. In many cases, the meaning of a given stock image is also specified by the text with which it is paired. (Fig. 1) Using text to control the meaning of an image is what Roland Barthes describes as anchorage, “the creator’s (and hence society’s) right of inspection over the image; anchorage is a control, bearing a responsibility in the face of the projective power of pictures-for the use of the message” (156). By using text to constrain how an image should be interpreted, the subjects, forms, and composition of a stock image work to complement the textual message in a clear and defined way. Fig. 1: Courtesy of Washington Mutual. Used by permission. Barthes’ textual anchorage: In advertising and marketing, the subject, form and composition of a stock image are made specific by the text with which the image is paired. Filler Content: Advertising and Marketing as a Culture of Noise In other marketing and advertising messages, I observe that stock images are used in quite a different way, as filler content: open-ended material that takes the place of more explicit, message-oriented elements. As a culture of noise, filler content opposes the goal of generating a clear and specific message. Rather, the goal of filler content is to present an ambiguous message to consumers. When stock images are used as filler content, they are placed into advertising and marketing messages with virtually the same degree of ambiguity as when the image was originally constructed. Such images receive only vague specificity from textual anchorage, and little effort is made by the message producer to explicitly “decide” a message’s meaning. Consider the image (Fig. 2) used in a certain marketing design. Compared with Figure 1, this design makes little attempt to specify the meaning of the image through text. On the contrary, the image is purposely left open to our individual interpretations. Without textual anchorage, the image is markedly “undecided.” As such, it stages the same ambiguous potential for final consumers as it did for the advertiser who originally purchased the image from the stock image house. Fig. 2: Courtesy of VISA.com. Used by permission. Filler content: What meaning(s) does the image have for you? Love? Happiness? Leisure? Freedom? The Outdoors? Perhaps you rode your bike today? While filler content relies upon audiences to fill in the blanks, it also inserts meaning by leveraging the cultural reinforcement of other, similar images. Consider the way that the image of “a woman with a headset” has come to signify customer service (Fig. 3). The image doesn’t represent this meaning on its own, but it works as part of a larger discourse, what Paul Frosh describes as an “image repertoire” (91). By bombarding us incessantly with a repetition of similar images, the media continues to bolster the iconic value that certain stock images possess. The woman with the headset has become an icon of a “Customer Service Representative” because we are exposed to a repetition of images that repeatedly stage the same or similar scene of this idea. As Frosh suggests, “this is the essence of the concept-based stock image: it constitutes a pre-formed, generically familiar visual symbol that calls forth relevant connotations from the social experience of viewers…” (79). Fig. 3: The image repertoire: All filler content depends upon the iconic status of certain stock photography clichés, categories and familiar scenes. Perhaps you have seen these images before? As a culture of noise, stock images in advertising and marketing function as filler content in two ways: 1) meaning is left undecided by the advertiser who intends for customers to create their own interpretations of an advertisement; 2) meaning is generated by the ideological constructs of an “image repertoire” that is itself promulgated by the stock photography industry. As such, filler content signals a shift in the goals of modern advertising and marketing, where corporate messages are designed to be increasingly ambiguous, and meaning seems to be decided more than ever by the final audience. As marketing psychologists Kim and Kahle suggest: Advertising strategy … may need to be changed. Instead of providing the “correct” consumption episodes, marketers could give … an open-ended status, thereby allowing consumers to create the image on their own and to decide the appropriateness of the product for a given need or situation. (63) The potentiality of meanings that was initially embedded into stock images in order to make them more attractive to cultural intermediaries, is also being “passed on” to the final audiences by these same advertisers and marketers. The same noisy signification that supported the sale of stock photos from the stock industry to advertisers now also seems to support the “sale” of messages that advertisers pass on to their audiences. In the same way as the stock photography industry, practices of filler content in advertising also create a culture of noise, by relying upon ambiguous messages that end customers are now forced to both produce and consume. Safe and Vague: The Corporate Imperative Ambiguous communication is not, by itself, egregious. On the contrary, many designers believe that creating a space for thoughtful, open-ended discovery is one of the best ways to provide a meaningful experience to end users. Interaction designer and professor Philip Van Allen describes one such approach to ambiguous design as “productive interaction”: “an open mode of communication where people can form their own outcomes and meanings … sharing insights, dilemmas and questions, and creating new opportunities for synthesis” (56). The critical difference between productive interaction and filler content is one of objective. While media designers embrace open-ended design as a way to create deeper, more meaningful connections with users, modern advertising employs ambiguous design elements, such as stock images, to elide a responsibility to message. Indeed, many marketing and organisational communication researchers (e.g.: Chreim; Elsbach and Kramer; Cheney) suggest that as corporations manage their identities to increasingly disparate and diverse media audiences, misunderstandings are more likely to bring about identity dissonance: that is, “disconnections” between the identity projected by the organisation and the identity attributed to an organisation by its customer-public. To grapple with identity dissonance, Samia Chreim suggests that top managers may choose to engage in the practice of dissonance avoidance: the use of ambiguous messages to provide flexibility in the interpretations of how customers can define a brand or organisation: Dissonance avoidance can be achieved through the use of ambiguous terms … organisations use ambiguity to unite stakeholders under one corporate banner and to stretch the interpretation of how the organisation, or a product or a message can be defined. (76) Corporations forgo the myriad disconnects and pitfalls of mass communication by choosing never to craft an explicit message, hold a position, or express a belief that customers could demur or discount. In such instances, it appears that cultures of noise, such as filler content, may service a shrewd corporate strategy that works to mitigate their responsibility to message. A Responsibility to Message This discussion of “cultures of noise” contributes to media studies by situating semantic noise as productive, and indeed, sometimes vital, to practices of media communication. Understanding the role of semantic noise in communication is an important corner for media scholars to turn, especially as today’s message producers rely and thrive upon certain productive aspects of ambiguous communication. However, this discussion also suggests that all media messages must be critically evaluated in quite a different way. While past analyses of media messages have sought to root out the subversive and manipulative factors that resided deep down in our culture, cultures of noise suggest that it is now also important for media studies to consider the deleterious effects of media’s noisy, diluted, and facile surface. Jean Baudrillard was deeply concerned that the images used in media are too often “used for delusion, for the elusion of communication … for absolving face-to-face relations and social responsibilities. They don’t really lead to action, they substitute for it most of the time” (203). Indeed, while the stock image as filler content may solve the problems faced by corporate message producers in a highly ramified media environment, there is an increasing need to question the depth of meaning in our visual culture. What is the purpose of a given image? What is the producer trying to say? Is it relying on end users to find meaning? Are the images relying an iconic repertoire? Is the producer actually making a statement? And if not, why not? As advertising and marketing continues to shape the visual ground of our culture, Chreim also warns us of our responsibility to message: What is gained in avoiding [identity dissonance] can be lost in the ability to create meaning for stakeholders. Over-reliance on abstract terms may well leave the organisation with a hollow core, one that cannot be appropriated by [customers] in their quest for meaning and identification with the organisation. (88) While cultures of noise may be productive in mitigating the problems of dissonance and miscommunication, and while they may signal new opportunity spaces for design, media, and mass communication, we must also remember that a reliance on ambiguity can sometimes cripple our ability to say anything meaningful at all. References Barthes, Roland. “The Rhetoric of the Image”. The Rhetoric of the Image. Trans. Richard Howard. Berkeley: U of California P, 1977. Baudrillard, Jean. “Aesthetic Illusion and Virtual Reality.” Reading Images. Ed. Julia Thomas. Houndmills: Macmillan, 2000. Berlo, David K. The Process of Communication. New York: Holt, Rinehart and Winston, Inc., 1960. Chreim, Samia. “Reducing Dissonance: Closing the Gap between Projected and Attributed Identity”. Corporate and Organizational Identities: Integrating Strategy, Marketing, Communication and Organizational Perspectives. Eds. B. Moingeon and G. Soenen. Chicago: Routledge, 2002. Derrida, Jacques. “Signature, Event. Context.” Derrida, Jacques: Margins of Philosophy. Trans. Alan Bass. Chicago: 1982. Derrida, Jacques. Limited Inc. Evanston, Ill.: Northwestern UP, 1988. Frosh, Paul. The Image Factory. New York: Berg, 2003. Gettyimages.com. Getty Images. 20 Oct. 2007 http://gettyimages.mediaroom.com>. Kahle, Lynn R., and Kim Chung-Hyun, eds. Creating Images and the Psychology of Marketing Communication. New Jersey: Lawrence Elbaum Associates, 2006 Shannon, Claude F., and Warren Weaver. The Mathematical Theory of Communication. Urbana, Ill.: The University of Illinois Press, 1964. Schramm, Wilbur. “How Communication Works”. The Process and Effects of Mass Communication, ed. Wilbur Schramm. Urbana, Ill.: U of Illinois P, 1961. Van Allen, Philip. “Models”. The New Ecology of Things. Pasadena: Media Design Program, Art Center College of Design, 2007. Citation reference for this article MLA Style Ward, Christopher Grant. "Stock Images, Filler Content and the Ambiguous Corporate Message." M/C Journal 10.5 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0711/04-ward.php>. APA Style Ward, C. (Oct. 2007) "Stock Images, Filler Content and the Ambiguous Corporate Message," M/C Journal, 10(5). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0711/04-ward.php>.
APA, Harvard, Vancouver, ISO, and other styles
27

Aly, Anne, and Lelia Green. "‘Moderate Islam’." M/C Journal 10, no. 6 (April 1, 2008). http://dx.doi.org/10.5204/mcj.2721.

Full text
Abstract:
On 23 August 2005, John Howard, then Prime Minister, called together Muslim ‘representatives’ from around the nation for a Muslim Summit in response to the London bombings in July of that year. One of the outcomes of the two hour summit was a Statement of Principles committing Muslim communities in Australia to resist radicalisation and pursue a ‘moderate’ Islam. Since then the ill-defined term ‘moderate Muslim’ has been used in both the political and media discourse to refer to a preferred form of Islamic practice that does not challenge the hegemony of the nation state and that is coherent with the principles of secularism. Akbarzadeh and Smith conclude that the terms ‘moderate’ and ‘mainstream’ are used to describe Muslims whom Australians should not fear in contrast to ‘extremists’. Ironically, the policy direction towards regulating the practice of Islam in Australia in favour of a state defined ‘moderate’ Islam signals an attempt by the state to mediate the practice of religion, undermining the ethos of secularism as it is expressed in the Australian Constitution. It also – arguably – impacts upon the citizenship rights of Australian Muslims in so far as citizenship presents not just as a formal set of rights accorded to an individual but also to democratic participation: the ability of citizens to enjoy those rights at a substantive level. Based on the findings of research into how Australian Muslims and members of the broader community are responding to the political and media discourses on terrorism, this article examines the impact of these discourses on how Muslims are practicing citizenship and re-defining an Australian Muslim identity. Free Speech Free speech has been a hallmark of liberal democracies ever since its defence became part of the First Amendment to the United States Constitution. The Australian Constitution does not expressly contain a provision for free speech. The right to free speech in Australia is implied in Australia’s ratification of the United Nations Universal Declaration of Human Rights (UDHR), article 19 of which affirms: Article 19. Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers. The ultimate recent endorsement of free speech rights, arguably associated with the radical free speech ‘open platform’ movement of the 1960s at the University of California Berkeley, constructs free speech as essential to human and civil liberties. Its approach has been expressed in terms such as: “I reject and detest XYZ views but will defend to the utmost a person’s right to express them”. An active defence of free speech is based on the observation that, unless held to account, “[Authorities] would grant free speech to those with whom they agree, but not to minorities whom they consider unorthodox or threatening” (“Online Archives of California”). Such minorities, differing from the majority view, do so as a right accorded to citizens. In very challenging circumstances – such as opposing the Cold War operations of the US Senate Anti-American Activities Committee – the free speech movement has been celebrated as holding fast (or embodying a ‘return’) to the true meaning of the American First Amendment. It was in public statements of unpopular and minority views, which opposed those of the majority, that the right to free speech could most non-controvertibly be demonstrated. Some have argued that such rights should be balanced by anti-vilification legislation, by prohibitions upon incitement to violence, and by considerations as to whether the organisation defended by the speaker was banned. In the latter case, there can be problems with excluding the defence of banned organisations from legitimate debate. In the 1970s and 1980s, for example, Sinn Fein was denounced in the UK as the ‘political wing of the IRA’ (the IRA being a banned organisation) and denied a speaking position in many forums, yet has proved to be an important party in the eventual reconciliation of the Northern Ireland divide. In effect, the banning of an organisation is a political act and such acts should best be interrogated through free speech and democratic debate. Arguably, such disputation is a responsibility of an involved citizenry. In general, liberal democracies such as Australia do not hesitate to claim that citizens have a right to free speech and that this is a right worth defending. There is a legitimate expectation by Australians of their rights as citizens to freedom of expression. For some Australian Muslims, however, the appeal to free speech seems a hollow one. Muslim citizens run the risk of being constructed as ‘un-Australian’ when they articulate their concerns or opinions. Calls by some Muslim leaders not to reprint the Danish cartoons depicting images of the Prophet Mohammed for example, met with a broader community backlash and drew responses that, typically, constructed Muslims as a threat to Australian cultural values of freedom and liberty. These kinds of responses to expressions by Australian Muslims of their deeply held convictions are rarely, if ever, interpreted as attempts to curtail Australian Muslims’ rights to free speech. There is a poor fit between what many Australian Muslims believe and what they feel the current climate in Australia allows them to say in the public domain. Positioned as the potential ‘enemy within’ in the evolving media and political discourse post September 11, they have been allocated restricted speaking positions on many subjects from the role and training of their Imams to the right to request Sharia courts (which could operate in parallel with Australian courts in the same way that Catholic divorce/annulment courts do). These social and political restrictions lead them to question whether Muslims enjoy citizenship rights on an equal footing with Australians from the broader community. The following comment from an Australian woman, an Iraqi refugee, made in a research interview demonstrates this: The media say that if you are Australian it means that you enjoy freedom, you enjoy the rights of citizenship. That is the idea of what it means to be Australian, that you do those things. But if you are a Muslim, you are not Australian. You are a people who are dangerous, a people who are suspicious, a people who do not want democracy—all the characteristics that make up terrorists. So yes, there is a difference, a big difference. And it is a feeling all Muslims have, not just me, whether you are at school, at work, and especially if you wear the hijab. (Translated from Arabic by Anne Aly) At the same time, Australian Muslims observe some members of the broader community making strong assertions about Muslims (often based on misunderstanding or misinformation) with very little in the way of censure or rebuke. For example, again in 2005, Liberal backbenchers Sophie Panopoulos and Bronwyn Bishop made an emotive plea for the banning of headscarves in public schools, drawing explicitly on the historically inherited image of Islam as a violent, backward and oppressive ideology that has no place in Western liberal democracy: I fear a frightening Islamic class emerging, supported by a perverse interpretation of the Koran where disenchantment breeds disengagement, where powerful and subversive orthodoxies are inculcated into passionate and impressionable young Muslims, where the Islamic mosque becomes the breeding ground for violence and rejection of Australian law and ideals, where extremists hijack the Islamic faith with their own prescriptive and unbending version of the Koran and where extremist views are given currency and validity … . Why should one section of the community be stuck in the Dark Ages of compliance cloaked under a veil of some distorted form of religious freedom? (Panopoulos) Several studies attest to the fact that, since the terrorist attacks in the United States in September 2001, Islam, and by association Australian Muslims, have been positioned as other in the political and media discourse (see for example Aly). The construct of Muslims as ‘out of place’ (Saniotis) denies them entry and representation in the public sphere: a key requisite for democratic participation according to Habermas (cited in Haas). This notion of a lack of a context for Muslim citizenship in Australian public spheres arises out of the popular construction of ‘Muslim’ and ‘Australian’ as mutually exclusive modes of being. Denied access to public spaces to partake in democratic dialogue as political citizens, Australian Muslims must pursue alternative communicative spaces. Some respond by limiting their expressions to closed spheres of communication – a kind of enforced silence. Others respond by pursuing alternative media discourses that challenge the dominant stereotypes of Muslims in Western media and reinforce majority-world cultural views. Enforced Silence In closed spheres of discussion, Australian Muslims can openly share their perceptions about terrorism, the government and media. Speaking openly in public however, is not common practice and results in forced silence for fear of reprisal or being branded a terrorist: “if we jump up and go ‘oh how dare you say this, rah, rah’, he’ll be like ‘oh he’s going to go off, he’ll blow something up’”. One research participant recalled that when his work colleagues were discussing the September 11 attacks he decided not to partake in the conversation because it “might be taken against me”. The participant made this decision despite the fact that his colleagues were expressing the opinion that United States foreign policy was the likely cause for the attacks—an opinion with which he agreed. This suggests some support for the theory that the fear of social isolation may make Australian Muslims especially anxious or fearful of expressing opinions about terrorism in public discussions (Noelle-Neumann). However, it also suggests that the fear of social isolation for Muslims is not solely related to the expression of minority opinion, as theorised in Noelle-Neumann’s Spiral of Silence . Given that many members of the wider community shared the theory that the attacks on the Pentagon and the World Trade Centre in 2001 may have been a response to American foreign policy, this may well not be a minority view. Nonetheless, Australian Muslims hesitated to embrace it. Saniotis draws attention to the pressure on Australian Muslims to publicly distance themselves from the terrorist attacks of September 11 and to openly denounce the actions of terrorists. The extent to which Muslims were positioned as a threatening other was contingent on their ability to demonstrate that they too participated in the distal responses to the terrorist attacks—initial pity for the sufferer and eventual marginalisation and rejection of the perceived aggressor. Australian Muslims were obliged to declare their loyalty and commitment to Australia’s ally and, in this way, partake in the nationalistic responses to the threat of terrorism. At the same time however, Australian Muslims were positioned as an imagined enemy and a threat to national identity. Australian Muslims were therefore placed in a paradoxical bind- as Australians they were expected to respond as the victims of fear; as Muslims they were positioned as the objects of fear. Even in discussions where their opinions are congruent with the dominant opinion being expressed, Australian Muslims describe themselves as feeling apprehensive or anxious about expressing their opinions because of how these “might be taken”. Pursuing alternative discourses The overriding message from the research project’s Muslim participants was that the media, as a powerful purveyor of public opinion, had inculcated a perception of Muslims as a risk to Australia and Australians: an ‘enemy within’; the potential ‘home grown terrorist’. The daily experience of visibly-different Australian Muslims, however, is that they are more fearing than fear-inspiring. The Aly and Balnaves fear scale indicates that Australian Muslims have twice as many fear indicators as non-Muslims Australians. Disengagement from Western media and media that is seen to be influenced or controlled by the West is widespread among Australian Muslims who increasingly argue that the media institutions are motivated by an agenda that includes profit and the perpetuation of a negative stereotype of Muslims both in Australia and around the globe, particularly in relation to Middle Eastern affairs. The negative stereotypes of Muslims in the Australian media have inculcated a sense of victimhood which Muslims in Australia have used as the basis for a reconstruction of their identity and the creation of alternative narratives of belonging (Aly). Central to the notion of identity among Australian Muslims is a sense of having their citizenship rights curtailed by virtue of their faith: of being included in a general Western dismissal of Muslims’ rights and experiences. As one interviewee said: If you look at the Channel Al Jazeera for example, it’s a channel but they aren’t making up stories, they are taping videos in Iraqi, Palestine and other Muslim countries, and they just show it to people, that’s all they do. And then George Bush, you know, we hear on the news that George Bush was discussing with Tony Blair that he was thinking to bomb Al Jazeera so why would these people have their right to freedom and we don’t? So that’s why I think the people who are in power, they have the control over the media, and it’s a big political game. Because if it wasn’t then George Bush, he’s the symbol of politics, why would he want to bomb Al Jazeera for example? Amidst leaks and rumours (Timms) that the 2003 US bombing of Al Jazeera was a deliberate attack upon one of the few elements of the public sphere in which some Western-nationality Muslims have confidence, many elements of the mainstream Western media rose to Al Jazeera’s defence. For example, using an appeal to the right of citizens to engage in and consume free speech, the editors of influential US paper The Nation commented that: If the classified memo detailing President Bush’s alleged proposal to bomb the headquarters of Al Jazeera is provided to The Nation, we will publish the relevant sections. Why is it so vital that this information be made available to the American people? Because if a President who claims to be using the US military to liberate countries in order to spread freedom then conspires to destroy media that fail to echo his sentiments, he does not merely disgrace his office and soil the reputation of his country. He attacks a fundamental principle, freedom of the press—particularly a dissenting and disagreeable press—upon which that country was founded. (cited in Scahill) For other Australian Muslims, it is the fact that some media organisations have been listed as banned by the US that gives them their ultimate credibility. This is the case with Al Manar, for example. Feeling that they are denied access to public spaces to partake in democratic dialogue as equal political citizens, Australian Muslims are pursuing alternative communicative spaces that support and reinforce their own cultural worldviews. The act of engaging with marginalised and alternative communicative spaces constitutes what Clifford terms ‘collective practices of displaced dwelling’. It is through these practices of displaced dwelling that Australian Muslims essentialise their diasporic identity and negotiate new identities based on common perceptions of injustice against Muslims. But you look at Al Jazeera they talk in the same tongue as the Western media in our language. And then you look again at something like Al Manar who talks of their own tongue. They do not use the other media’s ideas. They have been attacked by the Australians, been attacked by the Israelis and they have their own opinion. This statement came from an Australian Muslim of Jordanian background in her late forties. It reflects a growing trend towards engaging with media messages that coincide with and reinforce a sense of injustice. The Al Manar television station to which this participant refers is a Lebanese based station run by the militant Hezbollah movement and accessible to Australians via satellite. Much like Al Jazeera, Al Manar broadcasts images of Iraqi and Palestinian suffering and, in the recent war between Israel and Hezbollah, graphic images of Lebanese casualties of Israeli air strikes. Unlike the Al Jazeera broadcasts, these images are formatted into video clips accompanied by music and lyrics such as “we do not fear America”. Despite political pressure including a decision by the US to list Al Manar as a terrorist organisation in December 2004, just one week after a French ban on the station because its programming had “a militant perspective with anti-Semitic connotations” (Jorisch), Al Manar continued to broadcast videos depicting the US as the “mother of terrorism”. In one particularly graphic sequence, the Statue of Liberty rises from the depths of the sea, wielding a knife in place of the torch and dripping in blood, her face altered to resemble a skull. As she rises out of the sea accompanied by music resembling a funeral march the following words in Arabic are emblazoned across the screen: On the dead bodies of millions of native Americans And through the enslavement of tens of millions Africans The US rose It pried into the affairs of most countries in the world After an extensive list of countries impacted by US foreign policy including China, Japan, Congo, Vietnam, Peru, Laos, Libya and Guatamala, the video comes to a gruelling halt with the words ‘America owes blood to all of humanity’. Another video juxtaposes images of Bush with Hitler with the caption ‘History repeats itself’. One website run by the Coalition against Media Terrorism refers to Al Manar as ‘the beacon of hatred’ and applauds the decisions by the French and US governments to ban the station. Al Manar defended itself against the bans stating on its website that they are attempts “to terrorise and silence thoughts that are not in line with the US and Israeli policies.” The station claims that it continues on its mission “to carry the message of defending our peoples’ rights, holy places and just causes…within internationally agreed professional laws and standards”. The particular brand of propaganda employed by Al Manar is gaining popularity among some Muslims in Australia largely because it affirms their own views and opinions and offers them opportunities to engage in an alternative public space in which Muslims are positioned as the victims and not the aggressors. Renegotiating an ‘Othered’ Identity The negative portrayal of Muslims as ‘other’ in the Australian media and in political discourse has resulted in Australian Muslims constructing alternative identities based on a common perception of injustice. Particularly since the terrorist attacks on the World Trade Centre in September 2001 and the ensuing “war on terror”, the ethnic divisions within the Muslim diaspora are becoming less significant as Australian Muslims reconstruct their identity based on a notion of supporting each other in the face of a global alliance against Islam. Religious identity is increasingly becoming the identity of choice for Muslims in Australia. This causes problems, however, since religious identity has no place in the liberal democratic model, which espouses secularism. This is particularly the case where that religion is sometimes constructed as being at odds with the principles and values of liberal democracy; namely tolerance and adherence to the rule of law. This problematic creates a context in which Muslim Australians are not only denied their heterogeneity in the media and political discourse but are dealt with through an understanding of Islam that is constructed on the basis of a cultural and ideological clash between Islam and the West. Religion has become the sole and only characteristic by which Muslims are recognised, denying them political citizenship and access to the public spaces of citizenship. Such ‘essentialising practices’ as eliding considerable diversity into a single descriptor serves to reinforce and consolidate diasporic identity among Muslims in Australia, but does little to promote and assist participatory citizenship or to equip Muslims with the tools necessary to access the public sphere as political citizens of the secular state. In such circumstances, the moderate Muslim may be not so much a ‘preferred’ citizen as one whose rights has been constrained. Acknowledgment This paper is based on the findings of an Australian Research Council Discovery Project, 2005-7, involving 10 focus groups and 60 in-depth interviews. The authors wish to acknowledge the participation and contributions of WA community members. References Akbarzadeh, Shahram, and Bianca Smith. The Representation of Islam and Muslims in the Media (The Age and Herald Sun Newspapers). Melbourne: Monash University, 2005. Aly, Anne, and Mark Balnaves. ”‘They Want Us to Be Afraid’: Developing Metrics of the Fear of Terrorism.” International Journal of Diversity in Organisations, Communities and Nations 6 (2007): 113-122. Aly, Anne. “Australian Muslim Responses to the Discourse on Terrorism in the Australian Popular Media.” Australian Journal of Social Issues 42.1 (2007): 27-40. Clifford, James. Routes: Travel and Translation in the Late Twentieth Century. London: Harvard UP, 1997. Haas, Tanni. “The Public Sphere as a Sphere of Publics: Rethinking Habermas’s Theory of the Public Sphere.” Journal of Communication 54.1 (2004): 178- 84. Jorisch, Avi. J. “Al-Manar and the War in Iraq.” Middle East Intelligence Bulletin 5.2 (2003). Noelle-Neumann, Elisabeth. “The Spiral of Silence: A Theory of Public Opinion.” Journal of Communication 24.2 (1974): 43-52. “Online Archives of California”. California Digital Library. n.d. Feb. 2008 http://content.cdlib.org/ark:/13030/kt1199n498/?&query= %22open%20platform%22&brand=oac&hit.rank=1>. Panopoulos, Sophie. Parliamentary debate, 5 Sep. 2005. Feb. 2008 http://www.aph.gov.au.hansard>. Saniotis, Arthur. “Embodying Ambivalence: Muslim Australians as ‘Other’.” Journal of Australian Studies 82 (2004): 49-58. Scahill, Jeremy. “The War on Al-Jazeera (Comment)”. 2005. The Nation. Feb. 2008 http://www.thenation.com/doc/20051219/scahill>. Timms, Dominic. “Al-Jazeera Seeks Answers over Bombing Memo”. 2005. Media Guardian. Feb. 2008 http://www.guardian.co.uk/media/2005/nov/23/iraq.iraqandthemedia>. Citation reference for this article MLA Style Aly, Anne, and Lelia Green. "‘Moderate Islam’: Defining the Good Citizen." M/C Journal 10.6/11.1 (2008). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0804/08-aly-green.php>. APA Style Aly, A., and L. Green. (Apr. 2008) "‘Moderate Islam’: Defining the Good Citizen," M/C Journal, 10(6)/11(1). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0804/08-aly-green.php>.
APA, Harvard, Vancouver, ISO, and other styles
28

Fuller, Glen. "Punch-Drunk Love." M/C Journal 10, no. 3 (June 1, 2007). http://dx.doi.org/10.5204/mcj.2660.

Full text
Abstract:
For once I want to be the car crash, Not always just the traffic jam. Hit me hard enough to wake me, And lead me wild to your dark roads. (Snow Patrol: “Headlights on Dark Roads”, Eyes Open, 2006) I didn’t know about the online dating site rsvp.com.au until a woman who I was dating at the time showed me her online profile. Apparently ‘everyone does rsvp’. Well, ‘everyone’ except me. (Before things ended I never did ask her why she listed herself as ‘single’ on her profile…) Forming relationships in our era of post-institutional modes of sociality is problematic. Some probably find such ‘romantically’ orientated ‘meet up’ sites to be a more efficient option for sampling what is available. Perhaps others want some loving on the side. In some ways these sites transform romance into the online equivalent of the logistics dock at your local shopping centre. ‘Just-in-time’ relationships rely less on social support structures of traditional institutions such as the family, workplace, and so on, including ‘love’ itself, and more on a hit and miss style of dating, organised like a series of car crashes and perhaps even commodified through an eBay-style online catalogue (see Crawford 83-88). Instead of image-commodities there are image-people and the spectacle of post-romance romance as a debauched demolition derby. Is romance still possible if it is no longer the naïve and fatalistic realisation of complementary souls? I watched Paul Thomas Anderson’s third film Punch-Drunk Love with the above rsvp.com.au woman. She interpreted it in a completely different manner to me. I shall argue (as I did with her) that the film captures some sense of romance in a post-romance world. The film was billed as a comedy/romance or comedy/drama, but I did not laugh either with or at the film. The story covers the trials of two people ‘falling in love’. Lena Leonard (Emma Watson) orchestrates an encounter with Barry Egan (Adam Sandler) after seeing a picture of him with his seven sisters. The trajectory of the romance is defined less by the meeting of two people, than the violence of contingency and of the world arrayed by the event of love. Contingency is central to complexity theory. Contingency is not pure chance, rather it exists as part of the processual material time of the event that defines events or a series of events as problematic (Deleuze, The Logic of Sense 52-53). To problematise events and recognise the contingencies they inculcate is to refuse the tendency to colonise the future through actuarial practices, such as ‘risk management’ and insurance or the probabilistic ‘Perfect Match’ success of internet dating sites (mirroring ‘Dexter’ from the 1980s dating television game show). Therefore, through Punch-Drunk Love I shall problematise the event of love so as to resuscitate the contingencies of post-romance romance. It is not surprising Punch-Drunk Love opens with a car crash for the film takes romance on a veritable post-Crash detour. Crash – novel and film – serves as an exploration of surfaces and desire in a world at the intersection of the accident. Jean Baudrillard, in his infamous essay on Crash (novel), dwells on the repositioning of the accident: [It] is no longer at the margin, it is at the heart. It is no longer the exception to a triumphal rationality, it has become the Rule, it has devoured the Rule. … Everything is reversed. It is the Accident that gives form to life, it is the Accident, the insane, that is the sex of life. (113) After the SUV rolls over in Punch-Drunk Love’s opening scene, a taxi van pauses long enough for an occupant to drop off a harmonium. A harmonium is a cross between an organ and a piano, but much smaller than both. It is a harmony machine. It breathes and wheezes to gather potentiality consonant sound waves of heterogeneous frequencies to produce a unique musicality of multiplicative resonance. No reason is given for the harmonium in the workings of the film’s plot. Another accident without any explanation, like the SUV crash, but this time it is an accidental harmony-machine. The SUV accident is a disorganising eruption of excess force, while the accidental harmony-machine is a synthesising organisation of force. One produces abolition, while the other produces a multiplicative affirmation. These are two tendencies that follow two different relations to the heterogeneous materialism of contingency. Punch-Drunk Love captures the contingency at the heart of post-romance romance. Instead of the layers of expectation habituated into institutional engagements of two subjects meeting, there is the accident of the event of love within which various parties are arrayed with various affects and desires. I shall follow Alain Badiou’s definition of the event of love, but only to the point where I shall shift the perspective from love to romance. Badiou defines love by initially offering a series of negative definitions. Firstly, love is not a fusional concept, the ‘two’ that is ‘one’. That is because, as Badiou writes, “an ecstatic One can only be supposed beyond the Two as a suppression of the multiple” (“What Is Love?” 38). Secondly, nor is love the “prostration of the Same on the alter of the Other.” Badiou argues that it is not an experience of the Other, but an “experience of the world [i.e. multiple], or of the situation, under the post-evental condition that there were Two” (“What Is Love?” 39). Lastly, the rejection of the ‘superstructural’ or illusory conception of love, that is, to the base of desire and sexual jealously (Badiou, “What Is Love?” 39). For Badiou love is the production of truth. The truth is that the Two, and not only the One, are at work in the situation. However, from the perspective of romance, there is no post-evental truth procedure for love as such. In Deleuze’s terminology, from the perspective of post-romance the Two serves an important role as the ‘quasi-cause’ of love (The Logic of Sense 33), or for Badiou it is the “noemenal possibility [virtualite]” (“What Is Love?” 51). The event of the Two, and, therefore, of love, is immanent to itself. However, this does not capture the romantic functioning of love swept up in the quasi-cause of the Two. Romance is the differential repetition of the event of love to-come and thus the repetition of the intrinsic irreducible wonder at the heart of the event. The wonder at love’s heart is the excess of potentiality, the excitement, the multiplicity, the stultifying surprise. To resuscitate the functioning of love is to disagree with Badiou’s axiom that there is an absolute disjunction between the (nominalist) Two. The Two do actually share a common dimension and that is the radical contingency at the heart of love. Love is not as a teleological destiny of the eternal quasi-cause, but the fantastic impossibility of its contingent evental site. From Badiou’s line of argument, romance is precisely the passage of this “aleatory enquiry” (“What is Love?” 45), of “the world from the point of view of the Two, and not an enquiry of each term of the Two about the other” (49). Romance is the insinuation of desire into this dynamic of enquiry. Therefore, the functioning of romance is to produce a virtual architecture of wonder hewn from seeming impossibility of contingency. It is not the contingency in itself that is impossible (the ‘chaosmos’ is a manifold of wonderless-contingency), but that contingency might be repeated as part of a material practice that produces love as an effect of differentiating wonder. Or, again, not that the encounter of love has happened, but that precisely it might happen again and again. Romance is the material and embodied practice of producing wonder. The materiality of romance needs to be properly outlined and to do this I turn to another of Badiou’s texts and the film itself. To explicate the materialism of romance is to begin outlining the problematic of romance where the material force of Lena and Barry’s harmony resonates in the virtuosic co-production of new potentialities. The practice of romance is evidenced in the scene where Lena and Barry are in Hawaii and Lena is speaking to Barry’s sister while Barry is watching her. A sense of wonder is produced not in the other person but of the world as multiplicity produced free from the burden of Barry’s sister, hence altering the material conditions of the differential repetition of contingency. The materialism in effect here is, to borrow from Michel Foucault, an ‘incorporeal materialism’ (169), and pertains to the virtual evental dimension of love. In his Handbook of Inaesthetics, Badiou sets up dance and theatre as metaphors for thought. “The essence of dance,” writes Badiou, “is virtual, rather than actual movement” (Handbook of Inaesthetics 61), while theatre is an “assemblage” (72) which in part is “the circulation of desire between the sexes” (71). If romance is the deliberate care for the event of love and its (im)possible contingency, then the dance of love requires the theatre of romance. To include music with dance is to malign Badiou’s conception of dance by polluting it with some elements of what he calls ‘theatre’. To return to the Hawaii scene, Barry is arrayed as an example of what Badiou calls the ‘public’ of theatre because he is watching Lena lie to his sister about his whereabouts, and therefore completes the ‘idea’ of theatre-romance as a constituent element (Badiou, Handbook of Inaesthetics 74). There is an incorporeal (virtual) movement here of pure love in the theatre of romance that repotentialises the conditions of the event of love by producing a repeated and yet different contingency of the world. Wonder triggered by a lie manifest of a truth to-come. According to Badiou, the history of dance is “governed by the perpetual renewal of the relation between vertigo and exactitude. What will remain virtual, what will be actualized, and precisely how is the restraint going to free the infinite?” (Handbook of Inaesthetics 70). Importantly, Badiou suggests that theatrical production “is often the reasoned trial of chances” (Handbook of Inaesthetics 74). Another way to think the materiality of romance is as the event of love, but without Badiou’s necessary declaration of love (“What Is Love?” 45). Even though the ‘truth’ of the Two acts as quasi-cause, love as such remains a pure (‘incorporeal’) Virtuality. As a process, there is no “absolute disappearance or eclipse” that belongs to the love-encounter (“What Is Love?” 45), thus instead producing a rhythmic or, better, melodic heterogeneous tension between the love-dance and romance-theatre. The rhythm-melody of the virtual-actual cascade is distributed around aleatory contingencies as the event of love is differentially repeated and is therefore continually repotentialised and exhausted at the same time. A careful or graceful balance needs to be found between potentiality and exhaustion. The film contains many examples of this (re)potentialising tension, including when Lena achieves the wonder of the ‘encounter’ by orchestrating a meeting. Similarly, Barry feigns a ‘business trip’ to Hawaii to meet up with Lena. This is proceeded by the increased urgency of Barry’s manipulation of the frequent flyer miles reward to meet with up with Lena. The tension is affective – both anxious and exciting – and belongs to the lived duration of contingency. In the same way as an actual material dance floor (or ‘theatre’ here) is repeated across multiple incorporeal dimensions of music’s virtuality through the repotentialisation of the dancer’s body, the multiple dimensions of love are repeated across the virtuality of the lovers’ actions through the repotentialisation of the conditions of the event of love. Punch-Drunk Love frames this problematic of romance by way of a second movement that follows the trajectory of the main character Barry. Barry is a depressive with an affect regulation problem. He flies into a rage whenever a childhood incident is mentioned and becomes anxious or ‘scared’ (as one sister described him) when in proximity to Lena. He tries to escape from the oppressive intimacy of his family. He plays with ‘identity’ in a childlike manner by dressing up as a businessman and wearing the blue suit. His small business is organised around selling plungers used to unblock toilets to produce flow. Indeed, Barry is defined by the blockages and flows of desire. His seven-sister over-Oedipalised familial unit continually operates as an apparatus of capture, a phone-sex pervert scam seeks to overcode desire in libidinal economy that becomes exploited in circuits of axiomatised shame (like an online dating site?), and a consumer rewards program that offers the dream of a frequent-flyer million-miles (line of) flight out of it all. ‘Oedipal’ in the expanded sense Deleuze and Guattari give the term as a “displaced or internalised limit where desire lets itself be caught. The Oedipal triangle is the personal and private territoriality that corresponds to all of capitalism’s efforts at social reterritorialisation” (266). Barry says he wants to ‘diversify’ his business, which is not the same thing as ‘expanding’ or developing an already established commercial interest. He does not have a clear idea of what domain or type of business he wants to enter into when diversifying. When he speaks to business contacts or service personnel on the phone he attempts to connect with them on a level of intimacy that is uncomfortably inappropriate for impersonal phone conversations. The inappropriate intimacy comes back to haunt him, of course, when a low-level crook attempts to extort money from him after Barry calls a phone sex line. The romance between Lena and Barry develops through a series of accident-contingencies that to a certain extent ‘unblocks’ Barry and allows him to connect with Lena (who also changes). Apparent contingencies that are not actually contingencies need to be explained as such (‘dropping car off’, ‘beat up bathrooms’, ‘no actual business in Hawaii’, ‘phone sex line’, etc.). Upon their first proper conversation a forklift in Barry’s business crashes into boxes. Barry calls the phone sex line randomly and this leads to the severe car crash towards the end of the film. The interference of Barry’s sisters occurs in an apparently random unexpected manner – either directly or indirectly through the retelling of the ‘gayboy’ story. Lastly, the climatic meeting in Hawaii where the two soon-to-be-lovers are framed by silhouette, their bodies meet not in an embrace but a collision. They emerge as if emitted from the throngs of the passing crowd. Barry has his hand extended as if they were going to shake and there is an audible grunt when their bodies collide in an embrace. To love is to endure the violence of a creative temporality, such as the production of harmony from heterogeneity. As Badiou argues, love cannot be a fusional relation between the two to make the one, nor can it be the relation of the Same to the Other, this is because the differential repetition of the conditions of love through the material practice of romance already effaces such distinctions. This is the crux of the matter: The maximum violence in the plot of Punch-Drunk Love is not born by Lena, even though she ends up in hospital, but by Barry. (Is this merely a masculinist reading of traditional male on male violence? Maybe, and perhaps why rsvp.com.au woman read it different to me.) What I am trying to get at is the positive or creative violence of the two movements within the plot – of the romance and of Barry’s depressive social incompetence – intersect in such a way to force Barry to renew himself as himself. Barry’s explosive fury belongs to the paradox of trying to ‘mind his own business’ while at the same time ‘diversifying’. The moments of violence directed against the world and the ‘glass enclosures’ of his subjectivity are transversal actualisations of the violence of love (on function of ‘glass’ in the film see King). (This raises the question, perhaps irrelevant, regarding the scale of Badiou’s conception of truth-events. After Foucault and Deleuze, why isn’t ‘life’ itself a ‘truth’ event (for Badiou’s position see Briefings on Existence 66-68)? For example, are not the singularities of Barry’s life also the singularities of the event of love? Is the post-evental ‘decision’ supposed to always axiomatically subtract the singular truth-supplement from the stream of singularities of life? Why…?) The violence of love is given literal expression in the film in the ‘pillow talk’ dialogue between Barry and Lena: Barry: I’m sorry, I forgot to shave. Lena: Your face is so adorable. Your skin and your cheek… I want to bite it. I want to bite on your cheek and chew on it, you’re so fucking cute. Barry: I’m looking at your face and I just wanna smash it. I just wanna fucking smash it with a sledgehammer and squeeze you, you’re so pretty… Lena: I wanna chew your face off and scoop out your eyes. I wanna eat them and chew them and suck on them… Barry: [nodding] Ok…yes, that’s funny… Lena: Yeah… Barry: [still nodding] This’s nice. What dismayed or perhaps intrigued Baudrillard about Crash was its mixing of bodies and technologies in a kind of violent eroticism where “everything becomes a hole to offer itself to the discharge reflex” (112). On the surface this exchange between Barry and Lena is apparently an example of such violent eroticism. For Baudrillard the accident is a product of the violence of technology in the logistics of bodies and signs which intervene in relations in such a way to render perversity impossible (as a threshold structuration of the Symbolic) because ‘everything’ becomes perverse. However, writer and director of Punch-Drunk Love, Paul Anderson, produces a sense of the wondrous (‘Punch-Drunk’) violence that is at the heart of love. This is not because of the actual violence of individual characters; in the film this only serves as a canvas of action to illustrate the intrinsic violence of contingency. Lena and Barry’s ‘pillow talk’ not so much as a dance but a case of the necessary theatre capturing the violence and restraint of love’s virtual dance. ‘Violence’ (in the sense it is used above) also describes the harmonic marshalling of the heterogeneous materiality of sound affected by the harmonium. The ‘violence’ of the harmonium is decisively expressed through the coalescence of the diegetic and nondiegetic soundtracks at the end of the film when Barry plays the harmonium concurrently with Jon Brion’s score for the film. King notes, the “diegetic and nondiegetic music playing together is a moment of cinematic harmony; Barry, Lena, and the harmonium are now in sync” (par. 19). The notes of music connect different diegetic and nondiegetic series which pivot around new possibilities. As Deleuze writes about the notes played at a concert, they are “pure Virtualities that are actualized in the origins [of playing], but also pure Possibilities that are attained in vibrations or flux [of sound]” (The Fold 91). Following Deleuze further (The Fold 146-157), the horizontal melodic movement of romance forms a diagonal or transversal line with the differentially repeated ‘harmonic’ higher unity of love. The unity is literally ‘higher’ to the extent it escapes the diegetic confines of the film itself. For Deleuze “harmonic unity is not that of infinity, but that which allows the existent to be thought of as deriving from infinity” (The Fold 147, ital. added). While Barry is playing the harmonium in this scene Lena announces, “So here we go.” These are the final words of the film. In Badiou’s philosophy this is a declaration of the truth of love. Like the ‘higher’ non/diegetic harmony of the harmonium, the truth of love “composes, compounds itself to infinity. It is thus never presented integrally. All knowledge [of romance] relative to this truth [of the Two, as quasi-cause] thus disposes itself as an anticipation” (“What is Love?” 49). Romance is therefore lived as a vertiginous state of anticipation of love’s harmony. The materiality of romance does not simply consist of two people coming together and falling in love. The ‘fall’ functions as a fatalistic myth used to inscribe bodies within the eschatological libidinal economies of ‘romantic comedies’. To anneal Baudrillard’s lament, perversity obviously still has a positive Symbolic function on the internet, especially online dating sites where anticipation can be modulated through the probabilistic manipulation of signs. In post-romance, the ‘encounter’ of love necessarily remains, but it is the contingency of this encounter that matters. The main characters in Punch-Drunk Love are continually arrayed through the contingencies of love. I have linked this to Badiou’s notion of the event of love, but have focused on what I have called the materiality of romance. The materiality of romance requires more than a ‘fall’ induced by a probabilistic encounter, and yet it is not the declaration of a truth. The post-evental truth procedure of love is impossible in post-romance romance because there is no ‘after’ or ‘supplement’ to an event of love; there is only the continual rhythm of romance and anticipation of the impossible. It is not a coincidence that the Snow Patrol lyrics that serve above as an epigraph resonate with Deleuze’s comment that a change in the situation of Leibnizian monads has occurred “between the former model, the closed chapel with imperceptible openings… [to] the new model invoked by Tony Smith [of] the sealed car speeding down the dark highway” (The Fold 157). Post-Crash post-romance romance unfolds like the driving-monad in an aleatory pursuit of accidents. That is, to care for the event of love is not to announce the truth of the Two, but to pursue the differential repetition of the conditions of love’s (im)possible contingency. This exquisite and beautiful care is required for the contingency of love to be maintained. Hence, the post-romance problematic of romance thus posited as the material practice of repeating the wonder at the heart of love. References Badiou, Alain. Briefings on Existence: A Short Treatise on Transitory Ontology. Trans. Norman Madrasz. Albany, New York: State U of New York P, 2006. ———. Handbook of Inaesthetics. Trans. Alberto Toscano. Stanford, Calif.: Stanford UP, 2005. ———. “What Is Love?” Umbr(a) 1 (1996): 37-53. Baudrillard, Jean. Simulacra and Simulation. Ann Arbor: U of Michigan P, 1994. Crawford, Kate. Adult Themes: Rewriting the Rules of Adulthood. Sydney: Macmillan, 2006. Deleuze, Gilles. The Fold: Leibniz and the Baroque. Minneapolis: U of Minnesota P, 1993. ———. The Logic of Sense. Trans. Mark Laster and Charles Stivale. European Perspectives. Ed. Constantin V. Boundas. New York: Columbia UP, 1990. Deleuze, Gilles, and Félix Guattari. Anti-Oedipus: Capitalism and Schizophrenia. Minneapolis: U of Minnesota P, 1983. Foucault, Michel. “Theatricum Philosophicum.” Language, Counter-Memory, Practice: Selected Essays and Interviews. Ed. D. F. Bouchard. New York: Cornell UP, 1977. 165-96. King, Cubie. “Punch Drunk Love: The Budding of an Auteur.” Senses of Cinema 35 (2005). Citation reference for this article MLA Style Fuller, Glen. "Punch-Drunk Love: A Post-Romance Romance." M/C Journal 10.3 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0706/03-fuller.php>. APA Style Fuller, G. (Jun. 2007) "Punch-Drunk Love: A Post-Romance Romance," M/C Journal, 10(3). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0706/03-fuller.php>.
APA, Harvard, Vancouver, ISO, and other styles
29

Admin, Admin, and Dr Mustafa Arslan. "Effect of dexmedetomidine on ischemia-reperfusion injury of liver and kidney tissues in experimental diabetes and hepatic ischemia-reperfusion injury induced rats." Anaesthesia, Pain & Intensive Care, May 9, 2019, 143–49. http://dx.doi.org/10.35975/apic.v0i0.641.

Full text
Abstract:
Background: Reperfusion following ischemia can lead to more injuries than ischemia itself especially in diabetic patients. The aim of this study was to evaluate the effect of dexmedetomidine on ischemia-reperfusion injury (IRI) in rats with have hepatic IRI and diabetes mellitus. Methodology: Twenty-eight Wistar Albino rats were randomised into four groups as control (C), diabetic (DC), diabetic with hepatic ischemia-reperfusion injury (DIR), and diabetic but administered dexmedetomidine followed by hepatic IRI (DIRD) groups. Hepatic tissue samples were evaluated histopathologically by semiquantitative methods. Malondialdehyde (MDA), superoxide dismutase (SOD), glutathion s-transpherase (GST), and catalase (CAT) enzyme levels were investigated in liver and kidney tissues as oxidative state parameters. Results: In Group DIR; hepatocyte degeneration, sinusoidal dilatation, pycnotic nucleus, and necrotic cells were found to be more in rat hepatic tissue; while mononuclear cell infiltration was higher in the parenchyme. MDA levels were significantly lower; but SOD levels were significantly higher in Group DIRD with regard to Group DIR. In the IRI induced diabetic rats’ hepatic and nephrotic tissues MDA levels, showing oxidative injury, were found to be lower. SOD levels, showing early antioxidant activity, were higher. Conclusion: The enzymatic findings of our study together with the hepatic histopathology indicate that dexmedetomidine has a potential role to decrease IRI. Key words: Hepatic ischemia reperfusion injury; Diabetes mellitus; Dexmedetomidine; Rat; MDA; SOD Citation: Sezen SC, Işık B, Bilge M, Arslan M, Çomu FM, Öztürk L, Kesimci E, Kavutçu M. Effect of dexmedetomidine on ischemia-reperfusion injury of liver and kidney tissues in experimental diabetes and hepatic ischemia-reperfusion injury induced rats. Anaesth Pain & Intensive Care 2016;20(2):143-149 Received: 21 November 2015; Reviewed: 10, 24 December 2015, 9, 10 June 2016; Corrected: 12 December 2015; Accepted: 10 June 2016 INTRODUCTİON Perioperative acute tissue injury induced by ischemia-reperfusion is a comman clinical event caused by reduced blood supply to the tissue being compromised during major surgery. Ischemia leads to cellular injury by depleting cellular energy deposits and resulting in accumulation of toxic metabolites. The reperfusion of tissues that have remained in ischemic conditions causes even more damage.1 Furthermore hepatic ischemia-reperfusion injury (IRI) demonstrates a strong relationship with peri-operative acute kidney injury.2 The etiology of diabetic complications is strongly associated with increased oxidative stress (OS). Diabetic patients are known to have a high risk of developing OS or IRI which results with tissue failure.3 The most important role in ischemia and reperfusion is played by free oxygen radicals.1 In diabetes, characterized by hyperglycemia, even more free oxygen radicals are produced due to oxidation of glucose and glycosylation of proteins.3 The structures which are most sensitive to free oxygen radicals in the cells are membrane lipids, proteins, nucleic acids and deoxyribonucleic acids.1 It has been reported that endogenous antioxidant enzymes [superoxide dismutase (SOD), glutathion s-transpherase (GST), catalase (CAT)] play an important role to alleviate IRI.4-8 Also some pharmacological agents have certain effects on IRI.1 The anesthetic agents influence endogenous antioxidant systems and free oxygen radical formation.9-12 Dexmedetomidine is a selective α-2 adrenoceptor agonist agent. It has been described as a useful and safe adjunct in many clinical applications. It has been found that it may increase urine output by considerably redistributing cardiac output, inhibiting vasopressin secretion and maintaining renal blood flow and glomerular filtration. Previous studies demonstrated that dexmedetomidine provides protection against renal, focal cerebral, cardiac, testicular, and tourniquet-induced IRI.13-18 Arslan et al observed that dexmedetomidine protected against lipid peroxidation and cellular membrane alterations in hepatic IRI, when given before induction of ischemia.17 Si et al18 demonstrated that dexmedetomidine treatment results in a partial but significant attenuation of renal demage induced by IRI through the inactivation of JAK/STAT signaling pathway in an in vivo model. The efficacy of the dexmedetomidine for IRI in diabetic patient is not resarched yet. The purpose of this experimental study was to evaluate the biochemical and histological effects of dexmedetomidine on hepatic IRI in diabetic rat’s hepatic and renal tissue. METHODOLOGY Animals and Experimental Protocol: This study was conducted in the Physiology Laboratory of Kirikkale University upon the consent of the Experimental Animals Ethics Committee of Kirikkale University. All of the procedures were performed according to the accepted standards of the Guide for the Care and Use of Laboratory Animals. In the study, 28 male Wistar Albino rats, weighing between 250 and 300 g, raised under the same environmental conditions, were used. The rats were kept under 20-21 oC at cycles of 12-hour daylight and 12-hour darkness and had free access to food until 2 hours before the anesthesia procedure. The animals were randomly separated into four groups, each containing 7 rats. Diabetes was induced by a single intraperitoneal injection of streptozotocin (Sigma Chemical, St. Louis, MO, USA) at a dose of 65 mg/kg body weight. The blood glucose levels were measured at 72 hrs following this injection. Rats were classified as diabetic if their fasting blood glucose (FBG) levels exceeded 250 mg/dl, and only animals with FBGs of > 250 mg/dl were included in the diabetic groups (dia­betes only, diabetes plus ischemia-reperfusion and diabetes plus dexmedetomidine-ischemia-reperfusion). The rats were kept alive 4 weeks after streptozotocin injection to allow development of chronic dia­betes before they were exposed to ischemia-reperfusion.(19) The rats were weighed before the study. Rats were anesthetized with intraperitoneal ketamine 100 mg/kg. The chest and abdomen were shaved and each animal was fixed in a supine position on the operating table. The abdomen was cleaned with 1% polyvinyl iodine and when dry, the operating field was covered with a sterile drape and median laparotomy was performed. There were four experimental groups (Group C (sham-control; n = 7), (Group DC (diabetes-sham-control; n = 7), Group DIR (diabetes-ischemia-reperfusion; n = 7), and Group DIRD (diabetes-ischemia-reperfusion-dexmedetomidine; n = 7). Sham operation was performed on the rats in Group C and Group DC. The sham operation consisted of mobilization of the hepatic pedicle only. The rats in this group were sacrificed 90 min after the procedure. Hepatic I/R injury was induced in Groups DIR and DIRD respectively with hepatic pedicle clamping using a vascular clamp as in the previous study of Arslan et al.(17) After an ischemic period of 45 min, the vascular clamp was removed. A reperfusion period was maintained for 45 min. In Group DIRD, dexmedetomidine hydrochloride 100 μg/kg, (Precedex 100 μg/2 ml, Abbott®, Abbott Laboratory, North Chicago, Illinois, USA) was administrated via intraperitoneal route 30 minutes before surgery. All the rats were given ketamine 100 mg/kg intraperitoneally and intracardiac blood samples were obtained. Preserving the tissue integrity by avoiding trauma, liver and renal biopsy samples were obtained. Biochemical Analysis: The liver and renal tissues were first washed with cold deionized water to discard blood contamination and then homogenized in a homogenizer. Measurements on cell contest require an initial preparation of the tissues. The preparation procedure may involve grinding of the tissue in a ground glass tissue blender using a rotor driven by a simple electric motor. The homogenizer as a tissue blender similar to the typical kitchen blender is used to emulsify and pulverize the tissue (Heidolph Instruments GMBH & CO KGDiax 900 Germany®) at 1000 U for about 3 min. After centrifugation at 10,000 g for about 60 min, the upper clear layer was taken. MDA levels were determined using the method of Van Ye et al,(20) based on the reaction of MDA with thiobarbituric acid (TBA). In the TBA test reaction, MDA and TBA react in acid pH to form a pink pigment with an absorption maximum at 532 nm. Arbitrary values obtained were compared with a series of standard solutions (1,1,3,3-tetraethoxypropane). Results were expressed as nmol/mg.protein. Part of the homogenate was extracted in ethanol/chloroform mixture (5/3 v/v) to discard the lipid fraction, which caused interferences in the activity measurements of T-SOD, CAT and GST activities. After centrifugation at 10.000 x g for 60 min, the upper clear layer was removed and used for the T-SOD, CAT, GST enzyme activity measurement by methods as described by Durak et al21, Aebi22 and Habig et al23 respectively. One unit of SOD activity was defined as the enzyme protein amount causing 50% inhibition in NBTH2 reduction rate and result were expressed in U/mg protein. The CAT activity method is based on the measurement of absorbance decrease due to H2O2 consumption at 240 nm. The GST activity method is based on the measurement of absorbance changes at 340 nm due to formation of GSH-CDNB complex. Histological determinations: Semiquantitative evaluation technique used by Abdel-Wahhab et al(24) was applied for interpreting the structural changes investigated in hepatic tissues of control and research groups. According to this, (-) (negative point) represents no structural change, while (+) (one positive point) represents mild, (++) (two positive points) medium and (+++) (three positive points) represents severe structural changes. Statistical analysis: The Statistical Package for the Social Sciences (SPSS, Chicago, IL, USA) 20.0 softwre was used for the statistical analysis. Variations in oxidative state parameters, and histopathological examination between study groups were assessed using the Kruskal-Wallis test. The Bonferroni-adjusted Mann-Whitney U-test was used after significant Kruskal-Wallis to determine which groups differed from the others. Results were expressed as mean ± standard deviation (Mean ± SD). Statistical significance was set at a p value < 0.05 for all analyses. RESULTS There was statistically significant difference observed between the groups with respect to findings from the histological changes in the rat liver tissue (hepatocyte degeneration, sinüsoidal dilatation, pycnotic nucleus, prenecrotic cell) determined by light microscopy according to semiquantitative evaluation techniques (p < 0.0001). In Group DIR, hepatocyte degeneration was significantly high compared to Group C, Group DC and Group DIRD (p < 0.0001, p < 0.0001, p = 0.002, respectively), (Table 1, Figure 1-4). Similarly, sinüsoidal dilatation was significantly higher in Group DIR (p < 0.0001, p = 0.004, p = 0.015, respectively). Although, pcynotic nucleus was decreased in Group DIRD, it did not make a significant difference in comparison to Group DIR (p = 0.053), (Table 1, Figure 1-4). The prenecrotic cells were significantly increased in Group DIR, with respect to Group C, Group DC and Group DIRD (p < 0.0001, p = 0.004, p < 0.0001, respectively), (Table 1, Figure 1-4). Table 1. The comparison of histological changes in rat hepatic tissue [Mean ± SD)] p**: Statistical significance was set at a p value < 0.05 for Kruskal-Wallis test *p < 0.05: When compared with Group DIR Figure 1: Light microscopic view of hepatic tissue of Group C (control). VC: vena centralis, *: sinusoids. ®: hepatocytes, k: Kupffer cells, G: glycogen granules, mc: minimal cellular changes, Hematoxilen & Eosin x 40 Figure 2: Light-microscopic view of hepatic tissue of Group DC (diabetes mellitus control) (G: Glycogen granules increased in number, (VC: vena centralis, *:sinusoids. ®:hepatocytes, k:Kupffer cells, G: glycogen granules, mc: minimal cellular changes; Hematoxylin & Eosin x 40) Figure 3: Light-microscopic view of hepatic tissue of Group DIR (Diabetes Mellitus and ischemia-reperfusion) (VC: vena centralis, (H) degenerative and hydrophic hepatocytes, (dej) vena centralis degeneration (centrolobar injury) (*): sinusoid dilatation. (←) pycnotic and hyperchromatic nuclei, MNL: mononuclear cell infiltration, (¯) congestion, K: Kupffer cell hyperplasia, (­) vacuolar degeneration (Hematoxylin & Eosin x 40) Figure 4: Light-microscopic view of hepatic tissue of Group DIRD (Diabetes Mellitus and ischemia-reperfusion together with dexmedetomidine applied group) (VC: vena centralis, (MNL) mononuclear cell infiltration, (dej) hydrophilic degeneration in hepatocytes around vena centralis, (conj) congestion, G: glycogen granules, (←) pycnotic and hyperchromatic nuclei, sinusoid dilatation (*) (Hematoxylin & Eosin x 40) Besides, in liver tissue parenchyma, MN cellular infiltration was a light microscopic finding; and showed significant changes among the groups (p < 0.0001). This was significantly higher in Group DIR, compared to Group C, DC, and DIRD (p < 0.0001, p=0.007, p = 0.007, respectively), (Table 1, Figure 1-4). The enzymatic activity of MDA, SOD and GST in hepatic tissues showed significant differences among the groups [(p = 0.019), (p = 0.034). (p = 0.008) respectively]. MDA enzyme activity was significantly incresed in Group DIR, according to Group C and Group DIRD (p = 0.011, p = 0.016, respectively), (Table 2). In Group DIR SOD enzyme activity was lower with respect to Group C and Group DIRD (p = 0.010, p = 0.038, respectively), (Table 2). The GST enzyme activity was significantly higher in Group DIR, when compared to Group C, DC and DIRD (p = 0.007, p = 0.038, p = 0.039, respectively), (Table 2). Table 2. Oxidative state parameters in rat hepatic tissue [Mean ± SD] p**: Statistical significance was set at a p value < 0.05 for Kruskal-Wallis test *p < 0.05: When compared with Group DIR The enzymatic activity of MDA, SOD in renal tissues, showed significant differences among the groups [(p < 0.0001), (p = 0.008) respectively ]. MDA enzyme activity was significantly incresed in Group DIR, according to Group C and Group DIRD (p < 0.0001, p < 0.0001, respectively). Also MDA enzyme activity level was significantly increased in Group DC, in comparison to Group C and Group DIRD (p = 0.003, p = 0.001, respectively), (Table 3). In Group DIR SOD enzyme activity was lower with respect to Group C and Group DIRD (p = 0.032, p = 0.013, respectively), (Table 3). The GST enzyme activity was significantly higher in Group DIR than the other three groups, however; CAT levels were similar among the groups (Table 3). Table 3: Oxidative state parameters in rat nephrotic tissue [Mean ± SD)] p**: Statistical significance was set at a p value < 0.05 for Kruskal-Wallis test *p < 0.05: When compared with Group DIR DISCUSSION In this study, we have reported the protective effect of dexmedetomidine in experimental hepatic and renal IRI model in the rat by investigating the MDA and SOD levels biochemically. Besides, hepatic histopathological findings also supported our report. Ischemic damage may occur with trauma, hemorrhagic shock, and some surgical interventions, mainly hepatic and renal resections. Reperfusion following ischemia results in even more injury than ischemia itself. IRI is an inflammatory response accompanied by free radical formation, leucocyte migration and activation, sinusoidal endothelial cellular damage, deteoriated microcirculation and coagulation and complement system activation.1 We also detected injury in hepatic and renal tissue caused by reperfusion following ischemia in liver. Experimental and clinical evidence indicates that OS is involved in both the pathogenesis and the complications of diabetes mellitus.25,26 Diabetes mellitus is a serious risk factor for the development of renal and cardiovascular disease. It is also related to fatty changes in the liver.27 Diabetes-related organ damage seems to be the result of multiple mechanisms. Diabetes has been associated with increased free radical reactions and oxidant tissue damage in STZ-induced diabetic rats and also in patients.26Oxidative stress has been implicated in the destruction of pancreatic β-cells28 and could largely contribute to the oxidant tissue damage associated with chronic hyperglycemia.29 A number of reports have shown that antioxidants can attenuate the complications of diabetes in patients30 and in experimental models.28,31 This study demonstrated that diabetes causes a tendency to increase the IRI. There is a lot of investigations related to the pharmacological agents or food supplements applied for decreasing OS and IRI. Antioxidant agents paly an important role in IRI by effecting antioxidant system or lessening the formation of ROS. It has been reported that anesthetic agents too, are effective in oxidative stress.1 During surgical interventions, it seems rational to get benefit from anesthetic agents in prevention of OS caused by IRI instead of using other agents. It has been declared that; dexmedetomidine; as an α-2 agonist with sedative, hypnotic properties; is important in prevention of renal, focal, cerebral, cardiac, testicular and tourniquet-induced IRI.13-18 On the other hand Bostankolu et al. concluded that dexmedetomidine did not have an additional protective role for tournique induced IRI during routine general anesthesia.32 In this study; we have shown that dexmedetomidine has a reducing effect in IRI in diabetic rats. Some biochemical tests and histopathological evaluations are applied for bringing up oxidative stress and IRI in the tissues. Reactive oxygen species (ROS) that appear with reperfusion injury damage cellular structures through the process of the lipid peroxidation of cellular membranes and yield toxic metabolites such as MDA.33 As an important intermidiate product in lipid peroxidation, MDA is used as a sensitive marker of IRI.34 ROS-induced tissue injury is triggered by various defense mechanisms.35 The first defence mechanisms include the antioxidant enzymes of SOD, CAT, and GPx. These endogenous antioxidants are the first lines of defence against oxidative stres and act by scavenging potentially damaging free radical moieties.36 There is a balance between ROS and the scavenging capacity of antioxidant enzymes.1-8 In this study, for evaluation of oxidative damage and antioxidant activity, MDS, SOD, GST and CAT levels were determined in liver and kidney tissues. MDA levels in hepatic and renal tissues were higher in Group DIR compared to Group C and Group DIRD. GST levels were higher in Group DIR compared to all the other three groups. When the groups were arranged from highest to lowest order, with respect to CAT levels, the order was; Group DIR, Group DIRD, Group DC and Group C. However, the difference was not significant. The acute phase reactant MDA, as a marker of OS, was found to be high in Group DIR and low in Group DIRD. This could be interpreted as the presence of protective effect of dexmedetomidine in IRI. IRI developing in splanchnic area causes injury also in the other organs.35 Leithead et al showed that clinically significant hepatic IRI demonstrates a strong relationship with peri-operative acute kidney injury.2 In our experimental research that showed correlation to that of research by Leithead et al. After hepatic IRI in diabetic rats renal OS marker MDA levels were significantly more in Group DIR than Group DIRD. In our study, we observed histopathological changes in the ischemic liver tissue and alterations in the level of MDA, SOD, GST and CAT levels which are OS markers. Histopathological changes of the liver tissues are hepatocyt degeneration, sinusoidal dilatation, nuclear picnosis, celluler necrosis, mononuclear cell infiltrationat paranchimal tissue. These histopathological injury scores were significantly lower in the Group DIRD than those in group DIR. LIMITATION Study limitation is there was no negative control group, as this type of surgical intervention is not possible in rats without anesthesia. CONCLUSION The enzymatic findings of our study together with the hepatic histopathology indicate that dexmedetomidine has a potential role to decrease ischemia-reperfusion injury. Conflict of interest and funding: The authors have not received any funding or benefits from industry or elsewhere to conduct this study. Author contribution: ŞCS: Concept, conduction of the study work and manuscript editing; BI: the main author to write the article; MB & MK: biochemical analysis; MA: manuscript writing; FMÇ: helped us with experimental study; LÖ & EK: collection of data REFERENCES Collard CD, Gelman S. Pathophysiology, clinical manifestations, and prevention of ischemia-reperfusion injury. Anesthesiology. 2001;94(6):1133. [PubMed] [Free full text] Leithead JA, Armstrong MJ, Corbett C, Andrew M, Kothari C, Gunson BK, et al. Hepatic ischemia reperfusion injury is associated with acute kidney injury following donation after brain death liver transplantation. Transpl Int. 2013;26(11):1116. doi: 10.1111/tri.12175. [PubMed] [Free full text] Panés J, Kurose I, Rodriguez-Vaca D, Anderson DC, Miyasaka M, Tso P, et al. Diabetes exacerbates inflammatory responses to ischemia-reperfusion. Circulation. 1996;93(1):161. [PubMed] [Free full text] Touyz RM. Reactive oxygen species and angiotensin II signaling in vascular cells-implications in cardiovascular disease. Braz J Med Biol Res. 2004;37:1263. [PubMed] [Free full text] Olivares-Corichi IM, Ceballos G, Ortega-Camarillo C, Guzman-Grenfell AM, Hicks JJ. Reactive oxygen species (ROS) induce chemical and structural changes on human insulin in vitro, including alterations in its immunoreactivity. Front Biosci. 2005;10:834. [PubMed] Witko-Sarsat V, Friedlander M, Capeillere-Blandin C, Nguyen-Khoa T, Nguyen AT, Zingraff J, et al. Advanced oxidation protein products as a novel marker of oxidative stress in uremia. Kidney Int. 1996;49:1304. [PubMed] Harman D. Free radical theory of aging: An update: Increasing the functional life span. Ann N Y Acad Sci. 2006;1067:10. [PubMed] Nita DA, Nita V, Spulber S, Moldovan M, Popa DP, Zagrean AM, Zagrean L. Oxidative damage following cerebral ischemia depends on reperfusion – a biochemical study in rat. J Cell Mol Med. 2001;5:163–170. [PubMed] [Free full text] Annecke T, Kubitz JC, Kahr S, Hilberath JM, Langer K, Kemming GI, et al. Effects of sevoflurane and propofol on ischaemia-reperfusion injury after thoracic-aortic occlusion in pigs. Br J Anaesth. 2007;98(5):581. [PubMed] [Free full text] De Hert SG, Van der Linden PJ, Cromheecke S, Meeus R, Nelis A, Van Reeth V, ten Broecke PW, et al. Cardioprotective properties of sevoflurane in patients undergoing coronary surgery with cardiopulmonary bypass are related to the modalities of its administration. Anesthesiology. 2004;101(2):299. [PubMed] [Free full text] Yuzer H, Yuzbasioglu MF, Ciralik H, Kurutas EB, Ozkan OV, Bulbuloglu E, et al. Effects of intravenous anesthetics on renal ischemia/reperfusion injury. Ren Fail. 2009;31(4):290. [PubMed] [Free full text] Lee HT, Ota-Setlik A, Fu Y, Nasr SH, Emala CW. Differential protective effects of volatile anesthetics against renal ischemia-reperfusion injury in vivo. Anesthesiology. 2004;101(6):1313. [PubMed] [Free full text] Lai YC, Tsai PS, Huang CJ. Effects of dexmedetomidine on regulating endotoxin-induced up-regulation of inflammatory molecules in murine macrophages. J Surg Res. 2009;154(2):212. doi: 10.1016/j.jss.2008.07.010. [PubMed] Yoshitomi O, Cho S, Hara T, Shibata I, Maekawa T, Ureshino H, Sumikawa K. Direct protective effects of dexmedetomidine against myocardial ischemia-reperfusion injury in anesthetized pigs. Shock. 2012;38(1):92. doi: 10.1097/SHK.0b013e318254d3fb. [PubMed] Jolkkonen J, Puurunen K, Koistinaho J, Kauppinen R, Haapalinna A, Nieminen L, et al. Neuroprotection by the alpha2-adrenoceptor agonist, dexmedetomidine, in rat focal cerebral ischemia. Eur J Pharmacol. 1999;372(1):31. [PubMed] Kocoglu H, Ozturk H, Ozturk H, Yilmaz F, Gulcu N. Effect of dexmedetomidine on ischemia-reperfusion injury in rat kidney: a histopathologic study. Ren Fail. 2009;31(1):70. doi: 10.1080/08860220802546487. [PubMed] Arslan M, Çomu FM, Küçük A, Öztürk L, Yaylak F. Dexmedetomidine protects against lipid peroxidation and erythrocyte deformability alterations in experimental hepatic ischemia reperfusion injury. Libyan J Med. 2012;7. doi: 10.3402/ljm.v7i0.18185 [PubMed] [Free full text] Si Y, Bao H, Han L, Shi H, Zhang Y, Xu L, et al. Dexmedetomidine protects against renal ischemia and reperfusion injury by inhibiting the JAK/STAT signaling activation. J Transl Med. 2013;11(1):141. doi: 10.1186/1479-5876-11-141. [PubMed][Free full text] Türeci E, İş M, Üzüm G, Akyüz F, Ulu MO, Döşoğlu M, et al. Alterations in blood-brain barrier after traumatic brain injury in streptozotocin-induced diabetic rats. J Nervous Sys Surgery 2009;2(2):79. [Free full text] Van Ye TM, Roza AM, Pieper GM, Henderson J Jr, Johnson JP, Adams MB. Inhibition of intestinal lipid peroxidation does not minimize morphological damage. J Surg Res 1993;55:553. [PubMed] Durak I, Canbolat O, Kavutcu M, Öztürk HS, Yurtarslanı Z. Activities of total, cytoplasmic and mihochondrial superoxide dismutase enzymes in sera and pleural fluids from patient with lung cancer. J Clin Lab Anal 1996;10:17. [PubMed] Aebi H. Catalase. In: H.U.Bergmeyer (Ed): Methods of Enzymatic Analysis, Academic Press , New York and London, 1974;pp.673-677. Habig WH, Pabst MJ, Jakoby WB. Glutathione S-transferases. The first enzymatic step in mercapturic acid formation. J Biol Chem 1974;249:7130. [PubMed] [Free full text] Abdel-Wahhab MA, Nada SA, Arbid MS. Ochratoxicosis: Prevention of developmental toxicity by L-methionine in rats. J Appl Toxicol 1999;19:7. [PubMed] Wolff SP. Diabetes mellitus and free radicals: free radicals, transition metals and oxidative stress in the aetiology of diabetes mellitus and complications. Br Med Bull. 1993;49:642. [PubMed] [Free full text] West IC. Radicals and oxidative stress in diabetes. Diabet Med. 2000;17:171–180. [PubMed] Wanless IR, Lentz JS. Fatty liver hepatitis (steatohepatitis) and obesity: an autopsy study with analysis risk factors. Hepatology. 1990;12:1106. [PubMed] Hotta M, Tashiro F, Ikegami H, Niwa H, Ogihara T, Yodoi J, Miyazaki J. Pancreatic cell-specific expression of thioredoxin, an antioxidative and antiapoptotic protein, prevents autoimmune and streptozotocin-induced diabetes. J Exp Med. 1998;188:1445. [PubMed] [Free full text] Baynes JW. Role of oxidative stress in the development of complications in diabetes. Diabetes. 1991;40:405. [PubMed] Borcea V, Nourooz-Zadeh J, Wolff SP, Klevesath M, Hofmann M, Urich H, et al. α-Lipoic acid decreases oxidative stress even in diabetic patients with poor glycemic control and albuminuria. Free Radic Biol Med. 1999;26:1495. [PubMed] Fitzl G, Martin R, Dettmer D, Hermsdorf V, Drews H, Welt K. Protective effect of ginkgo biloba extract EGb 761 on myocardium of experimentally diabetic rats, I: ultrastructural and biochemical investigation on cardiomyocytes. Exp Toxicol Pathol. 1999;51:189. [PubMed] Bostankolu E, Ayoglu H, Yurtlu S, Okyay RD, Erdogan G, Deniz Y, et al. Dexmedetomidine did not reduce the effects of tourniquet-induced ischemia-reperfusion injury during general anesthesia. Kaohsiung J Med Sci. 2013;29(2):75. doi: 10.1016/j.kjms.2012.08.013. [PubMed] [Free full text] Wakai A, Wang JH, Winter DC, Street JT, O’Sullivan RG, Redmond HP. Tourniquet-induced systemic inflammatory response in extremity surgery. J Trauma 2001;51:922. [PubMed] Concannon MJ, Kester CG, Welsh CF, Puckett CL. Patterns of free-radical production after tourniquet ischemia implications for the hand surgeon. Plast Reconstr Surg 1992;89:846. [PubMed] Grisham MB, Granger DN. Free radicals: reactive metabolites of oxygen as mediators of postischemic reperfusion injury. In: Martson A, Bulkley GB, Fiddian-Green RG, Haglund U, editors. Splanchnic İschemia and Multiple Organ Failure. St Louis, MO: Mosby;1989. pp. 135–144. Mccard JM. The evolution of free radicals and oxidative stress. Am J Med 2000;108:652. [PubMed]
APA, Harvard, Vancouver, ISO, and other styles
30

Dieter, Michael. "Amazon Noir." M/C Journal 10, no. 5 (October 1, 2007). http://dx.doi.org/10.5204/mcj.2709.

Full text
Abstract:
There is no diagram that does not also include, besides the points it connects up, certain relatively free or unbounded points, points of creativity, change and resistance, and it is perhaps with these that we ought to begin in order to understand the whole picture. (Deleuze, “Foucault” 37) Monty Cantsin: Why do we use a pervert software robot to exploit our collective consensual mind? Letitia: Because we want the thief to be a digital entity. Monty Cantsin: But isn’t this really blasphemic? Letitia: Yes, but god – in our case a meta-cocktail of authorship and copyright – can not be trusted anymore. (Amazon Noir, “Dialogue”) In 2006, some 3,000 digital copies of books were silently “stolen” from online retailer Amazon.com by targeting vulnerabilities in the “Search inside the Book” feature from the company’s website. Over several weeks, between July and October, a specially designed software program bombarded the Search Inside!™ interface with multiple requests, assembling full versions of texts and distributing them across peer-to-peer networks (P2P). Rather than a purely malicious and anonymous hack, however, the “heist” was publicised as a tactical media performance, Amazon Noir, produced by self-proclaimed super-villains Paolo Cirio, Alessandro Ludovico, and Ubermorgen.com. While controversially directed at highlighting the infrastructures that materially enforce property rights and access to knowledge online, the exploit additionally interrogated its own interventionist status as theoretically and politically ambiguous. That the “thief” was represented as a digital entity or machinic process (operating on the very terrain where exchange is differentiated) and the emergent act of “piracy” was fictionalised through the genre of noir conveys something of the indeterminacy or immensurability of the event. In this short article, I discuss some political aspects of intellectual property in relation to the complexities of Amazon Noir, particularly in the context of control, technological action, and discourses of freedom. Software, Piracy As a force of distribution, the Internet is continually subject to controversies concerning flows and permutations of agency. While often directed by discourses cast in terms of either radical autonomy or control, the technical constitution of these digital systems is more regularly a case of establishing structures of operation, codified rules, or conditions of possibility; that is, of guiding social processes and relations (McKenzie, “Cutting Code” 1-19). Software, as a medium through which such communication unfolds and becomes organised, is difficult to conceptualise as a result of being so event-orientated. There lies a complicated logic of contingency and calculation at its centre, a dimension exacerbated by the global scale of informational networks, where the inability to comprehend an environment that exceeds the limits of individual experience is frequently expressed through desires, anxieties, paranoia. Unsurprisingly, cautionary accounts and moral panics on identity theft, email fraud, pornography, surveillance, hackers, and computer viruses are as commonplace as those narratives advocating user interactivity. When analysing digital systems, cultural theory often struggles to describe forces that dictate movement and relations between disparate entities composed by code, an aspect heightened by the intensive movement of informational networks where differences are worked out through the constant exposure to unpredictability and chance (Terranova, “Communication beyond Meaning”). Such volatility partially explains the recent turn to distribution in media theory, as once durable networks for constructing economic difference – organising information in space and time (“at a distance”), accelerating or delaying its delivery – appear contingent, unstable, or consistently irregular (Cubitt 194). Attributing actions to users, programmers, or the software itself is a difficult task when faced with these states of co-emergence, especially in the context of sharing knowledge and distributing media content. Exchanges between corporate entities, mainstream media, popular cultural producers, and legal institutions over P2P networks represent an ongoing controversy in this respect, with numerous stakeholders competing between investments in property, innovation, piracy, and publics. Beginning to understand this problematic landscape is an urgent task, especially in relation to the technological dynamics that organised and propel such antagonisms. In the influential fragment, “Postscript on the Societies of Control,” Gilles Deleuze describes the historical passage from modern forms of organised enclosure (the prison, clinic, factory) to the contemporary arrangement of relational apparatuses and open systems as being materially provoked by – but not limited to – the mass deployment of networked digital technologies. In his analysis, the disciplinary mode most famously described by Foucault is spatially extended to informational systems based on code and flexibility. According to Deleuze, these cybernetic machines are connected into apparatuses that aim for intrusive monitoring: “in a control-based system nothing’s left alone for long” (“Control and Becoming” 175). Such a constant networking of behaviour is described as a shift from “molds” to “modulation,” where controls become “a self-transmuting molding changing from one moment to the next, or like a sieve whose mesh varies from one point to another” (“Postscript” 179). Accordingly, the crisis underpinning civil institutions is consistent with the generalisation of disciplinary logics across social space, forming an intensive modulation of everyday life, but one ambiguously associated with socio-technical ensembles. The precise dynamics of this epistemic shift are significant in terms of political agency: while control implies an arrangement capable of absorbing massive contingency, a series of complex instabilities actually mark its operation. Noise, viral contamination, and piracy are identified as key points of discontinuity; they appear as divisions or “errors” that force change by promoting indeterminacies in a system that would otherwise appear infinitely calculable, programmable, and predictable. The rendering of piracy as a tactic of resistance, a technique capable of levelling out the uneven economic field of global capitalism, has become a predictable catch-cry for political activists. In their analysis of multitude, for instance, Antonio Negri and Michael Hardt describe the contradictions of post-Fordist production as conjuring forth a tendency for labour to “become common.” That is, as productivity depends on flexibility, communication, and cognitive skills, directed by the cultivation of an ideal entrepreneurial or flexible subject, the greater the possibilities for self-organised forms of living that significantly challenge its operation. In this case, intellectual property exemplifies such a spiralling paradoxical logic, since “the infinite reproducibility central to these immaterial forms of property directly undermines any such construction of scarcity” (Hardt and Negri 180). The implications of the filesharing program Napster, accordingly, are read as not merely directed toward theft, but in relation to the private character of the property itself; a kind of social piracy is perpetuated that is viewed as radically recomposing social resources and relations. Ravi Sundaram, a co-founder of the Sarai new media initiative in Delhi, has meanwhile drawn attention to the existence of “pirate modernities” capable of being actualised when individuals or local groups gain illegitimate access to distributive media technologies; these are worlds of “innovation and non-legality,” of electronic survival strategies that partake in cultures of dispersal and escape simple classification (94). Meanwhile, pirate entrepreneurs Magnus Eriksson and Rasmus Fleische – associated with the notorious Piratbyrn – have promoted the bleeding away of Hollywood profits through fully deployed P2P networks, with the intention of pushing filesharing dynamics to an extreme in order to radicalise the potential for social change (“Copies and Context”). From an aesthetic perspective, such activist theories are complemented by the affective register of appropriation art, a movement broadly conceived in terms of antagonistically liberating knowledge from the confines of intellectual property: “those who pirate and hijack owned material, attempting to free information, art, film, and music – the rhetoric of our cultural life – from what they see as the prison of private ownership” (Harold 114). These “unruly” escape attempts are pursued through various modes of engagement, from experimental performances with legislative infrastructures (i.e. Kembrew McLeod’s patenting of the phrase “freedom of expression”) to musical remix projects, such as the work of Negativland, John Oswald, RTMark, Detritus, Illegal Art, and the Evolution Control Committee. Amazon Noir, while similarly engaging with questions of ownership, is distinguished by specifically targeting information communication systems and finding “niches” or gaps between overlapping networks of control and economic governance. Hans Bernhard and Lizvlx from Ubermorgen.com (meaning ‘Day after Tomorrow,’ or ‘Super-Tomorrow’) actually describe their work as “research-based”: “we not are opportunistic, money-driven or success-driven, our central motivation is to gain as much information as possible as fast as possible as chaotic as possible and to redistribute this information via digital channels” (“Interview with Ubermorgen”). This has led to experiments like Google Will Eat Itself (2005) and the construction of the automated software thief against Amazon.com, as process-based explorations of technological action. Agency, Distribution Deleuze’s “postscript” on control has proven massively influential for new media art by introducing a series of key questions on power (or desire) and digital networks. As a social diagram, however, control should be understood as a partial rather than totalising map of relations, referring to the augmentation of disciplinary power in specific technological settings. While control is a conceptual regime that refers to open-ended terrains beyond the architectural locales of enclosure, implying a move toward informational networks, data solicitation, and cybernetic feedback, there remains a peculiar contingent dimension to its limits. For example, software code is typically designed to remain cycling until user input is provided. There is a specifically immanent and localised quality to its actions that might be taken as exemplary of control as a continuously modulating affective materialism. The outcome is a heightened sense of bounded emergencies that are either flattened out or absorbed through reconstitution; however, these are never linear gestures of containment. As Tiziana Terranova observes, control operates through multilayered mechanisms of order and organisation: “messy local assemblages and compositions, subjective and machinic, characterised by different types of psychic investments, that cannot be the subject of normative, pre-made political judgments, but which need to be thought anew again and again, each time, in specific dynamic compositions” (“Of Sense and Sensibility” 34). This event-orientated vitality accounts for the political ambitions of tactical media as opening out communication channels through selective “transversal” targeting. Amazon Noir, for that reason, is pitched specifically against the material processes of communication. The system used to harvest the content from “Search inside the Book” is described as “robot-perversion-technology,” based on a network of four servers around the globe, each with a specific function: one located in the United States that retrieved (or “sucked”) the books from the site, one in Russia that injected the assembled documents onto P2P networks and two in Europe that coordinated the action via intelligent automated programs (see “The Diagram”). According to the “villains,” the main goal was to steal all 150,000 books from Search Inside!™ then use the same technology to steal books from the “Google Print Service” (the exploit was limited only by the amount of technological resources financially available, but there are apparent plans to improve the technique by reinvesting the money received through the settlement with Amazon.com not to publicise the hack). In terms of informational culture, this system resembles a machinic process directed at redistributing copyright content; “The Diagram” visualises key processes that define digital piracy as an emergent phenomenon within an open-ended and responsive milieu. That is, the static image foregrounds something of the activity of copying being a technological action that complicates any analysis focusing purely on copyright as content. In this respect, intellectual property rights are revealed as being entangled within information architectures as communication management and cultural recombination – dissipated and enforced by a measured interplay between openness and obstruction, resonance and emergence (Terranova, “Communication beyond Meaning” 52). To understand data distribution requires an acknowledgement of these underlying nonhuman relations that allow for such informational exchanges. It requires an understanding of the permutations of agency carried along by digital entities. According to Lawrence Lessig’s influential argument, code is not merely an object of governance, but has an overt legislative function itself. Within the informational environments of software, “a law is defined, not through a statue, but through the code that governs the space” (20). These points of symmetry are understood as concretised social values: they are material standards that regulate flow. Similarly, Alexander Galloway describes computer protocols as non-institutional “etiquette for autonomous agents,” or “conventional rules that govern the set of possible behavior patterns within a heterogeneous system” (7). In his analysis, these agreed-upon standardised actions operate as a style of management fostered by contradiction: progressive though reactionary, encouraging diversity by striving for the universal, synonymous with possibility but completely predetermined, and so on (243-244). Needless to say, political uncertainties arise from a paradigm that generates internal material obscurities through a constant twinning of freedom and control. For Wendy Hui Kyong Chun, these Cold War systems subvert the possibilities for any actual experience of autonomy by generalising paranoia through constant intrusion and reducing social problems to questions of technological optimisation (1-30). In confrontation with these seemingly ubiquitous regulatory structures, cultural theory requires a critical vocabulary differentiated from computer engineering to account for the sociality that permeates through and concatenates technological realities. In his recent work on “mundane” devices, software and code, Adrian McKenzie introduces a relevant analytic approach in the concept of technological action as something that both abstracts and concretises relations in a diffusion of collective-individual forces. Drawing on the thought of French philosopher Gilbert Simondon, he uses the term “transduction” to identify a key characteristic of technology in the relational process of becoming, or ontogenesis. This is described as bringing together disparate things into composites of relations that evolve and propagate a structure throughout a domain, or “overflow existing modalities of perception and movement on many scales” (“Impersonal and Personal Forces in Technological Action” 201). Most importantly, these innovative diffusions or contagions occur by bridging states of difference or incompatibilities. Technological action, therefore, arises from a particular type of disjunctive relation between an entity and something external to itself: “in making this relation, technical action changes not only the ensemble, but also the form of life of its agent. Abstraction comes into being and begins to subsume or reconfigure existing relations between the inside and outside” (203). Here, reciprocal interactions between two states or dimensions actualise disparate potentials through metastability: an equilibrium that proliferates, unfolds, and drives individuation. While drawing on cybernetics and dealing with specific technological platforms, McKenzie’s work can be extended to describe the significance of informational devices throughout control societies as a whole, particularly as a predictive and future-orientated force that thrives on staged conflicts. Moreover, being a non-deterministic technical theory, it additionally speaks to new tendencies in regimes of production that harness cognition and cooperation through specially designed infrastructures to enact persistent innovation without any end-point, final goal or natural target (Thrift 283-295). Here, the interface between intellectual property and reproduction can be seen as a site of variation that weaves together disparate objects and entities by imbrication in social life itself. These are specific acts of interference that propel relations toward unforeseen conclusions by drawing on memories, attention spans, material-technical traits, and so on. The focus lies on performance, context, and design “as a continual process of tuning arrived at by distributed aspiration” (Thrift 295). This later point is demonstrated in recent scholarly treatments of filesharing networks as media ecologies. Kate Crawford, for instance, describes the movement of P2P as processual or adaptive, comparable to technological action, marked by key transitions from partially decentralised architectures such as Napster, to the fully distributed systems of Gnutella and seeded swarm-based networks like BitTorrent (30-39). Each of these technologies can be understood as a response to various legal incursions, producing radically dissimilar socio-technological dynamics and emergent trends for how agency is modulated by informational exchanges. Indeed, even these aberrant formations are characterised by modes of commodification that continually spillover and feedback on themselves, repositioning markets and commodities in doing so, from MP3s to iPods, P2P to broadband subscription rates. However, one key limitation of this ontological approach is apparent when dealing with the sheer scale of activity involved, where mass participation elicits certain degrees of obscurity and relative safety in numbers. This represents an obvious problem for analysis, as dynamics can easily be identified in the broadest conceptual sense, without any understanding of the specific contexts of usage, political impacts, and economic effects for participants in their everyday consumptive habits. Large-scale distributed ensembles are “problematic” in their technological constitution, as a result. They are sites of expansive overflow that provoke an equivalent individuation of thought, as the Recording Industry Association of America observes on their educational website: “because of the nature of the theft, the damage is not always easy to calculate but not hard to envision” (“Piracy”). The politics of the filesharing debate, in this sense, depends on the command of imaginaries; that is, being able to conceptualise an overarching structural consistency to a persistent and adaptive ecology. As a mode of tactical intervention, Amazon Noir dramatises these ambiguities by framing technological action through the fictional sensibilities of narrative genre. Ambiguity, Control The extensive use of imagery and iconography from “noir” can be understood as an explicit reference to the increasing criminalisation of copyright violation through digital technologies. However, the term also refers to the indistinct or uncertain effects produced by this tactical intervention: who are the “bad guys” or the “good guys”? Are positions like ‘good’ and ‘evil’ (something like freedom or tyranny) so easily identified and distinguished? As Paolo Cirio explains, this political disposition is deliberately kept obscure in the project: “it’s a representation of the actual ambiguity about copyright issues, where every case seems to lack a moral or ethical basis” (“Amazon Noir Interview”). While user communications made available on the site clearly identify culprits (describing the project as jeopardising arts funding, as both irresponsible and arrogant), the self-description of the artists as political “failures” highlights the uncertainty regarding the project’s qualities as a force of long-term social renewal: Lizvlx from Ubermorgen.com had daily shootouts with the global mass-media, Cirio continuously pushed the boundaries of copyright (books are just pixels on a screen or just ink on paper), Ludovico and Bernhard resisted kickback-bribes from powerful Amazon.com until they finally gave in and sold the technology for an undisclosed sum to Amazon. Betrayal, blasphemy and pessimism finally split the gang of bad guys. (“Press Release”) Here, the adaptive and flexible qualities of informatic commodities and computational systems of distribution are knowingly posited as critical limits; in a certain sense, the project fails technologically in order to succeed conceptually. From a cynical perspective, this might be interpreted as guaranteeing authenticity by insisting on the useless or non-instrumental quality of art. However, through this process, Amazon Noir illustrates how forces confined as exterior to control (virality, piracy, noncommunication) regularly operate as points of distinction to generate change and innovation. Just as hackers are legitimately employed to challenge the durability of network exchanges, malfunctions are relied upon as potential sources of future information. Indeed, the notion of demonstrating ‘autonomy’ by illustrating the shortcomings of software is entirely consistent with the logic of control as a modulating organisational diagram. These so-called “circuit breakers” are positioned as points of bifurcation that open up new systems and encompass a more general “abstract machine” or tendency governing contemporary capitalism (Parikka 300). As a consequence, the ambiguities of Amazon Noir emerge not just from the contrary articulation of intellectual property and digital technology, but additionally through the concept of thinking “resistance” simultaneously with regimes of control. This tension is apparent in Galloway’s analysis of the cybernetic machines that are synonymous with the operation of Deleuzian control societies – i.e. “computerised information management” – where tactical media are posited as potential modes of contestation against the tyranny of code, “able to exploit flaws in protocological and proprietary command and control, not to destroy technology, but to sculpt protocol and make it better suited to people’s real desires” (176). While pushing a system into a state of hypertrophy to reform digital architectures might represent a possible technique that produces a space through which to imagine something like “our” freedom, it still leaves unexamined the desire for reformation itself as nurtured by and produced through the coupling of cybernetics, information theory, and distributed networking. This draws into focus the significance of McKenzie’s Simondon-inspired cybernetic perspective on socio-technological ensembles as being always-already predetermined by and driven through asymmetries or difference. As Chun observes, consequently, there is no paradox between resistance and capture since “control and freedom are not opposites, but different sides of the same coin: just as discipline served as a grid on which liberty was established, control is the matrix that enables freedom as openness” (71). Why “openness” should be so readily equated with a state of being free represents a major unexamined presumption of digital culture, and leads to the associated predicament of attempting to think of how this freedom has become something one cannot not desire. If Amazon Noir has political currency in this context, however, it emerges from a capacity to recognise how informational networks channel desire, memories, and imaginative visions rather than just cultivated antagonisms and counterintuitive economics. As a final point, it is worth observing that the project was initiated without publicity until the settlement with Amazon.com. There is, as a consequence, nothing to suggest that this subversive “event” might have actually occurred, a feeling heightened by the abstractions of software entities. To the extent that we believe in “the big book heist,” that such an act is even possible, is a gauge through which the paranoia of control societies is illuminated as a longing or desire for autonomy. As Hakim Bey observes in his conceptualisation of “pirate utopias,” such fleeting encounters with the imaginaries of freedom flow back into the experience of the everyday as political instantiations of utopian hope. Amazon Noir, with all its underlying ethical ambiguities, presents us with a challenge to rethink these affective investments by considering our profound weaknesses to master the complexities and constant intrusions of control. It provides an opportunity to conceive of a future that begins with limits and limitations as immanently central, even foundational, to our deep interconnection with socio-technological ensembles. References “Amazon Noir – The Big Book Crime.” http://www.amazon-noir.com/>. Bey, Hakim. T.A.Z.: The Temporary Autonomous Zone, Ontological Anarchy, Poetic Terrorism. New York: Autonomedia, 1991. Chun, Wendy Hui Kyong. Control and Freedom: Power and Paranoia in the Age of Fibre Optics. Cambridge, MA: MIT Press, 2006. Crawford, Kate. “Adaptation: Tracking the Ecologies of Music and Peer-to-Peer Networks.” Media International Australia 114 (2005): 30-39. Cubitt, Sean. “Distribution and Media Flows.” Cultural Politics 1.2 (2005): 193-214. Deleuze, Gilles. Foucault. Trans. Seán Hand. Minneapolis: U of Minnesota P, 1986. ———. “Control and Becoming.” Negotiations 1972-1990. Trans. Martin Joughin. New York: Columbia UP, 1995. 169-176. ———. “Postscript on the Societies of Control.” Negotiations 1972-1990. Trans. Martin Joughin. New York: Columbia UP, 1995. 177-182. Eriksson, Magnus, and Rasmus Fleische. “Copies and Context in the Age of Cultural Abundance.” Online posting. 5 June 2007. Nettime 25 Aug 2007. Galloway, Alexander. Protocol: How Control Exists after Decentralization. Cambridge, MA: MIT Press, 2004. Hardt, Michael, and Antonio Negri. Multitude: War and Democracy in the Age of Empire. New York: Penguin Press, 2004. Harold, Christine. OurSpace: Resisting the Corporate Control of Culture. Minneapolis: U of Minnesota P, 2007. Lessig, Lawrence. Code and Other Laws of Cyberspace. New York: Basic Books, 1999. McKenzie, Adrian. Cutting Code: Software and Sociality. New York: Peter Lang, 2006. ———. “The Strange Meshing of Impersonal and Personal Forces in Technological Action.” Culture, Theory and Critique 47.2 (2006): 197-212. Parikka, Jussi. “Contagion and Repetition: On the Viral Logic of Network Culture.” Ephemera: Theory & Politics in Organization 7.2 (2007): 287-308. “Piracy Online.” Recording Industry Association of America. 28 Aug 2007. http://www.riaa.com/physicalpiracy.php>. Sundaram, Ravi. “Recycling Modernity: Pirate Electronic Cultures in India.” Sarai Reader 2001: The Public Domain. Delhi, Sarai Media Lab, 2001. 93-99. http://www.sarai.net>. Terranova, Tiziana. “Communication beyond Meaning: On the Cultural Politics of Information.” Social Text 22.3 (2004): 51-73. ———. “Of Sense and Sensibility: Immaterial Labour in Open Systems.” DATA Browser 03 – Curating Immateriality: The Work of the Curator in the Age of Network Systems. Ed. Joasia Krysa. New York: Autonomedia, 2006. 27-38. Thrift, Nigel. “Re-inventing Invention: New Tendencies in Capitalist Commodification.” Economy and Society 35.2 (2006): 279-306. Citation reference for this article MLA Style Dieter, Michael. "Amazon Noir: Piracy, Distribution, Control." M/C Journal 10.5 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0710/07-dieter.php>. APA Style Dieter, M. (Oct. 2007) "Amazon Noir: Piracy, Distribution, Control," M/C Journal, 10(5). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0710/07-dieter.php>.
APA, Harvard, Vancouver, ISO, and other styles
31

Wark, McKenzie. "Toywars." M/C Journal 6, no. 3 (June 1, 2003). http://dx.doi.org/10.5204/mcj.2179.

Full text
Abstract:
I first came across etoy in Linz, Austria in 1995. They turned up at Ars Electronica with their shaved heads, in their matching orange bomber jackets. They were not invited. The next year they would not have to crash the party. In 1996 they were awarded Arts Electronica’s prestigious Golden Nica for web art, and were on their way to fame and bitterness – the just rewards for their art of self-regard. As founding member Agent.ZAI says: “All of us were extremely greedy – for excitement, for drugs, for success.” (Wishart & Boschler: 16) The etoy story starts on the fringes of the squatters’ movement in Zurich. Disenchanted with the hard left rhetorics that permeate the movement in the 1980s, a small group look for another way of existing within a commodified world, without the fantasy of an ‘outside’ from which to critique it. What Antonio Negri and friends call the ‘real subsumption’ of life under the rule of commodification is something etoy grasps intuitively. The group would draw on a number of sources: David Bowie, the Sex Pistols, the Manchester rave scene, European Amiga art, rumors of the historic avant gardes from Dada to Fluxus. They came together in 1994, at a meeting in the Swiss resort town of Weggis on Lake Lucerne. While the staging of the founding meeting looks like a rerun of the origins of the Situationist International, the wording of the invitation might suggest the founding of a pop music boy band: “fun, money and the new world?” One of the – many – stories about the origins of the name Dada has it being chosen at random from a bilingual dictionary. The name etoy, in an update on that procedure, was spat out by a computer program designed to make four letter words at random. Ironically, both Dada and etoy, so casually chosen, would inspire furious struggles over the ownership of these chancey 4-bit words. The group decided to make money by servicing the growing rave scene. Being based in Vienna and Zurich, the group needed a way to communicate, and chose to use the internet. This was a far from obvious thing to do in 1994. Connections were slow and unreliable. Sometimes it was easier to tape a hard drive full of clubland graphics to the underside of a seat on the express train from Zurich to Vienna and simply email instructions to meet the train and retrieve it. The web was a primitive instrument in 1995 when etoy built its first website. They launched it with a party called etoy.FASTLANE, an optimistic title when the web was anything but. Coco, a transsexual model and tabloid sensation, sang a Japanese song while suspended in the air. She brought media interest, and was anointed etoy’s lifestyle angel. As Wishart and Bochsler write, “it was as if the Seven Dwarfs had discovered their Snow White.” (Wishart & Boschler: 33) The launch didn’t lead to much in the way of a music deal or television exposure. The old media were not so keen to validate the etoy dream of lifting themselves into fame and fortune by their bootstraps. And so etoy decided to be stars of the new media. The slogan was suitably revised: “etoy: the pop star is the pilot is the coder is the designer is the architect is the manager is the system is etoy.” (Wishart & Boschler: 34) The etoy boys were more than net.artists, they were artists of the brand. The brand was achieving a new prominence in the mid-90s. (Klein: 35) This was a time when capitalism was hollowing itself out in the overdeveloped world, shedding parts of its manufacturing base. Control of the circuits of commodification would rest less on the ownership of the means of production and more on maintaining a monopoly on the flows of information. The leading edge of the ruling class was becoming self-consciously vectoral. It controlled the flow of information about what to produce – the details of design, the underlying patents. It controlled the flows of information about what is produced – the brands and logos, the slogans and images. The capitalist class is supplanted by a vectoral class, controlling the commodity circuit through the vectors of information. (Wark) The genius of etoy was to grasp the aesthetic dimension of this new stage of commodification. The etoy boys styled themselves not so much as a parody of corporate branding and management groupthink, but as logical extension of it. They adopted matching uniforms and called themselves agents. In the dada-punk-hiphop tradition, they launched themselves on the world as brand new, self-created, self-named subjects: Agents Zai, Brainhard, Gramazio, Kubli, Esposto, Udatny and Goldstein. The etoy.com website was registered in 1995 with Network Solutions for a $100 fee. The homepage for this etoy.TANKSYSTEM was designed like a flow chart. As Gramazio says: “We wanted to create an environment with surreal content, to build a parallel world and put the content of this world into tanks.” (Wishart & Boschler: 51) One tank was a cybermotel, with Coco the first guest. Another tank showed you your IP number, with a big-brother eye looking on. A supermarket tank offered sunglasses and laughing gas for sale, but which may or may not be delivered. The underground tank included hardcore photos of a sensationalist kind. A picture of the Federal Building in Oklamoma City after the bombing was captioned in deadpan post-situ style “such work needs a lot of training.” (Wishart & Boschler: 52) The etoy agents were by now thoroughly invested in the etoy brand and the constellation of images they had built around it, on their website. Their slogan became “etoy: leaving reality behind.” (Wishart & Boschler: 53) They were not the first artists fascinated by commodification. It was Warhol who said “good art is good business.”(Warhol ) But etoy reversed the equation: good business is good art. And good business, in this vectoral age, is in its most desirable form an essentially conceptual matter of creating a brand at the center of a constellation of signifiers. Late in 1995, etoy held another group meeting, at the Zurich youth center Dynamo. The problem was that while they had build a hardcore website, nobody was visiting it. Agents Gooldstein and Udatny thought that there might be a way of using the new search engines to steer visitors to the site. Zai and Brainhard helped secure a place at the Vienna Academy of Applied Arts where Udatny could use the computer lab to implement this idea. Udatny’s first step was to create a program that would go out and gather email addresses from the web. These addresses would form the lists for the early examples of art-spam that etoy would perpetrate. Udatny’s second idea was a bit more interesting. He worked out how to get the etoy.TANKSYSTEM page listed in search engines. Most search engines ranked pages by the frequency of the search term in the pages it had indexed, so etoy.TANKSYSTEM would contain pages of selected keywords. Porn sites were also discovering this method of creating free publicity. The difference was that etoy chose a very carefully curated list of 350 search terms, including: art, bondage, cyberspace, Doom, Elvis, Fidel, genx, heroin, internet, jungle and Kant. Users of search engines who searched for these terms would find dummy pages listed prominently in their search results that directed them, unsuspectingly, to etoy.com. They called this project Digital Hijack. To give the project a slightly political aura, the pages the user was directed to contained an appeal for the release of convicted hacker Kevin Mitnick. This was the project that won them a Golden Nica statuette at Ars Electronica in 1996, which Gramazio allegedly lost the same night playing roulette. It would also, briefly, require that they explain themselves to the police. Digital Hijack also led to the first splits in the group, under the intense pressure of organizing it on a notionally collective basis, but with the zealous Agent Zai acting as de facto leader. When Udatny was expelled, Zai and Brainhard even repossessed his Toshiba laptop, bought with etoy funds. As Udatny recalls, “It was the lowest point in my life ever. There was nothing left; I could not rely on etoy any more. I did not even have clothes, apart from the etoy uniform.” (Wishart & Boschler: 104) Here the etoy story repeats a common theme from the history of the avant gardes as forms of collective subjectivity. After Digital Hijack, etoy went into a bit of a slump. It’s something of a problem for a group so dependent on recognition from the other of the media, that without a buzz around them, etoy would tend to collapse in on itself like a fading supernova. Zai spend the early part of 1997 working up a series of management documents, in which he appeared as the group’s managing director. Zai employed the current management theory rhetoric of employee ‘empowerment’ while centralizing control. Like any other corporate-Trotskyite, his line was that “We have to get used to reworking the company structure constantly.” (Wishart & Boschler: 132) The plan was for each member of etoy to register the etoy trademark in a different territory, linking identity to information via ownership. As Zai wrote “If another company uses our name in a grand way, I’ll probably shoot myself. And that would not be cool.” (Wishart & Boschler:: 132) As it turned out, another company was interested – the company that would become eToys.com. Zai received an email offering “a reasonable sum” for the etoy.com domain name. Zai was not amused. “Damned Americans, they think they can take our hunting grounds for a handful of glass pearls….”. (Wishart & Boschler: 133) On an invitation from Suzy Meszoly of C3, the etoy boys traveled to Budapest to work on “protected by etoy”, a work exploring internet security. They spent most of their time – and C3’s grant money – producing a glossy corporate brochure. The folder sported a blurb from Bjork: “etoy: immature priests from another world” – which was of course completely fabricated. When Artothek, the official art collection of the Austrian Chancellor, approached etoy wanting to buy work, the group had to confront the problem of how to actually turn their brand into a product. The idea was always that the brand was the product, but this doesn’t quite resolve the question of how to produce the kind of unique artifacts that the art world requires. Certainly the old Conceptual Art strategy of selling ‘documentation’ would not do. The solution was as brilliant as it was simple – to sell etoy shares. The ‘works’ would be ‘share certificates’ – unique objects, whose only value, on the face of it, would be that they referred back to the value of the brand. The inspiration, according to Wishart & Boschsler, was David Bowie, ‘the man who sold the world’, who had announced the first rock and roll bond on the London financial markets, backed by future earnings of his back catalogue and publishing rights. Gramazio would end up presenting Chancellor Viktor Klima with the first ‘shares’ at a press conference. “It was a great start for the project”, he said, “A real hack.” (Wishart & Boschler: 142) For this vectoral age, etoy would create the perfect vectoral art. Zai and Brainhard took off next for Pasadena, where they got the idea of reverse-engineering the online etoy.TANKSYSTEM by building an actual tank in an orange shipping container, which would become etoy.TANK 17. This premiered at the San Francisco gallery Blasthaus in June 1998. Instant stars in the small world of San Francisco art, the group began once again to disintegrate. Brainhard and Esposito resigned. Back in Europe in late 1998, Zai was preparing to graduate from the Vienna Academy of Applied Arts. His final project would recapitulate the life and death of etoy. It would exist from here on only as an online archive, a digital mausoleum. As Kubli says “there was no possibility to earn our living with etoy.” (Wishart & Boschler: 192) Zai emailed eToys.com and asked them if them if they would like to place a banner ad on etoy.com, to redirect any errant web traffic. Lawyers for eToys.com offered etoy $30,000 for the etoy.com domain name, which the remaining members of etoy – Zai, Gramazio, Kubli – refused. The offer went up to $100,000, which they also refused. Through their lawyer Peter Wild they demanded $750,000. In September 1999, while etoy were making a business presentation as their contribution to Ars Electronica, eToys.com lodged a complaint against etoy in the Los Angeles Superior Court. The company hired Bruce Wessel, of the heavyweight LA law firm Irell & Manella, who specialized in trademark, copyright and other intellectual property litigation. The complaint Wessel drafted alleged that etoy had infringed and diluted the eToys trademark, were practicing unfair competition and had committed “intentional interference with prospective economic damage.” (Wishart & Boschler: 199) Wessel demanded an injunction that would oblige etoy to cease using its trademark and take down its etoy.com website. The complaint also sought to prevent etoy from selling shares, and demanded punitive damages. Displaying the aggressive lawyering for which he was so handsomely paid, Wessel invoked the California Unfair Competition Act, which was meant to protect citizens from fraudulent business scams. Meant as a piece of consumer protection legislation, its sweeping scope made it available for inventive suits such as Wessel’s against etoy. Wessel was able to use pretty much everything from the archive etoy built against it. As Wishart and Bochsler write, “The court papers were like a delicately curated catalogue of its practices.” (Wishart & Boschler: 199) And indeed, legal documents in copyright and trademark cases may be the most perfect literature of the vectoral age. The Unfair Competition claim was probably aimed at getting the suit heard in a Californian rather than a Federal court in which intellectual property issues were less frequently litigated. The central aim of the eToys suit was the trademark infringement, but on that head their claims were not all that strong. According to the 1946 Lanham Act, similar trademarks do not infringe upon each other if there they are for different kinds of business or in different geographical areas. The Act also says that the right to own a trademark depends on its use. So while etoy had not registered their trademark and eToys had, etoy were actually up and running before eToys, and could base their trademark claim on this fact. The eToys case rested on a somewhat selective reading of the facts. Wessel claimed that etoy was not using its trademark in the US when eToys was registered in 1997. Wessel did not dispute the fact that etoy existed in Europe prior to that time. He asserted that owning the etoy.com domain name was not sufficient to establish a right to the trademark. If the intention of the suit was to bully etoy into giving in, it had quite the opposite effect. It pissed them off. “They felt again like the teenage punks they had once been”, as Wishart & Bochsler put it. Their art imploded in on itself for lack of attention, but called upon by another, it flourished. Wessel and eToys.com unintentionally triggered a dialectic that worked in quite the opposite way to what they intended. The more pressure they put on etoy, the more valued – and valuable – they felt etoy to be. Conceptual business, like conceptual art, is about nothing but the management of signs within the constraints of given institutional forms of market. That this conflict was about nothing made it a conflict about everything. It was a perfectly vectoral struggle. Zai and Gramazio flew to the US to fire up enthusiasm for their cause. They asked Wolfgang Staehle of The Thing to register the domain toywar.com, as a space for anti-eToys activities at some remove from etoy.com, and as a safe haven should eToys prevail with their injunction in having etoy.com taken down. The etoy defense was handled by Marcia Ballard in New York and Robert Freimuth in Los Angeles. In their defense, they argued that etoy had existed since 1994, had registered its globally accessible domain in 1995, and won an international art prize in 1996. To counter a claim by eToys that they had a prior trademark claim because they had bought a trademark from another company that went back to 1990, Ballard and Freimuth argued that this particular trademark only applied to the importation of toys from the previous owner’s New York base and thus had no relevance. They capped their argument by charging that eToys had not shown that its customers were really confused by the existence of etoy. With Christmas looming, eToys wanted a quick settlement, so they offered Zurich-based etoy lawyer Peter Wild $160,000 in shares and cash for the etoy domain. Kubli was prepared to negotiate, but Zai and Gramazio wanted to gamble – and raise the stakes. As Zai recalls: “We did not want to be just the victims; that would have been cheap. We wanted to be giants too.” (Wishart & Boschler: 207) They refused the offer. The case was heard in November 1999 before Judge Rafeedie in the Federal Court. Freimuth, for etoy, argued that federal Court was the right place for what was essentially a trademark matter. Robert Kleiger, for eToys, countered that it should stay where it was because of the claims under the California Unfair Competition act. Judge Rafeedie took little time in agreeing with the eToys lawyer. Wessel’s strategy paid off and eToys won the first skirmish. The first round of a quite different kind of conflict opened when etoy sent out their first ‘toywar’ mass mailing, drawing the attention of the net.art, activism and theory crowd to these events. This drew a report from Felix Stalder in Telepolis: “Fences are going up everywhere, molding what once seemed infinite space into an overcrowded and tightly controlled strip mall.” (Stalder ) The positive feedback from the net only emboldened etoy. For the Los Angeles court, lawyers for etoy filed papers arguing that the sale of ‘shares’ in etoy was not really a stock offering. “The etoy.com website is not about commerce per se, it is about artist and social protest”, they argued. (Wishart & Boschler: 209) They were obliged, in other words, to assert a difference that the art itself had intended to blur in order to escape eToy’s claims under the Unfair Competition Act. Moreover, etoy argued that there was no evidence of a victim. Nobody was claiming to have been fooled by etoy into buying something under false pretences. Ironically enough, art would turn out in hindsight to be a more straightforward transaction here, involving less simulation or dissimulation, than investing in a dot.com. Perhaps we have reached the age when art makes more, not less, claim than business to the rhetorical figure of ‘reality’. Having defended what appeared to be the vulnerable point under the Unfair Competition law, etoy went on the attack. It was the failure of eToys to do a proper search for other trademarks that created the problem in the first place. Meanwhile, in Federal Court, lawyers for etoy launched a counter-suit that reversed the claims against them made by eToys on the trademark question. While the suits and counter suits flew, eToys.com upped their offer to settle to a package of cash and shares worth $400,000. This rather puzzled the etoy lawyers. Those choosing to sue don’t usually try at the same time to settle. Lawyer Peter Wild advised his clients to take the money, but the parallel tactics of eToys.com only encouraged them to dig in their heels. “We felt that this was a tremendous final project for etoy”, says Gramazio. As Zai says, “eToys was our ideal enemy – we were its worst enemy.” (Wishart & Boschler: 210) Zai reported the offer to the net in another mass mail. Most people advised them to take the money, including Doug Rushkoff and Heath Bunting. Paul Garrin counseled fighting on. The etoy agents offered to settle for $750,000. The case came to court in late November 1999 before Judge Shook. The Judge accepted the plausibility of the eToys version of the facts on the trademark issue, which included the purchase of a registered trademark from another company that went back to 1990. He issued an injunction on their behalf, and added in his statement that he was worried about “the great danger of children being exposed to profane and hardcore pornographic issues on the computer.” (Wishart & Boschler: 222) The injunction was all eToys needed to get Network Solutions to shut down the etoy.com domain. Zai sent out a press release in early December, which percolated through Slashdot, rhizome, nettime (Staehle) and many other networks, and catalyzed the net community into action. A debate of sorts started on investor websites such as fool.com. The eToys stock price started to slide, and etoy ‘warriors’ felt free to take the credit for it. The story made the New York Times on 9th December, Washington Post on the 10th, Wired News on the 11th. Network Solutions finally removed the etoy.com domain on the 10th December. Zai responded with a press release: “this is robbery of digital territory, American imperialism, corporate destruction and bulldozing in the way of the 19th century.” (Wishart & Boschler: 237) RTMark set up a campaign fund for toywar, managed by Survival Research Laboratories’ Mark Pauline. The RTMark press release promised a “new internet ‘game’ designed to destroy eToys.com.” (Wishart & Boschler: 239) The RTMark press release grabbed the attention of the Associated Press newswire. The eToys.com share price actually rose on December 13th. Goldman Sachs’ e-commerce analyst Anthony Noto argued that the previous declines in the Etoys share price made it a good buy. Goldman Sachs was the lead underwriter of the eToys IPO. Noto’s writings may have been nothing more than the usual ‘IPOetry’ of the time, but the crash of the internet bubble was some months away yet. The RTMark campaign was called ‘The Twelve Days of Christmas’. It used the Floodnet technique that Ricardo Dominguez used in support of the Zapatistas. As Dominguez said, “this hysterical power-play perfectly demonstrates the intensions of the new net elite; to turn the World Wide Web into their own private home-shopping network.” (Wishart & Boschler: 242) The Floodnet attack may have slowed the eToys.com server down a bit, but it was robust and didn’t crash. Ironically, it ran on open source software. Dominguez claims that the ‘Twelve Days’ campaign, which relied on individuals manually launching Floodnet from their own computers, was not designed to destroy the eToys site, but to make a protest felt. “We had a single-bullet script that could have taken down eToys – a tactical nuke, if you will. But we felt this script did not represent the presence of a global group of people gathered to bear witness to a wrong.” (Wishart & Boschler: 245) While the eToys engineers did what they could to keep the site going, eToys also approached universities and businesses whose systems were being used to host Floodnet attacks. The Thing, which hosted Dominguez’s eToys Floodnet site was taken offline by The Thing’s ISP, Verio. After taking down the Floodnet scripts, The Thing was back up, restoring service to the 200 odd websites that The Thing hosted besides the offending Floodnet site. About 200 people gathered on December 20th at a demonstration against eToys outside the Museum of Modern Art. Among the crowd were Santas bearing signs that said ‘Coal for eToys’. The rally, inside the Museum, was led by the Reverend Billy of the Church of Stop Shopping: “We are drowning in a sea of identical details”, he said. (Wishart & Boschler: 249-250) Meanwhile etoy worked on the Toywar Platform, an online agitpop theater spectacle, in which participants could act as soldiers in the toywar. This would take some time to complete – ironically the dispute threatened to end before this last etoy artwork was ready, giving etoy further incentives to keep the dispute alive. The etoy agents had a new lawyer, Chris Truax, who was attracted to the case by the publicity it was generating. Through Truax, etoy offered to sell the etoy domain and trademark for $3.7 million. This may sound like an insane sum, but to put it in perspective, the business.com site changed hands for $7.5 million around this time. On December 29th, Wessel signaled that eToys was prepared to compromise. The problem was, the Toywar Platform was not quite ready, so etoy did what it could to drag out the negotiations. The site went live just before the scheduled court hearings, January 10th 2000. “TOYWAR.com is a place where all servers and all involved people melt and build a living system. In our eyes it is the best way to express and document what’s going on at the moment: people start to about new ways to fight for their ideas, their lifestyle, contemporary culture and power relations.” (Wishart & Boschler: 263) Meanwhile, in a California courtroom, Truax demanded that Network Solutions restore the etoy domain, that eToys pay the etoy legal expenses, and that the case be dropped without prejudice. No settlement was reached. Negotiations dragged on for another two weeks, with the etoy agents’ attention somewhat divided between two horizons – art and law. The dispute was settled on 25th January. Both parties dismissed their complaints without prejudice. The eToys company would pay the etoy artists $40,000 for legal costs, and contact Network Solutions to reinstate the etoy domain. “It was a pleasure doing business with one of the biggest e-commerce giants in the world” ran the etoy press release. (Wishart & Boschler: 265) That would make a charming end to the story. But what goes around comes around. Brainhard, still pissed off with Zai after leaving the group in San Francisco, filed for the etoy trademark in Austria. After that the internal etoy wranglings just gets boring. But it was fun while it lasted. What etoy grasped intuitively was the nexus between the internet as a cultural space and the transformation of the commodity economy in a yet-more abstract direction – its becoming-vectoral. They zeroed in on the heart of the new era of conceptual business – the brand. As Wittgenstein says of language, what gives words meaning is other words, so too for brands. What gives brands meaning is other brands. There is a syntax for brands as there is for words. What etoy discovered is how to insert a new brand into that syntax. The place of eToys as a brand depended on their business competition with other brands – with Toys ‘R’ Us, for example. For etoy, the syntax they discovered for relating their brand to another one was a legal opposition. What made etoy interesting was their lack of moral posturing. Their abandonment of leftist rhetorics opened them up to exploring the territory where media and business meet, but it also made them vulnerable to being consumed by the very dialectic that created the possibility of staging etoy in the first place. By abandoning obsolete political strategies, they discovered a media tactic, which collapsed for want of a new strategy, for the new vectoral terrain on which we find ourselves. Works Cited Negri, Antonio. Time for Revolution. Continuum, London, 2003. Warhol, Andy. From A to B and Back Again. Picador, New York, 1984. Stalder, Felix. ‘Fences in Cyberspace: Recent events in the battle over domain names’. 19 Jun 2003. <http://felix.openflows.org/html/fences.php>. Wark, McKenzie. ‘A Hacker Manifesto [version 4.0]’ 19 Jun 2003. http://subsol.c3.hu/subsol_2/contributors0/warktext.html. Klein, Naomi. No Logo. Harper Collins, London, 2000. Wishart, Adam & Regula Bochsler. Leaving Reality Behind: etoy vs eToys.com & Other Battles to Control Cyberspace Ecco Books, 2003. Staehle, Wolfgang. ‘<nettime> etoy.com shut down by US court.’ 19 Jun 2003. http://amsterdam.nettime.org/Lists-Archives/nettime-l-9912/msg00005.html Links http://amsterdam.nettime.org/Lists-Archives/nettime-l-9912/msg00005.htm http://felix.openflows.org/html/fences.html http://subsol.c3.hu/subsol_2/contributors0/warktext.html Citation reference for this article Substitute your date of access for Dn Month Year etc... MLA Style Wark, McKenzie. "Toywars" M/C: A Journal of Media and Culture< http://www.media-culture.org.au/0306/02-toywars.php>. APA Style Wark, M. (2003, Jun 19). Toywars. M/C: A Journal of Media and Culture, 6,< http://www.media-culture.org.au/0306/02-toywars.php>
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography