To see the other types of publications on this topic, follow the link: Online social networks – Zimbabwe – Economic aspects.

Journal articles on the topic 'Online social networks – Zimbabwe – Economic aspects'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 48 journal articles for your research on the topic 'Online social networks – Zimbabwe – Economic aspects.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Zhukovskaya, O. Y. "Social Capital and Social Networks under the Conditions of Digitalization: Interconnections and Implementation Features." Digital Transformation, no. 4 (January 7, 2021): 21–33. http://dx.doi.org/10.38086/2522-9613-2020-4-21-33.

Full text
Abstract:
The goal of the article is to analyze the new aspects of accumulation and usage, as well as the opportunities for growth of an important determinant of well-being – social capital – in connection with the active development of social networks under the conditions of digitalization and current social and economic situation. The interconnections as well as the development of social capital, social media and social networks in the context of the digital divide concept were investigated. It was suggested to distinguish digital social capital taking into account different sources and effects of this social and economic phenomenon. In addition, the econometric analysis of social capital, various social and economic indicators, as well as online activity was conducted. Based on the quantitative and qualitative aspects of modern social capital, social media and social networks, information and communication technologies (hereafter – ICT) were analysed, considering the dual nature of the Internet as well as digitalization.
APA, Harvard, Vancouver, ISO, and other styles
2

Pavari, Never. "Psychosocial Impacts of Covid 19 Pandemic in Zimbabwe." Journal of Public Administration and Governance 10, no. 3 (2020): 228. http://dx.doi.org/10.5296/jpag.v10i3.17687.

Full text
Abstract:
The Covid-19 pandemic has continued to cause socio-economic damages which will take a long time to recover from while there is no vaccine in sight. The impacts are affecting the social well-being of global citizens which triggers the need to investigate the psychosocial effects. In order to achieve and to provide the missing African context, the study was done in Zimbabwe. Due to lockdown restrictions, samples were obtained using online survey and social media platforms. Analysis was done to determine the effects, so far, of the virus on the general economy, psychological and social aspects as well as religious values of the citizens qualitatively and quantitatively. The findings indicated that economic loses were recorded at household, corporate and national levels. Indicators included increases in prices and exchange rate which eroded the buying power of local currency and increased the cost of basic commodities. Economic pressures, Covid-19 trends and restrictions imposed caused psychological damages which included fear, feeling of uncertainty and loss of hope. The study recommended increased awareness and psychological support to help the citizens to overcome mental displeasures. The study has potential to assist policy makers, health practitioners and development partners in developing strategies to eliminate Covid-19 related psychosocial pressures in Zimbabwe and globally.
APA, Harvard, Vancouver, ISO, and other styles
3

Mutanana, Ngonidzashe. "Social Media and Political Mobilisation: An Analysis of the July 2016 Zimbabwe Shut Down." American Journal of Trade and Policy 4, no. 1 (2017): 19–24. http://dx.doi.org/10.18034/ajtp.v4i1.412.

Full text
Abstract:
This study sought to carry out an analysis of the effects of the social media in political mobilization. These were analyzed using the following indicators (i) the social media as a communication tool and (ii) the role of social media in political mobilization. The study was using a one-day demonstration that occurred in Zimbabwe code named #ZimShutDown2016 as a case study. In the study, a qualitative case study research design was used. Secondary data from online newspaper reports and Social Media Networks was used to analyze the effects of the social media movement in bringing real socio-economic and political change in developing countries such as Zimbabwe. In-depth interviews with five key informants from local universities helped in the analysis and they were identified using purposive sampling technique. Findings from the study revealed that social media is an effective tool of communication among citizens. Information is exchanged minute by minute among citizens, and this encourages ‘citizenry journalism.’ As such, the social media has a positive impact in mobilizing the community in bringing real social, political and economic change. The study, therefore, recommends a longer survey on the challenges of the social media movement in developing countries such as Zimbabwe.
APA, Harvard, Vancouver, ISO, and other styles
4

Bailey, Michael, Rachel Cao, Theresa Kuchler, Johannes Stroebel, and Arlene Wong. "Social Connectedness: Measurement, Determinants, and Effects." Journal of Economic Perspectives 32, no. 3 (2018): 259–80. http://dx.doi.org/10.1257/jep.32.3.259.

Full text
Abstract:
Social networks can shape many aspects of social and economic activity: migration and trade, job-seeking, innovation, consumer preferences and sentiment, public health, social mobility, and more. In turn, social networks themselves are associated with geographic proximity, historical ties, political boundaries, and other factors. Traditionally, the unavailability of large-scale and representative data on social connectedness between individuals or geographic regions has posed a challenge for empirical research on social networks. More recently, a body of such research has begun to emerge using data on social connectedness from online social networking services such as Facebook, LinkedIn, and Twitter. To date, most of these research projects have been built on anonymized administrative microdata from Facebook, typically by working with coauthor teams that include Facebook employees. However, there is an inherent limit to the number of researchers that will be able to work with social network data through such collaborations. In this paper, we therefore introduce a new measure of social connectedness at the US county level. Our Social Connectedness Index is based on friendship links on Facebook, the global online social networking service. Specifically, the Social Connectedness Index corresponds to the relative frequency of Facebook friendship links between every county-pair in the United States, and between every US county and every foreign country. Given Facebook’s scale as well as the relative representativeness of Facebook’s user body, these data provide the first comprehensive measure of friendship networks at a national level.
APA, Harvard, Vancouver, ISO, and other styles
5

Garcia, David, Yonas Mitike Kassa, Angel Cuevas, et al. "Analyzing gender inequality through large-scale Facebook advertising data." Proceedings of the National Academy of Sciences 115, no. 27 (2018): 6958–63. http://dx.doi.org/10.1073/pnas.1717781115.

Full text
Abstract:
Online social media are information resources that can have a transformative power in society. While the Web was envisioned as an equalizing force that allows everyone to access information, the digital divide prevents large amounts of people from being present online. Online social media, in particular, are prone to gender inequality, an important issue given the link between social media use and employment. Understanding gender inequality in social media is a challenging task due to the necessity of data sources that can provide large-scale measurements across multiple countries. Here, we show how the Facebook Gender Divide (FGD), a metric based on aggregated statistics of more than 1.4 billion users in 217 countries, explains various aspects of worldwide gender inequality. Our analysis shows that the FGD encodes gender equality indices in education, health, and economic opportunity. We find gender differences in network externalities that suggest that using social media has an added value for women. Furthermore, we find that low values of the FGD are associated with increases in economic gender equality. Our results suggest that online social networks, while suffering evident gender imbalance, may lower the barriers that women have to access to informational resources and help to narrow the economic gender gap.
APA, Harvard, Vancouver, ISO, and other styles
6

Parinsi, Mario Tulenan, and Keith Francis Ratumbuisang. "Indonesian Mobile Learning Information System Using Social Media Platforms." International Journal of Mobile Computing and Multimedia Communications 8, no. 2 (2017): 44–67. http://dx.doi.org/10.4018/ijmcmc.2017040104.

Full text
Abstract:
As a developing country, Indonesia continues to improve its quality as a state, in which the attempt to optimize all of its potential both in terms of economic, political, social, cultural, technological, educational, health, etc. This modern era, all aspects of life are depending on technology. This makes the technology becomes one of necessary in people's life. The utilization of technology has been used by all people in all aspects of life. Specifically, this paper tries to offer an innovation that has never been designed before, namely a platform of M-Learning in form of social media related to the development of technology for learning. Nowadays, internet users and smartphone ownership in Indonesia increased dramatically, then writers took initiative to design an innovation related to this case. Social media technologies provide the opportunity for teachers to engage students in online classes, thereby supporting the development of skills and learners to achieve competency. In addition to students, the opportunity is also open to outside the community to get information that can add knowledge. This case study provides a platform for M-Learning based learning that facilitate student learning also helps society size to obtain information more easily. The design of this platform using models UML (Unified Modeling Language) to design a visual model of this platform.
APA, Harvard, Vancouver, ISO, and other styles
7

Rezaeian, Ali, Sajjad Shokouhyar, and Shahabedin Yousefi. "Intention to purchase behavior on social e-commerce website across cultures (case study: Iranian online purchaser)." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 15, no. 9 (2016): 7077–89. http://dx.doi.org/10.24297/ijct.v15i9.186.

Full text
Abstract:
With the increasing popularity of social media, millions of users use social media services in the space such as Facebook, Twitter, MySpace, etc. Following that many organizations see this phenomenon as an opportunity to create new business and know this is known as social commerce. This phenomenon is not only due to the growth of social media, but is also because of users' participation in the fate of the marketing and sale of products. So, e-commerce has undergone a revolution, which is affected by the adoption of web 2 functionalities to increase customer participation and achieve greater economic value. Therefore, studying the behavior of buyers in the social commerce platforms can create more value for owners of e-commerce in the context of social commerce. For this reason, attempts were made to obtain more accurate findings regarding the behavior of e-commerce purchasers in social networks by taking into account the moderating influence of culture on it (Iranian online purchasers).This is an applied study. It also considered a descriptive cross-sectional study with regards to the way data are collected. The analysis of data collected from 184 active professionals in the IT industry and users of social networks indicates the moderating effect of culture and the mediating role of trust in a social network community in terms of social identity, trust transference (Familiarity), social influence (intimacy and friendship), cognitive style and the intention to purchase in the social business environment. Moreover, these findings also show that trust transference affects intention to purchase in social networks considering the aspects of familiarity, social identity and cognitive style. But the direct effect of social influence (feel close to) on the purchase intention has been rejected.
APA, Harvard, Vancouver, ISO, and other styles
8

Gómez-Llanos, Eva, and Pablo Durán-Barroso. "Learning Design Decisions in Massive Open Online Courses (MOOC) Applied to Higher Education in Civil-Engineering Topics." Sustainability 12, no. 20 (2020): 8430. http://dx.doi.org/10.3390/su12208430.

Full text
Abstract:
Sustainable Development Goals (SDGs) reflect the relationship among social, economic, and environmental aspects of society. Massive online open courses (MOOCs) represent an opportunity to promote lifelong learning (SDG 4), complementing university education or providing knowledge to society free and openly. The objective of this work is to analyze experiences in one MOOC about wastewater treatment applied to higher education in civil engineering (SDG 6). The proposed educational methodology and the achieved participation results are studied. The MOOC had three editions and was hosted on the Miríadax platform. Data about sociodemographic characteristics, initial motivation, and satisfaction level were collected from questionnaires. The results highlighted the importance of design decisions to obtain a high completion rate: defining a target audience, without prejudice to the course’s open character, where the prior knowledge of students is crucial. The teaching methodology is based on autonomous and progressive learning, with short and direct master classes, social support, with the motivation of students to continue their training with opening complementary topics in the forums, following up on their doubts, and their combination with social networks.
APA, Harvard, Vancouver, ISO, and other styles
9

Isaías, Pedro, and Fábio Coelho. "Web 2.0 Tools Adoption Model." International Journal of Information Communication Technologies and Human Development 5, no. 3 (2013): 64–79. http://dx.doi.org/10.4018/jicthd.2013070104.

Full text
Abstract:
The Internet has become a major sales platform, assuming an emergent importance in increasing the economic growth of businesses. Web 2.0 has been a very important change in the way people use the internet and it has created an impact in all sectors of society. This study emphasises the importance of including Web 2.0 tools in online retailing as a contribution for success. The focus of this research lies in Portuguese online retailers and the elaboration of an adoption model for Web 2.0 tools. Through an observation of the 36 most visited Portuguese e-Commerce websites, it was possible to gather information on their adoption patterns of these tools. Social networks, Rich Internet Applications, mashups and Really Simple Syndication were the most popular tools, while semantic search, wikis and blogs were the least implemented. These and other aspects were gathered and then used to build a Web 2.0 adoption model.
APA, Harvard, Vancouver, ISO, and other styles
10

Yurko, Olha. "The functioning of information system in the society of the Second modern in conditions of military conflict." Grani 23, no. 3 (2020): 108–17. http://dx.doi.org/10.15421/172031.

Full text
Abstract:
The features of functioning of information system in the society of the Second modern in conditions of military conflict are analyzed in the article. Also we tried to analyze connection of this features with characteristics of the political and economic systems of this type of society. Television continues to be the main source of information about state of affairs in Ukraine and in the world, although it’s influence is decreasing. The concentration of media ownership in the hands of financial and industrial groups, associated with political forces, is an important issue. Online media and social networks are the second among the sources of information about state of affairs in Ukraine and in the world. Their increasing influence raises the question of the power of large internet companies, who have the ability to control information flows, provide an opportunity to use the information aggregated by them for the application of specific political technologies of influence on the public sphere. These companies are out of control of the regulatory mechanisms of state institutions in most countries, which creates vulnerabilities in the public sphere of nation-states to influence from other countries and unregulated aspects of online electoral campaigns. The crisis of confidence in traditional media increases the importance of offline and online networks of social interactions as a source of information. Data in Ukraine, Europe and USA show that loss of confidence in public institutions, rise of populism directly related to the decline in confidence in traditional media. The level of trust in vaccination in different regions of the world is also analyzed in the context of the functioning of media institutions and other public institutions. Modern media (both traditional and internet) tend to mix entertaining formats with political information. Converting policy to show, spreading fakes, noticeable dependence of media on certain political and economic groups and media’s partiality, weakening of expert filters undermines confidence in traditional and new media. Although the importance of social media for the democratization of the public sphere exists. Decreasing confidence in media in general converted to the сonfidence in concrete media figures (bloggers, experts etc.). The article also contains generalization of researches of media consumption in Ukraine in first part of 2019.
APA, Harvard, Vancouver, ISO, and other styles
11

Caran, Gustavo Miranda, Rose Marie Santini, and Jorge Calmon de Almeida Biolchini. "Use of social network to support visually impaired people: A Facebook case study." Transinformação 28, no. 2 (2016): 173–80. http://dx.doi.org/10.1590/2318-08892016000200004.

Full text
Abstract:
The use of Information and Communication Technologies can be seen as an important factor for social inclusion in its different aspects - economic, social, relational and informational, among others. Inclusion potentiality is even more relevant for groups of people who face limiting life conditions which determine social barriers. This study investigated the social support offered to people with disabilities based on the social network analysis method. The research objective was to make the online support dynamics for low vision people, friends and relatives evident, having as case study the Facebook Low Vision group. The social network modelling and quantitative analysis were performed from user data collection, posts, comments and likes. Contents were classified according to the type of support (Emotional or Instrumental) and according to its intention (Offered or Requested), represented in graphs as indicators for analysis. Results pointed towards a larger use rate of Instrumental and Offered support although a more intense and comprehensive exchange of Emotional and Requested support was found. Data collection limitations indicate the need for more empirical studies on the social use of socio-technical networks for different types of social support. This theme points to a research agenda about the role of information and communication technologies as a possible condition for inclusion, life quality and well-being of people with disabilities.
APA, Harvard, Vancouver, ISO, and other styles
12

Senne, Fabio Jose Novaes de. "Mapping the origin of digital inequalities: an empirical study about the city of São Paulo." Law, State and Telecommunications Review 11, no. 1 (2019): 303–30. http://dx.doi.org/10.26512/lstr.v11i1.24860.

Full text
Abstract:
Purpose – Approaches that attribute inequalities in access and use of the Internet to structural economic factors and contemplate the reproduction of individual off-line characteristics in the digital environment are predominant in the specialized literature. Recently, however, the focus has been shifting to the differences in patterns of digital inclusion according to characteristics of particular communities or territories.
 Methodology/approach/design – The empirical study investigates to what extent the territory matters to explain the variability of Internet use and the existence of ICT skills. Based on a sample survey conducted in 2016, the study analyses data collected on the city of São Paulo/Brazil.
 Findings – The results indicate that, in addition to socioeconomic conditions, territorial aspects are important for understanding digital inequalities. Nonetheless, it suggests that the level of territorial disaggregation must be taken into consideration when measuring the use of the Internet and ICT skills.
 Practical implications – The study highlights the need for deeper theoretical and methodological considerations of social, institutional and regulatory factors that affect the scenario of online inequalities, including place-based effects of urban policies.
APA, Harvard, Vancouver, ISO, and other styles
13

Sogari, Giovanni, Chiara Corbo, Martina Macconi, Davide Menozzi, and Cristina Mora. "Consumer attitude towards sustainable-labelled wine: an exploratory approach." International Journal of Wine Business Research 27, no. 4 (2015): 312–28. http://dx.doi.org/10.1108/ijwbr-12-2014-0053.

Full text
Abstract:
Purpose – This paper aims to investigate, using an exploratory approach, how environmental values and beliefs about sustainable labelling shape consumer attitude towards sustainable wine. Design/methodology/approach – Data were collected with an online survey from 495 Italian wine drinkers in 2013. The survey was advertised through websites, blogs, social networks and emails. Based on background research and literature review, ten hypotheses were tested. Then a structural equation model was constructed using latent variables to test the causal links specified in the model. Findings – The results show that attitude towards sustainable-labelled wine is shaped by both environmental and quality beliefs about sustainable wine, while it is not affected by the economic dimension of sustainability. In addition, age appears to have a slight effect on attitude because young consumers seem to be more interested in sustainability aspects of food products than older people are. Practical implications – The paper suggests that company communication strategies should focus on sustainable issues to meet the requirements of environmentally conscious consumers. At the same time, sustainable certification on wine labels may help wineries to become more competitive using verifiable sustainable claims to differentiate their products. Originality/value – The work adds to the literature on wine marketing by evaluating which variables influence consumer attitude towards sustainable-labelled wine and, at the same time, to what extent sustainable aspects are important during wine purchase.
APA, Harvard, Vancouver, ISO, and other styles
14

Marrow, Helen B., and Amanda Klekowski von Koppenfels. "Modeling American Migration Aspirations: How Capital, Race, and National Identity Shape Americans’ Ideas about Living Abroad." International Migration Review 54, no. 1 (2018): 83–119. http://dx.doi.org/10.1177/0197918318806852.

Full text
Abstract:
Recent scholarship proposes a “two-step” approach for better understanding mechanisms underlying the migration process, suggesting we study migration aspirations separately from migration behavior and that the one does not always translate directly into the other. Research on aspirations, however, concentrates on the Global South, despite growing migration flows originating in the Global North. Here, we fill this gap, drawing on a nationally representative online survey we commissioned in 2014 in the United States. Bivariate analysis shows that fully one-third of Americans surveyed reveal some aspiration to live abroad, a plurality of those primarily for the purpose of exploration. Multivariate analysis suggests that certain elements of cultural and social capital, including the networks Americans have with prior and current US citizen migrants, structure these aspirations, in tandem with strength of national attachment. Further, both cultural and economic aspects of class, alongside race and national attachment, shape where American aspirants envision going and why. While the existing literature addresses the motivations and profile of American migrants already living abroad, ours is the first study to examine Americans’ aspirations prospectively from the point of origin, thereby connecting the literature on Global North migration flows to that on migration aspirations.
APA, Harvard, Vancouver, ISO, and other styles
15

Frechette, Julie. "Keeping Media Literacy Critical during the Post-Truth Crisis over Fake News." International Journal of Critical Media Literacy 1, no. 1 (2019): 51–65. http://dx.doi.org/10.1163/25900110-00101004.

Full text
Abstract:
As citizens demand more media literacy education in schools, the criticality of media literacy must be advanced in meaningful and comprehensive ways that enable students to successfully access, analyze, evaluate and produce media ethically and effectively across diverse platforms and channels. Institutional analysis in the digital age means understanding who controls the architecture(s) of digital technology, and how they use it. Big data, high tech, and rich transnational global media all need to be carefully studied and held accountable. “Panopticonic” practices such as surveillance, geolocation, data mining, and niche microtargeting need to be studied as information brokers reap huge profits by amalgamating and selling off the data that internet and social media users unwittingly but willingly provide to companies. In light of the growing evidence that online-only networks create filter bubbles and polarization, people will need to interact and mobilize in offline real world spaces. Critical media literacy education must explore how human interactivity is undergoing tectonic shifts as powerful ideological and economic interests work to alter our digital media ecology. Such an approach will allow us to better leverage our public interest goals through a media landscape that preserves the multidirectional, participatory, global, networkable aspects of the digital world.
APA, Harvard, Vancouver, ISO, and other styles
16

Shulga, Olga. "Confidentiality and scam in the internet." University Economic Bulletin, no. 48 (March 30, 2021): 76–91. http://dx.doi.org/10.31470/2306-546x-2021-48-76-91.

Full text
Abstract:
The purpose of the work is to consider the theoretical and practical aspects of fraud in the Internet sphere and on this basis to identify ways to ensure the confidentiality and cybersecurity of private users and commercial organizations. The methodological basis of the work is the use of general and special methods of scientific knowledge. Methods of combining analysis and synthesis, induction and deduction have been used to identify different types of fraud in the Internet. Generalization methods, logical and empirical, were used in determining the directions of development of the national cyber defense system and ensuring confidentiality. The main results of the work: The most common methods of fraud with the use of bank payment cards are identified, among which: a fake poll on social networks with a prize draw; a phone call to obtain classified information; SIM card replacement for access to online banking; online payments on unsecured sites; phishing; copying card data when handed over; unsecured WI-Fi networks; computers in public places; skimming for card data theft; unauthorized micropayments; ATM fraud; use of malicious programs (viruses), fake sites in order to compromise the details of electronic payment instruments and/or logins/passwords for access to Internet/mobile banking systems; dissemination (sale, dissemination) of information on compromised data; terminal network fraud; fraud in remote service systems; social engineering. Basic security rules are defined to prevent fraud. The experience of European countries in the field of cybersecurity is analyzed. The directions of adaptation of the current legislation on cybersecurity to the EU standards are outlined and the directions of development of the national system of cybersecurity are defined. The practical significance of the results is to deepen the understanding of the nature and mechanism of various types of fraud in the Internet. The recommendations proposed in the paper can form a methodological and theoretical basis for the development of economic policy of the state to ensure the confidentiality and cybersecurity of private users and commercial organizations. Conclusions. The state should establish an effective oversight body in the field of personal data protection, but security measures and online restrictions should comply with international standards. The use of encryption should not be prohibited at the legislative level, as such restrictions reduce the ability of citizens to protect themselves from illegal intrusions into privacy. In addition, the state policy in the Internet should be aimed at promoting the development and operation of secure Internet technologies and the formation of mechanisms to protect against services and protocols that threaten the technical functioning of the Internet from viruses, phishing and more.
APA, Harvard, Vancouver, ISO, and other styles
17

Mendoza Navarrete, Martha Lorena, Yenny Alexandra Zambrano Villegas, Lilia del Rocio Bermudez Cevallos, and Yanina Alexandra Viteri Alcivar. "New technologies and new paradigms: the new technological societies approach." Universidad Ciencia y Tecnología 25, no. 110 (2021): 155–63. http://dx.doi.org/10.47460/uct.v25i110.487.

Full text
Abstract:
New technologies represent novelty depending on the era in which they are viewed, but in all cases they represent social evolution in some way. At present, new technologies are associated with the use of computer tools that strengthen processes, mechanisms, and undoubtedly, social communication. This paper evaluates new technologies focused on social transformations, their impact on human behavior and the social repercussions they may bring with their prevalence over time. Several academic documents of a scientific and technical nature are evaluated, with a view to defining the paradigms of technologies in their evolutionary process through societies. The most outstanding results show that the modern world is subject to a significant impact of information technology, that it encompasses not only educational aspects but also family, personal and economic aspects, and that the implications of traditional substitution by technology may be detrimental to mankind.
 Keywords: Technological implications, new societies, technological impact.
 References
 [1]C. Renfrew y P. Bahn, Arqueología: Teoría, métodos y práctica., Madrid: Akal, 1993.
 [2]Y. Laniuk, «Freedom in the Society of Control: Ethical challenges,» Ethics and Bioethics, vol. 10, nº 34, pp. 203-220, 2021.
 [3]J. Chaves, «Desarrollo tecnológico en la Primera Revolución Industrial,» Universidad de Extremadura, Servicio de Publicaciones, Extremadura, 2004.
 [4]A. Bessarab, O. Mitchuk, A. Baranetska, N. Kodatska, O. Kvasnytsia y G. Mykytiv, «Social networks as a phenomenon of the information society,» Journal of Optimization in Industrial Engineering, vol. 14, nº 1, pp. 35-42, 2021.
 [5]E. Popkova, A. Bogoviz y B. Sergi, «Towards digital society management and ‘capitalism 4.0’ in contemporary Russia,» Humanities and Social Sciences Communications, vol. 8, nº 1, p. 77, 2021.
 [6]A. Núñez, «Riesgo e Incertidumbre en las Sociedades Tecnológicas complejas.,» Cuaderno del ateneo, pp. 44-57, 2007.
 [7]wikipedia, «Accidente del transbordador espacial Challenger,» 1989. [Online]. Available: https://es.wikipedia.org/wiki/Accidente_del_transbordador_espacial_Challenger. [Last access: 2021].
 [8]J. Martínez, «La innovación tecnológica en las sociedades cooperativas y otras organizaciones de participación,» Rev.Est. Coop., vol. 48, 2001.
 [9]J. Echeverría, «Ética y sociedades tecnológicas,» Isegoría, vol. 41, pp. 217-229, 2009.
 [10]R. Pardo, «La cultura científico-tecnológica de las sociedades de la modernidad tardía,» Comunicar ciencia, vol. 51, pp. 35-86, 2001.
 [11]A. Zatls, «Metales, ambiente y las sociedades tecnológicas: ¿hacia dónde nos dirigimos?,» Química viva, vol. 10, nº 2, pp. 1-20, 2011.
APA, Harvard, Vancouver, ISO, and other styles
18

Gramatiuk, Svetlana Mykolaivna, Irina Yuriivna Bagmut, Michael Ivanivich Sheremet, et al. "Pediatric biobanks and parents of disabled children associations opinions on establishing children repositories in developing countries." Journal of Medicine and Life 14, no. 1 (2021): 50–55. http://dx.doi.org/10.25122/jml-2020-0106.

Full text
Abstract:
Pediatric biobanks are an indispensable resource for the research needed to bring advances in personalized medicine into pediatric medical care. It is unclear how or when these advances in medical care may reach children, but it is unlikely that research in adults will be adequate. We conducted the screening for a hypothetic problem in various European and American pediatric biobanks based on online surveys through e-mail distribution based on the Biobank Economic Modeling Tool (BEMT) questionnaire model. Participants in the survey had work experience in biobanking for at least 3 years or more. Contact information about the survey participants was confirmed on the social networks profiles (LinkedIn), as well as on generally available websites. First, we tried creating a model which can show the pediatric preclinical and basic clinical phase relationship and demonstrate how pediatric biobanking is linked to this process. Furthermore, we tried to look for new trends, and the final goal is to put the acquired knowledge into practice, so medical experts and patients could gain usable benefit from it. We concluded that leading positions must take into account ethical and legal aspects when considering the decision to include children in the biobank collection. However, communication with parents and children is essential. The biobank characteristics influence the biobank's motives to include children in the consent procedure. Moreover, the motives to include children influence how the children are involved in the consent procedure and the extent to which children are able to make voluntary decisions as part of the consent procedure.
APA, Harvard, Vancouver, ISO, and other styles
19

Лазаренко, Ірина Сергіївна, Станіслав Васильович Салоїд, Світлана Олександрівна Тульчинська, Сергій Олександрович Кириченко, and Ростислав Володимирович Тульчинський. "NECESSITY OF IMPLEMENTATING DATA SCIENCE COURSE IN ECONOMICS CURRICULA." Information Technologies and Learning Tools 78, no. 4 (2020): 132–44. http://dx.doi.org/10.33407/itlt.v78i4.3505.

Full text
Abstract:
The article describes the relevance and feasibility of implementation Data Science courses for leading economics majors: 051 Economics, 075 Marketing, 073 Management. Application of computer technology, mathematical methods and models, statistical analysis in the study process for economics students became routine long time ago, then why is Data Science linked mostly only to the faculties of information technologies? The specificity of economic professions requires the acquisition of skills in the work with large data sets, qualitative evaluation of statistics, predicting a large number of economic phenomena, so the economist of the future should be not only a specialist in the main subject area, but also a specialist in Big Data and Data Mining. The study outlines the underlying background for essential changes. The article analyzes relevant educational and professional programs, blocks of disciplines, providing qualitative assimilation of new information by students and acquisition of those abilities and skills that are needed by the modern specialist in the field of economy and will form student as a serious competitor in the labor market. It has been conducted the analysis of modern international commercial on-line courses, specifying the topics and aspects necessary for the future economics graduates. The logical scheme of Data Science specialties introduction which follows the relevant cycle of the existing disciplines of general and professional training is proposed. Mastering the knowledge of qualitative data analysis and tools for optimal work with them should be one of the main tasks of the methodological system of education and research at the faculties of economics. Modern educational technologies and scientific facilities of universities should help to expand the understanding and perception of the economist, marketologist, and manager profession, because the digital advertising, SMM, social networks, online applications, project management, State in a Smartphone, and other rapid transformations encourage to train not classic specialists, but universals who will be able to adapt quickly to the needs of the future.
APA, Harvard, Vancouver, ISO, and other styles
20

Shchekotin, Evgeniy, Viacheslav Goiko, Mikhail Myagkov, and Darya Dunaeva. "Assessment of quality of life in regions of Russia based on social media data." Journal of Eurasian Studies, July 28, 2021, 187936652110341. http://dx.doi.org/10.1177/18793665211034185.

Full text
Abstract:
The article offers a new method of quality of life assessment based on online activities of social networks users. The method has obvious advantages (quickness of research, low costs, large scale, and detailed character of the obtained information) and limitations (it covers only the “digital population,” whereas the rural population is not included). The article dwells on the potential of social networks as a data source to analyze the quality of life; it also presents the results of an empirical study of online activities of the users of VK, the most popular Russian social network. Using the obtained data, the authors have calculated the quality of life index for 83 regions of the Russian Federation based on 19 parameters of economic, social, and political aspects of life quality.
APA, Harvard, Vancouver, ISO, and other styles
21

Calvo, Dafne, Cristina Renedo Farpón, and María Díez- Garrido. "Podemos in the Regional Elections 2015: Online Campaign Strategies in Castile and León." RIPS: Revista de Investigaciones Políticas y Sociológicas 16, no. 2 (2017). http://dx.doi.org/10.15304/rips.16.2.3897.

Full text
Abstract:
Internet has revolutionized many aspects in the way that political parties communicate. The Network has induced a complete transformation of the political strategies used during election campaigns to spread their message to the electorate. Politicians use social networks and digital platforms to promote their messages and to communicate with citizens during these periods of time. Facebook has proven to be one of the most effective networks in this regard. The party Podemos was born in 2014 in Spain, surrounded by a deep economic, institutional and political crisis. This political party promises to be a real hope to the negative situation of the country. In the recent years, Podemos has aroused the interest of social scientists because of the innovative way they use social networks. This paper explores the cybercampaign strategies that Podemos used during the 2015 Castile and León regional elections, a region where the population is very different from the party’s average voter. To this end, a quantitative analysis of their activity on Facebook and their website was made during the regional election campaign. The results of this study allow us to observe an ordinary use of the digital tools by Podemos that highlights the engagement achieved with users in Facebook.
APA, Harvard, Vancouver, ISO, and other styles
22

Kubheka, Brenda Zanele. "Bioethics and the use of social media for medical crowdfunding." BMC Medical Ethics 21, no. 1 (2020). http://dx.doi.org/10.1186/s12910-020-00521-2.

Full text
Abstract:
Abstract Background Social media has globalised compassion enabling requests for donations to spread beyond geographical boundaries. The use of social media for medical crowdfunding links people with unmet healthcare needs to charitable donors. There is no doubt that fundraising campaigns using such platforms facilitates access to financial resources to the benefit of patients and their caregivers. Main text This paper reports on a critical review of the published literature and information from other online resources discussing medical crowdfunding and the related ethical questions. The review highlighted the benefits of crowdfunding as well as the under-exploration of the risk of having patients’ desires and human rights undermined during online fundraising campaigns. Majority of these campaigns get initiated on behalf of the patients, especially the very sick and dependant. The ethical questions raised relate to the voluntariness of informed consent and the possibility of patients being used as a means to an end. Vulnerability of patients may expose them to coercion, undue influence, manipulation, and violation of their human rights. The success of these campaigns is influenced by the digital skills, pre-existing social networks and, the emotional potency. Healthcare is a public good, and online market forces should not determine access to essential health services. The benefits of crowdfunding cannot be subverted, but it can perpetuate unintended injustices, especially those arising from socio-economic factors. Conclusions Policymakers ought to monitor the utilisation of crowdfunding sites to identify policy failures and unmet essential health care needs responsible for driving individuals to use these platforms. The upholding of human rights and the fundamental respect of the individual’s wishes is a moral imperative. The need for an ethics framework to guide different stakeholders during medical crowdfunding needs further examination.
APA, Harvard, Vancouver, ISO, and other styles
23

Tseng, Emy, and Kyle Eischen. "The Geography of Cyberspace." M/C Journal 6, no. 4 (2003). http://dx.doi.org/10.5204/mcj.2224.

Full text
Abstract:
The Virtual and the Physical The structure of virtual space is a product of the Internet’s geography and technology. Debates around the nature of the virtual — culture, society, economy — often leave out this connection to “fibre”, to where and how we are physically linked to each other. Rather than signaling the “end of geography” (Mitchell 1999), the Internet reinforces its importance with “real world” physical and logical constraints shaping the geography of cyberspace. To contest the nature of the virtual world requires understanding and contesting the nature of the Internet’s architecture in the physical world. The Internet is built on a physical entity – the telecommunications networks that connect computers around the world. In order to participate on the Internet, one needs to connect to these networks. In an information society access to bandwidth determines the haves from the have-nots (Mitchell 1999), and bandwidth depends upon your location and economics. Increasingly, the new generation Internet distributes bandwidth unevenly amongst regions, cities, organizations, and individuals. The speed, type, size and quality of these networks determine the level and nature of participation available to communities. Yet these types of choices, the physical and technical aspects of the network, are the ones least understood, contested and linked to “real world” realities. The Technical is the Political Recently, the US government proposed a Total Information Awareness surveillance system for all digital communications nationally. While technically unworkable on multiple fronts, many believed that the architecture of the Internet simply prevented such data collection, because no physical access points exist through which all data flows. In reality, North America does have central access points – six to be exact – through which all data moves because it is physically impossible to create redundant systems. This simple factor of geography potentially shapes policies on speech, privacy, terrorism, and government-business relations to name just a few. These are not new issues or challenges, but merely new technologies. The geography of infrastructure – from electricity, train and telephone networks to the architectures of freeways, cities and buildings – has always been as much social and political as technical. The technology and the social norms embedded in the network geography (Eischen, 2002) are central to the nature of cyberspace. We may wish for a utopian vision, but the hidden social assumptions in mundane ‘engineering’ questions like the location of fibre or bandwidth quality will shape virtual world. The Changing Landscape of the Internet The original Internet infrastructure is being redesigned and rebuilt. The massive fibre-optic networks of the Internet backbones have been upgraded, and broadband access technologies – cable modem, Digital Subscriber Line (DSL) and now wireless Wi-Fi – are being installed closer to homes and businesses. New network technologies and protocols enable the network to serve up data even faster than before. However, the next generation Internet architecture is quite different from the popular utopian vision described above. The Internet is being restructured as an entertainment and commerce medium, driven by the convergence of telecommunications technologies and commercialization. It is moving towards a broadcast model where individual consumers have access to less upstream bandwidth than downstream, with the symmetry of vendor and customer redesigned and built to favor content depending on who provides, requests and receives content. This Internet infrastructure has both physical and logical components – the telecommunications networks that comprise the physical infrastructure and the protocols that comprise the logical infrastructure of the software that runs the Internet. We are in the process of partitioning this infrastructure, both physical and logical, into information conduits of different speeds and sizes. Access to these conduits depends on who and where you are. These emerging Internet infrastructure technologies – Broadband Access Networks, Caching and Content Delivery Networks, Quality of Service and Policy Protocols – are shaped by geographical, economic and social factors in their development, deployment and use. The Geography of Broadband These new broadband networks are being deployed initially in more privileged, densely populated communities in primary cities and their wealthy suburbs (Graham, 2000). Even though many have touted the potential of Wi-Fi networks to bring broadband to underserved areas, initial mappings of wireless deployment show correlation between income and location of hotspots (NYCWireless, 2003). Equally important, the most commonly deployed broadband technologies, cable modem and ADSL, follow a broadcast model by offering more downstream bandwidth than upstream bandwidth. Some cable companies limit upstream bandwidth even further to 256 Kbps in order to discourage subscribers from setting up home servers. The asymmetry of bandwidth leads to asymmetry of information flows where corporations produce information and users content. Internet Infrastructure: Toll Roads and the Priority of Packets The Internet originally was designed around ‘best effort’ service: data flows through the networks as packets, and all packets are treated equally. The TCP/IP protocols that comprise the Internet’s logical infrastructure (Lessig, 101) govern how data is transferred across the physical networks. In the Internet’s original design, each packet is routed to the best path known, with the transport quality level dependent on network conditions. However, network congestion and differing content locations lead to inconsistent levels of quality. In order to overcome Internet “bottlenecks”, technologies such as content caching and Quality of Service (QoS) protocols have been developed that allow large corporate customers to bypass the public infrastructure, partitioning the Internet into publicly and privately accessible data conduits or throughways. Since access is based on payment, these private throughways can be thought of as the new toll roads of the Internet. Companies such as Akamai are deploying private ‘content delivery’ networks. These networks replicate and store content in geographically dispersed servers close to the end users, reducing the distance content data needs to traverse. Large content providers pay these companies to store and serve their content on these networks. Internet Service Providers (ISPs) offer similar services for internal or hosted content. The Internet’s physical infrastructure consists of a system of interconnected networks. The major ISPs’ networks interconnect at Network Access Point (NAPs) the major intersections of the Internet backbone. Congestion at these public intersection points has resulted in InterNAP building and deploying private network access points (P-NAPs). Akamai content delivery network (Akamai, 2000) and InterNAP’s P-NAPs (InterNAP, 2000) deployment maps reveal a deployment of private infrastructure to a select group of highly-connected U.S. cities (Moss & Townsend, 2000), furthering the advantage these ‘global cities’ (Graham, 1999) have over other cities and regions. QoS protocols allow ISPs to define differing levels of service by providing preferential treatment to some amount of the network traffic. Smart routers, or policy routers, enable network providers to define policies for data packet treatment. The routers can discriminate between and prioritize the handling of packets based on destination, source, the ISP, data content type, etc. Such protocols and policies represent a departure from the original peer-to-peer architecture of data equality with ‘best-effort’. The ability to discriminate and prioritize data traffic is being built into the system, with economic and even political factors able to shape the way packets and content flow through the network. For example, during the war on Iraq, Akamai Technologies canceled its service contract with the Arabic news service Al Jazeera (CNET, 2003). Technology, Choices and Values To address the social choices underpinning seemingly benign technical choices of the next generation Internet, we need to understand the economic, geographic and social factors guiding choices about its design and deployment. Just as the current architecture of the Internet reflects the values of its original creators, this next generation Internet will reflect our choices and our values. The reality is that decisions with very long-term impacts will be made with or without debate. If any utopian vision of the Internet is to survive, it is crucial to embed the new architectures with specific values by asking difficult questions with no pre-defined or easy answers. These are questions that require social debate and consensus. Is the Internet fundamentally a public or private space? Who will have access? What information and whose information will be accessible? Which values and whose values should form the basis of the new infrastructure? Should the construction be subject to market forces alone or should ideas of social equity and fairness be embedded in the technology? Technologists, policy makers (at both national and local levels), researchers and the general public all have a part in determining the answers to these questions. Policymakers need to link future competition and innovation with equitable access for all citizens. Urban planners and local governments need to link infrastructure, economic sustainability and equity through public and public-private investments – especially in traditionally marginalized areas. Researchers need to continue mapping the complex interactions of investment in and deployment of infrastructure across the disciplines of economics, technology and urban planning. Technologists need to consider the societal implications and inform the policy debates of the technologies they build. Communities need to link technical issues with local ramifications, contesting and advocating with policymakers and corporations. The ultimate geography of cyberspace will reflect the geography of fibre. Understanding and contesting the present and future reality requires linking mundane technical questions with the questions of values in exactly these wider social and political debates. Works Cited Akamai. See <http://www.akamai.com/service/network.php> Eischen, Kyle. ‘The Social Impact of Informational Production: Software Development as an Informational Practice’. Center for Global, International and Regional Studies Working Paper #2002-1. 2002. UC Santa Cruz. <http://cgirs.ucsc.edu/publications/workingpapers/> Graham, Stephen. “Global Grids of Glass: On Global Cities, Telecommunications and Planetary Urban Networks.” Urban Studies. 1999. 36 (5-6). Graham, Stephen. “Constructing Premium Network Spaces: Reflections on Infrastructure Networks and Contemporary Urban Development.” International Journal of Urban and Regional Research. 2000. 24(1) March. InterNAP. See <http://www.internap.com/html/news_05022000.htm> Junnarkar, Sandeep. “Akamai ends Al-Jazeera server support”, CNET News.com, April 4, 2003. See <http://news.com.com/1200-1035-995546.php> Lessig, Lawrence. Code and Other Laws of Cyberspace. New York: Basic Books, 1999. Mitchell, William. City of Bits. Cambridge, MA: MIT Press, 1999. Mosss, Mitchell L. and Anthony M. Townsend. “The Internet Backbone and the American Metropolis.” The Information Society Journal. 16(1): 35-47. Online at: <http://www.informationcity.org/research/internet-backbone-am... ...erican-metropolis/index.htm> Public Internet Project. “802.11b Survey of NYC.” <http://www.publicinternetproject.org/> Links http://cgirs.ucsc.edu/publications/workingpapers/ http://news.com.com/1200-1035-995546.html http://www.akamai.com/service/network.html http://www.informationcity.org/research/internet-backbone-american-metropolis/index.htm http://www.internap.com/html/news_05022000.htm http://www.publicinternetproject.org/ Citation reference for this article Substitute your date of access for Dn Month Year etc... MLA Style Eischen, Emy Tseng & Kyle. "The Geography of Cyberspace" M/C: A Journal of Media and Culture< http://www.media-culture.org.au/0308/03-geography.php>. APA Style Eischen, E. T. &. K. (2003, Aug 26). The Geography of Cyberspace. M/C: A Journal of Media and Culture, 6,< http://www.media-culture.org.au/0308/03-geography.php>
APA, Harvard, Vancouver, ISO, and other styles
24

Leaver, Tama. "The Social Media Contradiction: Data Mining and Digital Death." M/C Journal 16, no. 2 (2013). http://dx.doi.org/10.5204/mcj.625.

Full text
Abstract:
Introduction Many social media tools and services are free to use. This fact often leads users to the mistaken presumption that the associated data generated whilst utilising these tools and services is without value. Users often focus on the social and presumed ephemeral nature of communication – imagining something that happens but then has no further record or value, akin to a telephone call – while corporations behind these tools tend to focus on the media side, the lasting value of these traces which can be combined, mined and analysed for new insight and revenue generation. This paper seeks to explore this social media contradiction in two ways. Firstly, a cursory examination of Google and Facebook will demonstrate how data mining and analysis are core practices for these corporate giants, central to their functioning, development and expansion. Yet the public rhetoric of these companies is not about the exchange of personal information for services, but rather the more utopian notions of organising the world’s information, or bringing everyone together through sharing. The second section of this paper examines some of the core ramifications of death in terms of social media, asking what happens when a user suddenly exists only as recorded media fragments, at least in digital terms. Death, at first glance, renders users (or post-users) without agency or, implicitly, value to companies which data-mine ongoing social practices. Yet the emergence of digital legacy management highlights the value of the data generated using social media, a value which persists even after death. The question of a digital estate thus illustrates the cumulative value of social media as media, even on an individual level. The ways Facebook and Google approach digital death are examined, demonstrating policies which enshrine the agency and rights of living users, but become far less coherent posthumously. Finally, along with digital legacy management, I will examine the potential for posthumous digital legacies which may, in some macabre ways, actually reanimate some aspects of a deceased user’s presence, such as the Lives On service which touts the slogan “when your heart stops beating, you'll keep tweeting”. Cumulatively, mapping digital legacy management by large online corporations, and the affordances of more focussed services dealing with digital death, illustrates the value of data generated by social media users, and the continued importance of the data even beyond the grave. Google While Google is universally synonymous with search, and is the world’s dominant search engine, it is less widely understood that one of the core elements keeping Google’s search results relevant is a complex operation mining user data. Different tools in Google’s array of services mine data in different ways (Zimmer, “Gaze”). Gmail, for example, uses algorithms to analyse an individual’s email in order to display the most relevant related advertising. This form of data mining is comparatively well known, with most Gmail users knowingly and willingly accepting more personalised advertising in order to use Google’s email service. However, the majority of people using Google’s search engine are unaware that search, too, is increasingly driven by the tracking, analysis and refining of results on the basis of user activity (Zimmer, “Externalities”). As Alexander Halavais (160–180) quite rightly argues, recent focus on the idea of social search – the deeper integration of social network information in gauging search results – is oxymoronic; all search, at least for Google, is driven by deep analysis of personal and aggregated social data. Indeed, the success of Google’s mining of user data has led to concerns that often invisible processes of customisation and personalisation will mean that the supposedly independent or objective algorithms producing Google’s search results will actually yield a different result for every person. As Siva Vaidhyanathan laments: “as users in a diverse array of countries train Google’s algorithms to respond to specialized queries with localised results, each place in the world will have a different list of what is important, true, or ‘relevant’ in response to any query” (138). Personalisation and customisation are not inherently problematic, and frequently do enhance the relevance of search results, but the main objection raised by critics is not Google’s data mining, but the lack of transparency in the way data are recorded, stored and utilised. Eli Pariser, for example, laments the development of a ubiquitous “filter bubble” wherein all search results are personalised and subjective but are hidden behind the rhetoric of computer-driven algorithmic objectivity (Pariser). While data mining informs and drives many of Google’s tools and services, the cumulative value of these captured fragments of information is best demonstrated by the new service Google Now. Google Now is a mobile app which delivers an ongoing stream of search results but without the need for user input. Google Now extrapolates the rhythms of a person’s life, their interests and their routines in order to algorithmically determine what information will be needed next, and automatically displays it on a user’s mobile device. Clearly Google Now is an extremely valuable and clever tool, and the more information a user shares, the better the ongoing customised results will be, demonstrating the direct exchange value of personal data: total personalisation requires total transparency. Each individual user will need to judge whether they wish to share with Google the considerable amount of personal information needed to make Google Now work. The pressing ethical question that remains is whether Google will ensure that users are sufficiently aware of the amount of data and personal privacy they are exchanging in order to utilise such a service. Facebook Facebook began as a closed network, open only to students at American universities, but has transformed over time to a much wider and more open network, with over a billion registered users. Facebook has continually reinvented their interface, protocols and design, often altering both privacy policies and users’ experience of privacy, and often meeting significant and vocal resistance in the process (boyd). The data mining performed by social networking service Facebook is also extensive, although primarily aimed at refining the way that targeted advertising appears on the platform. In 2007 Facebook partnered with various retail loyalty services and combined these records with Facebook’s user data. This information was used to power Facebook’s Beacon service, which added details of users’ retail history to their Facebook news feed (for example, “Tama just purchased a HTC One”). The impact of all of these seemingly unrelated purchases turning up in many people’s feeds suddenly revealed the complex surveillance, data mining and sharing of these data that was taking place (Doyle and Fraser). However, as Beacon was turned on, without consultation, for all Facebook users, there was a sizable backlash that meant that Facebook had to initially switch the service to opt-in, and then discontinue it altogether. While Beacon has been long since erased, it is notable that in early 2013 Facebook announced that they have strengthened partnerships with data mining and profiling companies, including Datalogix, Epsilon, Acxiom, and BlueKai, which harness customer information from a range of loyalty cards, to further refine the targeting ability offered to advertisers using Facebook (Hof). Facebook’s data mining, surveillance and integration across companies is thus still going on, but no longer directly visible to Facebook users, except in terms of the targeted advertisements which appear on the service. Facebook is also a platform, providing a scaffolding and gateway to many other tools and services. In order to use social games such as Zynga’s Farmville, Facebook users agree to allow Zynga to access their profile information, and use Facebook to authenticate their identity. Zynga has been unashamedly at the forefront of user analytics and data mining, attempting to algorithmically determine the best way to make virtual goods within their games attractive enough for users to pay for them with real money. Indeed, during a conference presentation, Zynga Vice President Ken Rudin stated outright that Zynga is “an analytics company masquerading as a games company” (Rudin). I would contend that this masquerade succeeds, as few Farmville players are likely to consider how their every choice and activity is being algorithmically scrutinised in order to determine what virtual goods they might actually buy. As an instance of what is widely being called ‘big data’, the data miing operations of Facebook, Zynga and similar services lead to a range of ethical questions (boyd and Crawford). While users may have ostensibly agreed to this data mining after clicking on Facebook’s Terms of Use agreement, the fact that almost no one reads these agreements when signing up for a service is the Internet’s worst kept secret. Similarly, the extension of these terms when Facebook operates as a platform for other applications is a far from transparent process. While examining the recording of user data leads to questions of privacy and surveillance, it is important to note that many users are often aware of the exchange to which they have agreed. Anders Albrechtslund deploys the term ‘social surveillance’ to usefully emphasise the knowing, playful and at times subversive approach some users take to the surveillance and data mining practices of online service providers. Similarly, E.J. Westlake notes that performances of self online are often not only knowing but deliberately false or misleading with the aim of exploiting the ways online activities are tracked. However, even users well aware of Facebook’s data mining on the site itself may be less informed about the social networking company’s mining of offsite activity. The introduction of ‘like’ buttons on many other Websites extends Facebook’s reach considerably. The various social plugins and ‘like’ buttons expand both active recording of user activity (where the like button is actually clicked) and passive data mining (since a cookie is installed or updated regardless of whether a button is actually pressed) (Gerlitz and Helmond). Indeed, because cookies – tiny packets of data exchanged and updated invisibly in browsers – assign each user a unique identifier, Facebook can either combine these data with an existing user’s profile or create profiles about non-users. If that person even joins Facebook, their account is connected with the existing, data-mined record of their Web activities (Roosendaal). As with Google, the significant issue here is not users knowingly sharing their data with Facebook, but the often complete lack of transparency in terms of the ways Facebook extracts and mines user data, both on Facebook itself and increasingly across applications using Facebook as a platform and across the Web through social plugins. Google after Death While data mining is clearly a core element in the operation of Facebook and Google, the ability to scrutinise the activities of users depends on those users being active; when someone dies, the question of the value and ownership of their digital assets becomes complicated, as does the way companies manage posthumous user information. For Google, the Gmail account of a deceased person becomes inactive; the stored email still takes up space on Google’s servers, but with no one using the account, no advertising is displayed and thus Google can earn no revenue from the account. However, the process of accessing the Gmail account of a deceased relative is an incredibly laborious one. In order to even begin the process, Google asks that someone physically mails a series of documents including a photocopy of a government-issued ID, the death certificate of the deceased person, evidence of an email the requester received from the deceased, along with other personal information. After Google have received and verified this information, they state that they might proceed to a second stage where further documents are required. Moreover, if at any stage Google decide that they cannot proceed in releasing a deceased relative’s Gmail account, they will not reveal their rationale. As their support documentation states: “because of our concerns for user privacy, if we determine that we cannot provide the Gmail content, we will not be able to share further details about the account or discuss our decision” (Google, “Accessing”). Thus, Google appears to enshrine the rights and privacy of individual users, even posthumously; the ownership or transfer of individual digital assets after death is neither a given, nor enshrined in Google’s policies. Yet, ironically, the economic value of that email to Google is likely zero, but the value of the email history of a loved one or business partner may be of substantial financial and emotional value, probably more so than when that person was alive. For those left behind, the value of email accounts as media, as a lasting record of social communication, is heightened. The question of how Google manages posthumous user data has been further complicated by the company’s March 2012 rationalisation of over seventy separate privacy policies for various tools and services they operate under the umbrella of a single privacy policy accessed using a single unified Google account. While this move was ostensibly to make privacy more understandable and transparent at Google, it had other impacts. For example, one of the side effects of a singular privacy policy and single Google identity is that deleting one of a recently deceased person’s services may inadvertently delete them all. Given that Google’s services include Gmail, YouTube and Picasa, this means that deleting an email account inadvertently erases all of the Google-hosted videos and photographs that individual posted during their lifetime. As Google warns, for example: “if you delete the Google Account to which your YouTube account is linked, you will delete both the Google Account AND your YouTube account, including all videos and account data” (Google, “What Happens”). A relative having gained access to a deceased person’s Gmail might sensibly delete the email account once the desired information is exported. However, it seems less likely that this executor would realise that in doing so all of the private and public videos that person had posted on YouTube would also permanently disappear. While material possessions can be carefully dispersed to specific individuals following the instructions in someone’s will, such affordances are not yet available for Google users. While it is entirely understandable that the ramification of policy changes are aimed at living users, as more and more online users pass away, the question of their digital assets becomes increasingly important. Google, for example, might allow a deceased person’s executor to elect which of their Google services should be kept online (perhaps their YouTube videos), which traces can be exported (perhaps their email), and which services can be deleted. At present, the lack of fine-grained controls over a user’s digital estate at Google makes this almost impossible. While it violates Google’s policies to transfer ownership of an account to another person, if someone does leave their passwords behind, this provides their loved ones with the best options in managing their digital legacy with Google. When someone dies and their online legacy is a collection of media fragments, the value of those media is far more apparent to the loved ones left behind rather than the companies housing those media. Facebook Memorialisation In response to users complaining that Facebook was suggesting they reconnect with deceased friends who had left Facebook profiles behind, in 2009 the company instituted an official policy of turning the Facebook profiles of departed users into memorial pages (Kelly). Technically, loved ones can choose between memorialisation and erasing an account altogether, but memorialisation is the default. This entails setting the account so that no one can log into it, and that no new friends (connections) can be made. Existing friends can access the page in line with the user’s final privacy settings, meaning that most friends will be able to post on the memorialised profile to remember that person in various ways (Facebook). Memorialised profiles (now Timelines, after Facebook’s redesign) thus become potential mourning spaces for existing connections. Since memorialised pages cannot make new connections, public memorial pages are increasingly popular on Facebook, frequently set up after a high-profile death, often involving young people, accidents or murder. Recent studies suggest that both of these Facebook spaces are allowing new online forms of mourning to emerge (Marwick and Ellison; Carroll and Landry; Kern, Forman, and Gil-Egui), although public pages have the downside of potentially inappropriate commentary and outright trolling (Phillips). Given Facebook has over a billion registered users, estimates already suggest that the platform houses 30 million profiles of deceased people, and this number will, of course, continue to grow (Kaleem). For Facebook, while posthumous users do not generate data themselves, the fact that they were part of a network means that their connections may interact with a memorialised account, or memorial page, and this activity, like all Facebook activities, allows the platform to display advertising and further track user interactions. However, at present Facebook’s options – to memorialise or delete accounts of deceased people – are fairly blunt. Once Facebook is aware that a user has died, no one is allowed to edit that person’s Facebook account or Timeline, so Facebook literally offers an all (memorialisation) or nothing (deletion) option. Given that Facebook is essentially a platform for performing identities, it seems a little short-sighted that executors cannot clean up or otherwise edit the final, lasting profile of a deceased Facebook user. As social networking services and social media become more ingrained in contemporary mourning practices, it may be that Facebook will allow more fine-grained control, positioning a digital executor also as a posthumous curator, making the final decision about what does and does not get kept in the memorialisation process. Since Facebook is continually mining user activity, the popularity of mourning as an activity on Facebook will likely mean that more attention is paid to the question of digital legacies. While the user themselves can no longer be social, the social practices of mourning, and the recording of a user as a media entity highlights the fact that social media can be about interactions which in significant ways include deceased users. Digital Legacy Services While the largest online corporations have fairly blunt tools for addressing digital death, there are a number of new tools and niche services emerging in this area which are attempting to offer nuanced control over digital legacies. Legacy Locker, for example, offers to store the passwords to all of a user’s online services and accounts, from Facebook to Paypal, and to store important documents and other digital material. Users designate beneficiaries who will receive this information after the account holder passes away, and this is confirmed by preselected “verifiers” who can attest to the account holder’s death. Death Switch similarly provides the ability to store and send information to users after the account holder dies, but tests whether someone is alive by sending verification emails; fail to respond to several prompts and Death Switch will determine a user has died, or is incapacitated, and executes the user’s final instructions. Perpetu goes a step further and offers the same tools as Legacy Locker but also automates existing options from social media services, allowing users to specify, for example, that their Facebook, Twitter or Gmail data should be downloaded and this archive should be sent to a designated recipient when the Perpetu user dies. These tools attempt to provide a more complex array of choices in terms of managing a user’s digital legacy, providing similar choices to those currently available when addressing material possessions in a formal will. At a broader level, the growing demand for these services attests to the ongoing value of online accounts and social media traces after a user’s death. Bequeathing passwords may not strictly follow the Terms of Use of the online services in question, but it is extremely hard to track or intervene when a user has the legitimate password, even if used by someone else. More to the point, this finely-grained legacy management allows far more flexibility in the utilisation and curation of digital assets posthumously. In the process of signing up for one of these services, or digital legacy management more broadly, the ongoing value and longevity of social media traces becomes more obvious to both the user planning their estate and those who ultimately have to manage it. The Social Media Afterlife The value of social media beyond the grave is also evident in the range of services which allow users to communicate in some fashion after they have passed away. Dead Social, for example, allows users to schedule posthumous social media activity, including the posting of tweets, sending of email, Facebook messages, or the release of online photos and videos. The service relies on a trusted executor confirming someone’s death, and after that releases these final messages effectively from beyond the grave. If I Die is a similar service, which also has an integrated Facebook application which ensures a user’s final message is directly displayed on their Timeline. In a bizarre promotional campaign around a service called If I Die First, the company is promising that the first user of the service to pass away will have their posthumous message delivered to a huge online audience, via popular blogs and mainstream press coverage. While this is not likely to appeal to everyone, the notion of a popular posthumous performance of self further complicates that question of what social media can mean after death. Illustrating the value of social media legacies in a quite different but equally powerful way, the Lives On service purports to algorithmically learn how a person uses Twitter while they are live, and then continue to tweet in their name after death. Internet critic Evgeny Morozov argues that Lives On is part of a Silicon Valley ideology of ‘solutionism’ which casts every facet of society as a problem in need of a digital solution (Morozov). In this instance, Lives On provides some semblance of a solution to the problem of death. While far from defeating death, the very fact that it might be possible to produce any meaningful approximation of a living person’s social media after they die is powerful testimony to the value of data mining and the importance of recognising that value. While Lives On is an experimental service in its infancy, it is worth wondering what sort of posthumous approximation might be built using the robust data profiles held by Facebook or Google. If Google Now can extrapolate what a user wants to see without any additional input, how hard would it be to retool this service to post what a user would have wanted after their death? Could there, in effect, be a Google After(life)? Conclusion Users of social media services have differing levels of awareness regarding the exchange they are agreeing to when signing up for services provided by Google or Facebook, and often value the social affordances without necessarily considering the ongoing media they are creating. Online corporations, by contrast, recognise and harness the informatic traces users generate through complex data mining and analysis. However, the death of a social media user provides a moment of rupture which highlights the significant value of the media traces a user leaves behind. More to the point, the value of these media becomes most evident to those left behind precisely because that individual can no longer be social. While beginning to address the issue of posthumous user data, Google and Facebook both have very blunt tools; Google might offer executors access while Facebook provides the option of locking a deceased user’s account as a memorial or removing it altogether. Neither of these responses do justice to the value that these media traces hold for the living, but emerging digital legacy management tools are increasingly providing a richer set of options for digital executors. While the differences between material and digital assets provoke an array of legal, spiritual and moral issues, digital traces nevertheless clearly hold significant and demonstrable value. For social media users, the death of someone they know is often the moment where the media side of social media – their lasting, infinitely replicable nature – becomes more important, more visible, and casts the value of the social media accounts of the living in a new light. For the larger online corporations and service providers, the inevitable increase in deceased users will likely provoke more fine-grained controls and responses to the question of digital legacies and posthumous profiles. It is likely, too, that the increase in online social practices of mourning will open new spaces and arenas for those same corporate giants to analyse and data-mine. References Albrechtslund, Anders. “Online Social Networking as Participatory Surveillance.” First Monday 13.3 (2008). 21 Apr. 2013 ‹http://firstmonday.org/article/view/2142/1949›. boyd, danah. “Facebook’s Privacy Trainwreck: Exposure, Invasion, and Social Convergence.” Convergence 14.1 (2008): 13–20. ———, and Kate Crawford. “Critical Questions for Big Data.” Information, Communication & Society 15.5 (2012): 662–679. Carroll, Brian, and Katie Landry. “Logging On and Letting Out: Using Online Social Networks to Grieve and to Mourn.” Bulletin of Science, Technology & Society 30.5 (2010): 341–349. Doyle, Warwick, and Matthew Fraser. “Facebook, Surveillance and Power.” Facebook and Philosophy: What’s on Your Mind? Ed. D.E. Wittkower. Chicago, IL: Open Court, 2010. 215–230. Facebook. “Deactivating, Deleting & Memorializing Accounts.” Facebook Help Center. 2013. 7 Mar. 2013 ‹http://www.facebook.com/help/359046244166395/›. Gerlitz, Carolin, and Anne Helmond. “The Like Economy: Social Buttons and the Data-intensive Web.” New Media & Society (2013). Google. “Accessing a Deceased Person’s Mail.” 25 Jan. 2013. 21 Apr. 2013 ‹https://support.google.com/mail/answer/14300?hl=en›. ———. “What Happens to YouTube If I Delete My Google Account or Google+?” 8 Jan. 2013. 21 Apr. 2013 ‹http://support.google.com/youtube/bin/answer.py?hl=en&answer=69961&rd=1›. Halavais, Alexander. Search Engine Society. Polity, 2008. Hof, Robert. “Facebook Makes It Easier to Target Ads Based on Your Shopping History.” Forbes 27 Feb. 2013. 1 Mar. 2013 ‹http://www.forbes.com/sites/roberthof/2013/02/27/facebook-makes-it-easier-to-target-ads-based-on-your-shopping-history/›. Kaleem, Jaweed. “Death on Facebook Now Common as ‘Dead Profiles’ Create Vast Virtual Cemetery.” Huffington Post. 7 Dec. 2012. 7 Mar. 2013 ‹http://www.huffingtonpost.com/2012/12/07/death-facebook-dead-profiles_n_2245397.html›. Kelly, Max. “Memories of Friends Departed Endure on Facebook.” The Facebook Blog. 27 Oct. 2009. 7 Mar. 2013 ‹http://www.facebook.com/blog/blog.php?post=163091042130›. Kern, Rebecca, Abbe E. Forman, and Gisela Gil-Egui. “R.I.P.: Remain in Perpetuity. Facebook Memorial Pages.” Telematics and Informatics 30.1 (2012): 2–10. Marwick, Alice, and Nicole B. Ellison. “‘There Isn’t Wifi in Heaven!’ Negotiating Visibility on Facebook Memorial Pages.” Journal of Broadcasting & Electronic Media 56.3 (2012): 378–400. Morozov, Evgeny. “The Perils of Perfection.” The New York Times 2 Mar. 2013. 4 Mar. 2013 ‹http://www.nytimes.com/2013/03/03/opinion/sunday/the-perils-of-perfection.html?pagewanted=all&_r=0›. Pariser, Eli. The Filter Bubble: What the Internet Is Hiding from You. London: Viking, 2011. Phillips, Whitney. “LOLing at Tragedy: Facebook Trolls, Memorial Pages and Resistance to Grief Online.” First Monday 16.12 (2011). 21 Apr. 2013 ‹http://firstmonday.org/ojs/index.php/fm/article/view/3168›. Roosendaal, Arnold. “We Are All Connected to Facebook … by Facebook!” European Data Protection: In Good Health? Ed. Serge Gutwirth et al. Dordrecht: Springer, 2012. 3–19. Rudin, Ken. “Actionable Analytics at Zynga: Leveraging Big Data to Make Online Games More Fun and Social.” San Diego, CA, 2010. Vaidhyanathan, Siva. The Googlization of Everything. 1st ed. Berkeley: University of California Press, 2011. Westlake, E.J. “Friend Me If You Facebook: Generation Y and Performative Surveillance.” TDR: The Drama Review 52.4 (2008): 21–40. Zimmer, Michael. “The Externalities of Search 2.0: The Emerging Privacy Threats When the Drive for the Perfect Search Engine Meets Web 2.0.” First Monday 13.3 (2008). 21 Apr. 2013 ‹http://firstmonday.org/ojs/index.php/fm/article/view/2136/1944›. ———. “The Gaze of the Perfect Search Engine: Google as an Infrastructure of Dataveillance.” Web Search. Eds. Amanda Spink & Michael Zimmer. Berlin: Springer, 2008. 77–99.
APA, Harvard, Vancouver, ISO, and other styles
25

Heemsbergen, Luke J., Alexia Maddox, Toija Cinque, Amelia Johns, and Robert Gehl. "Dark." M/C Journal 24, no. 2 (2021). http://dx.doi.org/10.5204/mcj.2791.

Full text
Abstract:
This issue of M/C Journal rejects the association of darkness with immorality. In digital communication, the possibilities of darkness are greater than simple fears of what is hidden in online networks. Instead, new work in an emerging field of “dark social” studies’ consider “dark” as holding the potential for autonomy away from the digital visibilities that pervade economic, political, and surveillance logics of the present age. We shall not be afraid of the dark. We start from a technical rather than moral definition of darkness (Gehl), a definition that conceives of dark spaces as having legitimacies and anonymities against structural surveillance. At the same time, breaking away from techno-centric critiques of the dark allows a humanisation of how dark is embodied and performed at individual and structural levels. Other readings of digitally mediated dark (Fisher and Bolter) suggest tensions between exploitative potentials and deep societal reflection, and the ability for a new dark age (Bridle) to allow us to explore unknown potentials. Together these perspectives allow our authors a way to use dark to question and upend the unresting pressure and acceptance of—and hierarchy given to—the light in aesthetics of power and social transformation. While we reject, however, the reduction of “dark” to “immoral” as we are not blind to “bad actors” lurking in hidden spaces (see Potter, forthcoming). Dark algorithms and their encoded biases shape our online lives. Not everyone has the ability to go off grid or create their own dark networks. Colonial settlerism often hides its brutal logics behind discourses of welfare. And some of us are forced to go dark against our will, as in the case of economies or nations being shut out of communication networks. But above all, the tensions produced in darkness, going dark, and acting dark show the normative powers beyond only focusing on the light. Taken as a whole, the articles in this issue explore the tensions between dark and connected, opting in and opting out, and exposure and retreat. They challenge binaries that reduce our vision to the monochromaticism of dark and light. They explain how the concept of “dark” expands opportunities for existence and persistence beyond datafication. They point to moral, ethical, and pragmatic responses of selves and communities seeking to be/belong in/of the dark. The issue starts with a high-stakes contest: what happens when an entire country is forced to go dark? While the articles in this issue were in review, Australian Facebook users were abruptly introduced to a unique form of darkness when, overnight, all news posts were removed from Facebook. Leaver’s feature article responds to tell the story of how Facebook and Google fought the Australian media law, and nobody won. Simply put, the platforms-cum-infrastructures did not want the government to mandate terms of their payments and business to traditional news organisations, so pulled the plug on Australia. As Leaver points out, Facebook’s cull not only made news media go dark, but in the midst of a pandemic and ongoing bushfires, prevented government agencies from posting and sharing government public health information, weather and wind patterns, and some State Emergency Services information. His article positions darkness on the spectrum from visibility to invisibility and focuses on the complex interplays of who is in control of, or has the power over, visibility. Facebook’s power to darken vital voices in society was unprecedented in Australia, a form of “de-platforming at scale” (Crawford). It seemed that Facebook (and as Leaver explains, Google, to a lesser extent) were using Australia to test platform power and legislative response. The results of this experiment, Leaver argues, was not a dawn of a new dark age—without the misinforming-glare of Facebook (see Cinque in this issue)—but confirmatory evidence of the political economy of national media: News Corp and other large traditional media companies received millions from Facebook and Google in exchange for the latter being exempt from the very law in question. Everyone won, except the Australians looking to experiment and explore alternatives in a new darkness. Scared of the dark, politicians accepted a mutually agreed transfer of ad-revenue from Google and Facebook to large and incumbent media organisations; and with that, hope of exploring a world mediated without the glare of digital incumbents was snuffed out. These agreements, of course, found user privacy, algorithmic biases, and other concerns of computational light out of scope. Playing off the themes of status quo of institutionalised social media companies, Cinque examines how social online spaces (SOS) which are governed by logics of surveillance and datafication embodied in the concept of the “gazing elite” (data aggregators including social media), can prompt anxieties for users regarding data privacy. Her work in the issue particularly observes that anxiety for many users is shaped by this manifestation of the “dark” as it relates to the hidden processes of data capture and processing by the mainstream platforms, surveillant digital objects that are incorporated into the Internet of Things, and “dark” or black boxed automated decisions which censor expression and self-representation. Against this way of conceptualising digital darkness, Cinque argues that dark SOS which use VPNs or the Tor browser to evade monitoring are valuable to users precisely because of their ability to evade the politics of visibility and resist the power of the gazing elite. Continuing away from the ubiquitous and all consuming blue glow of Facebook to more esoteric online communities, Maddox and Heemsbergen use their article to expand a critique on the normative computational logics which define the current information age (based on datafication, tracking, prediction, and surveillance of human socialities). They consider how “digging in the shadows” and “tinkering” with cryptocurrencies in the “dark” is shaping alternative futures based on social, equitable, and reciprocal relations. Their work traces cryptocurrencies—a “community generated technology” made by makers, miners and traders on darknets—from its emergence during a time of global economic upheaval, uncertainty and mistrust in centralised financial systems, through to new generations of cryptocurrencies like Dogecoin that, based on lessons from early cryptocurrencies, are mutating and becoming absorbed into larger economic structures. These themes are explored using an innovative analytical framework considering the “construction, disruption, contention, redirection, and finally absorption of emerging techno-potentials into larger structures”. The authors conclude by arguing that experiments in the dark don’t stay in the dark, but are radical potentials that impact and shape larger social forms. Bradfield and Fredericks take a step back from a focus on potentially arcane online cultures to position dark in an explicit provocation to settler politics’ fears and anxieties. They show how being dark in Australia is embodied and everyday. In doing so, they draw back the veil on the uncontested normality of fear of the dark-as-object. Their article’s examples offer a stark demonstration of how for Indigenous peoples, associations of “dark” fear and danger are built into the structural mechanisms that shape and maintain colonial understandings of Indigenous peoples and their bodies as part of larger power structures. They note activist practices that provoke settlers to confront individuals, communities, and politics that proclaim “I’m not afraid of the Dark” (see Cotes in Bradfield and Fredericks). Drawing on a related embodied refusal of poorly situated connotations of the dark, Hardley considers the embodied ways mobile media have been deployed in the urban night and observes that in darkness, and the night, while vision is obscured and other senses are heightened we also encounter enmeshed cultural relationships of darkness and danger. Drawing on the postphenomenological concept of multistability, Hardley frames engagement with mobile media as a particular kind of body-technology relation in which the same technology can be used by different people in multiple ways, as people assign different meanings to the technology. Presenting empirical research on participants’ night-time mobile media practices, Hardley analyses how users co-opt mobile media functionalities to manage their embodied experiences of the dark. The article highlights how mobile media practices of privacy and isolation in urban spaces can be impacted by geographical location and urban darkness, and are also distinctly gendered. Smith explores how conversations flow across social media platforms and messaging technologies and in and out of sight across the public domain. Darkness is the backstage where backchannel conversations take place outside of public view, in private and parochial spaces, and in the shadow spaces where communication crosses between platforms. This narrative threading view of conversation, which Smith frames as a multiplatform accomplishment, responds to the question held by so many researchers and people trying to interpret what people say in public on social media. Is what we see the tip of an iceberg or just a small blip in the ocean? From Smith’s work we can see that so much happens in the dark, beyond the gaze of the onlooker, where conversational practices move by their own logic. Smith argues that drawing on pre-digital conversational analysis techniques associated with ethnomethodology will illuminate the social logics that structure online interaction and increase our understanding of online sociality forces. Set in the context of merging platforms and the “rise of data”, Lee presents issues that undergird contemporary, globally connected media systems. In translating descriptions of complex systems, the article critically discusses the changing relational quality of “the shadow of hierarchy” and “Platform Power”. The governmental use of private platforms, and the influence it has on power and opportunity for government and civil society is prefigured. The “dark” in this work is lucidly presented as a relationality; an expression of differing values, logics, and (techno)socialities. The author finds and highlights the line between traditional notions of "infrastructure" and the workings of contemporary digital platforms which is becoming increasingly indistinct. Lee concludes by showing how the intersection of platforms with public institutions and infrastructures has moulded society’s light into an evolving and emergent shadow of hierarchy over many domains where there are, as always, those that will have the advantage—and those that do not. Finally, Jethani and Fordyce present an understanding of “data provenance” as a metaphor and method both for analysing data as a social and political artefact. The authors point to the term via an inter-disciplinary history as a way to explain a custodial history of objects. They adroitly argue that in our contemporary communication environment that data is more than just a transact-able commodity. Data is vital—being acquired, shared, interpreted and re-used with significant influence and socio-technical affects. As we see in this article, the key methods that rely on the materiality and subjectivity of data extraction and interpretation are not to be ignored. Not least because they come with ethical challenges as the authors make clear. As an illuminating methodology, “data provenance” offers a narrative for data assets themselves (asking what, when, who, how, and why). In the process, the kinds of valences unearthed as being private, secret, or exclusive reveal aspects of the ‘dark’ (and ‘light’) that is the focus of this issue. References Bridle, James. New Dark Age: Technology and the End of the Future. London, UK: Verso Books, 2018. Crawford, Kate (katecrawford). “It happened: Facebook just went off the deep end in Australia. They are blocking *all* news content to Australians, and *no* Australian media can post news. This is what showdowns between states and platforms look like. It's deplatforming at scale.” 18 Feb. 2021. 22 Apr. 2021 <https://twitter.com/katecrawford/status/1362149306170368004>. Fisher, Joshua A., and Jay David Bolter. "Ethical Considerations for AR Experiences at Dark Tourism Sites." 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (2018): 365-69. Gehl, Robert. Weaving the Dark Web: Legitimacy on Freenet, Tor, and I2p. The Information Society Series. Cambridge, MA: MIT Press, 2018. Potter, Martin. “Bad Actors Never Sleep: Content Manipulation on Reddit.” Eds. Toija Cinque, Robert W. Gehl, Luke Heemsbergen, and Alexia Maddox. Continuum Dark Social Special Issue (forthcoming).
APA, Harvard, Vancouver, ISO, and other styles
26

Dlouhá, Jana. "Call for papers – thematic issue Competences in Environmental Education (EE) and Education for Sustainable Development (ESD)." Envigogika 9, no. 1 (2014). http://dx.doi.org/10.14712/18023061.442.

Full text
Abstract:
Competences have been identified as legitimate educational goals wherever it is not only knowledge that counts in learning (and educators are concerned with not only the cognitive domain in their teaching). There is an ongoing discussion on “key competences for all” identified by the European Parliament as a necessary prerequisite for personal fulfilment, active citizenship, social cohesion and employability in a knowledge society (cf. EP, 2006). Also in the field of EE and ESD, there have been attempts to find appropriate operationalisation of action-oriented, learner-centred, and socially and environmentally responsible educational strategies which would help to realize a transition of the whole education system towards sustainability. Competences appear to be an appropriate concept providing an opportunity especially for a change of traditional teaching/learning practices and proper assessment of these innovations at the level of the student, educational module or programme, and also the policy level – they might be specifically designed for different disciplinary and cultural contexts and easily adjusted for all ISCED levels. As a proper tool for setting transformative educational goals and reflection of the prerequisites/outcomes of the relevant learning processes, they have been stressed in prominent ESD policy documents (UNECE, 2011, 2013), in theoretical discussions (Wiek et al.,2011) as well as in practice (growing knowledge base of case studies in relevant journals).We encourage concerned experts to enrich this debate and contribute to the pool of knowledge by providing results of their original research and share their experience with practical implementation of the concept – and submit their work for the thematic issue of Envigogika. We accept articles in the category of research papers and case studies; moreover, there is a possibility to provide multimedia presentations of existing learning programmes and other activities. The deadline for submission of the manuscripts is 15 September 2014; the thematic issue will appear after the review process by the end of the year.The theme of the Envigogika issue is closely related to the international COPERNICUS Alliance Conference to be held on 3 October 2014 the outcomes of which are expected to be one of the main contributions to the UNESCO Decade for ESD (2005‑2014), ending this year. Conference-related themes cover curricular aspects of university education; thus they provide an opportunity to reflect ESD-oriented higher education in all disciplinary fields from the perspective of the educator and his/her professional development. Articles related to the themes of the Conference can be presented in parallel sessions as part of the afternoon programme (see here) if they are submitted in the form of abstracts through the registration system on the conference website by 15th July 2014. Other competence-related themes may be submitted for the special issue of Envigogika as well.Authors are warmly welcome to attend the COPERNICUS Alliance Conference to meet top European experts in the field and discuss issues of common interest. The Conference is closely associated with the UE4SD project which links the competence theme with professional development of university educators in the field of ESD. In the project, 55 partners from 33 countries are represented, the majority of which are expected to attend the Conference as it is a constitutive part of their cooperation. Four regions (East, West, North and South) are evenly covered by the partners’ consortium and thus the Conference will be a unique opportunity to also make links with these regional networks. Authors from the Czech Republic and its neighbours are therefore especially encouraged to consider their involvement in the Conference programme as the transition towards sustainability is an issue to be highlighted in this part of the world.ReferencesEP, (2006). Recommendation of the European Parliament and of the Council of 18 December 2006 on key competences for lifelong learning. European Reference Framework in the EC. Official Journal L 394 of 30.12.2006. [online] [cit 2014-05-12] available from http://www.cmepius.si/files/cmepius/userfiles/grundtvig/gradivo/key_competencies_2006_en.pdf (see also http://europa.eu/legislation_summaries/education_training_youth/lifelong_learning/c11090_en.htm ).UN ECE (2011). Learning for the future: Competences in Education for Sustainable Development. Geneva: United Nations Economic Commission for Europe, Steering Committee on Education for Sustainable Development. Retrieved from http://www.unece.org/fileadmin/DAM/env/esd/ESD_Publications/Competences_Publication.pdfUN ECE (2013). Empowering educators for a sustainable future: Tools for policy and practice workshops on education for sustainable development Competences. Geneva: , United Nations Economic Commission for Europe, Steering Committee on Education for Sustainable Development. Retrieved from http://www.unece.org/fileadmin/DAM/env/esd/8thMeetSC/ece.cep.ac.13.2013.4e.pdfWiek, A., Withycombe, L., & Redman, C. L. (2011). Key competencies in sustainability: a reference framework for academic program development. Sustain Sci, 6(2), 203-218. Retrieved from http://link.springer.com/10.1007/s11625-011-0132-6 http://dx.doi.org/10.1007/s11625-011-0132-6
APA, Harvard, Vancouver, ISO, and other styles
27

Marcheva, Marta. "The Networked Diaspora: Bulgarian Migrants on Facebook." M/C Journal 14, no. 2 (2010). http://dx.doi.org/10.5204/mcj.323.

Full text
Abstract:
The need to sustain and/or create a collective identity is regularly seen as one of the cultural priorities of diasporic peoples and this, in turn, depends upon the existence of a uniquely diasporic form of communication and connection with the country of origin. Today, digital media technologies provide easy information recording and retrieval, and mobile IT networks allow global accessibility and participation in the redefinition of identities. Vis-à-vis our understanding of the proximity and connectivity associated with globalisation, the role of ICTs cannot be underestimated and is clearly more than a simple instrument for the expression of a pre-existing diasporic identity. Indeed, the concept of “e-diaspora” is gaining popularity. Consequently, research into the role of ICTs in the lives of diasporic peoples contributes to a definition of the concept of diaspora, understood here as the result of the dispersal of all members of a nation in several countries. In this context, I will demonstrate how members of the Bulgarian diaspora negotiate not only their identities but also their identifications through one of the most popular community websites, Facebook. My methodology consists of the active observation of Bulgarian users belonging to the diaspora, the participation in groups and forums on Facebook, and the analysis of discourses produced online. This research was conducted for the first time between 1 August 2008 and 31 May 2009 through the largest 20 (of 195) Bulgarian groups on the French version of Facebook and 40 (of over 500) on the English one. It is important to note that the public considered to be predominantly involved in Facebook is a young audience in the age group of 18-35 years. Therefore, this article is focused on two generations of Bulgarian immigrants: mostly recent young and second-generation migrants. The observed users are therefore members of the Bulgarian diaspora who have little or no experience of communism, who don’t feel the weight of the past, and who have grown up as free and often cosmopolitan citizens. Communist hegemony in Bulgaria began on 9 September 1944, when the army and the communist militiamen deposed the country’s government and handed power over to an anti-fascist coalition. During the following decades, Bulgaria became the perfect Soviet satellite and the imposed Stalinist model led to sharp curtailing of the economic and social contacts with the free world beyond the Iron Curtain. In 1989, the fall of the Berlin Wall marked the end of the communist era and the political and economic structures that supported it. Identity, Internet, and Diaspora Through the work of Mead, Todorov, and boyd it is possible to conceptualise the subject in terms of both of internal and external social identity (Mead, Todorov, boyd). In this article, I will focus, in particular, on social and national identities as expressions of the process of sharing stories, experiences, and understanding between individuals. In this respect, the phenomenon of Facebook is especially well placed to mediate between identifications which, according to Freud, facilitate the plural subjectivities and the establishment of an emotional network of mutual bonds between the individual and the group (Freud). This research also draws on Goffman who, from a sociological point of view, demystifies the representation of the Self by developing a dramaturgical theory (Goffman), whereby identity is constructed through the "roles" that people play on the social scene. Social life is a vast stage where the actors are required to adhere to certain socially acceptable rituals and guidelines. It means that we can consider the presentation of Self, or Others, as a facade or a construction of socially accepted features. Among all the ICTs, the Internet is, by far, the medium most likely to facilitate free expression of identity through a multitude of possible actions and community interactions. Personal and national memories circulate in the transnational space of the Internet and are reshaped when framed from specific circumstances such as those raised by the migration process. In an age of globalisation marked by the proliferation of population movements, instant communication, and cultural exchanges across geographic boundaries, the phenomenon of the diaspora has caught the attention of a growing number of scholars. I shall be working with Robin Cohen’s definition of diaspora which highlights the following common features: (1) dispersal from an original homeland; (2) the expansion from a homeland in search of work; (3) a collective memory and myth about the homeland; (4) an idealisation of the supposed ancestral homeland; (5) a return movement; (6) a strong ethnic group consciousness sustained over a long time; (7) a troubled relationship with host societies; (8) a sense of solidarity with co-ethnic members in other countries; and (9) the possibility of a distinctive creative, enriching life in tolerant host countries (Cohen). Following on this earlier work on the ways in which diasporas give rise to new forms of subjectivity, the concept of “e-diaspora” is now rapidly gaining in popularity. The complex association between diasporic groups and ICTs has led to a concept of e-diasporas that actively utilise ICTs to achieve community-specific goals, and that have become critical for the formation and sustenance of an exilic community for migrant groups around the globe (Srinivasan and Pyati). Diaspora and the Digital Age Anderson points out two key features of the Internet: first, it is a heterogeneous electronic medium, with hardly perceptible contours, and is in a state of constant development; second, it is a repository of “imagined communities” without geographical or legal legitimacy, whose members will probably never meet (Anderson). Unlike “real” communities, where people have physical interactions, in the imagined communities, individuals do not have face-to-face communication and daily contact, but they nonetheless feel a strong emotional attachment to the nation. The Internet not only opens new opportunities to gain greater visibility and strengthen the sense of belonging to community, but it also contributes to the emergence of a transnational public sphere where the communities scattered in various locations freely exchange their views and ideas without fear of restrictions or censorship from traditional media (Appadurai, Bernal). As a result, the Web becomes a virtual diasporic space which opens up, to those who have left their country, a new means of confrontation and social participation. Within this new diasporic space, migrants are bound in their disparate geographical locations by a common vision or myth about the homeland (Karim). Thanks to the Internet, the computer has become a primary technological intermediary between virtual networks, bringing its members closer in a “global village” where everyone is immediately connected to others. Thus, today’s diasporas are not the diaspora of previous generations in that the migration is experienced and negotiated very differently: people in one country are now able to continue to participate actively in another country. In this context, the arrival of community sites has increased the capacity of users to create a network on the Internet, to rediscover lost links, and strengthen new ones. Unlike offline communities, which may weaken once their members have left the physical space, online communities that are no longer limited by the requirement of physical presence in the common space have the capacity to endure. Identity Strategies of New Generations of Bulgarian Migrants It is very difficult to quantify migration to or from Bulgaria. Existing data is not only partial and limited but, in some cases, give an inaccurate view of migration from Bulgaria (Soultanova). Informal data confirm that one million Bulgarians, around 15 per cent of Bulgaria’s entire population (7,620,238 inhabitants in 2007), are now scattered around the world (National Statistical Institute of Bulgaria). The Bulgarian migrant is caught in a system of redefinition of identity through the duration of his or her relocation. Emigrating from a country like Bulgaria implies a high number of contingencies. Bulgarians’ self-identification is relative to the inferiority complex of a poor country which has a great deal to do to catch up with its neighbours. Before the accession of Bulgaria to the European Union, the country was often associated with what have been called “Third World countries” and seen as a source of crime and social problems. Members of the Bulgarian diaspora faced daily prejudice due to the bad reputation of their country of origin, though the extent of the hostility depended upon the “host” nation (Marcheva). Geographically, Bulgaria is one of the most eastern countries in Europe, the last to enter the European Union, and its image abroad has not facilitated the integration of the Bulgarian diaspora. The differences between Bulgarian migrants and the “host society” perpetuate a sentiment of marginality that is now countered with an online appeal for national identity markers and shared experiences. Facebook: The Ultimate Social Network The Growing Popularity of Facebook With more than 500 million active members, Facebook is the most visited website in the world. In June 2007, Facebook experienced a record annual increase of 270 per cent of connections in one year (source: comScore World Metrix). More than 70 translations of the site are available to date, including the Bulgarian version. What makes it unique is that Facebook positively encourages identity games. Moreover, Facebook provides the symbolic building blocks with which to build a collective identity through shared forms of discourse and ways of thinking. People are desperate to make a good impression on the Internet: that is why they spend so much time managing their online identity. One of the most important aspects of Facebook is that it enables users to control and manage their image, leaving the choice of how their profile appears on the pages of others a matter of personal preference at any given time. Despite some limitations, we will see that Facebook offers the Bulgarian community abroad the possibility of an intense and ongoing interaction with fellow nationals, including the opportunity to assert and develop a complex new national/transnational identity. Facebook Experiences of the Bulgarian Diaspora Created in the United States in 2004 and extended to use in Europe two or three years later, Facebook was quickly adopted by members of the Bulgarian diaspora. Here, it is very important to note that, although the Internet per se has enabled Bulgarians across the globe to introduce Cyrillic script into the public arena, it is definitely Facebook that has made digital Cyrillic visible. Early in computer history, keyboards with the Cyrillic alphabet simply did not exist. Thus, Bulgarians were forced to translate their language into Latin script. Today, almost all members of the Bulgarian population who own a computer use a keyboard that combines the two alphabets, Latin and Cyrillic, and this allows alternation between the two. This is not the case for the majority of Bulgarians living abroad who are forced to use a keyboard specific to their country of residence. Thus, Bulgarians online have adopted a hybrid code to speak and communicate. Since foreign keyboards are not equipped with the same consonants and vowels that exist in the Bulgarian language, they use the Latin letters that best suit the Bulgarian phonetic. Several possible interpretations of these “encoded” texts exist which become another way for the Bulgarian migrants to distinguish and assert themselves. One of these encoded scripts is supplemented by figures. For example, the number “6” written in Bulgarian “шест” is applied to represent the Bulgarian letter “ш.” Bulgarian immigrants therefore employ very specific codes of communication that enhance the feeling of belonging to a community that shares the same language, which is often incomprehensible to others. As the ultimate social networking website, Facebook brings together Bulgarians from all over the world and offers them a space to preserve online memorials and digital archives. As a result, the Bulgarian diaspora privileges this website in order to manage the strong links between its members. Indeed, within months of coming into online existence, Facebook established itself as a powerful social phenomenon for the Bulgarian diaspora and, very soon, a virtual map of the Bulgarian diaspora was formed. It should be noted, however, that this mapping was focused on the new generation of Bulgarian migrants more familiar with the Internet and most likely to travel. By identifying the presence of online groups by country or city, I was able to locate the most active Bulgarian communities: “Bulgarians in UK” (524 members), “Bulgarians in Chicago” (436 members), “Bulgarians studying in the UK” (346 members), “Bulgarians in America” (333 members), “Bulgarians in the USA” (314 members), “Bulgarians in Montreal” (249 members), “Bulgarians in Munich” (241 members), and so on. These figures are based on the “Groups” Application of Facebook as updated in February 2010. Through those groups, a symbolic diasporic geography is imagined and communicated: the digital “border crossing,” as well as the real one, becomes a major identity resource. Thus, Bulgarian users of Facebook are connecting from the four corners of the globe in order to rebuild family links and to participate virtually in the marriages, births, and lives of their families. It sometimes seems that the whole country has an appointment on Facebook, and that all the photos and stories of Bulgarians are more or less accessible to the community in general. Among its virtual initiatives, Facebook has made available to its users an effective mobilising tool, the Causes, which is used as a virtual noticeboard for activities and ideas circulating in “real life.” The members of the Bulgarian diaspora choose to adhere to different “causes” that may be local, national, or global, and that are complementary to the civic and socially responsible side of the identity they have chosen to construct online. Acting as a virtual realm in which distinct and overlapping trajectories coexist, Facebook thus enables users to articulate different stories and meanings and to foster a democratic imaginary about both the past and the future. Facebook encourages diasporas to produce new initiatives to revive or create collective memories and common values. Through photos and videos, scenes of everyday life are celebrated and manipulated as tools to reconstruct, reconcile, and display a part of the history and the identity of the migrant. By combating the feelings of disorientation, the consciousness of sharing the same national background and culture facilitates dialogue and neutralises the anxiety and loneliness of Bulgarian migrants. When cultural differences become more acute, the sense of isolation increases and this encourages migrants to look for company and solidarity online. As the number of immigrants connected and visible on Facebook gets larger, so the use of the Internet heightens their sense of a substantial collective identity. This is especially important for migrants during the early years of relocation when their sense of identity is most fragile. It can therefore be argued that, through the Internet, some Bulgarian migrants are replacing alienating face-to-face contact with virtual friends and enjoying the feeling of reassurance and belonging to a transnational community of compatriots. In this sense, Facebook is a propitious ground for the establishment of the three identity strategies defined by Herzfeld: cultural intimacy (or self-stereotypes); structural nostalgia (the evocation of a time when everything was going better); and the social poetic (the strategies aiming to retrieve a particular advantage and turn it into a permanent condition). In this way, the willingness to remain continuously in virtual contact with other Bulgarians often reveals a desire to return to the place of birth. Nostalgia and outsourcing of such sentiments help migrants to cope with feelings of frustration and disappointment. I observed that it is just after their return from summer holidays spent in Bulgaria that members of the Bulgarian diaspora are most active on the Bulgarian forums and pages on Facebook. The “return tourism” (Fourcade) during the summer or for the winter holidays seems to be a central theme in the forums on Facebook and an important source of emotional refuelling. Tensions between identities can also lead to creative formulations through Facebook’s pages. Thus, the group “You know you’re a Bulgarian when...”, which enjoys very active participation from the Bulgarian diaspora, is a space where everyone is invited to share, through a single sentence, some fact of everyday life with which all Bulgarians can identify. With humour and self-irony, this Facebook page demonstrates what is distinctive about being Bulgarian but also highlights frustration with certain prejudices and stereotypes. Frequently these profiles are characterised by seemingly “glocal” features. The same Bulgarian user could define himself as a Parisian, adhering to the group “You know you’re from Paris when...”, but also a native of a Bulgarian town (“You know you’re from Varna when...”). At the same time, he is an architect (“All architects on Facebook”), supporting the candidacy of Barack Obama, a fan of Japanese manga (“maNga”), of a French actor, an American cinema director, or Indian food. He joins a cause to save a wild beach on the Black Sea coast (“We love camping: Gradina Smokinia and Arapia”) and protests virtually against the slaughter of dolphins in the Faroe Islands (“World shame”). One month, the individual could identify as Bulgarian, but next month he might choose to locate himself in the country in which he is now resident. Thus, Facebook creates a virtual territory without borders for the cosmopolitan subject (Negroponte) and this confirms the premise that the Internet does not lead to the convergence of cultures, but rather confirms the opportunities for diversification and pluralism through multiple social and national affiliations. Facebook must therefore be seen as an advantageous space for the representation and interpretation of identity and for performance and digital existence. Bulgarian migrants bring together elements of their offline lives in order to construct, online, entirely new composite identities. The Bulgarians we have studied as part of this research almost never use pseudonyms and do not seem to feel the need to hide their material identities. This suggests that they are mature people who value their status as migrants of Bulgarian origin and who feel confident in presenting their natal identities rather than hiding behind a false name. Starting from this material social/national identity, which is revealed through the display of surname with a Slavic consonance, members of the Bulgarian diaspora choose to manage their complex virtual identities online. Conclusion Far from their homeland, beset with feelings of insecurity and alienation as well as daily experiences of social and cultural exclusion (much of it stemming from an ongoing prejudice towards citizens from ex-communist countries), it is no wonder that migrants from Bulgaria find relief in meeting up with compatriots in front of their screens. Although some migrants assume their Bulgarian identity as a mixture of different cultures and are trying to rethink and continuously negotiate their cultural practices (often through the display of contradictory feelings and identifications), others identify with an imagined community and enjoy drawing boundaries between what is “Bulgarian” and what is not. The indispensable daily visit to Facebook is clearly a means of forging an ongoing sense of belonging to the Bulgarian community scattered across the globe. Facebook makes possible the double presence of Bulgarian immigrants both here and there and facilitates the ongoing processes of identity construction that depend, more and more, upon new media. In this respect, the role that Facebook plays in the life of the Bulgarian diaspora may be seen as a facet of an increasingly dynamic transnational world in which interactive media may be seen to contribute creatively to the formation of collective identities and the deformation of monolithic cultures. References Anderson, Benedict. L’Imaginaire National: Réflexions sur l’Origine et l’Essor du Nationalisme. Paris: La Découverte, 1983. Appadurai, Ajun. Après le Colonialisme: Les Conséquences Culturelles de la Globalisation. Paris: Payot, 2001. Bernal, Victoria. “Diaspora, Cyberspace and Political Imagination: The Eritrean Diaspora Online.” Global Network 6 (2006): 161-79. boyd, danah. “Social Network Sites: Public, Private, or What?” Knowledge Tree (May 2007). Cohen, Robin. Global Diasporas: An Introduction. London: University College London Press. 1997. Goffman, Erving. La Présentation de Soi. Paris: Editions de Minuit, Collection Le Sens Commun, 1973. Fourcade, Marie-Blanche. “De l’Arménie au Québec: Itinéraires de Souvenirs Touristiques.” Ethnologies 27.1 (2005): 245-76. Freud, Sigmund. “Psychologie des Foules et Analyses du Moi.” Essais de Psychanalyse. Paris: Petite Bibliothèque Payot, 2001 (1921). Herzfeld, Michael. Intimité Culturelle. Presse de l’Université de Laval, 2008. Karim, Karim-Haiderali. The Media of Diaspora. Oxford: Routledge, 2003. Marcheva, Marta. “Bulgarian Diaspora and the Media Treatment of Bulgaria in the French, Italian and North American Press (1992–2007).” Unpublished PhD dissertation. Paris: University Panthéon – Assas Paris 2, 2010. Mead, George Herbert. L’Esprit, le Soi et la Société. Paris: PUF, 2006. Negroponte, Nicholas. Being Digital. Vintage, 2005. Soultanova, Ralitza. “Les Migrations Multiples de la Population Bulgare.” Actes du Dolloque «La France et les Migrants des Balkans: Un État des Lieux.” Paris: Courrier des Balkans, 2005. Srinivasan, Ramesh, and Ajit Pyati. “Diasporic Information Environments: Reframing Immigrant-Focused Information Research.” Journal of the American Society for Information Science and Technology 58.12 (2007): 1734-44. Todorov, Tzvetan. Nous et les Autres: La Réflexion Française sur la Diversité Humaine. Paris: Seuil, 1989.
APA, Harvard, Vancouver, ISO, and other styles
28

Kadivar, Jamileh. "Government Surveillance and Counter-Surveillance on Social and Mobile Media: The Case of Iran (2009)." M/C Journal 18, no. 2 (2015). http://dx.doi.org/10.5204/mcj.956.

Full text
Abstract:
Human history has witnessed varied surveillance and counter-surveillance activities from time immemorial. Human beings could not surveille others effectively and accurately without the technology of their era. Technology is a tool that can empower both people and governments. The outcomes are different based on the users’ intentions and aims. 2,500 years ago, Sun Tzu noted that ‘If you know both yourself and your enemy, you can win numerous (literally, "a hundred") battles without jeopardy’. His words still ring true. To be a good surveiller and counter-surveiller it is essential to know both sides, and in order to be good at these activities access to technology is vital. There is no doubt that knowledge is power, and without technology to access the information, it is impossible to be powerful. As we become more expert at technology, we will learn what makes surveillance and counter-surveillance more effective, and will be more powerful.“Surveillance” is one of the most important aspects of living in the convergent media environment. This essay illustrates government surveillance and counter-surveillance during the Iranian Green Movement (2009) on social and mobile media. The Green Movement refers to a non-violent movement that arose after the disputed presidential election on June 2009. After that Iran was facing its most serious political crisis since the 1979 revolution. Claims of vote fraud triggered massive street protests. Many took to the streets with “Green” signs, chanting slogans such as ‘the government lied’, and ‘where is my vote?’ There is no doubt that social and mobile media has played an important role in Iran’s contemporary politics. According to Internet World Stats (IWS) Internet users in 2009 account for approximately 48.5 per cent of the population of Iran. In 2009, Iran had 30.2 million mobile phone users (Freedom House), and 72 cellular subscriptions for every 100 people (World Bank). Today, while Iran has the 19th-largest population in the world, its blogosphere holds the third spot in terms of number of users, just behind the United States and China (Beth Elson et al.). In this essay the use of social and mobile media (technology) is not debated, but the extent of this use, and who, why and how it is used, is clearly scrutinised.Visibility and Surveillance There have been different kinds of surveillance for a very long time. However, all types of surveillance are based on the notion of “visibility”. Previous studies show that visibility is not a new term (Foucault Discipline). The new things in the new era, are its scale, scope and complicated ways to watch others without being watched, which are not limited to a specific time, space and group, and are completely different from previous instruments for watching (Andrejevic). As Meikle and Young (146) have mentioned ‘networked digital media bring with them a new kind of visibility’, based on different kinds of technology. Internet surveillance has important implications in politics to control, protect, and influence (Marx Ethics; Castells; Fuchs Critique). Surveillance has been improved during its long history, and evolved from very simple spying and watching to complicated methods of “iSpy” (Andrejevic). To understand the importance of visibility and its relationship with surveillance, it is essential to study visibility in conjunction with the notion of “panopticon” and its contradictory functions. Foucault uses Bentham's notion of panopticon that carries within itself visibility and transparency to control others. “Gaze” is a central term in Bentham’s view. ‘Bentham thinks of a visibility organised entirely around a dominating, overseeing gaze’ (Foucault Eye). Moreover, Thomson (Visibility 11) notes that we are living in the age of ‘normalizing the power of the gaze’ and it is clear that the influential gaze is based on powerful means to see others.Lyon (Surveillance 2) explains that ‘surveillance is any collection and processing of personal data, whether identifiable or not, for the purpose of influencing or managing those whose data have been granted…’. He mentions that today the most important means of surveillance reside in computer power which allows collected data to be sorted, matched, retrieved, processed, marketed and circulated.Nowadays, the Internet has become ubiquitous in many parts of the world. So, the changes in people’s interactions have influenced their lives. Fuchs (Introduction 15) argues that ‘information technology enables surveillance at a distance…in real time over networks at high transmission speed’. Therefore, visibility touches different aspects of people’s lives and living in a “glasshouse” has caused a lot of fear and anxiety about privacy.Iran’s Green Movement is one of many cases for studying surveillance and counter-surveillance technologies in social and mobile media. Government Surveillance on Social and Mobile Media in Iran, 2009 In 2009 the Iranian government controlled technology that allowed them to monitor, track, and limit access to the Internet, social media and mobiles communication, which has resulted in the surveillance of Green Movement’s activists. The Iranian government had improved its technical capabilities to monitor the people’s behavior on the Internet long before the 2009 election. The election led to an increase in online surveillance. Using social media the Iranian government became even more powerful than it was before the election. Social media was a significant factor in strengthening the government’s power. In the months after the election the virtual atmosphere became considerably more repressive. The intensified filtering of the Internet and implementation of more advanced surveillance systems strengthened the government’s position after the election. The Open Net Initiative revealed that the Internet censorship system in Iran is one of the most comprehensive and sophisticated censorship systems in the world. It emphasized that ‘Advances in domestic technical capacity have contributed to the implementation of a centralized filtering strategy and a reduced reliance on Western technologies’.On the other hand, the authorities attempted to block all access to political blogs (Jaras), either through cyber-security methods or through threats (Tusa). The Centre for Investigating Organized Cyber Crimes, which was founded in 2007 partly ‘to investigate and confront social and economic offenses on the Internet’ (Cyber Police), became increasingly important over the course of 2009 as the government combated the opposition’s online activities (Beth Elson et al. 16). Training of "senior Internet lieutenants" to confront Iran's "virtual enemies online" was another attempt that the Intelligence minister announced following the protests (Iran Media Program).In 2009 the Iranian government enacted the Computer Crime Law (Jaras). According to this law the Committee in Charge of Determining Unauthorized Websites is legally empowered to identify sites that carry forbidden content and report that information to TCI and other major ISPs for blocking (Freedom House). In the late fall of 2009, the government started sending threatening and warning text messages to protesters about their presence in the protests (BBC). Attacking, blocking, hacking and hijacking of the domain names of some opposition websites such as Jaras and Kaleme besides a number of non-Iranian sites such as Twitter were among the other attempts of the Iranian Cyber Army (Jaras).It is also said that the police and security forces arrested dissidents identified through photos and videos posted on the social media that many imagined had empowered them. Furthermore, the online photos of the active protesters were posted on different websites, asking people to identify them (Valizadeh).In late June 2009 the Iranian government was intentionally permitting Internet traffic to and from social networking sites such as Facebook and Twitter so that it could use a sophisticated practice called Deep Packet Inspection (DPI) to collect information about users. It was reportedly also applying the same technology to monitor mobile phone communications (Beth Elson et al. 15).On the other hand, to cut communication between Iranians inside and outside the country, Iran slowed down the Internet dramatically (Jaras). Iran also blocked access to Facebook, YouTube, Wikipedia, Twitter and many blogs before, during and after the protests. Moreover, in 2009, text message services were shut down for over 40 days, and mobile phone subscribers could not send or receive text messages regardless of their mobile carriers. Subsequently it was disrupted on a temporary basis immediately before and during key protests days.It was later discovered that the Nokia Siemens Network provided the government with surveillance technologies (Wagner; Iran Media Program). The Iranian government built a complicated system that enabled it to monitor, track and intercept what was said on mobile phones. Nokia Siemens Network confirmed it supplied Iran with the technology needed to monitor, control, and read local telephone calls [...] The product allowed authorities to monitor any communications across a network, including voice calls, text messaging, instant messages, and web traffic (Cellan-Jones). Media sources also reported that two Chinese companies, Huawei and ZTE, provided surveillance technologies to the government. The Nic Payamak and Saman Payamak websites, that provide mass text messaging services, also reported that operator Hamrah Aval commonly blocked texts with words such as meeting, location, rally, gathering, election and parliament (Iran Media Program). Visibility and Counter-Surveillance The panopticon is not limited to the watchers. Similarly, new kinds of panopticon and visibility are not confined to government surveillance. Foucault points out that ‘the seeing machine was once a sort of dark room into which individuals spied; it has become a transparent building in which the exercise of power may be supervised by society as a whole’ (Discipline 207). What is important is Foucault's recognition that transparency, not only of those who are being observed but also of those who are observing, is central to the notion of the panopticon (Allen) and ‘any member of society will have the right to come and see with his own eyes how schools, hospitals, factories, and prisons function’ (Foucault, Discipline 207). Counter-surveillance is the process of detecting and mitigating hostile surveillance (Burton). Therefore, while the Internet is a surveillance instrument that enables governments to watch people, it also improves the capacity to counter-surveille, and draws public attention to governments’ injustice. As Castells (185) notes the Internet could be used by citizens to watch their government as an instrument of control, information, participation, and even decision-making, from the bottom up.With regards to the role of citizens in counter-surveillance we can draw on Jay Rosen’s view of Internet users as ‘the people formerly known as the audience’. In counter-surveillance it can be said that passive citizens (formerly the audience) have turned into active citizens. And this change was becoming impossible without mobile and social media platforms. These new techniques and technologies have empowered people and given them the opportunity to have new identities. When Thompson wrote ‘the exercise of power in modern societies remains in many ways shrouded in secrecy and hidden from the public gaze’ (Media 125), perhaps he could not imagine that one day people can gaze at the politicians, security forces and the police through the use of the Internet and mobile devices.Furthermore, while access to mobile media allows people to hold authorities accountable for their uses and abuses of power (Breen 183), social media can be used as a means of representation, organization of collective action, mobilization, and drawing attention to police brutality and reasons for political action (Gerbaudo).There is no doubt that having creativity and using alternative platforms are important aspects in counter-surveillance. For example, images of Lt. Pike “Pepper Spray Cop” from the University of California became the symbol of the senselessness of police brutality during the Occupy Movement (Shaw). Iranians’ Counter-Surveillance on Social and Mobile Media, 2009 Iran’s Green movement (2009) triggered a lot of discussions about the role of technology in social movements. In this regard, there are two notable attitudes about the role of technology: techno-optimistic (Shriky and Castells) and techno-pessimistic (Morozov and Gladwell) views should be taken into account. While techno-optimists overrated the role of social media, techno-pessimists underestimated its role. However, there is no doubt that technology has played a great role as a counter-surveillance tool amongst Iranian people in Iran’s contemporary politics.Apart from the academic discussions between techno-optimists and techno-pessimists, there have been numerous debates about the role of new technologies in Iran during the Green Movement. This subject has received interest from different corners of the world, including Western countries, Iranian authorities, opposition groups, and also some NGOs. However, its role as a means of counter-surveillance has not received adequate attention.As the tools of counter-surveillance are more or less the tools of surveillance, protesters learned from the government to use the same techniques to challenge authority on social media.Establishing new websites (such as JARAS, RASA, Kalemeh, and Iran green voice) or strengthening some previous ones (such as Saham, Emrooz, Norooz), also activating different platforms such as Facebook, Twitter, and YouTube accounts to broadcast the voice of the Iranian Green Movement and neutralize the government’s propaganda were the most important ways to empower supporters of Iran’s Green Movement in counter-surveillance.‘Reporters Without Borders issued a statement, saying that ‘the new media, and particularly social networks, have given populations collaborative tools with which they can change the social order’. It is also mentioned that despite efforts by the Iranian government to prevent any reporting of the protests and due to considerable pressure placed on foreign journalists inside Iran, social media played a significant role in sending the messages and images of the movement to the outside world (Axworthy). However, at that moment, many thought that Twitter performed a liberating role for Iranian dissenters. For example, Western media heralded the Green Movement in Iran as a “Twitter revolution” fuelled by information and communication technologies (ICTs) and social media tools (Carrieri et al. 4). “The Revolution Will Be Twittered” was the first in a series of blog posts published by Andrew Sullivan a few hours after the news of the protests was released.According to the researcher’s observation the numbers of Twitter users inside Iran who tweeted was very limited in 2009 and social media was most useful in the dissemination of information, especially from those inside Iran to outsiders. Mobile phones were mostly influential as an instrument firstly used for producing contents (images and videos) and secondly for the organisation of protests. There were many photos and videos that were filmed by very simple mobile cell phones, uploaded by ordinary people onto YouTube and other platforms. The links were shared many times on Twitter and Facebook and released by mainstream media. The most frequently circulated story from the Iranian protests was a video of Neda Agha-Sultan. Her final moments were captured by some bystanders with mobile phone cameras and rapidly spread across the global media and the Internet. It showed that the camera-phone had provided citizens with a powerful means, allowing for the creation and instant sharing of persuasive personalised eyewitness records with mobile and globalised target populations (Anden-Papadopoulos).Protesters used another technique, DDOS (distributed denial of service attacks), for political protest in cyber space. Anonymous people used DDOS to overload a website with fake requests, making it unavailable for users and disrupting the sites set as targets (McMillan) in effect, shutting down the site. DDOS is an important counter-surveillance activity by grassroots activists or hackers. It was a cyber protest that knocked the main Iranian governmental websites off-line and caused crowdsourcing and false trafficking. Amongst them were Mahmoud Ahmadinejad, Iran's supreme leader’s websites and those which belong to or are close to the government or security forces, including news agencies (Fars, IRNA, Press TV…), the Ministry of Foreign Affairs, the Ministry of Justice, the Police, and the Ministry of the Interior.Moreover, as authorities uploaded the pictures of protesters onto different platforms to find and arrest them, in some cities people started to put the pictures, phone numbers and addresses of members of security forces and plain clothes police officers who attacked them during the protests and asked people to identify and report the others. They also wanted people to send information about suspects who infringed human rights. Conclusion To sum up, visibility, surveillance and counter-surveillance are not new phenomena. What is new is the technology, which increased their complexity. As Foucault (Discipline 200) mentioned ‘visibility is a trap’, so being visible would be the weakness of those who are being surveilled in the power struggle. In the convergent era, in order to be more powerful, both surveillance and counter-surveillance activities aim for more visibility. Although both attempt to use the same means (technology) to trap the other side, the differences are in their subjects, objects, goals and results.While in surveillance, visibility of the many by the few is mostly for the purpose of control and influence in undemocratic ways, in counter-surveillance, the visibility of the few by the many is mostly through democratic ways to secure more accountability and transparency from the governments.As mentioned in the case of Iran’s Green Movement, the scale and scope of visibility are different in surveillance and counter-surveillance. The importance of what Shaw wrote about Sydney occupy counter-surveillance, applies to other places, such as Iran. She has stressed that ‘protesters and police engaged in a dance of technology and surveillance with one another. Both had access to technology, but there were uncertainties about the extent of technology and its proficient use…’In Iran (2009), both sides (government and activists) used technology and benefited from digital networked platforms, but their levels of access and domains of influence were different, which was because the sources of power, information and wealth were divided asymmetrically between them. Creativity was important for both sides to make others more visible, and make themselves invisible. Also, sharing information to make the other side visible played an important role in these two areas. References Alen, David. “The Trouble with Transparency: The Challenge of Doing Journalism Ethics in a Surveillance Society.” Journalism Studies 9.3 (2008): 323-40. 8 Dec. 2013 ‹http://www.tandfonline.com/doi/full/10.1080/14616700801997224#.UqRFSuIZsqN›. Anden-Papadopoulos, Kari. “Citizen Camera-Witnessing: Embodied Political Dissent in the Age of ‘Mediated Mass Self-Communication.’” New Media & Society 16.5 (2014). 753-69. 9 Aug. 2014 ‹http://nms.sagepub.com/content/16/5/753.full.pdf+html›. Andrejevic, Mark. iSpy: Surveillance and Power in the Interactive Era. Lawrence, Kan: UP of Kansas, 2007. Axworthy, Micheal. Revolutionary Iran: A History of the Islamic Republic. London: Penguin Books, 2014. Bentham, Jeremy. Panopticon Postscript. London: T. Payne, 1791. Beth Elson, Sara, Douglas Yeung, Parisa Roshan, S.R. Bohandy, and Alireza Nader. Using Social Media to Gauge Iranian Public Opinion and Mood after the 2009 Election. Santa Monica: RAND Corporation, 2012. 1 Aug. 2014 ‹http://www.rand.org/content/dam/rand/pubs/technical_reports/2012/RAND_TR1161.pdf›. Breen, Marcus. Uprising: The Internet’s Unintended Consequences. Champaign, Ill: Common Ground Pub, 2011. Burton, Fred. “The Secrets of Counter-Surveillance.” Stratfor Global Intelligence. 2007. 19 April 2015 ‹https://www.stratfor.com/secrets_countersurveillance›. Carrieri, Matthew, Ali Karimzadeh Bangi, Saad Omar Khan, and Saffron Suud. After the Green Movement Internet Controls in Iran, 2009-2012. OpenNet Initiative, 2013. 17 Dec. 2013 ‹https://opennet.net/sites/opennet.net/files/iranreport.pdf›. Castells, Manuel. The Internet Galaxy: Reflections on the Internet, Business, and Society. Oxford: Oxford UP: 2001. Cellan-Jones, Rory. “Hi-Tech Helps Iranian Monitoring.” BBC, 2009. 26 July 2014 ‹http://news.bbc.co.uk/1/hi/technology/8112550.stm›. “Cyber Crimes’ List.” Iran: Cyber Police, 2009. 17 July 2014 ‹http://www.cyberpolice.ir/page/2551›. Foucault, Michel. Discipline and Punish: The Birth of the Prison. Trans. Alan Sheridan. Harmondsworth: Penguin, 1977. Foucault, Michel. “The Eye of Power.” 1980. 12 Dec. 2013 ‹https://nbrokaw.files.wordpress.com/2010/12/the-eye-of-power.doc›. Freedom House. “Special Report: Iran.” 2009. 14 June 2014 ‹http://www.sssup.it/UploadDocs/4661_8_A_Special_Report_Iran_Feedom_House_01.pdf›. Fuchs, Christian. “Introduction.” Internet and Surveillance: The Challenges of Web 2.0 and Social Media. Ed. Christian Fuchs. London: Routledge, 2012. 1-28. Fuchs, Christian. “Critique of the Political Economy of Web 2.0 Surveillance.” Internet and Surveillance: The Challenges of Web 2.0 and Social Media. Ed. Christian Fuchs. London: Routledge, 2012. 30-70. Gerbaudo, Paolo. Tweets and the Streets: Social Media and Contemporary Activism. London: Pluto, 2012. “Internet: Iran’s New Imaginary Enemy.” Jaras Mar. 2009. 28 June 2014 ‹http://www.rahesabz.net/print/12143›.Iran Media Program. “Text Messaging as Iran's New Filtering Frontier.” 2013. 25 July 2014 ‹http://www.iranmediaresearch.org/en/blog/227/13/04/25/136›. Internet World Stats News. The Internet Hits 1.5 Billion. 2009. 3 July 2014 ‹ http://www.internetworldstats.com/pr/edi038.htm›. Lyon, David. Surveillance Society: Monitoring Everyday Life. Buckingham: Open UP, 2001. Lyon, David. “9/11, Synopticon, and Scopophilia: Watching and Being Watched.” The New Politics of Surveillance and Visibility. Eds. Richard V. Ericson and Kevin D. Haggerty. Toronto: UP of Toronto, 2006. 35-54. Marx, Gary T. “What’s New about the ‘New Surveillance’? Classify for Change and Continuity.” Surveillance & Society 1.1 (2002): 9-29. McMillan, Robert. “With Unrest in Iran, Cyber-Attacks Begin.” PC World 2009. 17 Apr. 2015 ‹http://www.pcworld.com/article/166714/article.html›. Meikle, Graham, and Sherman Young. Media Convergence: Networked Digital Media in Everyday Life. London: Palgrave Macmillan, 2012. Morozov, Evgeny. “How Dictators Watch Us on the Web.” Prospect 2009. 15 June 2014 ‹http://www.prospectmagazine.co.uk/magazine/how-dictators-watch-us-on-the-web/#.U5wU6ZRdU00›.Open Net. “Iran.” 2009. 26 June 2014 ‹https://opennet.net/research/profiles/iran›. Reporters without Borders. “Web 2.0 versus Control 2.0.” 2010. 27 May 2014 ‹http://en.rsf.org/web-2-0-versus-control-2-0-18-03-2010,36697›.Rosen, Jay. The People Formerly Known as the Audience. 2006. 7 Dec. 2013 ‹http://www.huffingtonpost.com/jay-rosen/the-people-formerly-known_1_b_24113.html›. Shaw, Frances. “'Walls of Seeing': Protest Surveillance, Embodied Boundaries, and Counter-Surveillance at Occupy Sydney.” Transformation 23 (2013). 9 Dec. 2013 ‹http://www.transformationsjournal.org/journal/issue_23/article_04.shtml›. “The Warning of the Iranian Revolutionary Guard Corps (IRGC) to the Weblogs and Websites.” BBC, 2009. 27 July 2014 ‹http://www.bbc.co.uk/persian/iran/2009/06/090617_ka_ir88_sepah_internet.shtml›. Thompson, John B. The Media And Modernity: A Social Theory of the Media. Cambridge: Polity Press, 1995. Thompson, John B. “The New Visibility.” Theory, Culture & Society 22.6 (2005): 31-51. 10 Dec. 2013 ‹http://tcs.sagepub.com/content/22/6/31.full.pdf+html›. Tusa, Felix. “How Social Media Can Shape a Protest Movement: The Cases of Egypt in 2011 and Iran in 2009.” Arab Media and Society 17 (Winter 2013). 15 July 2014 ‹http://www.arabmediasociety.com/index.php?article=816&p=0›. Tzu, Sun. Sun Tzu: The Art of War. S.l.: Pax Librorum Pub. H, 2009. Valizadeh, Reza. “Invitation to the Public Shooting with the Camera.” RFI, 2011. 19 June 2014 ‹http://www.persian.rfi.fr/%D8%AF%D8%B9%D9%88%D8%AA-%D8%A8%D9%87-%D8%B4%D9%84%DB%8C%DA%A9-%D8%B9%D9%85%D9%88%D9%85%DB%8C-%D8%A8%D8%A7-%D8%AF%D9%88%D8%B1%D8%A8%DB%8C%D9%86-%D8%B9%DA%A9%D8%A7%D8%B3%DB%8C-20110307/%D8%A7%DB%8C%D8%B1%D8%A7%D9%86›. Wagner, Ben. Exporting Censorship and Surveillance Technology. Netherlands: Humanist Institute for Co-operation with Developing Countries (Hivos), 2012. 7 July 2014 ‹https://hivos.org/sites/default/files/exporting_censorship_and_surveillance_technology_by_ben_wagner.pdf›. World Bank. Mobile Cellular Subscriptions (per 100 People). The World Bank. N.d. 27 June 2014 ‹http://data.worldbank.org/indicator/IT.CEL.SETS.P2›.
APA, Harvard, Vancouver, ISO, and other styles
29

Dieter, Michael. "Amazon Noir." M/C Journal 10, no. 5 (2007). http://dx.doi.org/10.5204/mcj.2709.

Full text
Abstract:

 
 
 There is no diagram that does not also include, besides the points it connects up, certain relatively free or unbounded points, points of creativity, change and resistance, and it is perhaps with these that we ought to begin in order to understand the whole picture. (Deleuze, “Foucault” 37) Monty Cantsin: Why do we use a pervert software robot to exploit our collective consensual mind? Letitia: Because we want the thief to be a digital entity. Monty Cantsin: But isn’t this really blasphemic? Letitia: Yes, but god – in our case a meta-cocktail of authorship and copyright – can not be trusted anymore. (Amazon Noir, “Dialogue”) In 2006, some 3,000 digital copies of books were silently “stolen” from online retailer Amazon.com by targeting vulnerabilities in the “Search inside the Book” feature from the company’s website. Over several weeks, between July and October, a specially designed software program bombarded the Search Inside!™ interface with multiple requests, assembling full versions of texts and distributing them across peer-to-peer networks (P2P). Rather than a purely malicious and anonymous hack, however, the “heist” was publicised as a tactical media performance, Amazon Noir, produced by self-proclaimed super-villains Paolo Cirio, Alessandro Ludovico, and Ubermorgen.com. While controversially directed at highlighting the infrastructures that materially enforce property rights and access to knowledge online, the exploit additionally interrogated its own interventionist status as theoretically and politically ambiguous. That the “thief” was represented as a digital entity or machinic process (operating on the very terrain where exchange is differentiated) and the emergent act of “piracy” was fictionalised through the genre of noir conveys something of the indeterminacy or immensurability of the event. In this short article, I discuss some political aspects of intellectual property in relation to the complexities of Amazon Noir, particularly in the context of control, technological action, and discourses of freedom. Software, Piracy As a force of distribution, the Internet is continually subject to controversies concerning flows and permutations of agency. While often directed by discourses cast in terms of either radical autonomy or control, the technical constitution of these digital systems is more regularly a case of establishing structures of operation, codified rules, or conditions of possibility; that is, of guiding social processes and relations (McKenzie, “Cutting Code” 1-19). Software, as a medium through which such communication unfolds and becomes organised, is difficult to conceptualise as a result of being so event-orientated. There lies a complicated logic of contingency and calculation at its centre, a dimension exacerbated by the global scale of informational networks, where the inability to comprehend an environment that exceeds the limits of individual experience is frequently expressed through desires, anxieties, paranoia. Unsurprisingly, cautionary accounts and moral panics on identity theft, email fraud, pornography, surveillance, hackers, and computer viruses are as commonplace as those narratives advocating user interactivity. When analysing digital systems, cultural theory often struggles to describe forces that dictate movement and relations between disparate entities composed by code, an aspect heightened by the intensive movement of informational networks where differences are worked out through the constant exposure to unpredictability and chance (Terranova, “Communication beyond Meaning”). Such volatility partially explains the recent turn to distribution in media theory, as once durable networks for constructing economic difference – organising information in space and time (“at a distance”), accelerating or delaying its delivery – appear contingent, unstable, or consistently irregular (Cubitt 194). Attributing actions to users, programmers, or the software itself is a difficult task when faced with these states of co-emergence, especially in the context of sharing knowledge and distributing media content. Exchanges between corporate entities, mainstream media, popular cultural producers, and legal institutions over P2P networks represent an ongoing controversy in this respect, with numerous stakeholders competing between investments in property, innovation, piracy, and publics. Beginning to understand this problematic landscape is an urgent task, especially in relation to the technological dynamics that organised and propel such antagonisms. In the influential fragment, “Postscript on the Societies of Control,” Gilles Deleuze describes the historical passage from modern forms of organised enclosure (the prison, clinic, factory) to the contemporary arrangement of relational apparatuses and open systems as being materially provoked by – but not limited to – the mass deployment of networked digital technologies. In his analysis, the disciplinary mode most famously described by Foucault is spatially extended to informational systems based on code and flexibility. According to Deleuze, these cybernetic machines are connected into apparatuses that aim for intrusive monitoring: “in a control-based system nothing’s left alone for long” (“Control and Becoming” 175). Such a constant networking of behaviour is described as a shift from “molds” to “modulation,” where controls become “a self-transmuting molding changing from one moment to the next, or like a sieve whose mesh varies from one point to another” (“Postscript” 179). Accordingly, the crisis underpinning civil institutions is consistent with the generalisation of disciplinary logics across social space, forming an intensive modulation of everyday life, but one ambiguously associated with socio-technical ensembles. The precise dynamics of this epistemic shift are significant in terms of political agency: while control implies an arrangement capable of absorbing massive contingency, a series of complex instabilities actually mark its operation. Noise, viral contamination, and piracy are identified as key points of discontinuity; they appear as divisions or “errors” that force change by promoting indeterminacies in a system that would otherwise appear infinitely calculable, programmable, and predictable. The rendering of piracy as a tactic of resistance, a technique capable of levelling out the uneven economic field of global capitalism, has become a predictable catch-cry for political activists. In their analysis of multitude, for instance, Antonio Negri and Michael Hardt describe the contradictions of post-Fordist production as conjuring forth a tendency for labour to “become common.” That is, as productivity depends on flexibility, communication, and cognitive skills, directed by the cultivation of an ideal entrepreneurial or flexible subject, the greater the possibilities for self-organised forms of living that significantly challenge its operation. In this case, intellectual property exemplifies such a spiralling paradoxical logic, since “the infinite reproducibility central to these immaterial forms of property directly undermines any such construction of scarcity” (Hardt and Negri 180). The implications of the filesharing program Napster, accordingly, are read as not merely directed toward theft, but in relation to the private character of the property itself; a kind of social piracy is perpetuated that is viewed as radically recomposing social resources and relations. Ravi Sundaram, a co-founder of the Sarai new media initiative in Delhi, has meanwhile drawn attention to the existence of “pirate modernities” capable of being actualised when individuals or local groups gain illegitimate access to distributive media technologies; these are worlds of “innovation and non-legality,” of electronic survival strategies that partake in cultures of dispersal and escape simple classification (94). Meanwhile, pirate entrepreneurs Magnus Eriksson and Rasmus Fleische – associated with the notorious Piratbyrn – have promoted the bleeding away of Hollywood profits through fully deployed P2P networks, with the intention of pushing filesharing dynamics to an extreme in order to radicalise the potential for social change (“Copies and Context”). From an aesthetic perspective, such activist theories are complemented by the affective register of appropriation art, a movement broadly conceived in terms of antagonistically liberating knowledge from the confines of intellectual property: “those who pirate and hijack owned material, attempting to free information, art, film, and music – the rhetoric of our cultural life – from what they see as the prison of private ownership” (Harold 114). These “unruly” escape attempts are pursued through various modes of engagement, from experimental performances with legislative infrastructures (i.e. Kembrew McLeod’s patenting of the phrase “freedom of expression”) to musical remix projects, such as the work of Negativland, John Oswald, RTMark, Detritus, Illegal Art, and the Evolution Control Committee. Amazon Noir, while similarly engaging with questions of ownership, is distinguished by specifically targeting information communication systems and finding “niches” or gaps between overlapping networks of control and economic governance. Hans Bernhard and Lizvlx from Ubermorgen.com (meaning ‘Day after Tomorrow,’ or ‘Super-Tomorrow’) actually describe their work as “research-based”: “we not are opportunistic, money-driven or success-driven, our central motivation is to gain as much information as possible as fast as possible as chaotic as possible and to redistribute this information via digital channels” (“Interview with Ubermorgen”). This has led to experiments like Google Will Eat Itself (2005) and the construction of the automated software thief against Amazon.com, as process-based explorations of technological action. Agency, Distribution Deleuze’s “postscript” on control has proven massively influential for new media art by introducing a series of key questions on power (or desire) and digital networks. As a social diagram, however, control should be understood as a partial rather than totalising map of relations, referring to the augmentation of disciplinary power in specific technological settings. While control is a conceptual regime that refers to open-ended terrains beyond the architectural locales of enclosure, implying a move toward informational networks, data solicitation, and cybernetic feedback, there remains a peculiar contingent dimension to its limits. For example, software code is typically designed to remain cycling until user input is provided. There is a specifically immanent and localised quality to its actions that might be taken as exemplary of control as a continuously modulating affective materialism. The outcome is a heightened sense of bounded emergencies that are either flattened out or absorbed through reconstitution; however, these are never linear gestures of containment. As Tiziana Terranova observes, control operates through multilayered mechanisms of order and organisation: “messy local assemblages and compositions, subjective and machinic, characterised by different types of psychic investments, that cannot be the subject of normative, pre-made political judgments, but which need to be thought anew again and again, each time, in specific dynamic compositions” (“Of Sense and Sensibility” 34). This event-orientated vitality accounts for the political ambitions of tactical media as opening out communication channels through selective “transversal” targeting. Amazon Noir, for that reason, is pitched specifically against the material processes of communication. The system used to harvest the content from “Search inside the Book” is described as “robot-perversion-technology,” based on a network of four servers around the globe, each with a specific function: one located in the United States that retrieved (or “sucked”) the books from the site, one in Russia that injected the assembled documents onto P2P networks and two in Europe that coordinated the action via intelligent automated programs (see “The Diagram”). According to the “villains,” the main goal was to steal all 150,000 books from Search Inside!™ then use the same technology to steal books from the “Google Print Service” (the exploit was limited only by the amount of technological resources financially available, but there are apparent plans to improve the technique by reinvesting the money received through the settlement with Amazon.com not to publicise the hack). In terms of informational culture, this system resembles a machinic process directed at redistributing copyright content; “The Diagram” visualises key processes that define digital piracy as an emergent phenomenon within an open-ended and responsive milieu. That is, the static image foregrounds something of the activity of copying being a technological action that complicates any analysis focusing purely on copyright as content. In this respect, intellectual property rights are revealed as being entangled within information architectures as communication management and cultural recombination – dissipated and enforced by a measured interplay between openness and obstruction, resonance and emergence (Terranova, “Communication beyond Meaning” 52). To understand data distribution requires an acknowledgement of these underlying nonhuman relations that allow for such informational exchanges. It requires an understanding of the permutations of agency carried along by digital entities. According to Lawrence Lessig’s influential argument, code is not merely an object of governance, but has an overt legislative function itself. Within the informational environments of software, “a law is defined, not through a statue, but through the code that governs the space” (20). These points of symmetry are understood as concretised social values: they are material standards that regulate flow. Similarly, Alexander Galloway describes computer protocols as non-institutional “etiquette for autonomous agents,” or “conventional rules that govern the set of possible behavior patterns within a heterogeneous system” (7). In his analysis, these agreed-upon standardised actions operate as a style of management fostered by contradiction: progressive though reactionary, encouraging diversity by striving for the universal, synonymous with possibility but completely predetermined, and so on (243-244). Needless to say, political uncertainties arise from a paradigm that generates internal material obscurities through a constant twinning of freedom and control. For Wendy Hui Kyong Chun, these Cold War systems subvert the possibilities for any actual experience of autonomy by generalising paranoia through constant intrusion and reducing social problems to questions of technological optimisation (1-30). In confrontation with these seemingly ubiquitous regulatory structures, cultural theory requires a critical vocabulary differentiated from computer engineering to account for the sociality that permeates through and concatenates technological realities. In his recent work on “mundane” devices, software and code, Adrian McKenzie introduces a relevant analytic approach in the concept of technological action as something that both abstracts and concretises relations in a diffusion of collective-individual forces. Drawing on the thought of French philosopher Gilbert Simondon, he uses the term “transduction” to identify a key characteristic of technology in the relational process of becoming, or ontogenesis. This is described as bringing together disparate things into composites of relations that evolve and propagate a structure throughout a domain, or “overflow existing modalities of perception and movement on many scales” (“Impersonal and Personal Forces in Technological Action” 201). Most importantly, these innovative diffusions or contagions occur by bridging states of difference or incompatibilities. Technological action, therefore, arises from a particular type of disjunctive relation between an entity and something external to itself: “in making this relation, technical action changes not only the ensemble, but also the form of life of its agent. Abstraction comes into being and begins to subsume or reconfigure existing relations between the inside and outside” (203). Here, reciprocal interactions between two states or dimensions actualise disparate potentials through metastability: an equilibrium that proliferates, unfolds, and drives individuation. While drawing on cybernetics and dealing with specific technological platforms, McKenzie’s work can be extended to describe the significance of informational devices throughout control societies as a whole, particularly as a predictive and future-orientated force that thrives on staged conflicts. Moreover, being a non-deterministic technical theory, it additionally speaks to new tendencies in regimes of production that harness cognition and cooperation through specially designed infrastructures to enact persistent innovation without any end-point, final goal or natural target (Thrift 283-295). Here, the interface between intellectual property and reproduction can be seen as a site of variation that weaves together disparate objects and entities by imbrication in social life itself. These are specific acts of interference that propel relations toward unforeseen conclusions by drawing on memories, attention spans, material-technical traits, and so on. The focus lies on performance, context, and design “as a continual process of tuning arrived at by distributed aspiration” (Thrift 295). This later point is demonstrated in recent scholarly treatments of filesharing networks as media ecologies. Kate Crawford, for instance, describes the movement of P2P as processual or adaptive, comparable to technological action, marked by key transitions from partially decentralised architectures such as Napster, to the fully distributed systems of Gnutella and seeded swarm-based networks like BitTorrent (30-39). Each of these technologies can be understood as a response to various legal incursions, producing radically dissimilar socio-technological dynamics and emergent trends for how agency is modulated by informational exchanges. Indeed, even these aberrant formations are characterised by modes of commodification that continually spillover and feedback on themselves, repositioning markets and commodities in doing so, from MP3s to iPods, P2P to broadband subscription rates. However, one key limitation of this ontological approach is apparent when dealing with the sheer scale of activity involved, where mass participation elicits certain degrees of obscurity and relative safety in numbers. This represents an obvious problem for analysis, as dynamics can easily be identified in the broadest conceptual sense, without any understanding of the specific contexts of usage, political impacts, and economic effects for participants in their everyday consumptive habits. Large-scale distributed ensembles are “problematic” in their technological constitution, as a result. They are sites of expansive overflow that provoke an equivalent individuation of thought, as the Recording Industry Association of America observes on their educational website: “because of the nature of the theft, the damage is not always easy to calculate but not hard to envision” (“Piracy”). The politics of the filesharing debate, in this sense, depends on the command of imaginaries; that is, being able to conceptualise an overarching structural consistency to a persistent and adaptive ecology. As a mode of tactical intervention, Amazon Noir dramatises these ambiguities by framing technological action through the fictional sensibilities of narrative genre. Ambiguity, Control The extensive use of imagery and iconography from “noir” can be understood as an explicit reference to the increasing criminalisation of copyright violation through digital technologies. However, the term also refers to the indistinct or uncertain effects produced by this tactical intervention: who are the “bad guys” or the “good guys”? Are positions like ‘good’ and ‘evil’ (something like freedom or tyranny) so easily identified and distinguished? As Paolo Cirio explains, this political disposition is deliberately kept obscure in the project: “it’s a representation of the actual ambiguity about copyright issues, where every case seems to lack a moral or ethical basis” (“Amazon Noir Interview”). While user communications made available on the site clearly identify culprits (describing the project as jeopardising arts funding, as both irresponsible and arrogant), the self-description of the artists as political “failures” highlights the uncertainty regarding the project’s qualities as a force of long-term social renewal: Lizvlx from Ubermorgen.com had daily shootouts with the global mass-media, Cirio continuously pushed the boundaries of copyright (books are just pixels on a screen or just ink on paper), Ludovico and Bernhard resisted kickback-bribes from powerful Amazon.com until they finally gave in and sold the technology for an undisclosed sum to Amazon. Betrayal, blasphemy and pessimism finally split the gang of bad guys. (“Press Release”) Here, the adaptive and flexible qualities of informatic commodities and computational systems of distribution are knowingly posited as critical limits; in a certain sense, the project fails technologically in order to succeed conceptually. From a cynical perspective, this might be interpreted as guaranteeing authenticity by insisting on the useless or non-instrumental quality of art. However, through this process, Amazon Noir illustrates how forces confined as exterior to control (virality, piracy, noncommunication) regularly operate as points of distinction to generate change and innovation. Just as hackers are legitimately employed to challenge the durability of network exchanges, malfunctions are relied upon as potential sources of future information. Indeed, the notion of demonstrating ‘autonomy’ by illustrating the shortcomings of software is entirely consistent with the logic of control as a modulating organisational diagram. These so-called “circuit breakers” are positioned as points of bifurcation that open up new systems and encompass a more general “abstract machine” or tendency governing contemporary capitalism (Parikka 300). As a consequence, the ambiguities of Amazon Noir emerge not just from the contrary articulation of intellectual property and digital technology, but additionally through the concept of thinking “resistance” simultaneously with regimes of control. This tension is apparent in Galloway’s analysis of the cybernetic machines that are synonymous with the operation of Deleuzian control societies – i.e. “computerised information management” – where tactical media are posited as potential modes of contestation against the tyranny of code, “able to exploit flaws in protocological and proprietary command and control, not to destroy technology, but to sculpt protocol and make it better suited to people’s real desires” (176). While pushing a system into a state of hypertrophy to reform digital architectures might represent a possible technique that produces a space through which to imagine something like “our” freedom, it still leaves unexamined the desire for reformation itself as nurtured by and produced through the coupling of cybernetics, information theory, and distributed networking. This draws into focus the significance of McKenzie’s Simondon-inspired cybernetic perspective on socio-technological ensembles as being always-already predetermined by and driven through asymmetries or difference. As Chun observes, consequently, there is no paradox between resistance and capture since “control and freedom are not opposites, but different sides of the same coin: just as discipline served as a grid on which liberty was established, control is the matrix that enables freedom as openness” (71). Why “openness” should be so readily equated with a state of being free represents a major unexamined presumption of digital culture, and leads to the associated predicament of attempting to think of how this freedom has become something one cannot not desire. If Amazon Noir has political currency in this context, however, it emerges from a capacity to recognise how informational networks channel desire, memories, and imaginative visions rather than just cultivated antagonisms and counterintuitive economics. As a final point, it is worth observing that the project was initiated without publicity until the settlement with Amazon.com. There is, as a consequence, nothing to suggest that this subversive “event” might have actually occurred, a feeling heightened by the abstractions of software entities. To the extent that we believe in “the big book heist,” that such an act is even possible, is a gauge through which the paranoia of control societies is illuminated as a longing or desire for autonomy. As Hakim Bey observes in his conceptualisation of “pirate utopias,” such fleeting encounters with the imaginaries of freedom flow back into the experience of the everyday as political instantiations of utopian hope. Amazon Noir, with all its underlying ethical ambiguities, presents us with a challenge to rethink these affective investments by considering our profound weaknesses to master the complexities and constant intrusions of control. It provides an opportunity to conceive of a future that begins with limits and limitations as immanently central, even foundational, to our deep interconnection with socio-technological ensembles. References “Amazon Noir – The Big Book Crime.” http://www.amazon-noir.com/>. Bey, Hakim. T.A.Z.: The Temporary Autonomous Zone, Ontological Anarchy, Poetic Terrorism. New York: Autonomedia, 1991. Chun, Wendy Hui Kyong. Control and Freedom: Power and Paranoia in the Age of Fibre Optics. Cambridge, MA: MIT Press, 2006. Crawford, Kate. “Adaptation: Tracking the Ecologies of Music and Peer-to-Peer Networks.” Media International Australia 114 (2005): 30-39. Cubitt, Sean. “Distribution and Media Flows.” Cultural Politics 1.2 (2005): 193-214. Deleuze, Gilles. Foucault. Trans. Seán Hand. Minneapolis: U of Minnesota P, 1986. ———. “Control and Becoming.” Negotiations 1972-1990. Trans. Martin Joughin. New York: Columbia UP, 1995. 169-176. ———. “Postscript on the Societies of Control.” Negotiations 1972-1990. Trans. Martin Joughin. New York: Columbia UP, 1995. 177-182. Eriksson, Magnus, and Rasmus Fleische. “Copies and Context in the Age of Cultural Abundance.” Online posting. 5 June 2007. Nettime 25 Aug 2007. Galloway, Alexander. Protocol: How Control Exists after Decentralization. Cambridge, MA: MIT Press, 2004. Hardt, Michael, and Antonio Negri. Multitude: War and Democracy in the Age of Empire. New York: Penguin Press, 2004. Harold, Christine. OurSpace: Resisting the Corporate Control of Culture. Minneapolis: U of Minnesota P, 2007. Lessig, Lawrence. Code and Other Laws of Cyberspace. New York: Basic Books, 1999. McKenzie, Adrian. Cutting Code: Software and Sociality. New York: Peter Lang, 2006. ———. “The Strange Meshing of Impersonal and Personal Forces in Technological Action.” Culture, Theory and Critique 47.2 (2006): 197-212. Parikka, Jussi. “Contagion and Repetition: On the Viral Logic of Network Culture.” Ephemera: Theory & Politics in Organization 7.2 (2007): 287-308. “Piracy Online.” Recording Industry Association of America. 28 Aug 2007. http://www.riaa.com/physicalpiracy.php>. Sundaram, Ravi. “Recycling Modernity: Pirate Electronic Cultures in India.” Sarai Reader 2001: The Public Domain. Delhi, Sarai Media Lab, 2001. 93-99. http://www.sarai.net>. Terranova, Tiziana. “Communication beyond Meaning: On the Cultural Politics of Information.” Social Text 22.3 (2004): 51-73. ———. “Of Sense and Sensibility: Immaterial Labour in Open Systems.” DATA Browser 03 – Curating Immateriality: The Work of the Curator in the Age of Network Systems. Ed. Joasia Krysa. New York: Autonomedia, 2006. 27-38. Thrift, Nigel. “Re-inventing Invention: New Tendencies in Capitalist Commodification.” Economy and Society 35.2 (2006): 279-306. 
 
 
 
 Citation reference for this article
 
 MLA Style
 Dieter, Michael. "Amazon Noir: Piracy, Distribution, Control." M/C Journal 10.5 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0710/07-dieter.php>. APA Style
 Dieter, M. (Oct. 2007) "Amazon Noir: Piracy, Distribution, Control," M/C Journal, 10(5). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0710/07-dieter.php>. 
APA, Harvard, Vancouver, ISO, and other styles
30

Teh, David. "Fibre." M/C Journal 6, no. 4 (2003). http://dx.doi.org/10.5204/mcj.2216.

Full text
Abstract:
At first, no doubt, only the reproduction and transmission of works of art will be affected. It will be possible to send anywhere or to re-create anywhere a system of sensations, or more precisely a system of stimuli, provoked by some object or event in any given place. Works of art will acquire a kind of ubiquity. We shall only have to summon them and there they will be…They will not merely exist in themselves but will exist wherever someone with a certain apparatus happens to be. (Paul Valéry, ‘The Conquest of Ubiquity’, 225-6) Paul Valéry made these remarks in 1934, as the first drive-in movie theater opened in New Jersey, as Muzak was born, as the Associated Press started its international wirephoto service, and as a company called Imperial & International Communications renamed itself Cable & Wireless. Regular TV broadcasting would begin in England two years later, and in the U.S. in 1939, the same year John Atanasoff and Clifford Berry completed the prototype of the first digital computer. (Caslon Analytics) Valéry’s prognostications may of course be read alongside the thinking of Walter Benjamin, who quotes this passage in his famous essay on ‘The Work of Art in the Age of Mechanical Reproduction’. Both stress that it is not simply the forms taken by art works that are changing, but their very conditions of possibility, or put another way (Benjamin’s), that they are henceforth designed with their reproducibility in mind. It is therefore neither uniqueness, nor specificity, but the potential for ‘ubiquity’, that yields the value of the work made for the new media. Just as water, gas and electricity are brought into our houses from far off to satisfy our needs in response to a minimal effort, so we shall be supplied with visual or auditory images, which will appear and disappear at a simple movement of the hand, hardly more than a sign.(226) Two things have always struck me about Valéry’s analysis. The first is his characterization – for want of a better word, metaphysical – of the new cultural produce. It is not simply a movement from the clunky physicality of the artisanal object to that of the commodity; rather, it is a commutation, a transmogrification, a liquidation of the cultural object, whose value and form henceforth arise according to its new fluidity. The cultural ‘fluid’ – what is given (data) to our ‘sense organs’ – behaves more like energy, or money, than the older art object. These properties suggest a whole new political economy of the culture industries. Just as we are accustomed, if not enslaved, to the various forms of energy that pour into our homes, we shall find it perfectly natural to receive the ultrarapid variations or oscillations that our sense organs gather in and integrate to form all we know. I do not know whether a philosopher has ever dreamed of a company engaged in the home delivery of Sensory Reality So began what we might call our Broadband Dreaming. Secondly, Valéry cannot but invoke the public utility company, a dominant corporate form in his day, but which to us is an endangered species, having almost liquidated itself over the course of the last few decades’ ecstatic neoliberalism. According to the Shorter OED, the “utility” provides something “able to satisfy human needs or wants”; it is a service (such as electricity or water) considered essential to the community; and it describes the provider of such a service or supply, usually ‘a nationalized or private monopoly subject to public regulation’. And this is precisely why I return to Valéry in opening a volume on ‘fibre’. For it is the privatization of communications infrastructure, hastening the closure of this zone of ‘public’ interest and community ‘needs’ – and this is as much about the downgrading of expectations as of actual services – that underlies the current political economy of networks and networked culture, and which prompts many of the articles collected here. What’s more, Valéry is especially alert to the peculiar purity of demand that the utility assumes, and our impatience for art’s sensory data “when not only our mind desires it, but our soul and whole being craves and as it were anticipates it”. Perhaps this well-nigh existential impatience is a necessary condition of networking – will we ever be satisfied with the bandwidth we have? As Gerard Goggin writes in the feature article: As the citizen is recast as consumer and customer, we rethink our cultural and political axioms as well as the axes that orient our understandings in this area. Information might travel close to the speed of light, and we might fantasise about optical fibre to the home (or pillow), but our terrain, our band where the struggle lies today, is narrower than we wish. That which we have ‘on tap’ has a way of engendering in us a reliance and an appetite somewhat out of keeping with actual need. Where conventional economic analysis might therefore struggle to explain our current obsession with fibre, histories of addiction, of affect and of symbolic exchange might succeed. The Fibreculture Flavour When we started the Fibreculture list in early 2001, national communications policy was a central concern, as was the question of how to make the best of it through critique and alternative networking practices, against the many challenges presented by the global and local zeitgeist of privatization, and by the post-dotcom deflation of the telecoms sector. Ravenous former monopolies, in rebound mode, were punished for their over-extensions into markets they knew little about, as the blue skies clouded over. Against this backdrop, it seemed most urgent to support, build upon, and learn from the experiences of a panoply of alternative media networks – of virtual communities getting real, and real communities going virtual – in order to learn the lessons of the dotcom debacle. Buzzwords were: D.I.Y. and tactical media, openness, sustainability, and collaborative and distributed models. But this collaboration between Fibreculture and M/C is not just content-sharing by two networks with overlapping interests, although this sort of temporary network chiasm demonstrates an untapped flexibility that ICTs retain in spite of the calcification of their institutions and their economic devaluation post-dotcom. Rather, at the heart of this experiment was an alternative peer-review process, a much-needed intervention into the orthodoxy (too long unrenovated) of blind peer-review. It took the form of a supplementary round of ‘collaborative text filtering’. Traditionally, peer-review is closed (‘blind’), centralized, and tends to be somewhat arbitrary; our alternative is distributed, open and more heuristic. From the list’s subscribers, small cells of four or five readers were formed; submissions were posted to the list, assigned to a cell, and readers were asked to post their critical responses within two weeks. Some of the ensuing dialogue was fascinating, all of it engaged and generous. The Fibreculture flavour thus consists of a wider discussion and debate inflecting the author’s final submission. ‘Review’ here was oriented towards an opening, rather than a closure, of the text, giving rise to a sharing of resources, references and informed opinions. These exchanges remain accessible via the list archives (look for subject lines ‘MCFIBRE’ and ‘Re: MCFIBRE’) at: <http://lists.myspinach.org/archives/fibreculture/2003-June/subject.html> <http://lists.myspinach.org/archives/fibreculture/2003-May/subject.html> What’s lost is anonymity and the discursive or disciplinary specialization of reviewers – both are key components of the older model, both with their downside. The question must be asked: If interdisciplinarity means anything beyond the proliferation of competing discourses, what are its implications for the practices and economies of academic publishing, and for the ‘knowledge economy’ generally? Of course, the spread of topics does mirror Fibreculture’s interests. Half of the authors assembled here are regular contributors to the list. They include its co-founder, Geert Lovink, who manages to report and speculate (at once!) on the much-paraded relationship between art and science; and Gerard Goggin, whose informative feature article takes up many of the concerns raised above, with respect to broadband infrastructure (and policy) in particular. Emy Tseng and Kyle Eischen take the notion of infrastructure more technically in considering how it might inform a progressive techno-geography. Fibreculture explores the politics of networks and ICTs, but also their cultures. The experiential (and ‘affective’) dimension of networked culture was also a prevalent theme of responses to the Call For Papers, including artist and architect Petra Gemeinboeck’s theoretical explanation of her installation Maya – Veil of Illusion. Fibre is where the economic meets the social, where the public meets the private, and intrudes upon it. Grayson Cooke responds in kind (and with humour) to the intrusive excesses of Spam. For Adrian Mackenzie, both social and technical practices “are integrated in our politics. When politics integrates human affairs and technical things, collective affects concerning infrastructure arise… Infrastructures are integral to how cultural forms of life render and inhabit their worlds.” But some aspects of sociality migrate to the networks more easily than others, as Jon Marshall discovers in his analysis of gendered and gendering behaviour online. For all their complexity, the interweavings of affect in the networks are anything but random. As we find in Andrew Murphie’s anthropological musing (after José Gil) on the place of ritual in the technosphere: Even at its apparently most disorganized … (in ritual ecstasy for example), ritual magic is in reality extremely organised (although an organisation of forces and translations rather than one of stable states). As Gil writes, even the 'gestures, words, or cries of the possessed are coded'. Indeed, the codes involved are precisely those of possession, but of a possession by networks rather than of them… Also of a theoretical bent is Andrew Goffey’s fascinating synopsis of the relationship – potentially very revealing – between immunology and theories of networked communication and organization. A welcome reminder of the necessity, and the speculative pleasures, of pressing on with cross-disciplinary investigation, even when it seems ‘interdisciplinarity’ has devolved from a type of work to a mere ‘framework’ for funding agendas and institutional window-dressing. As with all Fibreculture projects, no all-inclusive vision of anything is offered here. What we present instead is another installment of networked multiplicity, the unpredictable mixture of codes, idioms and critical thought on which list cultures seem to thrive. With thanks to the team at M/C, to the contributors and reviewers (especially Mel Gregg, Ned Rossiter and Esther Milne), and to all who contribute to the Fibreculture community. http://www.fibreculture.org Works Cited Paul Valéry, ‘The Conquest of Ubiquity’, in Aesthetics, trans. Ralph Manheim, London: Routledge and Kegan Paul, 1964. Caslon Analytics, ‘Media and Communications Timeline’ 1926-50 <http://www.caslon.com.au/timeline5.htm> accessed 18/08/03 Links http://lists.myspinach.org/archives/fibreculture/2003-June/subject.html http://lists.myspinach.org/archives/fibreculture/2003-May/subject.html http://www.caslon.com.au/timeline5.htm http://www.fibreculture.org/ http://www.fibreculture.org/index.html http://www.fibreculture.org/mcfibre.html Citation reference for this article Substitute your date of access for Dn Month Year etc... MLA Style Teh, David. "Fibre " M/C: A Journal of Media and Culture< http://www.media-culture.org.au/0308/01-editorial.html >. APA Style Teh, D. (2003, Aug 26). Fibre . M/C: A Journal of Media and Culture, 6,< http://www.media-culture.org.au/0308/01-editorial.html >
APA, Harvard, Vancouver, ISO, and other styles
31

Goggin, Gerard. "Broadband." M/C Journal 6, no. 4 (2003). http://dx.doi.org/10.5204/mcj.2219.

Full text
Abstract:
Connecting I’ve moved house on the weekend, closer to the centre of an Australian capital city. I had recently signed up for broadband, with a major Australian Internet company (my first contact, cf. Turner). Now I am the proud owner of a larger modem than I have ever owned: a white cable modem. I gaze out into our new street: two thick black cables cosseted in silver wire. I am relieved. My new home is located in one of those streets, double-cabled by Telstra and Optus in the data-rush of the mid-1990s. Otherwise, I’d be moth-balling the cable modem, and the thrill of my data percolating down coaxial cable. And it would be off to the computer supermarket to buy an ASDL modem, then to pick a provider, to squeeze some twenty-first century connectivity out of old copper (the phone network our grandparents and great-grandparents built). If I still lived in the country, or the outskirts of the city, or anywhere else more than four kilometres from the phone exchange, and somewhere that cable pay TV will never reach, it would be a dish for me — satellite. Our digital lives are premised upon infrastructure, the networks through which we shape what we do, fashion the meanings of our customs and practices, and exchange signs with others. Infrastructure is not simply the material or the technical (Lamberton), but it is the dense, fibrous knotting together of social visions, cultural resources, individual desires, and connections. No more can one easily discern between ‘society’ and ‘technology’, ‘carriage’ and ‘content’, ‘base’ and ‘superstructure’, or ‘infrastructure’ and ‘applications’ (or ‘services’ or ‘content’). To understand telecommunications in action, or the vectors of fibre, we need to consider the long and heterogeneous list of links among different human and non-human actors — the long networks, to take Bruno Latour’s evocative concept, that confect our broadband networks (Latour). The co-ordinates of our infrastructure still build on a century-long history of telecommunications networks, on the nineteenth-century centrality of telegraphy preceding this, and on the histories of the public and private so inscribed. Yet we are in the midst of a long, slow dismantling of the posts-telegraph-telephone (PTT) model of the monopoly carrier for each nation that dominated the twentieth century, with its deep colonial foundations. Instead our New World Information and Communication Order is not the decolonising UNESCO vision of the late 1970s and early 1980s (MacBride, Maitland). Rather it is the neoliberal, free trade, market access model, its symbol the 1984 US judicial decision to require the break-up of AT&T and the UK legislation in the same year that underpinned the Thatcherite twin move to privatize British Telecom and introduce telecommunications competition. Between 1984 and 1999, 110 telecommunications companies were privatized, and the ‘acquisition of privatized PTOs [public telecommunications operators] by European and American operators does follow colonial lines’ (Winseck 396; see also Mody, Bauer & Straubhaar). The competitive market has now been uneasily installed as the paradigm for convergent communications networks, not least with the World Trade Organisation’s 1994 General Agreement on Trade in Services and Annex on Telecommunications. As the citizen is recast as consumer and customer (Goggin, ‘Citizens and Beyond’), we rethink our cultural and political axioms as well as the axes that orient our understandings in this area. Information might travel close to the speed of light, and we might fantasise about optical fibre to the home (or pillow), but our terrain, our band where the struggle lies today, is narrower than we wish. Begging for broadband, it seems, is a long way from warchalking for WiFi. Policy Circuits The dreary everyday business of getting connected plugs the individual netizen into a tangled mess of policy circuits, as much as tricky network negotiations. Broadband in mid-2003 in Australia is a curious chimera, welded together from a patchwork of technologies, old and newer communications industries, emerging economies and patterns of use. Broadband conjures up grander visions, however, of communication and cultural cornucopia. Broadband is high-speed, high-bandwidth, ‘always-on’, networked communications. People can send and receive video, engage in multimedia exchanges of all sorts, make the most of online education, realise the vision of home-based work and trading, have access to telemedicine, and entertainment. Broadband really entered the lexicon with the mass takeup of the Internet in the early to mid-1990s, and with the debates about something called the ‘information superhighway’. The rise of the Internet, the deregulation of telecommunications, and the involuted convergence of communications and media technologies saw broadband positioned at the centre of policy debates nearly a decade ago. In 1993-1994, Australia had its Broadband Services Expert Group (BSEG), established by the then Labor government. The BSEG was charged with inquiring into ‘issues relating to the delivery of broadband services to homes, schools and businesses’. Stung by criticisms of elite composition (a narrow membership, with only one woman among its twelve members, and no consumer or citizen group representation), the BSEG was prompted into wider public discussion and consultation (Goggin & Newell). The then Bureau of Transport and Communications Economics (BTCE), since transmogrified into the Communications Research Unit of the Department of Communications, Information Technology and the Arts (DCITA), conducted its large-scale Communications Futures Project (BTCE and Luck). The BSEG Final report posed the question starkly: As a society we have choices to make. If we ignore the opportunities we run the risk of being left behind as other countries introduce new services and make themselves more competitive: we will become consumers of other countries’ content, culture and technologies rather than our own. Or we could adopt new technologies at any cost…This report puts forward a different approach, one based on developing a new, user-oriented strategy for communications. The emphasis will be on communication among people... (BSEG v) The BSEG proposed a ‘National Strategy for New Communications Networks’ based on three aspects: education and community access, industry development, and the role of government (BSEG x). Ironically, while the nation, or at least its policy elites, pondered the weighty question of broadband, Australia’s two largest telcos were doing it. The commercial decision of Telstra/Foxtel and Optus Vision, and their various television partners, was to nail their colours (black) to the mast, or rather telegraph pole, and to lay cable in the major capital cities. In fact, they duplicated the infrastructure in cities such as Sydney and Melbourne, then deciding it would not be profitable to cable up even regional centres, let alone small country towns or settlements. As Terry Flew and Christina Spurgeon observe: This wasteful duplication contrasted with many other parts of the country that would never have access to this infrastructure, or to the social and economic benefits that it was perceived to deliver. (Flew & Spurgeon 72) The implications of this decision for Australia’s telecommunications and television were profound, but there was little, if any, public input into this. Then Minister Michael Lee was very proud of his anti-siphoning list of programs, such as national sporting events, that would remain on free-to-air television rather than screen on pay, but was unwilling, or unable, to develop policy on broadband and pay TV cable infrastructure (on the ironies of Australia’s television history, see Given’s masterly account). During this period also, it may be remembered, Australia’s Internet was being passed into private hands, with the tendering out of AARNET (see Spurgeon for discussion). No such national strategy on broadband really emerged in the intervening years, nor has the market provided integrated, accessible broadband services. In 1997, landmark telecommunications legislation was enacted that provided a comprehensive framework for competition in telecommunications, as well as consolidating and extending consumer protection, universal service, customer service standards, and other reforms (CLC). Carrier and reseller competition had commenced in 1991, and the 1997 legislation gave it further impetus. Effective competition is now well established in long distance telephone markets, and in mobiles. Rivalrous competition exists in the market for local-call services, though viable alternatives to Telstra’s dominance are still few (Fels). Broadband too is an area where there is symbolic rivalry rather than effective competition. This is most visible in advertised ADSL offerings in large cities, yet most of the infrastructure for these services is comprised by Telstra’s copper, fixed-line network. Facilities-based duopoly competition exists principally where Telstra/Foxtel and Optus cable networks have been laid, though there are quite a number of ventures underway by regional telcos, power companies, and, most substantial perhaps, the ACT government’s TransACT broadband network. Policymakers and industry have been greatly concerned about what they see as slow takeup of broadband, compared to other countries, and by barriers to broadband competition and access to ‘bottleneck’ facilities (such as Telstra or Optus’s networks) by potential competitors. The government has alternated between trying to talk up broadband benefits and rates of take up and recognising the real difficulties Australia faces as a large country with a relative small and dispersed population. In March 2003, Minister Alston directed the ACCC to implement new monitoring and reporting arrangements on competition in the broadband industry. A key site for discussion of these matters has been the competition policy institution, the Australian Competition and Consumer Commission, and its various inquiries, reports, and considerations (consult ACCC’s telecommunications homepage at http://www.accc.gov.au/telco/fs-telecom.htm). Another key site has been the Productivity Commission (http://www.pc.gov.au), while a third is the National Office on the Information Economy (NOIE - http://www.noie.gov.au/projects/access/access/broadband1.htm). Others have questioned whether even the most perfectly competitive market in broadband will actually provide access to citizens and consumers. A great deal of work on this issue has been undertaken by DCITA, NOIE, the regulators, and industry bodies, not to mention consumer and public interest groups. Since 1997, there have been a number of governmental inquiries undertaken or in progress concerning the takeup of broadband and networked new media (for example, a House of Representatives Wireless Broadband Inquiry), as well as important inquiries into the still most strategically important of Australia’s companies in this area, Telstra. Much of this effort on an ersatz broadband policy has been piecemeal and fragmented. There are fundamental difficulties with the large size of the Australian continent and its harsh terrain, the small size of the Australian market, the number of providers, and the dominant position effectively still held by Telstra, as well as Singtel Optus (Optus’s previous overseas investors included Cable & Wireless and Bell South), and the larger telecommunications and Internet companies (such as Ozemail). Many consumers living in metropolitan Australia still face real difficulties in realising the slogan ‘bandwidth for all’, but the situation in parts of rural Australia is far worse. Satellite ‘broadband’ solutions are available, through Telstra Countrywide or other providers, but these offer limited two-way interactivity. Data can be received at reasonable speeds (though at far lower data rates than how ‘broadband’ used to be defined), but can only be sent at far slower rates (Goggin, Rural Communities Online). The cultural implications of these digital constraints may well be considerable. Computer gamers, for instance, are frustrated by slow return paths. In this light, the final report of the January 2003 Broadband Advisory Group (BAG) is very timely. The BAG report opens with a broadband rhapsody: Broadband communications technologies can deliver substantial economic and social benefits to Australia…As well as producing productivity gains in traditional and new industries, advanced connectivity can enrich community life, particularly in rural and regional areas. It provides the basis for integration of remote communities into national economic, cultural and social life. (BAG 1, 7) Its prescriptions include: Australia will be a world leader in the availability and effective use of broadband...and to capture the economic and social benefits of broadband connectivity...Broadband should be available to all Australians at fair and reasonable prices…Market arrangements should be pro-competitive and encourage investment...The Government should adopt a National Broadband Strategy (BAG 1) And, like its predecessor nine years earlier, the BAG report does make reference to a national broadband strategy aiming to maximise “choice in work and recreation activities available to all Australians independent of location, background, age or interests” (17). However, the idea of a national broadband strategy is not something the BAG really comes to grips with. The final report is keen on encouraging broadband adoption, but not explicit on how barriers to broadband can be addressed. Perhaps this is not surprising given that the membership of the BAG, dominated by representatives of large corporations and senior bureaucrats was even less representative than its BSEG predecessor. Some months after the BAG report, the Federal government did declare a broadband strategy. It did so, intriguingly enough, under the rubric of its response to the Regional Telecommunications Inquiry report (Estens), the second inquiry responsible for reassuring citizens nervous about the full-privatisation of Telstra (the first inquiry being Besley). The government’s grand $142.8 million National Broadband Strategy focusses on the ‘broadband needs of regional Australians, in partnership with all levels of government’ (Alston, ‘National Broadband Strategy’). Among other things, the government claims that the Strategy will result in “improved outcomes in terms of services and prices for regional broadband access; [and] the development of national broadband infrastructure assets.” (Alston, ‘National Broadband Strategy’) At the same time, the government announced an overall response to the Estens Inquiry, with specific safeguards for Telstra’s role in regional communications — a preliminary to the full Telstra sale (Alston, ‘Future Proofing’). Less publicised was the government’s further initiative in indigenous telecommunications, complementing its Telecommunications Action Plan for Remote Indigenous Communities (DCITA). Indigenous people, it can be argued, were never really contemplated as citizens with the ken of the universal service policy taken to underpin the twentieth-century government monopoly PTT project. In Australia during the deregulatory and re-regulatory 1990s, there was a great reluctance on the part of Labor and Coalition Federal governments, Telstra and other industry participants, even to research issues of access to and use of telecommunications by indigenous communicators. Telstra, and to a lesser extent Optus (who had purchased AUSSAT as part of their licence arrangements), shrouded the issue of indigenous communications in mystery that policymakers were very reluctant to uncover, let alone systematically address. Then regulator, the Australian Telecommunications Authority (AUSTEL), had raised grave concerns about indigenous telecommunications access in its 1991 Rural Communications inquiry. However, there was no government consideration of, nor research upon, these issues until Alston commissioned a study in 2001 — the basis for the TAPRIC strategy (DCITA). The elision of indigenous telecommunications from mainstream industry and government policy is all the more puzzling, if one considers the extraordinarily varied and significant experiments by indigenous Australians in telecommunications and Internet (not least in the early work of the Tanami community, made famous in media and cultural studies by the writings of anthropologist Eric Michaels). While the government’s mid-2003 moves on a ‘National Broadband Strategy’ attend to some details of the broadband predicament, they fall well short of an integrated framework that grasps the shortcomings of the neoliberal communications model. The funding offered is a token amount. The view from the seat of government is a glance from the rear-view mirror: taking a snapshot of rural communications in the years 2000-2002 and projecting this tableau into a safety-net ‘future proofing’ for the inevitable turning away of a fully-privately-owned Telstra from its previously universal, ‘carrier of last resort’ responsibilities. In this aetiolated, residualist policy gaze, citizens remain constructed as consumers in a very narrow sense in this incremental, quietist version of state securing of market arrangements. What is missing is any more expansive notion of citizens, their varied needs, expectations, uses, and cultural imaginings of ‘always on’ broadband networks. Hybrid Networks “Most people on earth will eventually have access to networks that are all switched, interactive, and broadband”, wrote Frances Cairncross in 1998. ‘Eventually’ is a very appropriate word to describe the parlous state of broadband technology implementation. Broadband is in a slow state of evolution and invention. The story of broadband so far underscores the predicament for Australian access to bandwidth, when we lack any comprehensive, integrated, effective, and fair policy in communications and information technology. We have only begun to experiment with broadband technologies and understand their evolving uses, cultural forms, and the sense in which they rework us as subjects. Our communications networks are not superhighways, to invoke an enduring artefact from an older technology. Nor any longer are they a single ‘public’ switched telecommunications network, like those presided over by the post-telegraph-telephone monopolies of old. Like roads themselves, or the nascent postal system of the sixteenth century, broadband is a patchwork quilt. The ‘fibre’ of our communications networks is hybrid. To be sure, powerful corporations dominate, like the Tassis or Taxis who served as postmasters to the Habsburg emperors (Briggs & Burke 25). Activating broadband today provides a perspective on the path dependency of technology history, and how we can open up new threads of a communications fabric. Our options for transforming our multitudinous networked lives emerge as much from everyday tactics and strategies as they do from grander schemes and unifying policies. We may care to reflect on the waning potential for nation-building technology, in the wake of globalisation. We no longer gather our imagined community around a Community Telephone Plan as it was called in 1960 (Barr, Moyal, and PMG). Yet we do require national and international strategies to get and stay connected (Barr), ideas and funding that concretely address the wider dimensions of access and use. We do need to debate the respective roles of Telstra, the state, community initiatives, and industry competition in fair telecommunications futures. Networks have global reach and require global and national integration. Here vision, co-ordination, and resources are urgently required for our commonweal and moral fibre. To feel the width of the band we desire, we need to plug into and activate the policy circuits. Thanks to Grayson Cooke, Patrick Lichty, Ned Rossiter, John Pace, and an anonymous reviewer for helpful comments. Works Cited Alston, Richard. ‘ “Future Proofing” Regional Communications.’ Department of Communications, Information Technology and the Arts, Canberra, 2003. 17 July 2003 <http://www.dcita.gov.au/Article/0,,0_1-2_3-4_115485,00.php> —. ‘A National Broadband Strategy.’ Department of Communications, Information Technology and the Arts, Canberra, 2003. 17 July 2003 <http://www.dcita.gov.au/Article/0,,0_1-2_3-4_115486,00.php>. Australian Competition and Consumer Commission (ACCC). Broadband Services Report March 2003. Canberra: ACCC, 2003. 17 July 2003 <http://www.accc.gov.au/telco/fs-telecom.htm>. —. Emerging Market Structures in the Communications Sector. Canberra: ACCC, 2003. 15 July 2003 <http://www.accc.gov.au/pubs/publications/utilities/telecommu... ...nications/Emerg_mar_struc.doc>. Barr, Trevor. new media.com: The Changing Face of Australia’s Media and Telecommunications. Sydney: Allen & Unwin, 2000. Besley, Tim (Telecommunications Service Inquiry). Connecting Australia: Telecommunications Service Inquiry. Canberra: Department of Information, Communications and the Arts, 2000. 17 July 2003 <http://www.telinquiry.gov.au/final_report.php>. Briggs, Asa, and Burke, Peter. A Social History of the Internet: From Gutenberg to the Internet. Cambridge: Polity, 2002. Broadband Advisory Group. Australia’s Broadband Connectivity: The Broadband Advisory Group’s Report to Government. Melbourne: National Office on the Information Economy, 2003. 15 July 2003 <http://www.noie.gov.au/publications/NOIE/BAG/report/index.htm>. Broadband Services Expert Group. Networking Australia’s Future: Final Report. Canberra: Australian Government Publishing Service (AGPS), 1994. Bureau of Transport and Communications Economics (BTCE). Communications Futures Final Project. Canberra: AGPS, 1994. Cairncross, Frances. The Death of Distance: How the Communications Revolution Will Change Our Lives. London: Orion Business Books, 1997. Communications Law Centre (CLC). Australian Telecommunications Regulation: The Communications Law Centre Guide. 2nd edition. Sydney: Communications Law Centre, University of NSW, 2001. Department of Communications, Information Technology and the Arts (DCITA). Telecommunications Action Plan for Remote Indigenous Communities: Report on the Strategic Study for Improving Telecommunications in Remote Indigenous Communities. Canberra: DCITA, 2002. Estens, D. Connecting Regional Australia: The Report of the Regional Telecommunications Inquiry. Canberra: DCITA, 2002. <http://www.telinquiry.gov.au/rti-report.php>, accessed 17 July 2003. Fels, Alan. ‘Competition in Telecommunications’, speech to Australian Telecommunications Users Group 19th Annual Conference. 6 March, 2003, Sydney. <http://www.accc.gov.au/speeches/2003/Fels_ATUG_6March03.doc>, accessed 15 July 2003. Flew, Terry, and Spurgeon, Christina. ‘Television After Broadcasting’. In The Australian TV Book. Ed. Graeme Turner and Stuart Cunningham. Allen & Unwin, Sydney. 69-85. 2000. Given, Jock. Turning Off the Television. Sydney: UNSW Press, 2003. Goggin, Gerard. ‘Citizens and Beyond: Universal service in the Twilight of the Nation-State.’ In All Connected?: Universal Service in Telecommunications, ed. Bruce Langtry. Melbourne: University of Melbourne Press, 1998. 49-77 —. Rural Communities Online: Networking to link Consumers to Providers. Melbourne: Telstra Consumer Consultative Council, 2003. Goggin, Gerard, and Newell, Christopher. Digital Disability: The Social Construction of Disability in New Media. Lanham, MD: Rowman & Littlefield, 2003. House of Representatives Standing Committee on Communications, Information Technology and the Arts (HoR). Connecting Australia!: Wireless Broadband. Report of Inquiry into Wireless Broadband Technologies. Canberra: Parliament House, 2002. <http://www.aph.gov.au/house/committee/cita/Wbt/report.htm>, accessed 17 July 2003. Lamberton, Don. ‘A Telecommunications Infrastructure is Not an Information Infrastructure’. Prometheus: Journal of Issues in Technological Change, Innovation, Information Economics, Communication and Science Policy 14 (1996): 31-38. Latour, Bruno. Science in Action: How to Follow Scientists and Engineers Through Society. Cambridge, MA: Harvard University Press, 1987. Luck, David. ‘Revisiting the Future: Assessing the 1994 BTCE communications futures project.’ Media International Australia 96 (2000): 109-119. MacBride, Sean (Chair of International Commission for the Study of Communication Problems). Many Voices, One World: Towards a New More Just and More Efficient World Information and Communication Order. Paris: Kegan Page, London. UNESCO, 1980. Maitland Commission (Independent Commission on Worldwide Telecommunications Development). The Missing Link. Geneva: International Telecommunications Union, 1985. Michaels, Eric. Bad Aboriginal Art: Tradition, Media, and Technological Horizons. Sydney: Allen & Unwin, 1994. Mody, Bella, Bauer, Johannes M., and Straubhaar, Joseph D., eds. Telecommunications Politics: Ownership and Control of the Information Highway in Developing Countries. Mahwah, NJ: Erlbaum, 1995. Moyal, Ann. Clear Across Australia: A History of Telecommunications. Melbourne: Thomas Nelson, 1984. Post-Master General’s Department (PMG). Community Telephone Plan for Australia. Melbourne: PMG, 1960. Productivity Commission (PC). Telecommunications Competition Regulation: Inquiry Report. Report No. 16. Melbourne: Productivity Commission, 2001. <http://www.pc.gov.au/inquiry/telecommunications/finalreport/>, accessed 17 July 2003. Spurgeon, Christina. ‘National Culture, Communications and the Information Economy.’ Media International Australia 87 (1998): 23-34. Turner, Graeme. ‘First Contact: coming to terms with the cable guy.’ UTS Review 3 (1997): 109-21. Winseck, Dwayne. ‘Wired Cities and Transnational Communications: New Forms of Governance for Telecommunications and the New Media’. In The Handbook of New Media: Social Shaping and Consequences of ICTs, ed. Leah A. Lievrouw and Sonia Livingstone. London: Sage, 2002. 393-409. World Trade Organisation. General Agreement on Trade in Services: Annex on Telecommunications. Geneva: World Trade Organisation, 1994. 17 July 2003 <http://www.wto.org/english/tratop_e/serv_e/12-tel_e.htm>. —. Fourth protocol to the General Agreement on Trade in Services. Geneva: World Trade Organisation. 17 July 2003 <http://www.wto.org/english/tratop_e/serv_e/4prote_e.htm>. Links http://www.accc.gov.au/pubs/publications/utilities/telecommunications/Emerg_mar_struc.doc http://www.accc.gov.au/speeches/2003/Fels_ATUG_6March03.doc http://www.accc.gov.au/telco/fs-telecom.htm http://www.aph.gov.au/house/committee/cita/Wbt/report.htm http://www.dcita.gov.au/Article/0,,0_1-2_3-4_115485,00.html http://www.dcita.gov.au/Article/0,,0_1-2_3-4_115486,00.html http://www.noie.gov.au/projects/access/access/broadband1.htm http://www.noie.gov.au/publications/NOIE/BAG/report/index.htm http://www.pc.gov.au http://www.pc.gov.au/inquiry/telecommunications/finalreport/ http://www.telinquiry.gov.au/final_report.html http://www.telinquiry.gov.au/rti-report.html http://www.wto.org/english/tratop_e/serv_e/12-tel_e.htm http://www.wto.org/english/tratop_e/serv_e/4prote_e.htm Citation reference for this article Substitute your date of access for Dn Month Year etc... MLA Style Goggin, Gerard. "Broadband" M/C: A Journal of Media and Culture< http://www.media-culture.org.au/0308/02-featurebroadband.php>. APA Style Goggin, G. (2003, Aug 26). Broadband. M/C: A Journal of Media and Culture, 6,< http://www.media-culture.org.au/0308/02-featurebroadband.php>
APA, Harvard, Vancouver, ISO, and other styles
32

Brown, Adam, and Leonie Rutherford. "Postcolonial Play: Constructions of Multicultural Identities in ABC Children's Projects." M/C Journal 14, no. 2 (2011). http://dx.doi.org/10.5204/mcj.353.

Full text
Abstract:
In 1988, historian Nadia Wheatley and indigenous artist Donna Rawlins published their award-winning picture book, My Place, a reinterpretation of Australian national identity and sovereignty prompted by the bicentennial of white settlement. Twenty years later, the Australian Broadcasting Corporation (ABC) commissioned Penny Chapman’s multi-platform project based on this book. The 13 episodes of the television series begin in 2008, each telling the story of a child at a different point in history, and are accompanied by substantial interactive online content. Issues as diverse as religious difference and immigration, wartime conscription and trauma, and the experiences of Aboriginal Australians are canvassed. The program itself, which has a second series currently in production, introduces child audiences to—and implicates them in—a rich ideological fabric of deeply politicised issues that directly engage with vexed questions of Australian nationhood. The series offers a subversive view of Australian history and society, and it is the child—whether protagonist on the screen or the viewer/user of the content—who is left to discover, negotiate and move beyond often problematic societal norms. As one of the public broadcaster’s keystone projects, My Place signifies important developments in ABC’s construction of multicultural child citizenship. The digitisation of Australian television has facilitated a wave of multi-channel and new media innovation. Though the development of a multi-channel ecology has occurred significantly later in Australia than in the US or Europe, in part due to genre restrictions on broadcasters, all major Australian networks now have at least one additional free-to-air channel, make some of their content available online, and utilise various forms of social media to engage their audiences. The ABC has been in the vanguard of new media innovation, leveraging the industry dominance of ABC Online and its cross-platform radio networks for the repurposing of news, together with the additional funding for digital renewal, new Australian content, and a digital children’s channel in the 2006 and 2009 federal budgets. In line with “market failure” models of broadcasting (Born, Debrett), the ABC was once the most important producer-broadcaster for child viewers. With the recent allocation for the establishment of ABC3, it is now the catalyst for a significant revitalisation of the Australian children’s television industry. The ABC Charter requires it to broadcast programs that “contribute to a sense of national identity” and that “reflect the cultural diversity of the Australian community” (ABC Documents). Through its digital children’s channel (ABC3) and its multi-platform content, child viewers are not only exposed to a much more diverse range of local content, but also politicised by an intricate network of online texts connected to the TV programs. The representation of diasporic communities through and within multi-platformed spaces forms a crucial part of the way(s) in which collective identities are now being negotiated in children’s texts. An analysis of one of the ABC’s My Place “projects” and its associated multi-platformed content reveals an intricate relationship between postcolonial concerns and the construction of child citizenship. Multicultural Places, Multi-Platformed Spaces: New Media Innovation at the ABC The 2007 restructure at the ABC has transformed commissioning practices along the lines noted by James Bennett and Niki Strange of the BBC—a shift of focus from “programs” to multi-platform “projects,” with the latter consisting of a complex network of textual production. These “second shift media practices” (Caldwell) involve the tactical management of “user flows structured into and across the textual terrain that serve to promote a multifaceted and prolonged experience of the project” (Bennett and Strange 115). ABC Managing Director Mark Scott’s polemic deployment of the “digital commons” trope (Murdock, From) differs from that of his opposite number at the BBC, Mark Thompson, in its emphasis on the glocalised openness of the Australian “town square”—at once distinct from, and an integral part of, larger conversations. As announced at the beginning of the ABC’s 2009 annual report, the ABC is redefining the town square as a world of greater opportunities: a world where Australians can engage with one another and explore the ideas and events that are shaping our communities, our nation and beyond … where people can come to speak and be heard, to listen and learn from each other. (ABC ii)The broad emphasis on engagement characterises ABC3’s positioning of children in multi-platformed projects. As the Executive Producer of the ABC’s Children’s Television Multi-platform division comments, “participation is very much the mantra of the new channel” (Glen). The concept of “participation” is integral to what has been described elsewhere as “rehearsals in citizenship” (Northam). Writing of contemporary youth, David Buckingham notes that “‘political thinking’ is not merely an intellectual or developmental achievement, but an interpersonal process which is part of the construction of a collective, social identity” (179). Recent domestically produced children’s programs and their associated multimedia applications have significant potential to contribute to this interpersonal, “participatory” process. Through multi-platform experiences, children are (apparently) invited to construct narratives of their own. Dan Harries coined the term “viewser” to highlight the tension between watching and interacting, and the increased sense of agency on the part of audiences (171–82). Various online texts hosted by the ABC offer engagement with extra content relating to programs, with themed websites serving as “branches” of the overarching ABC3 metasite. The main site—strongly branded as the place for its targeted demographic—combines conventional television guide/program details with “Watch Now!,” a customised iView application within ABC3’s own themed interface; youth-oriented news; online gaming; and avenues for viewsers to create digital art and video, or interact with the community of “Club3” and associated message boards. The profiles created by members of Club3 are moderated and proscribe any personal information, resulting in an (understandably) restricted form of “networked publics” (boyd 124–5). Viewser profiles comprise only a username (which, the website stresses, should not be one’s real name) and an “avatar” (a customisable animated face). As in other social media sites, comments posted are accompanied by the viewser’s “name” and “face,” reinforcing the notion of individuality within the common group. The tool allows users to choose from various skin colours, emphasising the multicultural nature of the ABC3 community. Other customisable elements, including the ability to choose between dozens of pre-designed ABC3 assets and feeds, stress the audience’s “ownership” of the site. The Help instructions for the Club3 site stress the notion of “participation” directly: “Here at ABC3, we don’t want to tell you what your site should look like! We think that you should be able to choose for yourself.” Multi-platformed texts also provide viewsers with opportunities to interact with many of the characters (human actors and animated) from the television texts and share further aspects of their lives and fictional worlds. One example, linked to the representation of diasporic communities, is the Abatti Pizza Game, in which the player must “save the day” by battling obstacles to fulfil a pizza order. The game’s prefacing directions makes clear the ethnicity of the Abatti family, who are also visually distinctive. The dialogue also registers cultural markers: “Poor Nona, whatsa she gonna do? Now it’s up to you to help Johnny and his friends make four pizzas.” The game was acquired from the Canadian-animated franchise, Angela Anaconda; nonetheless, the Abatti family, the pizza store they operate and the dilemma they face translates easily to the Australian context. Dramatisations of diasporic contributions to national youth identities in postcolonial or settler societies—the UK (My Life as a Popat, CITV) and Canada (How to Be Indie)—also contribute to the diversity of ABC3’s television offerings and the positioning of its multi-platform community. The negotiation of diasporic and postcolonial politics is even clearer in the public broadcaster’s commitment to My Place. The project’s multifaceted construction of “places,” the ethical positioning of the child both as an individual and a member of (multicultural) communities, and the significant acknowledgement of ongoing conflict and discrimination, articulate a cultural commons that is more open-ended and challenging than the Eurocentric metaphor, the “town square,” suggests. Diversity, Discrimination and Diasporas: Positioning the Viewser of My Place Throughout the first series of My Place, the experiences of children within different diasporic communities are the focal point of five of the initial six episodes, the plots of which revolve around children with Lebanese, Vietnamese, Greek, and Irish backgrounds. This article focuses on an early episode of the series, “1988,” which explicitly confronts the cultural frictions between dominant Anglocentric Australian and diasporic communities. “1988” centres on the reaction of young Lily to the arrival of her cousin, Phuong, from Vietnam. Lily is a member of a diasporic community, but one who strongly identifies as “an Australian,” allowing a nuanced exploration of the ideological conflicts surrounding the issue of so-called “boat people.” The protagonist’s voice-over narration at the beginning of the episode foregrounds her desire to win Australia’s first Olympic gold medal in gymnastics, thus mobilising nationally identified hierarchies of value. Tensions between diasporic and settler cultures are frequently depicted. One potentially reactionary sequence portrays the recurring character of Michaelis complaining about having to use chopsticks in the Vietnamese restaurant; however, this comment is contextualised several episodes later, when a much younger Michaelis, as protagonist of the episode “1958,” is himself discriminated against, due to his Greek background. The political irony of “1988” pivots on Lily’s assumption that her cousin “won’t know Australian.” There is a patronising tone in her warning to Phuong not to speak Vietnamese for fear of schoolyard bullying: “The kids at school give you heaps if you talk funny. But it’s okay, I can talk for you!” This encourages child viewers to distance themselves from this fictional parallel to the frequent absence of representation of asylum seekers in contemporary debates. Lily’s assumptions and attitudes are treated with a degree of scepticism, particularly when she assures her friends that the silent Phuong will “get normal soon,” before objectifying her cousin for classroom “show and tell.” A close-up camera shot settles on Phuong’s unease while the children around her gossip about her status as a “boat person,” further encouraging the audience to empathise with the bullied character. However, Phuong turns the tables on those around her when she reveals she can competently speak English, is able to perform gymnastics and other feats beyond Lily’s ability, and even invents a story of being attacked by “pirates” in order to silence her gossiping peers. By the end of the narrative, Lily has redeemed herself and shares a close friendship with Phuong. My Place’s structured child “participation” plays a key role in developing the postcolonial perspective required by this episode and the project more broadly. Indeed, despite the record project budget, a second series was commissioned, at least partly on the basis of the overwhelmingly positive reception of viewsers on the ABC website forums (Buckland). The intricate My Place website, accessible through the ABC3 metasite, generates transmedia intertextuality interlocking with, and extending the diegesis of, the televised texts. A hyperlinked timeline leads to collections of personal artefacts “owned” by each protagonist, such as journals, toys, and clothing. Clicking on a gold medal marked “History” in Lily’s collection activates scrolling text describing the political acceptance of the phrase “multiculturalism” and the “Family Reunion” policy, which assisted the arrival of 100,000 Vietnamese immigrants. The viewser is reminded that some people were “not very welcoming” of diasporic groups via an explicit reference to Mrs Benson’s discriminatory attitudes in the series. Viewsers can “visit” virtual representations of the program’s sets. In the bedroom, kitchen, living room and/or backyard of each protagonist can be discovered familiar and additional details of the characters’ lives. The artefacts that can be “played” with in the multimedia applications often imply the enthusiastic (and apparently desirable) adoption of “Australianness” by immigrant children. Lily’s toys (her doll, hair accessories, roller skates, and glass marbles) invoke various aspects of western children’s culture, while her “journal entry” about Phuong states that she is “new to Australia but with her sense of humour she has fitted in really well.” At the same time, the interactive elements within Lily’s kitchen, including a bowl of rice and other Asian food ingredients, emphasise cultural continuity. The description of incense in another room of Lily’s house as a “common link” that is “used in many different cultures and religions for similar purposes” clearly normalises a glocalised world-view. Artefacts inside the restaurant operated by Lily’s mother link to information ranging from the ingredients and (flexible) instructions for how to make rice paper rolls (“Lily and Phuong used these fillings but you can use whatever you like!”) to a brief interactive puzzle game requiring the arrangement of several peppers in order from least hot to most hot. A selectable picture frame downloads a text box labelled “Images of Home.” Combined with a slideshow of static, hand-drawn images of traditional Vietnamese life, the text can be read as symbolic of the multiplicity of My Place’s target audience(s): “These images would have reminded the family of their homeland and also given restaurant customers a sense of Vietnamese culture.” The social-developmental, postcolonial agenda of My Place is registered in both “conventional” ancillary texts, such as the series’ “making of” publication (Wheatley), and the elaborate pedagogical website for teachers developed by the ACTF and Educational Services Australia (http://www.myplace.edu.au/). The politicising function of the latter is encoded in the various summaries of each decade’s historical, political, social, cultural, and technological highlights, often associated with the plot of the relevant episode. The page titled “Multiculturalism” reports on the positive amendments to the Commonwealth’s Migration Act 1958 and provides links to photographs of Vietnamese migrants in 1982, exemplifying the values of equality and cultural diversity through Lily and Phuong’s story. The detailed “Teaching Activities” documents available for each episode serve a similar purpose, providing, for example, the suggestion that teachers “ask students to discuss the importance to a new immigrant of retaining links to family, culture and tradition.” The empathetic positioning of Phuong’s situation is further mirrored in the interactive map available for teacher use that enables children to navigate a boat from Vietnam to the Australian coast, encouraging a perspective that is rarely put forward in Australia’s mass media. This is not to suggest that the My Place project is entirely unproblematic. In her postcolonial analysis of Aboriginal children’s literature, Clare Bradford argues that “it’s all too possible for ‘similarities’ to erase difference and the political significances of [a] text” (188). Lily’s schoolteacher’s lesson in the episode “reminds us that boat people have been coming to Australia for a very long time.” However, the implied connection between convicts and asylum seekers triggered by Phuong’s (mis)understanding awkwardly appropriates a mythologised Australian history. Similarly in the “1998” episode, the Muslim character Mohammad’s use of Ramadan for personal strength in order to emulate the iconic Australian cricketer Shane Warne threatens to subsume the “difference” of the diasporic community. Nonetheless, alongside the similarities between individuals and the various ethnic groups that make up the My Place community, important distinctions remain. Each episode begins and/or ends with the child protagonist(s) playing on or around the central motif of the series—a large fig tree—with the characters declaring that the tree is “my place.” While emphasising the importance of individuality in the project’s construction of child citizens, the cumulative effect of these “my place” sentiments, felt over time by characters from different socio-economic, ethnic, and cultural backgrounds, builds a multifaceted conception of Australian identity that consists of numerous (and complementary) “branches.” The project’s multi-platformed content further emphasises this, with the website containing an image of the prominent (literal and figurative) “Community Tree,” through which the viewser can interact with the generations of characters and families from the series (http://www.abc.net.au/abc3/myplace/). The significant role of the ABC’s My Place project showcases the ABC’s remit as a public broadcaster in the digital era. As Tim Brooke-Hunt, the Executive Head of Children’s Content, explains, if the ABC didn’t do it, no other broadcaster was going to come near it. ... I don’t expect My Place to be a humungous commercial or ratings success, but I firmly believe ... that it will be something that will exist for many years and will have a very special place. Conclusion The reversion to iconic aspects of mainstream Anglo-Australian culture is perhaps unsurprising—and certainly telling—when reflecting on the network of local, national, and global forces impacting on the development of a cultural commons. However, this does not detract from the value of the public broadcaster’s construction of child citizens within a clearly self-conscious discourse of “multiculturalism.” The transmedia intertextuality at work across ABC3 projects and platforms serves an important politicising function, offering positive representations of diasporic communities to counter the negative depictions children are exposed to elsewhere, and positioning child viewsers to “participate” in “working through” fraught issues of Australia’s past that still remain starkly relevant today.References ABC. Redefining the Town Square. ABC Annual Report. Sydney: ABC, 2009. Bennett, James, and Niki Strange. “The BBC’s Second-Shift Aesthetics: Interactive Television, Multi-Platform Projects and Public Service Content for a Digital Era.” Media International Australia: Incorporating Culture and Policy 126 (2008): 106-19. Born, Georgina. Uncertain Vision: Birt, Dyke and the Reinvention of the BBC. London: Vintage, 2004. boyd, danah. “Why Youth ♥ Social Network Sites: The Role of Networked Publics in Teenage Social Life.” Youth, Identity, and Digital Media. Ed. David Buckingham. Cambridge: MIT, 2008. 119-42. Bradford, Clare. Reading Race: Aboriginality in Australian Children’s Literature. Carlton: Melbourne UP, 2001. Brooke-Hunt, Tim. Executive Head of Children’s Content, ABC TV. Interviewed by Dr Leonie Rutherford, ABC Ultimo Center, 16 Mar. 2010. Buckingham, David. After the Death of Childhood: Growing Up in the Age of Electronic Media. Cambridge: Polity, 2000. Buckland, Jenny. Chief Executive Officer, Australian Children’s Television Foundation. Interviewed by Dr Leonie Rutherford and Dr Nina Weerakkody, ACTF, 2 June 2010. Caldwell, John T. “Second Shift Media Aesthetics: Programming, Interactivity and User Flows.” New Media: Theories and Practices of Digitextuality. Eds. John T. Caldwell and Anna Everett. London: Routledge, 2003. 127-44. Debrett, Mary. “Riding the Wave: Public Service Television in the Multiplatform Era.” Media, Culture & Society 31.5 (2009): 807-27. From, Unni. “Domestically Produced TV-Drama and Cultural Commons.” Cultural Dilemmas in Public Service Broadcasting. Eds. Gregory Ferrell Lowe and Per Jauert. Göteborg: Nordicom, 2005. 163-77. Glen, David. Executive Producer, ABC Multiplatform. Interviewed by Dr Leonie Rutherford, ABC Elsternwick, 6 July 2010. Harries, Dan. “Watching the Internet.” The New Media Book. Ed. Dan Harries. London: BFI, 2002. 171-82. Murdock, Graham. “Building the Digital Commons: Public Broadcasting in the Age of the Internet.” Cultural Dilemmas in Public Service Broadcasting. Ed. Gregory Ferrell Lowe and Per Jauert. Göteborg: Nordicom, 2005. 213–30. My Place, Volumes 1 & 2: 2008–1888. DVD. ABC, 2009. Northam, Jean A. “Rehearsals in Citizenship: BBC Stop-Motion Animation Programmes for Young Children.” Journal for Cultural Research 9.3 (2005): 245-63. Wheatley, Nadia. Making My Place. Sydney and Auckland: HarperCollins, 2010. ———, and Donna Rawlins. My Place, South Melbourne: Longman, 1988.
APA, Harvard, Vancouver, ISO, and other styles
33

Burgess, Jean, and Axel Bruns. "Twitter Archives and the Challenges of "Big Social Data" for Media and Communication Research." M/C Journal 15, no. 5 (2012). http://dx.doi.org/10.5204/mcj.561.

Full text
Abstract:
Lists and Social MediaLists have long been an ordering mechanism for computer-mediated social interaction. While far from being the first such mechanism, blogrolls offered an opportunity for bloggers to provide a list of their peers; the present generation of social media environments similarly provide lists of friends and followers. Where blogrolls and other earlier lists may have been user-generated, the social media lists of today are more likely to have been produced by the platforms themselves, and are of intrinsic value to the platform providers at least as much as to the users themselves; both Facebook and Twitter have highlighted the importance of their respective “social graphs” (their databases of user connections) as fundamental elements of their fledgling business models. This represents what Mejias describes as “nodocentrism,” which “renders all human interaction in terms of network dynamics (not just any network, but a digital network with a profit-driven infrastructure).”The communicative content of social media spaces is also frequently rendered in the form of lists. Famously, blogs are defined in the first place by their reverse-chronological listing of posts (Walker Rettberg), but the same is true for current social media platforms: Twitter, Facebook, and other social media platforms are inherently centred around an infinite, constantly updated and extended list of posts made by individual users and their connections.The concept of the list implies a certain degree of order, and the orderliness of content lists as provided through the latest generation of centralised social media platforms has also led to the development of more comprehensive and powerful, commercial as well as scholarly, research approaches to the study of social media. Using the example of Twitter, this article discusses the challenges of such “big data” research as it draws on the content lists provided by proprietary social media platforms.Twitter Archives for ResearchTwitter is a particularly useful source of social media data: using the Twitter API (the Application Programming Interface, which provides structured access to communication data in standardised formats) it is possible, with a little effort and sufficient technical resources, for researchers to gather very large archives of public tweets concerned with a particular topic, theme or event. Essentially, the API delivers very long lists of hundreds, thousands, or millions of tweets, and metadata about those tweets; such data can then be sliced, diced and visualised in a wide range of ways, in order to understand the dynamics of social media communication. Such research is frequently oriented around pre-existing research questions, but is typically conducted at unprecedented scale. The projects of media and communication researchers such as Papacharissi and de Fatima Oliveira, Wood and Baughman, or Lotan, et al.—to name just a handful of recent examples—rely fundamentally on Twitter datasets which now routinely comprise millions of tweets and associated metadata, collected according to a wide range of criteria. What is common to all such cases, however, is the need to make new methodological choices in the processing and analysis of such large datasets on mediated social interaction.Our own work is broadly concerned with understanding the role of social media in the contemporary media ecology, with a focus on the formation and dynamics of interest- and issues-based publics. We have mined and analysed large archives of Twitter data to understand contemporary crisis communication (Bruns et al), the role of social media in elections (Burgess and Bruns), and the nature of contemporary audience engagement with television entertainment and news media (Harrington, Highfield, and Bruns). Using a custom installation of the open source Twitter archiving tool yourTwapperkeeper, we capture and archive all the available tweets (and their associated metadata) containing a specified keyword (like “Olympics” or “dubstep”), name (Gillard, Bieber, Obama) or hashtag (#ausvotes, #royalwedding, #qldfloods). In their simplest form, such Twitter archives are commonly stored as delimited (e.g. comma- or tab-separated) text files, with each of the following values in a separate column: text: contents of the tweet itself, in 140 characters or less to_user_id: numerical ID of the tweet recipient (for @replies) from_user: screen name of the tweet sender id: numerical ID of the tweet itself from_user_id: numerical ID of the tweet sender iso_language_code: code (e.g. en, de, fr, ...) of the sender’s default language source: client software used to tweet (e.g. Web, Tweetdeck, ...) profile_image_url: URL of the tweet sender’s profile picture geo_type: format of the sender’s geographical coordinates geo_coordinates_0: first element of the geographical coordinates geo_coordinates_1: second element of the geographical coordinates created_at: tweet timestamp in human-readable format time: tweet timestamp as a numerical Unix timestampIn order to process the data, we typically run a number of our own scripts (written in the programming language Gawk) which manipulate or filter the records in various ways, and apply a series of temporal, qualitative and categorical metrics to the data, enabling us to discern patterns of activity over time, as well as to identify topics and themes, key actors, and the relations among them; in some circumstances we may also undertake further processes of filtering and close textual analysis of the content of the tweets. Network analysis (of the relationships among actors in a discussion; or among key themes) is undertaken using the open source application Gephi. While a detailed methodological discussion is beyond the scope of this article, further details and examples of our methods and tools for data analysis and visualisation, including copies of our Gawk scripts, are available on our comprehensive project website, Mapping Online Publics.In this article, we reflect on the technical, epistemological and political challenges of such uses of large-scale Twitter archives within media and communication studies research, positioning this work in the context of the phenomenon that Lev Manovich has called “big social data.” In doing so, we recognise that our empirical work on Twitter is concerned with a complex research site that is itself shaped by a complex range of human and non-human actors, within a dynamic, indeed volatile media ecology (Fuller), and using data collection and analysis methods that are in themselves deeply embedded in this ecology. “Big Social Data”As Manovich’s term implies, the Big Data paradigm has recently arrived in media, communication and cultural studies—significantly later than it did in the hard sciences, in more traditionally computational branches of social science, and perhaps even in the first wave of digital humanities research (which largely applied computational methods to pre-existing, historical “big data” corpora)—and this shift has been provoked in large part by the dramatic quantitative growth and apparently increased cultural importance of social media—hence, “big social data.” As Manovich puts it: For the first time, we can follow [the] imaginations, opinions, ideas, and feelings of hundreds of millions of people. We can see the images and the videos they create and comment on, monitor the conversations they are engaged in, read their blog posts and tweets, navigate their maps, listen to their track lists, and follow their trajectories in physical space. (Manovich 461) This moment has arrived in media, communication and cultural studies because of the increased scale of social media participation and the textual traces that this participation leaves behind—allowing researchers, equipped with digital tools and methods, to “study social and cultural processes and dynamics in new ways” (Manovich 461). However, and crucially for our purposes in this article, many of these scholarly possibilities would remain latent if it were not for the widespread availability of Open APIs for social software (including social media) platforms. APIs are technical specifications of how one software application should access another, thereby allowing the embedding or cross-publishing of social content across Websites (so that your tweets can appear in your Facebook timeline, for example), or allowing third-party developers to build additional applications on social media platforms (like the Twitter user ranking service Klout), while also allowing platform owners to impose de facto regulation on such third-party uses via the same code. While platform providers do not necessarily have scholarship in mind, the data access affordances of APIs are also available for research purposes. As Manovich notes, until very recently almost all truly “big data” approaches to social media research had been undertaken by computer scientists (464). But as part of a broader “computational turn” in the digital humanities (Berry), and because of the increased availability to non-specialists of data access and analysis tools, media, communication and cultural studies scholars are beginning to catch up. Many of the new, large-scale research projects examining the societal uses and impacts of social media—including our own—which have been initiated by various media, communication, and cultural studies research leaders around the world have begun their work by taking stock of, and often substantially extending through new development, the range of available tools and methods for data analysis. The research infrastructure developed by such projects, therefore, now reflects their own disciplinary backgrounds at least as much as it does the fundamental principles of computer science. In turn, such new and often experimental tools and methods necessarily also provoke new epistemological and methodological challenges. The Twitter API and Twitter ArchivesThe Open API was a key aspect of mid-2000s ideas about the value of the open Web and “Web 2.0” business models (O’Reilly), emphasising the open, cross-platform sharing of content as well as promoting innovation at the margins via third-party application development—and it was in this ideological environment that the microblogging service Twitter launched and experienced rapid growth in popularity among users and developers alike. As José van Dijck cogently argues, however, a complex interplay of technical, economic and social dynamics has seen Twitter shift from a relatively open, ad hoc and user-centred platform toward a more formalised media business: For Twitter, the shift from being primarily a conversational communication tool to being a global, ad-supported followers tool took place in a relatively short time span. This shift did not simply result from the owner’s choice for a distinct business model or from the company’s decision to change hardware features. Instead, the proliferation of Twitter as a tool has been a complex process in which technological adjustments are intricately intertwined with changes in user base, transformations of content and choices for revenue models. (van Dijck 343)The specifications of Twitter’s API, as well as the written guidelines for its use by developers (Twitter, “Developer Rules”) are an excellent example of these “technological adjustments” and the ways they are deeply interwined with Twitter’s search for a viable revenue model. These changes show how the apparent semantic openness or “interpretive flexibility” of the term “platform” allows its meaning to be reshaped over time as the business models of platform owners change (Gillespie).The release of the API was first announced on the Twitter blog in September 2006 (Stone), not long after the service’s launch but after some popular third-party applications (like a mashup of Twitter with Google Maps creating a dynamic display of recently posted tweets around the world) had already been developed. Since then Twitter has seen a flourishing of what the company itself referred to as the “Twitter ecosystem” (Twitter, “Developer Rules”), including third-party developed client software (like Twitterific and TweetDeck), institutional use cases (such as large-scale social media visualisations of the London Riots in The Guardian), and parasitic business models (including social media metrics services like HootSuite and Klout).While the history of Twitter’s API rules and related regulatory instruments (such as its Developer Rules of the Road and Terms of Use) has many twists and turns, there have been two particularly important recent controversies around data access and control. First, the company locked out developers and researchers from direct “firehose” (very high volume) access to the Twitter feed; this was accompanied by a crackdown on free and public Twitter archiving services like 140Kit and the Web version of Twapperkeeper (Sample), and coincided with the establishment of what was at the time a monopoly content licensing arrangement between Twitter and Gnip, a company which charges commercial rates for high-volume API access to tweets (and content from other social media platforms). A second wave of controversy among the developer community occurred in August 2012 in response to Twitter’s release of its latest API rules (Sippey), which introduce further, significant limits to API use and usability in certain circumstances. In essence, the result of these changes to the Twitter API rules, announced without meaningful consultation with the developer community which created the Twitter ecosystem, is a forced rebalancing of development activities: on the one hand, Twitter is explicitly seeking to “limit” (Sippey) the further development of API-based third-party tools which support “consumer engagement activities” (such as end-user clients), in order to boost the use of its own end-user interfaces; on the other hand, it aims to “encourage” the further development of “consumer analytics” and “business analytics” as well as “business engagement” tools. Implicit in these changes is a repositioning of Twitter users (increasingly as content consumers rather than active communicators), but also of commercial and academic researchers investigating the uses of Twitter (as providing a narrow range of existing Twitter “analytics” rather than engaging in a more comprehensive investigation both of how Twitter is used, and of how such uses continue to evolve). The changes represent an attempt by the company to cement a certain, commercially viable and valuable, vision of how Twitter should be used (and analysed), and to prevent or at least delay further evolution beyond this desired stage. Although such attempts to “freeze” development may well be in vain, given the considerable, documented role which the Twitter user base has historically played in exploring new and unforeseen uses of Twitter (Bruns), it undermines scholarly research efforts to examine actual Twitter uses at least temporarily—meaning that researchers are increasingly forced to invest time and resources in finding workarounds for the new restrictions imposed by the Twitter API.Technical, Political, and Epistemological IssuesIn their recent article “Critical Questions for Big Data,” danah boyd and Kate Crawford have drawn our attention to the limitations, politics and ethics of big data approaches in the social sciences more broadly, but also touching on social media as a particularly prevalent site of social datamining. In response, we offer the following complementary points specifically related to data-driven Twitter research relying on archives of tweets gathered using the Twitter API.First, somewhat differently from most digital humanities (where researchers often begin with a large pre-existing textual corpus), in the case of Twitter research we have no access to an original set of texts—we can access only what Twitter’s proprietary and frequently changing API will provide. The tools Twitter researchers use rely on various combinations of parts of the Twitter API—or, more accurately, the various Twitter APIs (particularly the Search and Streaming APIs). As discussed above, of course, in providing an API, Twitter is driven not by scholarly concerns but by an attempt to serve a range of potentially value-generating end-users—particularly those with whom Twitter can create business-to-business relationships, as in their recent exclusive partnership with NBC in covering the 2012 London Olympics.The following section from Twitter’s own developer FAQ highlights the potential conflicts between the business-case usage scenarios under which the APIs are provided and the actual uses to which they are often put by academic researchers or other dataminers:Twitter’s search is optimized to serve relevant tweets to end-users in response to direct, non-recurring queries such as #hashtags, URLs, domains, and keywords. The Search API (which also powers Twitter’s search widget) is an interface to this search engine. Our search service is not meant to be an exhaustive archive of public tweets and not all tweets are indexed or returned. Some results are refined to better combat spam and increase relevance. Due to capacity constraints, the index currently only covers about a week’s worth of tweets. (Twitter, “Frequently Asked Questions”)Because external researchers do not have access to the full, “raw” data, against which we could compare the retrieved archives which we use in our later analyses, and because our data access regimes rely so heavily on Twitter’s APIs—each with its technical quirks and limitations—it is impossible for us to say with any certainty that we are capturing a complete archive or even a “representative” sample (whatever “representative” might mean in a data-driven, textualist paradigm). In other words, the “lists” of tweets delivered to us on the basis of a keyword search are not necessarily complete; and there is no way of knowing how incomplete they are. The total yield of even the most robust capture system (using the Streaming API and not relying only on Search) depends on a number of variables: rate limiting, the filtering and spam-limiting functions of Twitter’s search algorithm, server outages and so on; further, because Twitter prohibits the sharing of data sets it is difficult to compare notes with other research teams.In terms of epistemology, too, the primary reliance on large datasets produces a new mode of scholarship in media, communication and cultural studies: what emerges is a form of data-driven research which tends towards abductive reasoning; in doing so, it highlights tensions between the traditional research questions in discourse or text-based disciplines like media and communication studies, and the assumptions and modes of pattern recognition that are required when working from the “inside out” of a corpus, rather than from the outside in (for an extended discussion of these epistemological issues in the digital humanities more generally, see Dixon).Finally, even the heuristics of our analyses of Twitter datasets are mediated by the API: the datapoints that are hardwired into the data naturally become the most salient, further shaping the type of analysis that can be done. For example, a common process in our research is to use the syntax of tweets to categorise it as one of the following types of activity: original tweets: tweets which are neither @reply nor retweetretweets: tweets which contain RT @user… (or similar) unedited retweets: retweets which start with RT @user… edited retweets: retweets do not start with RT @user…genuine @replies: tweets which contain @user, but are not retweetsURL sharing: tweets which contain URLs(Retweets which are made using the Twitter “retweet button,” resulting in verbatim passing-along without the RT @user syntax or an opportunity to add further comment during the retweet process, form yet another category, which cannot be tracked particularly effectively using the Twitter API.)These categories are driven by the textual and technical markers of specific kinds of interactions that are built into the syntax of Twitter itself (@replies or @mentions, RTs); and specific modes of referentiality (URLs). All of them focus on (and thereby tend to privilege) more informational modes of communication, rather than the ephemeral, affective, or ambiently intimate uses of Twitter that can be illuminated more easily using ethnographic approaches: approaches that can actually focus on the individual user, their social contexts, and the broader cultural context of the traces they leave on Twitter. ConclusionsIn this article we have described and reflected on some of the sociotechnical, political and economic aspects of the lists of tweets—the structured Twitter data upon which our research relies—which may be gathered using the Twitter API. As we have argued elsewhere (Bruns and Burgess)—and, hopefully, have begun to demonstrate in this paper—media and communication studies scholars who are actually engaged in using computational methods are well-positioned to contribute to both the methodological advances we highlight at the beginning of this paper and the political debates around computational methods in the “big social data” moment on which the discussion in the second part of the paper focusses. One pressing issue in the area of methodology is to build on current advances to bring together large-scale datamining approaches with ethnographic and other qualitative approaches, especially including close textual analysis. More broadly, in engaging with the “big social data” moment there is a pressing need for the development of code literacy in media, communication and cultural studies. In the first place, such literacy has important instrumental uses: as Manovich argues, much big data research in the humanities requires costly and time-consuming (and sometimes alienating) partnerships with technical experts (typically, computer scientists), because the free tools available to non-programmers are still limited in utility in comparison to what can be achieved using raw data and original code (Manovich, 472).But code literacy is also a requirement of scholarly rigour in the context of what David Berry calls the “computational turn,” representing a “third wave” of Digital Humanities. Berry suggests code and software might increasingly become in themselves objects of, and not only tools for, research: I suggest that we introduce a humanistic approach to the subject of computer code, paying attention to the wider aspects of code and software, and connecting them to the materiality of this growing digital world. With this in mind, the question of code becomes increasingly important for understanding in the digital humanities, and serves as a condition of possibility for the many new computational forms that mediate our experience of contemporary culture and society. (Berry 17)A first step here lies in developing a more robust working knowledge of the conceptual models and methodological priorities assumed by the workings of both the tools and the sources we use for “big social data” research. Understanding how something like the Twitter API mediates the cultures of use of the platform, as well as reflexively engaging with its mediating role in data-driven Twitter research, promotes a much more materialist critical understanding of the politics of the social media platforms (Gillespie) that are now such powerful actors in the media ecology. ReferencesBerry, David M. “Introduction: Understanding Digital Humanities.” Understanding Digital Humanities. Ed. David M. Berry. London: Palgrave Macmillan, 2012. 1-20.boyd, danah, and Kate Crawford. “Critical Questions for Big Data.” Information, Communication & Society 15.5 (2012): 662-79.Bruns, Axel. “Ad Hoc Innovation by Users of Social Networks: The Case of Twitter.” ZSI Discussion Paper 16 (2012). 18 Sep. 2012 ‹https://www.zsi.at/object/publication/2186›.Bruns, Axel, and Jean Burgess. “Notes towards the Scientific Study of Public Communication on Twitter.” Keynote presented at the Conference on Science and the Internet, Düsseldorf, 4 Aug. 2012. 18 Sep. 2012 http://snurb.info/files/2012/Notes%20towards%20the%20Scientific%20Study%20of%20Public%20Communication%20on%20Twitter.pdfBruns, Axel, Jean Burgess, Kate Crawford, and Frances Shaw. “#qldfloods and @QPSMedia: Crisis Communication on Twitter in the 2011 South East Queensland Floods.” Brisbane: ARC Centre of Excellence for Creative Industries and Innovation, 2012. 18 Sep. 2012 ‹http://cci.edu.au/floodsreport.pdf›Burgess, Jean E. & Bruns, Axel (2012) “(Not) the Twitter Election: The Dynamics of the #ausvotes Conversation in Relation to the Australian Media Ecology.” Journalism Practice 6.3 (2012): 384-402Dixon, Dan. “Analysis Tool Or Research Methodology: Is There an Epistemology for Patterns?” Understanding Digital Humanities. Ed. David M. Berry. London: Palgrave Macmillan, 2012. 191-209.Fuller, Matthew. Media Ecologies: Materialist Energies in Art and Technoculture. Cambridge, Mass.: MIT P, 2005.Gillespie, Tarleton. “The Politics of ‘Platforms’.” New Media & Society 12.3 (2010): 347-64.Harrington, Stephen, Highfield, Timothy J., & Bruns, Axel (2012) “More than a Backchannel: Twitter and Television.” Ed. José Manuel Noguera. Audience Interactivity and Participation. COST Action ISO906 Transforming Audiences, Transforming Societies, Brussels, Belgium, pp. 13-17. 18 Sept. 2012 http://www.cost-transforming-audiences.eu/system/files/essays-and-interview-essays-18-06-12.pdfLotan, Gilad, Erhardt Graeff, Mike Ananny, Devin Gaffney, Ian Pearce, and danah boyd. “The Arab Spring: The Revolutions Were Tweeted: Information Flows during the 2011 Tunisian and Egyptian Revolutions.” International Journal of Communication 5 (2011): 1375-1405. 18 Sep. 2012 ‹http://ijoc.org/ojs/index.php/ijoc/article/view/1246/613›.Manovich, Lev. “Trending: The Promises and the Challenges of Big Social Data.” Debates in the Digital Humanities. Ed. Matthew K. Gold. Minneapolis: U of Minnesota P, 2012. 460-75.Mejias, Ulises A. “Liberation Technology and the Arab Spring: From Utopia to Atopia and Beyond.” Fibreculture Journal 20 (2012). 18 Sep. 2012 ‹http://twenty.fibreculturejournal.org/2012/06/20/fcj-147-liberation-technology-and-the-arab-spring-from-utopia-to-atopia-and-beyond/›.O’Reilly, Tim. “What is Web 2.0? Design Patterns and Business Models for the Next Generation of Software.” O’Reilly Network 30 Sep. 2005. 18 Sep. 2012 ‹http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html›.Papacharissi, Zizi, and Maria de Fatima Oliveira. “Affective News and Networked Publics: The Rhythms of News Storytelling on #Egypt.” Journal of Communication 62.2 (2012): 266-82.Sample, Mark. “The End of Twapperkeeper (and What to Do about It).” ProfHacker. The Chronicle of Higher Education 8 Mar. 2011. 18 Sep. 2012 ‹http://chronicle.com/blogs/profhacker/the-end-of-twapperkeeper-and-what-to-do-about-it/31582›.Sippey, Michael. “Changes Coming in Version 1.1 of the Twitter API.” 16 Aug. 2012. Twitter Developers Blog. 18 Sep. 2012 ‹https://dev.Twitter.com/blog/changes-coming-to-Twitter-api›.Stone, Biz. “Introducing the Twitter API.” Twitter Blog 20 Sep. 2006. 18 Sep. 2012 ‹http://blog.Twitter.com/2006/09/introducing-Twitter-api.html›.Twitter. “Developer Rules of the Road.” Twitter Developers Website 17 May 2012. 18 Sep. 2012 ‹https://dev.Twitter.com/terms/api-terms›.Twitter. “Frequently Asked Questions.” 18 Sep. 2012 ‹https://dev.twitter.com/docs/faq›.Van Dijck, José. “Tracing Twitter: The Rise of a Microblogging Platform.” International Journal of Media and Cultural Politics 7.3 (2011): 333-48.Walker Rettberg, Jill. Blogging. Cambridge: Polity, 2008.Wood, Megan M., and Linda Baughman. “Glee Fandom and Twitter: Something New, or More of the Same Old Thing?” Communication Studies 63.3 (2012): 328-44.
APA, Harvard, Vancouver, ISO, and other styles
34

Maddox, Alexia, and Luke J. Heemsbergen. "Digging in Crypto-Communities’ Future-Making." M/C Journal 24, no. 2 (2021). http://dx.doi.org/10.5204/mcj.2755.

Full text
Abstract:
Introduction This article situates the dark as a liminal and creative space of experimentation where tensions are generative and people tinker with emerging technologies to create alternative futures. Darkness need not mean chaos and fear of violence – it can mean privacy and protection. We define dark as an experimental space based upon uncertainties rather than computational knowns (Bridle) and then demonstrate via a case study of cryptocurrencies the contribution of dark and liminal social spaces to future(s)-making. Cryptocurrencies are digital cash systems that use decentralised (peer-to-peer) networking to enable irreversible payments (Maurer, Nelms, and Swartz). Cryptocurrencies are often clones or variations on the ‘original’ Bitcoin payment systems protocol (Trump et al.) that was shared with the cryptographic community through a pseudonymous and still unknown author(s) (Nakamoto), creating a founder mystery. Due to the open creation process, a new cryptocurrency is relatively easy to make. However, many of them are based on speculative bubbles that mirror Bitcoin, Ethereum, and ICOs’ wealth creation. Examples of cryptocurrencies now largely used for speculation due to their volatility in holding value are rampant, with online clearing houses competing to trade hundreds of different assets from AAVE to ZIL. Many of these altcoins have little to no following or trading volume, leading to their obsolescence. Others enjoy immense popularity among dedicated communities of backers and investors. Consequently, while many cryptocurrency experiments fail or lack adoption and drop from the purview of history, their constant variation also contributes to the undertow of the future that pulls against more visible surface waves of computational progress. The article is structured to first define how we understand and leverage ‘dark’ against computational cultures. We then apply thematic and analytical tactics to articulate future-making socio-technical experiments in the dark. Based on past empirical work of the authors (Maddox "Netnography") we focus on crypto-cultures’ complex emancipatory and normative tensions via themes of construction, disruption, contention, redirection, obsolescence, and iteration. Through these themes we illustrate the mutation and absorption of dark experimental spaces into larger social structures. The themes we identify are not meant as a complete or necessarily serial set of occurrences, but nonetheless contribute a new vocabulary for students of technology and media to see into and grapple with the dark. Embracing the Dark: Prework & Analytical Tactics for Outside the Known To frame discussion of the dark here as creative space for alternative futures, we focus on scholars who have deeply engaged with notions of socio-technical darkness. This allows us to explore outside the blinders of computational light and, with a nod to Sassen, dig in the shadows of known categories to evolve the analytical tactics required for the study of emerging socio-technical conditions. We understand the Dark Web to usher shifting and multiple definitions of darkness, from a moral darkness to a technical one (Gehl). From this work, we draw the observation of how technologies that obfuscate digital tracking create novel capacities for digital cultures in spaces defined by anonymity for both publisher and user. Darknets accomplish this by overlaying open internet protocols (e.g. TCP/IP) with non-standard protocols that encrypt and anonymise information (Pace). Pace traces concepts of darknets to networks in the 1970s that were 'insulated’ from the internet’s predecessor ARPANET by air gap, and then reemerged as software protocols similarly insulated from cultural norms around intellectual property. ‘Darknets’ can also be considered in ternary as opposed to binary terms (Gehl and McKelvey) that push to make private that which is supposed to be public infrastructure, and push private platforms (e.g. a Personal Computer) to make public networks via common bandwidth. In this way, darknets feed new possibilities of communication from both common infrastructures and individual’s platforms. Enabling new potentials of community online and out of sight serves to signal what the dark accomplishes for the social when measured against an otherwise unending light of computational society. To this point, a new dark age can be welcomed insofar it allows an undecided future outside of computational logics that continually define and refine the possible and probable (Bridle). This argument takes von Neumann’s 1945 declaration that “all stable processes we shall predict. All unstable processes we shall control” (in Bridle 21) as a founding statement for computational thought and indicative of current society. The hope expressed by Bridle is not an absence of knowledge, but an absence of knowing the future. Past the computational prison of total information awareness within an accelerating information age (Castells) is the promise of new formations of as yet unknowable life. Thus, from Bridle’s perspective, and ours, darkness can be a place of freedom and possibility, where the equality of being in the dark, together, is not as threatening as current privileged ways of thinking would suggest (Bridle 15). The consequences of living in a constant glaring light lead to data hierarchies “leaching” (Bridle) into everything, including social relationships, where our data are relationalised while our relations are datafied (Maddox and Heemsbergen) by enforcing computational thinking upon them. Darkness becomes a refuge that acknowledges the power of unknowing, and a return to potential for social, equitable, and reciprocal relations. This is not to say that we envision a utopian life without the shadow of hierarchy, but rather an encouragement to dig into those shadows made visible only by the brightest of lights. The idea of digging in the shadows is borrowed from Saskia Sassen, who asks us to consider the ‘master categories’ that blind us to alternatives. According to Sassen (402), while master categories have the power to illuminate, their blinding power keeps us from seeing other presences in the landscape: “they produce, then, a vast penumbra around that center of light. It is in that penumbra that we need to go digging”. We see darkness in the age of digital ubiquity as rejecting the blinding ‘master category’ of computational thought. Computational thought defines social/economic/political life via what is static enough to predict or unstable enough to render a need to control. Otherwise, the observable, computable, knowable, and possible all follow in line. Our dig in the shadows posits a penumbra of protocols – both of computational code and human practice – that circle the blinding light of known digital communications. We use the remainder of this short article to describe these themes found in the dark that offer new ways to understand the movements and moments of potential futures that remain largely unseen. Thematic Resonances in the Dark This section considers cryptocultures of the dark. We build from a thematic vocabulary that has been previously introduced from empirical examples of the crypto-market communities which tinker with and through the darkness provided by encryption and privacy technologies (Maddox "Netnography"). Here we refine these future-making themes through their application to events surrounding community-generated technology aimed at disrupting centralised banking systems: cryptocurrencies (Maddox, Singh, et al.). Given the overlaps in collective values and technologies between crypto-communities, we find it useful to test the relevance of these themes to the experimental dynamics surrounding cryptocurrencies. We unpack these dynamics as construction, rupture and disruption, redirection, and the flip-sided relationship between obsolescence and iteration leading to mutation and absorption. This section provides a working example for how these themes adapt in application to a community dwelling at the edge of experimental technological possibilities. The theme of construction is both a beginning and a materialisation of a value field. It originates within the cyberlibertarians’ ideological stance towards using technological innovations to ‘create a new world in the shell of the old’ (van de Sande) which has been previously expressed through the concept of constructive activism (Maddox, Barratt, et al.). This libertarian ideology is also to be found in the early cultures that gave rise to cryptocurrencies. Through their interest in the potential of cryptography technologies related to social and political change, the Cypherpunks mailing list formed in 1992 (Swartz). The socio-cultural field surrounding cryptocurrencies, however, has always consisted of a diverse ecosystem of vested interests building collaborations from “goldbugs, hippies, anarchists, cyberpunks, cryptographers, payment systems experts, currency activists, commodity traders, and the curious” (Maurer, Nelms, and Swartz 262). Through the theme of construction we can consider architectures of collaboration, cooperation, and coordination developed by technically savvy populations. Cryptocurrencies are often developed as code by teams who build in mechanisms for issuance (e.g. ‘mining’) and other controls (Conway). Thus, construction and making of cryptocurrencies tend to be collective yet decentralised. Cryptocurrencies arose during a time of increasing levels of distrust in governments and global financial instability from the Global Financial Crisis (2008-2013), whilst gaining traction through their usefulness in engaging in illicit trade (Saiedi, Broström, and Ruiz). It was through this rupture in the certainties of ‘the old system’ that this technology, and the community developing it, sought to disrupt the financial system (Maddox, Singh, et al.; Nelms et al.). Here we see the utility of the second theme of rupture and disruption to illustrate creative experimentation in the liminal and emergent spaces cryptocurrencies afford. While current crypto crazes (e.g. NFTs, ICOs) have their detractors, Cohen suggests, somewhat ironically, that the momentum for change of the crypto current was “driven by the grassroots, and technologically empowered, movement to confront the ills perceived to be powered and exacerbated by market-based capitalism, such as climate change and income inequality” (Cohen 739). Here we can start to envision how subterranean currents that emerge from creative experimentations in the dark impact global social forces in multifaceted ways – even as they are dragged into the light. Within a disrupted environment characterised by rupture, contention and redirection is rife (Maddox "Disrupting"). Contention and redirection illustrate how competing agendas bump and grind to create a generative tension around a deep collective desire for social change. Contention often emerges within an environment of hacks and scams, of which there are many stories in the cryptocurrency world (see Bartlett for an example of OneCoin, for instance; Kavanagh, Miscione, and Ennis). Other aspects of contention emerge around how the technology works to produce (mint) cryptocurrencies, including concern over the environmental impact of producing cryptocurrencies (Goodkind, Jones, and Berrens) and the production of non-fungible tokens for the sale of digital assets (Howson). Contention also arises through the gendered social dynamics of brogramming culture skewing inclusive and diverse engagement (Bowles). Shifting from the ideal of inclusion to the actual practice of crypto-communities begs the question of whose futures are being made. Contention and redirections are also evidenced by ‘hard forks’ in cryptocurrency. The founder mystery resulted in the gifting of this technology to a decentralised and leaderless community, materialised through the distributed consensus processes to approve software updates to a cryptocurrency. This consensus system consequently holds within it the seeds for governance failures (Trump et al.), the first of which occurred with the ‘hard forking’ of Bitcoin into Bitcoin cash in 2017 (Webb). Hard forks occur when developers and miners no longer agree on a proposed change to the software: one group upgraded to the new software while the others operated on the old rules. The resulting two separate blockchains and digital currencies concretised the tensions and disagreements within the community. This forking resulted initially in a shock to the market value of, and trust in, the Bitcoin network, and the dilution of adoption networks across the two cryptocurrencies. The ongoing hard forks of Bitcoin Cash illustrate the continued contention occurring within the community as crypto-personalities pit against each other (Hankin; Li). As these examples show, not all experiments in cryptocurrencies are successful; some become obsolete through iteration (Arnold). Iteration engenders mutations in the cultural framing of socio-technical experiments. These mutations of meaning and signification then facilitate their absorption into novel futures, showing the ternary nature of how what happens in the dark works with what is known by the light. As a rhetorical device, cryptocurrencies have been referred to as a currency (a payment system) or a commodity (an investment or speculation vehicle; Nelms et al. 21). However, new potential applications for the underlying technologies continue emerge. For example, Ethereum, the second-most dominant cryptocurrency after Bitcoin, now offers smart contract technology (decentralised autonomous organisations, DAO; Kavanagh, Miscione, and Ennis) and is iterating technology to dramatically reduce the energy consumption required to mine and mint the non-fungible tokens (NFTs) associated with crypto art (Wintermeyer). Here we can see how these rhetorical framings may represent iterative shifts and meaning-mutation that is as pragmatic as it is cultural. While we have considered here the themes of obsolescence and iteration threaded through the technological differentiations amongst cryptocurrencies, what should we make of these rhetorical or cultural mutations? This cultural mutation, we argue, can be seen most clearly in the resurgence of Dogecoin. Dogecoin is a cryptocurrency launched in 2013 that takes its name and logo from a Shiba Inu meme that was popular several years ago (Potts and Berg). We can consider Dogecoin as a playful infrastructure (Rennie) and cultural product that was initially designed to provide a low bar for entry into the market. Its affordability is kept in place by the ability for miners to mint an unlimited number of coins. Dogecoin had a large resurgence of value and interest just after the meme-centric Reddit community Wallstreetbets managed to drive the share price of video game retailer GameStop to gain 1,500% (Potts and Berg). In this instance we see the mutation of a cryptocurrency into memecoin, or cultural product, for which the value is a prism to the wild fluctuations of internet culture itself, linking cultural bubbles to financial ones. In this case, technologies iterated in the dark mutated and surfaced as cultural bubbles through playful infrastructures that intersected with financial systems. The story of dogecoin articulates how cultural mutation articulates the absorption of emerging techno-potentials into larger structures. Conclusion From creative experiments digging in the dark shadows of global socio-economic forces, we can see how the future is formed beneath the surface of computational light. Yet as we write, cryptocurrencies are being absorbed by centralising and powerful entities to integrate them into global economies. Examples of large institutions hoarding Bitcoin include the crypto-counterbalancing between the Chinese state through its digital currency DCEP (Vincent) and Facebook through the Libra project. Vincent observes that the state-backed DCEP project is the antithesis of the decentralised community agenda for cryptocurrencies to enact the separation of state and money. Meanwhile, Facebook’s centralised computational control of platforms used by 2.8 billion humans provide a similarly perverse addition to cryptocurrency cultures. The penumbra fades as computational logic shifts its gaze. Our thematic exploration of cryptocurrencies highlights that it is only in their emergent forms that such radical creative experiments can dwell in the dark. They do not stay in the dark forever, as their absorption into larger systems becomes part of the future-making process. The cold, inextricable, and always impending computational logic of the current age suffocates creative experimentations that flourish in the dark. Therefore, it is crucial to tend to the uncertainties within the warm, damp, and dark liminal spaces of socio-technical experimentation. References Arnold, Michael. "On the Phenomenology of Technology: The 'Janus-Faces' of Mobile Phones." Information and Organization 13.4 (2003): 231-56. Bartlett, Jamie. "Missing Cryptoqueen: Why Did the FCA Drop Its Warning about the Onecoin Scam?" BBC News 11 Aug. 2020. 19 Feb. 2021 <https://www.bbc.com/news/technology-53721017>. Bowles, Nellie. "Women in Cryptocurrencies Push Back against ‘Blockchain Bros’." New York Times 25 Feb. 2018. 21 Apr. 2021 <https://www.nytimes.com/2018/02/25/business/cryptocurrency-women-blockchain-bros.html>. Bridle, James. New Dark Age: Technology, Knowledge and the End of the Future. London: Verso, 2018. Castells, Manuel. The Information Age: Economy, Society and Culture. 2nd ed. Oxford: Blackwell, 2000. Cohen, Boyd. "The Rise of Alternative Currencies in Post-Capitalism." Journal of Management Studies 54.5 (2017): 739-46. Conway, Luke. "The 10 Most Important Cryptocurrencies Other than Bitcoin." Investopedia Jan. 2021. 19 Feb. 2021 <https://www.investopedia.com/tech/most-important-cryptocurrencies-other-than-bitcoin/>. Gehl, Robert, and Fenwick McKelvey. "Bugging Out: Darknets as Parasites of Large-Scale Media Objects." Media, Culture & Society 41.2 (2019): 219-35. Goodkind, Andrew L., Benjamin A. Jones, and Robert P. Berrens. "Cryptodamages: Monetary Value Estimates of the Air Pollution and Human Health Impacts of Cryptocurrency Mining." Energy Research & Social Science 59 (2020): 101281. Hankin, Aaron. "What You Need to Know about the Bitcoin Cash ‘Hard Fork’." MarketWatch 13 Nov. 2018. 21 Apr. 2021 <https://www.marketwatch.com/story/what-you-need-to-know-about-the-bitcoin-cash-hard-fork-2018-11-13>. Howson, Peter. "NFTs: Why Digital Art Has Such a Massive Carbon Footprint." The Conversation April 2021. 21 Apr. 2021 <https://theconversation.com/nfts-why-digital-art-has-such-a-massive-carbon-footprint-158077>. Kavanagh, Donncha, Gianluca Miscione, and Paul J. Ennis. "The Bitcoin Game: Ethno-Resonance as Method." Organization (2019): 1-20. Li, Shine. "Bitcoin Cash (Bch) Hard Forks into Two New Blockchains Following Disagreement on Miner Tax." Blockchain.News Nov. 2020. 19 Feb. 2021 <https://blockchain.news/news/bitcoin-cash-bch-hard-forks-two-new-blockchains-disagreement-on-miner-tax>. Maddox, Alexia. "Disrupting the Ethnographic Imaginarium: Challenges of Immersion in the Silk Road Cryptomarket Community." Journal of Digital Social Research 2.1 (2020): 31-51. ———. "Netnography to Uncover Cryptomarkets." Netnography Unlimited: Understanding Technoculture Using Qualitative Social Media Research. Eds. Rossella Gambetti and Robert V. Kozinets. London: Routledge, 2021: 3-23. Maddox, Alexia, Monica J. Barratt, Matthew Allen, and Simon Lenton. "Constructive Activism in the Dark Web: Cryptomarkets and Illicit Drugs in the Digital ‘Demimonde’." Information Communication and Society 19.1 (2016): 111-26. Maddox, Alexia, and Luke Heemsbergen. "The Electrified Social: A Policing and Politics of the Dark." Continuum (forthcoming). Maddox, Alexia, Supriya Singh, Heather Horst, and Greg Adamson. "An Ethnography of Bitcoin: Towards a Future Research Agenda." Australian Journal of Telecommunications and the Digital Economy 4.1 (2016): 65-78. Maurer, Bill, Taylor C. Nelms, and Lana Swartz. "'When Perhaps the Real Problem Is Money Itself!': The Practical Materiality of Bitcoin." Social Semiotics 23.2 (2013): 261-77. Nakamoto, Satoshi. "Bitcoin: A Peer-to-Peer Electronic Cash System." Bitcoin.org 2008. 21 Apr. 2021 <https://bitcoin.org/bitcoin.pdf>. Nelms, Taylor C., et al. "Social Payments: Innovation, Trust, Bitcoin, and the Sharing Economy." Theory, Culture & Society 35.3 (2018): 13-33. Pace, Jonathan. "Exchange Relations on the Dark Web." Critical Studies in Media Communication 34.1 (2017): 1-13. Potts, Jason, and Chris Berg. "After Gamestop, the Rise of Dogecoin Shows Us How Memes Can Move Market." The Conversation Feb. 2021. 21 Apr. 2021 <https://theconversation.com/after-gamestop-the-rise-of-dogecoin-shows-us-how-memes-can-move-markets-154470>. Rennie, Ellie. "The Governance of Degenerates Part II: Into the Liquidityborg." Medium Nov. 2020. 21 Apr. 2021 <https://ellierennie.medium.com/the-governance-of-degenerates-part-ii-into-the-liquidityborg-463889fc4d82>. Saiedi, Ed, Anders Broström, and Felipe Ruiz. "Global Drivers of Cryptocurrency Infrastructure Adoption." Small Business Economics (Mar. 2020). Sassen, Saskia. "Digging in the Penumbra of Master Categories." British Journal of Sociology 56.3 (2005): 401-03. Swartz, Lana. "What Was Bitcoin, What Will It Be? The Techno-Economic Imaginaries of a New Money Technology." Cultural Studies 32.4 (2018): 623-50. Trump, Benjamin D., et al. "Cryptocurrency: Governance for What Was Meant to Be Ungovernable." Environment Systems and Decisions 38.3 (2018): 426-30. Van de Sande, Mathijs. "Fighting with Tools: Prefiguration and Radical Politics in the Twenty-First Century." Rethinking Marxism 27.2 (2015): 177-94. Vincent, Danny. "'One Day Everyone Will Use China's Digital Currency'." BBC News Sep. 2020. 19 Feb. 2021 <https://www.bbc.com/news/business-54261382>. Webb, Nick. "A Fork in the Blockchain: Income Tax and the Bitcoin/Bitcoin Cash Hard Fork." North Carolina Journal of Law & Technology 19.4 (2018): 283-311. Wintermeyer, Lawrence. "Climate-Positive Crypto Art: The Next Big Thing or NFT Overreach." Forbes 19 Mar. 2021. 21 Apr. 2021 <https://www.forbes.com/sites/lawrencewintermeyer/2021/03/19/climate-positive-crypto-art-the-next-big-thing-or-nft-overreach/>.
APA, Harvard, Vancouver, ISO, and other styles
35

Hagen, Sal. "“Trump Shit Goes into Overdrive”: Tracing Trump on 4chan/pol/." M/C Journal 23, no. 3 (2020). http://dx.doi.org/10.5204/mcj.1657.

Full text
Abstract:
Content warning: although it was kept to a minimum, this text displays instances of (anti-Semitic) hate speech. During the 2016 U.S. election and its aftermath, multiple journalistic accounts reported on “alt-right trolls” emanating from anonymous online spaces like the imageboard 4chan (e.g. Abramson; Ellis). Having gained infamy for its nihilist trolling subcultures (Phillips, This Is Why) and the loose hacktivist movement Anonymous (Coleman), 4chan now drew headlines because of the alt-right’s “genuinely new” concoction of white supremacy, ironic Internet humour, and a lack of clear leadership (Hawley 50). The alt-right “anons”, as imageboard users call themselves, were said to primarily manifest on the “Politically Incorrect” subforum of 4chan: /pol/. Gradually, a sentiment arose in the titles of several news articles that the pro-Trump “alt-right trolls” had successfully won the metapolitical battle intertwined with the elections (Phillips, Oxygen 5). For instance, articles titled that “trolls” were “The Only True Winners of this Election” (Dewey) or even “Plotting a GOP Takeover” (Stuart).The headlines were as enticing as questionable. As trolling-expert Whitney Phillips headlined herself, the alt-right did not attain political gravity solely through its own efforts but rather was “Conjured Out of Pearl Clutching and Media Attention” (“The Alt-Right”), with news outlets being provoked to criticise, debunk, or sensationalise its trolling activities (Faris et al. 131; Phillips, “Oxygen” 5-6). Even with the right intentions, attempts at denouncement through using vague, structuralist notions–from “alt-right” and “trolls” to “the basket of deplorables” (Robertson) – arguably only strengthened the coherence of those it was meant to disavow (Phillips, Oxygen; Phillips et al.; Marantz). Phillips et al. therefore lamented such generalisations, arguing attributing Trump’s win to vague notions of “4chan”, “alt-right”, or “trolls” actually bestowed an “atemporal, almost godlike power” to what was actually an “ever-reactive anonymous online collective”. Therefore, they called to refrain from making claims about opaque spaces like 4chan without first “plotting the landscape” and “safeguarding the actual record”. Indeed, “when it comes to 4chan and Anonymous”, Phillips et al. warned, “nobody steps in the same river twice”.This text answers the call to map anonymous online groups by engaging with the complexity of testing the muddy waters of the ever-changing and dissimulative 4chan-current. It first argues how anti-structuralist research outlooks can answer to many of the pitfalls arising from this complex task. Afterwards, it traces the word trump as it was used on 4chan/pol/ to problematise some of the above-mentioned media narratives. How did anons consider Trump, and how did the /pol/-current change during the build-up of the 2016 U.S. elections and afterwards?On Researching Masked and Dissimulative ExtremistsWhile potentially playing into the self-imagination of malicious actors (Phillips et al.), the frequent appearance of overblown narratives on 4chan is unsurprising considering the peculiar affordances of imageboards. Imageboards are anonymous – no user account is required to post – and ephemeral – posts are deleted after a certain amount of activity, sometimes after days, sometimes after minutes (Bernstein et al.; Hagen). These affordances complicate studying collectives on imageboards, with the primary reasons being that 1) they prevent insights into user demographics, 2) they afford particularly dissimulative, playful discourse that can rarely be taken at face value (Auerbach; de Zeeuw and Tuters), and 3) the sheer volume of auto-deleted activity means one has to stay up-to-date with a rapid waterfall of subcultural ephemera. Additionally, the person stepping into the muddy waters of the chan-river also changes their gaze over time. For instance, Phillips bravely narrates how she once saw parts of the 4chan-stream as “fun” to only later realise the blatantly racist elements present from the start (“It Wasn’t Just”).To help render legible the changing currents of imageboard activity without relying on vague understandings of the “alt-right”, “trolls”, or “Anonymous”, anti-structuralist research outlooks form a possible answer. Around 1900, sociologists like Gabriel Tarde already argued to refrain from departing from structuralist notions of society and instead let social compositions arise through iterative tracing of minute imitations (11). As described in Bruno Latour’s Reassembling the Social, actor-network theory (ANT) revitalises the Tardean outlook by similarly criticising the notion of the “social” and “society” as distinct, sui-generis entities. Instead, ANT advocates tracing “flat” networks of agency made up of both human and non-human actors (165-72). By tracing actors and describing the emerging network of heterogeneous mediators and intermediaries (105), one can slowly but surely get a sense of collective life. ANT thus takes a page from ethnomethodology, which advocates a similar mapping of how participants of a group produce themselves as such (Garfinkel).For multiple reasons, anti-structuralist approaches like ANT can be useful in tracing elusive anonymous online groups and their changing compositions. First, instead of grasping collectives on imageboards from the outset through structuralist notions, as networked individuals, or as “amorphous and formless entities” (see e.g. Coleman 113-5), it only derives its composition after following where its actors lead. This can result in an empirical and literally objective mapping of their collectivity while refraining from mystifications and non-existent connections–so often present in popular narratives about “trolls” and the “alt-right”. At the same time, it allows prominent self-imaginations and mythologizations – or, in ANT-parlance, “localisations of the global” (Latour 173-190) – rise to the surface whenever they form important actors, which, as we will see, tends to happen on 4chan.Second, ANT offers a useful lens with which to consider how non-human actors can uphold a sense of collectivity within anonymous imageboards. This can include digital objects as part of the infrastructure–e.g. the automatically assigned post numbers having mythical value on 4chan (Beran, It Came From 69)–but also cultural objects like words or memes. Considering 4chan’s anonymity, this focus on objects instead of individuals is partly a necessity: one cannot know the exact amount and flow of users. Still, as this text seeks to show, non-human actors like words or memes can form suitable actors to map the changing collectivity of anonymous imageboard users in the absence of demographic insights.There are a few pitfalls worth noting when conducting ANT-informed research into extremist spaces like 4chan/pol/. The aforementioned ironic and dissimulative rhetoric of anonymous forum culture (de Zeeuw and Tuters) means tracing is complicated by implicit (yet omnipresent) intertextual references undecipherable to the untrained eye. Even worse, when misread or exaggerated, such tracing efforts can play into trolling tactics. This can in turn risk what Phillips calls “giving oxygen” to bigoted narratives by amplifying their presence (“Oxygen”). Since ANT does not prescribe what sort of description is needed (Latour 149), this exposure can be limited and/or critically engaged with by the researcher. Still, it is inevitable that research on extremist collectives adds at least some garbage to already polluted information ecologies (Phillips and Milner 2020), even when “just” letting the actors speak (Venturini). Indeed, this text will unfortunately also show hate speech terms below.These complications of irony and amplification can be somewhat mitigated by mixing ethnographic involvement with computational methods. Together, they can render implicit references explicit while also mapping broad patterns in imitation and preventing singular (misleading) actors from over-dominating the description. When done well, such descriptions do not only have to amplify but can also marginalise and trivialise. An accurate mapping can thereby counter sensationalist media narratives, as long as that is where the actors lead. It because of this potentiality that anti-structuralist tracing of extremist, dissimulative online groups should not be discarded outright.Stopping Momentarily to Test the WatersTo put the above into practice, what follows is a brief case study on the term trump on 4chan/pol/. Instead of following users, here the actor trump is taken an entry point for tracing various assemblages: not only referring to Donald J. Trump as an individual and his actions, but also to how /pol/-anons imagine themselves in relation to Trump. In this way, the actor trump is a fluid one: each of its iterations contains different boundaries and variants of its environment (de Laet and Mol 252). By following these environments, can we make sense of how the delirious 2016 U.S. election cycle played out on /pol/, a space described as the “skeleton key to the rise of Trump” (Beran, 4chan)?To trace trump, I use the 4plebs.com archive, containing almost all posts made on /pol/ between late-2013 and early 2018 (the time of research). I subsequently use two text mining methods to trace various connections between trump and other actors and use this to highlight specific posts. As Latour et al. note, computational methods allow “navigations” (593) of different data points to ensure diverse empirical perspectives, preventing both structuralist “zoomed-out” views and local contexts from over-dominating. Instead of moving between micro and macro views, such a navigation should therefore be understood as a “circulation” around the data, deploying various perspectives that each assemble the actors in a different way. In following this, the case study aims to demonstrate how, instead of a lengthy ethnographic account, a brief navigation using both quali- and quantitative perspectives can quickly demystify some aspects of seemingly nebulous online groups.Tracing trump: From Meme-Wizard to Anti-Semitic TargetTo get a sense of the centrality of Trump on /pol/, I start with post frequencies of trump assembled in two ways. The first (Figure 1) shows how, soon after the announcement of Trump’s presidential bid on 16 June 2015, around 100,000 comments mention the word (2% of the total amount of posts). The frequencies spike to a staggering 8% of all comments during the build-up to Trump’s win of the Republican nomination in early 2016 and presidential election in November 2016. Figure 1: The absolute and relative amount of posts on 4chan/pol/ containing the word trump (prefixes and suffixes allowed).To follow the traces between trump and the more general discourse surrounding it, I compiled a more general “trump-dense threads” dataset. These are threads containing thirty or more posts, with at least 15% of posts mentioning trump. As Figure 2 shows, at the two peaks, 8% of any thread on /pol/ was trump-dense, accounting for approximately 15,000 monthly threads. While Trump’s presence is unsurprising, these two views show just how incredibly central the former businessman was to /pol/ at the time of the 2016 U.S. election. Figure 2: The absolute and relative amount of threads on 4chan/pol/ that are “trump-dense”, meaning they have thirty comments or more, out of which at least 15% contain the word trump (prefixes and suffixes allowed).Instead of picking a certain moment from these aggregate overviews and moving to the “micro” (Latour et al.), I “circulate” further with Figure 3, showing another perspective on the trump­-dense thread dataset. It shows a scatter plot of trump-dense threads grouped per week and plotted according to how similar their vocabulary is. First, all the words per week are weighted with tf-idf, a common information retrieval algorithm that scores units on the basis if they appear a lot in one of the datasets but not in others (Spärck-Jones). The document sets are then plotted according to the similarity of their weighted vocabulary (cosine similarity). The five highest-scoring terms for the five clusters (identified with K-means) are listed in the bottom-right corner. For legibility, the scatterplot is compressed by the MDS algorithm. To get a better sense of specific vocabulary per week, terms that appeared in all weeks are filtered out (like trump or hillary). Read counterclockwise, the nodes roughly increase in time, thus showing a clear temporal change of discourse, with the first clusters being more similar in vocabulary than the last, and the weeks before and after the primary election (orange cluster) showing a clear gap. Figure 3: A scatterplot showing cosine distances between tf-idf weighted vocabularies of trump-dense threads per week. Compressed with MDS and coloured by five K-means clusters on the underlying tf-idf matrix (excluding terms that appeared in all weeks). Legend shows the top five tf-idf terms within these clusters. ★ denotes the median week in the cluster.With this map, we can trace other words appearing around trump as significant actors in the weekly documents. For instance, Trump-supportive words like stump (referring to “Can’t Stump the Trump”) and maga (“Make America Great Again”) are highly ranked in the first two clusters. In later weeks, less clearly pro-Trump terms appear: drumpf reminds of the unattractive root of the Trump family name, while impeached and mueller show the Russia probe in 2017 and 2018 were significant in the trump-dense threads of that time. This change might thus hint at growing scepticism towards Trump after his win, but it is not shown how these terms are used. Fortunately, the scatterplot offers a rudder with which to navigate to further perspectives.In keeping with Latour’s advice to keep “aggregate structures” and “local contexts” flat (165-72), I contrast the above scatterplot with a perspective on the data that keeps sentence structures intact instead of showing abstracted keyword sets. Figure 4 uses all posts mentioning trump in the median weeks of the first and last clusters in the scatterplot (indicated with ★) and visualises word trees (Wattenberg and Viégas) of most frequent words following “trump is a”. As such, they render explicit ontological associations about Trump; what is Trump, according to /pol/-anons? The first word tree shows posts from 2-8 November 2015, when fifteen Republican competitors were still in the race. As we have seen in Figure 1, Trump was in this month still “only” mentioned in around 50,000 posts (2% of the total). This word tree suggests his eventual nomination was at this point seen as an unlikely and even undesirable scenario, showing derogatory associations like retard and failure, as well as more conspiratorial words like shill, fraud, hillary plant, and hillary clinton puppet. Notably, the most prominent association, meme, and others like joke and fucking comic relief, imply Trump was not taken too seriously (see also Figure 5). Figure 4: Word trees of words following “trump is a” in the median weeks of the first and last clusters of the scatterplot. Made with Jason Davies’s Word Tree application. Figure 5: Anons who did not take Trump seriously. Screencapture taken from archive.4plebs.org (see post 1 and post 2 in context).The first word tree contrast dramatically with the one from the last median week from 18 to 24 December 2017. Here, most associations are anti-Semitic or otherwise related to Judaism, with trump most prominently related to the hate speech term kike. This prompts several questions: did /pol/ become increasingly anti-Semitic? Did already active users radicalise, or were more anti-Semites drawn to /pol/? Or was this nefarious current always there, with Trump merely drawing anti-Semitic attention after he won the election? Although the navigation did not depart from a particular critical framework, by “just following the actors” (Venturini), it already stumbled upon important questions related to popular narratives on 4chan and the alt-right. While it is tempting to stop here and explain the change as “radicalisation”, the navigation should continue to add more empirical perspectives. When doing so, the more plausible explanation is that the unlikely success of Trump briefly attracted (relatively) more diverse and playful visitors to /pol/, obscuring the presence and steady growth of overt extremists in the process.To unpack this, I first focus on the claim that a (relatively) diverse set of users flocked to /pol/ because of the Trump campaign. /pol/’s overall posting activity rose sharply during the 2016 election, which can point to already active users becoming more active, but is likely mostly caused by new users flocking to /pol/. Indeed, this can be traced in actor language. For instance, many anons professed to be “reporting in” from other 4chan boards during crucial moments in the campaing. One of the longest threads in the trump-dense threads dataset (4,504 posts) simply announces “Cruz drops out”. In the comments below, multiple anons state they arrived from other boards to join the Trump-infused activity. For instance, Figure 6 shows an anon replying “/v/ REPORTING IN”, to which sixty other users reacted by similarly affirming themselves as representatives from other boards (e.g. “/mu/ here. Ready to MAGA”). While but another particular view, this implies Trump’s surprising nomination stimulated a crowd-like gathering of different anons jumping into the vortex of trump-related activity on /pol/. Figure 6: Replies by outside-anons “reporting in” the sticky thread announcing Ted Cruz's drop out, 4 May 2016. Screenshots taken from 4plebs.org (see post 1 and post 2 in context).Other actor-language further expresses Trump’s campaign “drew in” new and unadjusted (or: less extreme) users. Notably, many anons claimed the 2016 election led to an “invasion of Reddit users”. Figure 7 shows one such expression: an annotated timeline of /pol/’s posting activity graph (made by 4plebs), posted to /pol/ on 26 February 2016 and subsequently reposted 34 times. It interprets 2016 as a period where “Trump shit goes into overdrive, meme shit floods /pol/, /pol/ is now reddit”. Whether these claims hold any truth is difficult to establish, but the image forms an interesting case of how the entirety “/pol/” is imagined and locally articulated. Such simplistic narratives relate to what Latour calls “panoramas”: totalising notions of some imagined “whole” (188-90) that, while not to be “confused with the collective”, form crucial data since they express how actors understand their own composition (190). Especially in the volatile conditions of anonymous and ephemeral imageboards, repeated panoramic narratives can help in constructing a sense of cohesion–and thereby also form interesting actors to trace. Indeed, following the panoramic statement “/pol/ is now reddit”, other gatekeeping-efforts are not hard to find. For instance, phrases urging other anons to go “back to reddit” (occurring in 19,069 posts in the total dataset) or “back to The_Donald” (a popular pro-Trump subreddit, 1,940 posts) are also particularly popular in the dataset. Figure 7: An image circulated on /pol/ lamenting that "/pol/ is now reddit" by annotating 4plebs’s posting metrics. Screenshot taken from archive.4plebs.org (see posts).Did trump-related activity on /pol/ indeed become more “meme-y” or “Reddit-like” during the election cycle, as the above panorama articulates? The activity in the trump-dense threads seems to suggest so. Figure 8 again uses the tf-idf terms from these threads, but here with the columns denoting the weeks and the rows the top scoring tf-idf terms of their respective week. To highlight relevant actors, all terms are greyed out (see the unedited sheet here), except for several keywords that indicate particularly playful or memetic vernacular: the aforementioned stump, emperor, referring to Trump’s nickname as “God Emperor”; energy, referring to “high energy”, a common catchphrase amongst Trump supporters; magic, referring to “meme magic”, the faux-ironic belief that posting memes affects real-life events; and pepe, the infamous cartoon frog. In both the tf-idf ranking and the absolute frequencies, these keywords flourish in 2016, but disappear soon after the presidential election passes. The later weeks in 2017 and 2018 rarely contain similarly playful and memetic terms, and if they do, suggest mocking discourse regarding Trump (e.g. drumpf). This perspective thus pictures the environment around trump in the run-up to the election as a particularly memetic yet short-lived carnival. At least from this perspective, “meme shit” thus indeed seemed to have “flooded /pol/”, but only for a short while. Figure 8: tf-idf matrix of trump-dense threads, columns denoting weeks and rows denoting the top hundred most relevant terms per week. Download the full tf-idf matrix with all terms here.Despite this carnivalesque activity, further perspectives suggest it did not go at the expense of extremist activity on /pol/. Figure 9 shows the absolute and relative counts of the word "jew" and its derogatory synonym "kike". Each of these increases from 2015 onwards. As such, it seems to align with claims that Trump’s success and /pol/ becoming increasingly extremist were causally related (Thompson). However, apart from possibly confusing correlation with causation, the relative presence remains fairly stable, even slightly decreasing during the frenzy of the Trump campaign. Since we also saw Trump himself become a target for anti-Semitic activity, these trendlines rather imply /pol/’s extremist current grew proportionally to the overall increase in activity, and increased alongside but not but necessarily as a partisan contingent as a result of Trump’s campaign. Figure 9: The absolute and relative frequency of the terms "jew" and "kike" on 4chan/pol/.ConclusionCombined, the above navigation implies two main changes in 4chan/pol/’s trump-related current. First, the climaxes of the 2016 Republican primaries and presidential elections seem to have invoked crowd-like influxes of (relatively) heterogeneous users joining the Trump-delirium, marked by particularly memetic activity. Second, /pol/ additionally seemed to have formed a welcoming hotbed for anti-Semites and other extremists, as the absolute amount of (anti-Semitic) hate speech increased. However, while already-present and new users might have been energised by Trump, they were not necessarily loyal to him, as professed by the fact that Trump himself eventually became a target. Together with the fact that anti-Semitic hate speech stayed relatively consistent, instead of being “countercultural” (Nagle) or exclusively pro-Trump, /pol/ thus seems to have been composed of quite a stable anti-Semitic and Trump-critical contingent, increasing proportionally to /pol/’s general growth.Methodologically, this text sought to demonstrate how a brief navigation of trump on 4chan/pol/ can provide provisional yet valuable insights regarding continuously changing current of online anonymous collectives. As the cliché goes, however, this brief exploration has left more many questions, or rather, it did not “deploy the content with all its connections” (Latour 147). For instance, I have not touched on how many of the trump-dense threads are distinctly separated and pro-Trump “general threads” (Jokubauskaitė and Peeters). Considering the vastness of such tasks, the necessity remains to find appropriate ways to “accurately map” the wild currents of the dissimulative Web–despite how muddy they might get.NoteThis text is a compressed and edited version of a longer MA thesis available here.ReferencesAbramson, Seth. “Listen Up, Progressives: Here’s How to Deal with a 4Chan (“Alt-Right”) Troll.” Medium, 2 May 2017. <https://medium.com/@Seth_Abramson/listen-up-progressives-heres-how-to-deal-with-a-4chan-alt-right-troll-48594f59a303>.Auerbach, David. “Anonymity as Culture: Treatise.” Triple Canopy, n.d. 22 June 2020 <https://www.canopycanopycanopy.com/contents/anonymity_as_culture__treatise>.Beran, Dale. “4chan: The Skeleton Key to the Rise of Trump”. Medium, 14 Feb. 2017. <https://medium.com/@DaleBeran/4chan-the-skeleton-key-to-the-rise-of-trump-624e7cb798cb>.Beran, Dale. It Came from Something Awful: How a Toxic Troll Army Accidentally Memed Donald Trump into Office. New York: All Points Books, 2019.Bernstein, Michael S, Andrés Monroy-Hernández, Drew Harry, Paul André, Katrina Panovich, and Greg Vargas. “4chan and /b/: An Analysis of Anonymity and Ephemerality in a Large Online Community.” Proceedings of the Fifth International AAAI Conference on Weblogs and Social Media, 2011.Coleman, Gabriella. Hacker, Hoaxer, Whistleblower, Spy: The Many Faces of Anonymous. London: Verso Books, 2014.De Laet, Marianne, and Annemarie Mol. “The Zimbabwe Bush Pump: Mechanics of a Fluid Technology.” Social Studies of Science 30.2 (2000): 225–263. 1 May 2020 <https://journals.sagepub.com/doi/10.1177/030631200030002002>. De Zeeuw, Daniel, and Marc Tuters. “Teh Internet Is Serious Business: On the Deep Vernacular Web Imaginary.” Cultural Politics 16.2 (2020).Dewey, Caitlin. “The Only True Winners of this Election are Trolls.” The Washington Post, 3 Nov. 2016. <https://www.washingtonpost.com/news/the-intersect/wp/2016/11/03/the-only-true-winners-of-this-election-are-trolls/>.Faris, Robert, Hal Roberts, Bruce Etling, Nikki Bourassa, Ethan Zuckerman, and Yochai Benkler. “Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election.” Berkman Klein Center Research Publication, 2017. <http://nrs.harvard.edu/urn-3:HUL.InstRepos:33759251>.Garfinkel, Harold. Studies in Ethnomethodology. New Jersey: Prentice-Hall, 1967.Hagen, Sal. “Rendering Legible the Ephemerality of 4chan/pol/.” OILab.eu, 12 Apr. 2020. <https://oilab.eu/rendering-legible-the-ephemerality-of-4chanpol/>.Hawley, George. Making Sense of the Alt-Right. New York: Columbia UP, 2017.Jokubauskaitė, Emilija, and Stijn Peeters. “Generally Curious: Thematically Distinct Datasets of General Threads on 4chan/Pol/”. Proceedings of the International AAAI Conference on Web and Social Media 14.1 (2020): 863-7. <https://www.aaai.org/ojs/index.php/ICWSM/article/view/7351>.Latour, Bruno. Reassembling the Social: An Introduction to Actor-Network Theory. New York: Oxford UP, 2005.Latour, Bruno, Pablo Jensen, Tommaso Venturini, Sébastian Grauwin, and Dominique Boullier. “‘The Whole Is Always Smaller than Its Parts’. A Digital Test of Gabriel Tarde’s Monads.” British Journal of Sociology 63.4 (2012): 590-615.Marantz, Andrew. Antisocial: Online Extremists, Techno-Utopians, and the Hijacking of the American Conversation. New York: Penguin Random House, 2019.Nagle, Angela. Kill All Normies: Online Culture Wars from 4chan and Tumblr to Trump and the White House. Winchester: Zero Books, 2017.Phillips, Whitney. This Is Why We Can’t Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture. Cambridge: MIT Press, 2015.———. “The Alt-Right Was Conjured Out of Pearl Clutching and Media Attention.” Motherboard, 12 Oct. 2016 <https://www.vice.com/en_us/article/jpgaeb/conjuring-the-alt-right>.———. “The Oxygen of Amplification: Better Practices for Reporting on Extremists, Antagonists, and Manipulators Online.” Data & Society, 2018. <https://datasociety.net/wp-content/uploads/2018/05/1_PART_1_Oxygen_of_Amplification_DS.pdf>.———. “It Wasn’t Just the Trolls: Early Internet Culture, ‘Fun,’ and the Fires of Exclusionary Laughter.” Social Media + Society (2019). <https://journals.sagepub.com/doi/10.1177/2056305119849493>.Phillips, Whitney, Gabriella Coleman, and Jessica Beyer. “Trolling Scholars Debunk the Idea That the Alt-Right’s Shitposters Have Magic Powers.” Motherboard, 22 Mar. 2017. <https://motherboard.vice.com/en_us/article/z4k549/trolling-scholars-debunk-the-idea-that-the-alt-rights-trolls-have-magic-powers>.Robertson, Adi. “Hillary Clinton Exposing Pepe the Frog Is the Death of Explainers.” The Verge, 15 Sep. 2016. <https://www.theverge.com/2016/9/15/12926976/hillary-clinton-trump-pepe-the-frog-alt-right-explainer>.Spärck Jones, Karen. “A Statistical Interpretation of Term Specificity and its Application in Retrieval.” Journal of Documentation 28.1 (1972): 11-21.Stuart, Tessa. “Inside the DeploraBall: The Trump-Loving Trolls Plotting a GOP Takeover.” Rolling Stone, 20 Jan. 2017. <https://www.rollingstone.com/politics/politics-features/inside-the-deploraball-the-trump-loving-trolls-plotting-a-gop-takeover-128128/>.Tarde, Gabriel. The Laws of Imitation. Ed. and trans. Elsie Clews Parsons. New York: Henry Holt and Company, 1903.Thompson, Andrew. “The Measure of Hate on 4chan.” Rolling Stone, 10 May 2018. <https://www.rollingstone.com/politics/politics-news/the-measure-of-hate-on-4chan-627922/>.Venturini, Tommaso. “Diving in Magma: How to Explore Controversies with Actor-Network Theory.” Public Understanding of Science 19.3 (2010): 258-273.Wattenberg, Martin, and Fernanda Viégas. “The Word Tree, an Interactive Visual Concordance.” IEEE Transactions on Visualization and Computer Graphics 14.6 (2008): 1221-1228.
APA, Harvard, Vancouver, ISO, and other styles
36

Lotti, Laura. "DIY Cheese-making and Individuation: Towards a Reconfiguration of Taste in Contemporary Computer Culture." M/C Journal 17, no. 1 (2014). http://dx.doi.org/10.5204/mcj.757.

Full text
Abstract:
Introduction The trope of food is often used in the humanities to discuss aspects of a culture that are customarily overlooked by a textualist approach, for food embodies a kind of knowledge that comes from the direct engagement with materials and processes, and involves taste as an aesthetics that exceeds the visual concept of the “beautiful.” Moreover, cooking is one of the most ancient cultural practices, and is considered the habit that defines us as humans in comparison to other animals—not only culturally, but also physiologically (Wrangham). Today we have entered a post-human age in which technological augmentations, while promoting the erasure of embodiment in favour of intelligence (Hayles), create new assemblages between the organic and the digital, thus redefining what it means to be human. In this context, a reassessment of the practice of cooking as the manipulation of what constitutes food—both for thought and for the body—may promote a more nuanced approach to contemporary culture, in which the agency of the non-human (from synthetic materials to the digital) affects our modes of being and reflects on our aesthetic sensibility. In the 1980s, Guy Debord observed that the food industry's standardisation and automation of methods of production and consumption have anaesthetised the consumer palate with broader political and cultural implications. Today the Internet has extended the intertwinement of food and technology to the social and aesthetic spheres, thus further impacting on taste. For instance, cultural trends such as “foodism” and “slow food” thrive on blogs and social networks and, while promoting an artisanal style in food preparation and presentation, they paradoxically may also homogenise cooking techniques and the experience of sharing a meal. This leads to questions regarding the extent to which the digitalisation of culture might be hindering our capacity to taste. Or, given the new possibilities for connectivity, can this digitalisation also foster an aesthetic sensibility associated with different attitudes and approaches to food—one that transgresses both the grand narratives and the standardisation promoted by such gastronomic fashions? It also leads to the question of how such activities reflect on the collective sphere, considering the contagious character of networked communication. While foodism thrives online, the Internet has nevertheless prompted a renewed interest in DIY (do-it-yourself) cooking techniques. As a recent issue of M/C Journal testifies, today cookbooks are produced and consulted at an unprecedented rate—either in print or online (Brien and Wessell). Taking the example of the online diffusion of DIY cheese-making recipes, I will below trace the connections between cooking, computer culture, and taste with the support of Gilbert Simondon's metaphysics of technics. Although Simondon never extensively discussed food in relation to technology, the positioning of technicity at the heart of culture allows his work to be used to address the multifaceted nature of taste in the light of recent technological development, in particular of the Network. As a matter of fact, today cooking is not only a technical activity, in the sense that it requires a certain practical and theoretical skilfulness—it is also a technological matter, for the amount of networked machines that are increasingly used for food production and marketing. Specifically, this paper argues that by disentangling the human—albeit partially—from the capitalist cycle of production-marketing-consumption and by triggering an awareness of the increasingly dominant role technology plays in food processing and manufacturing, the online sharing of home-cooking advice may promote a reconfiguration of taste, which would translate into a more nuanced approach to contemporary techno-culture. In the first part of this discussion, I introduce Simondon’s philosophy and foreground the technical dimension of cooking by discussing cheese-making as a process of individuation. In the second, I focus on Simondon’s definition of technical objects and technical ensembles to position Internet culture in relation to cooking, and highlight how technicity folds back on taste as aesthetic impression. Ultimately, I conclude with some reflections on how such a culinary-aesthetic approach may find application in other techno-cultural fields by promoting an aesthetic sensibility that extends beyond the experience of the “social” to encompass an ethical component. Cooking as Individuation: The Networked Dimension of Taste Simondon is known as the thinker, and “tinkerer”, of technics. His project is concerned with ontogenesis—that is, the becoming of objects in relation to the terms that constitute them as individual. Simondon’s philosophy of individuation allows for a better understanding of how the Internet fosters certain attitudes to food, for it is grounded on a notion of “energetic materiality in movement” (Deleuze and Guattari 408) that explains how “immaterial” algorithms can affect individual experience and cultural production. For Simondon, individuation is the process that arises from objects being out-of-phase with themselves. Put differently, individuation allows for “the conservation of being through becoming” (Genesis 301). Likewise, individualisation is “the individuation of an individuated being, resulting from an individuation, [and creating] a new structuration within the individual” (L’Individuation 132). Individuation and individualisation are processes common to all kinds of being. Any individual operates an internal and an external resonance within the system in which it is enmeshed, and produces an “associated milieu” capable of entering into relation with other individuals within the system. Simondon maintains that nature consists of three regimes of individuation, that is, three possible phases of every being: the physical, the biological, and the psycho-social—that develop from a metastable pre-individual field. Technology traverses all three regimes and allows for further individualisation via transductive operations across such phases—that is, via operations of conversion of energy from one form to another. The recent online diffusion of DIY cheese-making recipes lends itself to be analysed with the support of Simondon’s philosophy. Today cheese dominates degustation menus beside the finest wines, and constitutes a common obsession among “foodies.” Although, as an object, cheese defies more traditional canons of beauty and pleasure—its usual pale yellow colour is not especially inviting and, generally speaking, the stinkier and mouldier it is, the more exclusive and expensive it usually is—it has played a sizeable role in the collective imagination since ancient times. Although the genesis of cheese predates archival memory, it is commonly assumed to be the fruit of the chemical reaction naturally occurring in the interaction of milk with the rennet inherently contained in the bladders made of ruminants’ stomachs in which milk was contained during the long transits undertaken by the nomadic cultures of Central Asia. Cheese is an invention that reportedly occurred without human intervention, and only the technical need to preserve milk in high temperature impelled humans to learn to produce it. Since World War II its production is most exclusively factory-based, even in the case of artisanal cheese (McGee), which makes the renewed concern for homemade cheese more significant from a techno-cultural perspective. Following Simondon, the individualisation of cheese—and of people in relation to cheese—depends on the different objects involved in its production, and whose associated milieu affects the outcome of the ontogenetic process via transductive operations. In the specific case of an industrial block of cheese, these may include: the more or less ethical breeding and milking of cows in a factory environment; the types of bacteria involved in the cheese-making process; the energy and costs inherent in the fabrication of the packaging material and the packaging process itself; the CO2 emissions caused by transportations; the physical and intellectual labour implied in marketing, retailing and selling; and, last but not least, the arguable nutritional value of the factory-produced cheese—all of which, in spite of their “invisibility” to the eyes of the consumer, affect physical conditions and moods when they enter into relation with the human body (Bennet). To these, we may add, with specific reference to the packaging: the RFID tags that electronically index food items into databases for a more efficient management of supplies, and the QR codes used for social media marketing purposes. In contrast, the direct engagement with the techno-material conditions at the basis of the home cookery process allows one to grasp how different operations may affect the outcome of the recipe. DIY cheese-making recipes are specifically addressed to laypeople and, because they hardly demand professional equipment, they entail a greater attunement with, and to, the objects and processes required by the recipe. For instance, one needs to “feel” when milk has reached the right temperature (specifically, 82 degrees centigrade, which means that the surface of the milk should be slightly bubbly but not fully boiling) and, with practice, one learns how the slightest movement of the hand can lead to different results, in terms of consistency and aspect. Ultimately, DIY cheese-making allows the cook to be creative with moulding, seasonings, and marinading. Indeed, by directly engaging with the undiscovered properties and potentials of ingredients, by understanding the role that energy (both in the sense of induction and “transduction”) plays on form and matter, and by developing—often via processes of trial and error—technics for stirring, draining, moulding, marinading, canning, and so forth, making cheese at home an exercise in speculative pragmatics. An experimental approach to cooking, as the negotiation between the rigid axioms that make up a recipe and the creative and experimental components inherent in the operations of mixing and blending, allows one to feel the ultimate outcome of the cooking process as an event. The taste of a homemade cheese is linked to a new kind of knowledge—that is, an epistemology based on continuous breakages that allow for the cooking process to carry on until the ultimate result. It is a knowledge that comes from a commitment to objects being out-of-phase, and from the acknowledgement of the network of technical operations that bring cheese to our tables. The following section discusses how another kind of object may affect the outcome of a recipe, with important implications for aesthetics, that is, technical objects. The Internet as Ingredient: Technical Objects, Aesthetics, and Invention The notion of technical objects complements Simondon’s theory of individuation to define the becoming of technology in relation to culture. To Simondon: “the technical object is not this or that thing, given hic et nunc, but that of which there is a genesis” (Du Mode 20). Technical objects, therefore, are not simply technological artifacts but are constituted by a series of events that determine their evolution (De Vries). Analogously to other kinds of individuals, they are constituted by transductive operations across the three aforementioned phases of being. The evolution of technical objects extends from the element to the individual, and ultimately to the technical ensemble. Elements are less than individualised technical objects, while individuals that are in a relation of interconnection are called ensembles. According to Simondon, technical ensembles fully individualise with the realisation of the cybernetic project. Simondon observes that: “there is something eternal in a technical ensemble [...] and it is that which is always present, and can be conserved in a thing” (Les Cahiers 87). The Internet, as a thing-network, could be regarded as an instance of such technical ensembles, however, a clarification needs to be made. Simondon explains that “true technical ensembles are not those that use technical individuals, but those that are a network of technical individuals in a relation of interconnection” (Du mode 126). To Simondon, humankind has ceased to be a technical individual with the industrialisation and automation of methods of production, and has consigned this function to machines (128). Expanding this line of thought, examples such as the viral spreading of memes, and the hypnotic power of online marketing campaigns, demonstrate how digital technology seems to have intensified this process of alienation of people from the functioning of the machine. In short, no one seems to know how or why things happen on the Internet, but we cannot help but use it. In order to constitute “real” technical ensembles, we need to incorporate technics again into culture, in a relation of reciprocity and complementarity with machines, under the aegis of a technical culture. Simondon specifies that such a reconfiguration of the relation between man and machines can only be achieved by means of an invention. An invention entails the individualisation of the technical ensemble as a departure from the mind of the inventor or designer that conceived it, in order to acquire its own autonomous existence (“Technical Mentality”). It refers to the origin of an operative solidarity between individual agents in a network, which provides the support for a human relation based on the “model of transidividuality” (Du Mode 247). A “transindividual relation” is a relation of relations that puts the individual in direct contact with a real collective. The notion of real collective is opposed to that of an interindividual community or social sphere, which is poisoned by the anxieties that stem from a defected relation with the technical ensemble culture is embedded in. In the specific context of the online sharing of DIY cheese-making recipes, rather than a fully individualised technical ensemble per se, the Internet can be regarded as one of the ingredients that make up the final recipe—together with human and the food—for the invention of a true technical ensemble. In such a framework, praxis, as linked to the kind of non-verbal knowledge associated with “making,” defines individuation together with the types of objects that make up the Network. While in the case of foodism, the practice of online marketing and communication homogenises culture by creating “social phenomena,” in the case of DIY cooking advice, it fosters a diversification of tastes, experiences, and flavours linked to individual modes of doing and cooking, that put the cook in a new relation with the culinary process, with food, and with the guests who have the pleasure to taste her meal. This is a qualitative change in the network that constitutes culture, rather than a mere quantitative shift in energy induction. The term “conviviality” (from the Latin con-vivere) specifically means this: a “living together,” rather than a mere dinner party. For Simondon, a real technical ensemble is an assemblage of humans, machines, tools, resources and milieus, which can only be éprouve—i.e., experienced, also in the sense of “experimented with”—rather than represented. A technical ensemble is first and foremost an aesthetic affair—it can only be perceived by experimenting with the different agents involved in the networked operations that constitute it. For Simondon “aesthetics comes after technicity [and] it also returns to us in the heart of technicity” (Michaud in De Boever et al. 122). Therefore, any object bears an aesthetic potential—even something as trivial as a homemade block of cheese. Simondon rejects the idea of an aesthetic object, but affirms the power of technicity to foreground an aesthetic impression, which operates a convergence between the diverging forces that constitute the mediation between man and world, in terms of an ethical treatment of technics. For Simondon, the beautiful is a process: “it is never, properly speaking, the object that is beautiful: it is the encounter operating a propos of the object between a real aspect of the world and a human gesture” (Du Mode 191 emphasis added). If an analysis of cooking as individuation already foregrounds an aesthetics that is both networked and technical, the relational capabilities afforded by networked media have the power to amplify the aesthetic potential of the human gesture implied in a block of homemade cheese—which today extends from searching for (or writing) a recipe online, to pouring the milk and seasoning the cheese, and which entails less environmental waste due to the less intensive processing and the lack of, or certainly a reduction in, packaging materials (Rastogi). The praise of technical creativity resounds throughout Simondon’s thought. By using the Internet in order to create (or indeed cook) something new, the online sharing of DIY cooking techniques like cheese-making, which partially disengages the human (and food itself) from the cycle of production-marketing-consumption that characterises the food industry in capitalist society by fostering an awareness of the networked operations that constitute her as individual, is an invention in its own right. Although the impact of these DIY activities on the global food industry is still very limited, such a hands-on approach, imbued with a dose of technical creativity, partially overcomes the alienation of the individual from the production process, by providing the conditions to “feel” how the individualisation of cheese (and the human) is inscribed in a larger metabolism. This does not stop within the economy of the body but encompasses the techno-cultural ensemble that forms capitalist society as a whole, and in which humans play only a small part. This may be considered a first step towards the reconciliation between humans and technical culture—a true technical ensemble. Indeed, eating involves “experiments in art and technology”—as the name of the infamous 1960s art collective (E.A.T.) evokes. Home-cooking in this sense is a technical-aesthetic experiment in its own right, in which aesthetics acquires an ethical nuance. Simondon’s philosophy highlights how the aesthetics involved in the home cooking process entails a political component, aimed at the disentanglement of the human from the “false” technical ensemble constituted by capitalist society, which is founded on the alienation from the production process and is driven by economic interests. Surely, an ethical approach to food would entail considering the biopolitics of the guts from the perspective of sourcing materials, and perhaps even building one’s own tools. These days, however, keeping a cow or goat in the backyard is unconceivable and/or impossible for most of us. The point is that the Internet can foster inventiveness and creativity among the participants to the Network, in spite of the fixity of the frame in which culture is increasingly inscribed (for instance, the standardised format of a Wordpress blog), and in this way, can trigger an aesthetic impression that comprises an ethical component, which translates into a political stand against the syncopated, schizophrenic rhythms of the market. Conclusion In this discussion, I have demonstrated that cooking can be considered a process of individuation inscribed in a techno-cultural network in which different transductive operations have the power to affect the final taste of a recipe. Simondon’s theory of individuation allows us to account for the impact of ubiquitous networked media on traditionally considered “human” practices, thus suggesting a new kind of humanism—a sort of technological humanism—on the basis of a new model of perception, which acknowledges the non-human actants involved in the process of individuation. I have shown that, in the case of the online sharing of cheese-making recipes, Simondon’s philosophy allows us to uncover a concept of taste that extends beyond the mere gustatory experience provided by foodism, and in this sense it may indeed affirm a reconfiguration of human culture based on an ethical approach towards the technical ensemble that envelops individuals of any kind—be they physical, living, or technical. Analogously, a “culinary” approach to techno-culture in terms of a commitment to the ontogenetic character of objects’ behaviours could be transposed to the digital realm in order to enlighten new perspectives for the speculative design of occasions of interaction among different beings—including humans—in ethico-aesthetic terms, based on a creative, experimental engagement with techniques and technologies. As a result, this can foreground a taste for life and culture that exceeds human-centred egotistic pleasure to encompass both technology and nature. Considering that a worryingly high percentage of digital natives both in Australia and the UK today believe that cheese and yogurt grow on trees (Howden; Wylie), perhaps cooking should indeed be taught in school alongside (rather than separate to, or instead of) programming. References Bennet, Jane. Vibrant Matter: a Political Ecology of Things. Durham: Duke UP, 2010 Brien, Donna Lee, and Adele Wessell. “Cookbook: A New Scholarly View.” M/C Journal 16.3 (2013). 7 Jan. 2014. ‹http://journal.media-culture.org.au/index.php/mcjournal/article/viewArticle/688›. Crary, Jonathan, and Sanford Kwinter. Incorporations. New York: Zone, 1992. De Boever, Arne, Alex Murray, Jon Roffe, and Ashley Woodward, eds. Gilbert Simondon: Being and Technology. Edinburgh: Edinburgh UP, 2012. De Vries, Marc. “Gilbert Simondon and the Dual Nature of Technical Artifacts.” Techné: Research in Philosophy and Technology 12.1 (2008). Debord, Guy. “Abat-Faim.” Encyclopedie des Nuisances 5 (1985) 2 Jan. 2014. ‹http://www.notbored.org/abat-faim.html›. Deleuze, Gilles and Felix Guattari. A Thousand Plateaus. London: Continuum, 2004. Hayles, N. Katherine. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago: The University of Chicago Press, 1999. Howden, Saffron. “Cultural Cringe: Schoolchildren Can’t See the Yoghurt for the Trees.” The Sydney Morning Herald 5 Mar. 2012. 5 Jan. 2014. ‹http://www.smh.com.au/national/education/cultural-cringe-schoolchildren-cant-see-the-yoghurt-for-the-trees-20120304-1ub55.html›. McGee, Harold. On Food and Cooking: The Science and Lore of the Kitchen. New York: Scribner, 2004. Michaud, Yves. “The Aesthetics of Gilbert Simondon: Anticipation of the Contemporary Aesthetic Experience.” Gilbert Simondon: Being and Technology. Eds. Arne De Boever, Alex Murray, Jon Roffe, and Ashley Woodward. Edinburgh: Edinburgh UP, 2012. 121–32. Rastogi, Nina. “Soft Cheese for a Clean Planet”. Slate 15 Dec. 2009. 25 Jan. 2014. ‹http://www.slate.com/articles/health_and_science/the_green_lantern/2009/12/soft_cheese_for_a_clean_planet.html›. Simondon, Gilbert. Du Mode d’Existence des Objets Techniques. Paris: Aubier, 2001. ---. L’Individuation a La Lumière Des Notions de Forme et d’Information. Grenoble: Millon, 2005. ---. “Les Cahiers du Centre Culturel Canadien” 4, 2ème Colloque Sur La Mécanologie. Paris, 1976. ---. “Technical Mentality.” Parrhesia 7 (2009): 17–27.---. “The Genesis of the Individual.” Incorporations. Eds. Jonathan Crary, and Sanford Kwinter. New York: Zone, 1992. 296–319. Wrangham, Richard. “Reason in the Roasting of Eggs.” Collapse: Philosophical Research and Development Volume VII. Eds. Reza Negarestani, and Robin Mackay. London: Urbanomic, 2011. 331–44. Wylie, Catherine. “Significant Number of Children Believe Cheese Comes from Plants, Reveals New Survey.” The Independent 3 Jun. 2013. 5 Jan. 2014. ‹http://www.independent.co.uk/news/uk/home-news/significant-number-of-children-believe-cheese-comes-from-plants-reveals-new-survey-8641771.html›.
APA, Harvard, Vancouver, ISO, and other styles
37

Potts, Jason. "The Alchian-Allen Theorem and the Economics of Internet Animals." M/C Journal 17, no. 2 (2014). http://dx.doi.org/10.5204/mcj.779.

Full text
Abstract:
Economics of Cute There are many ways to study cute: for example, neuro-biology (cute as adaptation); anthropology (cute in culture); political economy (cute industries, how cute exploits consumers); cultural studies (social construction of cute); media theory and politics (representation and identity of cute), and so on. What about economics? At first sight, this might point to a money-capitalism nexus (“the cute economy”), but I want to argue here that the economics of cute actually works through choice interacting with fixed costs and what economists call ”the substitution effect”. Cute, in conjunction with the Internet, affects the trade-offs involved in choices people make. Let me put that more starkly: cute shapes the economy. This can be illustrated with internet animals, which at the time of writing means Grumpy Cat. I want to explain how that mechanism works – but to do so I will need some abstraction. This is not difficult – a simple application of a well-known economics model, namely the Allen-Alchian theorem, or the “third law of demand”. But I am going to take some liberties in order to represent that model clearly in this short paper. Specifically, I will model just two extremes of quality (“opera” and “cat videos”) to represent end-points of a spectrum. I will also assume that the entire effect of the internet is to lower the cost of cat videos. Now obviously these are just simplifying assumptions “for the purpose of the model”. And the purpose of the model is to illuminate a further aspect of how we might understand cute, by using an economic model of choice and its consequences. This is a standard technique in economics, but not so in cultural studies, so I will endeavour to explain these moments as we go, so as to avoid any confusion about analytic intent. The purpose of this paper is to suggest a way that a simple economic model might be applied to augment the cultural study of cute by seeking to unpack its economic aspect. This can be elucidated by considering the rise of internet animals as a media-cultural force, as epitomized by “cat videos”. We can explain this through an application of price theory and the theory of demand that was first proposed by Armen Alchian and William Allen. They showed how an equal fixed cost that was imposed to both high-quality and low-quality goods alike caused a shift in consumption toward the higher-quality good, because it is now relatively cheaper. Alchian and Allen had in mind something like transport costs on agricultural goods (such as apples). But it is also true that the same effect works in reverse (Cowen), and the purpose of this paper is to develop that logic to contribute to explaining how certain structural shifts in production and consumption in digital media, particularly the rise of blog formats such as Tumblr, a primary supplier of kittens on the Internet, can be in part understood as a consequence of this economic mechanism. There are three key assumptions to build this argument. The first is that the cost of the internet is independent of what it carries. This is certainly true at the level of machine code, and largely true at higher levels. What might be judged aesthetically high quality or low quality content – say of a Bach cantata or a funny cat video – are treated the same way if they both have the same file size. This is a physical and computational aspect of net-neutrality. The internet – or digitization – functions as a fixed cost imposed regardless of what cultural quality is moving across it. Second, while there are costs to using the internet (for example, in hardware or concerning digital literacy) these costs are lower than previous analog forms of information and cultural production and dissemination. This is not an empirical claim, but a logical one (revealed preference): if it were not so, people would not have chosen it. The first two points – net neutrality and lowered cost – I want to take as working assumptions, although they can obviously be debated. But that is not the purpose of the paper, which is instead the third point – the “Alchian-Allen theorem”, or the third fundamental law of demand. The Alchian-Allen Theorem The Alchian-Allen theorem is an extension of the law of demand (Razzolini et al) to consider how the distribution of high quality and low quality substitutes of the same good (such as apples) is affected by the imposition of a fixed cost (such as transportation). It is also known as the “shipping the good apples out” theorem, after Borcherding and Silberberg explained why places that produce a lot of apples – such as Seattle in the US – often also have low supplies of high quality apples compared to places that do not produce apples, such as New York. The puzzle of “why can’t you get good apples in Seattle?” is a simple but clever application of price theory. When a place produces high quality and low quality items, it will be rational for those in faraway places to consume the high quality items, and it will be rational for the producers to ship them, leaving only the low quality items locally.Why? Assume preferences and incomes are the same everywhere and that transport cost is the same regardless of whether the item shipped is high or low quality. Both high quality and low quality apples are more expensive in New York compared to Seattle, but because the fixed transport cost applies to both the high quality apples are relatively less expensive. Rational consumers in New York will consume more high quality apples. This makes fewer available in Seattle.Figure 1: Change in consumption ratio after the imposition of a fixed cost to all apples Another example: Australians drink higher quality Californian wine than Californians, and vice versa, because it is only worth shipping the high quality wine out. A counter-argument is that learning effects dominate: with high quality local product, local consumers learn to appreciate quality, and have different preferences (Cowen and Tabarrok).The Alchian-Allen theorem applies to any fixed cost that applies generally. For example, consider illegal drugs (such as alcohol during the US prohibition, or marijuana or cocaine presently) and the implication of a fixed penalty – such as a fine, or prison sentence, which is like a cost – applied to trafficking or consumption. Alchian-Allen predicts a shift toward higher quality (or stronger) drugs, because with a fixed penalty and probability of getting caught, the relatively stronger substance is now relatively cheaper. Empirical work finds that this effect did occur during alcohol prohibition, and is currently occurring in narcotics (Thornton Economics of Prohibition, "Potency of illegal drugs").Another application proposed by Steven Cuellar uses Alchian-Allen to explain a well-known statistical phenomenon why women taking the contraceptive pill on average prefer “more masculine” men. This is once again a shift toward quality predicted on falling relative price based on a common ‘fixed price’ (taking the pill) of sexual activity. Jean Eid et al show that the result also applies to racehorses (the good horses get shipped out), and Staten and Umbeck show it applies to students – the good students go to faraway universities, and the good student in those places do the same. So that’s apples, drugs, sex and racehorses. What about the Internet and kittens?Allen-Alchian Explains Why the Internet Is Made of CatsIn analog days, before digitization and Internet, the transactions costs involved with various consumption items, whether commodities or media, meant that the Alchian-Allen effect pushed in the direction of higher quality, bundled product. Any additional fixed costs, such as higher transport costs, or taxes or duties, or transactions costs associated with search and coordination and payment, i.e. costs that affected all substitutes in the same way, would tend to make the higher quality item relatively less expensive, increasing its consumption.But digitisation and the Internet reverse the direction of these transactions costs. Rather than adding a fixed cost, such as transport costs, the various aspects of the digital revolution are equivalent to a fall in fixed costs, particularly access.These factors are not just one thing, but a suite of changes that add up to lowered transaction costs in the production, distribution and consumption of media, culture and games. These include: The internet and world-wide-web, and its unencumbered operation The growth and increasing efficacy of search technology Growth of universal broadband for fast, wide band-width access Growth of mobile access (through smartphones and other appliances) Growth of social media networks (Facebook, Twitter; Metcalfe’s law) Growth of developer and distribution platforms (iPhone, android, iTunes) Globally falling hardware and network access costs (Moore’s law) Growth of e-commerce (Ebay, Amazon, Etsy) and e-payments (paypal, bitcoin) Expansions of digital literacy and competence Creative commons These effects do not simply shift us down a demand curve for each given consumption item. This effect alone simply predicts that we consume more. But the Alchian-Allen effect makes a different prediction, namely that we consume not just more, but also different.These effects function to reduce the overall fixed costs or transactions costs associated with any consumption, sharing, or production of media, culture or games over the internet (or in digital form). With this overall fixed cost component now reduced, it represents a relatively larger decline in cost at the lower-quality, more bite-sized or unbundled end of the media goods spectrum. As such, this predicts a change in the composition of the overall consumption basket to reflect the changed relative prices that these above effects give rise to. See Figure 2 below (based on a blog post by James Oswald). The key to the economics of cute, in consequence of digitisation, is to follow through the qualitative change that, because of the Alchian-Allen effect, moves away from the high-quality, highly-bundled, high-value end of the media goods spectrum. The “pattern prediction” here is toward more, different, and lower quality: toward five minutes of “Internet animals”, rather than a full day at the zoo. Figure 2: Reducing transaction costs lowers the relative price of cat videos Consider five dimensions in which this more and different tendency plays out. Consumption These effects make digital and Internet-based consumption cheaper, shifting us down a demand curve, so we consume more. That’s the first law of demand in action: i.e. demand curves slope downwards. But a further effect – brilliantly set out in Cowen – is that we also consume lower-quality media. This is not a value judgment. These lower-quality media may well have much higher aesthetic value. They may be funnier, or more tragic and sublime; or faster, or not. This is not about absolute value; only about relative value. Digitization operating through Allen-Alchian skews consumption toward the lower quality ends in some dimensions: whether this is time, as in shorter – or cost, as in cheaper – or size, as in smaller – or transmission quality, as in gifs. This can also be seen as a form of unbundling, of dropping of dimensions that are not valued to create a simplified product.So we consume different, with higher variance. We sample more than we used to. This means that we explore a larger information world. Consumption is bite-sized and assorted. This tendency is evident in the rise of apps and in the proliferation of media forms and devices and the value of interoperability.ProductionAs consumption shifts (lower quality, greater variety), so must production. The production process has two phases: (1) figuring out what to do, or development; and (2) doing it, or making. The world of trade and globalization describes the latter part: namely efficient production. The main challenge is the world of innovation: the entrepreneurial and experimental world of figuring out what to do, and how. It is this second world that is radically transformed by implications of lowered transaction costs.One implication is growth of user-communities based around collaborative media projects (such as open source software) and community-based platforms or common pool resources for sharing knowledge, such as the “Maker movement” (Anderson 2012). This phenomenon of user-co-creation, or produsers, has been widely recognized as an important new phenomenon in the innovation and production process, particularly those processes associated with new digital technologies. There are numerous explanations for this, particularly around preferences for cooperation, community-building, social learning and reputational capital, and entrepreneurial expectations (Quiggin and Potts, Banks and Potts). Business Models The Alchian-Allen effect on consumption and production follows through to business models. A business model is a way of extracting value that represents some strategic equilibrium between market forms, organizational structures, technological possibilities and institutional framework and environmental conditions that manifests in entrepreneurial patterns of business strategy and particular patterns of investment and organization. The discovery of effective business models is a key process of market capitalist development and competition. The Alchian-Allen effect impacts on the space of effective viable business models. Business models that used to work will work less well, or not at all. And new business models will be required. It is a significant challenge to develop these “economic technologies”. Perhaps no less so than development of the physical technologies, new business models are produced through experimental trial and error. They cannot be known in advance or planned. But business models will change, which will affect not only the constellation of existing companies and the value propositions that underlie them, but also the broader specializations based on these in terms of skill sets held and developed by people, locations of businesses and people, and so on. New business models will emerge from a process of Schumpeterian creative destruction as it unfolds (Beinhocker). The large production, high development cost, proprietary intellectual property and systems based business model is not likely to survive, other than as niche areas. More experimental, discovery-focused, fast-development-then-scale-up based business models are more likely to fit the new ecology. Social Network Markets & Novelty Bundling MarketsThe growth of variety and diversity of choice that comes with this change in the way media is consumed to reflect a reallocation of consumption toward smaller more bite-sized, lower valued chunks (the Alchian-Allen effect) presents consumers with a problem, namely that they have to make more choices over novelty. Choice over novelty is difficult for consumers because it is experimental and potentially costly due to risk of mistakes (Earl), but it also presents entrepreneurs with an opportunity to seek to help solve that problem. The problem is a simple consequence of bounded rationality and time scarcity. It is equivalent to saying that the cost of choice rises monotonically with the number of choices, and that because there is no way to make a complete rational choice, agents will use decision or choice heuristics. These heuristics can be developed independently by the agents themselves through experience, or they can be copied or adopted from others (Earl and Potts). What Potts et al call “social network markets” and what Potts calls “novelty bundling markets” are both instances of the latter process of copying and adoption of decision rules. Social network markets occur when agents use a “copy the most common” or “copy the highest rank” meta-level decision rule (Bentley et al) to deal with uncertainty. Social network markets can be efficient aggregators of distributed information, but they can also be path-dependent, and usually lead to winner-take all situations and dynamics. These can result in huge pay-offs differentials between first and second or fifth place, even when the initial quality differentials are slight or random. Diversity, rapid experimentation, and “fast-failure” are likely to be effective strategies. It also points to the role of trust and reputation in using adopted decision rules and the information economics that underlies that: namely that specialization and trade applies to the production and consumption of information as well as commodities. Novelty bundling markets are an entrepreneurial response to this problem, and observable in a range of new media and creative industries contexts. These include arts, music or food festivals or fairs where entertainment and sociality is combined with low opportunity cost situations in which to try bundles of novelty and connect with experts. These are by agents who developed expert preferences through investment and experience in consumption of the particular segment or domain. They are expert consumers and are selling their “decision rules” and not just the product. The more production and consumption of media and digital information goods and services experiences the Alchian-Allen effect, the greater the importance of novelty bundling markets. Intellectual Property & Regulation A further implication is that rent-seeking solutions may also emerge. This can be seen in two dimensions; pursuit of intellectual property (Boldrin and Levine); and demand for regulations (Stigler). The Alchian-Allen induced shift will affect markets and business models (and firms), and because this will induce strategic defensive and aggressive responses from different organizations. Some organizations will seek to fight and adapt to this new world through innovative competition. Other firms will fight through political connections. Most incumbent firms will have substantial investments in IP or in the business model it supports. Yet the intellectual property model is optimized for high-quality large volume centralized production and global sales of undifferentiated product. Much industrial and labour regulation is built on that model. How governments support such industries is predicated on the stability of this model. The Alchian-Allen effect threatens to upset that model. Political pushback will invariably take the form of opposing most new business models and the new entrants they carry. Conclusion I have presented here a lesser-known but important theorem in applied microeconomics – the Alchian-Allen effect – and explain why its inverse is central to understanding the evolution of new media industries, and also why cute animals proliferate on the Internet. The theorem states that when a fixed cost is added to substitute goods, consumers will shift to the higher quality item (now relatively less expensive). The theorem also holds in reverse, when a fixed cost is removed from substitute items we expect a shift to lower quality consumption. The Internet has dramatically lowered fixed costs of access to media consumption, and various development platforms have similarly lowered the costs of production. Alchian-Allen predicts a shift to lower-quality, ”bittier” cuter consumption (Cowen). References Alchian, Arman, and William Allen. Exchange and Production. 2nd ed. Belmont, CA: Wadsworth, 1967. Anderson, Chris. Makers. New York: Crown Business, 2012. Banks, John, and Jason Potts. "Consumer Co-Creation in Online Games." New Media and Society 12.2 (2010): 253-70. Beinhocker, Eric. Origin of Wealth. Cambridge, Mass.: Harvard University Press, 2005. Bentley, R., et al. "Regular Rates of Popular Culture Change Reflect Random Copying." Evolution and Human Behavior 28 (2007): 151-158. Borcherding, Thomas, and Eugene Silberberg. "Shipping the Good Apples Out: The Alchian and Allen Theorem Reconsidered." Journal of Political Economy 86.1 (1978): 131-6. Cowen, Tyler. Create Your Own Economy. New York: Dutton, 2009. (Also published as The Age of the Infovore: Succeeding in the Information Economy. Penguin, 2010.) Cowen, Tyler, and Alexander Tabarrok. "Good Grapes and Bad Lobsters: The Alchian and Allen Theorem Revisited." Journal of Economic Inquiry 33.2 (1995): 253-6. Cuellar, Steven. "Sex, Drugs and the Alchian-Allen Theorem." Unpublished paper, 2005. 29 Apr. 2014 ‹http://www.sonoma.edu/users/c/cuellar/research/Sex-Drugs.pdf›.Earl, Peter. The Economic Imagination. Cheltenham: Harvester Wheatsheaf, 1986. Earl, Peter, and Jason Potts. "The Market for Preferences." Cambridge Journal of Economics 28 (2004): 619–33. Eid, Jean, Travis Ng, and Terence Tai-Leung Chong. "Shipping the Good Horses Out." Wworking paper, 2012. http://homes.chass.utoronto.ca/~ngkaho/Research/shippinghorses.pdf Potts, Jason, et al. "Social Network Markets: A New Definition of Creative Industries." Journal of Cultural Economics 32.3 (2008): 166-185. Quiggin, John, and Jason Potts. "Economics of Non-Market Innovation & Digital Literacy." Media International Australia 128 (2008): 144-50. Razzolini, Laura, William Shughart, and Robert Tollison. "On the Third Law of Demand." Economic Inquiry 41.2 (2003): 292–298. Staten, Michael, and John Umbeck. “Shipping the Good Students Out: The Effect of a Fixed Charge on Student Enrollments.” Journal of Economic Education 20.2 (1989): 165-171. Stigler, George. "The Theory of Economic Regulation." Bell Journal of Economics 2.1 (1971): 3-22. Thornton, Mark. The Economics of Prohibition. Salt Lake City: University of Utah Press, 1991.Thornton, Mark. "The Potency of Illegal Drugs." Journal of Drug Issues 28.3 (1998): 525-40.
APA, Harvard, Vancouver, ISO, and other styles
38

Simpson, Catherine. "Cars, Climates and Subjectivity: Car Sharing and Resisting Hegemonic Automobile Culture?" M/C Journal 12, no. 4 (2009). http://dx.doi.org/10.5204/mcj.176.

Full text
Abstract:
Al Gore brought climate change into … our living rooms. … The 2008 oil price hikes [and the global financial crisis] awakened the world to potential economic hardship in a rapidly urbanising world where the petrol-driven automobile is still king. (Mouritz 47) Six hundred million cars (Urry, “Climate Change” 265) traverse the world’s roads, or sit idly in garages and clogging city streets. The West’s economic progress has been built in part around the success of the automotive industry, where the private car rules the spaces and rhythms of daily life. The problem of “automobile dependence” (Newman and Kenworthy) is often cited as one of the biggest challenges facing countries attempting to combat anthropogenic climate change. Sociologist John Urry has claimed that automobility is an “entire culture” that has re-defined movement in the contemporary world (Urry Mobilities 133). As such, it is the single most significant environmental challenge “because of the intensity of resource use, the production of pollutants and the dominant culture which sustains the major discourses of what constitutes the good life” (Urry Sociology 57-8). Climate change has forced a re-thinking of not only how we produce and dispose of cars, but also how we use them. What might a society not dominated by the private, petrol-driven car look like? Some of the pre-eminent writers on climate change futures, such as Gwynne Dyer, James Lovelock and John Urry, discuss one possibility that might emerge when oil becomes scarce: societies will descend into civil chaos, “a Hobbesian war of all against all” where “regional warlordism” and the most brutish, barbaric aspects of human nature come to the fore (Urry, “Climate Change” 261). Discussing a post-car society, John Urry also proffers another scenario in his “sociologies of the future:” an Orwellian “digital panopticon” in which other modes of transport, far more suited to a networked society, might emerge on a large scale and, in the long run, “might tip the system” into post-car one before it is too late (Urry, “Climate Change” 261). Amongst the many options he discusses is car sharing. Since its introduction in Germany more than 30 years ago, most of the critical literature has been devoted to the planning, environmental and business innovation aspects of car sharing; however very little has been written on its cultural dimensions. This paper analyses this small but developing trend in many Western countries, but more specifically its emergence in Sydney. The convergence of climate change discourse with that of the global financial crisis has resulted in a focus in the mainstream media, over the last few months, on technologies and practices that might save us money and also help the environment. For instance, a Channel 10 News story in May 2009 focused on the boom in car sharing in Sydney (see: http://www.youtube.com/watch? v=EPTT8vYVXro). Car sharing is an adaptive technology that doesn’t do away with the car altogether, but rather transforms the ways in which cars are used, thought about and promoted. I argue that car sharing provides a challenge to the dominant consumerist model of the privately owned car that has sustained capitalist structures for at least the last 50 years. In addition, through looking at some marketing and promotion tactics of car sharing in Australia, I examine some emerging car sharing subjectivities that both extend and subvert the long-established discourses of the automobile’s flexibility and autonomy to tempt monogamous car buyers into becoming philandering car sharers. Much literature has emerged over the last decade devoted to the ubiquitous phenomenon of automobility. “The car is the literal ‘iron cage’ of modernity, motorised, moving and domestic,” claims Urry (“Connections” 28). Over the course of twentieth century, automobility became “the dominant form of daily movement over much of the planet (dominating even those who do not move by cars)” (Paterson 132). Underpinning Urry’s prolific production of literature is his concept of automobility. This he defines as a complex system of “intersecting assemblages” that is not only about driving cars but the nexus between “production, consumption, machinic complexes, mobility, culture and environmental resource use” (Urry, “Connections” 28). In addition, Matthew Paterson, in his Automobile Politics, asserts that “automobility” should be viewed as everything that makes driving around in a car possible: highways, parking structures and traffic rules (87). While the private car seems an inevitable outcome of a capitalistic, individualistic modern society, much work has gone into the process of naturalising a dominant notion of automobility on drivers’ horizons. Through art, literature, popular music and brand advertising, the car has long been associated with seductive forms of identity, and societies have been built around a hegemonic culture of car ownership and driving as the pre-eminent, modern mode of self-expression. And more than 50 years of a popular Hollywood film genre—road movies—has been devoted to glorifying the car as total freedom, or in its more nihilistic version, “freedom on the road to nowhere” (Corrigan). As Paterson claims, “autonomous mobility of car driving is socially produced … by a range of interventions that have made it possible” (18). One of the main reasons automobility has been so successful, he claims, is through its ability to reproduce capitalist society. It provided a commodity around which a whole set of symbols, images and discourses could be constructed which served to effectively legitimise capitalist society. (30) Once the process is locked-in, it then becomes difficult to reverse as billions of agents have adapted to it and built their lives around “automobility’s strange mixture of co-ercion and flexibility” (Urry, “Climate Change” 266). The Decline of the Car Globally, the greatest recent rupture in the automobile’s meta-narrative of success came about in October 2008 when three CEOs from the major US car firms (General Motors, Ford and Chrysler) begged the United States Senate for emergency loan funds to avoid going bankrupt. To put the economic significance of this into context, Emma Rothschild notes “when the listing of the ‘Fortune 500’ began in 1955, General Motors was the largest American corporation, and it was one of the three largest, measured in revenues, every year until 2007” (Rothschilds, “Can we transform”). Curiously, instead of focusing on the death of the car (industry), as we know it, that this scenario might inevitably herald, much of the media attention focused on the hypocrisy and environmental hubris of the fact that all the CEOs had flown in private luxury jets to Washington. “Couldn’t they have at least jet-pooled?” complained one Democrat Senator (Wutkowski). In their next visit to Washington, most of them drove up in experimental vehicles still in pre-production, including plug-in hybrids. Up until that point no other manufacturing industry had been bailed out in the current financial crisis. Of course it’s not the first time the automobile industries have been given government assistance. The Australian automotive industry has received on-going government subsidies since the 1980s. Most recently, PM Kevin Rudd granted a 6.2 billion dollar ‘green car’ package to Australian automotive manufacturers. His justification to the growing chorus of doubts about the economic legitimacy of such a move was: “Some might say it's not worth trying to have a car industry, that is not my view, it is not the view of the Australian government and it never will be the view of any government which I lead” (The Australian). Amongst the many reasons for the government support of these industries must include the extraordinary interweaving of discourses of nationhood and progress with the success of the car industry. As the last few months reveal, evidently the mantra still prevails of “what’s good for the country is good for GM and vice versa”, as the former CEO of General Motors, Charles “Engine” Wilson, argued back in 1952 (Hirsch). In post-industrial societies like Australia it’s not only the economic aspects of the automotive industries that are criticised. Cars seem to be slowly losing their grip on identity-formation that they managed to maintain throughout “the century of the car” (Gilroy). They are no longer unproblematically associated with progress, freedom, youthfulness and absolute autonomy. The decline and eventual death of the automobile as we know it will be long, arduous and drawn-out. But there are some signs of a post-automobile society emerging, perhaps where cars will still be used but they will not dominate our society, urban space and culture in quite the same way that they have over the last 50 years. Urry discusses six transformations that might ‘tip’ the hegemonic system of automobility into a post-car one. He mentions new fuel systems, new materials for car construction, the de-privatisation of cars, development of communications technologies and integration of networked public transport through smart card technology and systems (Urry, Mobilities 281-284). As Paterson and others have argued, computers and mobile phones have somehow become “more genuine symbols of mobility and in turn progress” than the car (157). As a result, much automobile advertising now intertwines communications technologies with brand to valorise mobility. Car sharing goes some way in not only de-privatising cars but also using smart card technology and networked systems enabling an association with mobility futures. In Automobile Politics Paterson asks, “Is the car fundamentally unsustainable? Can it be greened? Has the car been so naturalised on our mobile horizons that we can’t imagine a society without it?” (27). From a sustainability perspective, one of the biggest problems with cars is still the amount of space devoted to them; highways, garages, car parks. About one-quarter of the land in London and nearly one-half of that in Los Angeles is devoted to car-only environments (Urry, “Connections” 29). In Sydney, it is more like a quarter. We have to reduce the numbers of cars on our roads to make our societies livable (Newman and Kenworthy). Car sharing provokes a re-thinking of urban space. If one quarter of Sydney’s population car shared and we converted this space into green use or local market gardens, then we’d have a radically transformed city. Car sharing, not to be confused with ‘ride sharing’ or ‘car pooling,’ involves a number of people using cars that are parked centrally in dedicated car bays around the inner city. After becoming a member (much like a 6 or 12 monthly gym membership), the cars can be booked (and extended) by the hour via the web or phone. They can then be accessed via a smart card. In Sydney there are 3 car sharing organisations operating: Flexicar (http://www.flexicar.com.au/), CharterDrive (http://www.charterdrive.com.au/) and GoGet (http://www.goget.com.au/).[1] The largest of these, GoGet, has been operating for 6 years and has over 5000 members and 200 cars located predominantly in the inner city suburbs. Anecdotally, GoGet claims its membership is primarily drawn from professionals living in the inner-urban ring. Their motivation for joining is, firstly, the convenience that car sharing provides in a congested, public transport-challenged city like Sydney; secondly, the financial savings derived; and thirdly, members consider the environmental and social benefits axiomatic. [2] The promotion tactics of car sharing seems to reflect this by barely mentioning the environment but focusing on those aspects which link car sharing to futuristic and flexible subjectivities which I outline in the next section. Unlike traditional car rental, the vehicles in car sharing are scattered through local streets in a network allowing local residents and businesses access to the vehicles mostly on foot. One car share vehicle is used by 22-24 members and gets about seven cars off the street (Mehlman 22). With lots of different makes and models of vehicles in each of their fleets, Flexicar’s website claims, “around the corner, around the clock” “Flexicar offers you the freedom of driving your own car without the costs and hassles of owning one,” while GoGet asserts, “like owning a car only better.” Due to the initial lack of interest from government, all the car sharing organisations in Australia are privately owned. This is very different to the situation in Europe where governments grant considerable financial assistance and have often integrated car sharing into pre-existing public transport networks. Urry discusses the spread of car sharing across the Western world: Six hundred plus cities across Europe have developed car-sharing schemes involving 50,000 people (Cervero, 2001). Prototype examples are found such as Liselec in La Rochelle, and in northern California, Berlin and Japan (Motavalli, 2000: 233). In Deptford there is an on-site car pooling service organized by Avis attached to a new housing development, while in Jersey electric hire cars have been introduced by Toyota. (Urry, “Connections” 34) ‘Collaborative Consumption’ and Flexible, Philandering Subjectivities Car sharing shifts the dominant conception of a car from being a ‘commodity’, which people purchase and subsequently identify with, to a ‘service’ or network of vehicles that are collectively used. It does this through breaking down the one car = one person (or one family) ratio with one car instead servicing 20 or more people. One of Paterson’s biggest criticisms concerns car driving as “a form of social exclusion” (44). Car sharing goes some way in subverting the model of hyper-individualism that supports both hegemonic automobility and capitalist structures, whereby the private motorcar produces a “separation of individuals from one another driving in their own private universes with no account for anyone else” (Paterson 90). As a car sharer, the driver has to acknowledge that this is not their private domain, and the car no longer becomes an extension of their living room or bedroom, as is noted in much literature around car cultures (Morris, Sheller, Simpson). There are a community of people using the car, so the driver needs to be attentive to things like keeping the car clean and bringing it back on time so another person can use it. So while car sharing may change the affective relationship and self-identification with the vehicle itself, it doesn’t necessarily change the phenomenological dimensions of car driving, such as the nostalgic pleasure of driving on the open road, or perhaps more realistically in Sydney, the frustration of being caught in a traffic jam. However, the fact the driver doesn’t own the vehicle does alter their relationship to the space and the commodity in a literal as well as a figurative way. Like car ownership, evidently car sharing also produces its own set of limitations on freedom and convenience. That mobility and car ownership equals freedom—the ‘freedom to drive’—is one imaginary which car firms were able to successfully manipulate and perpetuate throughout the twentieth century. However, car sharing also attaches itself to the same discourses of freedom and pervasive individualism and then thwarts them. For instance, GoGet in Sydney have run numerous marketing campaigns that attempt to contest several ‘self-evident truths’ about automobility. One is flexibility. Flexibility (and associated convenience) was one thing that ownership of a car in the late twentieth century was firmly able to affiliate itself with. However, car ownership is now more often associated with being expensive, a hassle and a long-term commitment, through things like buying, licensing, service and maintenance, cleaning, fuelling, parking permits, etc. Cars have also long been linked with sexuality. When in the 1970s financial challenges to the car were coming as a result of the oil shocks, Chair of General Motors, James Roche stated that, “America’s romance with the car is not over. Instead it has blossomed into a marriage” (Rothschilds, Paradise Lost). In one marketing campaign GoGet asked, ‘Why buy a car when all you need is a one night stand?’, implying that owning a car is much like a monogamous relationship that engenders particular commitments and responsibilities, whereas car sharing can just be a ‘flirtation’ or a ‘one night stand’ and you don’t have to come back if you find it a hassle. Car sharing produces a philandering subjectivity that gives individuals the freedom to have lots of different types of cars, and therefore relationships with each of them: I can be a Mini Cooper driver one day and a Falcon driver the next. This disrupts the whole kind of identification with one type of car that ownership encourages. It also breaks down a stalwart of capitalism—brand loyalty to a particular make of car with models changing throughout a person’s lifetime. Car sharing engenders far more fluid types of subjectivities as opposed to those rigid identities associated with ownership of one car. Car sharing can also be regarded as part of an emerging phenomenon of what Rachel Botsman and Roo Rogers have called “collaborative consumption”—when a community gets together “through organized sharing, swapping, bartering, trading, gifting and renting to get the same pleasures of ownership with reduced personal cost and burden, and lower environmental impact” (www.collaborativeconsumption.com). As Urry has stated, these developments indicate a gradual transformation in current economic structures from ownership to access, as shown more generally by many services offered and accessed via the web (Urry Mobilities 283). Rogers and Botsman maintain that this has come about through the “convergence of online social networks increasing cost consciousness and environmental necessity." In the future we could predict an increasing shift to payment to ‘access’ for mobility services, rather than the outright private ownerships of vehicles (Urry, “Connections”). Networked-Subjectivities or a ‘Digital Panopticon’? Cars, no longer able on their own to signify progress in either technical or social terms, attain their symbolic value through their connection to other, now more prevalently ‘progressive’ technologies. (Paterson 155) The term ‘digital panopticon’ has often been used to describe a dystopian world of virtual surveillance through such things as web-enabled social networking sites where much information is public, or alternatively, for example, the traffic surveillance system in London whereby the public can be constantly scrutinised through the centrally monitored cameras that track people’s/vehicle’s movements on city streets. In his “sociologies of the future,” Urry maintains that one thing which might save us from descending into post-car civil chaos is a system governed by a “digital panopticon” mobility system. This would be governed by a nexus system “that orders, regulates, tracks and relatively soon would ‘drive’ each vehicle and monitor each driver/passenger” (Urry, “Connections” 33). The transformation of mobile technologies over the last decade has made car sharing, as a viable business model, possible. Through car sharing’s exploitation of an online booking system, and cars that can be tracked, monitored and traced, the seeds of a mobile “networked-subjectivity” are emerging. But it’s not just the technology people are embracing; a cultural shift is occurring in the way that people understand mobility, their own subjectivity, and more importantly, the role of cars. NETT Magazine did a feature on car sharing, and advertised it on their front cover as “GoGet’s web and mobile challenge to car owners” (May 2009). Car sharing seems to be able to tap into more contemporary understandings of what mobility and flexibility might mean in the twenty-first century. In their marketing and promotion tactics, car sharing organisations often discursively exploit science fiction terminology and generate a subjectivity much more dependent on networks and accessibility (158). In the suburbs people park their cars in garages. In car sharing, the vehicles are parked not in car bays or car parks, but in publically accessible ‘pods’, which promotes a futuristic, sci-fi experience. Even the phenomenological dimensions of swiping a smart card over the front of the windscreen to open the car engender a transformation in access to the car, instead of through a key. This is service-technology of the future while those stuck in car ownership are from the old economy and the “century of the car” (Gilroy). The connections between car sharing and the mobile phone and other communications technologies are part of the notion of a networked, accessible vehicle. However, the more problematic side to this is the car under surveillance. Nic Lowe, of his car sharing organisation GoGet says, “Because you’re tagged on and we know it’s you, you are able to drive the car… every event you do is logged, so we know what time you turned the key, what time you turned it off and we know how far you drove … if a car is lost we can sound the horn to disable it remotely to prevent theft. We can track how fast you were going and even how fast you accelerated … track the kilometres for billing purposes and even find out when people are using the car when they shouldn’t be” (Mehlman 27). The possibility with the GPS technology installed in the car is being able to monitor speeds at which people drive, thereby fining then every minute spent going over the speed limit. While this conjures up the notion of the car under surveillance, it is also a much less bleaker scenario than “a Hobbesian war of all against all”. Conclusion: “Hundreds of Cars, No Garage” The prospect of climate change is provoking innovation at a whole range of levels, as well as providing a re-thinking of how we use taken-for-granted technologies. Sometime this century the one tonne, privately owned, petrol-driven car will become an artefact, much like Sydney trams did last century. At this point in time, car sharing can be regarded as an emerging transitional technology to a post-car society that provides a challenge to hegemonic automobile culture. It is evidently not a radical departure from the car’s vast machinic complex and still remains a part of what Urry calls the “system of automobility”. From a pro-car perspective, its networked surveillance places constraints on the free agency of the car, while for those of the deep green variety it is, no doubt, a compromise. Nevertheless, it provides a starting point for re-thinking the foundations of the privately-owned car. While Urry makes an important point in relation to a society moving from ownership to access, he doesn’t take into account the cultural shifts occurring that are enabling car sharing to be attractive to prospective members: the notion of networked subjectivities, the discursive constructs used to establish car sharing as a thing of the future with pods and smart cards instead of garages and keys. If car sharing became mainstream it could have radical environmental impacts on things like urban space and pollution, as well as the dominant culture of “automobile dependence” (Newman and Kenworthy), as Australia attempts to move to a low carbon economy. Notes [1] My partner Bruce Jeffreys, together with Nic Lowe, founded Newtown Car Share in 2002, which is now called GoGet. [2] Several layers down in the ‘About Us’ link on GoGet’s website is the following information about the environmental benefits of car sharing: “GoGet's aim is to provide a reliable, convenient and affordable transport service that: allows people to live car-free, decreases car usage, improves local air quality, removes private cars from local streets, increases patronage for public transport, allows people to lead more active lives” (http://www.goget.com.au/about-us.html). References The Australian. “Kevin Rudd Throws $6.2bn Lifeline to Car Industry.” 10 Nov. 2008. < http://www.theaustralian.news.com.au/business/story/ 0,28124,24628026-5018011,00.html >.Corrigan, Tim. “Genre, Gender, and Hysteria: The Road Movie in Outer Space.” A Cinema Without Walls: Movies, Culture after Vietnam. New Jersey: Rutgers University Press, 1991. Dwyer, Gwynne. Climate Wars. North Carlton: Scribe, 2008. Featherstone, Mike. “Automobilities: An Introduction.” Theory, Culture and Society 21.4-5 (2004): 1-24. Gilroy, Paul. “Driving while Black.” Car Cultures. Ed. Daniel Miller. Oxford: Berg, 2000. Hirsch, Michael. “Barack the Saviour.” Newsweek 13 Nov. 2008. < http://www.newsweek.com/id/168867 >. Lovelock, James. The Revenge of Gaia: Earth’s Climate Crisis and the Fate of Humanity. Penguin, 2007. Lovelock, James. The Vanishing Face of Gaia. Penguin, 2009. Mehlman, Josh. “Community Driven Success.” NETT Magazine (May 2009): 22-28. Morris, Meaghan. “Fate and the Family Sedan.” East West Film Journal 4.1 (1989): 113-134. Mouritz, Mike. “City Views.” Fast Thinking Winter 2009: 47-50. Newman, P. and J. Kenworthy. Sustainability and Cities: Overcoming Automobile Dependence. Washington DC: Island Press, 1999. Paterson, Matthew. Automobile Politics: Ecology and Cultural Political Economy. Cambridge: Cambridge University Press, 2007. Rothschilds, Emma. Paradise Lost: The Decline of the Auto-Industrial Age. New York: Radom House, 1973. Rothschilds, Emma. “Can We Transform the Auto-Industrial Society?” New York Review of Books 56.3 (2009). < http://www.nybooks.com/articles/22333 >. Sheller, Mimi. “Automotive Emotions: Feeling the Car.” Theory, Culture and Society 21 (2004): 221–42. Simpson, Catherine. “Volatile Vehicles: When Women Take the Wheel.” Womenvision. Ed. Lisa French. Melbourne: Damned Publishing, 2003. 197-210. Urry, John. Sociology Beyond Societies: Mobilities for the 21st Century. London: Routledge, 2000. Urry, John. “Connections.” Environment and Planning D: Society and Space 22 (2004): 27-37. Urry, John. Mobilities. Cambridge, and Maiden, MA: Polity Press, 2008. Urry, John. “Climate Change, Travel and Complex Futures.” British Journal of Sociology 59. 2 (2008): 261-279. Watts, Laura, and John Urry. “Moving Methods, Travelling Times.” Environment and Planning D: Society and Space 26 (2008): 860-874. Wutkowski, Karey. “Auto Execs' Private Flights to Washington Draw Ire.” Reuters News Agency 19 Nov. 2008. < http://www.reuters.com/article/newsOne/idUSTRE4AI8C520081119 >.
APA, Harvard, Vancouver, ISO, and other styles
39

Dwyer, Tim. "Transformations." M/C Journal 7, no. 2 (2004). http://dx.doi.org/10.5204/mcj.2339.

Full text
Abstract:
The Australian Government has been actively evaluating how best to merge the functions of the Australian Communications Authority (ACA) and the Australian Broadcasting Authority (ABA) for around two years now. Broadly, the reason for this is an attempt to keep pace with the communications media transformations we reduce to the term “convergence.” Mounting pressure for restructuring is emerging as a site of turf contestation: the possibility of a regulatory “one-stop shop” for governments (and some industry players) is an end game of considerable force. But, from a public interest perspective, the case for a converged regulator needs to make sense to audiences using various media, as well as in terms of arguments about global, industrial, and technological change. This national debate about the institutional reshaping of media regulation is occurring within a wider global context of transformations in social, technological, and politico-economic frameworks of open capital and cultural markets, including the increasing prominence of international economic organisations, corporations, and Free Trade Agreements (FTAs). Although the recently concluded FTA with the US explicitly carves out a right for Australian Governments to make regulatory policy in relation to existing and new media, considerable uncertainty remains as to future regulatory arrangements. A key concern is how a right to intervene in cultural markets will be sustained in the face of cultural, politico-economic, and technological pressures that are reconfiguring creative industries on an international scale. While the right to intervene was retained for the audiovisual sector in the FTA, by contrast, it appears that comparable unilateral rights to intervene will not operate for telecommunications, e-commerce or intellectual property (DFAT). Blurring Boundaries A lack of certainty for audiences is a by-product of industry change, and further blurs regulatory boundaries: new digital media content and overlapping delivering technologies are already a reality for Australia’s media regulators. These hypothetical media usage scenarios indicate how confusion over the appropriate regulatory agency may arise: 1. playing electronic games that use racist language; 2. being subjected to deceptive or misleading pop-up advertising online 3. receiving messaged imagery on your mobile phone that offends, disturbs, or annoys; 4. watching a program like World Idol with SMS voting that subsequently raises charging or billing issues; or 5. watching a new “reality” TV program where products are being promoted with no explicit acknowledgement of the underlying commercial arrangements either during or at the end of the program. These are all instances where, theoretically, regulatory mechanisms are in place that allow individuals to complain and to seek some kind of redress as consumers and citizens. In the last scenario, in commercial television under the sector code, no clear-cut rules exist as to the precise form of the disclosure—as there is (from 2000) in commercial radio. It’s one of a number of issues the peak TV industry lobby Commercial TV Australia (CTVA) is considering in their review of the industry’s code of practice. CTVA have proposed an amendment to the code that will simply formalise the already existing practice . That is, commercial arrangements that assist in the making of a program should be acknowledged either during programs, or in their credits. In my view, this amendment doesn’t go far enough in post “cash for comment” mediascapes (Dwyer). Audiences have a right to expect that broadcasters, production companies and program celebrities are open and transparent with the Australian community about these kinds of arrangements. They need to be far more clearly signposted, and people better informed about their role. In the US, the “Commercial Alert” <http://www.commercialalert.org/> organisation has been lobbying the Federal Communications Commission and the Federal Trade Commission to achieve similar in-program “visual acknowledgements.” The ABA’s Commercial Radio Inquiry (“Cash-for-Comment”) found widespread systemic regulatory failure and introduced three new standards. On that basis, how could a “standstill” response by CTVA, constitute best practice for such a pervasive and influential medium as contemporary commercial television? The World Idol example may lead to confusion for some audiences, who are unsure whether the issues involved relate to broadcasting or telecommunications. In fact, it could be dealt with as a complaint to the Telecommunication Industry Ombudsman (TIO) under an ACA registered, but Australian Communications Industry Forum (ACIF) developed, code of practice. These kind of cross-platform issues may become more vexed in future years from an audience’s perspective, especially if reality formats using on-screen premium rate service numbers invite audiences to participate, by sending MMS (multimedia messaging services) images or short video grabs over wireless networks. The political and cultural implications of this kind of audience interaction, in terms of access, participation, and more generally the symbolic power of media, may perhaps even indicate a longer-term shift in relations with consumers and citizens. In the Internet example, the Australian Competition and Consumer Commission’s (ACCC) Internet advertising jurisdiction would apply—not the ABA’s “co-regulatory” Internet content regime as some may have thought. Although the ACCC deals with complaints relating to Internet advertising, there won’t be much traction for them in a more complex issue that also includes, say, racist or religious bigotry. The DVD example would probably fall between the remits of the Office of Film and Literature Classification’s (OFLC) new “convergent” Guidelines for the Classification of Film and Computer Games and race discrimination legislation administered by the Human Rights and Equal Opportunity Commission (HREOC). The OFLC’s National Classification Scheme is really geared to provide consumer advice on media products that contain sexual and violent imagery or coarse language, rather than issues of racist language. And it’s unlikely that a single person would have the locus standito even apply for a reclassification. It may fall within the jurisdiction of the HREOC depending on whether it was played in public or not. Even then it would probably be considered exempt on free speech grounds as an “artistic work.” Unsolicited, potentially illegal, content transmitted via mobile wireless devices, in particular 3G phones, provide another example of content that falls between the media regulation cracks. It illustrates a potential content policy “turf grab” too. Image-enabled mobile phones create a variety of novel issues for content producers, network operators, regulators, parents and viewers. There is no one government media authority or agency with a remit to deal with this issue. Although it has elements relating to the regulatory activities of the ACA, the ABA, the OFLC, the TIO, and TISSC, the combination of illegal or potentially prohibited content and its carriage over wireless networks positions it outside their current frameworks. The ACA may argue it should have responsibility for this kind of content since: it now enforces the recently enacted Commonwealth anti-Spam laws; has registered an industry code of practice for unsolicited content delivered over wireless networks; is seeking to include ‘adult’ content within premium rate service numbers, and, has been actively involved in consumer education for mobile telephony. It has also worked with TISSC and the ABA in relation to telephone sex information services over voice networks. On the other hand, the ABA would probably argue that it has the relevant expertise for regulating wirelessly transmitted image-content, arising from its experience of Internet and free and subscription TV industries, under co-regulatory codes of practice. The OFLC can also stake its claim for policy and compliance expertise, since the recently implemented Guidelines for Classification of Film and Computer Games were specifically developed to address issues of industry convergence. These Guidelines now underpin the regulation of content across the film, TV, video, subscription TV, computer games and Internet sectors. Reshaping Institutions Debates around the “merged regulator” concept have occurred on and off for at least a decade, with vested interests in agencies and the executive jockeying to stake claims over new turf. On several occasions the debate has been given renewed impetus in the context of ruling conservative parties’ mooted changes to the ownership and control regime. It’s tended to highlight demarcations of remit, informed as they are by historical and legal developments, and the gradual accretion of regulatory cultures. Now the key pressure points for regulatory change include the mere existence of already converged single regulatory structures in those countries with whom we tend to triangulate our policy comparisons—the US, the UK and Canada—increasingly in a context of debates concerning international trade agreements; and, overlaying this, new media formats and devices are complicating existing institutional arrangements and legal frameworks. The Department of Communications, Information Technology & the Arts’s (DCITA) review brief was initially framed as “options for reform in spectrum management,” but was then widened to include “new institutional arrangements” for a converged regulator, to deal with visual content in the latest generation of mobile telephony, and other image-enabled wireless devices (DCITA). No other regulatory agencies appear, at this point, to be actively on the Government’s radar screen (although they previously have been). Were the review to look more inclusively, the ACCC, the OFLC and the specialist telecommunications bodies, the TIO and the TISSC may also be drawn in. Current regulatory arrangements see the ACA delegate responsibility for broadcasting services bands of the radio frequency spectrum to the ABA. In fact, spectrum management is the turf least contested by the regulatory players themselves, although the “convergent regulator” issue provokes considerable angst among powerful incumbent media players. The consensus that exists at a regulatory level can be linked to the scientific convention that holds the radio frequency spectrum is a continuum of electromagnetic bands. In this view, it becomes artificial to sever broadcasting, as “broadcasting services bands” from the other remaining highly diverse communications uses, as occurred from 1992 when the Broadcasting Services Act was introduced. The prospect of new forms of spectrum charging is highly alarming for commercial broadcasters. In a joint submission to the DCITA review, the peak TV and radio industry lobby groups have indicated they will fight tooth and nail to resist new regulatory arrangements that would see a move away from the existing licence fee arrangements. These are paid as a sliding scale percentage of gross earnings that, it has been argued by Julian Thomas and Marion McCutcheon, “do not reflect the amount of spectrum used by a broadcaster, do not reflect the opportunity cost of using the spectrum, and do not provide an incentive for broadcasters to pursue more efficient ways of delivering their services” (6). An economic rationalist logic underpins pressure to modify the spectrum management (and charging) regime, and undoubtedly contributes to the commercial broadcasting industry’s general paranoia about reform. Total revenues collected by the ABA and the ACA between 1997 and 2002 were, respectively, $1423 million and $3644.7 million. Of these sums, using auction mechanisms, the ABA collected $391 million, while the ACA collected some $3 billion. The sale of spectrum that will be returned to the Commonwealth by television broadcasters when analog spectrum is eventually switched off, around the end of the decade, is a salivating prospect for Treasury officials. The large sums that have been successfully raised by the ACA boosts their position in planning discussions for the convergent media regulatory agency. The way in which media outlets and regulators respond to publics is an enduring question for a democratic polity, irrespective of how the product itself has been mediated and accessed. Media regulation and civic responsibility, including frameworks for negotiating consumer and citizen rights, are fundamental democratic rights (Keane; Tambini). The ABA’s Commercial Radio Inquiry (‘cash for comment’) has also reminded us that regulatory frameworks are important at the level of corporate conduct, as well as how they negotiate relations with specific media audiences (Johnson; Turner; Gordon-Smith). Building publicly meaningful regulatory frameworks will be demanding: relationships with audiences are often complex as people are constructed as both consumers and citizens, through marketised media regulation, institutions and more recently, through hybridising program formats (Murdock and Golding; Lumby and Probyn). In TV, we’ve seen the growth of infotainment formats blending entertainment and informational aspects of media consumption. At a deeper level, changes in the regulatory landscape are symptomatic of broader tectonic shifts in the discourses of governance in advanced information economies from the late 1980s onwards, where deregulatory agendas created an increasing reliance on free market, business-oriented solutions to regulation. “Co-regulation” and “self-regulation’ became the preferred mechanisms to more direct state control. Yet, curiously contradicting these market transformations, we continue to witness recurring instances of direct intervention on the basis of censorship rationales (Dwyer and Stockbridge). That digital media content is “converging” between different technologies and modes of delivery is the norm in “new media” regulatory rhetoric. Others critique “visions of techno-glory,” arguing instead for a view that sees fundamental continuities in media technologies (Winston). But the socio-cultural impacts of new media developments surround us: the introduction of multichannel digital and interactive TV (in free-to-air and subscription variants); broadband access in the office and home; wirelessly delivered content and mobility, and, as Jock Given notes, around the corner, there’s the possibility of “an Amazon.Com of movies-on-demand, with the local video and DVD store replaced by online access to a distant server” (90). Taking a longer view of media history, these changes can be seen to be embedded in the global (and local) “innovation frontier” of converging digital media content industries and its transforming modes of delivery and access technologies (QUT/CIRAC/Cutler & Co). The activities of regulatory agencies will continue to be a source of policy rivalry and turf contestation until such time as a convergent regulator is established to the satisfaction of key players. However, there are risks that the benefits of institutional reshaping will not be readily available for either audiences or industry. In the past, the idea that media power and responsibility ought to coexist has been recognised in both the regulation of the media by the state, and the field of communications media analysis (Curran and Seaton; Couldry). But for now, as media industries transform, whatever the eventual institutional configuration, the evolution of media power in neo-liberal market mediascapes will challenge the ongoing capacity for interventions by national governments and their agencies. Works Cited Australian Broadcasting Authority. Commercial Radio Inquiry: Final Report of the Australian Broadcasting Authority. Sydney: ABA, 2000. Australian Communications Information Forum. Industry Code: Short Message Service (SMS) Issues. Dec. 2002. 8 Mar. 2004 <http://www.acif.org.au/__data/page/3235/C580_Dec_2002_ACA.pdf >. Commercial Television Australia. Draft Commercial Television Industry Code of Practice. Aug. 2003. 8 Mar. 2004 <http://www.ctva.com.au/control.cfm?page=codereview&pageID=171&menucat=1.2.110.171&Level=3>. Couldry, Nick. The Place of Media Power: Pilgrims and Witnesses of the Media Age. London: Routledge, 2000. Curran, James, and Jean Seaton. Power without Responsibility: The Press, Broadcasting and New Media in Britain. 6th ed. London: Routledge, 2003. Dept. of Communication, Information Technology and the Arts. Options for Structural Reform in Spectrum Management. Canberra: DCITA, Aug. 2002. ---. Proposal for New Institutional Arrangements for the ACA and the ABA. Aug. 2003. 8 Mar. 2004 <http://www.dcita.gov.au/Article/0,,0_1-2_1-4_116552,00.php>. Dept. of Foreign Affairs and Trade. Australia-United States Free Trade Agreement. Feb. 2004. 8 Mar. 2004 <http://www.dfat.gov.au/trade/negotiations/us_fta/outcomes/11_audio_visual.php>. Dwyer, Tim. Submission to Commercial Television Australia’s Review of the Commercial Television Industry’s Code of Practice. Sept. 2003. Dwyer, Tim, and Sally Stockbridge. “Putting Violence to Work in New Media Policies: Trends in Australian Internet, Computer Game and Video Regulation.” New Media and Society 1.2 (1999): 227-49. Given, Jock. America’s Pie: Trade and Culture After 9/11. Sydney: U of NSW P, 2003. Gordon-Smith, Michael. “Media Ethics After Cash-for-Comment.” The Media and Communications in Australia. Ed. Stuart Cunningham and Graeme Turner. Sydney: Allen and Unwin, 2002. Johnson, Rob. Cash-for-Comment: The Seduction of Journo Culture. Sydney: Pluto, 2000. Keane, John. The Media and Democracy. Cambridge: Polity, 1991. Lumby, Cathy, and Elspeth Probyn, eds. Remote Control: New Media, New Ethics. Melbourne: Cambridge UP, 2003. Murdock, Graham, and Peter Golding. “Information Poverty and Political Inequality: Citizenship in the Age of Privatized Communications.” Journal of Communication 39.3 (1991): 180-95. QUT, CIRAC, and Cutler & Co. Research and Innovation Systems in the Production of Digital Content and Applications: Report for the National Office for the Information Economy. Canberra: Commonwealth of Australia, Sept. 2003. Tambini, Damian. Universal Access: A Realistic View. IPPR/Citizens Online Research Publication 1. London: IPPR, 2000. Thomas, Julian and Marion McCutcheon. “Is Broadcasting Special? Charging for Spectrum.” Conference paper. ABA conference, Canberra. May 2003. Turner, Graeme. “Talkback, Advertising and Journalism: A cautionary tale of self-regulated radio”. International Journal of Cultural Studies 3.2 (2000): 247-255. ---. “Reshaping Australian Institutions: Popular Culture, the Market and the Public Sphere.” Culture in Australia: Policies, Publics and Programs. Ed. Tony Bennett and David Carter. Melbourne: Cambridge UP, 2001. Winston, Brian. Media, Technology and Society: A History from the Telegraph to the Internet. London: Routledge, 1998. Web Links http://www.aba.gov.au http://www.aca.gov.au http://www.accc.gov.au http://www.acif.org.au http://www.adma.com.au http://www.ctva.com.au http://www.crtc.gc.ca http://www.dcita.com.au http://www.dfat.gov.au http://www.fcc.gov http://www.ippr.org.uk http://www.ofcom.org.uk http://www.oflc.gov.au Links http://www.commercialalert.org/ Citation reference for this article MLA Style Dwyer, Tim. "Transformations" M/C: A Journal of Media and Culture <http://www.media-culture.org.au/0403/06-transformations.php>. APA Style Dwyer, T. (2004, Mar17). Transformations. M/C: A Journal of Media and Culture, 7, <http://www.media-culture.org.au/0403/06-transformations.php>
APA, Harvard, Vancouver, ISO, and other styles
40

Cham, Karen, and Jeffrey Johnson. "Complexity Theory." M/C Journal 10, no. 3 (2007). http://dx.doi.org/10.5204/mcj.2672.

Full text
Abstract:

 
 
 Complex systems are an invention of the universe. It is not at all clear that science has an a priori primacy claim to the study of complex systems. (Galanter 5) Introduction In popular dialogues, describing a system as “complex” is often the point of resignation, inferring that the system cannot be sufficiently described, predicted nor managed. Transport networks, management infrastructure and supply chain logistics are all often described in this way. In socio-cultural terms “complex” is used to describe those humanistic systems that are “intricate, involved, complicated, dynamic, multi-dimensional, interconnected systems [such as] transnational citizenship, communities, identities, multiple belongings, overlapping geographies and competing histories” (Cahir & James). Academic dialogues have begun to explore the collective behaviors of complex systems to define a complex system specifically as an adaptive one; i.e. a system that demonstrates ‘self organising’ principles and ‘emergent’ properties. Based upon the key principles of interaction and emergence in relation to adaptive and self organising systems in cultural artifacts and processes, this paper will argue that complex systems are cultural systems. By introducing generic principles of complex systems, and looking at the exploration of such principles in art, design and media research, this paper argues that a science of cultural systems as part of complex systems theory is the post modern science for the digital age. Furthermore, that such a science was predicated by post structuralism and has been manifest in art, design and media practice since the late 1960s. Complex Systems Theory Complexity theory grew out of systems theory, an holistic approach to analysis that views whole systems based upon the links and interactions between the component parts and their relationship to each other and the environment within they exists. This stands in stark contrast to conventional science which is based upon Descartes’s reductionism, where the aim is to analyse systems by reducing something to its component parts (Wilson 3). As systems thinking is concerned with relationships more than elements, it proposes that in complex systems, small catalysts can cause large changes and that a change in one area of a system can adversely affect another area of the system. As is apparent, systems theory is a way of thinking rather than a specific set of rules, and similarly there is no single unified Theory of Complexity, but several different theories have arisen from the natural sciences, mathematics and computing. As such, the study of complex systems is very interdisciplinary and encompasses more than one theoretical framework. Whilst key ideas of complexity theory developed through artificial intelligence and robotics research, other important contributions came from thermodynamics, biology, sociology, physics, economics and law. In her volume for the Elsevier Advanced Management Series, “Complex Systems and Evolutionary Perspectives on Organisations”, Eve Mitleton-Kelly describes a comprehensive overview of this evolution as five main areas of research: complex adaptive systems dissipative structures autopoiesis (non-equilibrium) social systems chaos theory path dependence Here, Mitleton-Kelly points out that relatively little work has been done on developing a specific theory of complex social systems, despite much interest in complexity and its application to management (Mitleton-Kelly 4). To this end, she goes on to define the term “complex evolving system” as more appropriate to the field than ‘complex adaptive system’ and suggests that the term “complex behaviour” is thus more useful in social contexts (Mitleton-Kelly). For our purpose here, “complex systems” will be the general term used to describe those systems that are diverse and made up of multiple interdependent elements, that are often ‘adaptive’, in that they have the capacity to change and learn from events. This is in itself both ‘evolutionary’ and ‘behavioural’ and can be understood as emerging from the interaction of autonomous agents – especially people. Some generic principles of complex systems defined by Mitleton Kelly that are of concern here are: self-organisation emergence interdependence feedback space of possibilities co-evolving creation of new order Whilst the behaviours of complex systems clearly do not fall into our conventional top down perception of management and production, anticipating such behaviours is becoming more and more essential for products, processes and policies. For example, compare the traditional top down model of news generation, distribution and consumption to the “emerging media eco-system” (Bowman and Willis 14). Figure 1 (Bowman & Willis 10) Figure 2 (Bowman & Willis 12) To the traditional news organisations, such a “democratization of production” (McLuhan 230) has been a huge cause for concern. The agencies once solely responsible for the representation of reality are now lost in a global miasma of competing perspectives. Can we anticipate and account for complex behaviours? Eve Mitleton Kelly states that “if organisations are understood as complex evolving systems co-evolving as part of a social ‘ecosystem’, then that changed perspective changes ways of acting and relating which lead to a different way of working. Thus, management strategy changes, and our organizational design paradigms evolve as new types of relationships and ways of working provide the conditions for the emergence of new organisational forms” (Mitleton-Kelly 6). Complexity in Design It is thus through design practice and processes that discovering methods for anticipating complex systems behaviours seem most possible. The Embracing Complexity in Design (ECiD) research programme, is a contemporary interdisciplinary research cluster consisting of academics and designers from architectural engineering, robotics, geography, digital media, sustainable design, and computing aiming to explore the possibility of trans disciplinary principles of complexity in design. Over arching this work is the conviction that design can be seen as model for complex systems researchers motivated by applying complexity science in particular domains. Key areas in which design and complexity interact have been established by this research cluster. Most immediately, many designed products and systems are inherently complex to design in the ordinary sense. For example, when designing vehicles, architecture, microchips designers need to understand complex dynamic processes used to fabricate and manufacture products and systems. The social and economic context of design is also complex, from market economics and legal regulation to social trends and mass culture. The process of designing can also involve complex social dynamics, with many people processing and exchanging complex heterogeneous information over complex human and communication networks, in the context of many changing constraints. Current key research questions are: how can the methods of complex systems science inform designers? how can design inform research into complex systems? Whilst ECiD acknowledges that to answer such questions effectively the theoretical and methodological relations between complexity science and design need further exploration and enquiry, there are no reliable precedents for such an activity across the sciences and the arts in general. Indeed, even in areas where a convergence of humanities methodology with scientific practice might seem to be most pertinent, most examples are few and far between. In his paper “Post Structuralism, Hypertext & the World Wide Web”, Luke Tredennick states that “despite the concentration of post-structuralism on text and texts, the study of information has largely failed to exploit post-structuralist theory” (Tredennick 5). Yet it is surely in the convergence of art and design with computation and the media that a search for practical trans-metadisciplinary methodologies might be most fruitful. It is in design for interactive media, where algorithms meet graphics, where the user can interact, adapt and amend, that self-organisation, emergence, interdependence, feedback, the space of possibilities, co-evolution and the creation of new order are embraced on a day to day basis by designers. A digitally interactive environment such as the World Wide Web, clearly demonstrates all the key aspects of a complex system. Indeed, it has already been described as a ‘complexity machine’ (Qvortup 9). It is important to remember that this ‘complexity machine’ has been designed. It is an intentional facility. It may display all the characteristics of complexity but, whilst some of its attributes are most demonstrative of self organisation and emergence, the Internet itself has not emerged spontaneously. For example, Tredinnick details the evolution of the World Wide Web through the Memex machine of Vannevar Bush, through Ted Nelsons hypertext system Xanadu to Tim Berners-Lee’s Enquire (Tredennick 3). The Internet was engineered. So, whilst we may not be able to entirely predict complex behavior, we can, and do, quite clearly design for it. When designing digitally interactive artifacts we design parameters or co ordinates to define the space within which a conceptual process will take place. We can never begin to predict precisely what those processes might become through interaction, emergence and self organisation, but we can establish conceptual parameters that guide and delineate the space of possibilities. Indeed this fact is so transparently obvious that many commentators in the humanities have been pushed to remark that interaction is merely interpretation, and so called new media is not new at all; that one interacts with a book in much the same way as a digital artifact. After all, post-structuralist theory had established the “death of the author” in the 1970s – the a priori that all cultural artifacts are open to interpretation, where all meanings must be completed by the reader. The concept of the “open work” (Eco 6) has been an established post modern concept for over 30 years and is commonly recognised as a feature of surrealist montage, poetry, the writings of James Joyce, even advertising design, where a purposive space for engagement and interpretation of a message is designated, without which the communication does not “work”. However, this concept is also most successfully employed in relation to installation art and, more recently, interactive art as a reflection of the artist’s conscious decision to leave part of a work open to interpretation and/or interaction. Art & Complex Systems One of the key projects of Embracing Complexity in Design has been to look at the relationship between art and complex systems. There is a relatively well established history of exploring art objects as complex systems in themselves that finds its origins in the systems art movement of the 1970s. In his paper “Observing ‘Systems Art’ from a Systems-Theroretical Perspective”, Francis Halsall defines systems art as “emerging in the 1960s and 1970s as a new paradigm in artistic practice … displaying an interest in the aesthetics of networks, the exploitation of new technology and New Media, unstable or de-materialised physicality, the prioritising of non-visual aspects, and an engagement (often politicised) with the institutional systems of support (such as the gallery, discourse, or the market) within which it occurs” (Halsall 7). More contemporarily, “Open Systems: Rethinking Art c.1970”, at Tate Modern, London, focuses upon systems artists “rejection of art’s traditional focus on the object, to wide-ranging experiments al focus on the object, to wide-ranging experiments with media that included dance, performance and…film & video” (De Salvo 3). Artists include Andy Warhol, Richard Long, Gilbert & George, Sol Lewitt, Eva Hesse and Bruce Nauman. In 2002, the Samuel Dorsky Museum of Art, New York, held an international exhibition entitled “Complexity; Art & Complex Systems”, that was concerned with “art as a distinct discipline offer[ing] its own unique approache[s] and epistemic standards in the consideration of complexity” (Galanter and Levy 5), and the organisers go on to describe four ways in which artists engage the realm of complexity: presentations of natural complex phenomena that transcend conventional scientific visualisation descriptive systems which describe complex systems in an innovative and often idiosyncratic way commentary on complexity science itself technical applications of genetic algorithms, neural networks and a-life ECiD artist Julian Burton makes work that visualises how companies operate in specific relation to their approach to change and innovation. He is a strategic artist and facilitator who makes “pictures of problems to help people talk about them” (Burton). Clients include public and private sector organisations such as Barclays, Shell, Prudential, KPMG and the NHS. He is quoted as saying “Pictures are a powerful way to engage and focus a group’s attention on crucial issues and challenges, and enable them to grasp complex situations quickly. I try and create visual catalysts that capture the major themes of a workshop, meeting or strategy and re-present them in an engaging way to provoke lively conversations” (Burton). This is a simple and direct method of using art as a knowledge elicitation tool that falls into the first and second categories above. The third category is demonstrated by the ground breaking TechnoSphere, that was specifically inspired by complexity theory, landscape and artificial life. Launched in 1995 as an Arts Council funded online digital environment it was created by Jane Prophet and Gordon Selley. TechnoSphere is a virtual world, populated by artificial life forms created by users of the World Wide Web. The digital ecology of the 3D world, housed on a server, depends on the participation of an on-line public who accesses the world via the Internet. At the time of writing it has attracted over a 100,000 users who have created over a million creatures. The artistic exploration of technical applications is by default a key field for researching the convergence of trans-metadisciplinary methodologies. Troy Innocent’s lifeSigns evolves multiple digital media languages “expressed as a virtual world – through form, structure, colour, sound, motion, surface and behaviour” (Innocent). The work explores the idea of “emergent language through play – the idea that new meanings may be generated through interaction between human and digital agents”. Thus this artwork combines three areas of converging research – artificial life; computational semiotics and digital games. In his paper “What Is Generative Art? Complexity Theory as a Context for Art Theory”, Philip Galanter describes all art as generative on the basis that it is created from the application of rules. Yet, as demonstrated above, what is significantly different and important about digital interactivity, as opposed to its predecessor, interpretation, is its provision of a graphical user interface (GUI) to component parts of a text such as symbol, metaphor, narrative, etc for the multiple “authors” and the multiple “readers” in a digitally interactive space of possibility. This offers us tangible, instantaneous reproduction and dissemination of interpretations of an artwork. Conclusion: Digital Interactivity – A Complex Medium Digital interaction of any sort is thus a graphic model of the complex process of communication. Here, complexity does not need deconstructing, representing nor modelling, as the aesthetics (as in apprehended by the senses) of the graphical user interface conveniently come first. Design for digital interactive media is thus design for complex adaptive systems. The theoretical and methodological relations between complexity science and design can clearly be expounded especially well through post-structuralism. The work of Barthes, Derrida & Foucault offers us the notion of all cultural artefacts as texts or systems of signs, whose meanings are not fixed but rather sustained by networks of relationships. Implemented in a digital environment post-structuralist theory is tangible complexity. Strangely, whilst Philip Galanter states that science has no necessary over reaching claim to the study of complexity, he then argues conversely that “contemporary art theory rooted in skeptical continental philosophy [reduces] art to social construction [as] postmodernism, deconstruction and critical theory [are] notoriously elusive, slippery, and overlapping terms and ideas…that in fact [are] in the business of destabilising apparently clear and universal propositions” (4). This seems to imply that for Galanter, post modern rejections of grand narratives necessarily will exclude the “new scientific paradigm” of complexity, a paradigm that he himself is looking to be universal. Whilst he cites Lyotard (6) describing both political and linguistic reasons why postmodern art celebrates plurality, denying any progress towards singular totalising views, he fails to appreciate what happens if that singular totalising view incorporates interactivity? Surely complexity is pluralistic by its very nature? In the same vein, if language for Derrida is “an unfixed system of traces and differences … regardless of the intent of the authored texts … with multiple equally legitimate meanings” (Galanter 7) then I have heard no better description of the signifiers, signifieds, connotations and denotations of digital culture. Complexity in its entirety can also be conversely understood as the impact of digital interactivity upon culture per se which has a complex causal relation in itself; Qvortups notion of a “communications event” (9) such as the Danish publication of the Mohammed cartoons falls into this category. Yet a complex causality could be traced further into cultural processes enlightening media theory; from the relationship between advertising campaigns and brand development; to the exposure and trajectory of the celebrity; describing the evolution of visual language in media cultures and informing the relationship between exposure to representation and behaviour. In digital interaction the terms art, design and media converge into a process driven, performative event that demonstrates emergence through autopoietic processes within a designated space of possibility. By insisting that all artwork is generative Galanter, like many other writers, negates the medium entirely which allows him to insist that generative art is “ideologically neutral” (Galanter 10). Generative art, like all digitally interactive artifacts are not neutral but rather ideologically plural. Thus, if one integrates Qvortups (8) delineation of medium theory and complexity theory we may have what we need; a first theory of a complex medium. Through interactive media complexity theory is the first post modern science; the first science of culture. References Bowman, Shane, and Chris Willis. We Media. 21 Sep. 2003. 9 March 2007 http://www.hypergene.net/wemedia/weblog.php>. Burton, Julian. “Hedron People.” 9 March 2007 http://www.hedron.com/network/assoc.php4?associate_id=14>. Cahir, Jayde, and Sarah James. “Complex: Call for Papers.” M/C Journal 9 Sep. 2006. 7 March 2007 http://journal.media-culture.org.au/journal/upcoming.php>. De Salvo, Donna, ed. Open Systems: Rethinking Art c. 1970. London: Tate Gallery Press, 2005. Eco, Umberto. The Open Work. Cambridge, Mass.: Harvard UP, 1989. Galanter, Phillip, and Ellen K. Levy. Complexity: Art & Complex Systems. SDMA Gallery Guide, 2002. Galanter, Phillip. “Against Reductionism: Science, Complexity, Art & Complexity Studies.” 2003. 9 March 2007 http://isce.edu/ISCE_Group_Site/web-content/ISCE_Events/ Norwood_2002/Norwood_2002_Papers/Galanter.pdf>. Halsall, Francis. “Observing ‘Systems-Art’ from a Systems-Theoretical Perspective”. CHArt 2005. 9 March 2007 http://www.chart.ac.uk/chart2005/abstracts/halsall.htm>. Innocent, Troy. “Life Signs.” 9 March 2007 http://www.iconica.org/main.htm>. Johnson, Jeffrey. “Embracing Complexity in Design (ECiD).” 2007. 9 March 2007 http://www.complexityanddesign.net/>. Lyotard, Jean-Francois. The Postmodern Condition. Manchester: Manchester UP, 1984. McLuhan, Marshall. The Gutenberg Galaxy: The Making of Typographic Man. Toronto: U of Toronto P, 1962. Mitleton-Kelly, Eve, ed. Complex Systems and Evolutionary Perspectives on Organisations. Elsevier Advanced Management Series, 2003. Prophet, Jane. “Jane Prophet.” 9 March 2007 http://www.janeprophet.co.uk/>. Qvortup, Lars. “Understanding New Digital Media.” European Journal of Communication 21.3 (2006): 345-356. Tedinnick, Luke. “Post Structuralism, Hypertext & the World Wide Web.” Aslib 59.2 (2007): 169-186. Wilson, Edward Osborne. Consilience: The Unity of Knowledge. New York: A.A. Knoff, 1998. 
 
 
 
 Citation reference for this article
 
 MLA Style
 Cham, Karen, and Jeffrey Johnson. "Complexity Theory: A Science of Cultural Systems?." M/C Journal 10.3 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0706/08-cham-johnson.php>. APA Style
 Cham, K., and J. Johnson. (Jun. 2007) "Complexity Theory: A Science of Cultural Systems?," M/C Journal, 10(3). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0706/08-cham-johnson.php>. 
APA, Harvard, Vancouver, ISO, and other styles
41

Grossman, Michele. "Prognosis Critical: Resilience and Multiculturalism in Contemporary Australia." M/C Journal 16, no. 5 (2013). http://dx.doi.org/10.5204/mcj.699.

Full text
Abstract:
Introduction Most developed countries, including Australia, have a strong focus on national, state and local strategies for emergency management and response in the face of disasters and crises. This framework can include coping with catastrophic dislocation, service disruption, injury or loss of life in the face of natural disasters such as major fires, floods, earthquakes or other large-impact natural events, as well as dealing with similar catastrophes resulting from human actions such as bombs, biological agents, cyber-attacks targeting essential services such as communications networks, or other crises affecting large populations. Emergency management frameworks for crisis and disaster response are distinguished by their focus on the domestic context for such events; that is, how to manage and assist the ways in which civilian populations, who are for the most part inexperienced and untrained in dealing with crises and disasters, are able to respond and behave in such situations so as to minimise the impacts of a catastrophic event. Even in countries like Australia that demonstrate a strong public commitment to cultural pluralism and social cohesion, ethno-cultural diversity can be seen as a risk or threat to national security and values at times of political, natural, economic and/or social tensions and crises. Australian government policymakers have recently focused, with increasing intensity, on “community resilience” as a key element in countering extremism and enhancing emergency preparedness and response. In some sense, this is the result of a tacit acknowledgement by government agencies that there are limits to what they can do for domestic communities should such a catastrophic event occur, and accordingly, the focus in recent times has shifted to how governments can best help people to help themselves in such situations, a key element of the contemporary “resilience” approach. Yet despite the robustly multicultural nature of Australian society, explicit engagement with Australia’s cultural diversity flickers only fleetingly on this agenda, which continues to pursue approaches to community resilience in the absence of understandings about how these terms and formations may themselves need to be diversified to maximise engagement by all citizens in a multicultural polity. There have been some recent efforts in Australia to move in this direction, for example the Australian Emergency Management Institute (AEMI)’s recent suite of projects with culturally and linguistically diverse (CALD) communities (2006-2010) and the current Australia-New Zealand Counter-Terrorism Committee-supported project on “Harnessing Resilience Capital in Culturally Diverse Communities to Counter Violent Extremism” (Grossman and Tahiri), which I discuss in a longer forthcoming version of this essay (Grossman). Yet the understanding of ethno-cultural identity and difference that underlies much policy thinking on resilience remains problematic for the way in which it invests in a view of the cultural dimensions of community resilience as relic rather than resource – valorising the preservation of and respect for cultural norms and traditions, but silent on what different ethno-cultural communities might contribute toward expanded definitions of both “community” and “resilience” by virtue of the transformative potential and existing cultural capital they bring with them into new national and also translocal settings. For example, a primary conclusion of the joint program between AEMI and the Australian Multicultural Commission is that CALD communities are largely “vulnerable” in the context of disasters and emergency management and need to be better integrated into majority-culture models of theorising and embedding community resilience. This focus on stronger national integration and the “vulnerability” of culturally diverse ethno-cultural communities in the Australian context echoes the work of scholars beyond Australia such as McGhee, Mouritsen (Reflections, Citizenship) and Joppke. They argue that the “civic turn” in debates around resurgent contemporary nationalism and multicultural immigration policies privileges civic integration over genuine two-way multiculturalism. This approach sidesteps the transculturational (Ortiz; Welsch; Mignolo; Bennesaieh; Robins; Stein) aspects of contemporary social identities and exchange by paying lip-service to cultural diversity while affirming a neo-liberal construct of civic values and principles as a universalising goal of Western democratic states within a global market economy. It also suggests a superficial tribute to cultural diversity that does not embed diversity comprehensively at the levels of either conceptualising or resourcing different elements of Australian transcultural communities within the generalised framework of “community resilience.” And by emphasising cultural difference as vulnerability rather than as resource or asset, it fails to acknowledge the varieties of resilience capital that many culturally diverse individuals and communities may bring with them when they resettle in new environments, by ignoring the question of what “resilience” actually means to those from culturally diverse communities. In so doing, it also avoids the critical task of incorporating intercultural definitional diversity around the concepts of both “community” and “resilience” used to promote social cohesion and the capacity to recover from disasters and crises. How we might do differently in thinking about the broader challenges for multiculturalism itself as a resilient transnational concept and practice? The Concept of Resilience The meanings of resilience vary by disciplinary perspective. While there is no universally accepted definition of the concept, it is widely acknowledged that resilience refers to the capacity of an individual to do well in spite of exposure to acute trauma or sustained adversity (Liebenberg 219). Originating in the Latin word resilio, meaning ‘to jump back’, there is general consensus that resilience pertains to an individual’s, community’s or system’s ability to adapt to and ‘bounce back’ from a disruptive event (Mohaupt 63, Longstaff et al. 3). Over the past decade there has been a dramatic rise in interest in the clinical, community and family sciences concerning resilience to a broad range of adversities (Weine 62). While debate continues over which discipline can be credited with first employing resilience as a concept, Mohaupt argues that most of the literature on resilience cites social psychology and psychiatry as the origin for the concept beginning in the mid-20th century. The pioneer researchers of what became known as resilience research studied the impact on children living in dysfunctional families. For example, the findings of work by Garmezy, Werner and Smith and Rutter showed that about one third of children in these studies were coping very well despite considerable adversities and traumas. In asking what it was that prevented the children in their research from being negatively influenced by their home environments, such research provided the basis for future research on resilience. Such work was also ground-breaking for identifying the so-called ‘protective factors’ or resources that individuals can operationalise when dealing with adversity. In essence, protective factors are those conditions in the individual that protect them from the risk of dysfunction and enable recovery from trauma. They mitigate the effects of stressors or risk factors, that is, those conditions that predispose one to harm (Hajek 15). Protective factors include the inborn traits or qualities within an individual, those defining an individual’s environment, and also the interaction between the two. Together, these factors give people the strength, skills and motivation to cope in difficult situations and re-establish (a version of) ‘normal’ life (Gunnestad). Identifying protective factors is important in terms of understanding the particular resources a given sociocultural group has at its disposal, but it is also vital to consider the interconnections between various protective mechanisms, how they might influence each other, and to what degree. An individual, for instance, might display resilience or adaptive functioning in a particular domain (e.g. emotional functioning) but experience significant deficits in another (e.g. academic achievement) (Hunter 2). It is also essential to scrutinise how the interaction between protective factors and risk factors creates patterns of resilience. Finally, a comprehensive understanding of the interrelated nature of protective mechanisms and risk factors is imperative for designing effective interventions and tailored preventive strategies (Weine 65). In short, contemporary thinking about resilience suggests it is neither entirely personal nor strictly social, but an interactive and iterative combination of the two. It is a quality of the environment as much as the individual. For Ungar, resilience is the complex entanglements between “individuals and their social ecologies [that] will determine the degree of positive outcomes experienced” (3). Thinking about resilience as context-dependent is important because research that is too trait-based or actor-centred risks ignoring any structural or institutional forces. A more ecological interpretation of resilience, one that takes into a person’s context and environment into account, is vital in order to avoid blaming the victim for any hardships they face, or relieving state and institutional structures from their responsibilities in addressing social adversity, which can “emphasise self-help in line with a neo-conservative agenda instead of stimulating state responsibility” (Mohaupt 67). Nevertheless, Ungar posits that a coherent definition of resilience has yet to be developed that adequately ‘captures the dual focus of the individual and the individual’s social ecology and how the two must both be accounted for when determining the criteria for judging outcomes and discerning processes associated with resilience’ (7). Recent resilience research has consequently prompted a shift away from vulnerability towards protective processes — a shift that highlights the sustained capabilities of individuals and communities under threat or at risk. Locating ‘Culture’ in the Literature on Resilience However, an understanding of the role of culture has remained elusive or marginalised within this trend; there has been comparatively little sustained investigation into the applicability of resilience constructs to non-western cultures, or how the resources available for survival might differ from those accessible to western populations (Ungar 4). As such, a growing body of researchers is calling for more rigorous inquiry into culturally determined outcomes that might be associated with resilience in non-western or multicultural cultures and contexts, for example where Indigenous and minority immigrant communities live side by side with their ‘mainstream’ neighbours in western settings (Ungar 2). ‘Cultural resilience’ considers the role that cultural background plays in determining the ability of individuals and communities to be resilient in the face of adversity. For Clauss-Ehlers, the term describes the degree to which the strengths of one’s culture promote the development of coping (198). Culturally-focused resilience suggests that people can manage and overcome stress and trauma based not on individual characteristics alone, but also from the support of broader sociocultural factors (culture, cultural values, language, customs, norms) (Clauss-Ehlers 324). The innate cultural strengths of a culture may or may not differ from the strengths of other cultures; the emphasis here is not so much comparatively inter-cultural as intensively intra-cultural (VanBreda 215). A culturally focused resilience model thus involves “a dynamic, interactive process in which the individual negotiates stress through a combination of character traits, cultural background, cultural values, and facilitating factors in the sociocultural environment” (Clauss-Ehlers 199). In understanding ways of ‘coping and hoping, surviving and thriving’, it is thus crucial to consider how culturally and linguistically diverse minorities navigate the cultural understandings and assumptions of both their countries of origin and those of their current domicile (Ungar 12). Gunnestad claims that people who master the rules and norms of their new culture without abandoning their own language, values and social support are more resilient than those who tenaciously maintain their own culture at the expense of adjusting to their new environment. They are also more resilient than those who forego their own culture and assimilate with the host society (14). Accordingly, if the combination of both valuing one’s culture as well as learning about the culture of the new system produces greater resilience and adaptive capacities, serious problems can arise when a majority tries to acculturate a minority to the mainstream by taking away or not recognising important parts of the minority culture. In terms of resilience, if cultural factors are denied or diminished in accounting for and strengthening resilience – in other words, if people are stripped of what they possess by way of resilience built through cultural knowledge, disposition and networks – they do in fact become vulnerable, because ‘they do not automatically gain those cultural strengths that the majority has acquired over generations’ (Gunnestad 14). Mobilising ‘Culture’ in Australian Approaches to Community Resilience The realpolitik of how concepts of resilience and culture are mobilised is highly relevant here. As noted above, when ethnocultural difference is positioned as a risk or a threat to national identity, security and values, this is precisely the moment when vigorously, even aggressively, nationalised definitions of ‘community’ and ‘identity’ that minoritise or disavow cultural diversities come to the fore in public discourse. The Australian evocation of nationalism and national identity, particularly in the way it has framed policy discussion on managing national responses to disasters and threats, has arguably been more muted than some of the European hysteria witnessed recently around cultural diversity and national life. Yet we still struggle with the idea that newcomers to Australia might fall on the surplus rather than the deficit side of the ledger when it comes to identifying and harnessing resilience capital. A brief example of this trend is explored here. From 2006 to 2010, the Australian Emergency Management Institute embarked on an ambitious government-funded four-year program devoted to strengthening community resilience in relation to disasters with specific reference to engaging CALD communities across Australia. The program, Inclusive Emergency Management with CALD Communities, was part of a wider Australian National Action Plan to Build Social Cohesion, Harmony and Security in the wake of the London terrorist bombings in July 2005. Involving CALD community organisations as well as various emergency and disaster management agencies, the program ran various workshops and agency-community partnership pilots, developed national school education resources, and commissioned an evaluation of the program’s effectiveness (Farrow et al.). While my critique here is certainly not aimed at emergency management or disaster response agencies and personnel themselves – dedicated professionals who often achieve remarkable results in emergency and disaster response under extraordinarily difficult circumstances – it is nevertheless important to highlight how the assumptions underlying elements of AEMI’s experience and outcomes reflect the persistent ways in which ethnocultural diversity is rendered as a problem to be surmounted or a liability to be redressed, rather than as an asset to be built upon or a resource to be valued and mobilised. AEMI’s explicit effort to engage with CALD communities in building overall community resilience was important in its tacit acknowledgement that emergency and disaster services were (and often remain) under-resourced and under-prepared in dealing with the complexities of cultural diversity in emergency situations. Despite these good intentions, however, while the program produced some positive outcomes and contributed to crucial relationship building between CALD communities and emergency services within various jurisdictions, it also continued to frame the challenge of working with cultural diversity as a problem of increased vulnerability during disasters for recently arrived and refugee background CALD individuals and communities. This highlights a common feature in community resilience-building initiatives, which is to focus on those who are already ‘robust’ versus those who are ‘vulnerable’ in relation to resilience indicators, and whose needs may require different or additional resources in order to be met. At one level, this is a pragmatic resourcing issue: national agencies understandably want to put their people, energy and dollars where they are most needed in pursuit of a steady-state unified national response at times of crisis. Nor should it be argued that at least some CALD groups, particularly those from new arrival and refugee communities, are not vulnerable in at least some of the ways and for some of the reasons suggested in the program evaluation. However, the consistent focus on CALD communities as ‘vulnerable’ and ‘in need’ is problematic, as well as partial. It casts members of these communities as structurally and inherently less able and less resilient in the context of disasters and emergencies: in some sense, as those who, already ‘victims’ of chronic social deficits such as low English proficiency, social isolation and a mysterious unidentified set of ‘cultural factors’, can become doubly victimised in acute crisis and disaster scenarios. In what is by now a familiar trope, the description of CALD communities as ‘vulnerable’ precludes asking questions about what they do have, what they do know, and what they do or can contribute to how we respond to disaster and emergency events in our communities. A more profound problem in this sphere revolves around working out how best to engage CALD communities and individuals within existing approaches to disaster and emergency preparedness and response. This reflects a fundamental but unavoidable limitation of disaster preparedness models: they are innately spatially and geographically bounded, and consequently understand ‘communities’ in these terms, rather than expanding definitions of ‘community’ to include the dimensions of community-as-social-relations. While some good engagement outcomes were achieved locally around cross-cultural knowledge for emergency services workers, the AEMI program fell short of asking some of the harder questions about how emergency and disaster service scaffolding and resilience-building approaches might themselves need to change or transform, using a cross-cutting model of ‘communities’ as both geographic places and multicultural spaces (Bartowiak-Théron and Crehan) in order to be more effective in national scenarios in which cultural diversity should be taken for granted. Toward Acknowledgement of Resilience Capital Most significantly, the AEMI program did not produce any recognition of the ways in which CALD communities already possess resilience capital, or consider how this might be drawn on in formulating stronger community initiatives around disaster and threats preparedness for the future. Of course, not all individuals within such communities, nor all communities across varying circumstances, will demonstrate resilience, and we need to be careful of either overgeneralising or romanticising the kinds and degrees of ‘resilience capital’ that may exist within them. Nevertheless, at least some have developed ways of withstanding crises and adapting to new conditions of living. This is particularly so in connection with individual and group behaviours around resource sharing, care-giving and social responsibility under adverse circumstances (Grossman and Tahiri) – all of which are directly relevant to emergency and disaster response. While some of these resilient behaviours may have been nurtured or enhanced by particular experiences and environments, they can, as the discussion of recent literature above suggests, also be rooted more deeply in cultural norms, habits and beliefs. Whatever their origins, for culturally diverse societies to achieve genuine resilience in the face of both natural and human-made disasters, it is critical to call on the ‘social memory’ (Folke et al.) of communities faced with responding to emergencies and crises. Such wellsprings of social memory ‘come from the diversity of individuals and institutions that draw on reservoirs of practices, knowledge, values, and worldviews and is crucial for preparing the system for change, building resilience, and for coping with surprise’ (Adger et al.). Consequently, if we accept the challenge of mapping an approach to cultural diversity as resource rather than relic into our thinking around strengthening community resilience, there are significant gains to be made. For a whole range of reasons, no diversity-sensitive model or measure of resilience should invest in static understandings of ethnicities and cultures; all around the world, ethnocultural identities and communities are in a constant and sometimes accelerated state of dynamism, reconfiguration and flux. But to ignore the resilience capital and potential protective factors that ethnocultural diversity can offer to the strengthening of community resilience more broadly is to miss important opportunities that can help suture the existing disconnects between proactive approaches to intercultural connectedness and social inclusion on the one hand, and reactive approaches to threats, national security and disaster response on the other, undermining the effort to advance effectively on either front. This means that dominant social institutions and structures must be willing to contemplate their own transformation as the result of transcultural engagement, rather than merely insisting, as is often the case, that ‘other’ cultures and communities conform to existing hegemonic paradigms of being and of living. In many ways, this is the most critical step of all. A resilience model and strategy that questions its own culturally informed yet taken-for-granted assumptions and premises, goes out into communities to test and refine these, and returns to redesign its approach based on the new knowledge it acquires, would reflect genuine progress toward an effective transculturational approach to community resilience in culturally diverse contexts.References Adger, W. Neil, Terry P. Hughes, Carl Folke, Stephen R. Carpenter and Johan Rockström. “Social-Ecological Resilience to Coastal Disasters.” Science 309.5737 (2005): 1036-1039. ‹http://www.sciencemag.org/content/309/5737/1036.full> Bartowiak-Théron, Isabelle, and Anna Corbo Crehan. “The Changing Nature of Communities: Implications for Police and Community Policing.” Community Policing in Australia: Australian Institute of Criminology (AIC) Reports, Research and Policy Series 111 (2010): 8-15. Benessaieh, Afef. “Multiculturalism, Interculturality, Transculturality.” Ed. A. Benessaieh. Transcultural Americas/Ameriques Transculturelles. Ottawa: U of Ottawa Press/Les Presses de l’Unversite d’Ottawa, 2010. 11-38. Clauss-Ehlers, Caroline S. “Sociocultural Factors, Resilience and Coping: Support for a Culturally Sensitive Measure of Resilience.” Journal of Applied Developmental Psychology 29 (2008): 197-212. Clauss-Ehlers, Caroline S. “Cultural Resilience.” Encyclopedia of Cross-Cultural School Psychology. Ed. C. S. Clauss-Ehlers. New York: Springer, 2010. 324-326. Farrow, David, Anthea Rutter and Rosalind Hurworth. Evaluation of the Inclusive Emergency Management with Culturally and Linguistically Diverse (CALD) Communities Program. Parkville, Vic.: Centre for Program Evaluation, U of Melbourne, July 2009. ‹http://www.ag.gov.au/www/emaweb/rwpattach.nsf/VAP/(9A5D88DBA63D32A661E6369859739356)~Final+Evaluation+Report+-+July+2009.pdf/$file/Final+Evaluation+Report+-+July+2009.pdf>.Folke, Carl, Thomas Hahn, Per Olsson, and Jon Norberg. “Adaptive Governance of Social-Ecological Systems.” Annual Review of Environment and Resources 30 (2005): 441-73. ‹http://arjournals.annualreviews.org/doi/pdf/10.1146/annurev.energy.30.050504.144511>. Garmezy, Norman. “The Study of Competence in Children at Risk for Severe Psychopathology.” The Child in His Family: Children at Psychiatric Risk. Vol. 3. Eds. E. J. Anthony and C. Koupernick. New York: Wiley, 1974. 77-97. Grossman, Michele. “Resilient Multiculturalism? Diversifying Australian Approaches to Community Resilience and Cultural Difference”. Global Perspectives on Multiculturalism in the 21st Century. Eds. B. E. de B’beri and F. Mansouri. London: Routledge, 2014. Grossman, Michele, and Hussein Tahiri. Harnessing Resilience Capital in Culturally Diverse Communities to Counter Violent Extremism. Canberra: Australia-New Zealand Counter-Terrorism Committee, forthcoming 2014. Grossman, Michele. “Cultural Resilience and Strengthening Communities”. Safeguarding Australia Summit, Canberra. 23 Sep. 2010. ‹http://www.safeguardingaustraliasummit.org.au/uploader/resources/Michele_Grossman.pdf>. Gunnestad, Arve. “Resilience in a Cross-Cultural Perspective: How Resilience Is Generated in Different Cultures.” Journal of Intercultural Communication 11 (2006). ‹http://www.immi.se/intercultural/nr11/gunnestad.htm>. Hajek, Lisa J. “Belonging and Resilience: A Phenomenological Study.” Unpublished Master of Science thesis, U of Wisconsin-Stout. Menomonie, Wisconsin, 2003. Hunter, Cathryn. “Is Resilience Still a Useful Concept When Working with Children and Young People?” Child Family Community Australia (CFA) Paper 2. Melbourne: Australian Institute of Family Studies, 2012.Joppke, Christian. "Beyond National Models: Civic Integration Policies for Immigrants in Western Europe". West European Politics 30.1 (2007): 1-22. Liebenberg, Linda, Michael Ungar, and Fons van de Vijver. “Validation of the Child and Youth Resilience Measure-28 (CYRM-28) among Canadian Youth.” Research on Social Work Practice 22.2 (2012): 219-226. Longstaff, Patricia H., Nicholas J. Armstrong, Keli Perrin, Whitney May Parker, and Matthew A. Hidek. “Building Resilient Communities: A Preliminary Framework for Assessment.” Homeland Security Affairs 6.3 (2010): 1-23. ‹http://www.hsaj.org/?fullarticle=6.3.6>. McGhee, Derek. The End of Multiculturalism? Terrorism, Integration and Human Rights. Maidenhead: Open U P, 2008.Mignolo, Walter. Local Histories/Global Designs: Coloniality, Subaltern Knowledges, and Border Thinking. Princeton: Princeton U P, 2000. Mohaupt, Sarah. “Review Article: Resilience and Social Exclusion.” Social Policy and Society 8 (2009): 63-71.Mouritsen, Per. "The Culture of Citizenship: A Reflection on Civic Integration in Europe." Ed. R. Zapata-Barrero. Citizenship Policies in the Age of Diversity: Europe at the Crossroad." Barcelona: CIDOB Foundation, 2009: 23-35. Mouritsen, Per. “Political Responses to Cultural Conflict: Reflections on the Ambiguities of the Civic Turn.” Ed. P. Mouritsen and K.E. Jørgensen. Constituting Communities. Political Solutions to Cultural Conflict, London: Palgrave, 2008. 1-30. Ortiz, Fernando. Cuban Counterpoint: Tobacco and Sugar. Trans. Harriet de Onís. Intr. Fernando Coronil and Bronislaw Malinowski. Durham, NC: Duke U P, 1995 [1940]. Robins, Kevin. The Challenge of Transcultural Diversities: Final Report on the Transversal Study on Cultural Policy and Cultural Diversity. Culture and Cultural Heritage Department. Strasbourg: Council of European Publishing, 2006. Rutter, Michael. “Protective Factors in Children’s Responses to Stress and Disadvantage.” Annals of the Academy of Medicine, Singapore 8 (1979): 324-38. Stein, Mark. “The Location of Transculture.” Transcultural English Studies: Fictions, Theories, Realities. Eds. F. Schulze-Engler and S. Helff. Cross/Cultures 102/ANSEL Papers 12. Amsterdam and New York: Rodopi, 2009. 251-266. Ungar, Michael. “Resilience across Cultures.” British Journal of Social Work 38.2 (2008): 218-235. First published online 2006: 1-18. In-text references refer to the online Advance Access edition ‹http://bjsw.oxfordjournals.org/content/early/2006/10/18/bjsw.bcl343.full.pdf>. VanBreda, Adrian DuPlessis. Resilience Theory: A Literature Review. Erasmuskloof: South African Military Health Service, Military Psychological Institute, Social Work Research & Development, 2001. Weine, Stevan. “Building Resilience to Violent Extremism in Muslim Diaspora Communities in the United States.” Dynamics of Asymmetric Conflict 5.1 (2012): 60-73. Welsch, Wolfgang. “Transculturality: The Puzzling Form of Cultures Today.” Spaces of Culture: City, Nation World. Eds. M. Featherstone and S. Lash. London: Sage, 1999. 194-213. Werner, Emmy E., and Ruth S. Smith. Vulnerable But Invincible: A Longitudinal Study of\ Resilience and Youth. New York: McGraw Hill, 1982. NotesThe concept of ‘resilience capital’ I offer here is in line with one strand of contemporary theorising around resilience – that of resilience as social or socio-ecological capital – but moves beyond the idea of enhancing general social connectedness and community cohesion by emphasising the ways in which culturally diverse communities may already be robustly networked and resourceful within micro-communal settings, with new resources and knowledge both to draw on and to offer other communities or the ‘national community’ at large. In effect, ‘resilience capital’ speaks to the importance of finding ‘the communities within the community’ (Bartowiak-Théron and Crehan 11) and recognising their capacity to contribute to broad-scale resilience and recovery.I am indebted for the discussion of the literature on resilience here to Dr Peta Stephenson, Centre for Cultural Diversity and Wellbeing, Victoria University, who is working on a related project (M. Grossman and H. Tahiri, Harnessing Resilience Capital in Culturally Diverse Communities to Counter Violent Extremism, forthcoming 2014).
APA, Harvard, Vancouver, ISO, and other styles
42

Busse, Kristina, and Shannon Farley. "Remixing the Remix: Fannish Appropriation and the Limits of Unauthorised Use." M/C Journal 16, no. 4 (2013). http://dx.doi.org/10.5204/mcj.659.

Full text
Abstract:
In August 2006 the LiveJournal (hereafter LJ) community sga_flashfic posted its bimonthly challenge: a “Mission Report” challenge. Challenge communities are fandom-specific sites where moderators pick a theme or prompt to which writers respond and then post their specific fan works. The terms of this challenge were to encourage participants to invent a new mission and create a piece of fan fiction in the form of a mission report from the point of view of the Stargate Atlantis team of explorers. As an alternative possibility, and this is where the trouble started, the challenge also allowed to “take another author’s story and write a report” of its mission. Moderator Cesperanza then explained, “if you choose to write a mission report of somebody else’s story, we’ll ask you to credit them, but we won’t require you to ask their permission” (sga_flashfic LJ, 21 Aug. 2006, emphasis added). Whereas most announcement posts would only gather a few comments, this reached more than a hundred responses within hours, mostly complaints. Even though the community administrators quickly backtracked and posted a revision of the challenge not 12 hours later, the fannish LiveJournal sphere debated the challenge for days, reaching far beyond the specific fandom of Stargate Atlantis to discuss the ethical questions surrounding fannish appropriation and remix. At the center of the debate were the last eight words: “we won’t require you to ask their permission.” By encouraging fans to effectively write fan fiction of fan fiction and by not requiring permission, the moderators had violated an unwritten norm within this fannish community. Like all fan communities, western media fans have developed internal rules covering everything from what to include in a story header to how long to include a spoiler warning following aired episodes (for a definition and overview of western media fandom, see Coppa). In this example, the mods violated the fannish prohibition against the borrowing of original characters, settings, plot points, or narrative structures from other fan writers without permission—even though as fan fiction, the source of the inspiration engages in such borrowing itself. These kinds of normative rules can be altered, of course, but any change requires long and involved discussions. In this essay, we look at various debates that showcase how this fan community—media fandom on LiveJournal—creates and enforces but also discusses and changes its normative behavior. Fan fiction authors’ desire to prevent their work from being remixed may seem hypocritical, but we argue that underlying these conversations are complex negotiations of online privacy and control, affective aesthetics, and the value of fan labor. This is not to say that all fan communities address issues of remixing in the same way media fandom at this point in time did nor to suggest that they should; rather, we want to highlight a specific community’s internal ethics, the fervor with which members defend their rules, and the complex arguments that evolve from all sides when rules are questioned. Moreover, we suggest that these conversations offer insight into the specific relation many fan writers have to their stories and how it may differ from a more universal authorial affect. In order to fully understand the underlying motivations and the community ethos that spawned the sga_flashfic debates, we first want to differentiate between forms of unauthorised (re)uses and the legal, moral, and artistic concerns they create. Only with a clear definition of copyright infringement and plagiarism, as well as a clear understanding of who is affected (and in what ways) in any of these cases, can we fully understand the social and moral intersection of fan remixing of fan fiction. Only when sidestepping the legal and economic concerns surrounding remix can we focus on the ethical intricacies between copyright holders and fan writers and, more importantly, within fan communities. Fan communities differ greatly over time, between fandoms, and even depending on their central social interfaces (such as con-based zines, email-based listservs, journal-based online communities, etc.), and as a result they also develop a diverse range of internal community rules (Busse and Hellekson, “Works”; Busker). Much strife is caused when different traditions and their associated mores intersect. We’d argue, however, that the issues in the case of the Stargate Atlantis Remix Challenge were less the confrontation of different communities and more the slowly changing attitudes within one. In fact, looking at media fandom today, we may already be seeing changed attitudes—even as the debates continue over remix permission and unauthorised use. Why Remixes Are Not Copyright Infringement In discussing the limits of unauthorised use, it is important to distinguish plagiarism and copyright violation from forms of remix. While we are more concerned with the ethical issues surrounding plagiarism, we want to briefly address copyright infringement, simply because it often gets mixed into the ethics of remixes. Copyright is strictly defined as a matter of law; in many of the online debates in media fandom, it is often further restricted to U.S. Law, because a large number of the source texts are owned by U.S. companies. According to the U.S. Constitution (Article I, Section 8), Congress has the power to secure an “exclusive Right” “for limited Times.” Given that intellectual property rights have to be granted and are limited, legal scholars read this statute as a delicate balance between offering authors exclusive rights and allowing the public to flourish by building on these works. Over the years, however, intellectual property rights have been expanded and increased at the expense of the public commons (Lessig, Boyle). The main exception to this exclusive right is the concept of “fair use,” defined as use “for purposes such as criticism, comment, news reporting, teaching..., scholarship, or research” (§107). Case law circumscribes the limits of fair use, distinguishing works that are merely “derivative” from those that are “transformative” and thus add value (Chander and Sunder, Fiesler, Katyal, McCardle, Tushnet). The legal status of fan fiction remains undefined without a specific case that would test the fair use doctrine in regards to fan fiction, yet fair use and fan fiction advocates argue that fan fiction should be understood as eminently transformative and thus protected under fair use. The nonprofit fan advocacy group, the Organization for Transformative Works, in fact makes clear its position by including the legal term in their name, reflecting a changing understanding of both fans and scholars. Why Remixes Are Not Plagiarism Whereas copyright infringement is a legal concept that punishes violations between fan writers and commercial copyright holders, plagiarism instead is defined by the norms of the audience for which a piece is written: definitions of plagiarism thus differ from academic to journalist to literary contexts. Within fandom one of the most blatant (and most easily detectable) forms of plagiarism is when a fan copies another work wholesale and publishes it under their own name, either within the same fandom or by simply searching and replacing names to make it fit another fandom. Other times, fan writers may take selections of published pro or fan fiction and insert them into their works. Within fandom accusations of plagiarism are taken seriously, and fandom as a whole polices itself with regards to plagiarism: the LiveJournal community stop_plagiarism, for example, was created in 2005 specifically to report and pursue accusations of plagiarism within fandom. The community keeps a list of known plagiarisers that include the names of over 100 fan writers. Fan fiction plagiarism can only be determined on a case-by-case basis—and fans remain hypervigilant simply because they are all too often falsely accused as merely plagiarising when instead they are interpreting, translating, and transforming. There is another form of fannish offense that does not actually constitute plagiarism but is closely connected to it, namely the wholesale reposting of stories with attributions intact. This practice is frowned upon for two main reasons. Writers like to maintain at least some control over their works, often deriving from anxieties over being able to delete one’s digital footprint if desired or necessary. Archiving stories without authorial permission strips authors of this ability. More importantly, media fandom is a gift economy, in which labor is not reimbursed economically but rather rewarded with feedback (such as comments and kudos) and the growth of a writer’s reputation (Hellekson, Scott). Hosting a story in a place where readers cannot easily give thanks and feedback to the author, the rewards for the writer’s fan labor are effectively taken from her. Reposting thus removes the story from the fannish gift exchange—or, worse, inserts the archivist in lieu of the author as the recipient of thanks and comments. Unauthorised reposting is not plagiarism, as the author’s name remains attached, but it tends to go against fannish mores nonetheless as it deprives the writer of her “payment” of feedback and recognition. When Copyright Holders Object to Fan Fiction A small group of professional authors vocally proclaim fan fiction as unethical, illegal, or both. In her “Fan Fiction Rant” Robin Hobbs declares that “Fan fiction is to writing what a cake mix is to gourmet cooking” and then calls it outright theft: “Fan fiction is like any other form of identity theft. It injures the name of the party whose identity is stolen.” Anne Rice shares her feelings about fan fiction on her web site with a permanent message: “I do not allow fan fiction. The characters are copyrighted. It upsets me terribly to even think about fan fiction with my characters. I advise my readers to write your own original stories with your own characters. It is absolutely essential that you respect my wishes.” Diana Gabaldon calls fan fiction immoral and describes, “it makes me want to barf whenever I’ve inadvertently encountered some of it involving my characters.” Moreover, in a move shared by other anti-fan fiction writers, she compares her characters to family members: “I wouldn’t like people writing sex fantasies for public consumption about me or members of my family—why would I be all right with them doing it to the intimate creations of my imagination and personality?” George R.R. Martin similarly evokes familial intimacy when he writes, “My characters are my children, I have been heard to say. I don’t want people making off with them.” What is interesting in these—and other authors’—articulations of why they disapprove of fan fiction of their works is that their strongest and ultimate argument is neither legal nor economic reasoning but an emotional plea: being a good fan means coloring within the lines laid out by the initial creator, putting one’s toys back exactly as one found them, and never ever getting creative or transformative with them. Many fan fiction writers respect these wishes and do not write in book fandoms where the authors have expressed their desires clearly. Sometimes entire archives respect an author’s desires: fanfiction.net, the largest repository of fic online, removed all stories based on Rice’s work and does not allow any new ones to be posted. However, fandom is a heterogeneous culture with no centralised authority, and it is not difficult to find fic based on Rice’s characters and settings if one knows where to look. Most of these debates are restricted to book fandoms, likely for two reasons: (1) film and TV fan fiction alters the medium, so that there is no possibility that the two works might be mistaken for one another; and (2) film and TV authorship tends to be collaborative and thus lowers the individual sense of ownership (Mann, Sellors). How Fannish Remixes Are like Fan Fiction Most fan fiction writers strongly dismiss accusations of plagiarism and theft, two accusations that all too easily are raised against fan fiction and yet, as we have shown, such accusations actually misdefine terms. Fans extensively debate the artistic values of fan fiction, often drawing from classical literary discussions and examples. Clearly echoing Wilde’s creed that “there is no such thing as a moral or immoral book,” Kalichan, for example, argues in one LJ conversation that “whenever I hear about writers asserting that other writing is immoral, I become violently ill. Aside from this, morality & legality are far from necessarily connected. Lots of things are immoral and legal, illegal and moral and so on, in every permutation imaginable, so let’s just not confuse the two, shall we” (Kalichan LJ, 3 May 2010). Aja Romano concludes an epic list of remixed works ranging from the Aeneid to The Wind Done Gone, from All’s Well That Ends Well to Wicked with a passionate appeal to authors objecting to fan fiction: the story is not defined by the barriers you place around it. The moment you gave it to us, those walls broke. You may hate the fact people are imagining more to your story than what you put there. But if I were you, I’d be grateful that I got the chance to create a story that has a culture around it, a story that people want to keep talking about, reworking, remixing, living in, fantasizing about, thinking about, writing about. (Bookshop LJ, 3 May 2010)Many fan writers view their own remixes as part of a larger cultural movement that appropriates found objects and culturally relevant materials to create new things, much like larger twentieth century movements that include Dada and Pop Art, as well as feminist and postcolonial challenges to the literary canon. Finally, fan fiction partakes in 21st century ideas of social anarchy to create a cultural creative commons of openly shared ideas. Fan Cupidsbow describes strong parallels and cross-connection between all sorts of different movements, from Warhol to opensource, DeviantArt to AMV, fanfiction to mashups, sampling to critique and review. All these things are about how people are interacting with technology every day, and not just digital technology, but pens and paper and clothes and food fusions and everything else. (Cupidsbow LJ, 20 May 2009) Legally, of course, these reuses of collectively shared materials are often treated quite differently, which is why fan fiction advocates often maintain that all remixes be treated equally—regardless of whether their source text is film, TV, literature, or fan fiction. The Archive of Our Own, a project of the Organization for Transformative Works, for example, does not distinguish in its Content and Abuse Policy section between commercial and fan works in regard to plagiarism and copyright. Returning to the initial case of the Stargate Atlantis Mission Report Challenge, we can thus see how the moderator clearly positions herself within a framework that considers all remixes equally remixable. Even after changing the guidelines to require permission for the remixing of existing fan stories, moderator Cesperanza notes that she “remain[s] philosophically committed to the idea that people have the right to make art based on other art provided that due credit is given the original artist” (sga_flashfic LJ, 21 Aug. 2006). Indeed, other fans agree with her position in the ensuing discussions, drawing attention to the hypocrisy of demanding different rules for what appears to be the exact same actions: “So explain to me how you can defend fanfiction as legitimate derivative work if it’s based on one type of source material (professional writing or TV shows), yet decry it as ‘stealing’ and plagiarism if it’s based on another type of source material (fanfiction)” (Marythefan LJ, 21 Aug. 2006). Many fans assert that all remixes should be tolerated by the creators of their respective source texts—be they pro or fan. Fans expect Rowling to be accepting of Harry Potter’s underage romance with a nice and insecure Severus Snape, and they expect Matthew Weiner to be accepting of stories that kill off Don Draper and have his (ex)wives join a commune together. So fans should equally accept fan fiction that presents the grand love of Rodney McKay and John Sheppard, the most popular non-canonical fan fiction pairing on Stargate Atlantis, to be transformed into an abusive and manipulative relationship or rewritten with one of them dying tragically. Lydiabell, for example, argues that “there’s [no]thing wrong with creating a piece of art that uses elements of another work to create something new, always assuming that proper credit is given to the original... even if your interpretation is at odds with everything the original artist wanted to convey” (Lydiabell LJ, 22 Aug. 2006). Transforming works can often move them into territory that is critical of the source text, mocks the source text, rearranges relationships, and alters characterisations. It is here that we reach the central issue of this article: many fans indeed do view intrafandom interactions as fundamentally different to their interactions with professional authors or commercial entertainment companies. While everyone agrees that there are no legal, economic, or even ultimately moral arguments to be made against remixing fan fiction (because any such argument would nullify the fan’s right to create their fan fiction in the first place), the discourses against open remixing tend to revolve around community norms, politeness, and respect. How Fannish Remixes Are Not like Fan Fiction At the heart of the debate lie issues of community norms: taking another fan’s stories as the basis for one’s own fiction is regarded as a violation of manners, at least the way certain sections of the community define them. This, in fact, is not unlike the way many fan academics engage with fandom research. While it may be perfectly legal to directly cite fans’ blog posts, and while it may even be in compliance with institutional ethical research requirements (such as Internal Review Boards at U.S. universities), the academic fan writing about her own community may indeed choose to take extra precautions to protect herself and that community. As Kristina Busse and Karen Hellekson have argued, fan studies often exists at the intersection of language and social studies, and thus written text may simultaneously be treated as artistic works and as utterances by human subjects (“Identity”). In this essay (and elsewhere), we thus limit direct linking into fannish spaces, instead giving site, date, and author, and we have consent from all fans we cite in this essay. The community of fans who write fic in a particular fandom is relatively small, and most of them are familiar with each other, or can trace a connection via one or two degrees of separation only. While writing fan fiction about Harry Potter may influence the way you and your particular circle of friends interpret the novels, it is unlikely to affect the overall reception of the work. During the remix debate, fan no_pseud articulates the differing power dynamic: When someone bases fanfic on another piece of fanfic, the balance of power in the relationship between the two things is completely different to the relationship between a piece of fanfic and the canon source. The two stories have exactly equal authority, exactly equal validity, exactly equal ‘reality’ in fandom. (nopseud LJ, 21 Aug. 2006) Within fandom, there are few stories that have the kind of reach that professional fiction does, and it is just as likely that a fan will come across an unauthorised remix of a piece of fan fiction as the original piece itself. In that way, the reception of fan fiction is more fragile, and fans are justifiably anxious about it. In a recent conversation about proper etiquette within Glee fandom, fan writer flaming_muse articulates her reasons for expecting different behavior from fandom writers who borrow ideas from each other: But there’s a huge difference between fanfic of media and fanfic of other fanfic authors. Part of it is a question of the relationship of the author to the source material … but part of it is just about not hurting or diminishing the other creative people around you. We aren’t hurting Glee by writing fic in their ‘verse; we are hurting other people if we write fanfic of fanfic. We’re taking away what’s special about their particular stories and all of the work they put into them. (Stoney321 LJ, 12 Feb. 2012)Flaming_muse brings together several concepts but underlying all is a sense of community. Thus she equates remixing within the community without permission as a violation of fannish etiquette. The sense of community also plays a role in another reason given by fans who prefer permission, which is the actual ease of getting it. Many fandoms are fairly small communities, which makes it more possible to ask for permission before doing a translation, adaptation, or other kind of rewrite of another person’s fic. Often a fan may have already given feedback to the story or shared some form of conversation with the writer, so that requesting permission seems fairly innocuous. Moreover, fandom is a community based on the economy of gifting and sharing (Hellekson), so that etiquette becomes that much more important. Unlike pro authors who are financially reimbursed for their works, feedback is effectively a fan writer’s only payment. Getting comments, kudos, or recommendations for their stories are ways in which readers reward and thank the writers for their work. Many fans feel that a gift economy functions only through the goodwill of all its participants, which remixing without permission violates. How Fan Writing May Differ From Pro Writing Fans have a different emotional investment in their creations, only partially connected to writing solely for love (as opposed to professional writers who may write for love but also write for their livelihood in the best-case scenarios). One fan, who writes both pro and fan fiction, describes her more distanced emotional involvement with her professional writing as follows, When I’m writing for money, I limit my emotional investment in the material I produce. Ultimately what I am producing does not belong to me. Someone else is buying it and I am serving their needs, not my own. (St_Crispins LJ, 27 Aug. 2006)The sense of writing for oneself as part of a community also comes through in a comment by pro and fan writer Matociquala, who describes the specificity and often quite limited audience of fan fiction as follows: Fanfiction is written in the expectation of being enjoyed in an open membership but tight-knit community, and the writer has an expectation of being included in the enjoyment and discussion. It is the difference, in other words, between throwing a fair on the high road, and a party in a back yard. Sure, you might be able to see what’s going on from the street, but you’re expected not to stare. (Matociquala LJ, 18 May 2006)What we find important here is the way both writers seem to suggest that fan fiction allows for a greater intimacy and immediacy on the whole. So while not all writers write to fulfill (their own or other’s) emotional and narrative desires, this seems to be more acceptable in fan fiction. Intimacy, i.e., the emotional and, often sexual, openness and vulnerability readers and writers exhibit in the stories and surrounding interaction, can thus constitute a central aspect for readers and writers alike. Again, none of these aspects are particular to fan fiction alone, but, unlike in much other writing, they are such a central component that the stories divorced from their context—textual, social, and emotional—may not be fully comprehensible. In a discussion several years ago, Ellen Fremedon coined the term Id Vortex, by which she refers to that very tailored and customised writing that caters to the writers’ and/or readers’ kinks, that creates stories that not only move us emotionally because we already care about the characters but also because it uses tropes, characterisations, and scenes that appeal very viscerally: In fandom, we’ve all got this agreement to just suspend shame. I mean, a lot of what we write is masturbation material, and we all know it, and so we can’t really pretend that we’re only trying to write for our readers’ most rarefied sensibilities, you know? We all know right where the Id Vortex is, and we have this agreement to approach it with caution, but without any shame at all. (Ellen Fremedon LJ, 2 Dec. 2004)Writing stories for a particular sexual kink may be the most obvious way fans tailor stories to their own (or others’) desires, but in general, fan stories often seem to be more immediate, more intimate, more revealing than most published writing. This attachment is only strengthened by fans’ immense emotional attachment to the characters, as they may spend years if not decades rewatching their show, discussing all its details, and reading and writing stories upon stories. From Community to Commons These norms and mores continue to evolve as fannish activity becomes more and more visible to the mainstream, and new generations of fans enter fandom within a culture where media is increasingly spreadable across social networks and all fannish activity is collectively described and recognised as “fandom” (Jenkins, Ford, and Green). The default mode of the mainstream often treats “found” material as disseminable, and interfaces encourage such engagement by inviting users to “share” on their collection of social networks. As a result, many new fans see remixing as not only part of their fannish right, but engage in their activity on platforms that make sharing with or without attribution both increasingly easy and normative. Tumblr is the most recent and obvious example of a platform in which reblogging other users’ posts, with or without commentary, is the normative mode. Instead of (or in addition to) uploading one’s story to an archive, a fan writer might post it on Tumblr and consider reblogs as another form of feedback. In fact, our case study and its associated differentiation of legal, moral, and artistic justifications for and against remixing fan works, may indeed be an historical artifact in its own right: media fandom as a small and well-defined community of fans with a common interest and a shared history is the exception rather than the norm in today’s fan culture. When access to stories and other fans required personal initiation, it was easy to teach and enforce a community ethos. Now, however, fan fiction tops Google searches for strings that include both Harry and Draco or Spock and Uhura, and fan art is readily reblogged by sites for shows ranging from MTV’s Teen Wolf to NBC’s Hannibal. Our essay thus must be understood as a brief glimpse into the internal debates of media fans at a particular historical juncture: showcasing not only the clear separation media fan writers make between professional and fan works, but also the strong ethos that online communities can hold and defend—if only for a little while. References Boyle, James. The Public Domain: Enclosing the Commons of the Mind. Ithaca: Yale University Press, 2008. Busker, Rebecca Lucy. “On Symposia: LiveJournal and the Shape of Fannish Discourse.” Transformative Works and Cultures 1 (2008). http://journal.transformativeworks.org/index.php/twc/article/view/49. Busse, Kristina, and Karen Hellekson. “Work in Progress.” In Karen Hellekson and Kristina Busse, eds., Fan Fiction and Fan Communities in the Age of the Internet: New Essays. Jefferson, N.C.: McFarland, 2006. 5–40. Busse, Kristina, and Karen Hellekson. “Identity, Ethics, and Fan Privacy.” In Katherine Larsen and Lynn Zubernis, eds., Fan Culture: Theory/Practice. Newcastle upon Tyne: Cambridge Scholars Publishing, 2012. 38-56. Chander, Anupam, and Madhavi Sunder. “Everyone’s a Superhero: A Cultural Theory of ‘Mary Sue’ Fan Fiction as Fair Use.” California Law Review 95 (2007): 597-626. Coppa, Francesca. “A Brief History of Media Fandom.” In Karen Hellekson and Kristina Busse, eds., Fan Fiction and Fan Communities in the Age of the Internet: New Essays. Jefferson, N.C.: McFarland, 2006. 41–59. Fiesler, Casey. “Everything I Need to Know I Learned from Fandom: How Existing Social Norms Can Help Shape the Next Generation of User-Generated Content.” Vanderbilt Journal of Entertainment and Technology Law 10 (2008): 729-62. Gabaldon, Diana. “Fan Fiction and Moral Conundrums.” Voyages of the Artemis. Blog. 3 May 2010. 7 May 2010 http://voyagesoftheartemis.blogspot.com/2010/05/fan-fiction-and-moral-conundrums.html. Hellekson, Karen. “A Fannish Field of Value: Online Fan Gift Culture.” Cinema Journal 48.4 (2009): 113–18. Hobbs, Robin. “The Fan Fiction Rant.” Robin Hobb’s Home. 2005. 14 May 2006 http://www.robinhobb.com/rant.html. Jenkins, Henry, Sam Ford, and Joshua Green. Spreadable Media: Creating Value and Meaning in a Networked Culture. New York: New York University Press, 2013. Katyal, Sonia. “Performance, Property, and the Slashing of Gender in Fan Fiction.” Journal of Gender, Social Policy, and the Law 14 (2006): 463-518. Lessig, Lawrence. Remix: Making Art and Commerce Thrive in a Hybrid Economy. New York: Penguin, 2008. Mann, Denise. “It’s Not TV, It’s Brand Management.” In Vicki Mayer, Miranda Banks, and John Thornton Caldwell, eds., Production Studies: Cultural Studies of Media Industries. New York: Routledge, 2009. 99-114. Martin, George R.R. “Someone is Angry on the Internet.” LiveJournal. 7 May 2010. 15 May 2013. http://grrm.livejournal.com/151914.html. McCardle, Meredith. “Fandom, Fan Fiction and Fanfare: What’s All the Fuss?” Boston University Journal of Science and Technology Law 9 (2003): 443-68. Rice, Anne. “Important Message From Anne on ‘Fan Fiction’.” n.d. 15 May 2013. http://www.annerice.com/readerinteraction-messagestofans.html. Scott, Suzanne. “Repackaging Fan Culture: The Regifting Economy of Ancillary Content Models.” Transformative Works and Cultures 3 (2009). http://dx.doi.org/10.3983/twc.2009.0150. Sellors, C. Paul. Film Authorship: Auteurs and Other Myths. London: Wallflower, 2010. Tushnet, Rebecca. “Copyright Law, Fan Practices, and the Rights of the Author.” In Jonathan Gray, Cornel Sandvoss, and C. Lee Harrington, eds., Fandom: Identities and Communities in a Mediated World. New York: New York University Press, 2007. 60-71.
APA, Harvard, Vancouver, ISO, and other styles
43

Banks, John. "From Fetish to Factish and Back Again." M/C Journal 2, no. 5 (1999). http://dx.doi.org/10.5204/mcj.1769.

Full text
Abstract:
Introduction This essay is very much an anxious response to an earlier article, "Controlling Gameplay", that I wrote for M/C about gameplay: the immersive, visceral experience of playing computer and video games. I argued that gameplay concerns the event status of playing computer and video games, and that as such it exceeds the symbolic content of games. Now, I continue to be troubled by the implications of this assertion -- does it not give up too much ground gained by the understanding that social practices such as gaming are socially constructed? Does it not return us to all of the problems associated with claims of access to an essential, authentic experience? In short, it becomes very difficult to contest or question such claims. The term gameplay may well function to depoliticise computer gaming; at least if the domain of the properly sociopolitical is understood as the symbolic field! -- and perhaps we shouldn't concede this point too quickly. In the previous article did I almost against myself end up fetishising the technological through the postulation of this sublime experience? The Fetish & Desire You may well be wondering what any of this has got to do with desire. Well, first let me fill you in on the research context out of which these essays emerge. For the past three years I have been undertaking ethnographic research on computer gaming: first, by participating in online gamer fan activities; and second, in an enterprise ethnographic study of Auran, a computer game development company situated in Brisbane, Australia. "Controlling Gameplay" is clearly marked by my immersion and entanglement in an ethnographic relationship with online gamers. The material on which it is based came from spending up to 25 hours a week online playing and discussing games. The point of these comments is not simply to establish my credentials as a gamer, nor to embarrassingly distance myself from 'going native' by making the appropriate gestures about reflexivity. Rather, I insist on these moments of fetishistic disavowal and illusion as a necessary condition of doing ethnographies. This shifts us from the domain of desire to what Slavoj Zizek, following Lacan, theorises as enjoyment. In the introduction to "Controlling Gameplay" I made the banal point that computer game software is a commodity. Computer games offer an example of the informational commodity circulating through the networks of informational capitalism. This is basically the bottom line of gaming: big business. Zizek carefully outlines that central to the Marxist understanding of commodity fetishism -- the displacement of relations among people onto relations among things -- is a fascination for some kind of mysterious 'content' that is presumed to be hidden by the form of commodities (Sublime Object 16-22). An example of this is the cultural studies academic doing ethnographic research, and believing that his work offers "something more", a potential critical edge, than just the commodification and corporatisation of academic work. It would appear, at least initially, that this is precisely how gameplay is working: the hidden technological sublime behind the empty form of the informational commodity. The problem for critical analysis then becomes that of insisting on asking the question of why this 'content' of gameplay is affirmed in the game's particular status of the commodity form. We are not interested in disclosing "the secret behind the form but the secret of this form itself" (Sublime Object 15). In discussions many gamers would insist on the fact that gameplay is simply the fun factor of playing computer games: nothing more and nothing less. Others would insist on refusing to finally fill in this secret content. After describing gameplay as having something to do with an immersive experience of escapism a gamer would invariably move on to suggest that it perhaps involved the design of a good interface that allows the player to seamlessly participate in the game; or it is to do with quality game-design, a careful balancing of various features that define a particular genre. Or it is a skill developed and honed by many hours of gaming; intense gameplay is an insider's experience that is used to define your belonging as a 'hard-core gamer with cred' -- if it has to be explained and described to you, well, you just won't get it, will you? In the movement of these discussions and exchanges desire is not so much to be found or discovered in the hidden content of an essential, authentic experience that is gameplay, but rather it is right there on the surface, in the work of these displacements. If anything then, unconscious desire is not a deep interior experience of gameplay but in the very form of this movement, in the work that is done to elaborate and produce the effect of a hidden content. And the question arises: what is being avoided or obfuscated in this movement that perhaps has nothing at all to do with an experience of gameplay or even desire for that matter? I will return to this question in a moment. The important step here is not to become overly dazzled by this 'content' of gameplay, but instead to ask the question of why it assumes the form of a commodity. But why this focus on the commodity-form, and the process of fetishistic inversion. After all there is a lot more at stake here than simply the commodity-form or some kind of economic reductionism, essentialism or substantialism. There is also the fascinating power of attraction that this "something more" can exert on academic work. This has to do with the status of a sublime materiality that persists beyond the physical materiality of an object in the networks of business, or even that of an object-cause for intersubjective desire played out in the game of ethnographic research. It is precisely this persistence that is so troubling. But is this interest in fetishistic disavowal, the insistence on "something more", simply a more refined type of traditional ideology critique? That is, is it once more a matter of the illusory knowledge or beliefs of misguided naive gamers which the critical intellectual will come along and tear down, to reveal the true state of affairs -- that there is really nothing there except perhaps a complex, overdetermined effect of socioeconomic processes, a social construction if you like? Is all of this concern with the fetish simply an epistemological and monstrous game played out in the interiority of the thinking subject that has in fact very little, if anything, to do with the effective materiality of the complex assemblage that is computer gaming. Perhaps a shift to the materiality of the processes and objects involved in the production of computer gaming would help us to leave behind the problem of the fetish as some tired epistemological quandary about illusory belief. After all, is not the very idea of commodity fetishism based on a rather tired and limiting opposition between people and things? The Factish In his recent Pandora's Hope: Essays on the Reality of Science Studies, Bruno Latour attacks the notion of the fetish and the modern critical subject that he believes is behind it. Latour's actor-network theory (nicely explained in Sean Aylward Smith's recent article for M/C, "Where Does the Body End?") works to displace the assumed divide between subjects and objects, particularly humans and nonhumans. This is often theorised through richly detailed ethnographic studies that follow the associations between humans and nonhumans that make up the assemblages and collectivities of scientific practice and technological projects. In Pandora's Hope Latour takes aim at the critical gesture of the iconoclast, the modern critic, who seeks to expose the fetish as "something that is nothing in itself, but simply the blank screen onto which we have projected, erroneously, our fancies, our labor, our hopes and passions". A problem for the anti-fetishist is the assumption that people naively believe in the inherent, mysterious qualities of the object in the first place. Anti-fetishism is not so much about the qualities or status of the object and our relations to it, but more a mode of argument: "it is always an accusation. Some person, or some people, are accused of being taken in -- or worse, of cynically manipulating credulous believers -- by someone who is sure of escaping from this illusion and wants to free the others as well: either from naive belief or from being manipulative. But if anti-fetishism is clearly an accusation, it is not a description of what happens with those who believe or are manipulated" (270). Latour argues that the problem of fetishism is all in the mind of the critical thinker. Believing himself disconnected from the realm of things and objects, this monstrous "mind in the vat" "invents the notion of belief and manipulation and projects this notion upon a situation in which the fetish plays an entirely different role" (270). Latour proposes that we shift our attention to the status of the fetish as a quasi-object or factish. The factish has to be fabricated, made, and invented; as such it has a complex and variable ontology in which it is entangled within collective practice. The status of the factish is all about the associations between humans and nonhumans and refuses the disabling opposition between subject and object, epistemology and ontology, internal belief and external world. The modern critic's belief that others believe functions to render invisible the complicated practice through which the categories are mixed and factishes are constructed. To replace all of this Latour suggests that we adopt a heterogeneous ontology in which we externalise belief "among the multiplicity of nonhumans" (284) -- in short that we recognise the ontological content of beliefs, and grant ontology back to nonhuman entities (273-88). By taking up the approach of actor-network theory I could now follow the diverse actors, both human and nonhuman, that make up the network and practices of computer gaming. When the gamers assert that gameplay is this and that and so on, I can take them at their word. They are not telling me, in some hysterical cycle, 'no it's not that, no, not that'. But instead 'yes it is that, and that as well'. They are affirming the multiple and heterogeneous ontology of humans and nonhumans. So I took this toolbox of concepts with me into my fieldwork encounters and interventions at Auran. And not surprisingly it worked really well. I could now quite easily and comfortably follow the entangled materiality of humans and nonhumans; the multiple shifting ontologies of objects such as game engines that function as representations in design reports; key elements in long term corporate business plans; links in relations with other corporations; development tools for game designers; the focus of licensing agreements; and programming problems and challenges for programmers. Game designers, programmers, CEOs, and public relations officers were more than happy to describe and show me the complex entanglements of humans and nonhumans involved in producing computer games. Now, throughout the period of my fieldwork I have been quite anxious and worried about negotiating the conditions of access, about the control exercised by senior Auran management. But at each stage or period of my research I have been amazed by the level of cooperation and access that has been given to me. Nor has Auran management shown much concern about my access to 'problem areas' of the company as it went through various periods of restructuring. I have had open and what I believe to be frank discussions with disgruntled employees who were very uncomfortable and openly critical of various aspects of Auran. And there has been very little effort to control or restrict my use of this material. My impression is that Auran has been more than pleased to put on display for the dazzled gaze of the ethnographer the corporate processes and mechanisms involved in producing computer game software. Initially I was rubbing my hands with glee at this research opportunity. I can see publication potential and career opportunities emerging from this ethnographic entanglement with Auran. The Fetish and Enjoyment But I have become increasingly anxious and worried about how well the fieldwork at Auran has gone, and how well actor-network theory works in explaining the multiple and heterogeneous ontologies of the humans and nonhumans that I have been mixing with for the past two years. And this worry brings me back to the fetish. I think Latour is correct: belief is not something internal, but more a matter of practice, externalised in the relations among humans and nonhumans. But is this not precisely the more useful and correct definition of the fetish, at least under the conditions of informational capitalism? Far from moving us out of the domain of the fetish into the ontological materiality of the factish, Latour is perhaps describing the fetishistic inversion perfectly. It is not at the level of some kind of internal knowledge, belief or deep mysterious unconscious that the misrecognition of the fetishistic inversion takes place. Rather, it is at the properly social level of our acts, what we do, that we overlook the fetishistic 'repressed' social dimension (Sublime Object 20). This nonknowledge of reality is part of the very effectivity of our social acts, "a kind of reality which is possible only on condition that the individuals partaking in it are not aware of its proper logic; that is, a kind of reality whose very ontological consistency implies a certain non-knowledge of its participants" (Sublime Object 21). The further point to recognise, as Zizek points out, is that commodity fetishism is not just the replacement of people with things, or our overlooking the properly social relations between humans behind things. More importantly, it is that this misrecognition occurs precisely at the level of the network of relations among things -- what is a structural effect of this network of relations starts to appear as the immediate property of one of the elements (Sublime Object 23-4). So from all of this the important point for my purposes is that fetishism is not really about what people know. Of course gamers know very well that their software is a commodity, and that capitalist business interests are basically running the show: they talk about the business of gaming all the time. The point is rather the fact that the fetishistic inversion occurs in the very activity of playing. This misrecognition, or illusion if you will, is not about false knowledge: the illusion is structuring reality, our real social activity: "they know very well how things really are, but still they are doing it as if they did not know" (Sublime Object 32). So Latour is quite insightful, belief is radically exterior and as Zizek points out one of the uptakes of this is that things, commodities, end up believing for us -- "it is belief which is radically exterior, embodied in the practical, effective procedure of people" (34). But does not Latour's focus on the complex ontology of objects, and our entanglement with them, at least in some way work to challenge this fetishistic inversion? Is not this obfuscation of the process of production, even if we shift that misrecognition to the relations among things, questioned by the process of exposing or opening the black box of the production mechanism? After all, isn't this precisely what we are trying to do with ethnographies? The difficulty, as Zizek writes in "Fetishism and Its Vicissitudes", is that "far from destroying the 'fetishist' illusion, the insight into the production mechanism in fact even strengthens it". It is the disclosure of the production process itself that "serves as the fetish which fascinates with its presence". And what is being concealed, and persists through all this display of disclosure is "the social mode of production" (102). Zizek warns us "the transparency of the process of production is false in so far as it obfuscates the immaterial virtual order which effectively runs the show ... . Capital functions as the sublime irrepresentable Thing, present only in its effects, in contrast to a commodity, a particular material object which miraculously 'comes to life', starts to move as if endowed with an invisible spirit" (103). Time for me to get back to the question of desire. One of the more fascinating and disturbing uptakes of this approach to the fetish is that the fetishistic misrecognition persists and insists beyond any interpretative intervention. This is the necessary conclusion of the fact that fetishism is not about what we know, but what we do: 'I know all too well that computer games are informational commodities generating profits for capitalist enterprises, but damn, they are fun to play.' The problem with gameplay is not one of explaining it, symbolising it, or even finding the appropriate theoretical vocabulary in which to talk about it. Gamers have come up with a range of different and flexible ways of discussing (dare I say, quite reflexively) the experience of gameplay. The problem is that I can never quite get rid of this problem of gameplay, it insists on sticking and attaching itself to my ethnography. Bruno Latour picks up on this dilemma with the observation that despite all the best efforts of the anti-fetishist critic "somehow the fetish gains in strength ... . The more you want it to be nothing, the more action springs back from it" (270). Even the attempt to generate a kind of critical distance through the process of 'writing up' the dissertation is smeared with the rather disgusting, perverted Enjoyment taken in disclosing and robbing the other of their Enjoyment. It is as if we are compelled, interpellated, by an anonymous superegoic injunction to 'Enjoy our gaming'. As Slavoj Zizek argues in his recent work (including the magnificent The Ticklish Subject: The Absent Centre of Political Ontology) the order of capital no longer functions according to the matrix of desire, a prohibitive injunction that sets in motion the impossibility of satisfying desire that is "reflexively inverted into the desire for nonsatisfaction" (345). Instead we get a corporate "little brother" commanding us to Enjoy ourselves! (The Ticklish Subject 347) Perhaps the only response open to us in these circumstances is in the act of insisting on a bottom line: $. References Banks, John. "Controlling Gameplay." M/C: A Journal of Media and Culture 1.5 (1998). 22 July 1999 <http://www.uq.edu.au/mc/9812/game.php>. Latour, Bruno. Pandora's Hope: Essays on the Reality of Science Studies. Cambridge, Mass.: Harvard UP, 1999. Smith, Sean Aylward. "Where Does the Body End?" M/C: A Journal of Media and Culture 2.3 (1999). 22 July 1999 <http://www.uq.edu.au/mc/9905/end.php>. Zizek, Slavoj. "Fetishism and Its Vicissitudes." The Plague of Fantasies. London: Verso, 1997. 86-126. ---. The Sublime Object of Ideology. London: Verso, 1989. ---. The Ticklish Subject: The Absent Centre of Political Ontology. London: Verso, 1999. Citation reference for this article MLA style: John Banks. "From Fetish to Factish and Back Again." M/C: A Journal of Media and Culture 2.5 (1999). [your date of access] <http://www.uq.edu.au/mc/9907/games.php>. Chicago style: John Banks, "From Fetish to Factish and Back Again," M/C: A Journal of Media and Culture 2, no. 5 (1999), <http://www.uq.edu.au/mc/9907/games.php> ([your date of access]). APA style: John Banks. (1999) From fetish to factish and back again. M/C: A Journal of Media and Culture 2(5). <http://www.uq.edu.au/mc/9907/games.php> ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles
44

Grandinetti, Justin Joseph. "A Question of Time: HQ Trivia and Mobile Streaming Temporality." M/C Journal 22, no. 6 (2019). http://dx.doi.org/10.5204/mcj.1601.

Full text
Abstract:
One of the commonplace and myopic reactions to the rise of televisual time-shifting via video-on-demand, DVD rental services, illegal downloads, and streaming media was to decree “the death of the communal television experience”. For many, new forms of watching television unconstrained by time-bound, regularly scheduled programming meant the demise of the predominant form of media liveness that existed commercially since the 1950s. Nevertheless, as time-shifting practices evolved, so have attendant notions of televisual temporality—including changing forms of liveness, shared experience, and the plastic and flexible nature of new viewing patterns (Bury & Li; Irani, Jefferies, & Knight; Turner; Couldry). Although these temporal conceptualisations are relevant to streaming media, in the few years since the launch of platforms such as Netflix, Hulu, and Amazon, what it means “to stream” has rapidly expanded. Social media platforms like Twitter, Facebook, Snapchat, YouTube, and TikTok allow users to record, share, and livestream their own content. Not only does social media add to the growing definition of streaming, but these streaming interactions are also predominately mobile (Munson; Droesch). Taken together, a live and social experience of time via audio-visual media is not lost but is instead reactivated through the increasingly mobile nature of streaming. In the following article, I examine how mobile streaming media practices are part of a construction of shared temporality that both draws upon and departs from conceptualisations of televisual and fixed streaming liveness. Accordingly, HQ Trivia—a mobile-specific streaming gameshow app launched in August 2017—demonstrates novel attempts at reimagining the temporally-bound live televisual experience while simultaneously offering new monetisation strategies via mobile streaming technologies. Through this example, I argue that pervasive Web-connectivity, streaming platforms, data collection, mobile devices, and mobile streaming practices form arrangements of valorisation that are temporally bound yet concomitantly mobile, allowing new forms of social cohesion and temporal control.A Brief History of Televisual TemporalityTime is at once something infinitely mysterious and inherently understood. As John Durham Peters concisely explains, “time lies at the heart of the meaning of our lives” (175). It is precisely due to the myriad ontological, phenomenological, and epistemological dimensions of time that the subject has long been the focus of critical inquiry. As part of the so-called spatial turn, Michel Foucault argues that theory formerly treated space as “the dead, the fixed, the undialectical, the immobile. Time, on the contrary, was richness, fecundity, life, dialectic” (70). While scholarly turns toward space and later mobility have shifted the emphasis of critical inquiry, time is not rendered irrelevant. For example, Doreen Massey defines spaces as the product of interrelations, as sphere of possibility and heterogeneous multiplicity, and as always under construction (9). Critical to these conceptualisations of space, then, is the element of time. Considering space not as a static container in which individual actors enter and leave but instead as a production of ongoing becoming demonstrates how space, mobility, and time are inexorably intertwined. Time, space, and mobility are also interrelated when it comes to conversations of power. Judy Wajcman and Nigel Dodd contend that temporal control is related to dynamics of power, in that the powerful are fast and the powerless slow (3). Questions of speed, mobility, and the control of time itself, however, require attention to the media that help construct time. Aspects of time may always escape human comprehension, yet, “Whatever time is, calendars and clocks measure, control, and constitute it” (Peters 176). Time is a sociotechnical construction, but temporal experience is bound up in more than just time-keeping apparatuses. Elucidated by Sarah Sharma, temporalities are not experienced as uniform time, but instead produced within larger economies of labor and temporal worth (8). To reach a more productive understanding of temporalities, Sharma offers power-chronography, which conceptualises time as experiential, political, and produced by social differences and institutions (15). Put another way, time is an experience structured by the social, economic, political, and technical toward forms of social cohesion and control.Time has always been central to the televisual. Though it is often placed in a genealogy with film, William Uricchio contends that early discursive imaginings and material experiments in television are more indebted to technologies such as the telegraph and telephone in promising live and simultaneous communication across distances (289-291). In essence, film is a technology of storage, related to 18th- and 19th-century traditions of conceptualising time as fragmented; the televisual is instead associated with the “contrasting notion of time conceived as a continuous present, as flow, as seamless” (Uricchio 295). Responding to Uricchio, Doron Galili asserts that the relationship between film and television is dialectical and not hierarchical. For Galili, the desire for simultaneity and storage oscillates—both are present, both remain separate from one another. It is the synthesis of simultaneity and storage that allows both to operate together as a technological and mediated vision of mastering time. Despite disagreements regarding how best to conceptualise early film and television, it is clear that the televisual furthered a desire for spatial and temporal coordination, liveness, and simultaneity.In recent years, forms of televisual “time-shifting” allow viewers to escape temporally-bound scheduling. In what is commonly periodised as TVIII, the proliferation of digital platforms, video-on-demand, legal and illegal downloads, and DVD players, and streaming media displaced more traditional forms of watching live television (Jenner 259). It is important to note that while streaming is often related to the televisual, the televisual-to-streaming shift is not a clean linear evolution. Televisual-style content persists in streaming, but streaming might be better defined as matrix media, where content is made available away from the television set (Jenner 260). Regardless, the rise of streaming media platforms such as Netflix, Hulu, and Amazon Prime is commonly framed as part of televisual temporal disruption, as scholars note the growing plurality of televisual-type viewing options (Bury and Li 594). Further still, streaming platforms are often defined as television, a recent example occurring when Netflix CEO Reed Hastings called the service a “global Internet TV network” in 2016.The changing landscape of streaming and time-shifting notwithstanding, individuals remain aware of the viewing patterns of others, and this anticipation impacts the coordination and production of the collective television experience (Irani, Jeffries, and Knight 621). Related to this goal is how liveness connects viewers to shared social realities as they are occurring and helps to create a collective sense of time (Couldry 355-356). This shared experience of the social is still readily available in a time-shifted landscape, in that even shows released via an all-at-once format (for example, Netflix’s Stranger Things) can rapidly become a cultural phenomenon. Moreover, livestreaming has become commonplace as alternative to cable television for live events and sports, along with new uses for gaming and social media. As Graeme Turner notes, “if liveness includes a sense of the shrinking temporal gap between oneself and the rest of the world, as well as a palpable sense of immediacy, then this is something we can find as readily online as in television”. To this end, the claim that streaming media is harbinger of the “death of liveness” is far too simplistic. Liveness vis-à-vis streaming is not something that ceases to exist—shared temporal experiences simply occur in new forms.HQ TriviaOne such strategy to reactive a more traditional form of televisual liveness through streaming is to make streaming more social and mobile. Launched in August 2017, HQ Trivia (later retitled HQ Trivia and Words) requires users, known as HQties, to download the app and log in at 3.00 pm and 9.00 pm Eastern Standard Time to join a live gameshow. In each session, gameshow hosts ask a series of 12 single-elimination questions with three answer choices. Any users who successfully answer all 12 questions correctly split the prize pool for the show, which ranges from $250 to $250,000. Though these monetary prizes appear substantial, the per-person winnings paid out are often quite low based on the number winners splitting the pool. In the short time since its inception, HQ has had high and low audience participation numbers and has also spawned a myriad of imitators, including Facebook’s “Confetti” gameshow.Mobile streaming via trivia gameshows are a return to forms of televisual liveness and participation often disrupted by the flexible nature of streaming. HQ’s twice-a-day events require users to re-adapt to temporal constraints to play and participate. Just as intriguing is that “HQ sees its biggest user participation—and largest prizes—on Sundays, especially if games coincide with national events, such as holidays, sports games or award shows” (Alcantara). Though it is difficult to draw conclusions from this correlation, the fact that HQ garners more players and attention during events and holidays complicates notions of mobile trivia as a primary form of entertainment. It is possible, perhaps, that HQ is an evolution to the so-called second screen experience, in which a mobile device is used simultaneously with a television. As noted by Hye-Jin Lee and Mark Andrejevic, the rise of the second screen often enables real-time monitoring, customisation, and targeting that is envisioned by the promoters of the interactive commercial economy (41). Second screens are a way to reestablish live-viewing and, by extension, advertising through the importance of affective economies (46). Affect, or a preconscious structure of feeling, is critical to platform monetisation, in that the capture of big data requires an infrastructuralisation of desire—in streaming media often a desire for entertainment (Cockayne 6). Through affective capture, users become willing to repeat certain actions via love for and connection to a platform. Put another way, big data collection and processing is often the central monetisation strategy of platforms, but capturing this data requires first cultivating user attachment and repeat actions.To this end, many platforms operate by encouraging as much user engagement as possible. HQ certainly endeavors for strong affective investment by users (a video search for “HQ Trivia winner reactions” demonstrates the often-zealous nature of HQties, even when winning relatively low amounts of prize money). However, HQ departs from the typical platform streaming model in that engagement with the app is limited to two games per day. These comparatively diminutive temporal appointments have substantial implications for HQ’s strategies of valorisation, or the process of apprehending and making productive the user as laborer in new times and spaces (Franklin 13). Media theorists have long acknowledged the “work of watching” television, in which the televisual is “a real economic process, a value-creating process, and a metaphor, a reflection of value creation in the economy as a whole” (Jhally and Livant 125). Televisual monetisation is predominately based on the advertising model, which functions to accelerate the selling of commodities. This configuration of capital accumulation is enabled by a lineage of privatisation of broadcasting; television is heralded as a triumph of deregulation, but in practice is an oligopolistic, advertising-supported system of electronic media aided by government policies (Streeter 175). By contrast, streaming media accomplishes capitalistic accumulation through the collection, storage, and processing of big data via cloud infrastructure. Cloud infrastructure enables unprecedented storage and analytic capacity, and is heavily utilised in streaming media to compress and transmit data packets.Although the metaphor of the cloud situates user data as ephemeral and free, these infrastructures are better conceptualised as a “digital enclosure”, which invokes the importance of privatisation and commodification, as well as the materiality and spatiality of data collection (Andrejevic 297). As such, streaming monetisation is often achieved through the multitude of monetisation possibilities that occur through the collection of vast amounts of user data. Streaming and mobile streaming, then, are similar to the televisual in that these processes monetise the work of watching; yet, the ubiquitous data collection of streaming permits more efficient forms of computational commodification.Mobile streaming media continues the lineage of ubiquitous immaterial labor—a labor form that can, and commonly is, accomplished by “filling the cracks” of non-work time with content engagement and accompanying data collection. HQ Trivia, nevertheless, functions as a notable departure from this model in that company has made public claims that the platform will not utilise the myriad user identification and location data collected by the app. Instead, HQ has engaged in brand promotions that include Warner Brothers movies Ready Player One and Rampage, along with a brief Nike partnership (Feldman; Perry). Here, mobile and temporal valorisation occurs through monetisation strategies more akin to traditional televisual advertising than the techniques of big data collection often utilised by platforms. Whether or not eschewing the proclivity toward monetising user data for a more traditional form of brand promotion will yield rewards for HQ remains to be seen. Nonetheless, this return to more conventional televisual monetisation strategies sets HQ apart from many other applications that rely on data collection and subsequent sale of user data for targeted advertisements.Affective attachment and the transformation of leisure times through mobile devices is critical not just to value generation, but also to the relationship between mobile streaming and temporal and mobile control. As previously noted, Sharma elucidates that time is part of biopolitical forms of control, produced and experienced differently. Nick Couldry echoes these sentiments, in that there are rival forms of liveness stemming from a desire for connectivity, and that these “types of liveness are now pulling in different directions” (360). Despite common positionings, the relationship between television and streaming media is not a neat linear evolution—television, streaming, and mobile streaming continue to operate both side-by-side and in conjunction with one another. The experience of time, nevertheless, operates differently in these media forms. Explained by Wendy Chun, television structures temporality through steady streams of information, the condensation of time that demands response in crisis, and the most powerful moments of “touching the real” via catastrophe (74). New media differs by instead fostering crisis as the norm, in that “crises promise to move users from banal to the crucial by offering the experience of something like responsibility; something like the consequences and joys of ‘being in touch’” (Chun 75). New media crisis is often felt via reminders and other increasingly pervasive prompts that require an immediate user response. HQ differs from other forms of streaming and mobile streaming in that the plastic and flexible nature of viewing is replaced by mobile notifications and reminders that one must be ready for twice-daily games or risk losing a chance to win.In contributing to a sense of new media crisis, HQ fosters novel expectations for the mobile streaming subject. Through temporally-bound mobile livestreaming, “networked smart screens are the mechanism by which time and space will be both overcome and reanimated” as the “real world” is transformed into a magical landscape of mobile desire (Oswald and Packer 286). There is a double-edged element to this transformation, however, in that power of HQ Trivia is the ability to reanimate space through a promise that users are able to win substantial prize money only if one remembers to tune in at certain times. Within HQ Trivia, the much-emphasised temporal freedom of streaming time-shifting is eschewed for more traditional forms of televisual liveness; at the same time, smartphone technologies permit mobile on-the-go forms of engagement. Accordingly, a more traditional televisual simultaneity reemerges even as the spaces of streaming are untethered from the living room. It is in this reemphasis of liveness and sharedness that the user is simultaneously empowered vis-à-vis mobile devices and made mobile streaming subject through new temporal expectations and forms of monetisation.As mobile streaming becomes increasingly pervasive, new experimental applications jockey for user attention and time. HQ Trivia’s model of eschewing data collection for more traditional televisual monetisation represents attempts to recreate mobile media engagement not through individual isolated audio-visual practices, but instead through a live and mobile experience. Consequently, HQ Trivia and other temporally-bound gameshow apps demonstrate a reimagined live televisual experience, and, in turn, a monetisation of mobile engagement through affective investment.ReferencesAlcantara, Chris. “Diving into HQ Trivia: The Toughest Rounds, the Best Time to Play and How Some Users Beat the Odds.” The Washington Post 5 Mar. 2018. <http://www.washingtonpost.com/graphics/2018/business/hq-trivia/?utm_term=.02dc389ae3a9>.Andrejevic, Mark. “Surveillance in the Digital Enclosure.” The Communication Review 10.4 (2007): 295-317.Bury, Rhiannon, and Johnson Li. “Is It Live or Is It Timeshifted, Streamed or Downloaded? Watching Television in the Era of Multiple Screens.” New Media & Society 17.4 (2013): 592-610.Chun, Wendy Hui Kyong. Updating to Remain the Same: Habitual New Media. Cambridge: MIT Press, 2017.Cockayne, Daniel G. “Affect and Value in Critical Examinations of the Production and ‘Prosumption’ of Big Data.” Big Data & Society 3.2 (2016): 1-11.Couldry, Nick. “Liveness, ‘Reality,’ and the Mediated Habitus from Television to the Mobile Phone.” Communication Review 7.4 (2004): 353-361.Droesch, Blake. “More than Half of US Social Network Users Will Be Mobile-Only in 2019.” EMarketer 26 Apr. 2019. <http://www.emarketer.com/content/more-than-half-of-social-network-users-will-be-mobile-only-in-2019>.Franklin, Seb. Control: Digitality as Cultural Logic. Cambridge: MIT Press, 2015.Galili, Doron. “Seeing by Electricity: The Emergence of Television and the Modern Mediascape, 1878—1939.” PhD dissertation. Chicago: U of Chicago, 2011.Irani, Lilly, Robin Jeffries, and Andrea Knight. “Rhythms and Plasticity: Television Temporality at Home.” Personal and Ubiquitous Computing 14.7 (2010): 621-632.Jenner, Mareike. “Is This TVIV? On Netflix, TVIII and Binge-Watching.” New Media & Society 18.2 (2014): 257-273.Jhally, Sut, and Bill Livant. “Watching as Working: The Valorization of Audience Consciousness.” Journal of Communication 36.3 (1986): 124-143.Lee, Hye-Jin, and Mark Andrejevic. “Second-Screen Theory: From Democratic Surround to the Digital Enclosure.” Connected Viewing: Selling, Streaming & Sharing Media in the Digital Age. Eds. Jennifer Holt and Kevin Sanson. New York: Routledge, 2014. 40-62.Massey, Doreen. For Space. London: Sage, 2005.Munson, Ben. “More than Half of Global Video Views Start on Mobile.” Fierce Video 24 Sep. 2019. <https://www.fiercevideo.com/video/more-than-half-global-video-views-start-mobile-report-says>.Oswald, Kathleen, and Jeremy Packer. “Flow and Mobile Media.” Communication Matters: Materialist Approaches to Media, Mobility and Networks. Eds. Jeremy Packer and Stephen B. Crofts Wiley. New York: Routledge, 2012. 276-287.Perry, Erica. “Here's How HQ Trivia Is Finally Monetizing Its Massive Audience.” Social Media Week 29 Mar. 2018. <http://socialmediaweek.org/blog/2018/03/heres-how-hq-trivia-is-finally-monetizing-its-massive-audience/>.Peters, John Durham. The Marvelous Clouds: Toward a Philosophy of Elemental Media. Chicago: U of Chicago P, 2016.Sharma, Sarah. In the Meantime: Temporality and Cultural Politics. Durham: Duke UP, 2014.Sterling, Greg. “Nearly 80 Percent of Social Media Time Now Spent on Mobile Devices.” Marketing Land 4 Apr. 2016. <http://marketingland.com/facebook-usage-accounts-1-5-minutes-spent-mobile-171561>.Streeter, Thomas. Selling the Air. Chicago: U of Chicago P, 1996.Turner, Graeme. “'Liveness' and 'Sharedness' Outside the Box” Flow Journal 8 (2011). <https://www.flowjournal.org/2011/04/liveness-and-sharedness-outside-the-box/>.Uricchio, William. “Television's First Seventy-Five Years: The Interpretive Flexibility of a Medium in Transition.” The Oxford Handbook of Film and Media Studies. Ed. Robert Kolker. Oxford: Oxford UP, 2008. 286-305.Wajcman, Judy, and Nigel Dodd. “Introduction: The Powerful Are Fast, The Powerless Are Slow.” The Sociology of Speed: Digital, Organizational, and Social Temporalities. Eds. Judy Wajcman and Nigel Dodd. Oxford: Oxford UP, 2017. 1-12.
APA, Harvard, Vancouver, ISO, and other styles
45

Wagman, Ira. "Wasteaminute.com: Notes on Office Work and Digital Distraction." M/C Journal 13, no. 4 (2010). http://dx.doi.org/10.5204/mcj.243.

Full text
Abstract:
For those seeking a diversion from the drudgery of work there are a number of websites offering to take you away. Consider the case of wasteaminute.com. On the site there is everything from flash video games, soft-core pornography and animated nudity, to puzzles and parlour games like poker. In addition, the site offers links to video clips grouped in categories such as “funny,” “accidents,” or “strange.” With its bright yellow bubble letters and elementary design, wasteaminute will never win any Webby awards. It is also unlikely to be part of a lucrative initial public offering for its owner, a web marketing company based in Lexington, Kentucky. The internet ratings company Alexa gives wasteaminute a ranking of 5,880,401 when it comes to the most popular sites online over the last three months, quite some way behind sites like Wikipedia, Facebook, and Windows Live.Wasteaminute is not unique. There exists a group of websites, a micro-genre of sorts, that go out of their way to offer momentary escape from the more serious work at hand, with a similar menu of offerings. These include sites with names such as ishouldbeworking.com, i-am-bored.com, boredatwork.com, and drivenbyboredom.com. These web destinations represent only the most overtly named time-wasting opportunities. Video sharing sites like YouTube or France’s DailyMotion, personalised home pages like iGoogle, and the range of applications available on mobile devices offer similar opportunities for escape. Wasteaminute inspired me to think about the relationship between digital media technologies and waste. In one sense, the site’s offerings remind us of the Internet’s capacity to re-purpose old media forms from earlier phases in the digital revolution, like the retro video game PacMan, or from aspects of print culture, like crosswords (Bolter and Grusin; Straw). For my purposes, though, wasteaminute permits the opportunity to meditate, albeit briefly, on the ways media facilitate wasting time at work, particularly for those working in white- and no-collar work environments. In contemporary work environments work activity and wasteful activity exist on the same platform. With a click of a mouse or a keyboard shortcut, work and diversion can be easily interchanged on the screen, an experience of computing I know intimately from first-hand experience. The blurring of lines between work and waste has accompanied the extension of the ‘working day,’ a concept once tethered to the standardised work-week associated with modernity. Now people working in a range of professions take work out of the office and find themselves working in cafes, on public transportation, and at times once reserved for leisure, like weekends (Basso). In response to the indeterminate nature of when and where we are at work, the mainstream media routinely report about the wasteful use of computer technology for non-work purposes. Stories such as a recent one in the Washington Post which claimed that increased employee use of social media sites like Facebook and Twitter led to decreased productivity at work have become quite common in traditional media outlets (Casciato). Media technologies have always offered the prospect of making office work more efficient or the means for management to exercise control over employees. However, those same technologies have also served as the platforms on which one can engage in dilatory acts, stealing time from behind the boss’s back. I suggest stealing time at work may well be a “tactic,” in the sense used by Michel de Certeau, as a means to resist the rules and regulations that structure work and the working life. However, I also consider it to be a tactic in a different sense: websites and other digital applications offer users the means to take time back, in the form of ‘quick hits,’ providing immediate visual or narrative pleasures, or through interfaces which make the time-wasting look like work (Wagman). Reading sites like wasteaminute as examples of ‘office entertainment,’ reminds us of the importance of workers as audiences for web content. An analysis of a few case studies also reveals how the forms of address of these sites themselves recognise and capitalise on an understanding of the rhythms of the working day, as well as those elements of contemporary office culture characterised by interruption, monotony and surveillance. Work, Media, Waste A mass of literature documents the transformations of work brought on by industrialisation and urbanisation. A recent biography of Franz Kafka outlines the rigors imposed upon the writer while working as an insurance agent: his first contract stipulated that “no employee has the right to keep any objects other than those belonging to the office under lock in the desk and files assigned for its use” (Murray 66). Siegfried Kracauer’s collection of writings on salaried workers in Germany in the 1930s argues that mass entertainment offers distractions that inhibit social change. Such restrictions and inducements are exemplary of the attempts to make work succumb to managerial regimes which are intended to maximise productivity and minimise waste, and to establish a division between ‘company time’ and ‘free time’. One does not have to be an industrial sociologist to know the efforts of Frederick W. Taylor, and the disciplines of “scientific management” in the early twentieth century which were based on the idea of making work more efficient, or of the workplace sociology scholarship from the 1950s that drew attention to the ways that office work can be monotonous or de-personalising (Friedmann; Mills; Whyte). Historian JoAnne Yates has documented the ways those transformations, and what she calls an accompanying “philosophy of system and efficiency,” have been made possible through information and communication technologies, from the typewriter to carbon paper (107). Yates evokes the work of James Carey in identifying these developments, for example, the locating of workers in orderly locations such as offices, as spatial in nature. The changing meaning of work, particularly white-collar or bureaucratic labour in an age of precarious employment and neo-liberal economic regimes, and aggressive administrative “auditing technologies,” has subjected employees to more strenuous regimes of surveillance to ensure employee compliance and to protect against waste of company resources (Power). As Andrew Ross notes, after a deep period of self-criticism over the drudgery of work in North American settings in the 1960s, the subsequent years saw a re-thinking of the meaning of work, one that gradually traded greater work flexibility and self-management for more assertive forms of workplace control (9). As Ross notes, this too has changed, an after-effect of “the shareholder revolution,” which forced companies to deliver short-term profitability to its investors at any social cost. With so much at stake, Ross explains, the freedom of employees assumed a lower priority within corporate cultures, and “the introduction of information technologies in the workplace of the new capitalism resulted in the intensified surveillance of employees” (12). Others, like Dale Bradley, have drawn attention to the ways that the design of the office itself has always concerned itself with the bureaucratic and disciplinary control of bodies in space (77). The move away from physical workspaces such as ‘the pen’ to the cubicle and now from the cubicle to the virtual office is for Bradley a move from “construction” to “connection.” This spatial shift in the way in which control over employees is exercised is symbolic of the liquid forms in which bodies are now “integrated with flows of money, culture, knowledge, and power” in the post-industrial global economies of the twenty-first century. As Christena Nippert-Eng points out, receiving office space was seen as a marker of trust, since it provided employees with a sense of privacy to carry out affairs—both of a professional or of a personal matter—out of earshot of others. Privacy means a lot of things, she points out, including “a relative lack of accountability for our immediate whereabouts and actions” (163). Yet those same modalities of control which characterise communication technologies in workspaces may also serve as the platforms for people to waste time while working. In other words, wasteful practices utilize the same technology that is used to regulate and manage time spent in the workplace. The telephone has permitted efficient communication between units in an office building or between the office and outside, but ‘personal business’ can also be conducted on the same line. Radio stations offer ‘easy listening’ formats, providing unobtrusive music so as not to disturb work settings. However, they can easily be tuned to other stations for breaking news, live sports events, or other matters having to do with the outside world. Photocopiers and fax machines facilitate the reproduction and dissemination of communication regardless of whether it is it work or non-work related. The same, of course, is true for computerised applications. Companies may encourage their employees to use Facebook or Twitter to reach out to potential clients or customers, but those same applications may be used for personal social networking as well. Since the activities of work and play can now be found on the same platform, employers routinely remind their employees that their surfing activities, along with their e-mails and company documents, will be recorded on the company server, itself subject to auditing and review whenever the company sees fit. Employees must be careful to practice image management, in order to ensure that contradictory evidence does not appear online when they call in sick to the office. Over time the dynamics of e-mail and Internet etiquette have changed in response to such developments. Those most aware of the distractive and professionally destructive features of downloading a funny or comedic e-mail attachment have come to adopt the acronym “NSFW” (Not Safe for Work). Even those of us who don’t worry about those things are well aware that the cache and “history” function of web browsers threaten to reveal the extent to which our time online is spent in unproductive ways. Many companies and public institutions, for example libraries, have taken things one step further by filtering out access to websites that may be peripheral to the primary work at hand.At the same time contemporary workplace settings have sought to mix both work and play, or better yet to use play in the service of work, to make “work” more enjoyable for its workers. Professional development seminars, team-building exercises, company softball games, or group outings are examples intended to build morale and loyalty to the company among workers. Some companies offer their employees access to gyms, to game rooms, and to big screen TVs, in return for long and arduous—indeed, punishing—hours of time at the office (Dyer-Witheford and Sherman; Ross). In this manner, acts of not working are reconfigured as a form of work, or at least as a productive experience for the company at large. Such perks are offered with an assumption of personal self-discipline, a feature of what Nippert-Eng characterises as the “discretionary workplace” (154). Of course, this also comes with an expectation that workers will stay close to the office, and to their work. As Sarah Sharma recently argued in this journal, such thinking is part of the way that late capitalism constructs “innovative ways to control people’s time and regulate their movement in space.” At the same time, however, there are plenty of moments of gentle resistance, in which the same machines of control and depersonalisation can be customised, and where individual expressions find their own platforms. A photo essay by Anna McCarthy in the Journal of Visual Culture records the inspirational messages and other personalised objects with which workers adorn their computers and work stations. McCarthy’s photographs represent the way people express themselves in relation to their work, making it a “place where workplace politics and power relations play out, often quite visibly” (McCarthy 214). Screen SecretsIf McCarthy’s photo essay illustrates the overt ways in which people bring personal expression or gentle resistance to anodyne workplaces, there are also a series of other ‘screen acts’ that create opportunities to waste time in ways that are disguised as work. During the Olympics and US college basketball playoffs, both American broadcast networks CBS and NBC offered a “boss button,” a graphic link that a user could immediately click “if the boss was coming by” that transformed the screen to something was associated with the culture of work, such as a spreadsheet. Other purveyors of networked time-wasting make use of the spreadsheet to mask distraction. The website cantyouseeimbored turns a spreadsheet into a game of “Breakout!” while other sites, like Spreadtweet, convert your Twitter updates into the form of a spreadsheet. Such boss buttons and screen interfaces that mimic work are the presentday avatars of the “panic button,” a graphic image found at the bottom of websites back in the days of Web 1.0. A click of the panic button transported users away from an offending website and towards something more legitimate, like Yahoo! Even if it is unlikely that boss keys actually convince one’s superiors that one is really working—clicking to a spreadsheet only makes sense for a worker who might be expected to be working on those kinds of documents—they are an index of how notions of personal space and privacy play out in the digitalised workplace. David Kiely, an employee at an Australian investment bank, experienced this first hand when he opened an e-mail attachment sent to him by his co-workers featuring a scantily-clad model (Cuneo and Barrett). Unfortunately for Kiely, at the time he opened the attachment his computer screen was visible in the background of a network television interview with another of the bank’s employees. Kiely’s inauspicious click (which made his the subject of an investigation by his employees) continues to circulate on the Internet, and it spawned a number of articles highlighting the precarious nature of work in a digitalised environment where what might seem to be private can suddenly become very public, and thus able to be disseminated without restraint. At the same time, the public appetite for Kiely’s story indicates that not working at work, and using the Internet to do it, represents a mode of media consumption that is familiar to many of us, even if it is only the servers on the company computer that can account for how much time we spend doing it. Community attitudes towards time spent unproductively online reminds us that waste carries with it a range of negative signifiers. We talk about wasting time in terms of theft, “stealing time,” or even more dramatically as “killing time.” The popular construction of television as the “boob tube” distinguishes it from more ‘productive’ activities, like spending time with family, or exercise, or involvement in one’s community. The message is simple: life is too short to be “wasted” on such ephemera. If this kind of language is less familiar in the digital age, the discourse of ‘distraction’ is more prevalent. Yet, instead of judging distraction a negative symptom of the digital age, perhaps we should reinterpret wasting time as the worker’s attempt to assert some agency in an increasingly controlled workplace. ReferencesBasso, Pietro. Modern Times, Ancient Hours: Working Lives in the Twenty-First Century. London: Verso, 2003. Bolter, Jay David, and Richard Grusin. Remediation: Understanding New Media. Cambridge: MIT Press, 2000.Bradley, Dale. “Dimensions Vary: Technology, Space, and Power in the 20th Century Office”. Topia 11 (2004): 67-82.Casciato, Paul. “Facebook and Other Social Media Cost UK Billions”. Washington Post, 5 Aug. 2010. 11 Aug. 2010 ‹http://www.washingtonpost.com/wp-dyn/content/article/2010/08/05/AR2010080503951.html›.Cuneo, Clementine, and David Barrett. “Was Banker Set Up Over Saucy Miranda”. The Daily Telegraph 4 Feb. 2010. 21 May 2010 ‹http://www.dailytelegraph.com.au/entertainment/sydney-confidential/was-banker-set-up-over-saucy-miranda/story-e6frewz0-1225826576571›.De Certeau, Michel. The Practice of Everyday Life. Vol. 1. Berkeley: U of California P. 1988.Dyer-Witheford, Nick, and Zena Sharman. "The Political Economy of Canada's Video and Computer Game Industry”. Canadian Journal of Communication 30.2 (2005). 1 May 2010 ‹http://www.cjc-online.ca/index.php/journal/article/view/1575/1728›.Friedmann, Georges. Industrial Society. Glencoe, Ill.: Free Press, 1955.Kracauer, Siegfried. The Salaried Masses. London: Verso, 1998.McCarthy, Anna. Ambient Television. Durham: Duke UP, 2001. ———. “Geekospheres: Visual Culture and Material Culture at Work”. Journal of Visual Culture 3 (2004): 213-21.Mills, C. Wright. White Collar. Oxford: Oxford UP, 1951. Murray, Nicholas. Kafka: A Biography. New Haven: Yale UP, 2004.Newman, Michael. “Ze Frank and the Poetics of Web Video”. First Monday 13.5 (2008). 1 Aug. 2010 ‹http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2102/1962›.Nippert-Eng, Christena. Home and Work: Negotiating Boundaries through Everyday Life. Chicago: U. of Chicago P, 1996.Power, Michael. The Audit Society. Oxford: Oxford UP, 1997. Ross, Andrew. No Collar: The Humane Workplace and Its Hidden Costs. Philadelphia: Temple UP, 2004. Sharma, Sarah. “The Great American Staycation and the Risk of Stillness”. M/C Journal 12.1 (2009). 11 May 2010 ‹http://journal.media-culture.org.au/index.php/mcjournal/article/viewArticle/122›. Straw, Will. “Embedded Memories”. Residual Media Ed. Charles Acland. U. of Minnesota P., 2007. 3-15.Whyte, William. The Organisation Man. New York: Simon and Schuster, 1957. Wagman, Ira. “Log On, Goof Off, Look Up: Facebook and the Rhythms of Canadian Internet Use”. How Canadians Communicate III: Contexts for Popular Culture. Eds. Bart Beaty, Derek, Gloria Filax Briton, and Rebecca Sullivan. Athabasca: Athabasca UP 2009. 55-77. ‹http://www2.carleton.ca/jc/ccms/wp-content/ccms-files/02_Beaty_et_al-How_Canadians_Communicate.pdf›Yates, JoAnne. “Business Use of Information Technology during the Industrial Age”. A Nation Transformed by Information. Eds. Alfred D. Chandler & James W. Cortada. Oxford: Oxford UP., 2000. 107-36.
APA, Harvard, Vancouver, ISO, and other styles
46

Bayes, Chantelle. "The Cyborg Flâneur: Reimagining Urban Nature through the Act of Walking." M/C Journal 21, no. 4 (2018). http://dx.doi.org/10.5204/mcj.1444.

Full text
Abstract:
The concept of the “writer flâneur”, as developed by Walter Benjamin, sought to make sense of the seemingly chaotic nineteenth century city. While the flâneur provided a way for new urban structures to be ordered, it was also a transgressive act that involved engaging with urban spaces in new ways. In the contemporary city, where spaces are now heavily controlled and ordered, some members of the city’s socio-ecological community suffer as a result of idealistic notions of who and what belongs in the city, and how we must behave as urban citizens. Many of these ideals emerge from nineteenth century conceptions of the city in contrast to the country (Williams). However, a reimagining of the flâneur can allow for new transgressions of urban space and result in new literary imaginaries that capture the complexity of urban environments, question some of the more damaging processes and systems, offer new ways of connecting with the city, and propose alternative ways of living with the non-human in such places. With reference to the work of Debra Benita Shaw, Rob Shields and Donna Haraway, I will examine how the urban walking figure might be reimagined as cyborg, complicating boundaries between the real and imagined, the organic and inorganic, and between the human and non-human (Haraway Cyborgs). I will argue that the cyborg flâneur allows for new ways of writing and reading the urban and can work to reimagine the city as posthuman multispecies community. As one example of cyborg flânerie, I look to the app Story City to show how a writer can develop new environmental imaginaries in situ as an act of resistance against the anthropocentric ordering of the city. This article intends to begin a conversation about the ethical, political and epistemological potential of cyborg flânerie and leads to several questions which will require further research.Shaping the City: Environmental ImaginariesIn a sense, the flâneur is the product of a utopian imaginary of the city. According to Shields, Walter Benjamin used the flâneur as a literary device to make sense of the changing modern city of Paris: The flâneur is a hero who excels under the stress of coming to terms with a changing ‘social spatialisation’ of everyday social and economic relations which in the nineteenth century increasingly extended the world of the average person further and further to include rival mass tourism destinations linked by railroad, news of other European powers and distant colonies. This expanding spatialization took the form of economic realities such as changing labour markets and commodity prices and social encounters with strangers and foreigners which impinged on the life world of Europeans. (Fancy Footwork 67)Through his writing, these new spaces and inhabitants were made familiar again to those that lived there. In consequence, the flâneur was seen as a heroic figure who approached the city like a wilderness to be studied and tamed:Even to early 20th-century sociologists the flâneur was a heroic everyman—masculine, controlled and as in tune with his environment as James Fenimore Cooper’s Mohican braves were in their native forests. Anticipating the hardboiled hero of the detective novel, the flâneur pursued clues to the truth of the metropolis, attempting to think through its historical specificity, to inhabit it, even as the truth of empire and commodity capitalism was hidden from him. (Shields Flanerie 210)In this way, the flâneur was a stabilising force, categorising and therefore ordering the city. However, flânerie was also a transgressive act as the walker engaged in eccentric and idle wandering against the usual purposeful walking practices of the time (Coates). Drawing on this aspect, flânerie has increasingly been employed in the humanities and social sciences as a practice of resistance as Jamie Coates has shown. This makes the flâneur, albeit in a refigured form, a useful tool for transgressing strict socio-ecological conventions that affect the contemporary city.Marginalised groups are usually the most impacted by the strict control and ordering of contemporary urban spaces in response to utopian imaginaries of who and what belong. Marginalised people are discouraged and excluded from living in particular areas of the city through urban policy and commercial practices (Shaw 7). Likewise, certain non-human others, like birds, are allowed to inhabit our cities while those that don’t fit ideal urban imaginaries, like bats or snakes, are controlled, excluded or killed (Low). Defensive architecture, CCTV, and audio deterrents are often employed in cities to control public spaces. In London, the spiked corridor of a shop entrance designed to keep homeless people from sleeping there (Andreou; Borromeo) mirrors the spiked ledges that keep pigeons from resting on buildings (observed 2012/2014). On the Gold Coast youths are deterred from loitering in public spaces with classical music (observed 2013–17), while in Brisbane predatory bird calls are played near outdoor restaurants to discourage ibis from pestering customers (Hinchliffe and Begley). In contrast, bright lights, calming music and inviting scents are used to welcome orderly consumers into shopping centres while certain kinds of plants are cultivated in urban parks and gardens to attract acceptable wildlife like butterflies and lorikeets (Wilson; Low). These ways of managing public spaces are built on utopian conceptions of the city as a “civilising” force—a place of order, consumption and safety.As environmental concerns become more urgent, it is important to re-examine these conceptions of urban environments and the assemblage of environmental imaginaries that interact and continue to shape understandings of and attitudes towards human and non-human nature. The network of goods, people and natural entities that feed into and support the city mean that imaginaries shaped in urban areas influence both urban and surrounding peoples and ecologies (Braun). Local ecologies also become threatened as urban structures and processes continue to encompass more of the world’s populations and locales, often displacing and damaging entangled natural/cultural entities in the process. Furthermore, conceptions and attitudes shaped in the city often feed into global systems and as such can have far reaching implications for the way local ecologies are governed, built, and managed. There has already been much research, including work by Lawrence Buell and Ursula Heise, on the contribution that art and literature can make to the development of environmental imaginaries, whether intentional or unintentional, and resulting in both positive and negative associations with urban inhabitants (Yusoff and Gabrys; Buell; Heise). Imaginaries might be understood as social constructs through which we make sense of the world and through which we determine cultural and personal values, attitudes and beliefs. According to Neimanis et al., environmental imaginaries help us to make sense of the way physical environments shape “one’s sense of social belonging” as well as how we “formulate—and enact—our values and attitudes towards ‘nature’” (5). These environmental imaginaries underlie urban structures and work to determine which aspects of the city are valued, who is welcomed into the city, and who is excluded from participation in urban systems and processes. The development of new narrative imaginaries can question some of the underlying assumptions about who or what belongs in the city and how we might settle conflicts in ecologically diverse communities. The reimagined flâneur then might be employed to transgress traditional notions of belonging in the city and replace this with a sense of “becoming” in relation with the myriad of others inhabiting the city (Haraway The Trouble). Like the Benjaminian flâneur, the postmodern version enacts a similar transgressive walking practice. However, the postmodern flâneur serves to resist dominant narratives, with a “greater focus on the tactile and grounded qualities of walking” than the traditional flâneur—and, as opposed to the lone detached wanderer, postmodern flâneur engage in a network of social relationships and may even wander in groups (Coates 32). By employing the notion of the postmodern flâneur, writers might find ways to address problematic urban imaginaries and question dominant narratives about who should and should not inhabit the city. Building on this and in reference to Haraway (Cyborgs), the notion of a cyborg flâneur might take this resistance one step further, not only seeking to counter the dominant social narratives that control urban spaces but also resisting anthropocentric notions of the city. Where the traditional flâneur walked a pet tortoise on a leash, the cyborg flâneur walks with a companion species (Shields Fancy Footwork; Haraway Companion Species). The distinction is subtle. The traditional flâneur walks a pet, an object of display that showcases the eccentric status of the owner. The cyborg flâneur walks in mutual enjoyment with a companion (perhaps a domestic companion, perhaps not); their path negotiated together, tracked, and mapped via GPS. The two acts may at first appear the same, but the difference is in the relationship between the human, non-human, and the multi-modal spaces they occupy. As Coates argues, not everyone who walks is a flâneur and similarly, not everyone who engages in relational walking is a cyborg flâneur. Rather a cyborg flâneur enacts a deliberate practice of walking in relation with naturecultures to transgress boundaries between human and non-human, cultural and natural, and the virtual, material and imagined spaces that make up a place.The Posthuman City: Cyborgs, Hybrids, and EntanglementsIn developing new environmental imaginaries, posthuman conceptions of the city can be drawn upon to readdress urban space as complex, questioning utopian notions of the city particularly as they relate to the exclusion of certain others, and allowing for diverse socio-ecological communities. The posthuman city might be understood in opposition to anthropocentric notions where the non-human is seen as something separate to culture and in need of management and control within the human sphere of the city. Instead, the posthuman city is a complex entanglement of hybrid non-human, cultural and technological entities (Braun; Haraway Companion Species). The flâneur who experiences the city through a posthuman lens acknowledges the human as already embodied and embedded in the non-human world. Key to re-imagining the city is recognising the myriad ways in which non-human nature also acts upon us and influences decisions on how we live in cities (Schliephake 140). This constitutes a “becoming-with each other”, in Haraway’s terms, which recognises the interdependency of urban inhabitants (The Trouble 3). In re-considering the city as a negotiated process between nature and culture rather than a colonisation of nature by culture, the agency of non-humans to contribute to the construction of cities and indeed environmental imaginaries must be acknowledged. Living in the posthuman city requires us humans to engage with the city on multiple levels as we navigate the virtual, corporeal, and imagined spaces that make up the contemporary urban experience. The virtual city is made up of narratives projected through media productions such as tourism campaigns, informational plaques, site markers, and images on Google map locations, all of which privilege certain understandings of the city. Virtual narratives serve to define the city through a network of historical and spatially determined locales. Closely bound up with the virtual is the imagined city that draws on urban ideals, potential developments, mythical or alternative versions of particular cities as well as literary interpretations of cities. These narratives are overlaid on the places that we engage with in our everyday lived experiences. Embodied encounters with the city serve to reinforce or counteract certain virtual and imagined versions while imagined and virtual narratives enhance locales by placing current experience within a temporal narrative that extends into the past as well as the future. Walking the City: The Cyber/Cyborg FlâneurThe notion of the cyber flâneur emerged in the twenty-first century from the practices of idly surfing the Internet, which in many ways has become an extension of the cityscape. In the contemporary world where we exist in both physical and digital spaces, the cyber flâneur (and indeed its cousin the virtual flâneur) have been employed to make sense of new digital sites of connection, voyeurism, and consumption. Metaphors that evoke the city have often been used to describe the experience of the digital including “chat rooms”, “cyber space”, and “home pages” while new notions of digital tourism, the rise of online shopping, and meeting apps have become substitutes for engaging with the physical sites of cities such as shopping malls, pubs, and attractions. The flâneur and cyberflâneur have helped to make sense of the complexities and chaos of urban life so that it might become more palatable to the inhabitants, reducing anxieties about safety and disorder. However, as with the concept of the flâneur, implicit in the cyberflâneur is a reinforcement of traditional urban hierarchies and social structures. This categorising has also worked to solidify notions of who belongs and who does not. Therefore, as Debra Benita Shaw argues, the cyberflâneur is not able to represent the complexities of “how we inhabit and experience the hybrid spaces of contemporary cities” (3). Here, Shaw suggests that Haraway’s cyborg might be used to interrupt settled boundaries and to reimagine the urban walking figure. In both Shaw and Shields (Flanerie), the cyborg is invoked as a solution to the problematic figure of the flâneur. While Shaw presents these figures in opposition and proposes that the flâneur be laid to rest as the cyborg takes its place, I argue that the idea of the flâneur may still have some use, particularly when applied to new multi-modal narratives. As Shields demonstrates, the cyborg operates in the virtual space of simulation rather than at the material level (217). Instead of setting up an opposition between the cyborg and flâneur, these figures might be merged to bring the cyborg into being through the material practice of flânerie, while refiguring the flâneur as posthuman. The traditional flâneur sought to define space, but the cyborg flâneur might be seen to perform space in relation to an entangled natural/cultural community. By drawing on this notion of the cyborg, it becomes possible to circumvent some of the traditional associations with the urban walking figure and imagine a new kind of flâneur, one that walks the streets as an act to complicate rather than compartmentalise urban space. As we emerge into a post-truth world where facts and fictions blur, creative practitioners can find opportunities to forge new ways of knowing, and new ways of connecting with the city through the cyborg flâneur. The development of new literary imaginaries can reconstruct natural/cultural relationships and propose alternative ways of living in a posthuman and multispecies community. The rise of smart-phone apps like Story City provides cyborg flâneurs with the ability to create digital narratives overlaid on real places and has the potential to encourage real connections with urban environments. While these apps are by no means the only activity that a cyborg flâneur might participate in, they offer the writer a platform to engage audiences in a purposeful and transgressive practice of cyborg flânerie. Such narratives produced through cyborg flânerie would conflate virtual, corporeal, and imagined experiences of the city and allow for new environmental imaginaries to be created in situ. The “readers” of these narratives can also become cyborg flâneurs as the traditional urban wanderer is combined with the virtual and imagined space of the contemporary city. As opposed to wandering the virtual city online, readers are encouraged to physically walk the city and engage with the narrative in situ. For example, in one narrative, readers are directed to walk a trail along the Brisbane river or through the CBD to chase a sea monster (Wilkins and Diskett). The reader can choose different pre-set paths which influence the outcome of each story and embed the story in a physical location. In this way, the narrative is layered onto the real streets and spaces of the cityscape. As the reader is directed to walk particular routes through the city, the narratives which unfold are also partly constructed by the natural/cultural entities which make up those locales establishing a narrative practice which engages with the urban on a posthuman level. The murky water of the Brisbane River could easily conceal monsters. Occasional sightings of crocodiles (Hall), fish that leap from the water, and shadows cast by rippling waves as the City Cat moves across the surface impact the experience of the story (observed 2016–2017). Potential exists to capitalise on this narrative form and develop new environmental imaginaries that pay attention to the city as a posthuman place. For example, a narrative might direct the reader’s attention to the networks of water that hydrate people and animals, allow transportation, and remove wastes from the city. People may also be directed to explore their senses within place, be encouraged to participate in sensory gardens, or respond to features of the city in new ways. The cyborg flâneur might be employed in much the same way as the flâneur, to help the “reader” make sense of the posthuman city, where boundaries are shifted, and increasing rates of social and ecological change are transforming contemporary urban sites and structures. Shields asks whether the cyborg might also act as “a stabilising figure amidst the collapse of dualisms, polluted categories, transgressive hybrids, and unstable fluidity” (Flanerie 211). As opposed to the traditional flâneur however, this “stabilising” figure doesn’t sort urban inhabitants into discrete categories but maps the many relations between organisms and technologies, fictions and realities, and the human and non-human. The cyborg flâneur allows for other kinds of “reading” of the city to take place—including those by women, families, and non-Western inhabitants. As opposed to the nineteenth century reader-flâneur, those who read the city through the Story City app are also participants in the making of the story, co-constructing the narrative along with the author and locale. I would argue this participation is a key feature of the cyborg flâneur narrative along with the transience of the narratives which may alter and eventually expire as urban structures and environments change. Not all those who engage with these narratives will necessarily enact a posthuman understanding and not all writers of these narratives will do so as cyborg flâneurs. Nevertheless, platforms such as Story City provide writers with an opportunity to engage participants to question dominant narratives of the city and to reimagine themselves within a multispecies community. In addition, by bringing readers into contact with the human and non-human entities that make up the city, there is potential for real relationships to be established. Through new digital platforms such as apps, writers can develop new environmental imaginaries that question urban ideals including conceptions about who belongs in the city and who does not. The notion of the cyborg is a useful concept through which to reimagine the city as a negotiated process between nature and culture, and to reimagine the flâneur as performer who becomes part of the posthuman city as they walk the streets. This article provides one example of cyborg flânerie in smart-phone apps like Story City that allow writers to construct new urban imaginaries, bring the virtual and imagined city into the physical spaces of the urban environment, and can act to re-place the reader in diverse socio-ecological communities. The reader then becomes both product and constructer of urban space, a cyborg flâneur in the cyborg city. This conversation raises further questions about the cyborg flâneur, including: how might cyborg flânerie be enacted in other spaces (rural, virtual, more-than-human)? What other platforms and narrative forms might cyborg flâneurs use to share their posthuman narratives? How might cyborg flânerie operate in other cities, other cultures and when adopted by marginalised groups? In answering these questions, the potential and limitations of the cyborg flâneur might be refined. The hope is that one day the notion of a cyborg flâneur will no longer necessary as the posthuman city becomes a space of negotiation rather than exclusion. ReferencesAndreou, Alex. “Anti-Homeless Spikes: ‘Sleeping Rough Opened My Eyes to the City’s Barbed Cruelty.’” The Guardian 19 Feb. 2015. 25 Aug. 2017 <https://www.theguardian.com/society/2015/feb/18/defensive-architecture-keeps-poverty-undeen-and-makes-us-more-hostile>.Borromeo, Leah. “These Anti-Homeless Spikes Are Brutal. We Need to Get Rid of Them.” The Guardian 23 Jul. 2015. 25 Aug. 2017 <https://www.theguardian.com/commentisfree/2015/jul/23/anti-homeless-spikes-inhumane-defensive-architecture>.Braun, Bruce. “Environmental Issues: Writing a More-than-Human Urban Geography.” Progress in Human Geography 29.5 (2005): 635–50. Buell, Lawrence. The Future of Environmental Criticism: Environmental Crisis and Literary Imagination. Malden: Blackwell, 2005.Coates, Jamie. “Key Figure of Mobility: The Flâneur.” Social Anthropology 25.1 (2017): 28–41.Hall, Peter. “Crocodiles Spotted in Queensland: A Brief History of Sightings and Captures in the Southeast.” The Courier Mail 4 Jan. 2017. 20 Aug. 2017 <http://www.couriermail.com.au/news/queensland/crocodiles-spotted-in-queensland-a-brief-history-of-sightings-and-captures-in-the-southeast/news-story/5fbb2d44bf3537b8a6d1f6c8613e2789>.Haraway, Donna J. Staying with the Trouble: Making Kin in the Chthulucene. Durham: Duke UP, 2016.———. The Companion Species Manifesto: Dogs, People, and Significant Otherness. Vol. 1. Chicago: Prickly Paradigm Press, 2003.———. Simians, Cyborgs, and Women: The Reinvention of Nature. Oxon: Routledge, 1991.Heise, Ursula K. Sense of Place and Sense of Planet: The Environmental Imagination of the Global. Oxford: Oxford UP, 2008. Hinchliffe, Jessica, and Terri Begley. “Brisbane’s Angry Birds: Recordings No Deterrent for Nosey Ibis at South Bank.” ABC News 2 Jun. 2015. 25 Aug. 2017 <http://www.abc.net.au/news/2015-02-06/recorded-bird-noise-not-detering-south-banks-angry-birds/6065610>.Low, Tim. The New Nature: Winners and Losers in Wild Australia. London: Penguin, 2002.Neimanis, Astrid, Cecilia Asberg, and Suzi Hayes. “Posthumanist Imaginaries.” Research Handbook on Climate Governance. Eds. K. Bäckstrand and E. Lövbrand. Massachusetts: Edward Elgar Publishing, 2015. 480–90.Schliephake, Christopher. Urban Ecologies: City Space, Material Agency, and Environmental Politics in Contemporary Culture. Maryland: Lexington Books, 2014.Shaw, Debra Benita. “Streets for Cyborgs: The Electronic Flâneur and the Posthuman City.” Space and Culture 18.3 (2015): 230–42.Shields, Rob. “Fancy Footwork: Walter Benjamin’s Notes on Flânerie.” The Flâneur. Ed. Keith Tester. London: Routledge, 2014. 61–80.———. “Flânerie for Cyborgs.” Theory, Culture & Society 23.7-8 (2006): 209–20.Yusoff, Kathryn, and Jennifer Gabrys. “Climate Change and the Imagination.” Wiley Interdisciplinary Reviews: Climate Change 2.4 (2011): 516–34.Wilkins, Kim, and Joseph Diskett. 9 Fathom Deep. Brisbane: Story City, 2014. Williams, Raymond. The Country and the City. New York: Oxford UP, 1975.Wilson, Alexander. The Culture of Nature: North American Landscape from Disney to the Exxon Valdez. Toronto: Between the Lines, 1991.
APA, Harvard, Vancouver, ISO, and other styles
47

Ibrahim, Yasmin. "Commodifying Terrorism." M/C Journal 10, no. 3 (2007). http://dx.doi.org/10.5204/mcj.2665.

Full text
Abstract:

 
 
 Introduction Figure 1 The counter-Terrorism advertising campaign of London’s Metropolitan Police commodifies some everyday items such as mobile phones, computers, passports and credit cards as having the potential to sustain terrorist activities. The process of ascribing cultural values and symbolic meanings to some everyday technical gadgets objectifies and situates Terrorism into the everyday life. The police, in urging people to look out for ‘the unusual’ in their normal day-to-day lives, juxtapose the everyday with the unusual, where day-to-day consumption, routines and flows of human activity can seemingly house insidious and atavistic elements. This again is reiterated in the Met police press release: Terrorists live within our communities making their plans whilst doing everything they can to blend in, and trying not to raise suspicions about their activities. (MPA Website) The commodification of Terrorism through uncommon and everyday objects situates Terrorism as a phenomenon which occupies a liminal space within the everyday. It resides, breathes and co-exists within the taken-for-granted routines and objects of ‘the everyday’ where it has the potential to explode and disrupt without warning. Since 9/11 and the 7/7 bombings Terrorism has been narrated through the disruption of mobility, whether in mid-air or in the deep recesses of the Underground. The resonant thread of disruption to human mobility evokes a powerful meta-narrative where acts of Terrorism can halt human agency amidst the backdrop of the metropolis, which is often a metaphor for speed and accelerated activities. If globalisation and the interconnected nature of the world are understood through discourses of risk, Terrorism bears the same footprint in urban spaces of modernity, narrating the vulnerability of the human condition in an inter-linked world where ideological struggles and resistance are manifested through inexplicable violence and destruction of lives, where the everyday is suspended to embrace the unexpected. As a consequence ambient fear “saturates the social spaces of everyday life” (Hubbard 2). The commodification of Terrorism through everyday items of consumption inevitably creates an intertextuality with real and media events, which constantly corrode the security of the metropolis. Paddy Scannell alludes to a doubling of place in our mediated world where “public events now occur simultaneously in two different places; the place of the event itself and that in which it is watched and heard. The media then vacillates between the two sites and creates experiences of simultaneity, liveness and immediacy” (qtd. in Moores 22). The doubling of place through media constructs a pervasive environment of risk and fear. Mark Danner (qtd. in Bauman 106) points out that the most powerful weapon of the 9/11 terrorists was that innocuous and “most American of technological creations: the television set” which provided a global platform to constantly replay and remember the dreadful scenes of the day, enabling the terrorist to appear invincible and to narrate fear as ubiquitous and omnipresent. Philip Abrams argues that ‘big events’ (such as 9/11 and 7/7) do make a difference in the social world for such events function as a transformative device between the past and future, forcing society to alter or transform its perspectives. David Altheide points out that since September 11 and the ensuing war on terror, a new discourse of Terrorism has emerged as a way of expressing how the world has changed and defining a state of constant alert through a media logic and format that shapes the nature of discourse itself. Consequently, the intensity and centralisation of surveillance in Western countries increased dramatically, placing the emphasis on expanding the forms of the already existing range of surveillance processes and practices that circumscribe and help shape our social existence (Lyon, Terrorism 2). Normalisation of Surveillance The role of technologies, particularly information and communication technologies (ICTs), and other infrastructures to unevenly distribute access to the goods and services necessary for modern life, while facilitating data collection on and control of the public, are significant characteristics of modernity (Reiman; Graham and Marvin; Monahan). The embedding of technological surveillance into spaces and infrastructures not only augment social control but also redefine data as a form of capital which can be shared between public and private sectors (Gandy, Data Mining; O’Harrow; Monahan). The scale, complexity and limitations of omnipresent and omnipotent surveillance, nevertheless, offer room for both subversion as well as new forms of domination and oppression (Marx). In surveillance studies, Foucault’s analysis is often heavily employed to explain lines of continuity and change between earlier forms of surveillance and data assemblage and contemporary forms in the shape of closed-circuit television (CCTV) and other surveillance modes (Dee). It establishes the need to discern patterns of power and normalisation and the subliminal or obvious cultural codes and categories that emerge through these arrangements (Fopp; Lyon, Electronic; Norris and Armstrong). In their study of CCTV surveillance, Norris and Armstrong (cf. in Dee) point out that when added to the daily minutiae of surveillance, CCTV cameras in public spaces, along with other camera surveillance in work places, capture human beings on a database constantly. The normalisation of surveillance, particularly with reference to CCTV, the popularisation of surveillance through television formats such as ‘Big Brother’ (Dee), and the expansion of online platforms to publish private images, has created a contradictory, complex and contested nature of spatial and power relationships in society. The UK, for example, has the most developed system of both urban and public space cameras in the world and this growth of camera surveillance and, as Lyon (Surveillance) points out, this has been achieved with very little, if any, public debate as to their benefits or otherwise. There may now be as many as 4.2 million CCTV cameras in Britain (cf. Lyon, Surveillance). That is one for every fourteen people and a person can be captured on over 300 cameras every day. An estimated £500m of public money has been invested in CCTV infrastructure over the last decade but, according to a Home Office study, CCTV schemes that have been assessed had little overall effect on crime levels (Wood and Ball). In spatial terms, these statistics reiterate Foucault’s emphasis on the power economy of the unseen gaze. Michel Foucault in analysing the links between power, information and surveillance inspired by Bentham’s idea of the Panopticon, indicated that it is possible to sanction or reward an individual through the act of surveillance without their knowledge (155). It is this unseen and unknown gaze of surveillance that is fundamental to the exercise of power. The design and arrangement of buildings can be engineered so that the “surveillance is permanent in its effects, even if it is discontinuous in its action” (Foucault 201). Lyon (Terrorism), in tracing the trajectory of surveillance studies, points out that much of surveillance literature has focused on understanding it as a centralised bureaucratic relationship between the powerful and the governed. Invisible forms of surveillance have also been viewed as a class weapon in some societies. With the advancements in and proliferation of surveillance technologies as well as convergence with other technologies, Lyon argues that it is no longer feasible to view surveillance as a linear or centralised process. In our contemporary globalised world, there is a need to reconcile the dialectical strands that mediate surveillance as a process. In acknowledging this, Giles Deleuze and Felix Guattari have constructed surveillance as a rhizome that defies linearity to appropriate a more convoluted and malleable form where the coding of bodies and data can be enmeshed to produce intricate power relationships and hierarchies within societies. Latour draws on the notion of assemblage by propounding that data is amalgamated from scattered centres of calculation where these can range from state and commercial institutions to scientific laboratories which scrutinise data to conceive governance and control strategies. Both the Latourian and Deleuzian ideas of surveillance highlight the disparate arrays of people, technologies and organisations that become connected to make “surveillance assemblages” in contrast to the static, unidirectional Panopticon metaphor (Ball, “Organization” 93). In a similar vein, Gandy (Panoptic) infers that it is misleading to assume that surveillance in practice is as complete and totalising as the Panoptic ideal type would have us believe. Co-optation of Millions The Metropolitan Police’s counter-Terrorism strategy seeks to co-opt millions where the corporeal body can complement the landscape of technological surveillance that already co-exists within modernity. In its press release, the role of civilian bodies in ensuring security of the city is stressed; Keeping Londoners safe from Terrorism is not a job solely for governments, security services or police. If we are to make London the safest major city in the world, we must mobilise against Terrorism not only the resources of the state, but also the active support of the millions of people who live and work in the capita. (MPA Website). Surveillance is increasingly simulated through the millions of corporeal entities where seeing in advance is the goal even before technology records and codes these images (William). Bodies understand and code risk and images through the cultural narratives which circulate in society. Compared to CCTV technology images, which require cultural and political interpretations and interventions, bodies as surveillance organisms implicitly code other bodies and activities. The travel bag in the Metropolitan Police poster reinforces the images of the 7/7 bombers and the renewed attempts to bomb the London Underground on the 21st of July. It reiterates the CCTV footage revealing images of the bombers wearing rucksacks. The image of the rucksack both embodies the everyday as well as the potential for evil in everyday objects. It also inevitably reproduces the cultural biases and prejudices where the rucksack is subliminally associated with a specific type of body. The rucksack in these terms is a laden image which symbolically captures the context and culture of risk discourses in society. The co-optation of the population as a surveillance entity also recasts new forms of social responsibility within the democratic polity, where privacy is increasingly mediated by the greater need to monitor, trace and record the activities of one another. Nikolas Rose, in discussing the increasing ‘responsibilisation’ of individuals in modern societies, describes the process in which the individual accepts responsibility for personal actions across a wide range of fields of social and economic activity as in the choice of diet, savings and pension arrangements, health care decisions and choices, home security measures and personal investment choices (qtd. in Dee). While surveillance in individualistic terms is often viewed as a threat to privacy, Rose argues that the state of ‘advanced liberalism’ within modernity and post-modernity requires considerable degrees of self-governance, regulation and surveillance whereby the individual is constructed both as a ‘new citizen’ and a key site of self management. By co-opting and recasting the role of the citizen in the age of Terrorism, the citizen to a degree accepts responsibility for both surveillance and security. In our sociological imagination the body is constructed both as lived as well as a social object. Erving Goffman uses the word ‘umwelt’ to stress that human embodiment is central to the constitution of the social world. Goffman defines ‘umwelt’ as “the region around an individual from which signs of alarm can come” and employs it to capture how people as social actors perceive and manage their settings when interacting in public places (252). Goffman’s ‘umwelt’ can be traced to Immanuel Kant’s idea that it is the a priori categories of space and time that make it possible for a subject to perceive a world (Umiker-Sebeok; qtd. in Ball, “Organization”). Anthony Giddens adapted the term Umwelt to refer to “a phenomenal world with which the individual is routinely ‘in touch’ in respect of potential dangers and alarms which then formed a core of (accomplished) normalcy with which individuals and groups surround themselves” (244). Benjamin Smith, in considering the body as an integral component of the link between our consciousness and our material world, observes that the body is continuously inscribed by culture. These inscriptions, he argues, encompass a wide range of cultural practices and will imply knowledge of a variety of social constructs. The inscribing of the body will produce cultural meanings as well as create forms of subjectivity while locating and situating the body within a cultural matrix (Smith). Drawing on Derrida’s work, Pugliese employs the term ‘Somatechnics’ to conceptualise the body as a culturally intelligible construct and to address the techniques in and through which the body is formed and transformed (qtd. in Osuri). These techniques can encompass signification systems such as race and gender and equally technologies which mediate our sense of reality. These technologies of thinking, seeing, hearing, signifying, visualising and positioning produce the very conditions for the cultural intelligibility of the body (Osuri). The body is then continuously inscribed and interpreted through mediated signifying systems. Similarly, Hayles, while not intending to impose a Cartesian dichotomy between the physical body and its cognitive presence, contends that the use and interactions with technology incorporate the body as a material entity but it also equally inscribes it by marking, recording and tracing its actions in various terrains. According to Gayatri Spivak (qtd. in Ball, “Organization”) new habits and experiences are embedded into the corporeal entity which then mediates its reactions and responses to the social world. This means one’s body is not completely one’s own and the presence of ideological forces or influences then inscribe the body with meanings, codes and cultural values. In our modern condition, the body and data are intimately and intricately bound. Outside the home, it is difficult for the body to avoid entering into relationships that produce electronic personal data (Stalder). According to Felix Stalder our physical bodies are shadowed by a ‘data body’ which follows the physical body of the consuming citizen and sometimes precedes it by constructing the individual through data (12). Before we arrive somewhere, we have already been measured and classified. Thus, upon arrival, the citizen will be treated according to the criteria ‘connected with the profile that represents us’ (Gandy, Panoptic; William). Following September 11, Lyon (Terrorism) reveals that surveillance data from a myriad of sources, such as supermarkets, motels, traffic control points, credit card transactions records and so on, was used to trace the activities of terrorists in the days and hours before their attacks, confirming that the body leaves data traces and trails. Surveillance works by abstracting bodies from places and splitting them into flows to be reassembled as virtual data-doubles, and in the process can replicate hierarchies and centralise power (Lyon, Terrorism). Mike Dee points out that the nature of surveillance taking place in modern societies is complex and far-reaching and in many ways insidious as surveillance needs to be situated within the broadest context of everyday human acts whether it is shopping with loyalty cards or paying utility bills. Physical vulnerability of the body becomes more complex in the time-space distanciated surveillance systems to which the body has become increasingly exposed. As such, each transaction – whether it be a phone call, credit card transaction, or Internet search – leaves a ‘data trail’ linkable to an individual person or place. Haggerty and Ericson, drawing from Deleuze and Guattari’s concept of the assemblage, describe the convergence and spread of data-gathering systems between different social domains and multiple levels (qtd. in Hier). They argue that the target of the generic ‘surveillance assemblage’ is the human body, which is broken into a series of data flows on which surveillance process is based. The thrust of the focus is the data individuals can yield and the categories to which they can contribute. These are then reapplied to the body. In this sense, surveillance is rhizomatic for it is diverse and connected to an underlying, invisible infrastructure which concerns interconnected technologies in multiple contexts (Ball, “Elements”). The co-opted body in the schema of counter-Terrorism enters a power arrangement where it constitutes both the unseen gaze as well as the data that will be implicated and captured in this arrangement. It is capable of producing surveillance data for those in power while creating new data through its transactions and movements in its everyday life. The body is unequivocally constructed through this data and is also entrapped by it in terms of representation and categorisation. The corporeal body is therefore part of the machinery of surveillance while being vulnerable to its discriminatory powers of categorisation and victimisation. As Hannah Arendt (qtd. in Bauman 91) had warned, “we terrestrial creatures bidding for cosmic significance will shortly be unable to comprehend and articulate the things we are capable of doing” Arendt’s caution conveys the complexity, vulnerability as well as the complicity of the human condition in the surveillance society. Equally it exemplifies how the corporeal body can be co-opted as a surveillance entity sustaining a new ‘banality’ (Arendt) in the machinery of surveillance. Social Consequences of Surveillance Lyon (Terrorism) observed that the events of 9/11 and 7/7 in the UK have inevitably become a prism through which aspects of social structure and processes may be viewed. This prism helps to illuminate the already existing vast range of surveillance practices and processes that touch everyday life in so-called information societies. As Lyon (Terrorism) points out surveillance is always ambiguous and can encompass genuine benefits and plausible rationales as well as palpable disadvantages. There are elements of representation to consider in terms of how surveillance technologies can re-present data that are collected at source or gathered from another technological medium, and these representations bring different meanings and enable different interpretations of life and surveillance (Ball, “Elements”). As such surveillance needs to be viewed in a number of ways: practice, knowledge and protection from threat. As data can be manipulated and interpreted according to cultural values and norms it reflects the inevitability of power relations to forge its identity in a surveillance society. In this sense, Ball (“Elements”) concludes surveillance practices capture and create different versions of life as lived by surveilled subjects. She refers to actors within the surveilled domain as ‘intermediaries’, where meaning is inscribed, where technologies re-present information, where power/resistance operates, and where networks are bound together to sometimes distort as well as reiterate patterns of hegemony (“Elements” 93). While surveillance is often connected with technology, it does not however determine nor decide how we code or employ our data. New technologies rarely enter passive environments of total inequality for they become enmeshed in complex pre-existing power and value systems (Marx). With surveillance there is an emphasis on the classificatory powers in our contemporary world “as persons and groups are often risk-profiled in the commercial sphere which rates their social contributions and sorts them into systems” (Lyon, Terrorism 2). Lyon (Terrorism) contends that the surveillance society is one that is organised and structured using surveillance-based techniques recorded by technologies, on behalf of the organisations and governments that structure our society. This information is then sorted, sifted and categorised and used as a basis for decisions which affect our life chances (Wood and Ball). The emergence of pervasive, automated and discriminatory mechanisms for risk profiling and social categorising constitute a significant mechanism for reproducing and reinforcing social, economic and cultural divisions in information societies. Such automated categorisation, Lyon (Terrorism) warns, has consequences for everyone especially in face of the new anti-terror measures enacted after September 11. In tandem with this, Bauman points out that a few suicidal murderers on the loose will be quite enough to recycle thousands of innocents into the “usual suspects”. In no time, a few iniquitous individual choices will be reprocessed into the attributes of a “category”; a category easily recognisable by, for instance, a suspiciously dark skin or a suspiciously bulky rucksack* *the kind of object which CCTV cameras are designed to note and passers-by are told to be vigilant about. And passers-by are keen to oblige. Since the terrorist atrocities on the London Underground, the volume of incidents classified as “racist attacks” rose sharply around the country. (122; emphasis added) Bauman, drawing on Lyon, asserts that the understandable desire for security combined with the pressure to adopt different kind of systems “will create a culture of control that will colonise more areas of life with or without the consent of the citizen” (123). This means that the inhabitants of the urban space whether a citizen, worker or consumer who has no terrorist ambitions whatsoever will discover that their opportunities are more circumscribed by the subject positions or categories which are imposed on them. Bauman cautions that for some these categories may be extremely prejudicial, restricting them from consumer choices because of credit ratings, or more insidiously, relegating them to second-class status because of their colour or ethnic background (124). Joseph Pugliese, in linking visual regimes of racial profiling and the shooting of Jean Charles de Menezes in the aftermath of 7/7 bombings in London, suggests that the discursive relations of power and visuality are inextricably bound. Pugliese argues that racial profiling creates a regime of visuality which fundamentally inscribes our physiology of perceptions with stereotypical images. He applies this analogy to Menzes running down the platform in which the retina transforms him into the “hallucinogenic figure of an Asian Terrorist” (Pugliese 8). With globalisation and the proliferation of ICTs, borders and boundaries are no longer sacrosanct and as such risks are managed by enacting ‘smart borders’ through new technologies, with huge databases behind the scenes processing information about individuals and their journeys through the profiling of body parts with, for example, iris scans (Wood and Ball 31). Such body profiling technologies are used to create watch lists of dangerous passengers or identity groups who might be of greater ‘risk’. The body in a surveillance society can be dissected into parts and profiled and coded through technology. These disparate codings of body parts can be assembled (or selectively omitted) to construct and represent whole bodies in our information society to ascertain risk. The selection and circulation of knowledge will also determine who gets slotted into the various categories that a surveillance society creates. Conclusion When the corporeal body is subsumed into a web of surveillance it often raises questions about the deterministic nature of technology. The question is a long-standing one in our modern consciousness. We are apprehensive about according technology too much power and yet it is implicated in the contemporary power relationships where it is suspended amidst human motive, agency and anxiety. The emergence of surveillance societies, the co-optation of bodies in surveillance schemas, as well as the construction of the body through data in everyday transactions, conveys both the vulnerabilities of the human condition as well as its complicity in maintaining the power arrangements in society. Bauman, in citing Jacques Ellul and Hannah Arendt, points out that we suffer a ‘moral lag’ in so far as technology and society are concerned, for often we ruminate on the consequences of our actions and motives only as afterthoughts without realising at this point of existence that the “actions we take are most commonly prompted by the resources (including technology) at our disposal” (91). References Abrams, Philip. Historical Sociology. Shepton Mallet, UK: Open Books, 1982. Altheide, David. “Consuming Terrorism.” Symbolic Interaction 27.3 (2004): 289-308. Arendt, Hannah. Eichmann in Jerusalem: A Report on the Banality of Evil. London: Faber & Faber, 1963. Bauman, Zygmunt. Liquid Fear. Cambridge, UK: Polity, 2006. Ball, Kristie. “Elements of Surveillance: A New Framework and Future Research Direction.” Information, Communication and Society 5.4 (2002): 573-90 ———. “Organization, Surveillance and the Body: Towards a Politics of Resistance.” Organization 12 (2005): 89-108. Dee, Mike. “The New Citizenship of the Risk and Surveillance Society – From a Citizenship of Hope to a Citizenship of Fear?” Paper Presented to the Social Change in the 21st Century Conference, Queensland University of Technology, Queensland, Australia, 22 Nov. 2002. 14 April 2007 http://eprints.qut.edu.au/archive/00005508/02/5508.pdf>. Deleuze, Gilles, and Felix Guattari. A Thousand Plateaus. Minneapolis: U of Minnesota P, 1987. Fopp, Rodney. “Increasing the Potential for Gaze, Surveillance and Normalization: The Transformation of an Australian Policy for People and Homeless.” Surveillance and Society 1.1 (2002): 48-65. Foucault, Michel. Discipline and Punish: The Birth of the Prison. London: Allen Lane, 1977. Giddens, Anthony. Modernity and Self-Identity. Self and Society in the Late Modern Age. Stanford: Stanford UP, 1991. Gandy, Oscar. The Panoptic Sort: A Political Economy of Personal Information. Boulder, CO: Westview, 1997. ———. “Data Mining and Surveillance in the Post 9/11 Environment.” The Intensification of Surveillance: Crime, Terrorism and War in the Information Age. Eds. Kristie Ball and Frank Webster. Sterling, VA: Pluto Press, 2003. Goffman, Erving. Relations in Public. Harmondsworth: Penguin, 1971. Graham, Stephen, and Simon Marvin. Splintering Urbanism: Networked Infrastructures, Technological Mobilities and the Urban Condition. New York: Routledge, 2001. Hier, Sean. “Probing Surveillance Assemblage: On the Dialectics of Surveillance Practices as Process of Social Control.” Surveillance and Society 1.3 (2003): 399-411. Hayles, Katherine. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature and Informatics. Chicago: U of Chicago P, 1999. Hubbard, Phil. “Fear and Loathing at the Multiplex: Everyday Anxiety in the Post-Industrial City.” Capital & Class 80 (2003). Latour, Bruno. Science in Action. Cambridge, Mass: Harvard UP, 1987 Lyon, David. The Electronic Eye – The Rise of Surveillance Society. Oxford: Polity Press, 1994. ———. “Terrorism and Surveillance: Security, Freedom and Justice after September 11 2001.” Privacy Lecture Series, Queens University, 12 Nov 2001. 16 April 2007 http://privacy.openflows.org/lyon_paper.html>. ———. “Surveillance Studies: Understanding Visibility, Mobility and the Phonetic Fix.” Surveillance and Society 1.1 (2002): 1-7. Metropolitan Police Authority (MPA). “Counter Terrorism: The London Debate.” Press Release. 21 June 2006. 18 April 2007 http://www.mpa.gov.uk.access/issues/comeng/Terrorism.htm>. Pugliese, Joseph. “Asymmetries of Terror: Visual Regimes of Racial Profiling and the Shooting of Jean Charles de Menezes in the Context of the War in Iraq.” Borderlands 5.1 (2006). 30 May 2007 http://www.borderlandsejournal.adelaide.edu.au/vol15no1_2006/ pugliese.htm>. Marx, Gary. “A Tack in the Shoe: Neutralizing and Resisting the New Surveillance.” Journal of Social Issues 59.2 (2003). 18 April 2007 http://web.mit.edu/gtmarx/www/tack.html>. Moores, Shaun. “Doubling of Place.” Mediaspace: Place Scale and Culture in a Media Age. Eds. Nick Couldry and Anna McCarthy. Routledge, London, 2004. Monahan, Teri, ed. Surveillance and Security: Technological Politics and Power in Everyday Life. Routledge: London, 2006. Norris, Clive, and Gary Armstrong. The Maximum Surveillance Society: The Rise of CCTV. Oxford: Berg, 1999. O’Harrow, Robert. No Place to Hide. New York: Free Press, 2005. Osuri, Goldie. “Media Necropower: Australian Media Reception and the Somatechnics of Mamdouh Habib.” Borderlands 5.1 (2006). 30 May 2007 http://www.borderlandsejournal.adelaide.edu.au/vol5no1_2006 osuri_necropower.htm>. Rose, Nikolas. “Government and Control.” British Journal of Criminology 40 (2000): 321–399. Scannell, Paddy. Radio, Television and Modern Life. Oxford: Blackwell, 1996. Smith, Benjamin. “In What Ways, and for What Reasons, Do We Inscribe Our Bodies?” 15 Nov. 1998. 30 May 2007 http:www.bmezine.com/ritual/981115/Whatways.html>. Stalder, Felix. “Privacy Is Not the Antidote to Surveillance.” Surveillance and Society 1.1 (2002): 120-124. Umiker-Sebeok, Jean. “Power and the Construction of Gendered Spaces.” Indiana University-Bloomington. 14 April 2007 http://www.slis.indiana.edu/faculty/umikerse/papers/power.html>. William, Bogard. The Simulation of Surveillance: Hypercontrol in Telematic Societies. Cambridge: Cambridge UP, 1996. Wood, Kristie, and David M. Ball, eds. “A Report on the Surveillance Society.” Surveillance Studies Network, UK, Sep. 2006. 14 April 2007 http://www.ico.gov.uk/upload/documents/library/data_protection/ practical_application/surveillance_society_full_report_2006.pdf>. 
 
 
 
 Citation reference for this article
 
 MLA Style
 Ibrahim, Yasmin. "Commodifying Terrorism: Body, Surveillance and the Everyday." M/C Journal 10.3 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0706/05-ibrahim.php>. APA Style
 Ibrahim, Y. (Jun. 2007) "Commodifying Terrorism: Body, Surveillance and the Everyday," M/C Journal, 10(3). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0706/05-ibrahim.php>. 
APA, Harvard, Vancouver, ISO, and other styles
48

Cutler, Ella Rebecca Barrowclough, Jacqueline Gothe, and Alexandra Crosby. "Design Microprotests." M/C Journal 21, no. 3 (2018). http://dx.doi.org/10.5204/mcj.1421.

Full text
Abstract:
IntroductionThis essay considers three design projects as microprotests. Reflecting on the ways design practice can generate spaces, sites and methods of protest, we use the concept of microprotest to consider how we, as designers ourselves, can protest by scaling down, focussing, slowing down and paying attention to the edges of our practice. Design microprotest is a form of design activism that is always collaborative, takes place within a community, and involves careful translation of a political conversation. While microprotest can manifest in any design discipline, in this essay we focus on visual communication design. In particular we consider the deep, reflexive practice of listening as the foundation of microprotests in visual communication design.While small in scale and fleeting in duration, these projects express rich and deep political engagements through conversations that create and maintain safe spaces. While many design theorists (Julier; Fuad-Luke; Clarke; Irwin et al.) have done important work to contextualise activist design as a broad movement with overlapping branches (social design, community design, eco-design, participatory design, critical design, and transition design etc.), the scope of our study takes ‘micro’ as a starting point. We focus on the kind of activism that takes shape in moments of careful design; these are moments when designers move politically, rather than necessarily within political movements. These microprotests respond to community needs through design more than they articulate a broad activist design movement. As such, the impacts of these microprotests often go unnoticed outside of the communities within which they take place. We propose, and test in this essay, a mode of analysis for design microprotests that takes design activism as a starting point but pays more attention to community and translation than designers and their global reach.In his analysis of design activism, Julier proposes “four possible conceptual tactics for the activist designer that are also to be found in particular qualities in the mainstream design culture and economy” (Julier, Introduction 149). We use two of these tactics to begin exploring a selection of attributes common to design microprotests: temporality – which describes the way that speed, slowness, progress and incompletion are dealt with; and territorialisation – which describes the scale at which responsibility and impact is conceived (227). In each of three projects to which we apply these tactics, one of us had a role as a visual communicator. As such, the research is framed by the knowledge creating paradigm described by Jonas as “research through design”.We also draw on other conceptualisations of design activism, and the rich design literature that has emerged in recent times to challenge the colonial legacies of design studies (Schultz; Tristan et al.; Escobar). Some analyses of design activism already focus on the micro or the minor. For example, in their design of social change within organisations as an experimental and iterative process, Lensjkold, Olander and Hasse refer to Deleuze and Guattari’s minoritarian: “minor design activism is ‘a position in co-design engagements that strives to continuously maintain experimentation” (67). Like minor activism, design microprotests are linked to the continuous mobilisation of actors and networks in processes of collective experimentation. However microprotests do not necessarily focus on organisational change. Rather, they create new (and often tiny) spaces of protest within which new voices can be heard and different kinds of listening can be done.In the first of our three cases, we discuss a representation of transdisciplinary listening. This piece of visual communication is a design microprotest in itself. This section helps to frame what we mean by a safe space by paying attention to the listening mode of communication. In the next sections we explore temporality and territorialisation through the design microprotests Just Spaces which documents the collective imagining of safe places for LBPQ (Lesbian, Bisexual, Pansexual, and Queer) women and non-binary identities through a series of graphic objects and Conversation Piece, a book written, designed and published over three days as a proposition for a collective future. A Representation of Transdisciplinary ListeningThe design artefact we present in this section is a representation of listening and can be understood as a microprotest emerging from a collective experiment that materialises firstly as a visual document asking questions of the visual communication discipline and its role in a research collaboration and also as a mirror for the interdisciplinary team to reflexively develop transdisciplinary perspectives on the risks associated with the release of environmental flows in the upper reaches of Hawkesbury Nepean River in NSW, Australia. This research project was funded through a Challenge Grant Scheme to encourage transdisciplinarity within the University. The project team worked with the Hawkesbury Nepean Catchment Management Authority in response to the question: What are the risks to maximising the benefits expected from increased environmental flows? Listening and visual communication design practice are inescapably linked. Renown American graphic designer and activist Sheila de Bretteville describes a consciousness and a commitment to listening as an openness, rather than antagonism and argument. Fiumara describes listening as nascent or an emerging skill and points to listening as the antithesis of the Western culture of saying and expression.For a visual communication designer there is a very specific listening that can be described as visual hearing. This practice materialises the act of hearing through a visualisation of the information or knowledge that is shared. This act of visual hearing is a performative process tracing the actors’ perspectives. This tracing is used as content, which is then translated into a transcultural representation constituted by the designerly act of perceiving multiple perspectives. The interpretation contributes to a shared project of transdisciplinary understanding.This transrepresentation (Fig. 1) is a manifestation of a small interaction among a research team comprised of a water engineer, sustainable governance researcher, water resource management researcher, environmental economist and a designer. This visualisation is a materialisation of a structured conversation in response to the question What are the risks to maximising the benefits expected from increased environmental flows? It represents a small contribution that provides an opportunity for reflexivity and documents a moment in time in response to a significant challenge. In this translation of a conversation as a visual representation, a design microprotest is made against reduction, simplification, antagonism and argument. This may seem intangible, but as a protest through design, “it involves the development of artifacts that exist in real time and space, it is situated within everyday contexts and processes of social and economic life” (Julier 226). This representation locates conversation in a visual order that responds to particular categorisations of the political, the institutional, the socio-economic and the physical in a transdisciplinary process that focusses on multiple perspectives.Figure 1: Transrepresentation of responses by an interdisciplinary research team to the question: What are the risks to maximising the benefits expected from increased environmental flows in the Upper Hawkesbury Nepean River? (2006) Just Spaces: Translating Safe SpacesListening is the foundation of design microprotest. Just Spaces emerged out of a collaborative listening project It’s OK! An Anthology of LBPQ (Lesbian, Bisexual, Pansexual and Queer) Women’s and Non-Binary Identities’ Stories and Advice. By visually communicating the way a community practices supportive listening (both in a physical form as a book and as an online resource), It’s OK! opens conversations about how LBPQ women and non-binary identities can imagine and help facilitate safe spaces. These conversations led to thinking about the effects of breaches of safe spaces on young LBPQ women and non-binary identities. In her book The Cultural Politics of Emotion, Sara Ahmed presents Queer Feelings as a new way of thinking about Queer bodies and the way they use and impress upon space. She makes an argument for creating and imagining new ways of creating and navigating public and private spaces. As a design microprotest, Just Spaces opens up Queer ways of navigating space through a process Ahmed describes as “the ‘non-fitting’ or discomfort .... an opening up which can be difficult and exciting” (Ahmed 154). Just Spaces is a series of workshops, translated into a graphic design object, and presented at an exhibition in the stairwell of the library at the University of Technology Sydney. It protests the requirement of navigating heteronormative environments by suggesting ‘Queer’ ways of being in and designing in space. The work offers solutions, suggestions, and new ways of doing and making by offering design methods as tools of microprotest to its participants. For instance, Just Spaces provides a framework for sensitive translation, through the introduction of a structure that helps build personas based on the game Dungeons and Dragons (a game popular among certain LGBTQIA+ communities in Sydney). Figure 2: Exhibition: Just Spaces, held at UTS Library from 5 to 27 April 2018. By focussing the design process on deep listening and rendering voices into visual translations, these workshops responded to Linda Tuhiwai Smith’s idea of the “outsider within”, articulating the way research should be navigated in vulnerable groups that have a history of being exploited as part of research. Through reciprocity and generosity, trust was generated in the design process which included a shared dinner; opening up participant-controlled safe spaces.To open up and explore ideas of discomfort and safety, two workshops were designed to provide safe and sensitive spaces for the group of seven LBPQ participants and collaborators. Design methods such as drawing, group imagining and futuring using a central prototype as a prompt drew out discussions of safe spaces. The prototype itself was a small folded house (representative of shelter) printed with a number of questions, such as:Our spaces are often unsafe. We take that as a given. But where do these breaches of safety take place? How was your safe space breached in those spaces?The workshops resulted in tangible objects, made by the participants, but these could not be made public because of privacy implications. So the next step was to use visual communication design to create sensitive and honest visual translations of the conversations. The translations trace images from the participants’ words, sketches and notes. For example, handwritten notes are transcribed and reproduced with a font chosen by the designer based on the tone of the comment and by considering how design can retain the essence of person as well as their anonymity. The translations focus on the micro: the micro breaches of safety; the interactions that take place between participants and their environment; and the everyday denigrating experiences that LBPQ women and non-binary identities go through on an ongoing basis. This translation process requires precise skills, sensitivity, care and deep knowledge of context. These skills operate at the smallest of scales through minute observation and detailed work. This micro-ness translates to the potential for truthfulness and care within the community, as it establishes a precedent through the translations for others to use and adapt for their own communities.The production of the work for exhibition also occurred on a micro level, using a Risograph, a screenprinting photocopier often found in schools, community groups and activist spaces. The machine (ME9350) used for this project is collectively owned by a co-op of Sydney creatives called Rizzeria. Each translation was printed only five times on butter paper. Butter paper is a sensitive surface but difficult to work with making the process slow and painstaking and with a lot of care.All aspects of this process and project are small: the pieced-together translations made by assembling segments of conversations; zines that can be kept in a pocket and read intimately; the group of participants; and the workshop and exhibition spaces. These small spaces of safety and their translations make possible conversations but also enable other safe spaces that move and intervene as design microprotests. Figure 3: Piecing the translations together. Figure 4: Pulling the translation off the drum; this was done every print making the process slow and requiring gentleness. This project was and is about slowing down, listening and visually translating in order to generate and imagine safe spaces. In this slowness, as Julier describes “...the activist is working in a more open-ended way that goes beyond the materialization of the design” (229). It creates methods for listening and collaboratively generating ways to navigate spaces that are fraught with micro conflict. As an act of territorialisation, it created tiny and important spaces as a design microprotest. Conversation Piece: A Fast and Slow BookConversation Piece is an experiment in collective self-publishing. It was made over three days by Frontyard, an activist space in Marrickville, NSW, involved in community “futuring”. Futuring for Frontyard is intended to empower people with tools to imagine and enact preferred futures, in contrast to what design theorist Tony Fry describes as “defuturing”, the systematic destruction of possible futures by design. Materialised as a book, Conversation Piece is also an act of collective futuring. It is a carefully designed process for producing dialogues between unlikely parties using an image archive as a starting point. Conversation Piece was designed with the book sprint format as a starting point. Founded by software designer Adam Hyde, book sprints are a method of collectively generating a book in just a few days then publishing it. Book sprints are related to the programming sprints common in agile software development or Scrum, which are often used to make FLOSS (Free and Open Source Software) manuals. Frontyard had used these techniques in a previous project to develop the Non Cash Arts Asset Platform.Conversation Piece was also modeled on two participatory books made during sprints that focussed on articulating alternative futures. Collaborative Futures was made during Transmediale in 2009, and Futurish: Thinking Out Loud about Futures (2015).The design for Conversation Piece began when Frontyard was invited to participate in the Hobiennale in 2017, a free festival emerging from the “national climate of uncertainty within the arts, influenced by changes to the structure of major arts organisations and diminishing funding opportunities.” The Hobiennale was the first Biennale held in Hobart, Tasmania, but rather than producing a standard large art survey, it focussed on artist-run spaces and initiatives, emergant practices, and marginalised voices in the arts. Frontyard is not an artist collective and does not work for commissions. Rather, the response to the invitation was based on how much energy there was in the group to contribute to Hobiennale. At Frontyard one of the ways collective and individual energy is accounted for is using spoon theory, a disability metaphor used to describe the planning that many people have to do to conserve and ration energy reserves in their daily lives (Miserandino). As outlined in the glossary of Conversation Piece, spoon theory is:A way of accounting for our emotional or physical energy and therefore our ability to participate in activities. Spoon theory can be used to collaborate with care and avoid guilt and burn out. Usually spoon theory is applied at an individual level, but it can also be used by organisations. For example, Hobiennale had enough spoons to participate in the Hobiennale so we decided to give it a go. (180)To make to book, Frontyard invited visitors to Hobiennale to participate in a series of open conversations that began with the photographic archive of the organisation over the two years of its existence. During a prototyping session, Frontyard designed nine diagrams that propositioned ways to begin conversations by combining images in different ways. Figure 5: Diagram 9. Conversation Piece: p.32-33One of the purposes of the diagrams, and the book itself, was to bring attention to the micro dynamics of conversation over time, and to create a safe space to explore the implications of these. While the production process and the book itself is micro (ten copies were printed and immediately given away), the decisions made in regards to licensing (a creative commons license is used), distribution (via the Internet Archive) and content generation (through participatory design processes) the project’s commitment to open design processes (Van Abel, Evers, Klaassen and Troxler) mean its impact is unpredictable. Counter-logical to the conventional copyright of books, open design borrows its definition - and at times its technologies and here its methods - from open source software design, to advocate the production of design objects based on fluid and shared circulation of design information. The tension between the abundance produced by an open approach to making, and the attention to the detail of relationships produced by slowing down and scaling down communication processes is made apparent in Conversation Piece:We challenge ourselves at Frontyard to keep bureaucratic processes as minimal an open as possible. We don’t have an application or acquittal process: we prefer to meet people over a cup of tea. A conversation is a way to work through questions. (7)As well as focussing on the micro dynamics of conversations, this projects protests the authority of archives. It works to dismantle the hierarchies of art and publishing through the design of an open, transparent, participatory publishing process. It offers a range of propositions about alternative economies, the agency of people working together at small scales, and the many possible futures in the collective imaginaries of people rethinking time, outcomes, results and progress.The contributors to the book are those in conversation – a complex networks of actors that are relationally configured and themselves in constant change, so as Julier explains “the object is subject to constant transformations, either literally or in its meaning. The designer is working within this instability.” (230) This is true of all design, but in this design microprotest, Frontyard works within this instability in order to redirect it. The book functions as a series of propositions about temporality and territorialisation, and focussing on micro interventions rather than radical political movements. In one section, two Frontyard residents offer a story of migration that also serves as a recipe for purslane soup, a traditional Portuguese dish (Rodriguez and Brison). Another lifts all the images of hand gestures from the Frontyard digital image archive and represents them in a photo essay. Figure 6: Talking to Rocks. Conversation Piece: p.143ConclusionThis article is an invitation to momentarily suspend the framing of design activism as a global movement in order to slow down the analysis of design protests and start paying attention to the brief moments and small spaces of protest that energise social change in design practice. We offered three examples of design microprotests, opening with a representation of transdisciplinary listening in order to frame design as a way if interpreting and listening as well as generating and producing. The two following projects we describe are collective acts of translation: small, momentary conversations designed into graphic forms that can be shared, reproduced, analysed, and remixed. Such protests have their limitations. Beyond the artefacts, the outcomes generated by design microprotests are difficult to identify. While they push and pull at the temporality and territorialisation of design, they operate at a small scale. How design microprotests connect to global networks of protest is an important question yet to be explored. The design practices of transdisciplinary listening, Queer Feelings and translations, and collaborative book sprinting, identified in these design microprotests change the thoughts and feelings of those who participate in ways that are impossible to measure in real time, and sometimes cannot be measured at all. Yet these practices are important now, as they shift the way designers design, and the way others understand what is designed. By identifying the common attributes of design microprotests, we can begin to understand the way necessary political conversations emerge in design practice, for instance about safe spaces, transdisciplinarity, and archives. Taking a research through design approach these can be understood over time, rather than just in the moment, and in specific territories that belong to community. They can be reconfigured into different conversations that change our world for the better. References Ahmed, Sara. “Queer Feelings.” The Cultural Politics of Emotion. Edinburgh: Edinburgh UP, 2004. 143-167.Clarke, Alison J. "'Actions Speak Louder': Victor Papanek and the Legacy of Design Activism." Design and Culture 5.2 (2013): 151-168.De Bretteville, Sheila L. Design beyond Design: Critical Reflection and the Practice of Visual Communication. Ed. Jan van Toorn. Maastricht: Jan van Eyck Akademie Editions, 1998. 115-127.Evers, L., et al. Open Design Now: Why Design Cannot Remain Exclusive. Amsterdam: BIS Publishers, 2011.Escobar, Arturo. Designs for the Pluriverse: Radical Interdependence, Autonomy, and the Making of Worlds. Duke UP, 2018.Fiumara, G.C. The Other Side of Language: A Philosophy of Listening. London: Routledge, 1995.Fuad-Luke, Alastair. Design Activism: Beautiful Strangeness for a Sustainable World. London: Routledge, 2013.Frontyard Projects. 2018. Conversation Piece. Marrickville: Frontyard Projects. Fry, Tony. A New Design Philosophy: An Introduction to Defuturing. Sydney: UNSW P, 1999.Hanna, Julian, Alkan Chipperfield, Peter von Stackelberg, Trevor Haldenby, Nik Gaffney, Maja Kuzmanovic, Tim Boykett, Tina Auer, Marta Peirano, and Istvan Szakats. Futurish: Thinking Out Loud about Futures. Linz: Times Up, 2014. Irwin, Terry, Gideon Kossoff, and Cameron Tonkinwise. "Transition Design Provocation." Design Philosophy Papers 13.1 (2015): 3-11.Julier, Guy. "From Design Culture to Design Activism." Design and Culture 5.2 (2013): 215-236.Julier, Guy. "Introduction: Material Preference and Design Activism." Design and Culture 5.2 (2013): 145-150.Jonas, W. “Exploring the Swampy Ground.” Mapping Design Research. Eds. S. Grand and W. Jonas. Basel: Birkhauser, 2012. 11-41.Kagan, S. Art and Sustainability. Bielefeld: Transcript, 2011.Lenskjold, Tau Ulv, Sissel Olander, and Joachim Halse. “Minor Design Activism: Prompting Change from Within.” Design Issues 31.4 (2015): 67–78. doi:10.1162/DESI_a_00352.Max-Neef, M.A. "Foundations of Transdisciplinarity." Ecological Economics 53.53 (2005): 5-16.Miserandino, C. "The Spoon Theory." <http://www.butyoudontlooksick.com>.Nicolescu, B. "Methodology of Transdisciplinarity – Levels of Reality, Logic of the Included Middle and Complexity." Transdisciplinary Journal of Engineering and Science 1.1 (2010): 19-38.Palmer, C., J. Gothe, C. Mitchell, K. Sweetapple, S. McLaughlin, G. Hose, M. Lowe, H. Goodall, T. Green, D. Sharma, S. Fane, K. Brew, and P. Jones. “Finding Integration Pathways: Developing a Transdisciplinary (TD) Approach for the Upper Nepean Catchment.” Proceedings of the 5th Australian Stream Management Conference: Australian Rivers: Making a Difference. Thurgoona, NSW: Charles Sturt University, 2008.Rodriguez and Brison. "Purslane Soup." Conversation Piece. Eds. Frontyard Projects. Marrickville: Frontyard Projects, 2018. 34-41.Schultz, Tristan, et al. "What Is at Stake with Decolonizing Design? A Roundtable." Design and Culture 10.1 (2018): 81-101.Smith, Linda Tuhiwai. Decolonising Methodologies: Research and Indigenous Peoples. New York: ZED Books, 1998. Van Abel, Bas, et al. Open Design Now: Why Design Cannot Remain Exclusive. Bis Publishers, 2014.Wing Sue, Derald. Microaggressions in Everyday Life: Race, Gender, and Sexual Orientation. London: John Wiley & Sons, 2010. XV-XX.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography