To see the other types of publications on this topic, follow the link: Machine learning. Heuristic programming. Set theory.

Journal articles on the topic 'Machine learning. Heuristic programming. Set theory'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 30 journal articles for your research on the topic 'Machine learning. Heuristic programming. Set theory.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

CHEN, PO-CHI, and SUH-YIN LEE. "AN IMPROVEMENT TO TOP-DOWN CLAUSE SPECIALIZATION." International Journal on Artificial Intelligence Tools 07, no. 01 (March 1998): 71–102. http://dx.doi.org/10.1142/s0218213098000068.

Full text
Abstract:
One remarkable progress of recent research in machine learning is inductive logic programming (ILP). In most ILP system, clause specialization is one of the most important tasks. Usually, the clause specialization is performed by adding a literal at a time using hill-climbing heuristics. However, the single-literal addition can be caught by local pits when more than one literal needs to be added at a time increase the accuracy. Several techniques have been proposed for this problem but are restricted to relational domains. In this paper, we propose a technique called structure subtraction to construct a set of candidates for adding literals, single-literal or multiple-literals. This technique can be employed in any ILP system using top-down specilization and is not restricted to relational domains. A theory revision system is described to illustrate the use of structural subtraction.
APA, Harvard, Vancouver, ISO, and other styles
2

WANG, CHAO, MINGHU HA, JIQIANG CHEN, and HONGJIE XING. "SUPPORT VECTOR MACHINE BASED ON RANDOM SET SAMPLES." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 21, supp01 (July 2013): 101–12. http://dx.doi.org/10.1142/s0218488513400084.

Full text
Abstract:
In order to deal with learning problems of random set samples encountered in real-world, according to random set theory and convex quadratic programming, a new support vector machine based on random set samples is constructed. Experimental results show that the new support vector machine is feasible and effective.
APA, Harvard, Vancouver, ISO, and other styles
3

Sokolov, I. A. "Theory and practice in artificial intelligence." Вестник Российской академии наук 89, no. 4 (April 24, 2019): 365–70. http://dx.doi.org/10.31857/s0869-5873894365-370.

Full text
Abstract:
Artificial Intelligence is an interdisciplinary field, and formed about 60 years ago as an interaction between mathematical methods, computer science, psychology, and linguistics. Artificial Intelligence is an experimental science and today features a number of internally designed theoretical methods: knowledge representation, modeling of reasoning and behavior, textual analysis, and data mining. Within the framework of Artificial Intelligence, novel scientific domains have arisen: non-monotonic logic, description logic, heuristic programming, expert systems, and knowledge-based software engineering. Increasing interest in Artificial Intelligence in recent years is related to the development of promising new technologies based on specific methods like knowledge discovery (or machine learning), natural language processing, autonomous unmanned intelligent systems, and hybrid human-machine intelligence.
APA, Harvard, Vancouver, ISO, and other styles
4

Masrom, Suraya, Masurah Mohamad, Shahirah Mohamed Hatim, Norhayati Baharun, Nasiroh Omar, and Abdullah Sani Abd. Rahman. "Different mutation and crossover set of genetic programming in an automated machine learning." IAES International Journal of Artificial Intelligence (IJ-AI) 9, no. 3 (September 1, 2020): 402. http://dx.doi.org/10.11591/ijai.v9.i3.pp402-408.

Full text
Abstract:
<span lang="EN-US">Automated machine learning is a promising approach widely used to solve classification and prediction problems, which currently receives much attention for modification and improvement. One of the progressing works for automated machine learning improvement is the inclusion of evolutionary algorithm such as Genetic Programming. The function of Genetic Programming is to optimize the best combination of solutions from the possible pipelines of machine learning modelling, including selection of algorithms and parameters optimization of the selected algorithm. As a family of evolutionary based algorithm, the effectiveness of Genetic Programming in providing the best machine learning pipelines for a given problem or dataset is substantially depending on the algorithm parameterizations including the mutation and crossover rates. This paper presents the effect of different pairs of mutation and crossover rates on the automated machine learning performances that tested on different types of datasets. The finding can be used to support the theory that higher crossover rates used to improve the algorithm accuracy score while lower crossover rates may cause the algorithm to converge at earlier stage.</span>
APA, Harvard, Vancouver, ISO, and other styles
5

MARATEA, MARCO, LUCA PULINA, and FRANCESCO RICCA. "A multi-engine approach to answer-set programming." Theory and Practice of Logic Programming 14, no. 6 (August 15, 2013): 841–68. http://dx.doi.org/10.1017/s1471068413000094.

Full text
Abstract:
AbstractAnswer-set programming (ASP) is a truly declarative programming paradigm proposed in the area of non-monotonic reasoning and logic programming, which has been recently employed in many applications. The development of efficient ASP systems is, thus, crucial. Having in mind the task of improving the solving methods for ASP, there are two usual ways to reach this goal: (i) extending state-of-the-art techniques and ASP solvers or (ii) designing a new ASP solver from scratch. An alternative to these trends is to build on top of state-of-the-art solvers, and to apply machine learning techniques for choosing automatically the “best” available solver on a per-instance basis.In this paper, we pursue this latter direction. We first define a set of cheap-to-compute syntactic features that characterize several aspects of ASP programs. Then, we apply classification methods that, given the features of the instances in atrainingset and the solvers' performance on these instances, inductively learn algorithm selection strategies to be applied to atestset. We report the results of a number of experiments considering solvers and different training and test sets of instances taken from the ones submitted to the “System Track” of the Third ASP Competition. Our analysis shows that by applying machine learning techniques to ASP solving, it is possible to obtain very robust performance: our approach can solve more instances compared with any solver that entered the Third ASP Competition.
APA, Harvard, Vancouver, ISO, and other styles
6

Suehiro, Daiki, Kohei Hatano, Eiji Takimoto, Shuji Yamamoto, Kenichi Bannai, and Akiko Takeda. "Theory and Algorithms for Shapelet-Based Multiple-Instance Learning." Neural Computation 32, no. 8 (August 2020): 1580–613. http://dx.doi.org/10.1162/neco_a_01297.

Full text
Abstract:
We propose a new formulation of multiple-instance learning (MIL), in which a unit of data consists of a set of instances called a bag. The goal is to find a good classifier of bags based on the similarity with a “shapelet” (or pattern), where the similarity of a bag with a shapelet is the maximum similarity of instances in the bag. In previous work, some of the training instances have been chosen as shapelets with no theoretical justification. In our formulation, we use all possible, and thus infinitely many, shapelets, resulting in a richer class of classifiers. We show that the formulation is tractable, that is, it can be reduced through linear programming boosting (LPBoost) to difference of convex (DC) programs of finite (actually polynomial) size. Our theoretical result also gives justification to the heuristics of some previous work. The time complexity of the proposed algorithm highly depends on the size of the set of all instances in the training sample. To apply to the data containing a large number of instances, we also propose a heuristic option of the algorithm without the loss of the theoretical guarantee. Our empirical study demonstrates that our algorithm uniformly works for shapelet learning tasks on time-series classification and various MIL tasks with comparable accuracy to the existing methods. Moreover, we show that the proposed heuristics allow us to achieve the result in reasonable computational time.
APA, Harvard, Vancouver, ISO, and other styles
7

MITRA, ARINDAM, and CHITTA BARAL. "Incremental and Iterative Learning of Answer Set Programs from Mutually Distinct Examples." Theory and Practice of Logic Programming 18, no. 3-4 (July 2018): 623–37. http://dx.doi.org/10.1017/s1471068418000248.

Full text
Abstract:
AbstractOver the years the Artificial Intelligence (AI) community has produced several datasets which have given the machine learning algorithms the opportunity to learn various skills across various domains. However, a subclass of these machine learning algorithms that aimed at learning logic programs, namely the Inductive Logic Programming algorithms, have often failed at the task due to the vastness of these datasets. This has impacted the usability of knowledge representation and reasoning techniques in the development of AI systems. In this research, we try to address this scalability issue for the algorithms that learn answer set programs. We present a sound and complete algorithm which takes the input in a slightly different manner and performs an efficient and more user controlled search for a solution. We show via experiments that our algorithm can learn from two popular datasets from machine learning community, namely bAbl (a question answering dataset) and MNIST (a dataset for handwritten digit recognition), which to the best of our knowledge was not previously possible. The system is publicly available athttps://goo.gl/KdWAcV.
APA, Harvard, Vancouver, ISO, and other styles
8

El'shin, L. A., A. M. Gil'manov, and V. V. Banderov. "Forecasting trends in the cryptocurrency exchange rate through the machine learning theory." Financial Analytics: Science and Experience 13, no. 1 (February 28, 2020): 97–113. http://dx.doi.org/10.24891/fa.13.1.97.

Full text
Abstract:
Subject. The study discusses methodological approaches to forecasting trends in the development of the cryptocurrency market (bitcoin). Objectives. The study aims to discover and explain tools and mechanisms for predicting how the cyptocurrency market may evolve in a short run through time series modeling methods and machine learning methods, which are based on artificial neural networks LSTM. Methods. Using Python-based programming methods, we constructed and substantiated a neural network model for the analyzable series describing how the stock exchange rate of bitcoin develops. Results. Matching loss functions, optimizer and parameters for constructing a neural network that predicts the BTC/USD exchange rate for a coming day, we proved its applicability and feasibility, which is confirmed with the lowest number of errors in the test and validation set. Conclusions and Relevance. The findings mainly prove that the above mechanism is feasible for predicting the cryptocurrency market. The mechanism is based on algorithms for constructing LSTM networks. The approach should be used to analyze and evaluate the current and future parameters of the cryptocurrency market development. The tools can be of interest for investors which operate in new markets of e-money.
APA, Harvard, Vancouver, ISO, and other styles
9

KATZOURIS, NIKOS, ALEXANDER ARTIKIS, and GEORGIOS PALIOURAS. "Online learning of event definitions." Theory and Practice of Logic Programming 16, no. 5-6 (September 2016): 817–33. http://dx.doi.org/10.1017/s1471068416000260.

Full text
Abstract:
AbstractSystems for symbolic event recognition infer occurrences of events in time using a set of event definitions in the form of first-order rules. The Event Calculus is a temporal logic that has been used as a basis in event recognition applications, providing among others, direct connections to machine learning, via Inductive Logic Programming (ILP). We present an ILP system for online learning of Event Calculus theories. To allow for a single-pass learning strategy, we use the Hoeffding bound for evaluating clauses on a subset of the input stream. We employ a decoupling scheme of the Event Calculus axioms during the learning process, that allows to learn each clause in isolation. Moreover, we use abductive-inductive logic programming techniques to handle unobserved target predicates. We evaluate our approach on an activity recognition application and compare it to a number of batch learning techniques. We obtain results of comparable predicative accuracy with significant speed-ups in training time. We also outperform hand-crafted rules and match the performance of a sound incremental learner that can only operate on noise-free datasets.
APA, Harvard, Vancouver, ISO, and other styles
10

Qu, Qiang, Ming Qi Chang, Lei Xu, Yue Wang, and Shao Hua Lu. "Support Vector Machine-Based Aqueduct Safety Assessment." Advanced Materials Research 368-373 (October 2011): 531–36. http://dx.doi.org/10.4028/www.scientific.net/amr.368-373.531.

Full text
Abstract:
According to water power, structure and foundation conditions of aqueduct, it has established aqueduct safety assessment indicator system and standards. Based on statistical learning theory, support vector machine shifts the learning problems into a convex quadratic programming problem with structural risk minimization criterion, which could get the global optimal solution, and be applicable to solving the small sample, nonlinearity classification and regression problems. In order to evaluate the safety condition of aqueduct, it has established the aqueduct safety assessment model which is based on support vector machine. It has divided safety standards into normal, basically normal, abnormal and dangerous. According to the aqueduct safety assessment standards and respective evaluation level, the sample set is generated randomly, which is used to build a pair of classifier with many support vectors. The results show that the method is feasible, and it has a good application prospect in irrigation district canal building safety assessment.
APA, Harvard, Vancouver, ISO, and other styles
11

Zhang, Nan, Xueyi Gao, and Tianyou Yu. "Heuristic Approaches to Attribute Reduction for Generalized Decision Preservation." Applied Sciences 9, no. 14 (July 16, 2019): 2841. http://dx.doi.org/10.3390/app9142841.

Full text
Abstract:
Attribute reduction is a challenging problem in rough set theory, which has been applied in many research fields, including knowledge representation, machine learning, and artificial intelligence. The main objective of attribute reduction is to obtain a minimal attribute subset that can retain the same classification or discernibility properties as the original information system. Recently, many attribute reduction algorithms, such as positive region preservation, generalized decision preservation, and distribution preservation, have been proposed. The existing attribute reduction algorithms for generalized decision preservation are mainly based on the discernibility matrix and are, thus, computationally very expensive and hard to use in large-scale and high-dimensional data sets. To overcome this problem, we introduce the similarity degree for generalized decision preservation. On this basis, the inner and outer significance measures are proposed. By using heuristic strategies, we develop two quick reduction algorithms for generalized decision preservation. Finally, theoretical and experimental results show that the proposed heuristic reduction algorithms are effective and efficient.
APA, Harvard, Vancouver, ISO, and other styles
12

Giang, Nguyen Long, Demetrovics Janos, Vu Duc Thi, and Phan Dang Khoa. "Some Properties Related to Reduct of Consistent Decision Systems." Cybernetics and Information Technologies 21, no. 2 (June 1, 2021): 3–9. http://dx.doi.org/10.2478/cait-2021-0015.

Full text
Abstract:
Abstract Reduct of decision systems is the topic that has been attracting the interest of many researchers in data mining and machine learning for more than two decades. So far, many algorithms for finding reduct of decision systems by rough set theory have been proposed. However, most of the proposed algorithms are heuristic algorithms that find one reduct with the best classification quality. The complete study of properties of reduct of decision systems is limited. In this paper, we discover equivalence properties of reduct of consistent decision systems related to a Sperner-system. As the result, the study of the family of reducts in a consistent decision system is the study of Sperner-systems.
APA, Harvard, Vancouver, ISO, and other styles
13

CALIMERI, FRANCESCO, CARMINE DODARO, DAVIDE FUSCÀ, SIMONA PERRI, and JESSICA ZANGARI. "Efficiently Coupling the I-DLV Grounder with ASP Solvers." Theory and Practice of Logic Programming 20, no. 2 (December 4, 2018): 205–24. http://dx.doi.org/10.1017/s1471068418000546.

Full text
Abstract:
We present ${{{{$\mathscr{I}$}-}\textsc{dlv}}+{{$\mathscr{MS}$}}}$, a new answer set programming (ASP) system that integrates an efficient grounder, namely ${{{$\mathscr{I}$}-}\textsc{dlv}}$, with an automatic selector that inductively chooses a solver: depending on some inherent features of the instantiation produced by ${{{$\mathscr{I}$}-}\textsc{dlv}}$, machine learning techniques guide the selection of the most appropriate solver. The system participated in the latest (7th) ASP competition, winning the regular track, category SP (i.e., one processor allowed).
APA, Harvard, Vancouver, ISO, and other styles
14

Burke, James V., and Abraham Engle. "Strong Metric (Sub)regularity of Karush–Kuhn–Tucker Mappings for Piecewise Linear-Quadratic Convex-Composite Optimization and the Quadratic Convergence of Newton’s Method." Mathematics of Operations Research 45, no. 3 (August 2020): 1164–92. http://dx.doi.org/10.1287/moor.2019.1027.

Full text
Abstract:
This work concerns the local convergence theory of Newton and quasi-Newton methods for convex-composite optimization: where one minimizes an objective that can be written as the composition of a convex function with one that is continuiously differentiable. We focus on the case in which the convex function is a potentially infinite-valued piecewise linear-quadratic function. Such problems include nonlinear programming, mini-max optimization, and estimation of nonlinear dynamics with non-Gaussian noise as well as many modern approaches to large-scale data analysis and machine learning. Our approach embeds the optimality conditions for convex-composite optimization problems into a generalized equation. We establish conditions for strong metric subregularity and strong metric regularity of the corresponding set-valued mappings. This allows us to extend classical convergence of Newton and quasi-Newton methods to the broader class of nonfinite valued piecewise linear-quadratic convex-composite optimization problems. In particular, we establish local quadratic convergence of the Newton method under conditions that parallel those in nonlinear programming.
APA, Harvard, Vancouver, ISO, and other styles
15

Li, Yiwei, Edison Tsai, Marek Perkowski, and Xiaoyu Song. "Grover-based Ashenhurst-Curtis decomposition using quantum language quipper." Quantum Information and Computation 19, no. 1&2 (February 2019): 35–66. http://dx.doi.org/10.26421/qic19.1-2-4.

Full text
Abstract:
Functional decomposition plays a key role in several areas such as system design, digital circuits, database systems, and Machine Learning. This paper presents a novel quantum computing approach based on Grover’s search algorithm for a generalized Ashenhurst-Curtis decomposition. The method models the decomposition problem as a search problem and constructs the oracle circuit based on the set-theoretic partition algebra. A hybrid quantum-based algorithm takes advantage of the quadratic speedup achieved by Grover’s search algorithm with quantum oracles for finding the minimum-cost decomposition. The method is implemented and simulated in the quantum programming language Quipper. This work constitutes the first attempt to apply quantum computing to functional decomposition.
APA, Harvard, Vancouver, ISO, and other styles
16

Mahmood, Sozan Abdulla, and Qani Qabil Qasim. "Big Data Sentimental Analysis Using Document to Vector and Optimized Support Vector Machine." UHD Journal of Science and Technology 4, no. 1 (February 13, 2020): 18. http://dx.doi.org/10.21928/uhdjst.v4n1y2020.pp18-28.

Full text
Abstract:
With the rapid evolution of the internet, using social media networks such as Twitter, Facebook, and Tumblr, is becoming so common that they have made a great impact on every aspect of human life. Twitter is one of the most popular micro-blogging social media that allow people to share their emotions in short text about variety of topics such as company’s products, people, politics, and services. Analyzing sentiment could be possible as emotions and reviews on different topics are shared every second, which makes social media to become a useful source of information in different fields such as business, politics, applications, and services. Twitter Application Programming Interface (Twitter-API), which is an interface between developers and Twitter, allows them to search for tweets based on the desired keyword using some secret keys and tokens. In this work, Twitter-API used to download the most recent tweets about four keywords, namely, (Trump, Bitcoin, IoT, and Toyota) with a different number of tweets. “Vader” that is a lexicon rule-based method used to categorize downloaded tweets into “Positive” and “Negative” based on their polarity, then the tweets were protected in Mongo database for the next processes. After pre-processing, the hold-out technique was used to split each dataset to 80% as “training-set” and rest 20% “testing-set.” After that, a deep learning-based Document to Vector model was used for feature extraction. To perform the classification task, Radial Bias Function kernel-based support vector machine (SVM) has been used. The accuracy of (RBF-SVM) mainly depends on the value of hyperplane “Soft Margin” penalty “C” and γ “gamma” parameters. The main goal of this work is to select best values for those parameters in order to improve the accuracy of RBF-SVM classifier. The objective of this study is to show the impacts of using four meta-heuristic optimizer algorithms, namely, particle swarm optimizer (PSO), modified PSO (MPSO), grey wolf optimizer (GWO), and hybrid of PSO-GWO in improving SVM classification accuracy by selecting the best values for those parameters. To the best of our knowledge, hybrid PSO-GWO has never been used in SVM optimization. The results show that these optimizers have a significant impact on increasing SVM accuracy. The best accuracy of the model with traditional SVM was 87.885%. After optimization, the highest accuracy obtained with GWO is 91.053% while PSO, hybrid PSO-GWO, and MPSO best accuracies are 90.736%, 90.657%, and 90.557%, respectively.
APA, Harvard, Vancouver, ISO, and other styles
17

Sgurev, Vassil, Vladimir Jotsov, and Mincho Hadjiski. "Intelligent Systems: Methodology, Models, and Applications in Emerging Technologies." Journal of Advanced Computational Intelligence and Intelligent Informatics 9, no. 1 (January 20, 2005): 3–4. http://dx.doi.org/10.20965/jaciii.2005.p0003.

Full text
Abstract:
From year to year the number of investigations on intelligent systems grows rapidly. For example this year 245 papers from 45 countries were sent for the Second International IEEE Conference on Intelligent Systems (www.ieee-is.org; www.fnts-bg.org/is) and this is an increase of more than 50% by all indicators. The presented papers on intelligent systems were marked by big audiences and they provoked a significant interest that ultimately led to the formation of vivid discussions, exchange of ideas and locally provoked the creation of working groups for different applied projects. All this reflects the worldwide tendencies for the leading role of the research on intelligent systems theoretically and practically. The greater part of the presented research dealt with traditional for the intelligent systems problems like artificial intelligence, knowledge engineering, intelligent agents, neural and fuzzy networks, intelligent data processing, intelligent control and decision making systems, and also new interdisciplinary problems like ontology and semantics in Internet, fuzzy intuitionistic logic. The majority of papers from the European and American researchers are dedicated to the theory and the applications of the intelligent systems with machine learning, fuzzy inference or uncertainty. Another big group of papers focuses on the domain of building and integrating ontologies of applications with heterogeneous multiagent systems. A great number of papers on intelligent systems deals with fuzzy sets. The papers of many other researchers underscore the significance of the contemporary perception-oriented methods and also of different applications in the intelligent systems. On the first place this is valid for the paradigm of L. A. Zadeh 'computing with words'. The Guest Editors in the present specialized journal volume would like to introduce a wealth of research with an applied and theoretical character that possesses a common characteristic and it is the conference best papers complemented and updated by the new elaborations of the authors during the last half a year. A short description of the presented in the volume papers follows. In 'Combining Local and Global Access to Ontologies in a Multiagent System' <B>R. Brena and H. Ceballos (Mexico)</B> proposed an original way for operation with ontologies where a part of the ontology is processed by a client's component and the rest is transmitted to the other agents by an ontology agent. The inter-agent communication is improved in this way. In 'Fuzzy Querying of Evolutive Situations: Application to Driving Situations' <B>S. Ould Yahia and S. Loriette-Rougegrez (France)</B> present an approach to analysis of driving situations using multimedia images and fuzzy estimates that will improve the driver's security. In 'Rememberng What You Forget in an Online Shopping Context' <B>M. Halvey and M. Keane (Ireland)</B> presented their approach to constructing online system that predicts the items for future shopping sessions using a novel idea called Memory Zones. In 'Reinforcement Learning for Online Industrial Process Control' the authors <B>J. Govindhasamy et al. (Ireland)</B> use a synthesis of dynamic programming, reinforcement learning and backpropagation for a goal of modeling and controlling an industrial grinding process. The felicitous combination of methods contributes for a greater effectiveness of the applications compared to the existing controllers. In 'Dynamic Visualization of Information: From Database to Dataspace' the authors <B>C. St-Jacques and L. Paquin (Canada)</B> suggested a friendly online access to large multimedia databases. <B>W. Huang (UK)</B> redefines in 'Towards Context-Aware Knowledge Management in e-Enterprises' the concept of context in intelligent systems and proposes a set of meta-information elements for context description in a business environment. His approach is applicable in the E-business, in the Semantic Web and in the Semantic Grid. In 'Block-Based Change Detection in the Presence of Ambient Illuminaion Variations' <B>T. Alexandropoulos et al. (Greece)</B> use a statistic analysis, clustering and pattern recognition algorithms, etc. for the goal of noise extraction and the global illumination correction. In 'Combining Argumentation and Web Search Technology: Towards a Qualitative Approach for Ranking Results' <B>C. Chesñevar (Spain) and A. Maguitman (USA)</B> proposed a recommender system for improving the WEB search. Defeasible argumentation and decision support methods have been used in the system. In 'Modified Axiomatic Basis of Subjective Probability' <B>K. Tenekedjiev et al. (Bulgaria)</B> make a contribution to the axiomatic approach to subjective uncertainty by introducing a modified set of six axioms to subjective probabilities. In 'Fuzzy Rationality in Quantitative Decision Analysis' <B>N. Nikolova et al. (Bulgaria)</B> present a discussion on fuzzy rationality in the elicitation of subjective probabilities and utilities. The possibility to make this special issue was politely offered to the Guest Editors by Prof. Kaoru Hirota, Prof. Toshio Fukuda and we thank them for that. Due to the help of Kenta Uchino and also due to the new elaborations presented by explorers from Europe and America the appearance of this special issue became possible.
APA, Harvard, Vancouver, ISO, and other styles
18

Nayyar, Anand, Rudra Rameshwar, and Piyush Kanti Dutta. "Special Issue on Recent Trends and Future of Fog and Edge Computing, Services and Enabling Technologies." Scalable Computing: Practice and Experience 20, no. 2 (May 2, 2019): iii—vi. http://dx.doi.org/10.12694/scpe.v20i2.1558.

Full text
Abstract:
Recent Trends and Future of Fog and Edge Computing, Services, and Enabling Technologies Cloud computing has been established as the most popular as well as suitable computing infrastructure providing on-demand, scalable and pay-as-you-go computing resources and services for the state-of-the-art ICT applications which generate a massive amount of data. Though Cloud is certainly the most fitting solution for most of the applications with respect to processing capability and storage, it may not be so for the real-time applications. The main problem with Cloud is the latency as the Cloud data centres typically are very far from the data sources as well as the data consumers. This latency is ok with the application domains such as enterprise or web applications, but not for the modern Internet of Things (IoT)-based pervasive and ubiquitous application domains such as autonomous vehicle, smart and pervasive healthcare, real-time traffic monitoring, unmanned aerial vehicles, smart building, smart city, smart manufacturing, cognitive IoT, and so on. The prerequisite for these types of application is that the latency between the data generation and consumption should be minimal. For that, the generated data need to be processed locally, instead of sending to the Cloud. This approach is known as Edge computing where the data processing is done at the network edge in the edge devices such as set-top boxes, access points, routers, switches, base stations etc. which are typically located at the edge of the network. These devices are increasingly being incorporated with significant computing and storage capacity to cater to the need for local Big Data processing. The enabling of Edge computing can be attributed to the Emerging network technologies, such as 4G and cognitive radios, high-speed wireless networks, and energy-efficient sophisticated sensors. Different Edge computing architectures are proposed (e.g., Fog computing, mobile edge computing (MEC), cloudlets, etc.). All of these enable the IoT and sensor data to be processed closer to the data sources. But, among them, Fog computing, a Cisco initiative, has attracted the most attention of people from both academia and corporate and has been emerged as a new computing-infrastructural paradigm in recent years. Though Fog computing has been proposed as a different computing architecture than Cloud, it is not meant to replace the Cloud. Rather, Fog computing extends the Cloud services to network edges for providing computation, networking, and storage services between end devices and data centres. Ideally, Fog nodes (edge devices) are supposed to pre-process the data, serve the need of the associated applications preliminarily, and forward the data to the Cloud if the data are needed to be stored and analysed further. Fog computing enhances the benefits from smart devices operational not only in network perimeter but also under cloud servers. Fog-enabled services can be deployed anywhere in the network, and with these services provisioning and management, huge potential can be visualized to enhance intelligence within computing networks to realize context-awareness, high response time, and network traffic offloading. Several possibilities of Fog computing are already established. For example, sustainable smart cities, smart grid, smart logistics, environment monitoring, video surveillance, etc. To design and implementation of Fog computing systems, various challenges concerning system design and implementation, computing and communication, system architecture and integration, application-based implementations, fault tolerance, designing efficient algorithms and protocols, availability and reliability, security and privacy, energy-efficiency and sustainability, etc. are needed to be addressed. Also, to make Fog compatible with Cloud several factors such as Fog and Cloud system integration, service collaboration between Fog and Cloud, workload balance between Fog and Cloud, and so on need to be taken care of. It is our great privilege to present before you Volume 20, Issue 2 of the Scalable Computing: Practice and Experience. We had received 20 Research Papers and out of which 14 Papers are selected for Publication. The aim of this special issue is to highlight Recent Trends and Future of Fog and Edge Computing, Services and Enabling technologies. The special issue will present new dimensions of research to researchers and industry professionals with regard to Fog Computing, Cloud Computing and Edge Computing. Sujata Dash et al. contributed a paper titled “Edge and Fog Computing in Healthcare- A Review” in which an in-depth review of fog and mist computing in the area of health care informatics is analysed, classified and discussed. The review presented in this paper is primarily focussed on three main aspects: The requirements of IoT based healthcare model and the description of services provided by fog computing to address then. The architecture of an IoT based health care system embedding fog computing layer and implementation of fog computing layer services along with performance and advantages. In addition to this, the researchers have highlighted the trade-off when allocating computational task to the level of network and also elaborated various challenges and security issues of fog and edge computing related to healthcare applications. Parminder Singh et al. in the paper titled “Triangulation Resource Provisioning for Web Applications in Cloud Computing: A Profit-Aware” proposed a novel triangulation resource provisioning (TRP) technique with a profit-aware surplus VM selection policy to ensure fair resource utilization in hourly billing cycle while giving the quality of service to end-users. The proposed technique use time series workload forecasting, CPU utilization and response time in the analysis phase. The proposed technique is tested using CloudSim simulator and R language is used to implement prediction model on ClarkNet weblog. The proposed approach is compared with two baseline approaches i.e. Cost-aware (LRM) and (ARMA). The response time, CPU utilization and predicted request are applied in the analysis and planning phase for scaling decisions. The profit-aware surplus VM selection policy used in the execution phase for select the appropriate VM for scale-down. The result shows that the proposed model for web applications provides fair utilization of resources with minimum cost, thus provides maximum profit to application provider and QoE to the end users. Akshi kumar and Abhilasha Sharma in the paper titled “Ontology driven Social Big Data Analytics for Fog enabled Sentic-Social Governance” utilized a semantic knowledge model for investigating public opinion towards adaption of fog enabled services for governance and comprehending the significance of two s-components (sentic and social) in aforesaid structure that specifically visualize fog enabled Sentic-Social Governance. The results using conventional TF-IDF (Term Frequency-Inverse Document Frequency) feature extraction are empirically compared with ontology driven TF-IDF feature extraction to find the best opinion mining model with optimal accuracy. The results concluded that implementation of ontology driven opinion mining for feature extraction in polarity classification outperforms the traditional TF-IDF method validated over baseline supervised learning algorithms with an average of 7.3% improvement in accuracy and approximately 38% reduction in features has been reported. Avinash Kaur and Pooja Gupta in the paper titled “Hybrid Balanced Task Clustering Algorithm for Scientific workflows in Cloud Computing” proposed novel hybrid balanced task clustering algorithm using the parameter of impact factor of workflows along with the structure of workflow and using this technique, tasks can be considered for clustering either vertically or horizontally based on value of impact factor. The testing of the algorithm proposed is done on Workflowsim- an extension of CloudSim and DAG model of workflow was executed. The Algorithm was tested on variables- Execution time of workflow and Performance Gain and compared with four clustering methods: Horizontal Runtime Balancing (HRB), Horizontal Clustering (HC), Horizontal Distance Balancing (HDB) and Horizontal Impact Factor Balancing (HIFB) and results stated that proposed algorithm is almost 5-10% better in makespan time of workflow depending on the workflow used. Pijush Kanti Dutta Pramanik et al. in the paper titled “Green and Sustainable High-Performance Computing with Smartphone Crowd Computing: Benefits, Enablers and Challenges” presented a comprehensive statistical survey of the various commercial CPUs, GPUs, SoCs for smartphones confirming the capability of the SCC as an alternative to HPC. An exhaustive survey is presented on the present and optimistic future of the continuous improvement and research on different aspects of smartphone battery and other alternative power sources which will allow users to use their smartphones for SCC without worrying about the battery running out. Dhanapal and P. Nithyanandam in the paper titled “The Slow HTTP Distributed Denial of Service (DDOS) Attack Detection in Cloud” proposed a novel method to detect slow HTTP DDoS attacks in cloud to overcome the issue of consuming all available server resources and making it unavailable to the real users. The proposed method is implemented using OpenStack cloud platform with slowHTTPTest tool. The results stated that proposed technique detects the attack in efficient manner. Mandeep Kaur and Rajni Mohana in the paper titled “Static Load Balancing Technique for Geographically partitioned Public Cloud” proposed a novel approach focused upon load balancing in the partitioned public cloud by combining centralized and decentralized approaches, assuming the presence of fog layer. A load balancer entity is used for decentralized load balancing at partitions and a controller entity is used for centralized level to balance the overall load at various partitions. Results are compared with First Come First Serve (FCFS) and Shortest Job First (SJF) algorithms. In this work, the researchers compared the Waiting Time, Finish Time and Actual Run Time of tasks using these algorithms. To reduce the number of unhandled jobs, a new load state is introduced which checks load beyond conventional load states. Major objective of this approach is to reduce the need of runtime virtual machine migration and to reduce the wastage of resources, which may be occurring due to predefined values of threshold. Mukta and Neeraj Gupta in the paper titled “Analytical Available Bandwidth Estimation in Wireless Ad-Hoc Networks considering Mobility in 3-Dimensional Space” proposes an analytical approach named Analytical Available Bandwidth Estimation Including Mobility (AABWM) to estimate ABW on a link. The major contributions of the proposed work are: i) it uses mathematical models based on renewal theory to calculate the collision probability of data packets which makes the process simple and accurate, ii) consideration of mobility under 3-D space to predict the link failure and provides an accurate admission control. To test the proposed technique, the researcher used NS-2 simulator to compare the proposed technique i.e. AABWM with AODV, ABE, IAB and IBEM on throughput, Packet loss ratio and Data delivery. Results stated that AABWM performs better as compared to other approaches. R.Sridharan and S. Domnic in the paper titled “Placement Strategy for Intercommunicating Tasks of an Elastic Request in Fog-Cloud Environment” proposed a novel heuristic IcAPER,(Inter-communication Aware Placement for Elastic Requests) algorithm. The proposed algorithm uses the network neighborhood machine for placement, once current resource is fully utilized by the application. The performance IcAPER algorithm is compared with First Come First Serve (FCFS), Random and First Fit Decreasing (FFD) algorithms for the parameters (a) resource utilization (b) resource fragmentation and (c) Number of requests having intercommunicating tasks placed on to same PM using CloudSim simulator. Simulation results shows IcAPER maps 34% more tasks on to the same PM and also increase the resource utilization by 13% while decreasing the resource fragmentation by 37.8% when compared to other algorithms. Velliangiri S. et al. in the paper titled “Trust factor based key distribution protocol in Hybrid Cloud Environment” proposed a novel security protocol comprising of two stages: first stage, Group Creation using the trust factor and develop key distribution security protocol. It performs the communication process among the virtual machine communication nodes. Creating several groups based on the cluster and trust factors methods. The second stage, the ECC (Elliptic Curve Cryptography) based distribution security protocol is developed. The performance of the Trust Factor Based Key Distribution protocol is compared with the existing ECC and Diffie Hellman key exchange technique. The results state that the proposed security protocol has more secure communication and better resource utilization than the ECC and Diffie Hellman key exchange technique in the Hybrid cloud. Vivek kumar prasad et al. in the paper titled “Influence of Monitoring: Fog and Edge Computing” discussed various techniques involved for monitoring for edge and fog computing and its advantages in addition to a case study based on Healthcare monitoring system. Avinash Kaur et al. elaborated a comprehensive view of existing data placement schemes proposed in literature for cloud computing. Further, it classified data placement schemes based on their assess capabilities and objectives and in addition to this comparison of data placement schemes. Parminder Singh et al. presented a comprehensive review of Auto-Scaling techniques of web applications in cloud computing. The complete taxonomy of the reviewed articles is done on varied parameters like auto-scaling, approach, resources, monitoring tool, experiment, workload and metric, etc. Simar Preet Singh et al. in the paper titled “Dynamic Task Scheduling using Balanced VM Allocation Policy for Fog Computing Platform” proposed a novel scheme to improve the user contentment by improving the cost to operation length ratio, reducing the customer churn, and boosting the operational revenue. The proposed scheme is learnt to reduce the queue size by effectively allocating the resources, which resulted in the form of quicker completion of user workflows. The proposed method results are evaluated against the state-of-the-art scene with non-power aware based task scheduling mechanism. The results were analyzed using parameters-- energy, SLA infringement and workflow execution delay. The performance of the proposed schema was analyzed in various experiments particularly designed to analyze various aspects for workflow processing on given fog resources. The LRR (35.85 kWh) model has been found most efficient on the basis of average energy consumption in comparison to the LR (34.86 kWh), THR (41.97 kWh), MAD (45.73 kWh) and IQR (47.87 kWh). The LRR model has been also observed as the leader when compared on the basis of number of VM migrations. The LRR (2520 VMs) has been observed as best contender on the basis of mean of number of VM migrations in comparison with LR (2555 VMs), THR (4769 VMs), MAD (5138 VMs) and IQR (5352 VMs).
APA, Harvard, Vancouver, ISO, and other styles
19

KATZOURIS, NIKOS, GEORGIOS PALIOURAS, and ALEXANDER ARTIKIS. "Online Learning Probabilistic Event Calculus Theories in Answer Set Programming." Theory and Practice of Logic Programming, August 1, 2021, 1–25. http://dx.doi.org/10.1017/s1471068421000107.

Full text
Abstract:
Abstract Complex Event Recognition (CER) systems detect event occurrences in streaming time-stamped input using predefined event patterns. Logic-based approaches are of special interest in CER, since, via Statistical Relational AI, they combine uncertainty-resilient reasoning with time and change, with machine learning, thus alleviating the cost of manual event pattern authoring. We present a system based on Answer Set Programming (ASP), capable of probabilistic reasoning with complex event patterns in the form of weighted rules in the Event Calculus, whose structure and weights are learnt online. We compare our ASP-based implementation with a Markov Logic-based one and with a number of state-of-the-art batch learning algorithms on CER data sets for activity recognition, maritime surveillance and fleet management. Our results demonstrate the superiority of our novel approach, both in terms of efficiency and predictive performance. This paper is under consideration for publication in Theory and Practice of Logic Programming (TPLP).
APA, Harvard, Vancouver, ISO, and other styles
20

Arrigoni, Marco, and Georg K. H. Madsen. "Evolutionary computing and machine learning for discovering of low-energy defect configurations." npj Computational Materials 7, no. 1 (May 20, 2021). http://dx.doi.org/10.1038/s41524-021-00537-1.

Full text
Abstract:
AbstractDensity functional theory (DFT) has become a standard tool for the study of point defects in materials. However, finding the most stable defective structures remains a very challenging task as it involves the solution of a multimodal optimization problem with a high-dimensional objective function. Hitherto, the approaches most commonly used to tackle this problem have been mostly empirical, heuristic, and/or based on domain knowledge. In this contribution, we describe an approach for exploring the potential energy surface (PES) based on the covariance matrix adaptation evolution strategy (CMA-ES) and supervised and unsupervised machine learning models. The resulting algorithm depends only on a limited set of physically interpretable hyperparameters and the approach offers a systematic way for finding low-energy configurations of isolated point defects in solids. We demonstrate its applicability on different systems and show its ability to find known low-energy structures and discover additional ones as well.
APA, Harvard, Vancouver, ISO, and other styles
21

Garbulowski, Mateusz, Klev Diamanti, Karolina Smolińska, Nicholas Baltzer, Patricia Stoll, Susanne Bornelöv, Aleksander Øhrn, Lars Feuk, and Jan Komorowski. "R.ROSETTA: an interpretable machine learning framework." BMC Bioinformatics 22, no. 1 (March 6, 2021). http://dx.doi.org/10.1186/s12859-021-04049-z.

Full text
Abstract:
Abstract Background Machine learning involves strategies and algorithms that may assist bioinformatics analyses in terms of data mining and knowledge discovery. In several applications, viz. in Life Sciences, it is often more important to understand how a prediction was obtained rather than knowing what prediction was made. To this end so-called interpretable machine learning has been recently advocated. In this study, we implemented an interpretable machine learning package based on the rough set theory. An important aim of our work was provision of statistical properties of the models and their components. Results We present the R.ROSETTA package, which is an R wrapper of ROSETTA framework. The original ROSETTA functions have been improved and adapted to the R programming environment. The package allows for building and analyzing non-linear interpretable machine learning models. R.ROSETTA gathers combinatorial statistics via rule-based modelling for accessible and transparent results, well-suited for adoption within the greater scientific community. The package also provides statistics and visualization tools that facilitate minimization of analysis bias and noise. The R.ROSETTA package is freely available at https://github.com/komorowskilab/R.ROSETTA. To illustrate the usage of the package, we applied it to a transcriptome dataset from an autism case–control study. Our tool provided hypotheses for potential co-predictive mechanisms among features that discerned phenotype classes. These co-predictors represented neurodevelopmental and autism-related genes. Conclusions R.ROSETTA provides new insights for interpretable machine learning analyses and knowledge-based systems. We demonstrated that our package facilitated detection of dependencies for autism-related genes. Although the sample application of R.ROSETTA illustrates transcriptome data analysis, the package can be used to analyze any data organized in decision tables.
APA, Harvard, Vancouver, ISO, and other styles
22

Hernández-Orozco, Santiago, Hector Zenil, Jürgen Riedel, Adam Uccello, Narsis A. Kiani, and Jesper Tegnér. "Algorithmic Probability-Guided Machine Learning on Non-Differentiable Spaces." Frontiers in Artificial Intelligence 3 (January 25, 2021). http://dx.doi.org/10.3389/frai.2020.567356.

Full text
Abstract:
We show how complexity theory can be introduced in machine learning to help bring together apparently disparate areas of current research. We show that this model-driven approach may require less training data and can potentially be more generalizable as it shows greater resilience to random attacks. In an algorithmic space the order of its element is given by its algorithmic probability, which arises naturally from computable processes. We investigate the shape of a discrete algorithmic space when performing regression or classification using a loss function parametrized by algorithmic complexity, demonstrating that the property of differentiation is not required to achieve results similar to those obtained using differentiable programming approaches such as deep learning. In doing so we use examples which enable the two approaches to be compared (small, given the computational power required for estimations of algorithmic complexity). We find and report that 1) machine learning can successfully be performed on a non-smooth surface using algorithmic complexity; 2) that solutions can be found using an algorithmic-probability classifier, establishing a bridge between a fundamentally discrete theory of computability and a fundamentally continuous mathematical theory of optimization methods; 3) a formulation of an algorithmically directed search technique in non-smooth manifolds can be defined and conducted; 4) exploitation techniques and numerical methods for algorithmic search to navigate these discrete non-differentiable spaces can be performed; in application of the (a) identification of generative rules from data observations; (b) solutions to image classification problems more resilient against pixel attacks compared to neural networks; (c) identification of equation parameters from a small data-set in the presence of noise in continuous ODE system problem, (d) classification of Boolean NK networks by (1) network topology, (2) underlying Boolean function, and (3) number of incoming edges.
APA, Harvard, Vancouver, ISO, and other styles
23

Wan, Min. "Research on economic system based on fuzzy set comprehensive evaluation model." Journal of Intelligent & Fuzzy Systems, December 21, 2020, 1–11. http://dx.doi.org/10.3233/jifs-189569.

Full text
Abstract:
The development of the economic system is affected by many factors, and the stability of the traditional economic analysis model is difficult to maintain. In order to explore the efficient and stable economic system evaluation and analysis model, based on machine learning ideas, this study uses rough set algorithm as the basic algorithm, and applies the related methods of rough set and catastrophe model theory to the evaluation of ecological economic development level. Moreover, this study reduces the redundant index of the index system and calculates the importance of the index after reduction. Based on the catastrophe set model, this study uses MATLAB software programming to comprehensively quantify the ecological economy, and finally divides the ecological economic grade. In addition, this study combines rough set theory with fuzzy mathematics, and initially establishes a two-branch fuzzy evaluation model. Finally, this study combines the actual situation to use the established model to evaluate the regional eco-economic system. The research results show that the method proposed in this paper has a certain effect, which can provide a reference for subsequent related research.
APA, Harvard, Vancouver, ISO, and other styles
24

Hernandez, Alberto, Adarsh Balasubramanian, Fenglin Yuan, Simon A. M. Mason, and Tim Mueller. "Fast, accurate, and transferable many-body interatomic potentials by symbolic regression." npj Computational Materials 5, no. 1 (November 18, 2019). http://dx.doi.org/10.1038/s41524-019-0249-1.

Full text
Abstract:
AbstractThe length and time scales of atomistic simulations are limited by the computational cost of the methods used to predict material properties. In recent years there has been great progress in the use of machine-learning algorithms to develop fast and accurate interatomic potential models, but it remains a challenge to develop models that generalize well and are fast enough to be used at extreme time and length scales. To address this challenge, we have developed a machine-learning algorithm based on symbolic regression in the form of genetic programming that is capable of discovering accurate, computationally efficient many-body potential models. The key to our approach is to explore a hypothesis space of models based on fundamental physical principles and select models within this hypothesis space based on their accuracy, speed, and simplicity. The focus on simplicity reduces the risk of overfitting the training data and increases the chances of discovering a model that generalizes well. Our algorithm was validated by rediscovering an exact Lennard-Jones potential and a Sutton-Chen embedded-atom method potential from training data generated using these models. By using training data generated from density functional theory calculations, we found potential models for elemental copper that are simple, as fast as embedded-atom models, and capable of accurately predicting properties outside of their training set. Our approach requires relatively small sets of training data, making it possible to generate training data using highly accurate methods at a reasonable computational cost. We present our approach, the forms of the discovered models, and assessments of their transferability, accuracy and speed.
APA, Harvard, Vancouver, ISO, and other styles
25

Saeed, Muhammad Usman, Zuoyu Sun, and Said Elias. "Research developments in adaptive intelligent vibration control of smart civil structures." Journal of Low Frequency Noise, Vibration and Active Control, August 17, 2021, 146134842110327. http://dx.doi.org/10.1177/14613484211032758.

Full text
Abstract:
Control algorithms are the most critical aspects in the successful control of civil structures subjected to earthquake and wind forces. In recent years, adaptive intelligent control algorithms are emerging as an acceptable substitute method to conventional model-based control algorithms. These algorithms mainly work on the principles of artificial intelligence (AI) and soft computing (SC) methods that make them highly efficient in controlling highly nonlinear, time-varying, and time-delayed complex civil structures. The current research probes to control algorithms, that this article set forth an inclusive state-of-the-art review of adaptive intelligent control (AIC) algorithms for vibration control of smart civil structures. First, a general introduction to adaptive intelligent control is presented along with its advantages over conventional control algorithms. Second, their classification concerning artificial intelligence and soft computing methods is provided that mainly consists of artificial neural network-based controller, brain emotional learning-based intelligent controller, replicator dynamics-based controller, multi-agent system-based controller, support vector machine-based controller, fuzzy logic control, adaptive neuro-fuzzy inference system-based controller, adaptive filters-base controller, and meta-heuristic algorithms-based hybrid controllers. Third, a brief review of these algorithms with their developments on the theory and applications is provided. Fourth, we demonstrate a summarized overview of the cited literature with a brief trend analysis is presented. Finally, this study presents an overview of these innovative AIC methods that can demonstrate future directions. The contribution of this article is the anticipation of detailed and in-depth discussion into the perspective of AI and SC-based AIC method advances that enabled practical applications in attenuating vibration response of smart civil structures. Moreover, the review demonstrates the computing advantages of AIC over conventional controllers that are important in creating the next generation of smart civil structures.
APA, Harvard, Vancouver, ISO, and other styles
26

Burns, Alex. "Select Issues with New Media Theories of Citizen Journalism." M/C Journal 10, no. 6 (April 1, 2008). http://dx.doi.org/10.5204/mcj.2723.

Full text
Abstract:
“Journalists have to begin a new type of journalism, sometimes being the guide on the side of the civic conversation as well as the filter and gatekeeper.” (Kolodzy 218) “In many respects, citizen journalism is simply public journalism removed from the journalism profession.” (Barlow 181) 1. Citizen Journalism — The Latest Innovation? New Media theorists such as Dan Gillmor, Henry Jenkins, Jay Rosen and Jeff Howe have recently touted Citizen Journalism (CJ) as the latest innovation in 21st century journalism. “Participatory journalism” and “user-driven journalism” are other terms to describe CJ, which its proponents argue is a disruptive innovation (Christensen) to the agenda-setting media institutions, news values and “objective” reportage. In this essay I offer a “contrarian” view, informed by two perspectives: (1) a three-stage model of theory-building (Carlile & Christensen) to evaluate the claims made about CJ; and (2) self-reflexive research insights (Etherington) from editing the US-based news site Disinformation between November 1999 and February 2008. New media theories can potentially create “cognitive dissonance” (Festinger) when their explanations of CJ practices are compared with what actually happens (Feyerabend). First I summarise Carlile & Christensen’s model and the dangers of “bad theory” (Ghoshal). Next I consider several problems in new media theories about CJ: the notion of ‘citizen’, new media populism, parallels in event-driven and civic journalism, and mergers and acquisitions. Two ‘self-reflexive’ issues are considered: ‘pro-ams’ or ‘professional amateurs’ as a challenge to professional journalists, and CJ’s deployment in new media operations and production environments. Finally, some exploratory questions are offered for future researchers. 2. An Evaluative Framework for New Media Theories on Citizen Journalism Paul Carlile and Clayton M. Christensen’s model offers one framework with which to evaluate new media theories on CJ. This framework is used below to highlight select issues and gaps in CJ’s current frameworks and theories. Carlile & Christensen suggest that robust theory-building emerges via three stages: Descriptive, Categorisation and Normative (Carlile & Christensen). There are three sub-stages in Descriptive theory-building; namely, the observation of phenomena, inductive classification into schemas and taxonomies, and correlative relationships to develop models (Carlile & Christensen 2-5). Once causation is established, Normative theory evolves through deductive logic which is subject to Kuhnian paradigm shifts and Popperian falsifiability (Carlile & Christensen 6). Its proponents situate CJ as a Categorisation or new journalism agenda that poses a Normative challenged and Kuhnian paradigm shift to traditional journalism. Existing CJ theories jump from the Descriptive phase of observations like “smart mobs” in Japanese youth subcultures (Rheingold) to make broad claims for Categorisation such as that IndyMedia, blogs and wiki publishing systems as new media alternatives to traditional media. CJ theories then underpin normative beliefs, values and worldviews. Correlative relationships are also used to differentiate CJ from the demand side of microeconomic analysis, from the top-down editorial models of traditional media outlets, and to adopt a vanguard stance. To support this, CJ proponents cite research on emergent collective behaviour such as the “wisdom of crowds” hypothesis (Surowiecki) or peer-to-peer network “swarms” (Pesce) to provide scientific justification for their Normative theories. However, further evaluative research is needed for three reasons: the emergent collective behaviour hypothesis may not actually inform CJ practices, existing theories may have “correlation not cause” errors, and the link may be due to citation network effects between CJ theorists. Collectively, this research base also frames CJ as an “ought to” Categorisation and then proceeds to Normative theory-building (Carlile & Christensen 7). However, I argue below that this Categorisation may be premature: its observations and correlative relationships might reinforce a ‘weak’ Normative theory with limited generalisation. CJ proponents seem to imply that it can be applied anywhere and under any condition—a “statement of causality” that almost makes it a fad (Carlile & Christensen 8). CJ that relies on Classification and Normative claims will be problematic without a strong grounding in Descriptive observation. To understand what’s potentially at stake for CJ’s future consider the consider the parallel debate about curricula renewal for the Masters of Business Administration in the wake of high-profile corporate collapses such as Enron, Worldcom, HIH and OneTel. The MBA evolved as a sociological and institutional construct to justify management as a profession that is codified, differentiated and has entry barriers (Khurana). This process might partly explain the pushback that some media professionals have to CJ as one alternative. MBA programs faced criticism if they had student cohorts with little business know-how or experiential learning (Mintzberg). Enron’s collapse illustrated the ethical dilemmas and unintended consequences that occurred when “bad theories” were implemented (Ghoshal). Professional journalists are aware of this: MBA-educated managers challenged the “craft” tradition in the early 1980s (Underwood). This meant that journalism’s ‘self-image’ (Morgan; Smith) is intertwined with managerial anxieties about media conglomerates in highly competitive markets. Ironically, as noted below, Citizen Journalists who adopt a vanguard position vis-a-vis media professionals step into a more complex game with other players. However, current theories have a naïve idealism about CJ’s promise of normative social change in the face of Machiavellian agency in business, the media and politics. 3. Citizen Who? Who is the “citizen” in CJ? What is their self-awareness as a political agent? CJ proponents who use the ‘self-image’ of ‘citizen’ draw on observations from the participatory vision of open source software, peer-to-peer networks, and case studies such as Howard Dean’s 2004 bid for the Democrat Party nominee in the US Presidential election campaign (Trippi). Recent theorists note Alexander Hamilton’s tradition of civic activism (Barlow 178) which links contemporary bloggers with the Federalist Papers and early newspaper pamphlets. One unsurfaced assumption in these observations and correlations is that most bloggers will adopt a coherent political philosophy as informed citizens: a variation on Lockean utilitarianism, Rawlsian liberalism or Nader consumer activism. To date there is little discussion about how political philosophy could deepen CJ’s ‘self-image’: how to critically evaluate sources, audit and investigation processes, or strategies to deal with elites, deterrence and power. For example, although bloggers kept Valerie Plame’s ‘outing’ as a covert intelligence operative highly visible in the issues-attention cycle, it was agenda-setting media like The New York Times who the Bush Administration targeted to silence (Pearlstine). To be viable, CJ needs to evolve beyond a new media populism, perhaps into a constructivist model of agency, norms and social change (Finnemore). 4. Citizen Journalism as New Media Populism Several “precursor trends” foreshadowed CJ notably the mid-1990s interest in “cool-hunting” by new media analysts and subculture marketeers (Gibson; Gladwell). Whilst this audience focus waned with the 1995-2000 dotcom bubble it resurfaced in CJ and publisher Tim O’Reilly’s Web 2.0 vision. Thus, CJ might be viewed as new media populism that has flourished with the Web 2.0 boom. Yet if the boom becomes a macroeconomic bubble (Gross; Spar) then CJ could be written off as a “silver bullet” that ultimately failed to deliver on its promises (Brooks, Jr.). The reputations of uncritical proponents who adopted a “true believer” stance would also be damaged (Hoffer). This risk is evident if CJ is compared with a parallel trend that shares its audience focus and populist view: day traders and technical analysts who speculate on financial markets. This parallel trend provides an alternative discipline in which the populism surfaced in an earlier form (Carlile & Christensen 12). Fidelity’s Peter Lynch argues that stock pickers can use their Main Street knowledge to beat Wall Street by exploiting information asymmetries (Lynch & Rothchild). Yet Lynch’s examples came from the mid-1970s to early 1980s when indexed mutual fund strategies worked, before deregulation and macroeconomic volatility. A change in the Web 2.0 boom might similarly trigger a reconsideration of Citizen Journalism. Hedge fund maven Victor Niederhoffer contends that investors who rely on technical analysis are practicing a Comtean religion (Niederhoffer & Kenner 72-74) instead of Efficient Market Hypothesis traders who use statistical arbitrage to deal with ‘random walks’ or Behavioural Finance experts who build on Amos Tversky and Daniel Kahneman’s Prospect Theory (Kahneman & Tversky). Niederhoffer’s deeper point is that technical analysts’ belief that the “trend is your friend” is no match for the other schools, despite a mini-publishing industry and computer trading systems. There are also ontological and epistemological differences between the schools. Similarly, CJ proponents who adopt a ‘Professional Amateur’ or ‘Pro-Am’ stance (Leadbeater & Miller) may face a similar gulf when making comparisons with professional journalists and the production environments in media organisations. CJ also thrives as new media populism because of institutional vested interests. When media conglomerates cut back on cadetships and internships CJ might fill the market demand as one alternative. New media programs at New York University and others can use CJ to differentiate themselves from “hyperlocal” competitors (Christensen; Slywotzky; Christensen, Curtis & Horn). This transforms CJ from new media populism to new media institution. 5. Parallels: Event-driven & Civic Journalism For new media programs, CJ builds on two earlier traditions: the Event-driven journalism of crises like the 1991 Gulf War (Wark) and the Civic Journalism school that emerged in the 1960s social upheavals. Civic Journalism’s awareness of minorities and social issues provides the character ethic and political philosophy for many Citizen Journalists. Jay Rosen and others suggest that CJ is the next-generation heir to Civic Journalism, tracing a thread from the 1968 Chicago Democratic Convention to IndyMedia’s coverage of the 1999 “Battle in Seattle” (Rosen). Rosen’s observation could yield an interesting historiography or genealogy. Events such as the Southeast Asian tsunami on 26 December 2004 or Al Qaeda’s London bombings on 7 July 2005 are cited as examples of CJ as event-driven journalism and “pro-am collaboration” (Kolodzy 229-230). Having covered these events and Al Qaeda’s attacks on 11th September 2001, I have a slightly different view: this was more a variation on “first responder” status and handicam video footage that journalists have sourced for the past three decades when covering major disasters. This different view means that the “salience of categories” used to justify CJ and “pro-am collaboration” these events does not completely hold. Furthermore, when Citizen Journalism proponents tout Flickr and Wikipedia as models of real-time media they are building on a broader phenomenon that includes CNN’s Gulf War coverage and Bloomberg’s dominance of financial news (Loomis). 6. The Mergers & Acquisitions Scenario CJ proponents often express anxieties about the resilience of their outlets in the face of predatory venture capital firms who initiate Mergers & Acquisitions (M&A) activities. Ironically, these venture capital firms have core competencies and expertise in the event-driven infrastructure and real-time media that CJ aspires to. Sequoia Capital and other venture capital firms have evaluative frameworks that likely surpass Carlile & Christensen in sophistication, and they exploit parallels, information asymmetries and market populism. Furthermore, although venture capital firms such as Union Street Ventures have funded Web 2.0 firms, they are absent from the explanations of some theorists, whose examples of Citizen Journalism and Web 2.0 success may be the result of survivorship bias. Thus, the venture capital market remains an untapped data source for researchers who want to evaluate the impact of CJ outlets and institutions. The M&A scenario further problematises CJ in several ways. First, CJ is framed as “oppositional” to traditional media, yet this may be used as a stratagem in a game theory framework with multiple stakeholders. Drexel Burnham Lambert’s financier Michael Milken used market populism to sell ‘high-yield’ or ‘junk’ bonds to investors whilst disrupting the Wall Street establishment in the late 1980s (Curtis) and CJ could fulfil a similar tactical purpose. Second, the M&A goal of some Web 2.0 firms could undermine the participatory goals of a site’s community if post-merger integration fails. Jason Calacanis’s sale of Weblogs, Inc to America Online in 2005 and MSNBC’s acquisition of Newsvine on 5 October 2007 (Newsvine) might be success stories. However, this raises issues of digital “property rights” if you contribute to a community that is then sold in an M&A transaction—an outcome closer to business process outsourcing. Third, media “buzz” can create an unrealistic vision when a CJ site fails to grow beyond its start-up phase. Backfence.com’s demise as a “hyperlocal” initiative (Caverly) is one cautionary event that recalls the 2000 dotcom crash. The M&A scenarios outlined above are market dystopias for CJ purists. The major lesson for CJ proponents is to include other market players in hypotheses about causation and correlation factors. 7. ‘Pro-Ams’ & Professional Journalism’s Crisis CJ emerged during a period when Professional Journalism faced a major crisis of ‘self-image’. The Demos report The Pro-Am Revolution (Leadbeater & Miller) popularised the notion of ‘professional amateurs’ which some CJ theorists adopt to strengthen their categorisation. In turn, this triggers a response from cultural theorists who fear bloggers are new media’s barbarians (Keen). I concede Leadbeater and Miller have identified an important category. However, how some CJ theorists then generalise from ‘Pro-Ams’ illustrates the danger of ‘weak’ theory referred to above. Leadbeater and Miller’s categorisation does not really include a counter-view on the strengths of professionals, as illustrated in humanistic consulting (Block), professional service firms (Maister; Maister, Green & Galford), and software development (McConnell). The signs of professionalism these authors mention include a commitment to learning and communal verification, mastery of a discipline and domain application, awareness of methodology creation, participation in mentoring, and cultivation of ethical awareness. Two key differences are discernment and quality of attention, as illustrated in how the legendary Hollywood film editor Walter Murch used Apple’s Final Cut Pro software to edit the 2003 film Cold Mountain (Koppelman). ‘Pro-Ams’ might not aspire to these criteria but Citizen Journalists shouldn’t throw out these standards, either. Doing so would be making the same mistake of overconfidence that technical analysts make against statistical arbitrageurs. Key processes—fact-checking, sub-editing and editorial decision-making—are invisible to the end-user, even if traceable in a blog or wiki publishing system, because of the judgments involved. One post-mortem insight from Assignment Zero was that these processes were vital to create the climate of authenticity and trust to sustain a Citizen Journalist community (Howe). CJ’s trouble with “objectivity” might also overlook some complexities, including the similarity of many bloggers to “noise traders” in financial markets and to op-ed columnists. Methodologies and reportage practices have evolved to deal with the objections that CJ proponents raise, from New Journalism’s radical subjectivity and creative non-fiction techniques (Wolfe & Johnson) to Precision Journalism that used descriptive statistics (Meyer). Finally, journalism frameworks could be updated with current research on how phenomenological awareness shapes our judgments and perceptions (Thompson). 8. Strategic Execution For me, one of CJ’s major weaknesses as a new media theory is its lack of “rich description” (Geertz) about the strategic execution of projects. As Disinfo.com site editor I encountered situations ranging from ‘denial of service’ attacks and spam to site migration, publishing systems that go offline, and ensuring an editorial consistency. Yet the messiness of these processes is missing from CJ theories and accounts. Theories that included this detail as “second-order interactions” (Carlile & Christensen 13) would offer a richer view of CJ. Many CJ and Web 2.0 projects fall into the categories of mini-projects, demonstration prototypes and start-ups, even when using a programming language such as Ajax or Ruby on Rails. Whilst the “bootstrap” process is a benefit, more longitudinal analysis and testing needs to occur, to ensure these projects are scalable and sustainable. For example, South Korea’s OhmyNews is cited as an exemplar that started with “727 citizen reporters and 4 editors” and now has “38,000 citizen reporters” and “a dozen editors” (Kolodzy 231). How does OhmyNews’s mix of hard and soft news change over time? Or, how does OhmyNews deal with a complex issue that might require major resources, such as security negotiations between North and South Korea? Such examples could do with further research. We need to go beyond “the vision thing” and look at the messiness of execution for deeper observations and counterintuitive correlations, to build new descriptive theories. 9. Future Research This essay argues that CJ needs re-evaluation. Its immediate legacy might be to splinter ‘journalism’ into micro-trends: Washington University’s Steve Boriss proclaims “citizen journalism is dead. Expert journalism is the future.” (Boriss; Mensching). The half-lives of such micro-trends demand new categorisations, which in turn prematurely feeds the theory-building cycle. Instead, future researchers could reinvigorate 21st century journalism if they ask deeper questions and return to the observation stage of building descriptive theories. In closing, below are some possible questions that future researchers might explore: Where are the “rich descriptions” of journalistic experience—“citizen”, “convergent”, “digital”, “Pro-Am” or otherwise in new media? How could practice-based approaches inform this research instead of relying on espoused theories-in-use? What new methodologies could be developed for CJ implementation? What role can the “heroic” individual reporter or editor have in “the swarm”? Do the claims about OhmyNews and other sites stand up to longitudinal observation? Are the theories used to justify Citizen Journalism’s normative stance (Rheingold; Surowiecki; Pesce) truly robust generalisations for strategic execution or do they reflect the biases of their creators? How could developers tap the conceptual dimensions of information technology innovation (Shasha) to create the next Facebook, MySpace or Wikipedia? References Argyris, Chris, and Donald Schon. Theory in Practice. San Francisco: Jossey-Bass Publishers, 1976. Barlow, Aaron. The Rise of the Blogosphere. Westport, CN: Praeger Publishers, 2007. Block, Peter. Flawless Consulting. 2nd ed. San Francisco, CA: Jossey-Bass/Pfeiffer, 2000. Boriss, Steve. “Citizen Journalism Is Dead. Expert Journalism Is the Future.” The Future of News. 28 Nov. 2007. 20 Feb. 2008 http://thefutureofnews.com/2007/11/28/citizen-journalism-is-dead- expert-journalism-is-the-future/>. Brooks, Jr., Frederick P. The Mythical Man-Month: Essays on Software Engineering. Rev. ed. Reading, MA: Addison-Wesley Publishing Company, 1995. Campbell, Vincent. Information Age Journalism: Journalism in an International Context. New York: Arnold, 2004. Carlile, Paul R., and Clayton M. Christensen. “The Cycles of Building Theory in Management Research.” Innosight working paper draft 6. 6 Jan. 2005. 19 Feb. 2008 http://www.innosight.com/documents/Theory%20Building.pdf>. Caverly, Doug. “Hyperlocal News Site Takes A Hit.” WebProNews.com 6 July 2007. 19 Feb. 2008 http://www.webpronews.com/topnews/2007/07/06/hyperlocal-news- sites-take-a-hit>. Chenoweth, Neil. Virtual Murdoch: Reality Wars on the Information Superhighway. Sydney: Random House Australia, 2001. Christensen, Clayton M. The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail. Boston, MA: Harvard Business School Press, 1997. Christensen, Clayton M., Curtis Johnson, and Michael Horn. Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns. New York: McGraw-Hill, 2008. Curtis, Adam. The Mayfair Set. London: British Broadcasting Corporation, 1999. Etherington, Kim. Becoming a Reflexive Researcher: Using Ourselves in Research. London: Jessica Kingsley Publishers, 2004. Festinger, Leon. A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press, 1962. Feyerabend, Paul. Against Method. 3rd ed. London: Verso, 1993. Finnemore, Martha. National Interests in International Society. Ithaca, NY: Cornell University Press, 1996. Geertz, Clifford. The Interpretation of Cultures. New York: Basic Books, 1973. Ghoshal, Sumantra. “Bad Management Theories Are Destroying Good Management Practices.” Academy of Management Learning & Education 4.1 (2005): 75-91. Gibson, William. Pattern Recognition. London: Viking, 2003. Gladwell, Malcolm. “The Cool-Hunt.” The New Yorker Magazine 17 March 1997. 20 Feb. 2008 http://www.gladwell.com/1997/1997_03_17_a_cool.htm>. Gross, Daniel. Pop! Why Bubbles Are Great for the Economy. New York: Collins, 2007. Hoffer, Eric. The True Believer. New York: Harper, 1951. Howe, Jeff. “Did Assignment Zero Fail? A Look Back, and Lessons Learned.” Wired News 16 July 2007. 19 Feb. 2008 http://www.wired.com/techbiz/media/news/2007/07/assignment_ zero_final?currentPage=all>. Kahneman, Daniel, and Amos Tversky. Choices, Values and Frames. Cambridge: Cambridge UP, 2000. Keen, Andrew. The Cult of the Amateur. New York: Doubleday Currency, 2007. Khurana, Rakesh. From Higher Aims to Hired Hands. Princeton, NJ: Princeton UP, 2007. Kolodzy, Janet. Convergence Journalism: Writing and Reporting across the News Media. Oxford: Rowman & Littlefield, 2006. Koppelman, Charles. Behind the Seen: How Walter Murch Edited Cold Mountain Using Apple’s Final Cut Pro and What This Means for Cinema. Upper Saddle River, NJ: New Rider, 2004. Leadbeater, Charles, and Paul Miller. “The Pro-Am Revolution”. London: Demos, 24 Nov. 2004. 19 Feb. 2008 http://www.demos.co.uk/publications/proameconomy>. Loomis, Carol J. “Bloomberg’s Money Machine.” Fortune 5 April 2007. 20 Feb. 2008 http://money.cnn.com/magazines/fortune/fortune_archive/2007/04/16/ 8404302/index.htm>. Lynch, Peter, and John Rothchild. Beating the Street. Rev. ed. New York: Simon & Schuster, 1994. Maister, David. True Professionalism. New York: The Free Press, 1997. Maister, David, Charles H. Green, and Robert M. Galford. The Trusted Advisor. New York: The Free Press, 2004. Mensching, Leah McBride. “Citizen Journalism on Its Way Out?” SFN Blog, 30 Nov. 2007. 20 Feb. 2008 http://www.sfnblog.com/index.php/2007/11/30/940-citizen-journalism- on-its-way-out>. Meyer, Philip. Precision Journalism. 4th ed. Lanham, MD: Rowman & Littlefield, 2002. McConnell, Steve. Professional Software Development. Boston, MA: Addison-Wesley, 2004. Mintzberg, Henry. Managers Not MBAs. San Francisco, CA: Berrett-Koehler, 2004. Morgan, Gareth. Images of Organisation. Rev. ed. Thousand Oaks, CA: Sage, 2006. Newsvine. “Msnbc.com Acquires Newsvine.” 7 Oct. 2007. 20 Feb. 2008 http://blog.newsvine.com/_news/2007/10/07/1008889-msnbccom- acquires-newsvine>. Niederhoffer, Victor, and Laurel Kenner. Practical Speculation. New York: John Wiley & Sons, 2003. Pearlstine, Norman. Off the Record: The Press, the Government, and the War over Anonymous Sources. New York: Farrar, Straus & Giroux, 2007. Pesce, Mark D. “Mob Rules (The Law of Fives).” The Human Network 28 Sep. 2007. 20 Feb. 2008 http://blog.futurestreetconsulting.com/?p=39>. Rheingold, Howard. Smart Mobs: The Next Social Revolution. Cambridge MA: Basic Books, 2002. Rosen, Jay. What Are Journalists For? Princeton NJ: Yale UP, 2001. Shasha, Dennis Elliott. Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists. New York: Copernicus, 1995. Slywotzky, Adrian. Value Migration: How to Think Several Moves Ahead of the Competition. Boston, MA: Harvard Business School Press, 1996. Smith, Steve. “The Self-Image of a Discipline: The Genealogy of International Relations Theory.” Eds. Steve Smith and Ken Booth. International Relations Theory Today. Cambridge, UK: Polity Press, 1995. 1-37. Spar, Debora L. Ruling the Waves: Cycles of Discovery, Chaos and Wealth from the Compass to the Internet. New York: Harcourt, 2001. Surowiecki, James. The Wisdom of Crowds. New York: Doubleday, 2004. Thompson, Evan. Mind in Life: Biology, Phenomenology, and the Sciences of Mind. Cambridge, MA: Belknap Press, 2007. Trippi, Joe. The Revolution Will Not Be Televised. New York: ReganBooks, 2004. Underwood, Doug. When MBA’s Rule the Newsroom. New York: Columbia University Press, 1993. Wark, McKenzie. Virtual Geography: Living with Global Media Events. Bloomington IN: Indiana UP, 1994. Wolfe, Tom, and E.W. Johnson. The New Journalism. New York: Harper & Row, 1973. Citation reference for this article MLA Style Burns, Alex. "Select Issues with New Media Theories of Citizen Journalism." M/C Journal 10.6/11.1 (2008). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0804/10-burns.php>. APA Style Burns, A. (Apr. 2008) "Select Issues with New Media Theories of Citizen Journalism," M/C Journal, 10(6)/11(1). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0804/10-burns.php>.
APA, Harvard, Vancouver, ISO, and other styles
27

Burns, Alex. "Select Issues with New Media Theories of Citizen Journalism." M/C Journal 11, no. 1 (June 1, 2008). http://dx.doi.org/10.5204/mcj.30.

Full text
Abstract:
“Journalists have to begin a new type of journalism, sometimes being the guide on the side of the civic conversation as well as the filter and gatekeeper.” (Kolodzy 218) “In many respects, citizen journalism is simply public journalism removed from the journalism profession.” (Barlow 181) 1. Citizen Journalism — The Latest Innovation? New Media theorists such as Dan Gillmor, Henry Jenkins, Jay Rosen and Jeff Howe have recently touted Citizen Journalism (CJ) as the latest innovation in 21st century journalism. “Participatory journalism” and “user-driven journalism” are other terms to describe CJ, which its proponents argue is a disruptive innovation (Christensen) to the agenda-setting media institutions, news values and “objective” reportage. In this essay I offer a “contrarian” view, informed by two perspectives: (1) a three-stage model of theory-building (Carlile & Christensen) to evaluate the claims made about CJ; and (2) self-reflexive research insights (Etherington) from editing the US-based news site Disinformation between November 1999 and February 2008. New media theories can potentially create “cognitive dissonance” (Festinger) when their explanations of CJ practices are compared with what actually happens (Feyerabend). First I summarise Carlile & Christensen’s model and the dangers of “bad theory” (Ghoshal). Next I consider several problems in new media theories about CJ: the notion of ‘citizen’, new media populism, parallels in event-driven and civic journalism, and mergers and acquisitions. Two ‘self-reflexive’ issues are considered: ‘pro-ams’ or ‘professional amateurs’ as a challenge to professional journalists, and CJ’s deployment in new media operations and production environments. Finally, some exploratory questions are offered for future researchers. 2. An Evaluative Framework for New Media Theories on Citizen Journalism Paul Carlile and Clayton M. Christensen’s model offers one framework with which to evaluate new media theories on CJ. This framework is used below to highlight select issues and gaps in CJ’s current frameworks and theories. Carlile & Christensen suggest that robust theory-building emerges via three stages: Descriptive, Categorisation and Normative (Carlile & Christensen). There are three sub-stages in Descriptive theory-building; namely, the observation of phenomena, inductive classification into schemas and taxonomies, and correlative relationships to develop models (Carlile & Christensen 2-5). Once causation is established, Normative theory evolves through deductive logic which is subject to Kuhnian paradigm shifts and Popperian falsifiability (Carlile & Christensen 6). Its proponents situate CJ as a Categorisation or new journalism agenda that poses a Normative challenged and Kuhnian paradigm shift to traditional journalism. Existing CJ theories jump from the Descriptive phase of observations like “smart mobs” in Japanese youth subcultures (Rheingold) to make broad claims for Categorisation such as that IndyMedia, blogs and wiki publishing systems as new media alternatives to traditional media. CJ theories then underpin normative beliefs, values and worldviews. Correlative relationships are also used to differentiate CJ from the demand side of microeconomic analysis, from the top-down editorial models of traditional media outlets, and to adopt a vanguard stance. To support this, CJ proponents cite research on emergent collective behaviour such as the “wisdom of crowds” hypothesis (Surowiecki) or peer-to-peer network “swarms” (Pesce) to provide scientific justification for their Normative theories. However, further evaluative research is needed for three reasons: the emergent collective behaviour hypothesis may not actually inform CJ practices, existing theories may have “correlation not cause” errors, and the link may be due to citation network effects between CJ theorists. Collectively, this research base also frames CJ as an “ought to” Categorisation and then proceeds to Normative theory-building (Carlile & Christensen 7). However, I argue below that this Categorisation may be premature: its observations and correlative relationships might reinforce a ‘weak’ Normative theory with limited generalisation. CJ proponents seem to imply that it can be applied anywhere and under any condition—a “statement of causality” that almost makes it a fad (Carlile & Christensen 8). CJ that relies on Classification and Normative claims will be problematic without a strong grounding in Descriptive observation. To understand what’s potentially at stake for CJ’s future consider the consider the parallel debate about curricula renewal for the Masters of Business Administration in the wake of high-profile corporate collapses such as Enron, Worldcom, HIH and OneTel. The MBA evolved as a sociological and institutional construct to justify management as a profession that is codified, differentiated and has entry barriers (Khurana). This process might partly explain the pushback that some media professionals have to CJ as one alternative. MBA programs faced criticism if they had student cohorts with little business know-how or experiential learning (Mintzberg). Enron’s collapse illustrated the ethical dilemmas and unintended consequences that occurred when “bad theories” were implemented (Ghoshal). Professional journalists are aware of this: MBA-educated managers challenged the “craft” tradition in the early 1980s (Underwood). This meant that journalism’s ‘self-image’ (Morgan; Smith) is intertwined with managerial anxieties about media conglomerates in highly competitive markets. Ironically, as noted below, Citizen Journalists who adopt a vanguard position vis-a-vis media professionals step into a more complex game with other players. However, current theories have a naïve idealism about CJ’s promise of normative social change in the face of Machiavellian agency in business, the media and politics. 3. Citizen Who? Who is the “citizen” in CJ? What is their self-awareness as a political agent? CJ proponents who use the ‘self-image’ of ‘citizen’ draw on observations from the participatory vision of open source software, peer-to-peer networks, and case studies such as Howard Dean’s 2004 bid for the Democrat Party nominee in the US Presidential election campaign (Trippi). Recent theorists note Alexander Hamilton’s tradition of civic activism (Barlow 178) which links contemporary bloggers with the Federalist Papers and early newspaper pamphlets. One unsurfaced assumption in these observations and correlations is that most bloggers will adopt a coherent political philosophy as informed citizens: a variation on Lockean utilitarianism, Rawlsian liberalism or Nader consumer activism. To date there is little discussion about how political philosophy could deepen CJ’s ‘self-image’: how to critically evaluate sources, audit and investigation processes, or strategies to deal with elites, deterrence and power. For example, although bloggers kept Valerie Plame’s ‘outing’ as a covert intelligence operative highly visible in the issues-attention cycle, it was agenda-setting media like The New York Times who the Bush Administration targeted to silence (Pearlstine). To be viable, CJ needs to evolve beyond a new media populism, perhaps into a constructivist model of agency, norms and social change (Finnemore). 4. Citizen Journalism as New Media Populism Several “precursor trends” foreshadowed CJ notably the mid-1990s interest in “cool-hunting” by new media analysts and subculture marketeers (Gibson; Gladwell). Whilst this audience focus waned with the 1995-2000 dotcom bubble it resurfaced in CJ and publisher Tim O’Reilly’s Web 2.0 vision. Thus, CJ might be viewed as new media populism that has flourished with the Web 2.0 boom. Yet if the boom becomes a macroeconomic bubble (Gross; Spar) then CJ could be written off as a “silver bullet” that ultimately failed to deliver on its promises (Brooks, Jr.). The reputations of uncritical proponents who adopted a “true believer” stance would also be damaged (Hoffer). This risk is evident if CJ is compared with a parallel trend that shares its audience focus and populist view: day traders and technical analysts who speculate on financial markets. This parallel trend provides an alternative discipline in which the populism surfaced in an earlier form (Carlile & Christensen 12). Fidelity’s Peter Lynch argues that stock pickers can use their Main Street knowledge to beat Wall Street by exploiting information asymmetries (Lynch & Rothchild). Yet Lynch’s examples came from the mid-1970s to early 1980s when indexed mutual fund strategies worked, before deregulation and macroeconomic volatility. A change in the Web 2.0 boom might similarly trigger a reconsideration of Citizen Journalism. Hedge fund maven Victor Niederhoffer contends that investors who rely on technical analysis are practicing a Comtean religion (Niederhoffer & Kenner 72-74) instead of Efficient Market Hypothesis traders who use statistical arbitrage to deal with ‘random walks’ or Behavioural Finance experts who build on Amos Tversky and Daniel Kahneman’s Prospect Theory (Kahneman & Tversky). Niederhoffer’s deeper point is that technical analysts’ belief that the “trend is your friend” is no match for the other schools, despite a mini-publishing industry and computer trading systems. There are also ontological and epistemological differences between the schools. Similarly, CJ proponents who adopt a ‘Professional Amateur’ or ‘Pro-Am’ stance (Leadbeater & Miller) may face a similar gulf when making comparisons with professional journalists and the production environments in media organisations. CJ also thrives as new media populism because of institutional vested interests. When media conglomerates cut back on cadetships and internships CJ might fill the market demand as one alternative. New media programs at New York University and others can use CJ to differentiate themselves from “hyperlocal” competitors (Christensen; Slywotzky; Christensen, Curtis & Horn). This transforms CJ from new media populism to new media institution. 5. Parallels: Event-driven & Civic Journalism For new media programs, CJ builds on two earlier traditions: the Event-driven journalism of crises like the 1991 Gulf War (Wark) and the Civic Journalism school that emerged in the 1960s social upheavals. Civic Journalism’s awareness of minorities and social issues provides the character ethic and political philosophy for many Citizen Journalists. Jay Rosen and others suggest that CJ is the next-generation heir to Civic Journalism, tracing a thread from the 1968 Chicago Democratic Convention to IndyMedia’s coverage of the 1999 “Battle in Seattle” (Rosen). Rosen’s observation could yield an interesting historiography or genealogy. Events such as the Southeast Asian tsunami on 26 December 2004 or Al Qaeda’s London bombings on 7 July 2005 are cited as examples of CJ as event-driven journalism and “pro-am collaboration” (Kolodzy 229-230). Having covered these events and Al Qaeda’s attacks on 11th September 2001, I have a slightly different view: this was more a variation on “first responder” status and handicam video footage that journalists have sourced for the past three decades when covering major disasters. This different view means that the “salience of categories” used to justify CJ and “pro-am collaboration” these events does not completely hold. Furthermore, when Citizen Journalism proponents tout Flickr and Wikipedia as models of real-time media they are building on a broader phenomenon that includes CNN’s Gulf War coverage and Bloomberg’s dominance of financial news (Loomis). 6. The Mergers & Acquisitions Scenario CJ proponents often express anxieties about the resilience of their outlets in the face of predatory venture capital firms who initiate Mergers & Acquisitions (M&A) activities. Ironically, these venture capital firms have core competencies and expertise in the event-driven infrastructure and real-time media that CJ aspires to. Sequoia Capital and other venture capital firms have evaluative frameworks that likely surpass Carlile & Christensen in sophistication, and they exploit parallels, information asymmetries and market populism. Furthermore, although venture capital firms such as Union Street Ventures have funded Web 2.0 firms, they are absent from the explanations of some theorists, whose examples of Citizen Journalism and Web 2.0 success may be the result of survivorship bias. Thus, the venture capital market remains an untapped data source for researchers who want to evaluate the impact of CJ outlets and institutions. The M&A scenario further problematises CJ in several ways. First, CJ is framed as “oppositional” to traditional media, yet this may be used as a stratagem in a game theory framework with multiple stakeholders. Drexel Burnham Lambert’s financier Michael Milken used market populism to sell ‘high-yield’ or ‘junk’ bonds to investors whilst disrupting the Wall Street establishment in the late 1980s (Curtis) and CJ could fulfil a similar tactical purpose. Second, the M&A goal of some Web 2.0 firms could undermine the participatory goals of a site’s community if post-merger integration fails. Jason Calacanis’s sale of Weblogs, Inc to America Online in 2005 and MSNBC’s acquisition of Newsvine on 5 October 2007 (Newsvine) might be success stories. However, this raises issues of digital “property rights” if you contribute to a community that is then sold in an M&A transaction—an outcome closer to business process outsourcing. Third, media “buzz” can create an unrealistic vision when a CJ site fails to grow beyond its start-up phase. Backfence.com’s demise as a “hyperlocal” initiative (Caverly) is one cautionary event that recalls the 2000 dotcom crash. The M&A scenarios outlined above are market dystopias for CJ purists. The major lesson for CJ proponents is to include other market players in hypotheses about causation and correlation factors. 7. ‘Pro-Ams’ & Professional Journalism’s Crisis CJ emerged during a period when Professional Journalism faced a major crisis of ‘self-image’. The Demos report The Pro-Am Revolution (Leadbeater & Miller) popularised the notion of ‘professional amateurs’ which some CJ theorists adopt to strengthen their categorisation. In turn, this triggers a response from cultural theorists who fear bloggers are new media’s barbarians (Keen). I concede Leadbeater and Miller have identified an important category. However, how some CJ theorists then generalise from ‘Pro-Ams’ illustrates the danger of ‘weak’ theory referred to above. Leadbeater and Miller’s categorisation does not really include a counter-view on the strengths of professionals, as illustrated in humanistic consulting (Block), professional service firms (Maister; Maister, Green & Galford), and software development (McConnell). The signs of professionalism these authors mention include a commitment to learning and communal verification, mastery of a discipline and domain application, awareness of methodology creation, participation in mentoring, and cultivation of ethical awareness. Two key differences are discernment and quality of attention, as illustrated in how the legendary Hollywood film editor Walter Murch used Apple’s Final Cut Pro software to edit the 2003 film Cold Mountain (Koppelman). ‘Pro-Ams’ might not aspire to these criteria but Citizen Journalists shouldn’t throw out these standards, either. Doing so would be making the same mistake of overconfidence that technical analysts make against statistical arbitrageurs. Key processes—fact-checking, sub-editing and editorial decision-making—are invisible to the end-user, even if traceable in a blog or wiki publishing system, because of the judgments involved. One post-mortem insight from Assignment Zero was that these processes were vital to create the climate of authenticity and trust to sustain a Citizen Journalist community (Howe). CJ’s trouble with “objectivity” might also overlook some complexities, including the similarity of many bloggers to “noise traders” in financial markets and to op-ed columnists. Methodologies and reportage practices have evolved to deal with the objections that CJ proponents raise, from New Journalism’s radical subjectivity and creative non-fiction techniques (Wolfe & Johnson) to Precision Journalism that used descriptive statistics (Meyer). Finally, journalism frameworks could be updated with current research on how phenomenological awareness shapes our judgments and perceptions (Thompson). 8. Strategic Execution For me, one of CJ’s major weaknesses as a new media theory is its lack of “rich description” (Geertz) about the strategic execution of projects. As Disinfo.com site editor I encountered situations ranging from ‘denial of service’ attacks and spam to site migration, publishing systems that go offline, and ensuring an editorial consistency. Yet the messiness of these processes is missing from CJ theories and accounts. Theories that included this detail as “second-order interactions” (Carlile & Christensen 13) would offer a richer view of CJ. Many CJ and Web 2.0 projects fall into the categories of mini-projects, demonstration prototypes and start-ups, even when using a programming language such as Ajax or Ruby on Rails. Whilst the “bootstrap” process is a benefit, more longitudinal analysis and testing needs to occur, to ensure these projects are scalable and sustainable. For example, South Korea’s OhmyNews is cited as an exemplar that started with “727 citizen reporters and 4 editors” and now has “38,000 citizen reporters” and “a dozen editors” (Kolodzy 231). How does OhmyNews’s mix of hard and soft news change over time? Or, how does OhmyNews deal with a complex issue that might require major resources, such as security negotiations between North and South Korea? Such examples could do with further research. We need to go beyond “the vision thing” and look at the messiness of execution for deeper observations and counterintuitive correlations, to build new descriptive theories. 9. Future Research This essay argues that CJ needs re-evaluation. Its immediate legacy might be to splinter ‘journalism’ into micro-trends: Washington University’s Steve Boriss proclaims “citizen journalism is dead. Expert journalism is the future.” (Boriss; Mensching). The half-lives of such micro-trends demand new categorisations, which in turn prematurely feeds the theory-building cycle. Instead, future researchers could reinvigorate 21st century journalism if they ask deeper questions and return to the observation stage of building descriptive theories. In closing, below are some possible questions that future researchers might explore: Where are the “rich descriptions” of journalistic experience—“citizen”, “convergent”, “digital”, “Pro-Am” or otherwise in new media?How could practice-based approaches inform this research instead of relying on espoused theories-in-use?What new methodologies could be developed for CJ implementation?What role can the “heroic” individual reporter or editor have in “the swarm”?Do the claims about OhmyNews and other sites stand up to longitudinal observation?Are the theories used to justify Citizen Journalism’s normative stance (Rheingold; Surowiecki; Pesce) truly robust generalisations for strategic execution or do they reflect the biases of their creators?How could developers tap the conceptual dimensions of information technology innovation (Shasha) to create the next Facebook, MySpace or Wikipedia? References Argyris, Chris, and Donald Schon. Theory in Practice. San Francisco: Jossey-Bass Publishers, 1976. Barlow, Aaron. The Rise of the Blogosphere. Westport, CN: Praeger Publishers, 2007. Block, Peter. Flawless Consulting. 2nd ed. San Francisco, CA: Jossey-Bass/Pfeiffer, 2000. Boriss, Steve. “Citizen Journalism Is Dead. Expert Journalism Is the Future.” The Future of News. 28 Nov. 2007. 20 Feb. 2008 < http://thefutureofnews.com/2007/11/28/citizen-journalism-is-dead- expert-journalism-is-the-future/ >. Brooks, Jr., Frederick P. The Mythical Man-Month: Essays on Software Engineering. Rev. ed. Reading, MA: Addison-Wesley Publishing Company, 1995. Campbell, Vincent. Information Age Journalism: Journalism in an International Context. New York: Arnold, 2004. Carlile, Paul R., and Clayton M. Christensen. “The Cycles of Building Theory in Management Research.” Innosight working paper draft 6. 6 Jan. 2005. 19 Feb. 2008 < http://www.innosight.com/documents/Theory%20Building.pdf >. Caverly, Doug. “Hyperlocal News Site Takes A Hit.” WebProNews.com 6 July 2007. 19 Feb. 2008 < http://www.webpronews.com/topnews/2007/07/06/hyperlocal-news- sites-take-a-hit >. Chenoweth, Neil. Virtual Murdoch: Reality Wars on the Information Superhighway. Sydney: Random House Australia, 2001. Christensen, Clayton M. The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail. Boston, MA: Harvard Business School Press, 1997. Christensen, Clayton M., Curtis Johnson, and Michael Horn. Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns. New York: McGraw-Hill, 2008. Curtis, Adam. The Mayfair Set. London: British Broadcasting Corporation, 1999. Etherington, Kim. Becoming a Reflexive Researcher: Using Ourselves in Research. London: Jessica Kingsley Publishers, 2004. Festinger, Leon. A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press, 1962. Feyerabend, Paul. Against Method. 3rd ed. London: Verso, 1993. Finnemore, Martha. National Interests in International Society. Ithaca, NY: Cornell University Press, 1996. Geertz, Clifford. The Interpretation of Cultures. New York: Basic Books, 1973. Ghoshal, Sumantra. “Bad Management Theories Are Destroying Good Management Practices.” Academy of Management Learning & Education 4.1 (2005): 75-91. Gibson, William. Pattern Recognition. London: Viking, 2003. Gladwell, Malcolm. “The Cool-Hunt.” The New Yorker Magazine 17 March 1997. 20 Feb. 2008 < http://www.gladwell.com/1997/1997_03_17_a_cool.htm >. Gross, Daniel. Pop! Why Bubbles Are Great for the Economy. New York: Collins, 2007. Hoffer, Eric. The True Believer. New York: Harper, 1951. Howe, Jeff. “Did Assignment Zero Fail? A Look Back, and Lessons Learned.” Wired News 16 July 2007. 19 Feb. 2008 < http://www.wired.com/techbiz/media/news/2007/07/assignment_ zero_final?currentPage=all >. Kahneman, Daniel, and Amos Tversky. Choices, Values and Frames. Cambridge: Cambridge UP, 2000. Keen, Andrew. The Cult of the Amateur. New York: Doubleday Currency, 2007. Khurana, Rakesh. From Higher Aims to Hired Hands. Princeton, NJ: Princeton UP, 2007. Kolodzy, Janet. Convergence Journalism: Writing and Reporting across the News Media. Oxford: Rowman & Littlefield, 2006. Koppelman, Charles. Behind the Seen: How Walter Murch Edited Cold Mountain Using Apple’s Final Cut Pro and What This Means for Cinema. Upper Saddle River, NJ: New Rider, 2004. Leadbeater, Charles, and Paul Miller. “The Pro-Am Revolution”. London: Demos, 24 Nov. 2004. 19 Feb. 2008 < http://www.demos.co.uk/publications/proameconomy >. Loomis, Carol J. “Bloomberg’s Money Machine.” Fortune 5 April 2007. 20 Feb. 2008 < http://money.cnn.com/magazines/fortune/fortune_archive/2007/04/16/ 8404302/index.htm >. Lynch, Peter, and John Rothchild. Beating the Street. Rev. ed. New York: Simon & Schuster, 1994. Maister, David. True Professionalism. New York: The Free Press, 1997. Maister, David, Charles H. Green, and Robert M. Galford. The Trusted Advisor. New York: The Free Press, 2004. Mensching, Leah McBride. “Citizen Journalism on Its Way Out?” SFN Blog, 30 Nov. 2007. 20 Feb. 2008 < http://www.sfnblog.com/index.php/2007/11/30/940-citizen-journalism- on-its-way-out >. Meyer, Philip. Precision Journalism. 4th ed. Lanham, MD: Rowman & Littlefield, 2002. McConnell, Steve. Professional Software Development. Boston, MA: Addison-Wesley, 2004. Mintzberg, Henry. Managers Not MBAs. San Francisco, CA: Berrett-Koehler, 2004. Morgan, Gareth. Images of Organisation. Rev. ed. Thousand Oaks, CA: Sage, 2006. Newsvine. “Msnbc.com Acquires Newsvine.” 7 Oct. 2007. 20 Feb. 2008 < http://blog.newsvine.com/_news/2007/10/07/1008889-msnbccom- acquires-newsvine >. Niederhoffer, Victor, and Laurel Kenner. Practical Speculation. New York: John Wiley & Sons, 2003. Pearlstine, Norman. Off the Record: The Press, the Government, and the War over Anonymous Sources. New York: Farrar, Straus & Giroux, 2007. Pesce, Mark D. “Mob Rules (The Law of Fives).” The Human Network 28 Sep. 2007. 20 Feb. 2008 < http://blog.futurestreetconsulting.com/?p=39 >. Rheingold, Howard. Smart Mobs: The Next Social Revolution. Cambridge MA: Basic Books, 2002. Rosen, Jay. What Are Journalists For? Princeton NJ: Yale UP, 2001. Shasha, Dennis Elliott. Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists. New York: Copernicus, 1995. Slywotzky, Adrian. Value Migration: How to Think Several Moves Ahead of the Competition. Boston, MA: Harvard Business School Press, 1996. Smith, Steve. “The Self-Image of a Discipline: The Genealogy of International Relations Theory.” Eds. Steve Smith and Ken Booth. International Relations Theory Today. Cambridge, UK: Polity Press, 1995. 1-37. Spar, Debora L. Ruling the Waves: Cycles of Discovery, Chaos and Wealth from the Compass to the Internet. New York: Harcourt, 2001. Surowiecki, James. The Wisdom of Crowds. New York: Doubleday, 2004. Thompson, Evan. Mind in Life: Biology, Phenomenology, and the Sciences of Mind. Cambridge, MA: Belknap Press, 2007. Trippi, Joe. The Revolution Will Not Be Televised. New York: ReganBooks, 2004. Underwood, Doug. When MBA’s Rule the Newsroom. New York: Columbia University Press, 1993. Wark, McKenzie. Virtual Geography: Living with Global Media Events. Bloomington IN: Indiana UP, 1994. Wolfe, Tom, and E.W. Johnson. The New Journalism. New York: Harper & Row, 1973.
APA, Harvard, Vancouver, ISO, and other styles
28

Humphry, Justine, and César Albarrán Torres. "A Tap on the Shoulder: The Disciplinary Techniques and Logics of Anti-Pokie Apps." M/C Journal 18, no. 2 (April 29, 2015). http://dx.doi.org/10.5204/mcj.962.

Full text
Abstract:
In this paper we explore the rise of anti-gambling apps in the context of the massive expansion of gambling in new spheres of life (online and offline) and an acceleration in strategies of anticipatory and individualised management of harm caused by gambling. These apps, and the techniques and forms of labour they demand, are examples of and a mechanism through which a mode of governance premised on ‘self-care’ and ‘self-control’ is articulated and put into practice. To support this argument, we explore two government initiatives in the Australian context. Quit Pokies, a mobile app project between the Moreland City Council, North East Primary Care Partnership and the Victorian Local Governance Association, is an example of an emerging service paradigm of ‘self-care’ that uses online and mobile platforms with geo-location to deliver real time health and support interventions. A similar mobile app, Gambling Terminator, was launched by the NSW government in late 2012. Both apps work on the premise that interrupting a gaming session through a trigger, described by Quit Pokies’ creator as a “tap on the shoulder” provides gamblers the opportunity to take a reflexive stance and cut short their gambling practice in the course of play. We critically examine these apps as self-disciplining techniques of contemporary neo-liberalism directed towards anticipating and reducing the personal harm and social risk associated with gambling. We analyse the material and discursive elements, and new forms of user labour, through which this consumable media is framed and assembled. We argue that understanding the role of these apps, and mobile media more generally, in generating new techniques and technologies of the self, is important for identifying emerging modes of governance and their implications at a time when gambling is going through an immense period of cultural normalisation in online and offline environments. The Australian context is particularly germane for the way gambling permeates everyday spaces of sociality and leisure, and the potential of gambling interventions to interrupt and re-configure these spaces and institute a new kind of subject-state relation. Gambling in Australia Though a global phenomenon, the growth and expansion of gambling manifests distinctly in Australia because of its long cultural and historical attachment to games of chance. Australians are among the biggest betters and losers in the world (Ziolkowski), mainly on Electronic Gaming Machines (EGM) or pokies. As of 2013, according to The World Count of Gaming Machine (Ziolkowski), there were 198,150 EGMs in the country, of which 197,274 were slot machines, with the rest being electronic table games of roulette, blackjack and poker. There are 118 persons per machine in Australia. New South Wales is the jurisdiction with most EGMs (95,799), followed by Queensland (46,680) and Victoria (28,758) (Ziolkowski). Gambling is significant in Australian cultural history and average Australian households spend at least some money on different forms of gambling, from pokies to scratch cards, every year (Worthington et al.). In 1985, long-time gambling researcher Geoffrey Caldwell stated thatAustralians seem to take a pride in the belief that we are a nation of gamblers. Thus we do not appear to be ashamed of our gambling instincts, habits and practices. Gambling is regarded by most Australians as a normal, everyday practice in contrast to the view that gambling is a sinful activity which weakens the moral fibre of the individual and the community. (Caldwell 18) The omnipresence of gambling opportunities in most Australian states has been further facilitated by the availability of online and mobile gambling and gambling-like spaces. Social casino apps, for instance, are widely popular in Australia. The slots social casino app Slotomania was the most downloaded product in the iTunes store in 2012 (Metherell). In response to the high rate of different forms of gambling in Australia, a range of disparate interest groups have identified the expansion of gambling as a concerning trend. Health researchers have pointed out that online gamblers have a higher risk of experiencing problems with gambling (at 30%) compared to 15% in offline bettors (Hastings). The incidence of gambling problems is also disproportionately high in specific vulnerable demographics, including university students (Cervini), young adults prone to substance abuse problems (Hayatbakhsh et al.), migrants (Tanasornnarong et al.; Scull & Woolcock; Ohtsuka & Ohtsuka), pensioners (Hing & Breen), female players (Lee), Aboriginal communities (Young et al.; McMillen & Donnelly) and individuals experiencing homelessness (Holsworth et al.). While there is general recognition of the personal and public health impacts of gambling in Australia, there is a contradiction in the approach to gambling at a governance level. On one hand, its expansion is promoted and even encouraged by the federal and state governments, as gambling is an enormous source of revenue, as evidenced, for example, by the construction of the new Crown casino in Barangaroo in Sydney (Markham & Young). Campaigns trying to limit the use of poker machines, which are associated with concerns over problem gambling and addiction, are deemed by the gambling lobby as un-Australian. Paradoxically, efforts to restrict gambling or control gambling winnings have also been described as un-Australian, such as in the Australian Taxation Office’s campaign against MONA’s founder, David Walsh, whose immense art collection was acquired with the funds from a gambling scheme (Global Mail). On the other hand, people experiencing problems with gambling are often categorised as addicts and the ultimate blame (and responsibility) is attributed to the individual. In Australia, attitudes towards people who are arguably addicted to gambling are different than those towards individuals afflicted by alcohol or drug abuse (Jean). While “Australians tend to be sympathetic towards people with alcohol and other drug addictions who seek help,” unless it is seen as one of the more socially acceptable forms of occasional, controlled gambling (such as sports betting, gambling on the Melbourne Cup or celebrating ANZAC Day with Two-Up), gambling is framed as an individual “problem” and “moral failing” (Jean). The expansion of gambling is the backdrop to another development in health care and public health discourse, which have for some time now been devoted to the ideal of what Lupton has called the “digitally engaged patient” (Lupton). Technologies are central to the delivery of this model of health service provision that puts the patient at the centre of, and responsible for, their own health and medical care. Lupton has pointed out how this discourse, while appearing new, is in fact the latest version of the 1970s emphasis on the ‘patient as consumer’, an idea given an extra injection by the massive development and availability of digital and interactive web-based and mobile platforms, many of these directed towards the provision of health and health-related information and services. What this means for patients is that, rather than relying solely on professional medical expertise and care, the patient is encouraged to take on some of this medical/health work to conduct practices of ‘self-care’ (Lupton). The Discourse of ‘Self-Management’ and ‘Self-Care’ The model of ‘self-care’ and ‘self-management’ by ‘empowering’ digital technology has now become a dominant discourse within health and medicine, and is increasingly deployed across a range of related sectors such as welfare services. In recent research conducted on homelessness and mobile media, for example, government department staff involved in the reform of welfare services referred to ‘self-management’ as the new service paradigm that underpins their digital reform strategy. Echoing ideas and language similar to the “digitally engaged patient”, customers of Centrelink, Medicare and other ‘human services’ are being encouraged (through planned strategic initiatives aimed at shifting targeted customer groups online) to transact with government services digitally and manage their own personal profiles and health information. One departmental staff member described this in terms of an “opportunity cost”, the savings in time otherwise spent standing in long queues in service centres (Humphry). Rather than view these examples as isolated incidents taking place within or across sectors or disciplines, these are better understood as features of an emerging ‘discursive formation’ , a term Foucault used to describe the way in which particular institutions and/or the state establish a regime of truth, or an accepted social reality and which gives definition to a new historical episteme and subject: in this case that of the self-disciplined and “digitally engaged medical/health patient”. As Foucault explained, once this subject has become fully integrated into and across the social field, it is no longer easy to excavate, since it lies below the surface of articulation and is held together through everyday actions, habits and institutional routines and techniques that appear to be universal, necessary and/normal. The way in which this citizen subject becomes a universal model and norm, however, is not a straightforward or linear story and since we are in the midst of its rise, is not a story with a foretold conclusion. Nevertheless, across a range of different fields of governance: medicine; health and welfare, we can see signs of this emerging figure of the self-caring “digitally engaged patient” constituted from a range of different techniques and practices of self-governance. In Australia, this figure is at the centre of a concerted strategy of service digitisation involving a number of cross sector initiatives such as Australia’s National EHealth Strategy (2008), the National Digital Economy Strategy (2011) and the Australian Public Service Mobile Roadmap (2013). This figure of the self-caring “digitally engaged” patient, aligns well and is entirely compatible with neo-liberal formulations of the individual and the reduced role of the state as a provider of welfare and care. Berry refers to Foucault’s definition of neoliberalism as outlined in his lectures to the College de France as a “particular form of post-welfare state politics in which the state essentially outsources the responsibility of the ‘well-being' of the population” (65). In the case of gambling, the neoliberal defined state enables the wedding of two seemingly contradictory stances: promoting gambling as a major source of revenue and capitalisation on the one hand, and identifying and treating gambling addiction as an individual pursuit and potential risk on the other. Risk avoidance strategies are focused on particular groups of people who are targeted for self-treatment to avoid the harm of gambling addiction, which is similarly framed as individual rather than socially and systematically produced. What unites and makes possible this alignment of neoliberalism and the new “digitally engaged subject/patient” is first and foremost, the construction of a subject in a chronic state of ill health. This figure is positioned as terminal from the start. They are ‘sick’, a ‘patient’, an ‘addict’: in need of immediate and continuous treatment. Secondly, this neoliberal patient/addict is enabled (we could even go so far as to say ‘empowered’) by digital technology, especially smartphones and the apps available through these devices in the form of a myriad of applications for intervening and treating ones afflictions. These apps range fromself-tracking programs such as mood regulators through to social media interventions. Anti-Pokie Apps and the Neoliberal Gambler We now turn to two examples which illustrate this alignment between neoliberalism and the new “digitally engaged subject/patient” in relation to gambling. Anti-gambling apps function to both replace or ‘take the place’ of institutions and individuals actively involved in the treatment of problem gambling and re-engineer this service through the logics of ‘self-care’ and ‘self-management’. Here, we depart somewhat from Foucault’s model of disciplinary power summed up in the institution (with the prison exemplifying this disciplinary logic) and move towards Deleuze’s understanding of power as exerted by the State not through enclosures but through diffuse and rhizomatic information flows and technologies (Deleuze). At the same time, we retain Foucault’s attention to the role and agency of the user in this power-dynamic, identifiable in the technics of self-regulation and in his ideas on governmentality. We now turn to analyse these apps more closely, and explore the way in which these articulate and perform these disciplinary logics. The app Quit Pokies was a joint venture of the North East Primary Care Partnership, the Victorian Local Governance Association and the Moreland City Council, launched in early 2014. The idea of the rational, self-reflexive and agentic user is evident in the description of the app by app developer Susan Rennie who described it this way: What they need is for someone to tap them on the shoulder and tell them to get out of there… I thought the phone could be that tap on the shoulder. The “tap on the shoulder” feature uses geolocation and works by emitting a sound alert when the user enters a gaming venue. It also provides information about each user’s losses at that venue. This “tap on the shoulder” is both an alert and a reprimand from past gambling sessions. Through the Responsible Gambling Fund, the NSW government also launched an anti-pokie app in 2013, Gambling Terminator, including a similar feature. The app runs on Apple and Android smartphone platforms, and when a person is inside a gambling venue in New South Wales it: sends reminder messages that interrupt gaming-machine play and gives you a chance to re-think your choices. It also provides instant access to live phone and online counselling services which operate 24 hours a day, seven days a week. (Google Play Store) Yet an approach that tries to prevent harm by anticipating the harm that will come from gambling at the point of entering a venue, also eliminates the chance of potential negotiations and encounters a user might have during a visit to the pub and how this experience will unfold. It reduces the “tap on the shoulder”, which may involve a far wider set of interactions and affects, to a software operation and it frames the pub or the club (which under some conditions functions as hubs for socialization and community building) as dangerous places that should be avoided. This has the potential to lead to further stigmatisation of gamblers, their isolation and their exclusion from everyday spaces. Moreland Mayor, Councillor Tapinos captures the implicit framing of self-care as a private act in his explanation of the app as a method for problem gamblers to avoid being stigmatised by, for example, publicly attending group meetings. Yet, curiously, the app has the potential to create a new kind of public stigmatisation through potentially drawing other peoples’ attention to users’ gambling play (as the alarm is triggered) generating embarrassment and humiliation at being “caught out” in an act framed as aberrant and literally, “alarming”. Both Quit Pokies and Gambling Terminator require their users to perform ‘acts’ of physical and affective labour aimed at behaviour change and developing the skills of self-control. After downloading Quit Pokies on the iPhone and launching the app, the user is presented an initial request: “Before you set up this app. please write a list of the pokies venues that you regularly use because the app will ask you to identify these venues so it can send you alerts if you spend time in these locations. It will also use your set up location to identify other venues you might use so we recommend that you set up the App in the location where you spend most time. Congratulation on choosing Quit Pokies.”Self-performed processes include installation, setting up, updating the app software, programming in gambling venues to be detected by the smartphone’s inbuilt GPS, monitoring and responding to the program’s alerts and engaging in alternate “legitimate” forms of leisure such as going to the movies or the library, having coffee with a friend or browsing Facebook. These self-performed labours can be understood as ‘technologies of the self’, a term used by Foucault to describe the way in which social members are obliged to regulate and police their ‘selves’ through a range of different techniques. While Foucault traces the origins of ‘technologies of the self’ to the Greco-Roman texts with their emphasis on “care of oneself” as one of the duties of citizenry, he notes the shift to “self-knowledge” under Christianity around the 8th century, where it became bound up in ideals of self-renunciation and truth. Quit Pokies and Gambling Terminator may signal a recuperation of the ideal of self-care, over confession and disclosure. These apps institute a set of bodily activities and obligations directed to the user’s health and wellbeing, aided through activities of self-examination such as charting your recovery through a Recovery Diary and implementing a number of suggested “Strategies for Change” such as “writing a list” and “learning about ways to manage your money better”. Writing is central to the acts of self-examination. As Jeremy Prangnell, gambling counsellor from Mission Australia for Wollongong and Shellharbour regions explained the app is “like an electronic diary, which is a really common tool for people who are trying to change their behaviour” (Thompson). The labours required by users are also implicated in the functionality and performance of the platform itself suggesting the way in which ‘technologies of the self’ simultaneously function as a form of platform work: user labour that supports and sustains the operation of digital systems and is central to the performance and continuation of digital capitalism in general (Humphry, Demanding Media). In addition to the acts of labour performed on the self and platform, bodies are themselves potentially mobilised (and put into new circuits of consumption and production), as a result of triggers to nudge users away from gambling venues, towards a range of other cultural practices in alternative social spaces considered to be more legitimate.Conclusion Whether or not these technological interventions are effective or successful is yet to be tested. Indeed, the lack of recent activity in the community forums and preponderance of issues reported on installation and use suggests otherwise, pointing to a need for more empirical research into these developments. Regardless, what we’ve tried to identify is the way in which apps such as these embody a new kind of subject-state relation that emphasises self-control of gambling harm and hastens the divestment of institutional and social responsibility at a time when gambling is going through an immense period of expansion in many respects backed by and sanctioned by the state. Patterns of smartphone take up in the mainstream population and the rise of the so called ‘mobile only population’ (ACMA) provide support for this new subject and service paradigm and are often cited as the rationale for digital service reform (APSMR). Media convergence feeds into these dynamics: service delivery becomes the new frontier for the merging of previously separate media distribution systems (Dwyer). Letters, customer service centres, face-to-face meetings and web sites, are combined and in some instances replaced, with online and mobile media platforms, accessible from multiple and mobile devices. These changes are not, however, simply the migration of services to a digital medium with little effective change to the service itself. Health and medical services are re-invented through their technological re-assemblage, bringing into play new meanings, practices and negotiations among the state, industry and neoliberal subjects (in the case of problem gambling apps, a new subjectivity, the ‘neoliberal addict’). These new assemblages are as much about bringing forth a new kind of subject and mode of governance, as they are a solution to problem gambling. This figure of the self-treating “gambler addict” can be seen to be a template for, and prototype of, a more generalised and universalised self-governing citizen: one that no longer needs or makes demands on the state but who can help themselves and manage their own harm. Paradoxically, there is the potential for new risks and harms to the very same users that accompanies this shift: their outright exclusion as a result of deprivation from basic and assumed digital access and literacy, the further stigmatisation of gamblers, the elimination of opportunities for proximal support and their exclusion from everyday spaces. References Albarrán-Torres, César. “Gambling-Machines and the Automation of Desire.” Platform: Journal of Media and Communication 5.1 (2013). Australian Communications and Media Authority. “Australians Cut the Cord.” Research Snapshots. Sydney: ACMA (2013) Berry, David. Critical Theory and the Digital. Broadway, New York: Bloomsbury Academic, 2014 Berry, David. Stunlaw: A Critical Review of Politics, Arts and Technology. 2012. ‹http://stunlaw.blogspot.com.au/2012/03/code-foucault-and-neoliberal.html›. Caldwell, G. “Some Historical and Sociological Characteristics of Australian Gambling.” Gambling in Australia. Eds. G. Caldwell, B. Haig, M. Dickerson, and L. Sylan. Sydney: Croom Helm Australia, 1985. 18-27. Cervini, E. “High Stakes for Gambling Students.” The Age 8 Nov. 2013. ‹http://www.theage.com.au/national/education/high-stakes-for-gambling-students-20131108-2x5cl.html›. Deleuze, Gilles. "Postscript on the Societies of Control." October (1992): 3-7. Foucault, Michel. “Technologies of the Self.” Eds. Luther H. Martin, Huck Gutman and Patrick H. Hutton. Boston: University of Massachusetts Press, 1988 Hastings, E. “Online Gamblers More at Risk of Addiction.” Herald Sun 13 Oct. 2013. ‹http://www.heraldsun.com.au/news/online-gamblers-more-at-risk-of-addiction/story-fni0fiyv-1226739184629#!›.Hayatbakhsh, Mohammad R., et al. "Young Adults' Gambling and Its Association with Mental Health and Substance Use Problems." Australian and New Zealand Journal of Public Health 36.2 (2012): 160-166. Hing, Nerilee, and Helen Breen. "A Profile of Gaming Machine Players in Clubs in Sydney, Australia." Journal of Gambling Studies 18.2 (2002): 185-205. Holdsworth, Louise, Margaret Tiyce, and Nerilee Hing. "Exploring the Relationship between Problem Gambling and Homelessness: Becoming and Being Homeless." Gambling Research 23.2 (2012): 39. Humphry, Justine. “Demanding Media: Platform Work and the Shaping of Work and Play.” Scan: Journal of Media Arts Culture, 10.2 (2013): 1-13. Humphry, Justine. “Homeless and Connected: Mobile Phones and the Internet in the Lives of Homeless Australians.” Australian Communications Consumer Action Network. Sep. 2014. ‹https://www.accan.org.au/grants/completed-grants/619-homeless-and-connected›.Lee, Timothy Jeonglyeol. "Distinctive Features of the Australian Gambling Industry and Problems Faced by Australian Women Gamblers." Tourism Analysis 14.6 (2009): 867-876. Lupton, D. “The Digitally Engaged Patient: Self-Monitoring and Self-Care in the Digital Health Era.” Social Theory & Health 11.3 (2013): 256-70. Markham, Francis, and Martin Young. “Packer’s Barangaroo Casino and the Inevitability of Pokies.” The Conversation 9 July 2013. ‹http://theconversation.com/packers-barangaroo-casino-and-the-inevitability-of-pokies-15892›. Markham, Francis, and Martin Young. “Who Wins from ‘Big Gambling’ in Australia?” The Conversation 6 Mar. 2014. ‹http://theconversation.com/who-wins-from-big-gambling-in-australia-22930›.McMillen, Jan, and Katie Donnelly. "Gambling in Australian Indigenous Communities: The State of Play." The Australian Journal of Social Issues 43.3 (2008): 397. Ohtsuka, Keis, and Thai Ohtsuka. “Vietnamese Australian Gamblers’ Views on Luck and Winning: Universal versus Culture-Specific Schemas.” Asian Journal of Gambling Issues and Public Health 1.1 (2010): 34-46. Scull, Sue, Geoffrey Woolcock. “Problem Gambling in Non-English Speaking Background Communities in Queensland, Australia: A Qualitative Exploration.” International Gambling Studies 5.1 (2005): 29-44. Tanasornnarong, Nattaporn, Alun Jackson, and Shane Thomas. “Gambling among Young Thai People in Melbourne, Australia: An Exploratory Study.” International Gambling Studies 4.2 (2004): 189-203. Thompson, Angela, “Live Gambling Odds Tipped for the Chop.” Illawarra Mercury 22 May 2013: 6. Metherell, Mark. “Virtual Pokie App a Hit - But ‘Not Gambling.’” Sydney Morning Herald 13 Jan. 2013. ‹http://www.smh.com.au/digital-life/smartphone-apps/virtual-pokie-app-a-hit--but-not-gambling-20130112-2cmev.html#ixzz2QVlsCJs1›. Worthington, Andrew, et al. "Gambling Participation in Australia: Findings from the National Household Expenditure Survey." Review of Economics of the Household 5.2 (2007): 209-221. Young, Martin, et al. "The Changing Landscape of Indigenous Gambling in Northern Australia: Current Knowledge and Future Directions." International Gambling Studies 7.3 (2007): 327-343. Ziolkowski, S. “The World Count of Gaming Machines 2013.” Gaming Technologies Association, 2014. ‹http://www.gamingta.com/pdf/World_Count_2014.pdf›.
APA, Harvard, Vancouver, ISO, and other styles
29

Jethani, Suneel, and Robbie Fordyce. "Darkness, Datafication, and Provenance as an Illuminating Methodology." M/C Journal 24, no. 2 (April 27, 2021). http://dx.doi.org/10.5204/mcj.2758.

Full text
Abstract:
Data are generated and employed for many ends, including governing societies, managing organisations, leveraging profit, and regulating places. In all these cases, data are key inputs into systems that paradoxically are implemented in the name of making societies more secure, safe, competitive, productive, efficient, transparent and accountable, yet do so through processes that monitor, discipline, repress, coerce, and exploit people. (Kitchin, 165) Introduction Provenance refers to the place of origin or earliest known history of a thing. It refers to the custodial history of objects. It is a term that is commonly used in the art-world but also has come into the language of other disciplines such as computer science. It has also been applied in reference to the transactional nature of objects in supply chains and circular economies. In an interview with Scotland’s Institute for Public Policy Research, Adam Greenfield suggests that provenance has a role to play in the “establishment of reliability” given that a “transaction or artifact has a specified provenance, then that assertion can be tested and verified to the satisfaction of all parities” (Lawrence). Recent debates on the unrecognised effects of digital media have convincingly argued that data is fully embroiled within capitalism, but it is necessary to remember that data is more than just a transactable commodity. One challenge in bringing processes of datafication into critical light is how we understand what happens to data from its point of acquisition to the point where it becomes instrumental in the production of outcomes that are of ethical concern. All data gather their meaning through relationality; whether acting as a representation of an exterior world or representing relations between other data points. Data objectifies relations, and despite any higher-order complexities, at its core, data is involved in factualising a relation into a binary. Assumptions like these about data shape reasoning, decision-making and evidence-based practice in private, personal and economic contexts. If processes of datafication are to be better understood, then we need to seek out conceptual frameworks that are adequate to the way that data is used and understood by its users. Deborah Lupton suggests that often we give data “other vital capacities because they are about human life itself, have implications for human life opportunities and livelihoods, [and] can have recursive effects on human lives (shaping action and concepts of embodiment ... selfhood [and subjectivity]) and generate economic value”. But when data are afforded such capacities, the analysis of its politics also calls for us to “consider context” and “making the labour [of datafication] visible” (D’Ignazio and Klein). For Jenny L. Davis, getting beyond simply thinking about what data affords involves bringing to light how continually and dynamically to requests, demands, encourages, discourages, and refuses certain operations and interpretations. It is in this re-orientation of the question from what to how where “practical analytical tool[s]” (Davis) can be found. Davis writes: requests and demands are bids placed by technological objects, on user-subjects. Encourage, discourage and refuse are the ways technologies respond to bids user-subjects place upon them. Allow pertains equally to bids from technological objects and the object’s response to user-subjects. (Davis) Building on Lupton, Davis, and D’Ignazio and Klein, we see three principles that we consider crucial for work on data, darkness and light: data is not simply a technological object that exists within sociotechnical systems without having undergone any priming or processing, so as a consequence the data collecting entity imposes standards and way of imagining data before it comes into contact with user-subjects; data is not neutral and does not possess qualities that make it equivalent to the things that it comes to represent; data is partial, situated, and contingent on technical processes, but the outcomes of its use afford it properties beyond those that are purely informational. This article builds from these principles and traces a framework for investigating the complications arising when data moves from one context to another. We draw from the “data provenance” as it is applied in the computing and informational sciences where it is used to query the location and accuracy of data in databases. In developing “data provenance”, we adapt provenance from an approach that solely focuses on technical infrastructures and material processes that move data from one place to another and turn to sociotechnical, institutional, and discursive forces that bring about data acquisition, sharing, interpretation, and re-use. As data passes through open, opaque, and darkened spaces within sociotechnical systems, we argue that provenance can shed light on gaps and overlaps in technical, legal, ethical, and ideological forms of data governance. Whether data becomes exclusive by moving from light to dark (as has happened with the removal of many pages and links from Facebook around the Australian news revenue-sharing bill), or is publicised by shifting from dark to light (such as the Australian government releasing investigative journalist Andie Fox’s welfare history to the press), or even recontextualised from one dark space to another (as with genetic data shifting from medical to legal contexts, or the theft of personal financial data), there is still a process of transmission here that we can assess and critique through provenance. These different modalities, which guide data acquisition, sharing, interpretation, and re-use, cascade and influence different elements and apparatuses within data-driven sociotechnical systems to different extents depending on context. Attempts to illuminate and make sense of these complex forces, we argue, exposes data-driven practices as inherently political in terms of whose interests they serve. Provenance in Darkness and in Light When processes of data capture, sharing, interpretation, and re-use are obscured, it impacts on the extent to which we might retrospectively examine cases where malpractice in responsible data custodianship and stewardship has occurred, because it makes it difficult to see how things have been rendered real and knowable, changed over time, had causality ascribed to them, and to what degree of confidence a decision has been made based on a given dataset. To borrow from this issue’s concerns, the paradigm of dark spaces covers a range of different kinds of valences on the idea of private, secret, or exclusive contexts. We can parallel it with the idea of ‘light’ spaces, which equally holds a range of different concepts about what is open, public, or accessible. For instance, in the use of social data garnered from online platforms, the practices of academic researchers and analysts working in the private sector often fall within a grey zone when it comes to consent and transparency. Here the binary notion of public and private is complicated by the passage of data from light to dark (and back to light). Writing in a different context, Michael Warner complicates the notion of publicness. He observes that the idea of something being public is in and of itself always sectioned off, divorced from being fully generalisable, and it is “just whatever people in a given context think it is” (11). Michael Hardt and Antonio Negri argue that publicness is already shadowed by an idea of state ownership, leaving us in a situation where public and private already both sit on the same side of the propertied/commons divide as if the “only alternative to the private is the public, that is, what is managed and regulated by states and other governmental authorities” (vii). The same can be said about the way data is conceived as a public good or common asset. These ideas of light and dark are useful categorisations for deliberately moving past the tensions that arise when trying to qualify different subspecies of privacy and openness. The problem with specific linguistic dyads of private vs. public, or open vs. closed, and so on, is that they are embedded within legal, moral, technical, economic, or rhetorical distinctions that already involve normative judgements on whether such categories are appropriate or valid. Data may be located in a dark space for legal reasons that fall under the legal domain of ‘private’ or it may be dark because it has been stolen. It may simply be inaccessible, encrypted away behind a lost password on a forgotten external drive. Equally, there are distinctions around lightness that can be glossed – the openness of Open Data (see: theodi.org) is of an entirely separate category to the AACS encryption key, which was illegally but enthusiastically shared across the internet in 2007 to the point where it is now accessible on Wikipedia. The language of light and dark spaces allows us to cut across these distinctions and discuss in deliberately loose terms the degree to which something is accessed, with any normative judgments reserved for the cases themselves. Data provenance, in this sense, can be used as a methodology to critique the way that data is recontextualised from light to dark, dark to light, and even within these distinctions. Data provenance critiques the way that data is presented as if it were “there for the taking”. This also suggests that when data is used for some or another secondary purpose – generally for value creation – some form of closure or darkening is to be expected. Data in the public domain is more than simply a specific informational thing: there is always context, and this contextual specificity, we argue, extends far beyond anything that can be captured in a metadata schema or a licensing model. Even the transfer of data from one open, public, or light context to another will evoke new degrees of openness and luminosity that should not be assumed to be straightforward. And with this a new set of relations between data-user-subjects and stewards emerges. The movement of data between public and private contexts by virtue of the growing amount of personal information that is generated through the traces left behind as people make use of increasingly digitised services going about their everyday lives means that data-motile processes are constantly occurring behind the scenes – in darkness – where it comes into the view, or possession, of third parties without obvious mechanisms of consent, disclosure, or justification. Given that there are “many hands” (D’Iganzio and Klein) involved in making data portable between light and dark spaces, equally there can be diversity in the approaches taken to generate critical literacies of these relations. There are two complexities that we argue are important for considering the ethics of data motility from light to dark, and this differs from the concerns that we might have when we think about other illuminating tactics such as open data publishing, freedom-of-information requests, or when data is anonymously leaked in the public interest. The first is that the terms of ethics must be communicable to individuals and groups whose data literacy may be low, effectively non-existent, or not oriented around the objective of upholding or generating data-luminosity as an element of a wider, more general form of responsible data stewardship. Historically, a productive approach to data literacy has been finding appropriate metaphors from adjacent fields that can help add depth – by way of analogy – to understanding data motility. Here we return to our earlier assertion that data is more than simply a transactable commodity. Consider the notion of “giving” and “taking” in the context of darkness and light. The analogy of giving and taking is deeply embedded into the notion of data acquisition and sharing by virtue of the etymology of the word data itself: in Latin, “things having been given”, whereby in French données, a natural gift, perhaps one that is given to those that attempt capture for the purposes of empiricism – representation in quantitative form is a quality that is given to phenomena being brought into the light. However, in the contemporary parlance of “analytics” data is “taken” in the form of recording, measuring, and tracking. Data is considered to be something valuable enough to give or take because of its capacity to stand in for real things. The empiricist’s preferred method is to take rather than to accept what is given (Kitchin, 2); the data-capitalist’s is to incentivise the act of giving or to take what is already given (or yet to be taken). Because data-motile processes are not simply passive forms of reading what is contained within a dataset, the materiality and subjectivity of data extraction and interpretation is something that should not be ignored. These processes represent the recontextualisation of data from one space to another and are expressed in the landmark case of Cambridge Analytica, where a private research company extracted data from Facebook and used it to engage in psychometric analysis of unknowing users. Data Capture Mechanism Characteristics and Approach to Data Stewardship Historical Information created, recorded, or gathered about people of things directly from the source or a delegate but accessed for secondary purposes. Observational Represents patterns and realities of everyday life, collected by subjects by their own choice and with some degree of discretion over the methods. Third parties access this data through reciprocal arrangement with the subject (e.g., in exchange for providing a digital service such as online shopping, banking, healthcare, or social networking). Purposeful Data gathered with a specific purpose in mind and collected with the objective to manipulate its analysis to achieve certain ends. Integrative Places less emphasis on specific data types but rather looks towards social and cultural factors that afford access to and facilitate the integration and linkage of disparate datasets Table 1: Mechanisms of Data Capture There are ethical challenges associated with data that has been sourced from pre-existing sets or that has been extracted from websites and online platforms through scraping data and then enriching it through cleaning, annotation, de-identification, aggregation, or linking to other data sources (tab. 1). As a way to address this challenge, our suggestion of “data provenance” can be defined as where a data point comes from, how it came into being, and how it became valuable for some or another purpose. In developing this idea, we borrow from both the computational and biological sciences (Buneman et al.) where provenance, as a form of qualitative inquiry into data-motile processes, centres around understanding the origin of a data point as part of a broader almost forensic analysis of quality and error-potential in datasets. Provenance is an evaluation of a priori computational inputs and outputs from the results of database queries and audits. Provenance can also be applied to other contexts where data passes through sociotechnical systems, such as behavioural analytics, targeted advertising, machine learning, and algorithmic decision-making. Conventionally, data provenance is based on understanding where data has come from and why it was collected. Both these questions are concerned with the evaluation of the nature of a data point within the wider context of a database that is itself situated within a larger sociotechnical system where the data is made available for use. In its conventional sense, provenance is a means of ensuring that a data point is maintained as a single source of truth (Buneman, 89), and by way of a reproducible mechanism which allows for its path through a set of technical processes, it affords the assessment of a how reliable a system’s output might be by sheer virtue of the ability for one to retrace the steps from point A to B. “Where” and “why” questions are illuminating because they offer an ends-and-means view of the relation between the origins and ultimate uses of a given data point or set. Provenance is interesting when studying data luminosity because means and ends have much to tell us about the origins and uses of data in ways that gesture towards a more accurate and structured research agenda for data ethics that takes the emphasis away from individual moral patients and reorients it towards practices that occur within information management environments. Provenance offers researchers seeking to study data-driven practices a similar heuristic to a journalist’s line of questioning who, what, when, where, why, and how? This last question of how is something that can be incorporated into conventional models of provenance that make it useful in data ethics. The question of how data comes into being extends questions of power, legality, literacy, permission-seeking, and harm in an entangled way and notes how these factors shape the nature of personal data as it moves between contexts. Forms of provenance accumulate from transaction to transaction, cascading along, as a dataset ‘picks up’ the types of provenance that have led to its creation. This may involve multiple forms of overlapping provenance – methodological and epistemological, legal and illegal – which modulate different elements and apparatuses. Provenance, we argue is an important methodological consideration for workers in the humanities and social sciences. Provenance provides a set of shared questions on which models of transparency, accountability, and trust may be established. It points us towards tactics that might help data-subjects understand privacy in a contextual manner (Nissenbaum) and even establish practices of obfuscation and “informational self-defence” against regimes of datafication (Brunton and Nissenbaum). Here provenance is not just a declaration of what means and ends of data capture, sharing, linkage, and analysis are. We sketch the outlines of a provenance model in table 2 below. Type Metaphorical frame Dark Light What? The epistemological structure of a database determines the accuracy of subsequent decisions. Data must be consistent. What data is asked of a person beyond what is strictly needed for service delivery. Data that is collected for a specific stated purpose with informed consent from the data-subject. How does the decision about what to collect disrupt existing polities and communities? What demands for conformity does the database make of its subjects? Where? The contents of a database is important for making informed decisions. Data must be represented. The parameters of inclusion/exclusion that create unjust risks or costs to people because of their inclusion or exclusion in a dataset. The parameters of inclusion or exclusion that afford individuals representation or acknowledgement by being included or excluded from a dataset. How are populations recruited into a dataset? What divides exist that systematically exclude individuals? Who? Who has access to data, and how privacy is framed is important for the security of data-subjects. Data access is political. Access to the data by parties not disclosed to the data-subject. Who has collected the data and who has or will access it? How is the data made available to those beyond the data subjects? How? Data is created with a purpose and is never neutral. Data is instrumental. How the data is used, to what ends, discursively, practically, instrumentally. Is it a private record, a source of value creation, the subject of extortion or blackmail? How the data was intended to be used at the time that it was collected. Why? Data is created by people who are shaped by ideological factors. Data has potential. The political rationality that shapes data governance with regard to technological innovation. The trade-offs that are made known to individuals when they contribute data into sociotechnical systems over which they have limited control. Table 2: Forms of Data Provenance Conclusion As an illuminating methodology, provenance offers a specific line of questioning practices that take information through darkness and light. The emphasis that it places on a narrative for data assets themselves (asking what when, who, how, and why) offers a mechanism for traceability and has potential for application across contexts and cases that allows us to see data malpractice as something that can be productively generalised and understood as a series of ideologically driven technical events with social and political consequences without being marred by perceptions of exceptionality of individual, localised cases of data harm or data violence. References Brunton, Finn, and Helen Nissenbaum. "Political and Ethical Perspectives on Data Obfuscation." Privacy, Due Process and the Computational Turn: The Philosophy of Law Meets the Philosophy of Technology. Eds. Mireille Hildebrandt and Katja de Vries. New York: Routledge, 2013. 171-195. Buneman, Peter, Sanjeev Khanna, and Wang-Chiew Tan. "Data Provenance: Some Basic Issues." International Conference on Foundations of Software Technology and Theoretical Computer Science. Berlin: Springer, 2000. Davis, Jenny L. How Artifacts Afford: The Power and Politics of Everyday Things. Cambridge: MIT Press, 2020. D'Ignazio, Catherine, and Lauren F. Klein. Data Feminism. Cambridge: MIT Press, 2020. Hardt, Michael, and Antonio Negri. Commonwealth. Cambridge: Harvard UP, 2009. Kitchin, Rob. "Big Data, New Epistemologies and Paradigm Shifts." Big Data & Society 1.1 (2014). Lawrence, Matthew. “Emerging Technology: An Interview with Adam Greenfield. ‘God Forbid That Anyone Stopped to Ask What Harm This Might Do to Us’. Institute for Public Policy Research, 13 Oct. 2017. <https://www.ippr.org/juncture-item/emerging-technology-an-interview-with-adam-greenfield-god-forbid-that-anyone-stopped-to-ask-what-harm-this-might-do-us>. Lupton, Deborah. "Vital Materialism and the Thing-Power of Lively Digital Data." Social Theory, Health and Education. Eds. Deana Leahy, Katie Fitzpatrick, and Jan Wright. London: Routledge, 2018. Nissenbaum, Helen F. Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford: Stanford Law Books, 2020. Warner, Michael. "Publics and Counterpublics." Public Culture 14.1 (2002): 49-90.
APA, Harvard, Vancouver, ISO, and other styles
30

McQuillan, Dan. "The Countercultural Potential of Citizen Science." M/C Journal 17, no. 6 (October 12, 2014). http://dx.doi.org/10.5204/mcj.919.

Full text
Abstract:
What is the countercultural potential of citizen science? As a participant in the wider citizen science movement, I can attest that contemporary citizen science initiatives rarely characterise themselves as countercultural. Rather, the goal of most citizen science projects is to be seen as producing orthodox scientific knowledge: the ethos is respectability rather than rebellion (NERC). I will suggest instead that there are resonances with the counterculture that emerged in the 1960s, most visibly through an emphasis on participatory experimentation and the principles of environmental sustainability and social justice. This will be illustrated by example, through two citizen science projects that have a commitment to combining social values with scientific practice. I will then describe the explicitly countercultural organisation, Science for the People, which arose from within the scientific community itself, out of opposition to the Vietnam War. Methodological and conceptual weaknesses in the authoritative model of science are explored, suggesting that there is an opportunity for citizen science to become anti-hegemonic by challenging the hegemony of science itself. This reformulation will be expressed through Deleuze and Guattari's notion of nomadic science, the means through which citizen science could become countercultural. Counterculture Before examining the countercultural potential of citizen science, I set out some of the grounds for identifying a counterculture drawing on the ideas of Theodore Roszak, who invented the term counterculture to describe the new forms of youth movements that emerged in the 1960s (Roszak). This was a perspective that allowed the carnivalesque procession of beatniks, hippies and the New Left to be seen as a single paradigm shift combining psychic and social revolution. But just as striking and more often forgotten is the way Roszak characterised the role of the counterculture as mobilising a vital critique of the scientific worldview (Roszak 273-274). The concept of counterculture has been taken up in diverse ways since its original formation. We can draw, for example, on Lawrence Grossberg's more contemporary analysis of counterculture (Grossberg) to clarify the main concepts and contrast them with a scientific approach. Firstly, a counterculture works on and through cultural formations. This positions it as something the scientific community would see as the other, as the opposite to the objective, repeatable and quantitative truth-seeking of science. Secondly, a counterculture is a diverse and hybrid space without a unitary identity. Again, scientists would often see science as a singular activity applied in modulated forms depending on the context, although in practice the different sciences can experience each other as different tribes. Thirdly, a counterculture is lived as a transformative experience where the participant is fundamentally changed at a psychic level through participation in unique events. Contrast this with the scientific idea of the separation of observer and observed, and the objective repeatability of the experiment irrespective of the experimenter. Fourthly, a counterculture is associated with a unique moment in time, a point of shift from the old to the new. For the counterculture of the 1960s this was the Age of Aquarius. In general, the aim of science and scientists is to contribute to a form of truth that is essentially timeless, in that a physical law is assumed to hold across all time (and space), although science also has moments of radical change with regard to scientific paradigms. Finally, and significantly for the conclusions of this paper, according to Roszak a counterculture stands against the mainstream. It offers a challenge not at the level of detail but, to the fundamental assumptions of the status quo. This is what “science” cannot do, in as much as science itself has become the mainstream. It was the character of science as the bedrock of all values that Roszak himself opposed and for which he named and welcomed the counterculture. Although critical of some of the more shallow aspects of its psychedelic experimentation or political militancy, he shared its criticism of the technocratic society (the technocracy) and the egocentric mode of consciousness. His hope was that the counterculture could help restore a visionary imagination along with a more human sense of community. What Is Citizen Science? In recent years the concept of citizen science has grown massively in popularity, but is still an open and unstable term with many variants. Current moves towards institutionalisation (Citizen Science Association) are attempting to marry growth and stabilisation, with the first Annual General Meeting of the European Citizen Science Association securing a tentative agreement on the common principles of citizen science (Haklay, "European"). Key papers and presentations in the mainstream of the movement emphasise that citizen science is not a new activity (Bonney et al.) with much being made of the fact that the National Audubon Society started its annual Christmas Bird Count in 1900 (National Audubon Society). However, this elides the key role of the Internet in the current surge, which takes two distinct forms; the organisation of distributed fieldwork, and the online crowdsourcing of data analysis. To scientists, the appeal of citizen science fieldwork follows from its distributed character; they can research patterns over large scales and across latitudes in ways that would be impossible for a researcher at a single study site (Toomey). Gathering together the volunteer, observations are made possible by an infrastructure of web tools. The role of the citizen in this is to be a careful observer; the eyes and ears of the scientist in cyberspace. In online crowdsourcing, the internet is used to present pattern recognition tasks; enrolling users in searching images for signs of new planets or the jets of material from black holes. The growth of science crowdsourcing is exponential; one of the largest sites facilitating this kind of citizen science now has well in excess of a million registered users (Zooniverse). Such is the force of the technological aura around crowdsourced science that mainstream publications often conflate it with the whole of citizen science (Parr). There are projects within citizen science which share core values with the counterculture as originally defined by Roszak, in particular open participation and social justice. These projects also show characteristics from Grossberg's analysis of counterculture; they are diverse and hybrid spaces, carry a sense of moving from an old era to a new one, and have cultural forms of their own. They open up the full range of the scientific method to participation, including problem definition, research design, analysis and action. Citizen science projects that aim for participation in all these areas include the Extreme Citizen Science research group (ExCiteS) at University College London (UCL), the associated social enterprise Mapping for Change (Mapping for Change), and the Public Laboratory for Open Technology and Science (Public Lab). ExCiteS sees its version of citizen science as "a situated, bottom-up practice" that "takes into account local needs, practices and culture". Public Lab, meanwhile, argue that many citizen science projects only offer non-scientists token forms of participation in scientific inquiry that rarely amount to more that data collection and record keeping. They counter this through an open process which tries to involve communities all the way from framing the research questions, to prototyping tools, to collating and interpreting the measurements. ExCiteS and Public Lab also share an implicit commitment to social justice through scientific activity. The Public Lab mission is to "put scientific inquiry at the heart of civic life" and the UCL research group strive for "new devices and knowledge creation processes that can transform the world". All of their work is framed by environmental sustainability and care for the planet, whether it's enabling environmental monitoring by indigenous communities in the Congo (ExCiteS) or developing do-it-yourself spectrometry kits to detect crude oil pollution (Public Lab, "Homebrew"). Having provided a case for elements of countercultural DNA being present in bottom-up and problem-driven citizen science, we can contrast this with Science for the People, a scientific movement that was born out of the counterculture. Countercultural Science from the 1970s: Science for the People Science for the People (SftP) was a scientific movement seeded by a rebellion of young physicists against the role of US science in the Vietnam War. Young members of the American Physical Society (APS) lobbied for it to take a position against the war but were heavily criticised by other members, whose written complaints in the communications of the APS focused on the importance of scientific neutrality and the need to maintain the association's purely scientific nature rather than allowing science to become contaminated by politics (Sarah Bridger, in Plenary 2, 0:46 to 1:04). The counter-narrative from the dissidents argued that science is not neutral, invoking the example of Nazi science as a justification for taking a stand. After losing the internal vote the young radicals left to form Scientists and Engineers for Social and Political Action (SESPA), which later became Science for the People (SftP). As well as opposition to the Vietnam War, SftP embodied from the start other key themes of the counterculture, such as civil rights and feminism. For example, the first edition of Science for the People magazine (appearing as Vol. 2, No. 2 of the SESPA Newsletter) included an article about leading Black Panther, Bobby Seale, alongside a piece entitled “Women Demand Equality in Science.” The final articles in the same issue are indicators of SftP's dual approach to science and change; both the radicalisation of professionals (“Computer Professionals for Peace”) and the demystification of technical practices (“Statistics for the People”) (Science for the People). Science for the People was by no means just a magazine. For example, their technical assistance programme provided practical support to street health clinics run by the Black Panthers, and brought SftP under FBI surveillance (Herb Fox, in Plenary 1, 0:25 to 0:35). Both as a magazine and as a movement, SftP showed a tenacious longevity, with the publication being produced every two months between August 1970 and May/June 1989. It mutated through a network of affiliated local groups and international links, and was deeply involved in constructing early critiques of nuclear power and genetic determinism. SftP itself seems to have had a consistent commitment to non-hierarchical processes and, as one of the founders expressed it, a “shit kicking” approach to putting its principles in to practice (Al Weinrub, in Plenary 1, 0:25 to 0:35). SftP criticised power, front and centre. It is this opposition to hegemony that puts the “counter” into counterculture, and is missing from citizen science as currently practised. Cracks in the authority of orthodox science, which can be traced to both methodologies and basic concepts, follow in this paper. These can be seen as an opportunity for citizen science to directly challenge orthodox science and thus establish an anti-hegemonic stance of its own. Weaknesses of Scientific Hegemony In this section I argue that the weaknesses of scientific hegemony are in proportion to its claims to authority (Feyerabend). Through my scientific training as an experimental particle physicist I have participated in many discussions about the ontological and epistemological grounds for scientific authority. While most scientists choose to present their practice publicly as an infallible machine for the production of truths, the opinions behind the curtain are far more mixed. Physicist Lee Somolin has written a devastating critique of science-in-practice that focuses on the capture of the institutional economy of science by an ideological grouping of string theorists (Smolin), and his account is replete with questions about science itself and ethnographic details that bring to life the messy behind-the-scenes conflicts in scientific-knowledge making. Knowledge of this messiness has prompted some citizen science advocates to take science to task, for example for demanding higher standards in data consistency from citizen science than is often the case in orthodox science (Haklay, "Assertions"; Freitag, "Good Science"). Scientists will also and invariably refer to reproducibility as the basis for the authority of scientific truths. The principle that the same experiments always get the same results, irrespective of who is doing the experiment, and as long as they follow the same method, is a foundation of scientific objectivity. However, a 2012 study of landmark results in cancer science was able to reproduce only 11 per cent of the original findings (Begley and Ellis). While this may be an outlier case, there are broader issues with statistics and falsification, a bias on positive results, weaknesses in peer review and the “publish or perish” academic culture (The Economist). While the pressures are all-too-human, the resulting distortions are rarely acknowledged in public by scientists themselves. On the other hand, citizen science has been slow to pick up the gauntlet. For example, while some scientists involved in citizen science have commented on the inequality and inappropriateness of orthodox peer review for citizen science papers (Freitag, “What Is the Role”) there has been no direct challenge to any significant part of the scientific edifice. I argue that the nearest thing to a real challenge to orthodox science is the proposal for a post-normal science, which pre-dates the current wave of citizen science. Post-normal science tries to accommodate the philosophical implications of post-structuralism and at the same time position science to tackle problems such as climate change, intractable to reproducibility (Funtowicz and Ravetz). It accomplishes this by extending the domains in which science can provide meaningful answers to include issues such as global warming, which involve high decision stakes and high uncertainty. It extends traditional peer review into an extended peer community, which includes all the stakeholders in an issue, and may involve active research as well as quality assessment. The idea of extended peer review has obvious overlaps with community-oriented citizen science, but has yet to be widely mobilised as a theoretical buttress for citizen-led science. Prior even to post-normal science are the potential cracks in the core philosophy of science. In her book Cosmopolitics, Isabelle Stengers characterises the essential nature of scientific truth as the ability to disqualify and exclude other truth claims. This, she asserts, is the hegemony of physics and its singular claim to decide what is real and what is true. Stengers traces this, in part, to the confrontation more than one hundred years ago between Max Planck and Ernst Mach, whereas the latter argued that claims to an absolute truth should be replaced by formulations that tied physical laws to the human practices that produced them. Planck stood firmly for knowledge forms that were unbounded by time, space or specific social-material procedures (Stengers). Although contemporary understandings of science are based on Planck's version, citizen science has the potential to re-open these questions in a productive manner for its own practices, if it can re-conceive of itself as what Deleuze and Guattari would call nomadic science (Deleuze; Deleuze & Guattari). Citizen Science as Nomadic Science Deleuze and Guattari referred to orthodox science as Royal Science or Striated Science, referring in part to its state-like form of authority and practice, as well as its psycho-social character. Their alternative is a smooth or nomadic science that, importantly for citizen science, does not have the ambition to totalise knowledge. Nomadic science is a form of empirical investigation that has no need to be hooked up to a grand narrative. The concept of nomadic science is a natural fit for bottom-up citizen science because it can valorise truths that are non-dual and that go beyond objectivity to include the experiential. In this sense it is like the extended peer review of post-normal science but without the need to be limited to high-risk high-stakes questions. As there is no a priori problem with provisional knowledges, it naturally inclines towards the local, the situated and the culturally reflective. The apparent unreliability of citizen science in terms of participants and tools, which is solely a source of anxiety, can become heuristic for nomadic science when re-cast through the forgotten alternatives like Mach's formulation; that truths are never separated from the specifics of the context and process that produced them (Stengers 6-18; 223). Nomadic science, I believe, will start to emerge through projects that are prepared to tackle toxic epistemology as much as toxic pollutants. For example, the Community Based Auditing (CBA) developed by environmental activists in Tasmania (Tattersall) challenges local alliances of state and extractive industries by undermining their own truth claims with regards to environmental impact, a process described in the CBA Toolbox as disconfirmation. In CBA, this mixture of post-normal science and Stenger's critique is combined with forms of data collection and analysis known as Community Based Sampling (Tattersall et al.), which would be recognisable to any citizen science project. The change from citizen science to nomadic science is not a total rupture but a shift in the starting point: it is based on an overt critique of power. One way to bring this about is being tested in the “Kosovo Science for Change” project (Science for Change Kosovo), where I am a researcher and where we have adopted the critical pedagogy of Paulo Freire as the starting point for our empirical investigations (Freire). Critical pedagogy is learning as the co-operative activity of understanding—how our lived experience is constructed by power, and how to make a difference in the world. Taking a position such as nomadic science, openly critical of Royal Science, is the anti-hegemonic stance that could qualify citizen science as properly countercultural. Citizen Science and Counterculture Counterculture, as I have expressed it, stands against or rejects the hegemonic culture. However, there is a strong tendency in contemporary social movements to take a stance not only against the dominant structures but against hegemony itself. They contest what Richard Day calls the hegemony of hegemony (Day). I witnessed this during the counter-G8 mobilisation of 2001. Having been an activist in the 1980s and 1990s I was wearily familiar with the sectarian competitiveness of various radical narratives, each seeking to establish itself as the correct path. So it was a strongly affective experience to stand in the convergence centre and listen to so many divergent social groups and movements agree to support each other's tactics, expressing a solidarity based on a non-judgemental pluralism. Since then we have seen the emergence of similarly anti-hegemonic countercultures around the Occupy and Anonymous movements. It is in this context of counterculture that I will try to summarise and evaluate the countercultural potential of citizen science and what being countercultural might offer to citizen science itself. To be countercultural it is not enough for citizen science to counterpose participation against the institutional and hierarchical aspects of professional science. As an activity defined purely by engagement it offers to plug the legitimacy gap for science while still being wholly dependent on it. A countercultural citizen science must pose a strong challenge to the status quo, and I have suggested that a route to this would be to develop as nomadic science. This does not mean replacing or overthrowing science but constructing an other to science with its own claim to empirical methods. It is fair to ask what this would offer citizen science that it does not already have. At an abstract level it would gain a freedom of movement; an ability to occupy Deleuzian smooth spaces rather than be constrained by the striation of established science. The founders of Science for the People are clear that it could never have existed if it had not been able to draw on the mass movements of its time. Being countercultural would give citizen science an affinity with the bottom-up, local and community-based issues where empirical methods are likely to have the most social impact. One of many examples is the movement against fracking (the hydraulic fracturing of deep rock formations to release shale gas). Together, these benefits of being countercultural open up the possibility for forms of citizen science to spread rhizomatically in a way that is not about immaterial virtual labour but is itself part of a wider cultural change. The possibility of a nomadic science stands as a doorway to the change that Roszak saw at the heart of the counterculture, a renewal of the visionary imagination. References Begley, C. Glenn, and Lee M. Ellis. "Drug Development: Raise Standards for Preclinical Cancer Research." Nature 483.7391 (2012): 531–533. 8 Oct. 2014 ‹http://www.nature.com/nature/journal/v483/n7391/full/483531a.html›. Bonney, Rick, et al. "Citizen Science: A Developing Tool for Expanding Science Knowledge and Scientific Literacy." BioScience 59.11 (2009): 977–984. 6 Oct. 2014 ‹http://bioscience.oxfordjournals.org/content/59/11/977›. Citizen Science Association. "Citizen Science Association." 2014. 6 Oct. 2014 ‹http://citizenscienceassociation.org/›. Day, Richard J.F. Gramsci Is Dead: Anarchist Currents in the Newest Social Movements. London: Pluto Press, 2005. Deleuze, Giles. Nomadology: The War Machine. New York, NY: MIT Press, 1986. Deleuze, Gilles, and Felix Guattari. A Thousand Plateaus. London: Bloomsbury Academic, 2013. ExCiteS. "From Non-Literate Data Collection to Intelligent Maps." 26 Aug. 2013. 8 Oct. 2014 ‹http://www.ucl.ac.uk/excites/projects/excites-projects/intelligent-maps/intelligent-maps›. Feyerabend, Paul K. Against Method. 4th ed. London: Verso, 2010. Freire, Paulo. Pedagogy of the Oppressed. Continuum International Publishing Group, 2000. Freitag, Amy. "Good Science and Bad Science in Democratized Science." Oceanspaces 22 Jan. 2014. 9 Oct. 2014 ‹http://oceanspaces.org/blog/good-science-and-bad-science-democratized-science›. ---. "What Is the Role of Peer-Reviewed Literature in Citizen Science?" Oceanspaces 29 Jan. 2014. 10 Oct. 2014 ‹http://oceanspaces.org/blog/what-role-peer-reviewed-literature-citizen-science›. Funtowicz, Silvio O., and Jerome R. Ravetz. "Science for the Post-Normal Age." Futures 25.7 (1993): 739–755. 8 Oct. 2014 ‹http://www.sciencedirect.com/science/article/pii/001632879390022L›. Grossberg, Lawrence. "Some Preliminary Conjunctural Thoughts on Countercultures." Journal of Gender and Power 1.1 (2014). 3 Nov. 2014 ‹http://gender-power.amu.edu.pl/?page_id=20›. Haklay, Muki. "Assertions on Crowdsourced Geographic Information & Citizen Science #2." Po Ve Sham - Muki Haklay’s Personal Blog 16 Jan. 2014. 8 Oct. 2014 ‹http://povesham.wordpress.com/2014/01/16/assertions-on-crowdsourced-geographic-information-citizen-science-2/›. ---. "European Citizen Science Association Suggestion for 10 Principles of Citizen Science." Po Ve Sham - Muki Haklay’s Personal Blog 14 May 2014. 6 Oct. 2014 ‹http://povesham.wordpress.com/2014/05/14/european-citizen-science-association-suggestion-for-10-principles-of-citizen-science/›. Mapping for Change. "Mapping for Change." 2014. 6 June 2014 ‹http://www.mappingforchange.org.uk/›. National Audubon Society. "Christmas Bird Count." 2014. 6 Oct. 2014 ‹http://birds.audubon.org/christmas-bird-count›. NERC. "Best Practice Guides to Choosing and Using Citizen Science for Environmental Projects." Centre for Ecology & Hydrology May 2014. 9 Oct. 2014 ‹http://www.ceh.ac.uk/products/publications/understanding-citizen-science.html›. Parr, Chris. "Why Citizen Scientists Help and How to Keep Them Hooked." Times Higher Education 6 June 2013. 6 Oct. 2014 ‹http://www.timeshighereducation.co.uk/news/why-citizen-scientists-help-and-how-to-keep-them-hooked/2004321.article›. Plenary 1: Stories from the Movement. Film. Science for the People, 2014. Plenary 2: The History and Lasting Significance of Science for the People. Film. Science for the People, 2014. Public Lab. "Public Lab: A DIY Environmental Science Community." 2014. 6 June 2014 ‹http://publiclab.org/›. ---. "The Homebrew Oil Testing Kit." Kickstarter 24 Sep. 2014. 8 Oct. 2014 ‹https://www.kickstarter.com/projects/publiclab/the-homebrew-oil-testing-kit›. Roszak, Theodore. The Making of a Counter Culture. Garden City, N.Y.: Anchor Books/Doubleday, 1969. Science for Change Kosovo. "Citizen Science Kosovo." Facebook, n.d. 17 Aug. 2014 ‹https://www.facebook.com/CitSciKS›. Science for the People. "SftP Magazine." 2013. 8 Oct. 2014 ‹http://science-for-the-people.org/sftp-resources/magazine/›. Smolin, Lee. The Trouble with Physics: The Rise of String Theory, the Fall of a Science, and What Comes Next. Reprint ed. Boston: Mariner Books, 2007. Stengers, Isabelle. Cosmopolitics I. Trans. Robert Bononno. Minneapolis: U of Minnesota P, 2010. Tattersall, Philip J. "What Is Community Based Auditing and How Does It Work?." Futures 42.5 (2010): 466–474. 9 Oct. 2014 ‹http://www.sciencedirect.com/science/article/pii/S0016328709002055›. ---, Kim Eastman, and Tasmanian Community Resource Auditors. Community Based Auditing: Tool Boxes: Training and Support Guides. Beauty Point, Tas.: Resource Publications, 2010. The Economist. "Trouble at the Lab." 19 Oct. 2013. 8 Oct. 2014 ‹http://www.economist.com/news/briefing/21588057-scientists-think-science-self-correcting-alarming-degree-it-not-trouble›. Toomey, Diane. "How Rise of Citizen Science Is Democratizing Research." 28 Jan. 2014. 6 Oct. 2014 ‹http://e360.yale.edu/feature/interview_caren_cooper_how_rise_of_citizen_science_is_democratizing_research/2733/›. UCL. "Extreme Citizen Science (ExCiteS)." July 2013. 6 June 2014 ‹http://www.ucl.ac.uk/excites/›. Zooniverse. "The Ever-Expanding Zooniverse - Updated." Daily Zooniverse 3 Feb. 2014. 6 Oct. 2014 ‹http://daily.zooniverse.org/2014/02/03/the-ever-expanding-zooniverse-updated/›.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography