To see the other types of publications on this topic, follow the link: Computer network protocols – South Africa.

Journal articles on the topic 'Computer network protocols – South Africa'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 47 journal articles for your research on the topic 'Computer network protocols – South Africa.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Mosepele, Mosepele, Cecilia Kanyama, David Meya, et al. "PO 8719 INTRODUCING A NEW AFRICA MENINGITIS NETWORK – A NORTH-SOUTH COLLABORATION." BMJ Global Health 4, Suppl 3 (2019): A63.1—A63. http://dx.doi.org/10.1136/bmjgh-2019-edc.165.

Full text
Abstract:
BackgroundCentral nervous system infections, including meningitis, continue to cause significant morbidity in Africa. HIV has contributed to the epidemiology of CNS infections in this setting. Notable advances in the study of CNS infections by several groups have demonstrated the utility of new diagnostic strategies and impact of novel treatment strategies. However, efforts to coordinate meningitis research in Africa, and between Africa and the rest of the world remain very limited.MethodsIn a bid to promote a coordinated study of CNS infections across Africa, and in collaboration with other meningitis groups globally, the researchers of the AMBITION study (High Dose Ambisome on a Fluconazole Backbone for Cryptococcal Meningitis Induction Therapy in Sub-saharan Africa: A Randomised Controlled Non-inferiority Trial) are leveraging the EDCTP support for the AMBITION trial to set up an Africa Meningitis Trials Network.ResultsThe Africa Meningitis Trials Network (AMNET) was launched in Malawi in early 2018. Main achievements since the launch of the network, include an internal review of meningitis research across network sites and launch of the network website. The network also has two study protocols pending ethics review at all sites. These studies will provide much needed information on resources available for meningitis care, research and provide a baseline epidemiology of meningitis in Africa.ConclusionAMNET provides a rare opportunity for investigators interested in meningitis research to leverage the ongoing AMBITION trial to conduct Africa-wide preliminary research on meningitis. The network is recruiting additional members in Africa and globally to collaborate on meningitis research, and also apply for research funding to support meningitis work. Anyone interested in knowing more about the network should contact the AMNET communications officer, Ms Phum’lani Machao, phumlani.machao@gmail.com
APA, Harvard, Vancouver, ISO, and other styles
2

Irons, Alastair, and Jacques Ophoff. "Aspects of Digital Forensics in South Africa." Interdisciplinary Journal of Information, Knowledge, and Management 11 (2016): 273–83. http://dx.doi.org/10.28945/3576.

Full text
Abstract:
This paper explores the issues facing digital forensics in South Africa. It examines particular cyber threats and cyber threat levels for South Africa and the challenges in addressing the cybercrimes in the country through digital forensics. The paper paints a picture of the cybercrime threats facing South Africa and argues for the need to develop a skill base in digital forensics in order to counter the threats through detection of cybercrime, by analyzing cybercrime reports, consideration of current legislation, and an analysis of computer forensics course provision in South African universities. The paper argues that there is a need to develop digital forensics skills in South Africa through university programs, in addition to associated training courses. The intention in this paper is to promote debate and discussion in order to identify the cyber threats to South Africa and to encourage the development of a framework to counter the threats – through legislation, high tech law enforcement structures and protocols, digital forensics education, digital forensics skills development, and a public and business awareness of cybercrime threats.
APA, Harvard, Vancouver, ISO, and other styles
3

Kwofie, Titus Ebenezer, Clinton Ohis Aigbavboa, and Wellington Didibhuku Thwala. "Communication performance challenges in PPP projects: cases of Ghana and South Africa." Built Environment Project and Asset Management 9, no. 5 (2019): 628–41. http://dx.doi.org/10.1108/bepam-11-2018-0137.

Full text
Abstract:
Purpose The need to gain theoretical and practical understanding into the communication performance challenges in public private partnership (PPP) projects is considered as a precursor to effective communication strategies, management, planning and improvement in PPP models in both developed and developing countries. Hence, the purpose of this paper is to investigate the nature of communication performance challenges in PPP projects. Design/methodology/approach By adopting a deductive research design, a questionnaire survey of participants in the communication network of PPP projects in two countries (Ghana and South Africa) was conducted. The responses were analyzed using mean scores, Kendall’s concordance and Mann–Whitney U test. Findings The results revealed communication performance challenges that are unique to PPP project environment. Additionally, there were other typologies of communication challenges such as untimeliness, distortions and protocols that were frequently experienced in Ghana and not in South Africa. Also the emergence of misunderstanding affirmed that indeed this is a communication problem that is common and not peculiar to any project context or country. Practical implications With communication challenges and information asymmetries as notable challenges in PPP coupled with suggestions that effective communication is central to success of PPP projects and management, the insight into the communication performance challenges given by this study could be very useful to effective planning and strategies towards communication in construction project delivery in PPP and thus underline the importance of deriving mechanisms and protocols that suit PPP project environments. Originality/value These findings can be a precursor for developing bespoke communication systems, tools, protocols and communication behaviours to facilitate information flow aimed at overcoming information asymmetries and ultimately improving the quality of communication actions, tasks and outcomes in PPP project delivery.
APA, Harvard, Vancouver, ISO, and other styles
4

Kok, Illasha, Petra Bester, and Hennie Esterhuizen. "Late Departures from Paper-Based to Supported Networked Learning in South Africa." International Journal of Distance Education Technologies 16, no. 1 (2018): 56–75. http://dx.doi.org/10.4018/ijdet.2018010104.

Full text
Abstract:
Fragmented connectivity in South Africa is the dominant barrier for digitising initiatives. New insights surfaced when a university-based nursing programme introduced tablets within a supportive network learning environment. A qualitative, explorative design investigated adult nurses' experiences of the realities when moving from paper-based learning towards using tablets within a blended learning environment. Purposive sampling was applied. 45 (N) participants were included, each receiving a preloaded tablet (15 running on iOS, 15 on Android 4.2.2 Jelly Bean and 15 on Windows® 8 operating systems), being WiFi-dependent, integrated into a supportive learning network. Participants completed eleven compulsory Internet-based activities. Three reflective focus groups with 18 (n) participants concluded the project. Through self-empowerment and supportive environment, students adopted seamlessly, overcame network and resource-related challenges. Valuable lessons were learned within the digital divide, integrate tablets into distance learning from a resilient and pragmatic approach.
APA, Harvard, Vancouver, ISO, and other styles
5

Cooper, C. J. "Energy and transport issues for Gauteng, South Africa." Journal of Energy in Southern Africa 18, no. 2 (2007): 11–15. http://dx.doi.org/10.17159/2413-3051/2007/v18i2a3369.

Full text
Abstract:
Rapid urbanisation brings unwelcome negative impacts, and places excessive pressure on infra-structure development and maintenance. In partic-ular, transport networks become congested with negative impacts on energy logistics. The liquid fuel situation of South Africa and Gauteng is briefly examined. The paper considers the impact of con-strained oil supply, and supply infrastructure, on transport. The author further suggests that the authorities in Gauteng should critically examine an ultra light rail option in order to reduce reliance on imported oil, while helping reduce road congestion. A more energy efficient transport network for the province, able to meet the transport needs of pas-sengers and business, will help decrease environ-mentally damaging emissions.
APA, Harvard, Vancouver, ISO, and other styles
6

Mabasa, Brighton, Meena D. Lysko, Henerica Tazvinga, Sophie T. Mulaudzi, Nosipho Zwane, and Sabata J. Moloi. "The Ångström–Prescott Regression Coefficients for Six Climatic Zones in South Africa." Energies 13, no. 20 (2020): 5418. http://dx.doi.org/10.3390/en13205418.

Full text
Abstract:
The South African Weather Service (SAWS) manages an in situ solar irradiance radiometric network of 13 stations and a very dense sunshine recording network, located in all six macroclimate zones of South Africa. A sparsely distributed radiometric network over a landscape with dynamic climate and weather shifts is inadequate for solar energy studies and applications. Therefore, there is a need to develop mathematical models to estimate solar irradiation for a multitude of diverse climates. In this study, the annual regression coefficients, a and b, of the Ångström–Prescott (AP) model, which can be used to estimate global horizontal irradiance (GHI) from observed sunshine hours, were calibrated and validated with observed station data. The AP regression coefficients were calibrated and validated for each of the six macroclimate zones of South Africa using the observation data that span 2013 to 2019. The predictive effectiveness of the calibrated AP model coefficients was evaluated by comparing estimated and observed daily GHI. The maximum annual relative Mean Bias Error (rMBE) was 0.371%, relative Mean Absolute Error (rMAE) was 0.745%, relative Root Mean Square Error (rRMSE) was 0.910%, and the worst-case correlation coefficient (R2) was 0.910. The statistical validation metrics results show that there is a strong correlation and linear relation between observed and estimated GHI values. The AP model coefficients calculated in this study can be used with quantitative confidence in estimating daily GHI data at locations in South Africa where daily observation sunshine duration data are available.
APA, Harvard, Vancouver, ISO, and other styles
7

Okoh, D. I., L. A. McKinnell, and P. J. Cilliers. "Developing an ionospheric map for South Africa." Annales Geophysicae 28, no. 7 (2010): 1431–39. http://dx.doi.org/10.5194/angeo-28-1431-2010.

Full text
Abstract:
Abstract. The development of a map of the ionosphere over South Africa is presented in this paper. The International Reference Ionosphere (IRI) model, South African Bottomside Ionospheric Model (SABIM), and measurements from ionosondes in the South African Ionosonde Network, were combined within their own limitations to develop an accurate representation of the South African ionosphere. The map is essentially in the form of a computer program that shows spatial and temporal representations of the South African ionosphere for a given set of geophysical parameters. A validation of the map is attempted using a comparison of Total Electron Content (TEC) values derived from the map, from the IRI model, and from Global Positioning System (GPS) measurements. It is foreseen that the final South African ionospheric map will be implemented as a Space Weather product of the African Space Weather Regional Warning Centre.
APA, Harvard, Vancouver, ISO, and other styles
8

Masembe, Angela. "Reliability benefit of smart grid technologies: A case for South Africa." Journal of Energy in Southern Africa 26, no. 3 (2015): 2–9. http://dx.doi.org/10.17159/2413-3051/2015/v26i3a2124.

Full text
Abstract:
The South African power industry faces many challenges, from poor performing networks, a shortage of generation capacity to significant infrastructure backlog and an ageing work force. According to the Development Bank of South Africa (DBSA), the key challenge facing the industry is ageing infrastructure. Smart grid technologies are a class of technologies that are being developed and used by utilities to deliver electrical systems into the 21st century using computer-based remote control and automation. The main motive towards smart grid technologies is to improve reliability, flexibility, accessibility and profitability; as well as to support trends towards a more sustainable energy supply. This study identifies a number of smart grid technologies and examines the impact they may have on the distribution reliability of a test system. The components on the selected test system are the same as those found on a South African feeder. The bulk of the load in test system was modelled using load data collected in South Africa. This study will consider a number of different cases, with the base case incorporating the impact of aged infrastructure on the reliability of the system. The smart grid technologies were then introduced into the system and their impact on distribution reliability was determined. These different cases were also compared to the alternative of replacing the aged and worn out infrastructure with new infrastructure. The findings of this study indicate that the identified smart grid technologies improve the reliability of the system, mainly by decreasing the outage duration experienced by customers on the network. An even better performance was achieved when the ageing infrastructure was replaced with new infrastructure.
APA, Harvard, Vancouver, ISO, and other styles
9

Pillay, Sureshnee, Jennifer Giandhari, Houriiyah Tegally, et al. "Whole Genome Sequencing of SARS-CoV-2: Adapting Illumina Protocols for Quick and Accurate Outbreak Investigation during a Pandemic." Genes 11, no. 8 (2020): 949. http://dx.doi.org/10.3390/genes11080949.

Full text
Abstract:
The COVID-19 pandemic has spread very fast around the world. A few days after the first detected case in South Africa, an infection started in a large hospital outbreak in Durban, KwaZulu-Natal (KZN). Phylogenetic analysis of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) genomes can be used to trace the path of transmission within a hospital. It can also identify the source of the outbreak and provide lessons to improve infection prevention and control strategies. This manuscript outlines the obstacles encountered in order to genotype SARS-CoV-2 in near-real time during an urgent outbreak investigation. This included problems with the length of the original genotyping protocol, unavailability of reagents, and sample degradation and storage. Despite this, three different library preparation methods for Illumina sequencing were set up, and the hands-on library preparation time was decreased from twelve to three hours, which enabled the outbreak investigation to be completed in just a few weeks. Furthermore, the new protocols increased the success rate of sequencing whole viral genomes. A simple bioinformatics workflow for the assembly of high-quality genomes in near-real time was also fine-tuned. In order to allow other laboratories to learn from our experience, all of the library preparation and bioinformatics protocols are publicly available at protocols.io and distributed to other laboratories of the Network for Genomics Surveillance in South Africa (NGS-SA) consortium.
APA, Harvard, Vancouver, ISO, and other styles
10

Robinson, Bryan Michael Kenneth, and Siân Stephens. "moderating and mediating role of local government in the community engagement strategy of a renewable energy company in South Africa." Journal of Energy in Southern Africa 32, no. 3 (2021): 14–23. http://dx.doi.org/10.17159/2413-3051/2021/v32i3a9403.

Full text
Abstract:
Adopting a qualitative case study approach of a renewable energy company in South Africa, the research investigated community engagement within the tripartite relationship of a wind farm, the communities, and the local government. It was found that local government played a moderating role in the community engagement efforts of the wind farm which had to comply with certain engagement protocols determined by local government. Local government also played a mediating role in corporate community engagement, as the wind farm developed their engagement strategy in alignment with local government’s policies and acted as a ‘dot-connector’ between local government and communities. The wind farm played a reciprocal mediating role on local government’s own engagement with their communities in the face of local government’s inefficiencies and complemented service delivery outcomes. The wind farm’s engagement strategy thus enabled local development which was appreciated by communities and local government.
APA, Harvard, Vancouver, ISO, and other styles
11

Arena, Fabio, Simona Pollini, Gian Maria Rossolini, and Maurizio Margaglione. "Summary of the Available Molecular Methods for Detection of SARS-CoV-2 during the Ongoing Pandemic." International Journal of Molecular Sciences 22, no. 3 (2021): 1298. http://dx.doi.org/10.3390/ijms22031298.

Full text
Abstract:
Since early 2020, the COVID-19 pandemic has caused an excess in morbidity and mortality rates worldwide. Containment strategies rely firstly on rapid and sensitive laboratory diagnosis, with molecular detection of the viral genome in respiratory samples being the gold standard. The reliability of diagnostic protocols could be affected by SARS-CoV-2 genetic variability. In fact, mutations occurring during SARS-CoV-2 genomic evolution can involve the regions targeted by the diagnostic probes. Following a review of the literature and an in silico analysis of the most recently described virus variants (including the UK B 1.1.7 and the South Africa 501Y.V2 variants), we conclude that the described genetic variability should have minimal or no effect on the sensitivity of existing diagnostic protocols for SARS-CoV-2 genome detection. However, given the continuous emergence of new variants, the situation should be monitored in the future, and protocols including multiple targets should be preferred.
APA, Harvard, Vancouver, ISO, and other styles
12

Iyamu, Tiko, and Sibulela Mgudlwa. "ANT Perspective of Healthcare Big Data for Service Delivery in South Africa." Journal of Cases on Information Technology 23, no. 1 (2021): 65–81. http://dx.doi.org/10.4018/jcit.2021010104.

Full text
Abstract:
In South Africa, there has been for many years challenges in how healthcare big data are accessed, used, and managed by facilities, particularly the small health facilities. The challenges arise from inaccuracy and inconsistency of patients' data and have impact on diagnoses, medications, and treatments, which consequently contributes to fatalities in South Africa, particularly in the rural areas of the country. The problem of inaccuracy and inconsistency of patients' data is often caused by lack of or poor analysis (or analytics) of data. Thus, the objective of this research was to understand the factors that influence the use and management of patients' big data for healthcare service delivery. The qualitative methods were applied, and a South African healthcare facility was used as a case in the study. Actor network theory (ANT) was employed as a lens to guide the analysis of the qualitative data. Based on the findings from the analysis, a model was developed, which is intended to guide analytics of big data for healthcare purposes, towards improving service delivery in the country.
APA, Harvard, Vancouver, ISO, and other styles
13

Terblanche, E., J. A. Wessels, R. I. Stewart, and J. H. Koeslag. "A computer simulation of free-range exercise in the laboratory." Journal of Applied Physiology 87, no. 4 (1999): 1386–91. http://dx.doi.org/10.1152/jappl.1999.87.4.1386.

Full text
Abstract:
We present a technique for simulating dynamic field (free-range) exercise, using a novel computer-controlled cycle ergometer. This modified cycle ergometer takes into account the effect of friction and aerodynamic drag forces on a 70-kg cyclist in a racing position. It also affords the ability to select different gear ratios. We have used this technique to simulate a known competition cycle route in Cape Town, South Africa. In an attempt to analyze the input stimulus, in this case the generated power output of each cyclist, eight subjects cycled for 40 min at a self-selected, comfortable pace on the first part of the simulated route. Our results indicate that this exercise input excites the musculocardiorespiratory system over a wide range of power outputs, both in terms of amplitude and frequency. This stimulus profile thereby complies with the fundamental requirement for nonlinear (physiological) systems analysis and identification. Through a computer simulation, we have devised a laboratory exercise protocol that not only is physiologically real but also overcomes the artificiality of most traditional laboratory exercise protocols.
APA, Harvard, Vancouver, ISO, and other styles
14

Anku Ankudey, Selorm. "Scientometric Analysis of Solar Cell Research: A Comparison between Africa and India." Journal of Information & Knowledge Management 20, no. 02 (2021): 2150018. http://dx.doi.org/10.1142/s0219649221500180.

Full text
Abstract:
The concerned study invariably presents the key findings of the scientometric analysis of Solar Cell Research (SCR) in the specific context of Africa and India. The outstanding contributions explicitly delivered by the successful collaboration of African and Indian authors are satisfactorily accounted for in specific terms of published author, year-wise, research area, funding agencies, published citations, and h-index. The necessary data for the needed research was retrieved from the Web of Science from 2009–2018. The raw data was further analysed and properly presented using MS Excel and VOSviewer as the keyword network tool. An aggregate number of scholarly publications in the global scenario at 117,605, Africa and India typically contributed 2,932 and 7,848, respectively. Joint research of 92 academic journals, receiving citations of 1,348, was usually observed. The highest source of 394 (29.23%) was received overwhelmingly in 2018. Out of the 652 published authors contributing constructively to the remarkable collaboration, H. C. Swart, of the University of the Free State, Bloemfontein, South Africa, contributed 14 publications, allegedly giving a 2.147% of the total count. This is followed by V. Kumar of the Indian Institute of Technology New Delhi (India); UCA (France); UFS (South Africa), which also contributed 12 publications measured at 1.84% of the total count of publications.
APA, Harvard, Vancouver, ISO, and other styles
15

Andrews, Caroline, Brian Fortier, Amy Hayward, et al. "Development, Evaluation, and Implementation of a Pan-African Cancer Research Network: Men of African Descent and Carcinoma of the Prostate." Journal of Global Oncology, no. 4 (December 2018): 1–14. http://dx.doi.org/10.1200/jgo.18.00063.

Full text
Abstract:
Purpose Cancer of the prostate (CaP) is the leading cancer among men in sub-Saharan Africa (SSA). A substantial proportion of these men with CaP are diagnosed at late (usually incurable) stages, yet little is known about the etiology of CaP in SSA. Methods We established the Men of African Descent and Carcinoma of the Prostate Network, which includes seven SSA centers partnering with five US centers to study the genetics and epidemiology of CaP in SSA. We developed common data elements and instruments, regulatory infrastructure, and biosample collection, processing, and shipping protocols. We tested this infrastructure by collecting epidemiologic, medical record, and genomic data from a total of 311 patients with CaP and 218 matched controls recruited at the seven SSA centers. We extracted genomic DNA from whole blood, buffy coat, or buccal swabs from 265 participants and shipped it to the Center for Inherited Disease Research (Baltimore, MD) and the Centre for Proteomics and Genomics Research (Cape Town, South Africa), where genotypes were generated using the UK Biobank Axiom Array. Results We used common instruments for data collection and entered data into the shared database. Double-entered data from pilot participants showed a 95% to 98% concordance rate, suggesting that data can be collected, entered, and stored with a high degree of accuracy. Genotypes were obtained from 95% of tested DNA samples (100% from blood-derived DNA samples) with high concordance across laboratories. Conclusion We provide approaches that can produce high-quality epidemiologic and genomic data in multicenter studies of cancer in SSA.
APA, Harvard, Vancouver, ISO, and other styles
16

Harerimana, Alexis, and Ntombifikile Gloria Mtshali. "Internet usage among undergraduate nursing students: A case study of a selected university in South Africa." Journal of Nursing Education and Practice 8, no. 8 (2018): 75. http://dx.doi.org/10.5430/jnep.v8n8p75.

Full text
Abstract:
Background: Globally, the internet is becoming an increasingly indispensable tool in academic institutions and the workplace. Nursing students are required to use the computer and the internet to search for information and to use various software, for which computer and internet literacy are essential. Despite becoming an important tool for teaching and learning, literature reflects an under-utilization of the internet in academic and non-academic settings for a number of reasons. This article explores the general internet usage of undergraduate nursing students at a selected university in South Africa.Methods: A quantitative, non-experimental, exploratory descriptive design was used, with 115 undergraduate nursing students participating in the study. Data was collected using a questionnaire survey after obtaining ethical clearance from the university’s ethics committee and were analysed descriptively.Results: The findings revealed that the internet was used for various purposes including; academic (96.5%); communication (82.6%), pleasure (71.3%), and work-related activity (53.9%). Facebook (77.4%) was the most commonly used social network. Constraints encountered in using Barriers to the use of the internet include restriction of access to certain sites (62.6%), very slow internet connection (55.7%), little training on how to use internet facilities (38.3%), and a limited number of computers (37.4%).Conclusions: Contrary to other studies, this study shows that students do use the internet for a number of reasons, and recommend structured support on how to use if for academic purposes.
APA, Harvard, Vancouver, ISO, and other styles
17

Pons Batugal. "International Coconut Genetic Resources Network (COGENT): Its history and achievements." CORD 21, no. 02 (2005): 34. http://dx.doi.org/10.37833/cord.v21i02.408.

Full text
Abstract:
The International Coconut Genetic Resources Network (COGENT) is a global research network organized by the International Plant Genetic Resources Institute (IPGRI) in 1992 with support from member countries, the Consultative Group on International Agricultural Research (CGIAR), partner institutions, donor agencies, and by regional and international development organizations.
 
 In the last 12 years, COGENT has been fully operational with 38 member coconut producing countries in five regions (South Asia; Southeast and East Asia; South Pacific; Africa and the Indian Ocean; and Latin America and the Caribbean). It has successfully developed and disseminated to coconut breeders and curators worldwide the International Coconut Genetic Resources Database (CGRD). The CGRD contains characterization data and some pictures of 1,416 accessions which are conserved by national programmes in 28 sites in 23 countries. To further secure conserved germplasm, a COGENT multi-site International Coconut Genebank has been established to conserve 200 important accessions in each region. Coconut varieties with multi-purpose uses are being identified, documented and promoted. The performance of promising 38 high-yielding hybrids are being evaluated in a multilocation trial involving four African and three Latin America/Caribbean countries to identify suitable varieties and hybrids for resource-poor farmers. Farmers’ varietal preferences in 15 countries are being evaluated. Diversity-linked income-generating activities are being used as a strategy to promote in situ and on-farm conservation and germplasm utilization have been initiated in 15 countries. Protocols for in vitro embryo culture, cryopreservation, morphometric and molecular marker-based methods for locating and characterizing diversity; pest risk assessment and germplasm health management are being developed, tested and upgraded. Strategies and techniques for farmer participatory research, collecting, characterization and ex situ and in situ conservation are being refined.
 
 To strengthen the coconut research capability of COGENT member countries, the COGENT Secretariat and IPGRI have organized 39 country need assessment missions and conducted 41 workshops and meetings involving 994 coconut researchers to share information and technologies, discuss issues and common problems and opportunities and how to address them; conducted 40 training courses involving 765 participants from 41 countries; supported 274 research and training/capacity building activities in 30 countries; and led the establishment of the Global Coconut Research for Development Programme (PROCORD). IPGRI and COGENT's current priority involves the further promotion of more effective conservation and use of coconut genetic resources, both regionally and globally.
APA, Harvard, Vancouver, ISO, and other styles
18

Kozma, Csaba, and Clara Calero-Medina. "The role of South African researchers in intercontinental collaboration." Scientometrics 121, no. 3 (2019): 1293–321. http://dx.doi.org/10.1007/s11192-019-03230-9.

Full text
Abstract:
Abstract The analysis presented here focuses on mapping, based on publication output, the scientific collaboration of African based researchers and the role of the South African research community as a channel for within- and intercontinental collaborations. We have selected 10 scientific fields, namely, Tropical Medicine, Parasitology, Infectious Disease, Ecology, Water Resources, Immunology, Zoology, Plant Sciences, Agricultural and Food Sciences, and Psychology to gain a clear picture of the aforementioned scientific activity. As a first step, we created cooperation networks and visualized them on world-maps. In addition, centrality measures of the network were calculated to see the frequency of involvement regarding different countries, with a focus on South Africa, in the collaboration process. Furthermore, first and last authorship positions of the publications were summed to highlight the influence of the selected authors on the direction of and resources provided to the publications. Finally, the most prominent funding organizations and their focus on the selected fields were singled out. Through combining these steps of analysis, we gained an accurate picture of the level of involvement of the South African research community in within- and intercontinental scientific collaboration.
APA, Harvard, Vancouver, ISO, and other styles
19

Torto, Nelson. "Preface." Pure and Applied Chemistry 85, no. 12 (2013): iv. http://dx.doi.org/10.1351/pac20138512iv.

Full text
Abstract:
The fourth conference of the African Network for Analytical Chemists (SEANAC) took place in Maputo, Mozambique, 8-11 July 2012. The SEANAC conferences are always characterized by pre-symposium workshops that are meant to address various aspects for students. In Maputo, on day one, a workshop on "Sample preparation" was given by Dr. Ron Majors of Agilent Technologies. On day two, Dr. Sharon Neal of the National Science Foundation (USA) gave a workshop on "Writing effective grants". On the same day, Dr. Jean Pemberton of Arizona University (USA) gave a workshop on "The basics of writing a good manuscript" and Dr. Jorge Gardea-Torresdey of Texas University at El Paso (USA) gave a workshop on "How to get published in scientific journals". Five keynote lectures were given: "Electrospun sorbents and colorometric probes", Prof. N. Torto, Rhodes University (South Africa); "Pressurized fluid technology in green analytical chemistry", Prof. Charlotta Turner, Lund University (Sweden); "The speciation of mercury in soil, water and ambient air: Analytical protocols and detection", Prof. Andrew Crouch, University of Witwatersrand (South Africa); "Advances in biological and food sample method development by GCxGC/TOF-MS", Prof. Jean-Marie Dimandja, LECO (USA); and "Use of synchrotron techniques to study the environmental implications of nanoparticles in the environment: The case of terrestrial plants", Prof. Jorge Gardea-Torresdey, University of Texas at El Paso (USA). In the main conference, 5 plenary as well as 40 lectures were presented. The papers published in this issue reflect the main areas of focus at the conference, as they covered aspects of agriculture, environment, health, and emerging applications based on nanotechnology.Nelson Torto Conference Editor
APA, Harvard, Vancouver, ISO, and other styles
20

Roos, F., and R. C. Bansal. "Reactive power and harmonic compensation: A case study for the coal-mining industry." Journal of Energy in Southern Africa 30, no. 1 (2019): 34–48. http://dx.doi.org/10.17159/2413-3051/2019/v30i1a2473.

Full text
Abstract:
This study reports on a case study in Grootegeluk Mine: Exxaro Coal, Lephalale, South Africa, in terms of power factor correction (PFC), load flow, harmonic frequency scans and harmonic voltage distortion analyses. The DIgSilent PowerFactory software was used for network simulations. Harmonic and reactive power compensation techniques were compared in terms of filter type evolution and technology advancement, with the use of simple trade-off criteria such as cost-effectiveness versus performance. It was found that both passive and hybrid filters were more favourable and could effectively compensate all voltage and current harmonics and reactive power for large nonlinear loads. The installation of switched PFC filter banks tuned at the fifth harmonic order accommodates future network growth and this solution can be rolled out to any mining industry as a benchmark to lower energy cost and maximise savings achievable on the electricity bill.
APA, Harvard, Vancouver, ISO, and other styles
21

Ajibade, Patrick, and Stephen M. Mutula. "Big data, 4IR and electronic banking and banking systems applications in South Africa and Nigeria." Banks and Bank Systems 15, no. 2 (2020): 187–99. http://dx.doi.org/10.21511/bbs.15(2).2020.17.

Full text
Abstract:
Efficient banking solutions are an integral part of the business integration of South African and Nigerian economies as the two largest economies in the continent. Security, effectiveness, and integration of banking systems are critical to the sustainable development of the African continent. Therefore, an empirical analysis of the production of research on banking services and systems was conducted. The aim of the study was to examine the robustness of the research findings on banking systems in terms of their importance for the economic sustainability of the continent in the era of the fourth industrial revolution. The study adopted a bibliometric analysis using software clusters to visualize the results. Due to higher visibility of outputs and likely citations, the results showed that the key terms from Google Scholar are ranked higher than outputs from Scopus. Main research interests were related to internet banking (f = 70), e-payment systems (f = 57), telephone banking (f = 56), automated teller machines (f = 54), and mobile banking (f = 40). The results also showed a very low research interest in the technical aspect of online banking services such as security (f = 19, TLS = 40), authentication (f = 17, TLS =33), network security (f =13, TLS = 33), computer crime (f = 16, TLS = 42), and online banking (f = 11, TLS =32). The study found there were insufficient outputs in the area of the fourth industrial revolution (4IR) and banking services in Africa. Future research trends should examine the impact of the 4IR and big data on the banking system, regional economic integration, and sustainable growth in the continent.
APA, Harvard, Vancouver, ISO, and other styles
22

Kariuki, Paul, Lizzy O Ofusori, Prabhakar Rontala Subramanniam, Moses Okpeku, and Maria L Goyayi. "Challenges in Contact Tracing by Mining Mobile Phone Location Data for COVID-19: Implications for Public Governance in South Africa." Interdisciplinary Journal of Information, Knowledge, and Management 16 (2021): 101–24. http://dx.doi.org/10.28945/4736.

Full text
Abstract:
Aim/Purpose: The paper’s objective is to examine the challenges of using the mobile phone to mine location data for effective contact tracing of symptomatic, pre-symptomatic, and asymptomatic individuals and the implications of this technology for public health governance. Background: The COVID-19 crisis has created an unprecedented need for contact tracing across South Africa, requiring thousands of people to be traced and their details captured in government health databases as part of public health efforts aimed at breaking the chains of transmission. Contact tracing for COVID-19 requires the identification of persons who may have been exposed to the virus and following them up daily for 14 days from the last point of exposure. Mining mobile phone location data can play a critical role in locating people from the time they were identified as contacts to the time they access medical assistance. In this case, it aids data flow to various databases designated for COVID-19 work. Methodology: The researchers conducted a review of the available literature on this subject drawing from academic articles published in peer-reviewed journals, research reports, and other relevant national and international government documents reporting on public health and COVID-19. Document analysis was used as the primary research method, drawing on the case studies. Contribution: Contact tracing remains a critical strategy in curbing the deadly COVID-19 pandemic in South Africa and elsewhere in the world. However, given increasing concern regarding its invasive nature and possible infringement of individual liberties, it is imperative to interrogate the challenges related to its implementation to ensure a balance with public governance. The research findings can thus be used to inform policies and practices associated with contact tracing in South Africa. Findings: The study found that contact tracing using mobile phone location data mining can be used to enforce quarantine measures such as lockdowns aimed at mitigating a public health emergency such as COVID-19. However, the use of technology can expose the public to criminal activities by exposing their locations. From a public governance point of view, any exposure of the public to social ills is highly undesirable. Recommendations for Practitioners: In using contact tracing apps to provide pertinent data location caution needs to be exercised to ensure that sensitive private information is not made public to the extent that it compromises citizens’ safety and security. The study recommends the development and implementation of data use protocols to support the use of this technology, in order to mitigate against infringement of individual privacy and other civil liberties. Recommendation for Researchers: Researchers should explore ways of improving digital applications in order to improve the acceptability of the use of contact tracing technology to manage pandemics such as COVID-19, paying attention to ethical considerations. Impact on Society: Since contact tracing has implications for privacy and confidentiality it must be conducted with caution. This research highlights the challenges that the authorities must address to ensure that the right to privacy and confidentiality is upheld. Future Research: Future research could focus on collecting primary data to provide insight on contact tracing through mining mobile phone location data. Research could also be conducted on how app-based technology can enhance the effectiveness of contact tracing in order to optimize testing and tracing coverage. This has the potential to minimize transmission whilst also minimizing tracing delays. Moreover, it is important to develop contact tracing apps that are universally inter-operable and privacy-preserving.
APA, Harvard, Vancouver, ISO, and other styles
23

Miao, Yufan, Reinhard Koenig, Katja Knecht, Kateryna Konieva, Peter Buš, and Mei-Chih Chang. "Computational urban design prototyping: Interactive planning synthesis methods—a case study in Cape Town." International Journal of Architectural Computing 16, no. 3 (2018): 212–26. http://dx.doi.org/10.1177/1478077118798395.

Full text
Abstract:
This article is motivated by the fact that in Cape Town, South Africa, approximately 7.5 million people live in informal settlements and focuses on potential upgrading strategies for such sites. To this end, we developed a computational method for rapid urban design prototyping. The corresponding planning tool generates urban layouts including street network, blocks, parcels and buildings based on an urban designer’s specific requirements. It can be used to scale and replicate a developed urban planning concept to fit different sites. To facilitate the layout generation process computationally, we developed a new data structure to represent street networks, land parcellation, and the relationship between the two. We also introduced a nested parcellation strategy to reduce the number of irregular shapes generated due to algorithmic limitations. Network analysis methods are applied to control the distribution of buildings in the communities so that preferred neighborhood relationships can be considered in the design process. Finally, we demonstrate how to compare designs based on various urban analysis measures and discuss the limitations that arise when we apply our method in practice, especially when dealing with more complex urban design scenarios.
APA, Harvard, Vancouver, ISO, and other styles
24

Ratshilengo, Mamphaga, Caston Sigauke, and Alphonce Bere. "Short-Term Solar Power Forecasting Using Genetic Algorithms: An Application Using South African Data." Applied Sciences 11, no. 9 (2021): 4214. http://dx.doi.org/10.3390/app11094214.

Full text
Abstract:
Renewable energy forecasts are critical to renewable energy grids and backup plans, operational plans, and short-term power purchases. This paper focused on short-term forecasting of high-frequency global horizontal irradiance data from one of South Africa’s radiometric stations. The aim of the study was to compare the predictive performance of the genetic algorithm and recurrent neural network models with the K-nearest neighbour model, which was used as the benchmark model. Empirical results from the study showed that the genetic algorithm model has the best conditional predictive ability compared to the other two models, making this study a useful tool for decision-makers and system operators in power utility companies. To the best of our knowledge this is the first study which compares the genetic algorithm, the K-nearest neighbour method, and recurrent neural networks in short-term forecasting of global horizontal irradiance data from South Africa.
APA, Harvard, Vancouver, ISO, and other styles
25

Yadav, Ashima, and Dinesh Kumar Vishwakarma. "A Language-independent Network to Analyze the Impact of COVID-19 on the World via Sentiment Analysis." ACM Transactions on Internet Technology 22, no. 1 (2022): 1–30. http://dx.doi.org/10.1145/3475867.

Full text
Abstract:
Towards the end of 2019, Wuhan experienced an outbreak of novel coronavirus, which soon spread worldwide, resulting in a deadly pandemic that infected millions of people around the globe. The public health agencies followed many strategies to counter the fatal virus. However, the virus severely affected the lives of the people. In this paper, we study the sentiments of people from the top five worst affected countries by the virus, namely the USA, Brazil, India, Russia, and South Africa. We propose a deep language-independent Multilevel Attention-based Conv-BiGRU network (MACBiG-Net) , which includes embedding layer, word-level encoded attention, and sentence-level encoded attention mechanisms to extract the positive, negative, and neutral sentiments. The network captures the subtle cues in a document by focusing on the local characteristics of text along with the past and future context information for the sentiment classification. We further develop a COVID-19 Sentiment Dataset by crawling the tweets from Twitter and applying topic modeling to extract the hidden thematic structure of the document. The classification results demonstrate that the proposed model achieves an accuracy of 85%, which is higher than other well-known algorithms for sentiment classification. The findings show that the topics which evoked positive sentiments were related to frontline workers, entertainment, motivation, and spending quality time with family. The negative sentiments were related to socio-economic factors like racial injustice, unemployment rates, fake news, and deaths. Finally, this study provides feedback to the government and health professionals to handle future outbreaks and highlight future research directions for scientists and researchers.
APA, Harvard, Vancouver, ISO, and other styles
26

Kwet, Michael. "Digital colonialism: US empire and the new imperialism in the Global South." Race & Class 60, no. 4 (2019): 3–26. http://dx.doi.org/10.1177/0306396818823172.

Full text
Abstract:
This article proposes a conceptual framework of how the United States is reinventing colonialism in the Global South through the domination of digital technology. Using South Africa as a case study, it argues that US multinationals exercise imperial control at the architecture level of the digital ecosystem: software, hardware and network connectivity, which then gives rise to related forms of domination. The monopoly power of multinational corporations is used for resource extraction through rent and surveillance – economic domination. By controlling the digital ecosystem, Big Tech corporations control computer-mediated experiences, giving them direct power over political, economic and cultural domains of life – imperial control. The centrepiece of surveillance capitalism, Big Data, violates the sanctity of privacy and concentrates economic power in the hands of US corporations – a system of global surveillance capitalism. As a feature of surveillance capitalism, Global North intelligence agencies partner with their own corporations to conduct mass and targeted surveillance in the Global South – which intensifies imperial state surveillance. US elites have persuaded people that society must proceed according to its ruling class conceptions of the digital world, setting the foundation for tech hegemony. The author argues for a different ecosystem that decentralises technology by placing control directly into the hands of the people to counter the rapidly advancing frontier of digital empire.
APA, Harvard, Vancouver, ISO, and other styles
27

A. Johnston, Kevin, and Grandon Gill. "Standard Bank: The Agile Transformation." Journal of Information Technology Education: Discussion Cases 6 (2017): 07. http://dx.doi.org/10.28945/3923.

Full text
Abstract:
South Africa’s largest bank has recently completed a transformation from traditional systems development to the scaled agile framework. The individual leading the transformation is now considering how to keep the momentum going and possible new directions. Josef Langerman, Head of IT Transformation for Standard Bank, reflected on the extraordinary transformation that his organization’s IT group had recently experienced. Over the past three years, Standard Bank’s IT group had changed from the relatively well accepted systems development lifecycle/waterfall model to a revolutionary large scale agile approach. The results had been gratifying. But it left a question unanswered. Now that things were starting to stabilize, what should be the next steps? The 154-year-old Standard Bank was the largest banking group in Africa, and the 5th largest company headquartered in South Africa. The bank offered a range of corporate, business and personal banking as well as financial services. Its 49,000 employees served over 15 million customers, in 20 countries across the continent of Africa, as well as other countries scattered around the globe. Standard Bank’s IT group, located within the company’s Johannesburg headquarters, had over 6000 employees. The group managed the bank’s technology infrastructure–including a network of nearly 10,000 ATMs, its applications development, testing, deployment, maintenance and operations. By 2014, the bank recognized that its IT performance was lagging industry benchmarks in productivity, turnaround time and employee satisfaction. Employing a “do it in-house” philosophy, it embarked on a major transformation. Abandoning traditional highly structured approaches to project management and development, it had adopted an agile philosophy that was most commonly seen in much smaller organizations and technology startups. The results had been impressive–productivity, cycle time and organizational health indicators had all risen dramatically. The group had also achieved substantial reductions in its budget. Even skeptics within the organization could not fail to be impressed. Now, however, Langerman wondered about the future. He had been cautioned by his group’s HR Culture Transformation Guide that rapid improvement could easily be followed by disillusionment. What could be done to keep the momentum going forward? Should the bank double down on the types of changes to culture, practice and training that had led to its success, or was it time to let things settle? And who should be guiding the change? Should the implementation continue entirely in-house, or should outside consultants–that were working in other areas of the bank–play a significant role? In the near future, he would need to present his recommendations to the group’s CIO.
APA, Harvard, Vancouver, ISO, and other styles
28

Lambrechts, IJ. "Replacement depreciation and price regulation." Journal of Energy in Southern Africa 17, no. 3 (2006): 10–20. http://dx.doi.org/10.17159/2413-3051/2006/v17i3a3243.

Full text
Abstract:
Price regulation occurs quite commonly amongst natural monopolies which frequently include public utilities. In South Africa and in certain countries in Africa, there has recently been a revival of price regulation in certain industries and enterprises, where competition is limited or non-existent. Price regulation can be applied in a multitude of ways. Because of the importance of the price levels (historical and replacement) in the price setting exercise, the focus in this paper will be on the issue of depreciation to arrive at the final prices. The electricity utility industry was historically viewed as a highly mature and heavily regulated natural monopoly. In many parts of the world, electricity utilities have already been deregulated to a large extent and in the United States the process was preceded by a process of unbundling or ringfencing of the main divisions, i.e. generation and distribution. Even the network component of transmission, traditionally seen as natural monopolies, was deregulated to a large extent. The deregulation process, whether fully or partially, emphasised the requirement for a detailed explanation for a specific price level. The need for acceptable and transparent selling prices has, therefore, not disappeared. Regulatory pricing is consequently a vital component of pricing at this stage and in the restructured industry it will continue to play an important role because of a limited number of participants. In other sectors of the South African energy industry too, the deregulation process has either not started or has not been completed. Price regulation is presently and will in future be applicable to the liquid fuels industry, which includes the pipeline of Petronet as well as gas pipelines. Other industries which are being price regulated at the moment include water, medicine, telecommunication (fixed lines) and postal rates. Although the economic regulation for these industries may differ substantially, the principles applying to depreciation calculations would be similar. Replacement depreciation produces lower profit figures during periods of inflation. Quoted companies often oppose this system because of a lack of taxation recognition on income and the adverse effect on earnings per share. This paper covers the calculation of depreciation by price regulators where assets are not diversified (single assets). Shorter depreciation lifetimes based on historical cost result in an automatic provision for replacement depreciation. The extent of the provision would be a function of the difference between the actual and selected lifetimes, income tax rates, re-investment rates and the extent of the financial gearing ratio. Provision for replacement depreciation may be reduced significantly, if not reduced completely, by reducing depreciation lifetimes.
APA, Harvard, Vancouver, ISO, and other styles
29

Makgahlela, Mpsanyana, Tebogo M. Mothiba, Jabu P. Mokwena, and Peter Mphekgwana. "Measures to Enhance Student Learning and Well-Being during the COVID-19 Pandemic: Perspectives of Students from a Historically Disadvantaged University." Education Sciences 11, no. 5 (2021): 212. http://dx.doi.org/10.3390/educsci11050212.

Full text
Abstract:
Since December 2019, the world population has been battling with the SARS-CoV-2 disease (COVID-2019) pandemic. The pandemic has continued to impact negatively on people’s livelihoods and also on student’s education. This qualitative study established from students in a previously disadvantaged university, their challenges and needs pursuant to the COVID-19 nationwide lockdown in South Africa. A total of 312 (male = 141; female = 171) registered students were conveniently sampled and completed an online survey questionnaire. Thematically analysed data revealed that student education and health have been impacted since the COVID-19 nationwide lockdown. Participants went on to recommend several measures which, if implemented, could improve their well-being and access to education. Study findings imply that students from previously disadvantaged universities, who in their majority are from impoverished rural communities, have been struggling to access remote learning due to amongst others, the lack of information and communication technology (ICT) devices and network connectivity problems. It, therefore, requires rural-based universities to work together with the government and the private sector and join hands in addressing student challenges and needs during the on-going lockdown in the country. This would be one way of ensuring that in spite of students’ socioeconomic status, cultural location or background, their right to education is protected.
APA, Harvard, Vancouver, ISO, and other styles
30

Oyieke, Lilian Ingutia, and Archie L. Dick. "Empowering academic librarians for effective e-services." Electronic Library 35, no. 2 (2017): 263–82. http://dx.doi.org/10.1108/el-10-2015-0200.

Full text
Abstract:
Purpose The purpose of this paper was to assess the Web 2.0 competencies’ levels of academic librarians in selected libraries from two library consortia KLISC in Kenya and GAELIC in South Africa and how these competencies can be sources of empowerment for the effective provision of e-services. As service organizations, academic libraries face challenges similar to those in other service sectors. One of the major challenges includes that of providing not only quality print resources but also quality e-services. Globally, academic librarians use various Web 2.0 technologies to engage with their library users and colleagues and for their information work. Academic librarians are now, more than ever, expected to be empowered and build diverse voices, perspectives and arguments into library collections and services. Most of the youths (academic library users) in developing countries like Kenya and South Africa are heavy users of Web 2.0 technologies. This presents a challenge for their academic librarians who must augment their traditional library skills with Web 2.0 technologies for provision of effective e-services. This study explores the interconnections between librarian empowerment, traditional librarian skills and Web 2.0 competencies for effective e-services. Design/methodology/approach Survey method of research was used to conduct the study. Purposive sampling was used to select a homogeneous sample of academic librarians and libraries that use the Web 2.0 technologies. An online questionnaire with both closed and open-ended questions was used to collect data. The Web 2.0 competency levels were analyzed and presented using descriptive statistics. To achieve more robust findings and to illustrate the implications of Web 2.0 for librarian empowerment, the ATLAS. ti software was used to illustrate interconnections between librarian empowerment, traditional librarian skills and Web 2.0 technologies. The emerging codes and themes are presented in three network views. Findings The findings from the quantitative data indicate that the majority of the librarians have intermediate competency levels in Web 2.0 technologies. The findings from the qualitative data show that Web 2.0 tools and librarian empowerment can be used to illustrate interconnections in content collaboration tools, bookmarking tools and SNSs. Web 2.0 competencies can transform knowledge-sharing activities, augment existing authoritative information service, foster interaction and market information products and services. This study redefines librarian empowerment through competency in Web 2.0 tools and new roles for effective e-services. Originality/value This paper proposes that librarian empowerment through Web 2.0 competencies is essential in augmenting traditional library skills and in providing effective e-services. This manuscript describes original work and is a re-submission EL-08-2015-0143.R1 due to expired deadline in this journal. Both authors approved the manuscript and this submission.
APA, Harvard, Vancouver, ISO, and other styles
31

Sivhugwana, K. S., and E. Ranganai. "Intelligent techniques, harmonically coupled and SARIMA models in forecasting solar radiation data: A hybridization approach." Journal of Energy in Southern Africa 31, no. 3 (2020): 14–37. http://dx.doi.org/10.17159/2413-3051/2020/v31i3a7754.

Full text
Abstract:
The unsteady and intermittent feature (mainly due to atmospheric mechanisms and diurnal cycles) of solar energy resource is often a stumbling block, due to its unpredictable nature, to receiving high-intensity levels of solar radiation at ground level. Hence, there has been a growing demand for accurate solar irradiance forecasts that properly explain the mixture of deterministic and stochastic characteristic (which may be linear or nonlinear) in which solar radiation presents itself on the earth’s surface. The seasonal autoregressive integrated moving average (SARIMA) models are popular for accurately modelling linearity, whilst the neural networks effectively capture the aspect of nonlinearity embedded in solar radiation data at ground level. This comparative study couples sinusoidal predictors at specified harmonic frequencies with SARIMA models, neural network autoregression (NNAR) models and the hybrid (SARIMA-NNAR) models to form the respective harmonically coupled models, namely, HCSARIMA models, HCNNAR models and HCSARIMA-NNAR models, with the sinusoidal predictor function, SARIMA, and NNAR parts capturing the deterministic, linear and nonlinear components, respectively. These models are used to forecast 10-minutely and 60-minutely averaged global horizontal irradiance data series obtained from the RVD Richtersveld solar radiometric station in the Northern Cape, South Africa. The forecasting accuracy of the three above-mentioned models is undertaken based on the relative mean square error, mean absolute error and mean absolute percentage error. The HCNNAR model and HCSARIMA-NNAR model gave more accurate forecasting results for 60-minutely and 10-minutely data, respectively.
 Highlights
 
 HCSARIMA models were outperformed by both HCNNAR models and HCSARIMA-NNAR models in the forecasting arena.
 HCNNAR models were most appropriate for forecasting larger time scales (i.e. 60-minutely).
 HCSARIMA-NNAR models were most appropriate for forecasting smaller time scales (i.e. 10-minutely).
 Models fitted on the January data series performed better than those fitted on the June data series.
APA, Harvard, Vancouver, ISO, and other styles
32

Ledwaba, Lehlogonolo P. I., Gerhard P. Hancke, Sherrin J. Isaac, and Hein S. Venter. "Smart Microgrid Energy Market: Evaluating Distributed Ledger Technologies for Remote and Constrained Microgrid Deployments." Electronics 10, no. 6 (2021): 714. http://dx.doi.org/10.3390/electronics10060714.

Full text
Abstract:
The increasing strain on ageing generation infrastructure has seen more frequent instances of scheduled and unscheduled blackouts, rising reliability on fossil fuel based energy alternatives and a slow down in efforts towards achieving universal access to electrical energy in South Africa. To try and relieve the burden on the National Grid and still progress electrification activities, the smart microgrid model and secure energy trade paradigm is considered—enabled by the Industrial IoT (IIoT) and distributed ledger technologies (DLTs). Given the high availability requirements of microgrid operations, the limited resources available on IIoT devices and the high processing and energy requirements of DLT operations, this work aims to determine the effect of native DLT algorithms when implemented on IIoT edge devices to assess the suitability of DLTs as a mechanism to establish a secure, energy trading market for the Internet of Energy. Metrics such as the node transaction time, operating temperature, power consumption, processor and memory usage are considered towards determining possible interference on the edge node operation. In addition, the cost and time required for mining operations associated with the DLT-enabled node are determined in an effort to predict the cost to end users—in terms of fees payable and mobile data costs—as well as predicting the microgrid’s growth and potential blockchain network slowdown.
APA, Harvard, Vancouver, ISO, and other styles
33

Mawela, Tendani, Nixon Muganda Ochara, and Hossana Twinomurinzi. "Missed opportunities for introducing transformational government." Transforming Government: People, Process and Policy 10, no. 1 (2016): 168–88. http://dx.doi.org/10.1108/tg-11-2014-0059.

Full text
Abstract:
Purpose The purpose of this paper is to trace the trajectory of the Gauteng Freeway Improvement Project, an electronic tolling (e-tolling) programme based in South Africa, to argue for the importance of taking advantage of similar public project opportunities to introduce the concept of Transformational Government (t-government). Design/methodology/approach The research uses an interpretive perspective and utilizes actor–network theory (ANT) to identify the roles and interests of the various stakeholders within the project and assess how each stakeholder could have better influenced the project’s sustainability using a t-government approach. Findings The findings suggest that in the midst of waning global actor interest, and strong local displeasure about specific public projects, public participation offers an ideal opportunity to introduce the notion of t-government, the use of information and communication technologies (ICT) to transform government for citizen benefits. The research allowed the authors to posit that public participation projects are solid and indispensable avenues for introducing t-government. Part of this claim is hinged on the view that the specific e-toll project carries a visible ICT artefact, which has embodied its own patterns of use characterized by various viewpoints, values, opinions and rhetoric. Practical implications The paper elevates the importance of t-government as a means to bring about practical transformation in government using public projects. The paper suggests how governments can use public participatory approaches to assimilate a new way of working in government. Originality/value This paper contributes to research on the emerging discourse on t-government. The paper also highlights the utility of ANT as a tool for understanding the dynamic public sector ICT programmes, their associated complexities and unintended consequences.
APA, Harvard, Vancouver, ISO, and other styles
34

An, Ran, Yuncheng Man, Shamreen Iram, et al. "Computer Vision and Deep Learning Assisted Microchip Electrophoresis for Integrated Anemia and Sickle Cell Disease Screening." Blood 136, Supplement 1 (2020): 46–47. http://dx.doi.org/10.1182/blood-2020-142548.

Full text
Abstract:
Introduction: Anemia affects a third of the world's population with the heaviest burden borne by women and children. Anemia leads to preventable impaired development in children, as well as high morbidity and early mortality among sufferers. Inherited hemoglobin (Hb) disorders, such as sickle cell disease (SCD), are associated with chronic hemolytic anemia causing high morbidity and mortality. Anemia and SCD are inherently associated and are both prevalent in the same regions of the world including sub-Saharan Africa, India, and south-east Asia. Anemia and SCD-related complications can be mitigated by screening, early diagnosis followed by timely intervention. Anemia treatment depends on the accurate characterization of the cause, such as inherited Hb disorders. Meanwhile, Hb disorders or SCD treatments, such as hydroxyurea therapy, requires close monitoring of blood Hb level and the patient's anemia status over time. As a result, it is crucially important to perform integrated detection and monitoring of blood Hb level, anemia status, and Hb variants, especially in areas where anemia and inherited Hb disorders are the most prevalent. Blood Hb level (in g/dL) is used as the main indicator of anemia, while the presence of Hb variants (e.g., sickle Hb or HbS) in blood is the primary indicator of an inherited disorder. The current clinical standards for anemia testing and Hb variant identification are complete blood count (CBC) and High-Performance Liquid Chromatography (HPLC), respectively. State-of-the-art laboratory infrastructure and trained personnel are required for these laboratory tests. However, these resources are typically scarce in low- and middle-income countries, where anemia and Hb disorders are the most prevalent. As a result, there is a dire need for high accuracy portable point-of-care (POC) devices to perform integrated anemia and Hb variant tests with affordable cost and high throughput. Methods: In 2019, the World Health Organization (WHO) listed Hb electrophoresis as an essential in vitro diagnostic (IVD) technology for diagnosing SCD and sickle cell trait. We have leveraged the common Hb electrophoresis method and developed a POC microchip electrophoresis test, Hemoglobin Variant/Anemia (HbVA). This technology is being commercialized under the product name "Gazelle" by Hemex Health Inc. for Hb variant identification with integrated anemia detection (Fig. 1A&B). We hypothesized that computer vision and deep learning will enhance the accuracy and reproducibility of blood Hb level prediction and anemia detection in cellulose acetate based Hb electrophoresis, which is a clinical standard test for Hb variant screening and diagnosis worldwide (Fig. 1C). To test this hypothesis, we integrated, for the first time, a new, computer vision and artificial neural network (ANN) based deep learning imaging and data analysis algorithm, to Hb electrophoresis. Here, we show the feasibility of this new, computer vision and deep learning enabled diagnostic approach via testing of 46 subjects, including individuals with anemia and homozygous (HbSS) or heterozygous (HbSC or Sβ-thalassemia) SCD. Results and Discussion: HbVA computer vision tracked the electrophoresis process real-time and the deep learning neural network algorithm determined Hb levels which demonstrated significant correlation with a Pearson Correlation Coefficient of 0.95 compared to the results of reference standard CBC (Fig.1D). Furthermore, HbVA demonstrated high reproducibly with a mean absolute error of 0.55 g/dL and a bias of -0.10 g/dL (95% limits of agreement: 1.5 g/dL) according to Bland-Altman analysis (Fig. 1E). Anemia determination was achieved with 100% sensitivity and 92.3% specificity with a receiver operating characteristic area under the curve (AUC) of 0.99 (Fig. 1F). Within the same test, subjects with SCD were identified with 100% sensitivity and specificity (Fig. 1G). Overall, the results suggested that computer vision and deep learning methods can be used to extract new information from Hb electrophoresis, enabling, for the first time, reproducible, accurate, and integrated blood Hb level prediction, anemia detection, and Hb variant identification in a single affordable test at the POC. Disclosures An: Hemex Health, Inc.: Patents & Royalties. Hasan:Hemex Health, Inc.: Patents & Royalties. Ahuja:Genentech: Consultancy; Sanofi-Genzyme: Consultancy; XaTec Inc.: Consultancy; XaTec Inc.: Research Funding; XaTec Inc.: Divested equity in a private or publicly-traded company in the past 24 months; Genentech: Honoraria; Sanofi-Genzyme: Honoraria. Little:GBT: Research Funding; Bluebird Bio: Research Funding; BioChip Labs: Patents & Royalties: SCD Biochip (patent, no royalties); Hemex Health, Inc.: Patents & Royalties: Microfluidic electropheresis (patent, no royalties); NHLBI: Research Funding; GBT: Membership on an entity's Board of Directors or advisory committees. Gurkan:Hemex Health, Inc.: Consultancy, Current Employment, Patents & Royalties, Research Funding; BioChip Labs: Patents & Royalties; Xatek Inc.: Patents & Royalties; Dx Now Inc.: Patents & Royalties.
APA, Harvard, Vancouver, ISO, and other styles
35

Govender, Nerissa, and Thokozani P. Mbhele. "Dynamics of intermodal logistical systems on containerisation and road transportation in Durban, South Africa." Journal of Transport and Supply Chain Management 8, no. 1 (2014). http://dx.doi.org/10.4102/jtscm.v8i1.150.

Full text
Abstract:
The underlying port operations in Durban, South Africa, epitomise intense global competitiveness in the intermodal logistics chain. The link between containerisation and theroad transport network can falter as a result of the dynamics of the logistics system. The main objective of the study was to establish the extent of the intermodal challenges of logistical systems on containerisation to which the role of intermodal sea–road freight transportation enhances the logistical competitiveness. It further examined the intermodal relationship on containerised freight between the challenges of containerisation processes and the effects on road freight transport mode. The impact of containerisation on intermodalism, the sea–road freight transport network and the technological attributes of security-based systems and logistical tracking protocols influence the systematic movement of containers on Durban’s public roads.
APA, Harvard, Vancouver, ISO, and other styles
36

Parry, Douglas Anderson. "Computing Research in South Africa: A Scientometric Investigation." South African Computer Journal 31, no. 1 (2019). http://dx.doi.org/10.18489/sacj.v31i1.674.

Full text
Abstract:
Limited attention has been afforded to mapping the ‘landscape’ of South African computing research. Prior studies have considered singular sub-disciplines, publications, or publication types. Given the growing prominence of computing disciplines, it is necessary to identify the patterns of research production, publication, collaboration, and impact of South African computing research. This study presents a scientometric investigation in this regard. Through the analysis of data accessed from the Scopus citation enhanced bibliographic database, the investigation presents findings in relation to annual research production, institutional differences in outputs, topics, collaboration, and citation impact. While characterised by institutional differences, over the period considered, South African computing research output has increased at a greater rate than that of South African research at large. Additionally, despite accounting for a greater proportion of all outputs, conference papers account for a smaller proportion of citations relative to journal articles or book chapters. Corresponding to previous investigations, there exists a tendency towards applied computing topics in contrast to more theoretical topics. Finally, the collaboration network was shown to be particularly de-centralised with many researchers clustered around institutions. The findings are of interest to all researchers conducting computing or related research in South Africa.
APA, Harvard, Vancouver, ISO, and other styles
37

Saccaggi, Davina L., Melanie Arendse, John R. U. Wilson, and John S. Terblanche. "Contaminant organisms recorded on plant product imports to South Africa 1994–2019." Scientific Data 8, no. 1 (2021). http://dx.doi.org/10.1038/s41597-021-00869-z.

Full text
Abstract:
AbstractBiosecurity interception records are crucial data underlying efforts to predict and manage pest and pathogen introductions. Here we present a dataset containing information on imported plant products inspected by the South African Department of Agriculture’s laboratories between 1994 and 2019 and the contaminant organisms found on them. Samples were received from border inspectors as either propagation material (e.g. plants) or material for immediate use (e.g. fruit). Material for immediate use was further divided into two sample categories, depending on if contaminants were seen/suspected by the border official or not: intervention or audit samples. The final dataset consists of 25,279 records, of which 30% tested positive (i.e. had at least one contaminant) and 13% had multiple contaminants. Of the 13,731 recorded contaminants, fungi (41%), mites (37%) and insects (19%) were most common. This dataset provides insight into the suite of taxa transported along the plant import pathway and provides an important resource for analyses of contaminant organisms in international trade, which can inform strategies for risk assessment, pathway management and biosecurity protocols.
APA, Harvard, Vancouver, ISO, and other styles
38

"Prediction of COVID-19 Time Series – Case Studies of South Africa and Egypt using Interval Type-2 Fuzzy Logic System." International Journal of Advanced Trends in Computer Science and Engineering 10, no. 2 (2021): 627–35. http://dx.doi.org/10.30534/ijatcse/2021/241022021.

Full text
Abstract:
COVID-19 is a virus known to emanate from Wuhan, China in December 2019. COVID-19 spread widely to nearby countries like Japan and Korea, followed by Europe and America and later to Africa. Particularly, South Africa and Egypt have been worst hit by the virus. Generally, the COVID-19 data is highly uncertain and requires fuzzy logic approaches for the effective handling of these uncertainties. This study therefore presents the prediction of COVID-19 cases in South Africa and Egypt using interval type-2 fuzzy logic system with Takagi-Sugeno-Kang fuzzy inference and neural network learning. The parameters of the model are adapted using gradient descent backpropagation approach. The proposed model is found to outperform type-1 fuzzy logic system and artificial neural network in terms of the root mean squared error, mean absolute percentage error and mean absolute error
APA, Harvard, Vancouver, ISO, and other styles
39

Zhandire, Evans. "Predicting clear-sky global horizontal irradiance at eight locations in South Africa using four models." Journal of Energy in Southern Africa 28, no. 4 (2017). http://dx.doi.org/10.17159/2413-3051/2017/v28i4a2397.

Full text
Abstract:
Solar radiation under clear-sky conditions provides information about the maximum possible magnitude of the solar resource available at a location of interest. This information is useful for determining the limits of solar energy use in applications such as thermal and electrical energy generation. Measurements of solar irradiance to provide this information are limited by the associated cost. It is therefore of great interest and importance to develop models that generate these data in lieu of measurements. This study focused on four such models: Ineichen-Perez (I-P), European Solar Radiation Atlas model (ESRA), multilayer perceptron neural network (MLPNN) and radial basis function neural network (RBFNN) models. These models were calibrated and tested using solar irradiance data measured at eight different locations in South Africa. The I-P model showed the best performance, recording relative root mean square errors of less than 2% across all hours, months and locations. The performances of the MLPNN and RBFNN were poor when averaged over all stations, but tended to show performance similar to that of the I-P model for some of the stations. The ESRA model showed performance that was in between that of the Artificial Neural Networks and that of the I-P model.
APA, Harvard, Vancouver, ISO, and other styles
40

Vorvornator, Lawrence Korsi, and Joyce Mdiniso. "The Covid-19 Pandemic and Religious Activities: A Case Study of Esikhaleni Settlement." Religion, ethics and communication in the era of the COVID-19 pandemic, no. 102(2) (June 2021). http://dx.doi.org/10.46222/pharosjot.102.213.

Full text
Abstract:
The COVID-19 virus allegedly originated from Wuhan, China and spread globally including South Africa and forced the country into restricted lockdown. This study analyses COVID-19 and religious activities during lockdown among dwellers in the rural community of Esikhaleni in South Africa’s KwaZulu Natal province. A qualitative approach was employed including thirty participants who were selected through random sampling. Telephonic interviews were conducted with the respondents from 1st to 30th May, 2020. Durkhein’s ideas of functional religion in Sociology relating to the human race being together, sharing solace and love with the destitute were defied by COVID-19 protocols. These protocols of the state implemented social and physical distancing to be observed to curb a high infection rate (WHO, 2020). As a result social gatherings were halted, which posed severe challenges to religious bodies to meet, praise and worship as the normally do. Some religious bodies then, resorted to online approaches and used media platforms such as, Zoom, Skype, and even WhatsApp to deliver their services. Major events by some religious organisations including baptisms, crusading, evangelism, and Hajj pilgrimages by Muslim adherents were also postponed. The COVID-19 catastrophe befalling the destitute and needy in society forced religious bodies to extend their arms to those in dire need of help. Challenges during the use of online services included both leaders and congregants not having the requisite technical know-how to connect the programmes. There were also issues related to network connectivity, intermittent power interruption, and the inordinately high cost of data procurement in South Africa for especially the poor. Overall, despite COVID-19 protocols preventing social gatherings, religious bodies, developed other means to keep their spiritual tempo and ought to overcome a sense of hopelessness bestowed on congregants by the pandemic – but sadly this omitted the poor. It is recommended that, religious leaders must learn to use ICT effectively, because COVID-19 might be here for some length of time to come. Moreover, religious leaders must also strive to educate their congregants to observe COVID-19 protocols and seek to avoid a third imminent wave of the virus, instead of laying blame at the door steps of government. Religious orders need to urgently embrace technological solutions which is sadly not always possible due to limited resources. Getting the masses out of poverty through job creation would also go a long way to help when future pandemics arise, and they surely will.
APA, Harvard, Vancouver, ISO, and other styles
41

Schubert, Grit, Vincent Achi, Steve Ahuka, et al. "The African Network for Improved Diagnostics, Epidemiology and Management of common infectious Agents." BMC Infectious Diseases 21, no. 1 (2021). http://dx.doi.org/10.1186/s12879-021-06238-w.

Full text
Abstract:
Abstract Background In sub-Saharan Africa, acute respiratory infections (ARI), acute gastrointestinal infections (GI) and acute febrile disease of unknown cause (AFDUC) have a large disease burden, especially among children, while respective aetiologies often remain unresolved. The need for robust infectious disease surveillance to detect emerging pathogens along with common human pathogens has been highlighted by the ongoing novel coronavirus disease 2019 (COVID-19) pandemic. The African Network for Improved Diagnostics, Epidemiology and Management of Common Infectious Agents (ANDEMIA) is a sentinel surveillance study on the aetiology and clinical characteristics of ARI, GI and AFDUC in sub-Saharan Africa. Methods ANDEMIA includes 12 urban and rural health care facilities in four African countries (Côte d’Ivoire, Burkina Faso, Democratic Republic of the Congo and Republic of South Africa). It was piloted in 2018 in Côte d’Ivoire and the initial phase will run from 2019 to 2021. Case definitions for ARI, GI and AFDUC were established, as well as syndrome-specific sampling algorithms including the collection of blood, naso- and oropharyngeal swabs and stool. Samples are tested using comprehensive diagnostic protocols, ranging from classic bacteriology and antimicrobial resistance screening to multiplex real-time polymerase chain reaction (PCR) systems and High Throughput Sequencing. In March 2020, PCR testing for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) and analysis of full genomic information was included in the study. Standardised questionnaires collect relevant clinical, demographic, socio-economic and behavioural data for epidemiologic analyses. Controls are enrolled over a 12-month period for a nested case-control study. Data will be assessed descriptively and aetiologies will be evaluated using a latent class analysis among cases. Among cases and controls, an integrated analytic approach using logistic regression and Bayesian estimation will be employed to improve the assessment of aetiology and associated risk factors. Discussion ANDEMIA aims to expand our understanding of ARI, GI and AFDUC aetiologies in sub-Saharan Africa using a comprehensive laboratory diagnostics strategy. It will foster early detection of emerging threats and continued monitoring of important common pathogens. The network collaboration will be strengthened and site diagnostic capacities will be reinforced to improve quality management and patient care.
APA, Harvard, Vancouver, ISO, and other styles
42

Noruwana, Nimrod, and Maureen Tanner. "Understanding the structured processes followed by organisations prior to engaging in agile processes: A South African Perspective." South African Computer Journal 48 (June 27, 2012). http://dx.doi.org/10.18489/sacj.v48i1.74.

Full text
Abstract:
There appears to be a lack of knowledge on the phases South African (SA) organisations go through while adopting agile methods. As a means to address this gap, this study uncovered empirical evidence on the phases SA organisations go through whilst adopting agile methods as well as the disparities between agile prescriptions and the way SA organisations actually implement agile methods.
 
 The data collected using a case study approach was analysed through the lens of Actor-Network Theory (ANT). The results reveal that there is no structured process for adopting agile methods and organisations go through various phases in their attempts to adopt agile methods. During the various phases, organisations face challenges which are culture as well as people related.
 
 Through this study South African practitioners could now be aware that before adopting an agile methodology, there has to be a common understanding of the problems at hand and the envisioned solution. The findings also inform aspiring adopters in South Africa that adoption of the methods does not have to be as prescribed. They are free to adopt only those aspects the organisations need most.
APA, Harvard, Vancouver, ISO, and other styles
43

"Transforming the Siyabuswa Community Centre into a Smart Centre." Muma Case Review 3 (2018): 001–14. http://dx.doi.org/10.28945/4221.

Full text
Abstract:
Dr. Jackie Phahlamohlaka reflected on what he would propose to the board regarding the transformation of the existing Siyabuswa Educational Improvement and Development Trust (SEIDET) community centre to a smart community centre. As the Competency Area Manager at the Council for Scientific and Industrial Research (CSIR) in South Africa, the founder of SEIDET and the chairman of its Board of Trustees, he had for over twenty-four years led socio-economic development efforts and ICT related research linked to SEIDET (SEIDET, 2014). These programmes ranged from high school supplementary tutorials on mathematics and science to adult and computer literacy training of the broader community in the Siyabuswa area of Mpumalanga Province, Republic of South Africa. These questions were: (1) How could SEIDET leverage or create affordable cyber network infrastructure in the envisaged smart centre? (2) How could the smart centre assist the local community and individuals in the community to be enabled to participate in local economic development? (3) How could SEIDET get the local government administration, traditional leaders and the local community to buy-in to this smart centre development project in order to ensure its sustainability?
APA, Harvard, Vancouver, ISO, and other styles
44

Moore, Christopher Luke. "Digital Games Distribution: The Presence of the Past and the Future of Obsolescence." M/C Journal 12, no. 3 (2009). http://dx.doi.org/10.5204/mcj.166.

Full text
Abstract:
A common criticism of the rhythm video games genre — including series like Guitar Hero and Rock Band, is that playing musical simulation games is a waste of time when you could be playing an actual guitar and learning a real skill. A more serious criticism of games cultures draws attention to the degree of e-waste they produce. E-waste or electronic waste includes mobiles phones, computers, televisions and other electronic devices, containing toxic chemicals and metals whose landfill, recycling and salvaging all produce distinct environmental and social problems. The e-waste produced by games like Guitar Hero is obvious in the regular flow of merchandise transforming computer and video games stores into simulation music stores, filled with replica guitars, drum kits, microphones and other products whose half-lives are short and whose obsolescence is anticipated in the annual cycles of consumption and disposal. This paper explores the connection between e-waste and obsolescence in the games industry, and argues for the further consideration of consumers as part of the solution to the problem of e-waste. It uses a case study of the PC digital distribution software platform, Steam, to suggest that the digital distribution of games may offer an alternative model to market driven software and hardware obsolescence, and more generally, that such software platforms might be a place to support cultures of consumption that delay rather than promote hardware obsolescence and its inevitability as e-waste. The question is whether there exists a potential for digital distribution to be a means of not only eliminating the need to physically transport commodities (its current 'green' benefit), but also for supporting consumer practices that further reduce e-waste. The games industry relies on a rapid production and innovation cycle, one that actively enforces hardware obsolescence. Current video game consoles, including the PlayStation 3, the Xbox 360 and Nintendo Wii, are the seventh generation of home gaming consoles to appear within forty years, and each generation is accompanied by an immense international transportation of games hardware, software (in various storage formats) and peripherals. Obsolescence also occurs at the software or content level and is significant because the games industry as a creative industry is dependent on the extensive management of multiple intellectual properties. The computing and video games software industry operates a close partnership with the hardware industry, and as such, software obsolescence directly contributes to hardware obsolescence. The obsolescence of content and the redundancy of the methods of policing its scarcity in the marketplace has been accelerated and altered by the processes of disintermediation with a range of outcomes (Flew). The music industry is perhaps the most advanced in terms of disintermediation with digital distribution at the center of the conflict between the legitimate and unauthorised access to intellectual property. This points to one issue with the hypothesis that digital distribution can lead to a reduction in hardware obsolescence, as the marketplace leader and key online distributor of music, Apple, is also the major producer of new media technologies and devices that are the paragon of stylistic obsolescence. Stylistic obsolescence, in which fashion changes products across seasons of consumption, has long been observed as the dominant form of scaled industrial innovation (Slade). Stylistic obsolescence is differentiated from mechanical or technological obsolescence as the deliberate supersedence of products by more advanced designs, better production techniques and other minor innovations. The line between the stylistic and technological obsolescence is not always clear, especially as reduced durability has become a powerful market strategy (Fitzpatrick). This occurs where the design of technologies is subsumed within the discourses of manufacturing, consumption and the logic of planned obsolescence in which the product or parts are intended to fail, degrade or under perform over time. It is especially the case with signature new media technologies such as laptop computers, mobile phones and portable games devices. Gamers are as guilty as other consumer groups in contributing to e-waste as participants in the industry's cycles of planned obsolescence, but some of them complicate discussions over the future of obsolescence and e-waste. Many gamers actively work to forestall the obsolescence of their games: they invest time in the play of older games (“retrogaming”) they donate labor and creative energy to the production of user-generated content as a means of sustaining involvement in gaming communities; and they produce entirely new game experiences for other users, based on existing software and hardware modifications known as 'mods'. With Guitar Hero and other 'rhythm' games it would be easy to argue that the hardware components of this genre have only one future: as waste. Alternatively, we could consider the actual lifespan of these objects (including their impact as e-waste) and the roles they play in the performances and practices of communities of gamers. For example, the Elmo Guitar Hero controller mod, the Tesla coil Guitar Hero controller interface, the Rock Band Speak n' Spellbinder mashup, the multiple and almost sacrilegious Fender guitar hero mods, the Guitar Hero Portable Turntable Mod and MAKE magazine's Trumpet Hero all indicate a significant diversity of user innovation, community formation and individual investment in the post-retail life of computer and video game hardware. Obsolescence is not just a problem for the games industry but for the computing and electronics industries more broadly as direct contributors to the social and environmental cost of electrical waste and obsolete electrical equipment. Planned obsolescence has long been the experience of gamers and computer users, as the basis of a utopian mythology of upgrades (Dovey and Kennedy). For PC users the upgrade pathway is traversed by the consumption of further hardware and software post initial purchase in a cycle of endless consumption, acquisition and waste (as older parts are replaced and eventually discarded). The accumulation and disposal of these cultural artefacts does not devalue or accrue in space or time at the same rate (Straw) and many users will persist for years, gradually upgrading and delaying obsolescence and even perpetuate the circulation of older cultural commodities. Flea markets and secondhand fairs are popular sites for the purchase of new, recent, old, and recycled computer hardware, and peripherals. Such practices and parallel markets support the strategies of 'making do' described by De Certeau, but they also continue the cycle of upgrade and obsolescence, and they are still consumed as part of the promise of the 'new', and the desire of a purchase that will finally 'fix' the users' computer in a state of completion (29). The planned obsolescence of new media technologies is common, but its success is mixed; for example, support for Microsoft's operating system Windows XP was officially withdrawn in April 2009 (Robinson), but due to the popularity in low cost PC 'netbooks' outfitted with an optimised XP operating system and a less than enthusiastic response to the 'next generation' Windows Vista, XP continues to be popular. Digital Distribution: A Solution? Gamers may be able to reduce the accumulation of e-waste by supporting the disintermediation of the games retail sector by means of online distribution. Disintermediation is the establishment of a direct relationship between the creators of content and their consumers through products and services offered by content producers (Flew 201). The move to digital distribution has already begun to reduce the need to physically handle commodities, but this currently signals only further support of planned, stylistic and technological obsolescence, increasing the rate at which the commodities for recording, storing, distributing and exhibiting digital content become e-waste. Digital distribution is sometimes overlooked as a potential means for promoting communities of user practice dedicated to e-waste reduction, at the same time it is actively employed to reduce the potential for the unregulated appropriation of content and restrict post-purchase sales through Digital Rights Management (DRM) technologies. Distributors like Amazon.com continue to pursue commercial opportunities in linking the user to digital distribution of content via exclusive hardware and software technologies. The Amazon e-book reader, the Kindle, operates via a proprietary mobile network using a commercially run version of the wireless 3G protocols. The e-book reader is heavily encrypted with Digital Rights Management (DRM) technologies and exclusive digital book formats designed to enforce current copyright restrictions and eliminate second-hand sales, lending, and further post-purchase distribution. The success of this mode of distribution is connected to Amazon's ability to tap both the mainstream market and the consumer demand for the less-than-popular; those books, movies, music and television series that may not have been 'hits' at the time of release. The desire to revisit forgotten niches, such as B-sides, comics, books, and older video games, suggests Chris Anderson, linked with so-called “long tail” economics. Recently Webb has queried the economic impact of the Long Tail as a business strategy, but does not deny the underlying dynamics, which suggest that content does not obsolesce in any straightforward way. Niche markets for older content are nourished by participatory cultures and Web 2.0 style online services. A good example of the Long Tail phenomenon is the recent case of the 1971 book A Lion Called Christian, by Anthony Burke and John Rendall, republished after the author's film of a visit to a resettled Christian in Africa was popularised on YouTube in 2008. Anderson's Long Tail theory suggests that over time a large number of items, each with unique rather than mass histories, will be subsumed as part of a larger community of consumers, including fans, collectors and everyday users with a long term interest in their use and preservation. If digital distribution platforms can reduce e-waste, they can perhaps be fostered by to ensuring digital consumers have access to morally and ethically aware consumer decisions, but also that they enjoy traditional consumer freedoms, such as the right to sell on and change or modify their property. For it is not only the fixation on the 'next generation' that contributes to obsolescence, but also technologies like DRM systems that discourage second hand sales and restrict modification. The legislative upgrades, patches and amendments to copyright law that have attempted to maintain the law's effectiveness in competing with peer-to-peer networks have supported DRM and other intellectual property enforcement technologies, despite the difficulties that owners of intellectual property have encountered with the effectiveness of DRM systems (Moore, Creative). The games industry continues to experiment with DRM, however, this industry also stands out as one of the few to have significantly incorporated the user within the official modes of production (Moore, Commonising). Is the games industry capable (or willing) of supporting a digital delivery system that attempts to minimise or even reverse software and hardware obsolescence? We can try to answer this question by looking in detail at the biggest digital distributor of PC games, Steam. Steam Figure 1: The Steam Application user interface retail section Steam is a digital distribution system designed for the Microsoft Windows operating system and operated by American video game development company and publisher, Valve Corporation. Steam combines online games retail, DRM technologies and internet-based distribution services with social networking and multiplayer features (in-game voice and text chat, user profiles, etc) and direct support for major games publishers, independent producers, and communities of user-contributors (modders). Steam, like the iTunes games store, Xbox Live and other digital distributors, provides consumers with direct digital downloads of new, recent and classic titles that can be accessed remotely by the user from any (internet equipped) location. Steam was first packaged with the physical distribution of Half Life 2 in 2004, and the platform's eventual popularity is tied to the success of that game franchise. Steam was not an optional component of the game's installation and many gamers protested in various online forums, while the platform was treated with suspicion by the global PC games press. It did not help that Steam was at launch everything that gamers take objection to: a persistent and initially 'buggy' piece of software that sits in the PC's operating system and occupies limited memory resources at the cost of hardware performance. Regular updates to the Steam software platform introduced social network features just as mainstream sites like MySpace and Facebook were emerging, and its popularity has undergone rapid subsequent growth. Steam now eclipses competitors with more than 20 million user accounts (Leahy) and Valve Corporation makes it publicly known that Steam collects large amounts of data about its users. This information is available via the public player profile in the community section of the Steam application. It includes the average number of hours the user plays per week, and can even indicate the difficulty the user has in navigating game obstacles. Valve reports on the number of users on Steam every two hours via its web site, with a population on average between one and two million simultaneous users (Valve, Steam). We know these users’ hardware profiles because Valve Corporation makes the results of its surveillance public knowledge via the Steam Hardware Survey. Valve’s hardware survey itself conceptualises obsolescence in two ways. First, it uses the results to define the 'cutting edge' of PC technologies and publishing the standards of its own high end production hardware on the companies blog. Second, the effect of the Survey is to subsequently define obsolescent hardware: for example, in the Survey results for April 2009, we can see that the slight majority of users maintain computers with two central processing units while a significant proportion (almost one third) of users still maintained much older PCs with a single CPU. Both effects of the Survey appear to be well understood by Valve: the Steam Hardware Survey automatically collects information about the community's computer hardware configurations and presents an aggregate picture of the stats on our web site. The survey helps us make better engineering and gameplay decisions, because it makes sure we're targeting machines our customers actually use, rather than measuring only against the hardware we've got in the office. We often get asked about the configuration of the machines we build around the office to do both game and Steam development. We also tend to turn over machines in the office pretty rapidly, at roughly every 18 months. (Valve, Team Fortress) Valve’s support of older hardware might counter perceptions that older PCs have no use and begins to reverse decades of opinion regarding planned and stylistic obsolescence in the PC hardware and software industries. Equally significant to the extension of the lives of older PCs is Steam's support for mods and its promotion of user generated content. By providing software for mod creation and distribution, Steam maximises what Postigo calls the development potential of fan-programmers. One of the 'payoffs' in the information/access exchange for the user with Steam is the degree to which Valve's End-User Licence Agreement (EULA) permits individuals and communities of 'modders' to appropriate its proprietary game content for use in the creation of new games and games materials for redistribution via Steam. These mods extend the play of the older games, by requiring their purchase via Steam in order for the individual user to participate in the modded experience. If Steam is able to encourage this kind of appropriation and community support for older content, then the potential exists for it to support cultures of consumption and practice of use that collaboratively maintain, extend, and prolong the life and use of games. Further, Steam incorporates the insights of “long tail” economics in a purely digital distribution model, in which the obsolescence of 'non-hit' game titles can be dramatically overturned. Published in November 2007, Unreal Tournament 3 (UT3) by Epic Games, was unappreciated in a market saturated with games in the first-person shooter genre. Epic republished UT3 on Steam 18 months later, making the game available to play for free for one weekend, followed by discounted access to new content. The 2000 per cent increase in players over the game's 'free' trial weekend, has translated into enough sales of the game for Epic to no longer consider the release a commercial failure: It’s an incredible precedent to set: making a game a success almost 18 months after a poor launch. It’s something that could only have happened now, and with a system like Steam...Something that silently updates a purchase with patches and extra content automatically, so you don’t have to make the decision to seek out some exciting new feature: it’s just there anyway. Something that, if you don’t already own it, advertises that game to you at an agreeably reduced price whenever it loads. Something that enjoys a vast community who are in turn plugged into a sea of smaller relevant communities. It’s incredibly sinister. It’s also incredibly exciting... (Meer) Clearly concerns exist about Steam's user privacy policy, but this also invites us to the think about the economic relationship between gamers and games companies as it is reconfigured through the private contractual relationship established by the EULA which accompanies the digital distribution model. The games industry has established contractual and licensing arrangements with its consumer base in order to support and reincorporate emerging trends in user generated cultures and other cultural formations within its official modes of production (Moore, "Commonising"). When we consider that Valve gets to tax sales of its virtual goods and can further sell the information farmed from its users to hardware manufacturers, it is reasonable to consider the relationship between the corporation and its gamers as exploitative. Gabe Newell, the Valve co-founder and managing director, conversely believes that people are willing to give up personal information if they feel it is being used to get better services (Leahy). If that sentiment is correct then consumers may be willing to further trade for services that can reduce obsolescence and begin to address the problems of e-waste from the ground up. Conclusion Clearly, there is a potential for digital distribution to be a means of not only eliminating the need to physically transport commodities but also supporting consumer practices that further reduce e-waste. For an industry where only a small proportion of the games made break even, the successful relaunch of older games content indicates Steam's capacity to ameliorate software obsolescence. Digital distribution extends the use of commercially released games by providing disintermediated access to older and user-generated content. For Valve, this occurs within a network of exchange as access to user-generated content, social networking services, and support for the organisation and coordination of communities of gamers is traded for user-information and repeat business. Evidence for whether this will actively translate to an equivalent decrease in the obsolescence of game hardware might be observed with indicators like the Steam Hardware Survey in the future. The degree of potential offered by digital distribution is disrupted by a range of technical, commercial and legal hurdles, primary of which is the deployment of DRM, as part of a range of techniques designed to limit consumer behaviour post purchase. While intervention in the form of legislation and radical change to the insidious nature of electronics production is crucial in order to achieve long term reduction in e-waste, the user is currently considered only in terms of 'ethical' consumption and ultimately divested of responsibility through participation in corporate, state and civil recycling and e-waste management operations. The message is either 'careful what you purchase' or 'careful how you throw it away' and, like DRM, ignores the connections between product, producer and user and the consumer support for environmentally, ethically and socially positive production, distribrution, disposal and recycling. This article, has adopted a different strategy, one that sees digital distribution platforms like Steam, as capable, if not currently active, in supporting community practices that should be seriously considered in conjunction with a range of approaches to the challenge of obsolescence and e-waste. References Anderson, Chris. "The Long Tail." Wired Magazine 12. 10 (2004). 20 Apr. 2009 ‹http://www.wired.com/wired/archive/12.10/tail.html›. De Certeau, Michel. The Practice of Everyday Life. Berkeley: U of California P, 1984. Dovey, Jon, and Helen Kennedy. Game Cultures: Computer Games as New Media. London: Open University Press,2006. Fitzpatrick, Kathleen. The Anxiety of Obsolescence. Nashville: Vanderbilt UP, 2008. Flew, Terry. New Media: An Introduction. South Melbourne: Oxford UP, 2008. Leahy, Brian. "Live Blog: DICE 2009 Keynote - Gabe Newell, Valve Software." The Feed. G4TV 18 Feb. 2009. 16 Apr. 2009 ‹http://g4tv.com/thefeed/blog/post/693342/Live-Blog-DICE-2009-Keynote-–-Gabe-Newell-Valve-Software.html›. Meer, Alec. "Unreal Tournament 3 and the New Lazarus Effect." Rock, Paper, Shotgun 16 Mar. 2009. 24 Apr. 2009 ‹http://www.rockpapershotgun.com/2009/03/16/unreal-tournament-3-and-the-new-lazarus-effect/›.Moore, Christopher. "Commonising the Enclosure: Online Games and Reforming Intellectual Property Regimes." Australian Journal of Emerging Technologies and Society 3. 2, (2005). 12 Apr. 2009 ‹http://www.swin.edu.au/sbs/ajets/journal/issue5-V3N2/abstract_moore.htm›. Moore, Christopher. "Creative Choices: Changes to Australian Copyright Law and the Future of the Public Domain." Media International Australia 114 (Feb. 2005): 71–83. Postigo, Hector. "Of Mods and Modders: Chasing Down the Value of Fan-Based Digital Game Modification." Games and Culture 2 (2007): 300-13. Robinson, Daniel. "Windows XP Support Runs Out Next Week." PC Business Authority 8 Apr. 2009. 16 Apr. 2009 ‹http://www.pcauthority.com.au/News/142013,windows-xp-support-runs-out-next-week.aspx›. Straw, Will. "Exhausted Commodities: The Material Culture of Music." Canadian Journal of Communication 25.1 (2000): 175. Slade, Giles. Made to Break: Technology and Obsolescence in America. Cambridge: Harvard UP, 2006. Valve. "Steam and Game Stats." 26 Apr. 2009 ‹http://store.steampowered.com/stats/›. Valve. "Team Fortress 2: The Scout Update." Steam Marketing Message 20 Feb. 2009. 12 Apr. 2009 ‹http://storefront.steampowered.com/Steam/Marketing/message/2269/›. Webb, Richard. "Online Shopping and the Harry Potter Effect." New Scientist 2687 (2008): 52-55. 16 Apr. 2009 ‹http://www.newscientist.com/article/mg20026873.300-online-shopping-and-the-harry-potter-effect.html?page=2›. With thanks to Dr Nicola Evans and Dr Frances Steel for their feedback and comments on drafts of this paper.
APA, Harvard, Vancouver, ISO, and other styles
45

Livingstone, Randall M. "Let’s Leave the Bias to the Mainstream Media: A Wikipedia Community Fighting for Information Neutrality." M/C Journal 13, no. 6 (2010). http://dx.doi.org/10.5204/mcj.315.

Full text
Abstract:
Although I'm a rich white guy, I'm also a feminist anti-racism activist who fights for the rights of the poor and oppressed. (Carl Kenner)Systemic bias is a scourge to the pillar of neutrality. (Cerejota)Count me in. Let's leave the bias to the mainstream media. (Orcar967)Because this is so important. (CuttingEdge)These are a handful of comments posted by online editors who have banded together in a virtual coalition to combat Western bias on the world’s largest digital encyclopedia, Wikipedia. This collective action by Wikipedians both acknowledges the inherent inequalities of a user-controlled information project like Wikpedia and highlights the potential for progressive change within that same project. These community members are taking the responsibility of social change into their own hands (or more aptly, their own keyboards).In recent years much research has emerged on Wikipedia from varying fields, ranging from computer science, to business and information systems, to the social sciences. While critical at times of Wikipedia’s growth, governance, and influence, most of this work observes with optimism that barriers to improvement are not firmly structural, but rather they are socially constructed, leaving open the possibility of important and lasting change for the better.WikiProject: Countering Systemic Bias (WP:CSB) considers one such collective effort. Close to 350 editors have signed on to the project, which began in 2004 and itself emerged from a similar project named CROSSBOW, or the “Committee Regarding Overcoming Serious Systemic Bias on Wikipedia.” As a WikiProject, the term used for a loose group of editors who collaborate around a particular topic, these editors work within the Wikipedia site and collectively create a social network that is unified around one central aim—representing the un- and underrepresented—and yet they are bound by no particular unified set of interests. The first stage of a multi-method study, this paper looks at a snapshot of WP:CSB’s activity from both content analysis and social network perspectives to discover “who” geographically this coalition of the unrepresented is inserting into the digital annals of Wikipedia.Wikipedia and WikipediansDeveloped in 2001 by Internet entrepreneur Jimmy Wales and academic Larry Sanger, Wikipedia is an online collaborative encyclopedia hosting articles in nearly 250 languages (Cohen). The English-language Wikipedia contains over 3.2 million articles, each of which is created, edited, and updated solely by users (Wikipedia “Welcome”). At the time of this study, Alexa, a website tracking organisation, ranked Wikipedia as the 6th most accessed site on the Internet. Unlike the five sites ahead of it though—Google, Facebook, Yahoo, YouTube (owned by Google), and live.com (owned by Microsoft)—all of which are multibillion-dollar businesses that deal more with information aggregation than information production, Wikipedia is a non-profit that operates on less than $500,000 a year and staffs only a dozen paid employees (Lih). Wikipedia is financed and supported by the WikiMedia Foundation, a charitable umbrella organisation with an annual budget of $4.6 million, mainly funded by donations (Middleton).Wikipedia editors and contributors have the option of creating a user profile and participating via a username, or they may participate anonymously, with only an IP address representing their actions. Despite the option for total anonymity, many Wikipedians have chosen to visibly engage in this online community (Ayers, Matthews, and Yates; Bruns; Lih), and researchers across disciplines are studying the motivations of these new online collectives (Kane, Majchrzak, Johnson, and Chenisern; Oreg and Nov). The motivations of open source software contributors, such as UNIX programmers and programming groups, have been shown to be complex and tied to both extrinsic and intrinsic rewards, including online reputation, self-satisfaction and enjoyment, and obligation to a greater common good (Hertel, Niedner, and Herrmann; Osterloh and Rota). Investigation into why Wikipedians edit has indicated multiple motivations as well, with community engagement, task enjoyment, and information sharing among the most significant (Schroer and Hertel). Additionally, Wikipedians seem to be taking up the cause of generativity (a concern for the ongoing health and openness of the Internet’s infrastructures) that Jonathan Zittrain notably called for in The Future of the Internet and How to Stop It. Governance and ControlAlthough the technical infrastructure of Wikipedia is built to support and perhaps encourage an equal distribution of power on the site, Wikipedia is not a land of “anything goes.” The popular press has covered recent efforts by the site to reduce vandalism through a layer of editorial review (Cohen), a tightening of control cited as a possible reason for the recent dip in the number of active editors (Edwards). A number of regulations are already in place that prevent the open editing of certain articles and pages, such as the site’s disclaimers and pages that have suffered large amounts of vandalism. Editing wars can also cause temporary restrictions to editing, and Ayers, Matthews, and Yates point out that these wars can happen anywhere, even to Burt Reynold’s page.Academic studies have begun to explore the governance and control that has developed in the Wikipedia community, generally highlighting how order is maintained not through particular actors, but through established procedures and norms. Konieczny tested whether Wikipedia’s evolution can be defined by Michels’ Iron Law of Oligopoly, which predicts that the everyday operations of any organisation cannot be run by a mass of members, and ultimately control falls into the hands of the few. Through exploring a particular WikiProject on information validation, he concludes:There are few indicators of an oligarchy having power on Wikipedia, and few trends of a change in this situation. The high level of empowerment of individual Wikipedia editors with regard to policy making, the ease of communication, and the high dedication to ideals of contributors succeed in making Wikipedia an atypical organization, quite resilient to the Iron Law. (189)Butler, Joyce, and Pike support this assertion, though they emphasise that instead of oligarchy, control becomes encapsulated in a wide variety of structures, policies, and procedures that guide involvement with the site. A virtual “bureaucracy” emerges, but one that should not be viewed with the negative connotation often associated with the term.Other work considers control on Wikipedia through the framework of commons governance, where “peer production depends on individual action that is self-selected and decentralized rather than hierarchically assigned. Individuals make their own choices with regard to resources managed as a commons” (Viegas, Wattenberg and McKeon). The need for quality standards and quality control largely dictate this commons governance, though interviewing Wikipedians with various levels of responsibility revealed that policies and procedures are only as good as those who maintain them. Forte, Larco, and Bruckman argue “the Wikipedia community has remained healthy in large part due to the continued presence of ‘old-timers’ who carry a set of social norms and organizational ideals with them into every WikiProject, committee, and local process in which they take part” (71). Thus governance on Wikipedia is a strong representation of a democratic ideal, where actors and policies are closely tied in their evolution. Transparency, Content, and BiasThe issue of transparency has proved to be a double-edged sword for Wikipedia and Wikipedians. The goal of a collective body of knowledge created by all—the “expert” and the “amateur”—can only be upheld if equal access to page creation and development is allotted to everyone, including those who prefer anonymity. And yet this very option for anonymity, or even worse, false identities, has been a sore subject for some in the Wikipedia community as well as a source of concern for some scholars (Santana and Wood). The case of a 24-year old college dropout who represented himself as a multiple Ph.D.-holding theology scholar and edited over 16,000 articles brought these issues into the public spotlight in 2007 (Doran; Elsworth). Wikipedia itself has set up standards for content that include expectations of a neutral point of view, verifiability of information, and the publishing of no original research, but Santana and Wood argue that self-policing of these policies is not adequate:The principle of managerial discretion requires that every actor act from a sense of duty to exercise moral autonomy and choice in responsible ways. When Wikipedia’s editors and administrators remain anonymous, this criterion is simply not met. It is assumed that everyone is behaving responsibly within the Wikipedia system, but there are no monitoring or control mechanisms to make sure that this is so, and there is ample evidence that it is not so. (141) At the theoretical level, some downplay these concerns of transparency and autonomy as logistical issues in lieu of the potential for information systems to support rational discourse and emancipatory forms of communication (Hansen, Berente, and Lyytinen), but others worry that the questionable “realities” created on Wikipedia will become truths once circulated to all areas of the Web (Langlois and Elmer). With the number of articles on the English-language version of Wikipedia reaching well into the millions, the task of mapping and assessing content has become a tremendous endeavour, one mostly taken on by information systems experts. Kittur, Chi, and Suh have used Wikipedia’s existing hierarchical categorisation structure to map change in the site’s content over the past few years. Their work revealed that in early 2008 “Culture and the arts” was the most dominant category of content on Wikipedia, representing nearly 30% of total content. People (15%) and geographical locations (14%) represent the next largest categories, while the natural and physical sciences showed the greatest increase in volume between 2006 and 2008 (+213%D, with “Culture and the arts” close behind at +210%D). This data may indicate that contributing to Wikipedia, and thus spreading knowledge, is growing amongst the academic community while maintaining its importance to the greater popular culture-minded community. Further work by Kittur and Kraut has explored the collaborative process of content creation, finding that too many editors on a particular page can reduce the quality of content, even when a project is well coordinated.Bias in Wikipedia content is a generally acknowledged and somewhat conflicted subject (Giles; Johnson; McHenry). The Wikipedia community has created numerous articles and pages within the site to define and discuss the problem. Citing a survey conducted by the University of Würzburg, Germany, the “Wikipedia:Systemic bias” page describes the average Wikipedian as:MaleTechnically inclinedFormally educatedAn English speakerWhiteAged 15-49From a majority Christian countryFrom a developed nationFrom the Northern HemisphereLikely a white-collar worker or studentBias in content is thought to be perpetuated by this demographic of contributor, and the “founder effect,” a concept from genetics, linking the original contributors to this same demographic has been used to explain the origins of certain biases. Wikipedia’s “About” page discusses the issue as well, in the context of the open platform’s strengths and weaknesses:in practice editing will be performed by a certain demographic (younger rather than older, male rather than female, rich enough to afford a computer rather than poor, etc.) and may, therefore, show some bias. Some topics may not be covered well, while others may be covered in great depth. No educated arguments against this inherent bias have been advanced.Royal and Kapila’s study of Wikipedia content tested some of these assertions, finding identifiable bias in both their purposive and random sampling. They conclude that bias favoring larger countries is positively correlated with the size of the country’s Internet population, and corporations with larger revenues work in much the same way, garnering more coverage on the site. The researchers remind us that Wikipedia is “more a socially produced document than a value-free information source” (Royal & Kapila).WikiProject: Countering Systemic BiasAs a coalition of current Wikipedia editors, the WikiProject: Countering Systemic Bias (WP:CSB) attempts to counter trends in content production and points of view deemed harmful to the democratic ideals of a valueless, open online encyclopedia. WP:CBS’s mission is not one of policing the site, but rather deepening it:Generally, this project concentrates upon remedying omissions (entire topics, or particular sub-topics in extant articles) rather than on either (1) protesting inappropriate inclusions, or (2) trying to remedy issues of how material is presented. Thus, the first question is "What haven't we covered yet?", rather than "how should we change the existing coverage?" (Wikipedia, “Countering”)The project lays out a number of content areas lacking adequate representation, geographically highlighting the dearth in coverage of Africa, Latin America, Asia, and parts of Eastern Europe. WP:CSB also includes a “members” page that editors can sign to show their support, along with space to voice their opinions on the problem of bias on Wikipedia (the quotations at the beginning of this paper are taken from this “members” page). At the time of this study, 329 editors had self-selected and self-identified as members of WP:CSB, and this group constitutes the population sample for the current study. To explore the extent to which WP:CSB addressed these self-identified areas for improvement, each editor’s last 50 edits were coded for their primary geographical country of interest, as well as the conceptual category of the page itself (“P” for person/people, “L” for location, “I” for idea/concept, “T” for object/thing, or “NA” for indeterminate). For example, edits to the Wikipedia page for a single person like Tony Abbott (Australian federal opposition leader) were coded “Australia, P”, while an edit for a group of people like the Manchester United football team would be coded “England, P”. Coding was based on information obtained from the header paragraphs of each article’s Wikipedia page. After coding was completed, corresponding information on each country’s associated continent was added to the dataset, based on the United Nations Statistics Division listing.A total of 15,616 edits were coded for the study. Nearly 32% (n = 4962) of these edits were on articles for persons or people (see Table 1 for complete coding results). From within this sub-sample of edits, a majority of the people (68.67%) represented are associated with North America and Europe (Figure A). If we break these statistics down further, nearly half of WP:CSB’s edits concerning people were associated with the United States (36.11%) and England (10.16%), with India (3.65%) and Australia (3.35%) following at a distance. These figures make sense for the English-language Wikipedia; over 95% of the population in the three Westernised countries speak English, and while India is still often regarded as a developing nation, its colonial British roots and the emergence of a market economy with large, technology-driven cities are logical explanations for its representation here (and some estimates make India the largest English-speaking nation by population on the globe today).Table A Coding Results Total Edits 15616 (I) Ideas 2881 18.45% (L) Location 2240 14.34% NA 333 2.13% (T) Thing 5200 33.30% (P) People 4962 31.78% People by Continent Africa 315 6.35% Asia 827 16.67% Australia 175 3.53% Europe 1411 28.44% NA 110 2.22% North America 1996 40.23% South America 128 2.58% The areas of the globe of main concern to WP:CSB proved to be much less represented by the coalition itself. Asia, far and away the most populous continent with more than 60% of the globe’s people (GeoHive), was represented in only 16.67% of edits. Africa (6.35%) and South America (2.58%) were equally underrepresented compared to both their real-world populations (15% and 9% of the globe’s population respectively) and the aforementioned dominance of the advanced Westernised areas. However, while these percentages may seem low, in aggregate they do meet the quota set on the WP:CSB Project Page calling for one out of every twenty edits to be “a subject that is systematically biased against the pages of your natural interests.” By this standard, the coalition is indeed making headway in adding content that strategically counterbalances the natural biases of Wikipedia’s average editor.Figure ASocial network analysis allows us to visualise multifaceted data in order to identify relationships between actors and content (Vego-Redondo; Watts). Similar to Davis’s well-known sociological study of Southern American socialites in the 1930s (Scott), our Wikipedia coalition can be conceptualised as individual actors united by common interests, and a network of relations can be constructed with software such as UCINET. A mapping algorithm that considers both the relationship between all sets of actors and each actor to the overall collective structure produces an image of our network. This initial network is bimodal, as both our Wikipedia editors and their edits (again, coded for country of interest) are displayed as nodes (Figure B). Edge-lines between nodes represents a relationship, and here that relationship is the act of editing a Wikipedia article. We see from our network that the “U.S.” and “England” hold central positions in the network, with a mass of editors crowding around them. A perimeter of nations is then held in place by their ties to editors through the U.S. and England, with a second layer of editors and poorly represented nations (Gabon, Laos, Uzbekistan, etc.) around the boundaries of the network.Figure BWe are reminded from this visualisation both of the centrality of the two Western powers even among WP:CSB editoss, and of the peripheral nature of most other nations in the world. But we also learn which editors in the project are contributing most to underrepresented areas, and which are less “tied” to the Western core. Here we see “Wizzy” and “Warofdreams” among the second layer of editors who act as a bridge between the core and the periphery; these are editors with interests in both the Western and marginalised nations. Located along the outer edge, “Gallador” and “Gerrit” have no direct ties to the U.S. or England, concentrating all of their edits on less represented areas of the globe. Identifying editors at these key positions in the network will help with future research, informing interview questions that will investigate their interests further, but more significantly, probing motives for participation and action within the coalition.Additionally, we can break the network down further to discover editors who appear to have similar interests in underrepresented areas. Figure C strips down the network to only editors and edits dealing with Africa and South America, the least represented continents. From this we can easily find three types of editors again: those who have singular interests in particular nations (the outermost layer of editors), those who have interests in a particular region (the second layer moving inward), and those who have interests in both of these underrepresented regions (the center layer in the figure). This last group of editors may prove to be the most crucial to understand, as they are carrying the full load of WP:CSB’s mission.Figure CThe End of Geography, or the Reclamation?In The Internet Galaxy, Manuel Castells writes that “the Internet Age has been hailed as the end of geography,” a bold suggestion, but one that has gained traction over the last 15 years as the excitement for the possibilities offered by information communication technologies has often overshadowed structural barriers to participation like the Digital Divide (207). Castells goes on to amend the “end of geography” thesis by showing how global information flows and regional Internet access rates, while creating a new “map” of the world in many ways, is still closely tied to power structures in the analog world. The Internet Age: “redefines distance but does not cancel geography” (207). The work of WikiProject: Countering Systemic Bias emphasises the importance of place and representation in the information environment that continues to be constructed in the online world. This study looked at only a small portion of this coalition’s efforts (~16,000 edits)—a snapshot of their labor frozen in time—which itself is only a minute portion of the information being dispatched through Wikipedia on a daily basis (~125,000 edits). Further analysis of WP:CSB’s work over time, as well as qualitative research into the identities, interests and motivations of this collective, is needed to understand more fully how information bias is understood and challenged in the Internet galaxy. The data here indicates this is a fight worth fighting for at least a growing few.ReferencesAlexa. “Top Sites.” Alexa.com, n.d. 10 Mar. 2010 ‹http://www.alexa.com/topsites>. Ayers, Phoebe, Charles Matthews, and Ben Yates. How Wikipedia Works: And How You Can Be a Part of It. San Francisco, CA: No Starch, 2008.Bruns, Axel. Blogs, Wikipedia, Second Life, and Beyond: From Production to Produsage. New York: Peter Lang, 2008.Butler, Brian, Elisabeth Joyce, and Jacqueline Pike. Don’t Look Now, But We’ve Created a Bureaucracy: The Nature and Roles of Policies and Rules in Wikipedia. Paper presented at 2008 CHI Annual Conference, Florence.Castells, Manuel. The Internet Galaxy: Reflections on the Internet, Business, and Society. Oxford: Oxford UP, 2001.Cohen, Noam. “Wikipedia.” New York Times, n.d. 12 Mar. 2010 ‹http://www.nytimes.com/info/wikipedia/>. Doran, James. “Wikipedia Chief Promises Change after ‘Expert’ Exposed as Fraud.” The Times, 6 Mar. 2007 ‹http://technology.timesonline.co.uk/tol/news/tech_and_web/article1480012.ece>. Edwards, Lin. “Report Claims Wikipedia Losing Editors in Droves.” Physorg.com, 30 Nov 2009. 12 Feb. 2010 ‹http://www.physorg.com/news178787309.html>. Elsworth, Catherine. “Fake Wikipedia Prof Altered 20,000 Entries.” London Telegraph, 6 Mar. 2007 ‹http://www.telegraph.co.uk/news/1544737/Fake-Wikipedia-prof-altered-20000-entries.html>. Forte, Andrea, Vanessa Larco, and Amy Bruckman. “Decentralization in Wikipedia Governance.” Journal of Management Information Systems 26 (2009): 49-72.Giles, Jim. “Internet Encyclopedias Go Head to Head.” Nature 438 (2005): 900-901.Hansen, Sean, Nicholas Berente, and Kalle Lyytinen. “Wikipedia, Critical Social Theory, and the Possibility of Rational Discourse.” The Information Society 25 (2009): 38-59.Hertel, Guido, Sven Niedner, and Stefanie Herrmann. “Motivation of Software Developers in Open Source Projects: An Internet-Based Survey of Contributors to the Linex Kernel.” Research Policy 32 (2003): 1159-1177.Johnson, Bobbie. “Rightwing Website Challenges ‘Liberal Bias’ of Wikipedia.” The Guardian, 1 Mar. 2007. 8 Mar. 2010 ‹http://www.guardian.co.uk/technology/2007/mar/01/wikipedia.news>. Kane, Gerald C., Ann Majchrzak, Jeremaih Johnson, and Lily Chenisern. A Longitudinal Model of Perspective Making and Perspective Taking within Fluid Online Collectives. Paper presented at the 2009 International Conference on Information Systems, Phoenix, AZ, 2009.Kittur, Aniket, Ed H. Chi, and Bongwon Suh. What’s in Wikipedia? Mapping Topics and Conflict Using Socially Annotated Category Structure. Paper presented at the 2009 CHI Annual Conference, Boston, MA.———, and Robert E. Kraut. Harnessing the Wisdom of Crowds in Wikipedia: Quality through Collaboration. Paper presented at the 2008 Association for Computing Machinery’s Computer Supported Cooperative Work Annual Conference, San Diego, CA.Konieczny, Piotr. “Governance, Organization, and Democracy on the Internet: The Iron Law and the Evolution of Wikipedia.” Sociological Forum 24 (2009): 162-191.———. “Wikipedia: Community or Social Movement?” Interface: A Journal for and about Social Movements 1 (2009): 212-232.Langlois, Ganaele, and Greg Elmer. “Wikipedia Leeches? The Promotion of Traffic through a Collaborative Web Format.” New Media & Society 11 (2009): 773-794.Lih, Andrew. The Wikipedia Revolution. New York, NY: Hyperion, 2009.McHenry, Robert. “The Real Bias in Wikipedia: A Response to David Shariatmadari.” OpenDemocracy.com 2006. 8 Mar. 2010 ‹http://www.opendemocracy.net/media-edemocracy/wikipedia_bias_3621.jsp>. Middleton, Chris. “The World of Wikinomics.” Computer Weekly, 20 Jan. 2009: 22-26.Oreg, Shaul, and Oded Nov. “Exploring Motivations for Contributing to Open Source Initiatives: The Roles of Contribution, Context and Personal Values.” Computers in Human Behavior 24 (2008): 2055-2073.Osterloh, Margit and Sandra Rota. “Trust and Community in Open Source Software Production.” Analyse & Kritik 26 (2004): 279-301.Royal, Cindy, and Deepina Kapila. “What’s on Wikipedia, and What’s Not…?: Assessing Completeness of Information.” Social Science Computer Review 27 (2008): 138-148.Santana, Adele, and Donna J. Wood. “Transparency and Social Responsibility Issues for Wikipedia.” Ethics of Information Technology 11 (2009): 133-144.Schroer, Joachim, and Guido Hertel. “Voluntary Engagement in an Open Web-Based Encyclopedia: Wikipedians and Why They Do It.” Media Psychology 12 (2009): 96-120.Scott, John. Social Network Analysis. London: Sage, 1991.Vego-Redondo, Fernando. Complex Social Networks. Cambridge: Cambridge UP, 2007.Viegas, Fernanda B., Martin Wattenberg, and Matthew M. McKeon. “The Hidden Order of Wikipedia.” Online Communities and Social Computing (2007): 445-454.Watts, Duncan. Six Degrees: The Science of a Connected Age. New York, NY: W. W. Norton & Company, 2003Wikipedia. “About.” n.d. 8 Mar. 2010 ‹http://en.wikipedia.org/wiki/Wikipedia:About>. ———. “Welcome to Wikipedia.” n.d. 8 Mar. 2010 ‹http://en.wikipedia.org/wiki/Main_Page>.———. “Wikiproject:Countering Systemic Bias.” n.d. 12 Feb. 2010 ‹http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Countering_systemic_bias#Members>. Zittrain, Jonathan. The Future of the Internet and How to Stop It. New Haven, CT: Yale UP, 2008.
APA, Harvard, Vancouver, ISO, and other styles
46

Maxwell, Richard, and Toby Miller. "The Real Future of the Media." M/C Journal 15, no. 3 (2012). http://dx.doi.org/10.5204/mcj.537.

Full text
Abstract:
When George Orwell encountered ideas of a technological utopia sixty-five years ago, he acted the grumpy middle-aged man Reading recently a batch of rather shallowly optimistic “progressive” books, I was struck by the automatic way in which people go on repeating certain phrases which were fashionable before 1914. Two great favourites are “the abolition of distance” and “the disappearance of frontiers”. I do not know how often I have met with the statements that “the aeroplane and the radio have abolished distance” and “all parts of the world are now interdependent” (1944). It is worth revisiting the old boy’s grumpiness, because the rhetoric he so niftily skewers continues in our own time. Facebook features “Peace on Facebook” and even claims that it can “decrease world conflict” through inter-cultural communication. Twitter has announced itself as “a triumph of humanity” (“A Cyber-House” 61). Queue George. In between Orwell and latter-day hoody cybertarians, a whole host of excitable public intellectuals announced the impending end of materiality through emergent media forms. Marshall McLuhan, Neil Postman, Daniel Bell, Ithiel de Sola Pool, George Gilder, Alvin Toffler—the list of 1960s futurists goes on and on. And this wasn’t just a matter of punditry: the OECD decreed the coming of the “information society” in 1975 and the European Union (EU) followed suit in 1979, while IBM merrily declared an “information age” in 1977. Bell theorized this technological utopia as post-ideological, because class would cease to matter (Mattelart). Polluting industries seemingly no longer represented the dynamic core of industrial capitalism; instead, market dynamism radiated from a networked, intellectual core of creative and informational activities. The new information and knowledge-based economies would rescue First World hegemony from an “insurgent world” that lurked within as well as beyond itself (Schiller). Orwell’s others and the Cold-War futurists propagated one of the most destructive myths shaping both public debate and scholarly studies of the media, culture, and communication. They convinced generations of analysts, activists, and arrivistes that the promises and problems of the media could be understood via metaphors of the environment, and that the media were weightless and virtual. The famous medium they wished us to see as the message —a substance as vital to our wellbeing as air, water, and soil—turned out to be no such thing. Today’s cybertarians inherit their anti-Marxist, anti-materialist positions, as a casual glance at any new media journal, culture-industry magazine, or bourgeois press outlet discloses. The media are undoubtedly important instruments of social cohesion and fragmentation, political power and dissent, democracy and demagoguery, and other fraught extensions of human consciousness. But talk of media systems as equivalent to physical ecosystems—fashionable among marketers and media scholars alike—is predicated on the notion that they are environmentally benign technologies. This has never been true, from the beginnings of print to today’s cloud-covered computing. Our new book Greening the Media focuses on the environmental impact of the media—the myriad ways that media technology consumes, despoils, and wastes natural resources. We introduce ideas, stories, and facts that have been marginal or absent from popular, academic, and professional histories of media technology. Throughout, ecological issues have been at the core of our work and we immodestly think the same should apply to media communications, and cultural studies more generally. We recognize that those fields have contributed valuable research and teaching that address environmental questions. For instance, there is an abundant literature on representations of the environment in cinema, how to communicate environmental messages successfully, and press coverage of climate change. That’s not enough. You may already know that media technologies contain toxic substances. You may have signed an on-line petition protesting the hazardous and oppressive conditions under which workers assemble cell phones and computers. But you may be startled, as we were, by the scale and pervasiveness of these environmental risks. They are present in and around every site where electronic and electric devices are manufactured, used, and thrown away, poisoning humans, animals, vegetation, soil, air and water. We are using the term “media” as a portmanteau word to cover a multitude of cultural and communications machines and processes—print, film, radio, television, information and communications technologies (ICT), and consumer electronics (CE). This is not only for analytical convenience, but because there is increasing overlap between the sectors. CE connect to ICT and vice versa; televisions resemble computers; books are read on telephones; newspapers are written through clouds; and so on. Cultural forms and gadgets that were once separate are now linked. The currently fashionable notion of convergence doesn’t quite capture the vastness of this integration, which includes any object with a circuit board, scores of accessories that plug into it, and a global nexus of labor and environmental inputs and effects that produce and flow from it. In 2007, a combination of ICT/CE and media production accounted for between 2 and 3 percent of all greenhouse gases emitted around the world (“Gartner Estimates,”; International Telecommunication Union; Malmodin et al.). Between twenty and fifty million tonnes of electronic waste (e-waste) are generated annually, much of it via discarded cell phones and computers, which affluent populations throw out regularly in order to buy replacements. (Presumably this fits the narcissism of small differences that distinguishes them from their own past.) E-waste is historically produced in the Global North—Australasia, Western Europe, Japan, and the US—and dumped in the Global South—Latin America, Africa, Eastern Europe, Southern and Southeast Asia, and China. It takes the form of a thousand different, often deadly, materials for each electrical and electronic gadget. This trend is changing as India and China generate their own media detritus (Robinson; Herat). Enclosed hard drives, backlit screens, cathode ray tubes, wiring, capacitors, and heavy metals pose few risks while these materials remain encased. But once discarded and dismantled, ICT/CE have the potential to expose workers and ecosystems to a morass of toxic components. Theoretically, “outmoded” parts could be reused or swapped for newer parts to refurbish devices. But items that are defined as waste undergo further destruction in order to collect remaining parts and valuable metals, such as gold, silver, copper, and rare-earth elements. This process causes serious health risks to bones, brains, stomachs, lungs, and other vital organs, in addition to birth defects and disrupted biological development in children. Medical catastrophes can result from lead, cadmium, mercury, other heavy metals, poisonous fumes emitted in search of precious metals, and such carcinogenic compounds as polychlorinated biphenyls, dioxin, polyvinyl chloride, and flame retardants (Maxwell and Miller 13). The United States’ Environmental Protection Agency estimates that by 2007 US residents owned approximately three billion electronic devices, with an annual turnover rate of 400 million units, and well over half such purchases made by women. Overall CE ownership varied with age—adults under 45 typically boasted four gadgets; those over 65 made do with one. The Consumer Electronics Association (CEA) says US$145 billion was expended in the sector in 2006 in the US alone, up 13% on the previous year. The CEA refers joyously to a “consumer love affair with technology continuing at a healthy clip.” In the midst of a recession, 2009 saw $165 billion in sales, and households owned between fifteen and twenty-four gadgets on average. By 2010, US$233 billion was spent on electronic products, three-quarters of the population owned a computer, nearly half of all US adults owned an MP3 player, and 85% had a cell phone. By all measures, the amount of ICT/CE on the planet is staggering. As investigative science journalist, Elizabeth Grossman put it: “no industry pushes products into the global market on the scale that high-tech electronics does” (Maxwell and Miller 2). In 2007, “of the 2.25 million tons of TVs, cell phones and computer products ready for end-of-life management, 18% (414,000 tons) was collected for recycling and 82% (1.84 million tons) was disposed of, primarily in landfill” (Environmental Protection Agency 1). Twenty million computers fell obsolete across the US in 1998, and the rate was 130,000 a day by 2005. It has been estimated that the five hundred million personal computers discarded in the US between 1997 and 2007 contained 6.32 billion pounds of plastics, 1.58 billion pounds of lead, three million pounds of cadmium, 1.9 million pounds of chromium, and 632000 pounds of mercury (Environmental Protection Agency; Basel Action Network and Silicon Valley Toxics Coalition 6). The European Union is expected to generate upwards of twelve million tons annually by 2020 (Commission of the European Communities 17). While refrigerators and dangerous refrigerants account for the bulk of EU e-waste, about 44% of the most toxic e-waste measured in 2005 came from medium-to-small ICT/CE: computer monitors, TVs, printers, ink cartridges, telecommunications equipment, toys, tools, and anything with a circuit board (Commission of the European Communities 31-34). Understanding the enormity of the environmental problems caused by making, using, and disposing of media technologies should arrest our enthusiasm for them. But intellectual correctives to the “love affair” with technology, or technophilia, have come and gone without establishing much of a foothold against the breathtaking flood of gadgets and the propaganda that proclaims their awe-inspiring capabilities.[i] There is a peculiar enchantment with the seeming magic of wireless communication, touch-screen phones and tablets, flat-screen high-definition televisions, 3-D IMAX cinema, mobile computing, and so on—a totemic, quasi-sacred power that the historian of technology David Nye has named the technological sublime (Nye Technological Sublime 297).[ii] We demonstrate in our book why there is no place for the technological sublime in projects to green the media. But first we should explain why such symbolic power does not accrue to more mundane technologies; after all, for the time-strapped cook, a pressure cooker does truly magical things. Three important qualities endow ICT/CE with unique symbolic potency—virtuality, volume, and novelty. The technological sublime of media technology is reinforced by the “virtual nature of much of the industry’s content,” which “tends to obscure their responsibility for a vast proliferation of hardware, all with high levels of built-in obsolescence and decreasing levels of efficiency” (Boyce and Lewis 5). Planned obsolescence entered the lexicon as a new “ethics” for electrical engineering in the 1920s and ’30s, when marketers, eager to “habituate people to buying new products,” called for designs to become quickly obsolete “in efficiency, economy, style, or taste” (Grossman 7-8).[iii] This defines the short lifespan deliberately constructed for computer systems (drives, interfaces, operating systems, batteries, etc.) by making tiny improvements incompatible with existing hardware (Science and Technology Council of the American Academy of Motion Picture Arts and Sciences 33-50; Boyce and Lewis). With planned obsolescence leading to “dizzying new heights” of product replacement (Rogers 202), there is an overstated sense of the novelty and preeminence of “new” media—a “cult of the present” is particularly dazzled by the spread of electronic gadgets through globalization (Mattelart and Constantinou 22). References to the symbolic power of media technology can be found in hymnals across the internet and the halls of academe: technologies change us, the media will solve social problems or create new ones, ICTs transform work, monopoly ownership no longer matters, journalism is dead, social networking enables social revolution, and the media deliver a cleaner, post-industrial, capitalism. Here is a typical example from the twilight zone of the technological sublime (actually, the OECD): A major feature of the knowledge-based economy is the impact that ICTs have had on industrial structure, with a rapid growth of services and a relative decline of manufacturing. Services are typically less energy intensive and less polluting, so among those countries with a high and increasing share of services, we often see a declining energy intensity of production … with the emergence of the Knowledge Economy ending the old linear relationship between output and energy use (i.e. partially de-coupling growth and energy use) (Houghton 1) This statement mixes half-truths and nonsense. In reality, old-time, toxic manufacturing has moved to the Global South, where it is ascendant; pollution levels are rising worldwide; and energy consumption is accelerating in residential and institutional sectors, due almost entirely to ICT/CE usage, despite advances in energy conservation technology (a neat instance of the age-old Jevons Paradox). In our book we show how these are all outcomes of growth in ICT/CE, the foundation of the so-called knowledge-based economy. ICT/CE are misleadingly presented as having little or no material ecological impact. In the realm of everyday life, the sublime experience of electronic machinery conceals the physical work and material resources that go into them, while the technological sublime makes the idea that more-is-better palatable, axiomatic; even sexy. In this sense, the technological sublime relates to what Marx called “the Fetishism which attaches itself to the products of labour” once they are in the hands of the consumer, who lusts after them as if they were “independent beings” (77). There is a direct but unseen relationship between technology’s symbolic power and the scale of its environmental impact, which the economist Juliet Schor refers to as a “materiality paradox” —the greater the frenzy to buy goods for their transcendent or nonmaterial cultural meaning, the greater the use of material resources (40-41). We wrote Greening the Media knowing that a study of the media’s effect on the environment must work especially hard to break the enchantment that inflames popular and elite passions for media technologies. We understand that the mere mention of the political-economic arrangements that make shiny gadgets possible, or the environmental consequences of their appearance and disappearance, is bad medicine. It’s an unwelcome buzz kill—not a cool way to converse about cool stuff. But we didn’t write the book expecting to win many allies among high-tech enthusiasts and ICT/CE industry leaders. We do not dispute the importance of information and communication media in our lives and modern social systems. We are media people by profession and personal choice, and deeply immersed in the study and use of emerging media technologies. But we think it’s time for a balanced assessment with less hype and more practical understanding of the relationship of media technologies to the biosphere they inhabit. Media consumers, designers, producers, activists, researchers, and policy makers must find new and effective ways to move ICT/CE production and consumption toward ecologically sound practices. In the course of this project, we found in casual conversation, lecture halls, classroom discussions, and correspondence, consistent and increasing concern with the environmental impact of media technology, especially the deleterious effects of e-waste toxins on workers, air, water, and soil. We have learned that the grip of the technological sublime is not ironclad. Its instability provides a point of departure for investigating and criticizing the relationship between the media and the environment. The media are, and have been for a long time, intimate environmental participants. Media technologies are yesterday’s, today’s, and tomorrow’s news, but rarely in the way they should be. The prevailing myth is that the printing press, telegraph, phonograph, photograph, cinema, telephone, wireless radio, television, and internet changed the world without changing the Earth. In reality, each technology has emerged by despoiling ecosystems and exposing workers to harmful environments, a truth obscured by symbolic power and the power of moguls to set the terms by which such technologies are designed and deployed. Those who benefit from ideas of growth, progress, and convergence, who profit from high-tech innovation, monopoly, and state collusion—the military-industrial-entertainment-academic complex and multinational commandants of labor—have for too long ripped off the Earth and workers. As the current celebration of media technology inevitably winds down, perhaps it will become easier to comprehend that digital wonders come at the expense of employees and ecosystems. This will return us to Max Weber’s insistence that we understand technology in a mundane way as a “mode of processing material goods” (27). Further to understanding that ordinariness, we can turn to the pioneering conversation analyst Harvey Sacks, who noted three decades ago “the failures of technocratic dreams [:] that if only we introduced some fantastic new communication machine the world will be transformed.” Such fantasies derived from the very banality of these introductions—that every time they took place, one more “technical apparatus” was simply “being made at home with the rest of our world’ (548). Media studies can join in this repetitive banality. Or it can withdraw the welcome mat for media technologies that despoil the Earth and wreck the lives of those who make them. In our view, it’s time to green the media by greening media studies. References “A Cyber-House Divided.” Economist 4 Sep. 2010: 61-62. “Gartner Estimates ICT Industry Accounts for 2 Percent of Global CO2 Emissions.” Gartner press release. 6 April 2007. ‹http://www.gartner.com/it/page.jsp?id=503867›. Basel Action Network and Silicon Valley Toxics Coalition. Exporting Harm: The High-Tech Trashing of Asia. Seattle: Basel Action Network, 25 Feb. 2002. Benjamin, Walter. “Central Park.” Trans. Lloyd Spencer with Mark Harrington. New German Critique 34 (1985): 32-58. Biagioli, Mario. “Postdisciplinary Liaisons: Science Studies and the Humanities.” Critical Inquiry 35.4 (2009): 816-33. Boyce, Tammy and Justin Lewis, eds. Climate Change and the Media. New York: Peter Lang, 2009. Commission of the European Communities. “Impact Assessment.” Commission Staff Working Paper accompanying the Proposal for a Directive of the European Parliament and of the Council on Waste Electrical and Electronic Equipment (WEEE) (recast). COM (2008) 810 Final. Brussels: Commission of the European Communities, 3 Dec. 2008. Environmental Protection Agency. Management of Electronic Waste in the United States. Washington, DC: EPA, 2007 Environmental Protection Agency. Statistics on the Management of Used and End-of-Life Electronics. Washington, DC: EPA, 2008 Grossman, Elizabeth. Tackling High-Tech Trash: The E-Waste Explosion & What We Can Do about It. New York: Demos, 2008. ‹http://www.demos.org/pubs/e-waste_FINAL.pdf› Herat, Sunil. “Review: Sustainable Management of Electronic Waste (e-Waste).” Clean 35.4 (2007): 305-10. Houghton, J. “ICT and the Environment in Developing Countries: Opportunities and Developments.” Paper prepared for the Organization for Economic Cooperation and Development, 2009. International Telecommunication Union. ICTs for Environment: Guidelines for Developing Countries, with a Focus on Climate Change. Geneva: ICT Applications and Cybersecurity Division Policies and Strategies Department ITU Telecommunication Development Sector, 2008. Malmodin, Jens, Åsa Moberg, Dag Lundén, Göran Finnveden, and Nina Lövehagen. “Greenhouse Gas Emissions and Operational Electricity Use in the ICT and Entertainment & Media Sectors.” Journal of Industrial Ecology 14.5 (2010): 770-90. Marx, Karl. Capital: Vol. 1: A Critical Analysis of Capitalist Production, 3rd ed. Trans. Samuel Moore and Edward Aveling, Ed. Frederick Engels. New York: International Publishers, 1987. Mattelart, Armand and Costas M. Constantinou. “Communications/Excommunications: An Interview with Armand Mattelart.” Trans. Amandine Bled, Jacques Guot, and Costas Constantinou. Review of International Studies 34.1 (2008): 21-42. Mattelart, Armand. “Cómo nació el mito de Internet.” Trans. Yanina Guthman. El mito internet. Ed. Victor Hugo de la Fuente. Santiago: Editorial aún creemos en los sueños, 2002. 25-32. Maxwell, Richard and Toby Miller. Greening the Media. New York: Oxford University Press, 2012. Nye, David E. American Technological Sublime. Cambridge, Mass.: MIT Press, 1994. Nye, David E. Technology Matters: Questions to Live With. Cambridge, Mass.: MIT Press. 2007. Orwell, George. “As I Please.” Tribune. 12 May 1944. Richtel, Matt. “Consumers Hold on to Products Longer.” New York Times: B1, 26 Feb. 2011. Robinson, Brett H. “E-Waste: An Assessment of Global Production and Environmental Impacts.” Science of the Total Environment 408.2 (2009): 183-91. Rogers, Heather. Gone Tomorrow: The Hidden Life of Garbage. New York: New Press, 2005. Sacks, Harvey. Lectures on Conversation. Vols. I and II. Ed. Gail Jefferson. Malden: Blackwell, 1995. Schiller, Herbert I. Information and the Crisis Economy. Norwood: Ablex Publishing, 1984. Schor, Juliet B. Plenitude: The New Economics of True Wealth. New York: Penguin, 2010. Science and Technology Council of the American Academy of Motion Picture Arts and Sciences. The Digital Dilemma: Strategic Issues in Archiving and Accessing Digital Motion Picture Materials. Los Angeles: Academy Imprints, 2007. Weber, Max. “Remarks on Technology and Culture.” Trans. Beatrix Zumsteg and Thomas M. Kemple. Ed. Thomas M. Kemple. Theory, Culture [i] The global recession that began in 2007 has been the main reason for some declines in Global North energy consumption, slower turnover in gadget upgrades, and longer periods of consumer maintenance of electronic goods (Richtel). [ii] The emergence of the technological sublime has been attributed to the Western triumphs in the post-Second World War period, when technological power supposedly supplanted the power of nature to inspire fear and astonishment (Nye Technology Matters 28). Historian Mario Biagioli explains how the sublime permeates everyday life through technoscience: "If around 1950 the popular imaginary placed science close to the military and away from the home, today’s technoscience frames our everyday life at all levels, down to our notion of the self" (818). [iii] This compulsory repetition is seemingly undertaken each time as a novelty, governed by what German cultural critic Walter Benjamin called, in his awkward but occasionally illuminating prose, "the ever-always-the-same" of "mass-production" cloaked in "a hitherto unheard-of significance" (48).
APA, Harvard, Vancouver, ISO, and other styles
47

Horrigan, Matthew. "A Flattering Robopocalypse." M/C Journal 23, no. 6 (2020). http://dx.doi.org/10.5204/mcj.2726.

Full text
Abstract:
RACHAEL. It seems you feel our work is not a benefit to the public.DECKARD. Replicants are like any other machine. They're either a benefit or a hazard. If they're a benefit it's not my problem.RACHAEL. May I ask you a personal question?DECKARD. Yes.RACHAEL. Have you every retired a human by mistake? (Scott 17:30) CAPTCHAs (henceforth "captchas") are commonplace on today's Internet. Their purpose seems clear: block malicious software, allow human users to pass. But as much as they exclude spambots, captchas often exclude humans with visual and other disabilities (Dzieza; W3C Working Group). Worse yet, more and more advanced captcha-breaking technology has resulted in more and more challenging captchas, raising the barrier between online services and those who would access them. In the words of inclusive design advocate Robin Christopherson, "CAPTCHAs are evil". In this essay I describe how the captcha industry implements a posthuman process that speculative fiction has gestured toward but not grasped. The hostile posthumanity of captcha is not just a technical problem, nor just a problem of usability or access. Rather, captchas convey a design philosophy that asks humans to prove themselves by performing well at disembodied games. This philosophy has its roots in the Turing Test itself, whose terms guide speculation away from the real problems that today's authentication systems present. Drawing the concept of "procedurality" from game studies, I argue that, despite a design goal of separating machines and humans to the benefit of the latter, captchas actually and ironically produce an arms race in which humans have a systematic and increasing disadvantage. This arms race results from the Turing Test's equivocation between human and machine bodies, an assumption whose influence I identify in popular film, science fiction literature, and captcha design discourse. The Captcha Industry and Its Side-Effects Exclusion is an essential function of every cybersecurity system. From denial-of-service attacks to data theft, toxic automated entities constantly seek admission to services they would damage. To remain functional and accessible, Websites need security systems to keep out "abusive agents" (Shet). In cybersecurity, the term "user authentication" refers to the process of distinguishing between abusive agents and welcome users (Jeng et al.). Of the many available authentication techniques, CAPTCHA, "Completely Automated Public Turing test[s] to tell Computers and Humans Apart" (Von Ahn et al. 1465), is one of the most iconic. Although some captchas display a simple checkbox beside a disclaimer to the effect that "I am not a robot" (Shet), these frequently give way to more difficult alternatives: perception tests (fig. 1). Test captchas may show sequences of distorted letters, which a user is supposed to recognise and then type in (Godfrey). Others effectively digitize a game of "I Spy": an image appears, with an instruction to select the parts of it that show a specific type of object (Zhu et al.). A newer type of captcha involves icons rotated upside-down or sideways, the task being to right them (Gossweiler et al.). These latter developments show the influence of gamification (Kani and Nishigaki; Kumar et al.), the design trend where game-like elements figure in serious tasks. Fig. 1: A series of captchas followed by multifactor authentication as a "quick security check" during the author's suspicious attempt to access LinkedIn over a Virtual Private Network Gamified captchas, in using tests of ability to tell humans from computers, invite three problems, of which only the first has received focussed critical attention. I discuss each briefly below, and at greater length in subsequent sections. First, as many commentators have pointed out (W3C Working Group), captchas can accidentally categorise real humans as nonhumans—a technical problem that becomes more likely as captcha-breaking technologies improve (e.g. Tam et al.; Brown et al.). Indeed, the design and breaking of captchas has become an almost self-sustaining subfield in computer science, as researchers review extant captchas, publish methods for breaking them, and publish further captcha designs (e.g. Weng et al.). Such research fuels an industry of captcha-solving services (fig. 2), of which some use automated techniques, and some are "human-powered", employing groups of humans to complete large numbers of captchas, thus clearing the way for automated incursions (Motoyama et al. 2). Captchas now face the quixotic task of using ability tests to distinguish legitimate users from abusers with similar abilities. Fig. 2: Captcha production and captcha breaking: a feedback loop Second, gamified captchas import the feelings of games. When they defeat a real human, the human seems not to have encountered the failure state of an automated procedure, but rather to have lost, or given up on, a game. The same frame of "gameful"-ness (McGonigal, under "Happiness Hacking") or "gameful work" (under "The Rise of the Happiness Engineers"), supposed to flatter users with a feeling of reward or satisfaction when they complete a challenge, has a different effect in the event of defeat. Gamefulness shifts the fault from procedure to human, suggesting, for the latter, the shameful status of loser. Third, like games, gamified captchas promote a particular strain of logic. Just as other forms of media can be powerful venues for purveying stereotypes, so are gamified captchas, in this case conveying the notion that ability is a legitimate means, not only of apportioning privilege, but of humanising and dehumanising. Humanity thus appears as a status earned, and disability appears not as a stigma, nor an occurrence, but an essence. The latter two problems emerge because the captcha reveals, propagates and naturalises an ideology through mechanised procedures. Below I invoke the concept of "procedural rhetoric" to critique the disembodied notion of humanity that underlies both the original Turing Test and the "Completely Automated Public Turing test." Both tests, I argue, ultimately play to the disadvantage of their human participants. Rhetorical Games, Procedural Rhetoric When videogame studies emerged as an academic field in the early 2000s, once of its first tasks was to legitimise games relative to other types of artefact, especially literary texts (Eskelinen; Aarseth). Scholars sought a framework for discussing how video games, like other more venerable media, can express ideas (Weise). Janet Murray and Ian Bogost looked to the notion of procedure, devising the concepts of "procedurality" (Bogost 3), "procedural authorship" (Murray 171), and "procedural rhetoric" (Bogost 1). From a proceduralist perspective, a videogame is both an object and a medium for inscribing processes. Those processes have two basic types: procedures the game's developers have authored, which script the behaviour of the game as a computer program; and procedures human players respond with, the "operational logic" of gameplay (Bogost 13). Procedurality's two types of procedure, the computerised and the human, have a kind of call-and-response relationship, where the behaviour of the machine calls upon players to respond with their own behaviour patterns. Games thus train their players. Through the training that is play, players acquire habits they bring to other contexts, giving videogames the power not only to express ideas but "disrupt and change fundamental attitudes and beliefs about the world, leading to potentially significant long-term social change" (Bogost ix). That social change can be positive (McGonigal), or it can involve "dark patterns", cases where game procedures provoke and exploit harmful behaviours (Zagal et al.). For example, embedded in many game paradigms is the procedural rhetoric of "toxic meritocracy" (Paul 66), where players earn rewards, status and personal improvement by overcoming challenges, and, especially, excelling where others fail. While meritocracy may seem logical within a strictly competitive arena, its effect in a broader cultural context is to legitimise privileges as the spoils of victory, and maltreatment as the just result of defeat. As game design has influenced other fields, so too has procedurality's applicability expanded. Gamification, "the use of game design elements in non-game contexts" (Deterding et al. 9), is a popular trend in which designers seek to imbue diverse tasks with some of the enjoyment of playing a game (10). Gamification discourse has drawn heavily upon Mihaly Csikszentmihalyi's "positive psychology" (Seligman and Csikszentmihalyi), and especially the speculative psychology of flow (Csikszentmihalyi 51), which promise enormously broad benefits for individuals acting in the "flow state" that challenging play supposedly promotes (75). Gamification has become a celebrated cause, advocated by a group of scholars and designers Sebastian Deterding calls the "Californian league of gamification evangelists" (120), before becoming an object of critical scrutiny (Fuchs et al.). Where gamification goes, it brings its dark patterns with it. In gamified user authentication (Kroeze and Olivier), and particularly gamified captcha, there occurs an intersection of deceptively difficult games, real-world stakes, and users whose differences go often ignored. The Disembodied Arms Race In captcha design research, the concept of disability occurs under the broader umbrella of usability. Usability studies emphasise the fact that some technology pieces are easier to access than others (Yan and El Ahmad). Disability studies, in contrast, emphasises the fact that different users have different capacities to overcome access barriers. Ability is contextual, an intersection of usability and disability, use case and user (Reynolds 443). When used as an index of humanness, ability yields illusive results. In Posthuman Knowledge, Rosi Braidotti begins her conceptual enquiry into the posthuman condition with a contemplation of captcha, asking what it means to tick that checkbox claiming that "I am not a robot" (8), and noting the baffling multiplicity of possible answers. From a practical angle, Junya Kani and Masakatsu Nishigaki write candidly about the problem of distinguishing robot from human: "no matter how advanced malicious automated programs are, a CAPTCHA that will not pass automated programs is required. Hence, we have to find another human cognitive processing capability to tackle this challenge" (40). Kani and Nishigaki try out various human cognitive processing capabilities for the task. Narrative comprehension and humour become candidates: might a captcha ascribe humanity based on human users' ability to determine the correct order of scenes in a film (43)? What about panels in a cartoon (40)? As they seek to assess the soft skills of machines, Kani and Nishigaki set up a drama similar to that of Philip K. Dick's Do Androids Dream of Electric Sheep. Do Androids Dream of Electric Sheep, and its film adaptation, Blade Runner (Scott), describe a spacefaring society populated by both humans and androids. Androids have lesser legal privileges than humans, and in particular face execution—euphemistically called "retirement"—for trespassing on planet Earth (Dick 60). Blade Runner gave these androids their more famous name: "replicant". Replicants mostly resemble humans in thought and action, but are reputed to lack the capacity for empathy, so human police, seeking a cognitive processing capability unique to humans, test for empathy to test for humanness (30). But as with captchas, Blade Runner's testing procedure depends upon an automated device whose effectiveness is not certain, prompting the haunting question: "have you ever retired a human by mistake?" (Scott 17:50). Blade Runner's empathy test is part of a long philosophical discourse about the distinction between human and machine (e.g. Putnam; Searle). At the heart of the debate lies Alan Turing's "Turing Test", which a machine hypothetically passes when it can pass itself off as a human conversationalist in an exchange of written text. Turing's motivation for coming up with the test goes: there may be no absolute way of defining what makes a human mind, so the best we can do is assess a computer's ability to imitate one (Turing 433). The aporia, however—how can we determine what makes a human mind?—is the result of an unfair question. Turing's test, dealing only with information expressed in strings of text, purposely disembodies both humans and machines. The Blade Runner universe similarly evens the playing field: replicants look, feel and act like humans to such an extent that distinguishing between the two becomes, again, the subject of a cognition test. The Turing Test, obsessed with information processing and steeped in mind-body dualism, assesses humanness using criteria that automated users can master relatively easily. In contrast, in everyday life, I use a suite of much more intuitive sensory tests to distinguish between my housemate and my laptop. My intuitions capture what the Turing Test masks: a human is a fleshy entity, possessed of the numerous trappings and capacities of a human body. The result of the automated Turing Test's focus on cognition is an arms race that places human users at an increasing disadvantage. Loss, in such a race, manifests not only as exclusion by and from computer services, but as a redefinition of proper usership, the proper behaviour of the authentic, human, user. Thus the Turing Test implicitly provides for a scenario where a machine becomes able to super-imitate humanness: to be perceived as human more often than a real human would be. In such an outcome, it would be the human conversationalist who would begin to fail the Turing test; to fail to pass themself off according to new criteria for authenticity. This scenario is possible because, through procedural rhetoric, machines shift human perspectives: about what is and is not responsible behaviour; about what humans should and should not feel when confronted with a challenge; about who does and does not deserve access; and, fundamentally, about what does and does not signify authentic usership. In captcha, as in Blade Runner, it is ultimately a machine that adjudicates between human and machine cognition. As users we rely upon this machine to serve our interests, rather than pursue some emergent automated interest, some by-product of the feedback loop that results from the ideologies of human researchers both producing and being produced by mechanised procedures. In the case of captcha, that faith is misplaced. The Feeling of Robopocalypse A rich repertory of fiction has speculated upon what novelist Daniel Wilson calls the "Robopocalypse", the scenario where machines overthrow humankind. Most versions of the story play out as a slave-owner's nightmare, featuring formerly servile entities (which happen to be machines) violently revolting and destroying the civilisation of their masters. Blade Runner's rogue replicants, for example, are effectively fugitive slaves (Dihal 196). Popular narratives of robopocalypse, despite showing their antagonists as lethal robots, are fundamentally human stories with robots playing some of the parts. In contrast, the exclusion a captcha presents when it defeats a human is not metaphorical or emancipatory. There, in that moment, is a mechanised entity defeating a human. The defeat takes place within an authoritative frame that hides its aggression. For a human user, to be defeated by a captcha is to fail to meet an apparently common standard, within the framework of a common procedure. This is a robopocalypse of baffling systems rather than anthropomorphic soldiers. Likewise, non-human software clients pose threats that humanoid replicants do not. In particular, software clients replicate much faster than physical bodies. The sheer sudden scale of a denial-of-service attack makes Philip K. Dick's vision of android resistance seem quaint. The task of excluding unauthorised software, unlike the impulse to exclude replicants, is more a practical necessity than an exercise in colonialism. Nevertheless, dystopia finds its way into the captcha process through the peril inherent in the test, whenever humans are told apart from authentic users. This is the encroachment of the hostile posthuman, naturalised by us before it denaturalises us. The hostile posthuman sometimes manifests as a drone strike, Terminator-esque (Cameron), a dehumanised decision to kill (Asaro). But it is also a process of gradual exclusion, detectable from moment to moment as a feeling of disdain or impatience for the irresponsibility, incompetence, or simply unusualness of a human who struggles to keep afloat of a rising standard. "We are in this together", Braidotti writes, "between the algorithmic devil and the acidified deep blue sea" (9). But we are also in this separately, divided along lines of ability. Captcha's danger, as a broken procedure, hides in plain sight, because it lashes out at some only while continuing to flatter others with a game that they can still win. Conclusion Online security systems may always have to define some users as legitimate and others as illegitimate. Is there a future where they do so on the basis of behaviour rather than identity or essence? Might some future system accord each user, human or machine, the same authentic status, and provide all with an initial benefit of the doubt? In the short term, such a system would seem grossly impractical. The type of user that most needs to be excluded is the disembodied type, the type that can generate orders of magnitude more demands than a human, that can proliferate suddenly and in immense number because it does not lag behind the slow processes of human bodies. This type of user exists in software alone. Rich in irony, then, is the captcha paradigm which depends on the disabilities of the threats it confronts. We dread malicious software not for its disabilities—which are momentary and all too human—but its abilities. Attenuating the threat presented by those abilities requires inverting a habit that meritocracy trains and overtrains: specifically, we have here a case where the plight of the human user calls for negative action toward ability rather than disability. References Aarseth, Espen. "Computer Game Studies, Year One." Game Studies 1.1 (2001): 1–15. Asaro, Peter. "On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-Making." International Review of the Red Cross 94.886 (2012): 687–709. Blade Runner. Dir. Ridley Scott. Warner Bros, 1982. Bogost, Ian. Persuasive Games: The Expressive Power of Videogames. Cambridge, MA: MIT Press, 2007. Braidotti, Rosi. Posthuman Knowledge. Cambridge: Polity Press, 2019. Brown, Samuel S., et al. "I Am 'Totally' Human: Bypassing the Recaptcha." 13th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS), 2017. Christopherson, Robin. "AI Is Making CAPTCHA Increasingly Cruel for Disabled Users." AbilityNet 2019. 17 Sep. 2020 <https://abilitynet.org.uk/news-blogs/ai-making-captcha-increasingly-cruel-disabled-users>. Csikszentmihalyi, Mihaly. Flow: The Psychology of Optimal Experience. Harper & Row: New York, 1990. Deterding, Sebastian. "Eudaimonic Design, Or: Six Invitations to Rethink Gamification." Rethinking Gamification. Eds. Mathias Fuchs et al. Lüneburg: Meson Press, 2014. Deterding, Sebastian, et al. "From Game Design Elements to Gamefulness: Defining Gamification." Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments. ACM, 2011. Dick, Philip K. Do Androids Dream of Electric Sheep. 1968. New York: Del Rey, 1996. Dihal, Kanta. "Artificial Intelligence, Slavery, and Revolt." AI Narratives: A History of Imaginative Thinking about Intelligent Machines. Eds. Stephen Cave, Kanta Dihal, and Sarah Dillon. 2020. 189–212. Dzieza, Josh. "Why Captchas Have Gotten So Difficult." The Verge 2019. 17 Sep. 2020 <https://www.theverge.com/2019/2/1/18205610/google-captcha-ai-robot-human-difficult-artificial-intelligence>. Eskelinen, Markku. "Towards Computer Game Studies." Digital Creativity 12.3 (2001): 175–83. Fuchs, Mathias, et al., eds. Rethinking Gamification. Lüneburg: Meson Press, 2014. Godfrey, Philip Brighten. "Text-Based CAPTCHA Algorithms." First Workshop on Human Interactive Proofs, 15 Dec. 2001. 14 Nov. 2020 <http://www.aladdin.cs.cmu.edu/hips/events/abs/godfreyb_abstract.pdf>. Gossweiler, Rich, et al. "What's Up CAPTCHA? A CAPTCHA Based on Image Orientation." Proceedings of the 18th International Conference on World Wide Web. WWW, 2009. Jeng, Albert B., et al. "A Study of CAPTCHA and Its Application to User Authentication." International Conference on Computational Collective Intelligence. Springer, 2010. Kani, Junya, and Masakatsu Nishigaki. "Gamified Captcha." International Conference on Human Aspects of Information Security, Privacy, and Trust. Springer, 2013. Kroeze, Christien, and Martin S. Olivier. "Gamifying Authentication." 2012 Information Security for South Africa. IEEE, 2012. Kumar, S. Ashok, et al. "Gamification of Internet Security by Next Generation Captchas." 2017 International Conference on Computer Communication and Informatics (ICCCI). IEEE, 2017. McGonigal, Jane. Reality Is Broken: Why Games Make Us Better and How They Can Change the World. Penguin, 2011. Motoyama, Marti, et al. "Re: Captchas – Understanding CAPTCHA-Solving Services in an Economic Context." USENIX Security Symposium. 2010. Murray, Janet. Hamlet on the Holodeck: The Future of Narrative in Cyberspace. New York: The Free Press, 1997. Paul, Christopher A. The Toxic Meritocracy of Video Games: Why Gaming Culture Is the Worst. University of Minnesota Press, 2018. Putnam, Hilary. "Robots: Machines or Artificially Created Life?" The Journal of Philosophy 61.21 (1964): 668–91. Reynolds, Joel Michael. "The Meaning of Ability and Disability." The Journal of Speculative Philosophy 33.3 (2019): 434–47. Searle, John. "Minds, Brains, and Programs." Behavioral and Brain Sciences 3.3 (1980): 417–24. Seligman, Martin, and Mihaly Csikszentmihalyi. "Positive Psychology: An Introduction." Flow and the Foundations of Positive Psychology. 2000. Springer, 2014. 279–98. Shet, Vinay. "Are You a Robot? Introducing No Captcha Recaptcha." Google Security Blog 3 (2014): 12. Tam, Jennifer, et al. "Breaking Audio Captchas." Advances in Neural Information Processing Systems. 2009. Proceedings of the 21st International Conference on Neural Information Processing Systems 1625–1632. ACM, 2008. The Terminator. Dir. James Cameron. Orion, 1984. Turing, Alan. "Computing Machinery and Intelligence." Mind 59.236 (1950). Von Ahn, Luis, et al. "Recaptcha: Human-Based Character Recognition via Web Security Measures." Science 321.5895 (2008): 1465–68. W3C Working Group. "Inaccessibility of CAPTCHA: Alternatives to Visual Turing Tests on the Web." W3C 2019. 17 Sep. 2020 <https://www.w3.org/TR/turingtest/>. Weise, Matthew. "How Videogames Express Ideas." DiGRA Conference. 2003. Weng, Haiqin, et al. "Towards Understanding the Security of Modern Image Captchas and Underground Captcha-Solving Services." Big Data Mining and Analytics 2.2 (2019): 118–44. Wilson, Daniel H. Robopocalypse. New York: Doubleday, 2011. Yan, Jeff, and Ahmad Salah El Ahmad. "Usability of Captchas or Usability Issues in CAPTCHA Design." Proceedings of the 4th Symposium on Usable Privacy and Security. 2008. Zagal, José P., Staffan Björk, and Chris Lewis. "Dark Patterns in the Design of Games." 8th International Conference on the Foundations of Digital Games. 2013. 25 Aug. 2020 <http://soda.swedish-ict.se/5552/1/DarkPatterns.1.1.6_cameraready.pdf>. Zhu, Bin B., et al. "Attacks and Design of Image Recognition Captchas." Proceedings of the 17th ACM Conference on Computer and Communications Security. 2010.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!