To see the other types of publications on this topic, follow the link: Central authentication service.

Journal articles on the topic 'Central authentication service'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 43 journal articles for your research on the topic 'Central authentication service.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Li, Wei, Yuan Bo Zhu, and Ming Zou. "Applied Research of Single Sign on Technology in Cloud Services." Applied Mechanics and Materials 602-605 (August 2014): 3552–55. http://dx.doi.org/10.4028/www.scientific.net/amm.602-605.3552.

Full text
Abstract:
Industrial cloud service platform launched a series of industrial services based on cloud computing for middle and small-sized enterprises. These services include CAD, CAE and other software services utilized on industrial design and manufacturing. Due to integrating various application systems, an efficient authentication technology was urgent required to realize the exchange and sharing of the identity information in the platform. Based on analysis and compare of various mainstream Single Sign-On (SSO) technologies, this paper proposed an SSO solution by using Central Authentication Service (CAS) protocol. Proposed system makes it easy and convenient to deal with login authentication and gain identity information of the whole platform. In addition, proposed system significantly improves the user experience.
APA, Harvard, Vancouver, ISO, and other styles
2

Orbán, Anna, and Annamária Beláz. "eIdentification - Renewable regulated electronic administration services." Central and Eastern European eDem and eGov Days 325 (February 14, 2018): 463–76. http://dx.doi.org/10.24989/ocg.v325.38.

Full text
Abstract:
Since 2005, Hungary has a comprehensive central identification solution. The Client Gate is capable of identifying citizens for any public authority that connects to it. The Client Gate is very popular and useful tool for identification among citizens for electronic transaction. Today approximately 2.4 million clients have a Client Gate account. However, many have an aversion towards the online administration. They can choose the personal administration or they can use their mobile phone for administration. Since the beginning of 2016, the new electronic ID card integrates personal identification, social security and tax identification information which is also suitable for providing an electronic signature. These two new identification options are available to citizens, including the newly introduced national eID card, as well as the Partial Code Telephone Authentication. A half years about a million new eID card has been claimed for the citizens. However, the telephone authentication is less popular. In Hungary, the new electronic administration is based on the "Regulated Electronic Administrative Services” (Hungarian short name SZEÜSZ) since 2012. The new central identification solution, the Central Authentication Agent as one of the Regulated Electronic Administrative Service has been launched that supports the use of different electronic identification and authentication services. Now the usual Client Gate has been changed to the Central Authentication Agent in Web Assistant application to implement full electronic administration procedures. The aim of the study is to present the experience of the various methods of identification by comparative analysis.
APA, Harvard, Vancouver, ISO, and other styles
3

Niewolski, Wojciech, Tomasz W. Nowak, Mariusz Sepczuk, and Zbigniew Kotulski. "Token-Based Authentication Framework for 5G MEC Mobile Networks." Electronics 10, no. 14 (July 18, 2021): 1724. http://dx.doi.org/10.3390/electronics10141724.

Full text
Abstract:
MEC technology provides a distributed computing environment in 5G mobile networks for application and service hosting. It allows customers with different requirements and professional competencies to use the services offered by external suppliers. We consider a service access control framework on 5G MEC networks that is efficient, flexible, and user-friendly. Its central element is the MEC Enabler, which handles AAA requests for stakeholders accessing services hosted on the edge servers. The JSON Web Token (JWT) open standard is a suitable tool for the MEC Enabler to manage access control credentials and transfer them securely between parties. In this paper, in the context of access control, we propose the token reference pattern called JSON MEC Access Token (JMAT) and analyze the effectiveness of its available protection methods in compliance with the standard requirements of MEC-hosted services in 5G networks.
APA, Harvard, Vancouver, ISO, and other styles
4

KIM, YOUNG-GAB, CHANG-JOO MOON, DONGWON JEONG, and DOO-KWON BAIK. "FORMAL VERIFICATION OF BUNDLE AUTHENTICATION MECHANISM IN OSGi SERVICE PLATFORM: BAN LOGIC." International Journal of Software Engineering and Knowledge Engineering 16, no. 02 (April 2006): 153–73. http://dx.doi.org/10.1142/s0218194006002793.

Full text
Abstract:
Security is critical in a home gateway environment. Robust secure mechanisms must be put in place for protecting information transferred through a central location. In considering characteristics for the home gateway environment, this paper proposes a bundle authentication mechanism. We designed the exchange mechanism for transferring a shared secret key. This transports a service bundle safely in the bootstrapping step, by recognizing and initializing various components. In this paper, we propose a bundle authentication mechanism based on a MAC that uses a shared secret key created in the bootstrapping step. In addition, we verify the safety of the key exchange mechanism and bundle authentication mechanism using BAN Logic. From the verified result, we achieved goals of authentication. That is, the operator can trust the bundle provided by the service provider. The user who uses the service gateway can also express trust and use the bundle provided by the operator.
APA, Harvard, Vancouver, ISO, and other styles
5

Brandão, Luís T. A. N., Nicolas Christin, and George Danezis. "Toward Mending Two Nation-Scale Brokered Identification Systems." Proceedings on Privacy Enhancing Technologies 2015, no. 2 (June 1, 2015): 135–55. http://dx.doi.org/10.1515/popets-2015-0022.

Full text
Abstract:
Abstract Available online public/governmental services requiring authentication by citizens have considerably expanded in recent years. This has hindered the usability and security associated with credential management by users and service providers. To address the problem, some countries have proposed nation-scale identification/authentication systems that intend to greatly reduce the burden of credential management, while seemingly offering desirable privacy benefits. In this paper we analyze two such systems: the Federal Cloud Credential Exchange (FCCX) in the United States and GOV.UK Verify in the United Kingdom, which altogether aim at serving more than a hundred million citizens. Both systems propose a brokered identification architecture, where an online central hub mediates user authentications between identity providers and service providers. We show that both FCCX and GOV.UK Verify suffer from serious privacy and security shortcomings, fail to comply with privacy-preserving guidelines they are meant to follow, and may actually degrade user privacy. Notably, the hub can link interactions of the same user across different service providers and has visibility over private identifiable information of citizens. In case of malicious compromise it is also able to undetectably impersonate users. Within the structural design constraints placed on these nation-scale brokered identification systems, we propose feasible technical solutions to the privacy and security issues we identified. We conclude with a strong recommendation that FCCX and GOV.UK Verify be subject to a more in-depth technical and public review, based on a defined and comprehensive threat model, and adopt adequate structural adjustments.
APA, Harvard, Vancouver, ISO, and other styles
6

Kim, MyeongHyun, KiSung Park, SungJin Yu, JoonYoung Lee, YoungHo Park, Sang-Woo Lee, and BoHeung Chung. "A Secure Charging System for Electric Vehicles Based on Blockchain." Sensors 19, no. 13 (July 9, 2019): 3028. http://dx.doi.org/10.3390/s19133028.

Full text
Abstract:
Smart grids incorporating internet-of-things are emerging solutions to provide a reliable, sustainable and efficient electricity supply, and electric vehicle drivers can access efficient charging services in the smart grid. However, traditional electric vehicle charging systems are vulnerable to distributed denial of service and privileged insider attacks when the central charging server is attacked. The blockchain-based charging systems have been proposed to resolve these problems. In 2018, Huang et al. proposed the electric vehicle charging system using lightning network and smart contract. However, their system has an inefficient charging mechanism and does not guarantee security of key. We propose a secure charging system for electric vehicles based on blockchain to resolve these security flaws. Our charging system ensures the security of key, secure mutual authentication, anonymity, and perfect forward secrecy, and also provides efficient charging. We demonstrate that our proposed system provides secure mutual authentication using Burrows–Abadi–Needham logic and prevents replay and man-in-the-middle attacks using automated validation of internet security protocols and applications simulation tool. Furthermore, we compare computation and communication costs with previous schemes. Therefore, the proposed charging system efficiently applies to practical charging systems for electric vehicles.
APA, Harvard, Vancouver, ISO, and other styles
7

Mishra*, Arjit, Surendra Gupta, and Swarnim Soni. "Designing Information System for Private Network using RBAC, FGAC and Micro service Architecture." International Journal of Engineering and Advanced Technology 10, no. 4 (April 30, 2021): 195–200. http://dx.doi.org/10.35940/ijeat.d2474.0410421.

Full text
Abstract:
Microservice architecture is used in developing enterprise-level applications with the intent to modularise deployment of the application, this happens by creating an application as a collection of var-ious smaller applications known as microservices. An Information system is one such application that is ever-growing and therefore needs an architectural solution that addresses this issue. While microservice architecture addresses this issue by giving low coupling among microservices, future scalability of the system, and convenience in developing, deploying, and integrating new microservices.For all it‘s benefits, microservice architecture complicates the consistent implementation of security policies in this distributed system. Current industry standards are to use protocols that delegate the process of authentication and authorization to a third-party server, e.g. OAuth. Delegating these processes to be handled by the third party is not suitable for some web applications that are deployed in a less resourceful environment, e.g. organization with high internet downtime or an organization with high traffic of non working personnel e.g. people giving exams in college or workshops being held. This paper aims to research proposed solutions, existing frameworks, and technologies to implement security policies in an Information system which can be suitable for the above two scenarios.For this, we use authentication, Role-based access control (RBAC) on every request, and Fine-grained access control (FGAC) on the implementation method level, to achieve greater access control and flex-ibility of adding new microservice without changing whole security policies. We have also proposed a pre-registration condition in our system, which allows only certain people, whose data is already present in the system, to register themselves with the application. We also discuss the scenario where using a protocol like OAuth is not suitable. The solution is based on creating a central single entry point for authentication and implementing an RBAC policy that will filter every request based on access roles that the requesting user has. We further use FGAC on method level in microservices to enforce n even finer restrictions on resources to be accessed based on requirements. This solution will be implemented as apart of the Department Information System (DIS) in the following two-step:
APA, Harvard, Vancouver, ISO, and other styles
8

Weitzel, Derek, Brian Bockelman, Jim Basney, Todd Tannenbaum, Zach Miller, and Jeff Gaynor. "Capability-Based Authorization for HEP." EPJ Web of Conferences 214 (2019): 04014. http://dx.doi.org/10.1051/epjconf/201921404014.

Full text
Abstract:
Outside the HEP computing ecosystem, it is vanishingly rare to encounter user X509 certificate authentication (and proxy certificates are even more rare). The web never widely adopted the user certificate model, but increasingly sees the need for federated identity services and distributed authorization. For example, Dropbox, Google and Box instead use bearer tokens issued via the OAuth2 protocol to authorize actions on their services. Thus, the HEP ecosystem has the opportunity to reuse recent work in industry that now covers our needs. We present a token-based ecosystem for authorization tailored for use by CMS. We base the tokens on the SciTokens profile for the standardized JSON Web Token (JWT) format. The token embeds a signed description of what capabilities the VO grants the bearer; the site-level service can verify the VO’s signature without contacting a central service. In this paper, we describe the modifications done to enable token-based authorization in various software packages used by CMS, including XRootD, CVMFS, and HTCondor. We describe the token-issuing workflows that would be used to get tokens to running jobs in order to authorize data access and file stageout, and explain the advantages for hosted web services. Finally, we outline what the transition would look like for an experiment like CMS.
APA, Harvard, Vancouver, ISO, and other styles
9

Udayana, I. Putu Agus Eka Darma. "Integrasi Sistem Single Sign On Pada Sistem Informasi Akademik, Web Information System Dan Learning Management System Berbasis Central Authentication Service." Jurnal RESISTOR (Rekayasa Sistem Komputer) 1, no. 1 (April 21, 2018): 48–54. http://dx.doi.org/10.31598/jurnalresistor.v1i1.265.

Full text
Abstract:
Elearning and web based information systems is a means to communicate and exchange information for academic purposes. Nowadays lightweight directory access protocol (LDAP) is a state of the art method of choice. With LDAP technologies user only need one username and password to access to multiple web based application, The problem is if the user wanted to do autentification said user had to input their credentials over and over again for each application. To solve that problem single sign on mechanism (SSO) is invented. With SSO user only need login once and they got all the same credentials with them to all intergrated application wthin the campus. To implement the SSO we use Central authentication service (CAS) as a authentifiation central within LDAP structure as a user management. In this reseach we see that single sign on (SSO) system that intergrated into student management system, E-Learning system and Internal blog system both use of database based system or even LDAP based system.
APA, Harvard, Vancouver, ISO, and other styles
10

Zeng, Lijun, Xiaoxia Yao, Juanjuan Liu, and Qiang Zhu. "Construction of a one-stop document supply service platform." Interlending & Document Supply 42, no. 2/3 (August 12, 2014): 120–24. http://dx.doi.org/10.1108/ilds-01-2014-0013.

Full text
Abstract:
Purpose – The purpose of this paper is to provide a detailed overview of the China Academic Library and Information system (CALIS) document supply service platform (CDSSP) – its historical development, network structure and future development plans – and discuss how its members make use of and benefit from its various components. Design/methodology/approach – The authors provide a first-person account based on their professional positions at the CALIS Administrative Center. Findings – CDSSP comprises five application systems including a unified authentication system, Saas-based interlibrary loan (ILL) and document delivery (DD) service system, ILL central scheduling and settlement system, File Transfer Protocol (FTP) service system and a service integration interface system. These systems work together to meet the needs of member libraries, other information service institutions, and their end users. CDSSP is widely used by more than 1,100 libraries based on a cloud service strategy. Each year more than 100,000 ILL and DD transactions are processed by this platform. Originality/value – The development of CDSSP makes it becomes true for CALIS to provide one stop information retrieval and supply service. At the same time, it promotes the resource sharing among member libraries to a great degree.
APA, Harvard, Vancouver, ISO, and other styles
11

Martinez Pedreira, Miguel, Costin Grigoras, Volodymyr Yurchenko, and Maksim Melnik Storetvedt. "The Security model of the ALICE next generation Grid framework." EPJ Web of Conferences 214 (2019): 03042. http://dx.doi.org/10.1051/epjconf/201921403042.

Full text
Abstract:
JAliEn (Java-AliEn) is the ALICE next generation Grid framework which will be used for the top-level distributed computing resources management during the LHC Run 3 and onward. While preserving an interface familiar to the ALICE users, its performance and scalability are an order of magnitude better than the currently used framework. To implement the JAliEn security model, we have developed the so-called Token Certificates – short lived full Grid certificates, generated by central services automatically or on the client’s request. Token Certificates allow fine-grained control over user/client authorization, e.g. filtering out unauthorized requests based on the client’s type: end user, job agent, jobpayload. These and other parameters (like job ID) are encrypted in the token by the issuing service and cannot be altered.The client-side security implementation is further described in aspects of the interaction between user jobs and job agents. User jobs will use JAliEn tokens for authentication and authorization by the central JAliEn services. These tokens are passed from the job agent through a pipe stream, not stored on disk and thus readily available only to the intended job process. The level of isolation of user payloads is further improved by running them in containers. While JAliEn doesn't rely on X.509 proxies, the backward compatibility is kept to assure interoperability with services that require them.
APA, Harvard, Vancouver, ISO, and other styles
12

Corre, Kevin, Olivier Barais, Gerson Sunyé, Vincent Frey, and Jean-Michel Crom. "Why can’t users choose their identity providers on the web?" Proceedings on Privacy Enhancing Technologies 2017, no. 3 (July 1, 2017): 75–89. http://dx.doi.org/10.1515/popets-2017-0029.

Full text
Abstract:
Abstract Authentication delegation is a major function of the modern web. Identity Providers (IdP) acquired a central role by providing this function to other web services. By knowing which web services or web applications access its service, an IdP can violate the enduser privacy by discovering information that the user did not want to share with its IdP. For instance, WebRTC introduces a new field of usage as authentication delegation happens during the call session establishment, between two users. As a result, an IdP can easily discover that Bob has a meeting with Alice. A second issue that increases the privacy violation is the lack of choice for the end-user to select its own IdP. Indeed, on many web-applications, the end-user can only select between a subset of IdPs, in most cases Facebook or Google. In this paper, we analyze this phenomena, in particular why the end-user cannot easily select its preferred IdP, though there exists standards in this field such as OpenID Connect and OAuth 2? To lead this analysis, we conduct three investigations. The first one is a field survey on OAuth 2 and OpenID Connect scope usage by web sites to understand if scopes requested by websites could allow for user defined IdPs. The second one tries to understand whether the problem comes from the OAuth 2 protocol or its implementations by IdP. The last one tries to understand if trust relations between websites and IdP could prevent the end user to select its own IdP. Finally, we sketch possible architecture for web browser based identity management, and report on the implementation of a prototype.
APA, Harvard, Vancouver, ISO, and other styles
13

Vlasenko, Aleksandra, and Irina Korkh. "Issues of Security of Remote Banking Systems." NBI Technologies, no. 2 (December 2018): 6–10. http://dx.doi.org/10.15688/nbit.jvolsu.2018.2.1.

Full text
Abstract:
The banking system of the Russian Federation is based on Federal legislation, as well as industry and inter-industry standards developed by the International community and the Central Bank of the Russian Federation. The introduction and development of remote banking services is the most popular direction in the modern banking system. The term ‘remote banking service’ refers to the technology of providing services for the remote transmission of orders by credit institutions (without clients’ presence in bank office), using different communication channels. Information security in bank-client systems is carried out by several types of protection. Client authentication is confirmed by the use of an electronic signature. The data channel and the data itself are encrypted. In order to implement the above-described functions of the bank-client system in any software, the client must first generate and register cryptographic keys with the bank and obtain electronic digital certificates. In addition to the protection of the transferred data, personal data that have entered the system of remote banking services also need protection. The research purpose is to analyze the system of remote banking services, to identify regulatory and methodological documents regulating information security issues, options for meeting the requirements of regulators. Confirmation or refutation of the issue’s urgency should be considered as research results.
APA, Harvard, Vancouver, ISO, and other styles
14

Ceccanti, Andrea, Enrico Vianello, and Diego Michelotto. "Token-based authorization in StoRM WebDAV." EPJ Web of Conferences 245 (2020): 04020. http://dx.doi.org/10.1051/epjconf/202024504020.

Full text
Abstract:
At the end of May 2017 the Globus Alliance announced that the open-source Globus Toolkit (GT) would be no longer supported by the Globus team at the University of Chicago. This announcement had an obvious impact on WLCG, given the central role of the Globus Security Infrastructure (GSI) and GridFTP in the WLCG data management framework, so discussions started in the appropriate forums on the search for alternatives. At the same time, support for token-based authentication and authorization has emerged as a key requirement for storage elements powering WLCG data centers. In this contribution, we describe the work done to enable token-based authentication and authorization in the StoRM WebDAV service, describing and highlighting the differences between support for external OpenID connect providers, groupbased and capability-based authorization schemes, and locally-issued authorization tokens. We discuss how StoRM WebDAV token-based authorization is being exploited in several contexts, from WLCG DOMA activities to other scientific experiments hosted at the INFN Tier-1 data center. In this contribution, we also describe the methodology used to compare Globus GridFTP and StoRM WebDAV and we present initial results confirming how HTTP represent a viable alternative to GridFTP for data transfers also performance-wise.
APA, Harvard, Vancouver, ISO, and other styles
15

Deeptha, R., and Rajeswari Mukesh. "Extending OpenID Connect Towards Mission Critical Applications." Cybernetics and Information Technologies 18, no. 3 (September 1, 2018): 93–110. http://dx.doi.org/10.2478/cait-2018-0041.

Full text
Abstract:
Abstract Single Sign-On (SSO) decreases the complexity and eases the burden of managing many accounts with a single authentication mechanism. Mission critical application such as banking demands highly trusted identity provider to authenticate its users. The existing SSO protocol such as OpenID Connect protocol provides secure SSO but it is applicable only in the consumer-to-social-network scenarios. Owing to stringent security requirements, the SSO for banking service necessitates a highly trusted identity provider and a secured private channel for user access. The banking system depends on a dedicated central banking authority which controls the monetary policy and it must assume the role of the identity provider. This paper proposes an extension of OpenID Connect protocol that establishes a central identity provider for bank users, which facilitates the users to access different accounts using single login information. The proposed Enhanced OpenID Connect (EOIDC) modifies the authorization code flow of OpenID Connect to build a secure channel from a single trusted identity provider that supports multiple banking services. Moreover, the EOIDC tightens the security mechanism with the help of SAT to avoid impersonation attack using replay and redirect. The formal security analysis and validation demonstrate the strength of the EOIDC against possible attacks such as impersonation, eavesdropping, and a brute force login. The experimental results reveal that the proposed EOIDC system is efficient in providing secured SSO protocol for banking services.
APA, Harvard, Vancouver, ISO, and other styles
16

Watini, Sri, Pipit Nursaputri, and Muhammad Iqbal. "Comparison of CAS and Manage Oauth in Single Sign on (SSO) Client Applications." IAIC Transactions on Sustainable Digital Innovation (ITSDI) 1, no. 2 (April 29, 2020): 152–59. http://dx.doi.org/10.34306/itsdi.v1i2.147.

Full text
Abstract:
Single Sign On is one of the systems that have been developed long ago to meet the expectations of developers to provide ease and convenience of accessing data. In the development of the system, methods and protocols have been formed in varied ways to suit the needs of the developers . In a variety of methods and protocols , a developer can choose the architecture and protocols that can be used to develop the system. Central Authentication Service and Open authorization is two Single Sign On systems most widely used in the manufacture of a web log . Both can be used as the basis for the application of the system of Single Sign On for developers who intend to design a login system that is safe and comfortable , so that developers can create a system that suits his desire .
APA, Harvard, Vancouver, ISO, and other styles
17

Su, Tianhong, Sujie Shao, Shaoyong Guo, and Min Lei. "Blockchain-Based Internet of Vehicles Privacy Protection System." Wireless Communications and Mobile Computing 2020 (September 7, 2020): 1–10. http://dx.doi.org/10.1155/2020/8870438.

Full text
Abstract:
With the development of wireless local area networks and intelligent transportation technologies, the Internet of Vehicles is considered to be an effective method to alleviate the severe situation of the current transportation system. The vehicles in the Internet of Vehicles system build the Vehicular Ad Hoc Networks through wireless communication technology and dynamically provide different services through the real-time driving information broadcast by the vehicles. Vehicle drivers can control the distance, planning the driving route, between vehicles according to the current traffic environment, which improves the overall safety and efficiency of the traffic system. Due to the particularity of the Internet of Vehicles system service, vehicles need to broadcast their location information frequently. Attackers can collect and analyze vehicle broadcast information to steal privacy and even directionally track the owner through the driving trajectory, bringing serious security risks. This paper proposes a blockchain-based privacy protection system for the Internet of Vehicles. The system combines the blockchain with the Internet of Vehicles system to design a safe and efficient two-way authentication and key agreement algorithm through encryption and signature algorithm, which also solves the central dependency problem of the traditional Internet of Vehicles system.
APA, Harvard, Vancouver, ISO, and other styles
18

Purkayastha, Saptarshi, Judy W. Gichoya, and Abhishek Siva Addepally. "Implementation of a single sign-on system between practice, research and learning systems." Applied Clinical Informatics 26, no. 01 (2017): 306–12. http://dx.doi.org/10.4338/aci-2016-10-cr-0171.

Full text
Abstract:
SummaryBackground: Multiple specialized electronic medical systems are utilized in the health enterprise. Each of these systems has their own user management, authentication and authorization process, which makes it a complex web for navigation and use without a coherent process workflow. Users often have to remember multiple passwords, login/logout between systems that disrupt their clinical workflow. Challenges exist in managing permissions for various cadres of health care providers. Objectives: This case report describes our experience of implementing a single sign-on system, used between an electronic medical records system and a learning management system at a large academic institution with an informatics department responsible for student education and a medical school affiliated with a hospital system caring for patients and conducting research.Methods: At our institution, we use OpenMRS for research registry tracking of interventional radiology patients as well as to provide access to medical records to students studying health informatics. To provide authentication across different users of the system with different permissions, we developed a Central Authentication Service (CAS) module for OpenMRS, released under the Mozilla Public License and deployed it for single sign-on across the academic enterprise. The module has been in implementation since August 2015 to present, and we assessed usability of the registry and education system before and after implementation of the CAS module. 54 students and 3 researchers were interviewed.Results: The module authenticates users with appropriate privileges in the medical records system, providing secure access with minimal disruption to their workflow. No passwords requests were sent and users reported ease of use, with streamlined workflow.Conclusions: The project demonstrates that enterprise-wide single sign-on systems should be used in healthcare to reduce complexity like “password hell”, improve usability and user navigation. We plan to extend this to work with other systems used in the health care enterprise.
APA, Harvard, Vancouver, ISO, and other styles
19

Afonso, Jose A., Helder G. Duarte, Luiz A. Lisboa Cardoso, Vitor Monteiro, and Joao L. Afonso. "Wireless Communication and Management System for E-Bike Dynamic Inductive Power Transfer Lanes." Electronics 9, no. 9 (September 10, 2020): 1485. http://dx.doi.org/10.3390/electronics9091485.

Full text
Abstract:
This paper presents the design, implementation, and testing of a wireless communication system for automatic identification of e-bikes and management of their battery charging in the context of dynamic inductive wireless power transfer (DIWPT) lanes. The proposed system checks if an e-bike, uniquely identified by its RFID tag, is authorized to receive energy from the lane coils and acts accordingly. An authentication mechanism was developed based on the use of embedded Wi-Fi boards attached to the coils and communicating with a central HTTP server with a MySQL database. The developed management system also provides other features, such as the recording of the number of lane coils used by each e-bike for billing purposes. The results from experimental tests on a laboratory prototype were used to validate the developed functionalities and assess the quality of service provided by the proposed system.
APA, Harvard, Vancouver, ISO, and other styles
20

Mamidisetti, Gowtham, and Ramesh Makala. "A Proposed Model for Trust Management: Insights from a Simulation Study in the Context of Cloud Computing." Journal of Computational and Theoretical Nanoscience 17, no. 7 (July 1, 2020): 2983–88. http://dx.doi.org/10.1166/jctn.2020.9121.

Full text
Abstract:
In computing systems, one of the centric topics entails cloud computing. This dominance is attributed to the crucial role that the concept plays in the daily lives of individuals, especially in the wake of the increasing adoption of technology by individuals and organizations. Indeed, the motivation behind the establishment, adoption, and implementation of cloud computing has been attributed to the need to offer low-cost and quick consumer service provision, as well as data manipulation and storage. However, the cloud environment continues to face security threats, a trend that informs the need for further investigations and analyses that could provide room for new system improvements. The current simulation study presents a dynamic model for security management in a cloud computing environment, with the central parameter involving electronic trust. Imperatively the proposed study examines interactions between the data provider and the data owner, as well as the end user. Specifically, the proposed model is that which ensures that for authentication purposes and access permissions, there is a continuous update of trust values. From the results, the model is flexible relative to the provision of dynamic access control, a positive trend that points to its promising level of efficiency.
APA, Harvard, Vancouver, ISO, and other styles
21

Dobbins, Nicholas J., Clifford H. Spital, Robert A. Black, Jason M. Morrison, Bas de Veer, Elizabeth Zampino, Robert D. Harrington, et al. "Leaf: an open-source, model-agnostic, data-driven web application for cohort discovery and translational biomedical research." Journal of the American Medical Informatics Association 27, no. 1 (October 8, 2019): 109–18. http://dx.doi.org/10.1093/jamia/ocz165.

Full text
Abstract:
Abstract Objective Academic medical centers and health systems are increasingly challenged with supporting appropriate secondary use of clinical data. Enterprise data warehouses have emerged as central resources for these data, but often require an informatician to extract meaningful information, limiting direct access by end users. To overcome this challenge, we have developed Leaf, a lightweight self-service web application for querying clinical data from heterogeneous data models and sources. Materials and Methods Leaf utilizes a flexible biomedical concept system to define hierarchical concepts and ontologies. Each Leaf concept contains both textual representations and SQL query building blocks, exposed by a simple drag-and-drop user interface. Leaf generates abstract syntax trees which are compiled into dynamic SQL queries. Results Leaf is a successful production-supported tool at the University of Washington, which hosts a central Leaf instance querying an enterprise data warehouse with over 300 active users. Through the support of UW Medicine (https://uwmedicine.org), the Institute of Translational Health Sciences (https://www.iths.org), and the National Center for Data to Health (https://ctsa.ncats.nih.gov/cd2h/), Leaf source code has been released into the public domain at https://github.com/uwrit/leaf. Discussion Leaf allows the querying of single or multiple clinical databases simultaneously, even those of different data models. This enables fast installation without costly extraction or duplication. Conclusions Leaf differs from existing cohort discovery tools because it does not specify a required data model and is designed to seamlessly leverage existing user authentication systems and clinical databases in situ. We believe Leaf to be useful for health system analytics, clinical research data warehouses, precision medicine biobanks, and clinical studies involving large patient cohorts.
APA, Harvard, Vancouver, ISO, and other styles
22

Yang, Haotian, Shuming Xiong, Samuel Akwasi Frimpong, and Mingzheng Zhang. "A Consortium Blockchain-Based Agricultural Machinery Scheduling System." Sensors 20, no. 9 (May 6, 2020): 2643. http://dx.doi.org/10.3390/s20092643.

Full text
Abstract:
The introduction of a consortium blockchain-based agricultural machinery scheduling system will help improve the transparency and efficiency of the data flow within the sector. Currently, the traditional agricultural machinery centralized scheduling systems suffer when there is a failure of the single point control system, and it also comes with high cost managing with little transparency, not leaving out the wastage of resources. This paper proposes a consortium blockchain-based agricultural machinery scheduling system for solving the problems of single point of failure, high-cost, low transparency, and waste of resources. The consortium blockchain-based system eliminates the central server in the traditional way, optimizes the matching function and scheduling algorithm in the smart contract, and improves the scheduling efficiency. The data in the system can be traced, which increases transparency and improves the efficiency of decision-making in the process of scheduling. In addition, this system adopts a crowdsourcing scheduling mode, making full use of idle agricultural machinery in the society, which can effectively solve the problem of resource waste. Then, the proposed system implements authentication access mechanisms, and allows only authorized users into the system. It includes transactions based on digital currency and eliminates third-party platform to charge service fees. Moreover, participating organizations have the opportunity to obtain benefits and reduce transaction costs. Finally, the upper layers supervision improves the efficiency and security of consensus algorithm, allows supervisors to block users with malicious motives, and always ensures system security.
APA, Harvard, Vancouver, ISO, and other styles
23

Melnychenko, Svitlana, Svitlana Volosovych, and Yurii Baraniuk. "DOMINANT IDEAS OF FINANCIAL TECHNOLOGIES IN DIGITAL BANKING." Baltic Journal of Economic Studies 6, no. 1 (March 16, 2020): 92. http://dx.doi.org/10.30525/2256-0742/2020-6-1-92-99.

Full text
Abstract:
The purpose of the research is the definition of the dominant ideas of financial technologies in digital banking. The methods of theoretical generalization, qualitative, quantitative and correlation analysis, causality tests, description and explanation are used, which made it possible to establish the relationship between the volume of investments in financial technologies and the performance of the banking system, identify the areas of application of financial technologies in the activities of the bank, determine the dominant ideas of financial technologies in digital banking and to uncover the factors and prospects of intensifying the use of financial technologies in digital banking in Ukraine. Results of the research are to substantiate the impact of artificial intelligence, biometrics, cloud services, big data, blockchain and open banking services on digital banking. Due to financial technologies in digital banking, it is possible to generate and store large amounts of data, simultaneously analyze and apply the results of their analysis, provide personalized banking services, perform the functions of central storage of information about the client of financial and non-financial nature, which facilitates the effective investment and credit decision-making, as well as improving the level of information security of banking operations. Practical implications. Financial services markets are transformed by the impact of financial technologies. Development of financial technology instruments by non-banking institutions necessitates the identification of opportunities for their use in banks. The set of financial technologies used by banks forms the digital banking system, the development level of which is the main competitive advantage of the bank in the business environment. Digital banking is characterized by the continuity and security of banking services, which provide the consumer with the ability to receive them online anywhere around the clock, personalization of banking services, digital authentication of users and digitization of banking transactions with the replacement of paperwork. The use of financial technologies in digital banking enables to automate customer segmentation processes, reduce costs on payment transactions, optimize accounting, financial and tax accounting, improve customer service and expand your customer base while maximizing revenue in certain business segments. Value/originality. The basic spheres of the use of financial technologies in digital banking, as well as the factors and prospects of intensifying the use of their instruments in Ukraine are revealed. The main areas of use of financial technologies in digital banking are customer behavior analysis, transaction monitoring, customer identification and segmentation, fraud management, banking services personification, risk assessment and regulatory compliance, customer response analysis, process automation, financial advice, investment decision-making, trade facilitation, syndicated loan services, and P2P transfers. The prospects for developing financial technology tools in digital banking include strengthening the interaction between regulators, banks and financial technology companies, the increased use of biometrics, the development of neo-banking and open banking services.
APA, Harvard, Vancouver, ISO, and other styles
24

Gruntz, Dominik, Christof Arnosti, and Marco Hauri. "MOONACS: a mobile on-/offline NFC-based physical access control system." International Journal of Pervasive Computing and Communications 12, no. 1 (April 4, 2016): 2–22. http://dx.doi.org/10.1108/ijpcc-01-2016-0012.

Full text
Abstract:
Purpose The purpose of this paper is to present a smartphone-based physical access control system in which the access points are not directly connected to a central authorization server, but rather use the connectivity of the mobile phone to authorize a user access request online by a central access server. The access points ask the mobile phone whether a particular user has access or not. The mobile phone then relays such a request to the access server or presents an offline ticket. One of the basic requirements of our solution is the independence from third parties like mobile network operators, trusted service managers and handset manufacturers. Design/methodology/approach The authentication of the smartphone is based on public key cryptography. This requires that the private key is stored in a secure element or in a trusted execution environment to prevent identity theft. However, due to the intended independence from third parties, subscriber identity module (SIM)-based secure elements and embedded secure elements (i.e. separate hardware chips on the handset) were not an option and only one of the remaining secure element architectures could be used: host card emulation (HCE) or a microSD-based secure element. Findings This paper describes the implementation of such a physical access control system and discusses its security properties. In particular, it is shown that the HCE approach cannot solve the relay attack under conservative security assumptions and an implementation based on a microSD secure element is presented and discussed. Moreover, the paper also describes an offline solution which can be used if the smartphone is not connected to the access server. In this case, an access token is sent to the access point in response to an access request. These tokens are renewed regularly and automatically whenever the smartphone is connected. Originality/value In this paper, a physical access control system is presented which operates as fast as existing card-based solutions. By using a microSD-based secure element (SE), the authors were able to prevent the software relay attack. This solution is not restricted to microSD-based SEs, it could also be implemented with SIM-based or embedded secure elements (with the consequence that the solution depends on third parties).
APA, Harvard, Vancouver, ISO, and other styles
25

Randell, Brian. "A Computer Scientist's Reactions to NPfIT." Journal of Information Technology 22, no. 3 (September 2007): 222–34. http://dx.doi.org/10.1057/palgrave.jit.2000106.

Full text
Abstract:
This paper contains a set of personal views relating to NHS Connecting for Health's National Programme for IT (NPfIT), and in particular its Care Records Service, written from the point of view of a computer scientist, not a medical informatics expert. The principal points made are as follows: Centralisation: Pulling lots of data together (for individual patients and then for large patient populations) harms safety and privacy – it is one byproduct of excessive use of identification when in fact all that is usually needed is authentication. Large centralized data storage facilities can be useful for reliability, but risk exchanging lots of small failures for a lesser number of much larger failures. A much more decentralised approach to electronic patient record (EPR) data and its storage should be investigated. Evolutionary acquisition: Specifying, implementing, deploying and evaluating a sequence of ever more complete IT systems is the best way of ending up with well-accepted and well-trusted systems – especially when this process is controlled by the stakeholders who are most directly involved, rather than by some distant central bureaucracy. Thus authority as well as responsibility should be left with hospital and general practitioner trusts to acquire IT systems that suit their environments and priorities – subject to adherence to minimal interoperability constraints – and to use centralized services (e.g., for system support and back-up) as if and when they choose. Socio-technical issues: Ill-chosen imposed medical IT systems impede patient care, are resisted, result in lots of accidental faults, and lose user support and trust. All these points are attested to by rigorous studies involving expertise from the social sciences (psychology, ethnography, etc.) as well as by technical (medical and computer) experts – much more attention needs to be paid to such studies, and more such studies encouraged. Constructive reviews: A constructive expert review, working closely with Connecting for Health, could be very helpful, but should be evidently independent and open and thus essentially different in nature to past and current inquiries. A review of this nature could not just recommend appropriate changes of plan, and speed progress. It could also contribute to the vital task of helping to restore the trust and confidence of the public and the media in the programme and in the government officials involved.
APA, Harvard, Vancouver, ISO, and other styles
26

Iqbal, Jawaid, Arif Iqbal Umar, Noor Ul Amin, Abdul Waheed, Saleem Abdullah, Mahdi Zareei, and Muazzam Ali Khan Khattak. "Efficient network selection using multi fuzzy criteria for confidential data transmission in wireless body sensor networks." Journal of Intelligent & Fuzzy Systems 41, no. 1 (August 11, 2021): 37–55. http://dx.doi.org/10.3233/jifs-191104.

Full text
Abstract:
In the last decade, due to wireless technology’s enhancement, the people’s interest is highly increased in Wireless Body Sensor Networks (WBSNs). WBSNs consist of many tiny biosensor nodes that are continuously monitoring diverse physiological signals such as BP (systolic and diastolic), ECG, EMG, SpO2, and activity recognition and transmit these sensed patients’ sensitive information to the central node, which is straight communicate with the controller. To disseminate this sensitive patient information from the controller to remote Medical Server (MS) needs to be prolonged high-speed wireless technology, i.e., LTE, UMTS, WiMAX, WiFi, and satellite communication. It is a challenging task for the controller to choose the optimal network to disseminate various patient vital signs, i.e., emergency data, normal data, and delay-sensitive data. According to the nature of various biosensor nodes in WBSNs, monitor patient vital signs and provide complete intelligent treatment when any abnormality occurs in the human body, i.e., accurate insulin injection when patient sugar level increased. In this paper, first, we select the optimal network from accessible networks using four different fuzzy attribute-based decision-making techniques (Triangular Cubic Hesistent Fuzzy Weighted Averaging Operator, Neutrosophic Linguistic TOPSIS method, Trangualar Cubic Hesistent Fuzzy Hamacher Weighted Averaging Operator and Cubic Grey Relational Analysis) depending upon the quality of service requirement for various application of WBSNs to prolong the human life, enhanced the society’s medical treatment and indorse living qualities of people. Similarly, leakage and misuse of patient data can be a security threat to human life. Thus, confidential data transmission is of great importance. For this purpose, in our proposed scheme, we used HECC for secure key exchange and an AES algorithm to secure patient vital signs to protect patient information from illegal usage. Furthermore, MAC protocol is used for mutual authentication among sensor nodes and Base Stations (BS). Mathematical results show that our scheme is efficient for optimal network selection in such circumstances where conflict arises among diverse QoS requirements for different applications of WBSNs.
APA, Harvard, Vancouver, ISO, and other styles
27

S, Rajasekaran, Kalifulla Y, Murugesan S, Ezhilvendan M, and Gunasekaran J. "Authentication Based Cloud Storage and Secure Data Forwarding." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 4, no. 1 (February 1, 2013): 106–10. http://dx.doi.org/10.24297/ijct.v4i1b.3068.

Full text
Abstract:
cloud storage system, consisting of a collection of storage servers, provides long-term storage services over the Internet. Storing data in a third party’s cloud system causes serious concern over data confidentiality. General encryption schemes protect data confidentiality, but also limit the functionality of the storage system because a few operations are supported over encrypted data. Constructing a secure storage system that supports multiple functions is challenging when the storage system is distributed and has no central authority. We propose a threshold proxy re-encryption scheme and integrate it with a decentralized erasure code such that a secure distributed storage system is formulated. The distributed storage system not only supports secure and robust data storage and retrieval, but also lets a user forward his data in the storage servers to another user without retrieving the data back. The main technical contribution is that the proxy re-encryption scheme supports encoding operations over encrypted messages as well as forwarding operations over encoded and encrypted messages. Our method fully integrates encrypting, encoding, and forwarding. We analyze and suggest suitable parameters for the number of copies of a message dispatched to storage servers and the number of storage servers queried by a key server. These parameters allow more flexible adjustment between the number of storage servers and robustness.
APA, Harvard, Vancouver, ISO, and other styles
28

Roskó, Tibor. "A központosított felhasználó azonosítás jelene és jövője: biztonságos infrastruktúra vagy időzített bomba?" Információs Társadalom 19, no. 2 (December 17, 2019): 52. http://dx.doi.org/10.22503/inftars.xix.2019.2.4.

Full text
Abstract:
Kutatási projektünk célkitűzése a globálisan központosított felhasználó azonosítás bevezetés lehetőségének vizsgálata, támogatási modellek kidolgozása. Hipotézisünk, hogy a globálisan központosított felhasználó azonosítás eredményesen növelheti a biztonságot és hozzájárulhat az adatvédelmi előírások hatékony, gyakorlati implementálásához. Az olvasót, jelen publikációnk keretében, egy úton szeretnénk végigvezetni, melynek célállomása hipotézisünk igazolása, miközben átfogóan megismerheti a közelmúlt adatvédelmi incidenseinek hatásait, iránymutatást kaphat mind felhasználóként, mind szolgáltatóként a biztonság növelésére a személyes adatok megosztásával kapcsolatosan és nem elhanyagolandó, kutatásunk célkitűzéseit. --- The present and future of centralized user authentication: secure infrastructure or a time-bomb? The goal of our research project is to exam the possibility of using globally centra¬lized user authentication and developing support models. Our hypothesis is that glob¬ally centralized user identification can effectively increase security and contribute to the effective implementation of data protection regulations in practice. In the con¬text of our present publication, the reader would like to be guided along a path, the purpose of which is to prove our hypothesis while gaining a comprehensive under¬standing of the effects of recent data protection incidents, providing guidance both as a user and as a service provider to enhance security in sharing personal information and not neglecting the goals of our research. We will publish more detailed theoretical and implementation descriptions of our models in a separate paper.
APA, Harvard, Vancouver, ISO, and other styles
29

Yoo, Soonduck. "Blockchain based financial case analysis and its implications." Asia Pacific Journal of Innovation and Entrepreneurship 11, no. 3 (December 4, 2017): 312–21. http://dx.doi.org/10.1108/apjie-12-2017-036.

Full text
Abstract:
Purpose In Korea and abroad, this paper investigates the use of blockchains in the financial sector. This study aims to examine how blockchains are applied to the financial sector and how to respond to the Korean conditions. Design/methodology/approach This paper investigates the movements of the financial sector and related services using the blockchain in the current market. Findings First, as a result of examining domestic and foreign cases, it can be seen that the areas where blockchains are most actively applied in the financial sector are expanding into settlement, remittance, securities and smart contracts. Also, in Korea, many of the authentication procedures based on the equipment possessed by the consumers are used so that introduction of the blockchain in the authentication part is prominent. Second, the move to introduce a closed (private) distributed ledger that does not go through the central bank is accelerating in payments between banks. Third, domestic financial institutions also need joint action by financial institutions through a blockchain consortium to apply blockchain technology to the financial sector. Fourth, consumer needs and technological developments are changing. At the same time, as the opportunity to infringe on the information held by individuals has expanded, the need for blockchain technology is strongly emerging because of the efforts of the organizations to defend it. Originality/value This paper contributes to understanding the changes in the financial sector using the blockchain.
APA, Harvard, Vancouver, ISO, and other styles
30

Kalyan Chakravathy, P., K. Vasavi Devi, T. Sai Sri, and SK Abu Saleha. "An efficient and secured approach for sequential and transparent identity validation of user in internet services." International Journal of Engineering & Technology 7, no. 1.1 (December 21, 2017): 664. http://dx.doi.org/10.14419/ijet.v7i1.1.10823.

Full text
Abstract:
These days, it ends up noticeably genuine worry to give greater security to web administrations. Along these lines, secure client verification is the central assignment in security frameworks. Generally, a large portion of the frameworks depend on sets of username and secret key which checks the personality of client just at authentication stage. Once the client accesses with username and secret key, no checks are performed and encourages amid working sessions. Be that as it may, rising biometric arrangements gives the username and secret key using biometric information of client. In such approach, single shot check is less proficient in light of the fact that personality of client is perpetually amid entire session. Consequently, an important arrangement is to utilize brief time of timeouts for every session and occasionally ask the client to enter his or her qualifications again and again. In any case, this isn't a legitimate arrangement since it vigorously influences the administration convenience and eventually the fulfilment of clients. This paper investigates the framework for nonstop verification of client utilizing his accreditations, for example, biometric qualities. The utilization of consistent biometric verification framework procures certifications without expressly telling the client or requiring client communication that is, straightforwardly which is essential to ensure for better execution and administration ease of use.
APA, Harvard, Vancouver, ISO, and other styles
31

Goertzen, Melissa. "Longitudinal Analysis of Undergraduate E-book Use Finds that Knowledge of Local Communities Drives Format Selection and Collection Development Activities." Evidence Based Library and Information Practice 12, no. 1 (March 15, 2017): 112. http://dx.doi.org/10.18438/b8bw5q.

Full text
Abstract:
A Review of: Hobbs, K., & Klare, D. (2016). Are we there yet?: A longitudinal look at e-books through students’ eyes. Journal of Electronic Resources Librarianship, 28(1), 9-24. http://dx.doi.org/10.1080/1941126X.2016.1130451 Abstract Objective – To determine undergraduate students’ opinions of, use of, and facility with e-books. Design – A qualitative study that incorporated annual interview and usability sessions over a period of four years. The protocol was informed by interview techniques used in prior studies at Wesleyan University. To supplement the body of qualitative data, the 2014 Measuring Information Service Outcomes (MISO) survey was distributed; the researchers built five campus-specific e-book questions into the survey. Setting – A small university in the Northeastern United States of America. Subjects – 28 undergraduate students (7 per year) who attended summer session between the years of 2011-2014 recruited for interview and usability sessions; 700 full-time undergraduate students recruited for the 2014 MISO survey. Methods – The method was designed by a library consortium in the Northeastern United States of America. The study itself was conducted by two librarians based at the single university. To recruit students for interview and usability sessions, librarians sent invitations via email to a random list of students enrolled in the university’s summer sessions. Recruitment for the 2014 MISO survey was also conducted via email; the survey was sent to a stratified, random sample of undergraduate students in February 2014. Interview sessions were structured around five open-ended questions that examined students’ familiarity with e-books and whether the format supports academic work. These sessions were followed by the students’ evaluation of specific book titles available on MyiLibrary and ebrary, platforms accessible to all libraries in the CTW Consortium. Participants were asked to locate e-books on given topics, answer two research questions using preselected e-books, explain their research process using the above mentioned platforms, and comment on the overall usability experience. Instead of taking notes during interview and usability sessions, the researchers recorded interviews and captured screen activity. Following sessions, they watched recordings, took notes independently, and compared notes to ensure salient points were captured. Due to concerns that a small pool of interview and usability candidates might not capture the overall attitude of students towards e-books, the researchers distributed the 2014 MISO survey between the third and fourth interview years. Five additional campus-specific e-book questions were included. The final response rate was 33%. Main Results – The results of the interviews, usability studies, and MISO survey suggest that although students use print and electronic formats for complementary functions, 86% would still select print if they had to choose between the formats. Findings indicate that e-books promote discovery and convenient access to information, but print supports established and successful study habits, such as adding sticky notes to pages or creating annotations in margins. With that being said, most students do not attempt to locate one specific format over another. Rather, their two central concerns are that content is relevant to search terms and the full-text is readily available. Study findings also suggest that students approach content through the lens of a particular assignment. Regardless of format, they want to get in, locate specific information, and move on to the next source. Also, students want all sources – regardless of format – readily at hand and arranged in personal organization systems. PDF files were the preferred electronic format because they best support this research behaviour; content can be arranged in filing systems on personal devices or printed when necessary. Because of these research habits, digital rights management (DRM) restrictions created extreme frustration and were said to impede work. In some cases, students created workarounds for the purpose of accessing information in a usable form. This included visiting file sharing sites like Pirate Bay in order to locate DRM free content. Findings demonstrated a significant increase in student e-book use over the course of four years. However, this trend did not correspond to increased levels of sophistication in e-book use or facility with build-in functions on e-book platforms. The researchers discovered that students create workarounds instead of seeking out menu options that save time in the long run. This behaviour was consistent across the study group regardless of individual levels of experience working with e-books. Students commented that additional features slow down work rather than creating efficiency. For instance, when keyboard shortcuts used to copy and paste text did not function, students preferred to type out a passage rather than spend time searching for copy functions available on the e-book platform. Conclusion – Academic e-books continue to evolve in a fluid and dynamic environment. While the researchers saw improvements over the course of four years (e.g., fewer DRM restrictions) access barriers remain, such as required authentication to access platform content. They also identified areas where training sessions lead by librarians could demonstrate how e-books support student research and learning activities. The researchers also found that user experiences are local in nature and specific to campus cultures and expectations. They concluded that knowledge of local user communities should drive book format selection. Whenever possible, libraries should provide access to multiple formats to support a variety of learning needs and research behaviours.
APA, Harvard, Vancouver, ISO, and other styles
32

Gaikar, Dipak Damodar, Bijith Marakarkandy, and Chandan Dasgupta. "Using Twitter data to predict the performance of Bollywood movies." Industrial Management & Data Systems 115, no. 9 (October 19, 2015): 1604–21. http://dx.doi.org/10.1108/imds-04-2015-0145.

Full text
Abstract:
Purpose – The purpose of this paper is to address the shortcomings of limited research in forecasting the power of social media in India. Design/methodology/approach – This paper uses sentiment analysis and prediction algorithms to analyze the performance of Indian movies based on data obtained from social media sites. The authors used Twitter4j Java API for extracting the tweets through authenticating connection with Twitter web sites and stored the extracted data in MySQL database and used the data for sentiment analysis. To perform sentiment analysis of Twitter data, the Probabilistic Latent Semantic Analysis classification model is used to find the sentiment score in the form of positive, negative and neutral. The data mining algorithm Fuzzy Inference System is used to implement sentiment analysis and predict movie performance that is classified into three categories: hit, flop and average. Findings – In this study the authors found results of movie performance at the box office, which had been based on fuzzy interface system algorithm for prediction. The fuzzy interface system contains two factors, namely, sentiment score and actor rating to get the accurate result. By calculation of opening weekend collection, the authors found that that the predicted values were approximately same as the actual values. For the movie Singham Returns over method of prediction gave a box office collection as 84 crores and the actual collection turned out to be 88 crores. Research limitations/implications – The current study suffers from the limitation of not having enough computing resources to crawl the data. For predicting box office collection, there is no correct availability of ticket price information, total number of seats per screen and total number of shows per day on all screens. In the future work the authors can add several other inputs like budget of movie, Central Board of Film Certification rating, movie genre, target audience that will improve the accuracy and quality of the prediction. Originality/value – The authors used different factors for predicting box office movie performance which had not been used in previous literature. This work is valuable for promoting of product and services of the firms.
APA, Harvard, Vancouver, ISO, and other styles
33

"Distributed Key Management for IT Infrastructure using Block Chain and Hash Graph." International Journal of Recent Technology and Engineering 8, no. 2S3 (August 10, 2019): 533–37. http://dx.doi.org/10.35940/ijrte.b1096.0782s319.

Full text
Abstract:
The Cloud Computing Paradigm simplified the IT infrastructure management thereby making the organiza-tions and enterprises to rapidly migrate from conventional model to service driven model. The service driven model allows the users to access, maintain and backup the files from remote location over the internet through the central-ized cloud service providers. Inspite of major advantages, there are some concerns for the safety and privacy risks of organizational and users sensitive information in cloud stor-age. The existing service oriented model uses distributed key management which encrypts the data and stores it in cloud servers to provide the authentication to data, but it addresses challenges in identifying ownership of data, boot-strapping problem and securing the keys. The proposed system uses block chain, a distributed ledger for user authentication in cloud server and Hash graph with asynchronous byzantine fault tolerance algorithm is proposed for replicat-ed state machines with guaranteed Byzantine fault-tolerance and it can be utilized in the IT Infrastructure envi-ronment which solves the hassles of maintaining security keys
APA, Harvard, Vancouver, ISO, and other styles
34

Princy, D. Jegalakshmi, and M. Shobana. "Secure Virtualisation Using Hash Key Authentication." International Journal of Scientific Research in Computer Science, Engineering and Information Technology, November 12, 2018, 1–8. http://dx.doi.org/10.32628/cseit18386.

Full text
Abstract:
In the current scenario, cloud computing is new kind of computing paradigm which enables sharing of computing assets over the net. The cloud characteristics are on-call for self-service, vicinity unbiased community get right of entry to, ubiquitous community get right of entry to and usage based totally pay. Due to this charming capabilities personal and public enterprise are outsourcing their large quantity of records on cloud storage. Organizations are encouraged to migrate their facts from nearby web site to central industrial public cloud server. By outsourcing statistics on cloud customers gets alleviation from storage preservation. Although there are numerous blessings to migrate facts on cloud garage it brings many safety problems. Therefore the records owners hesitate to emigrate their sensitive data. The current technique makes use of Virtual Machines and Hypervisor Intrusion Detection System, in detecting and stopping the hypervisor attacks inside the virtualized cloud surroundings. In this work, we proposed a model which searches and get access to files which are uploaded in a network. Elliptical Curve Cryptography and Attribute Based Encryption method are implemented for secured data communication. The proposed method is implemented in real time and the results prove that the proposed method is efficient than the existing methods.
APA, Harvard, Vancouver, ISO, and other styles
35

Wesner, Stefan. "Towards an Architecture for the Mobile Grid (Architektur für ein Mobiles Grid)." it - Information Technology 47, no. 6 (January 1, 2005). http://dx.doi.org/10.1524/itit.2005.47.6.336.

Full text
Abstract:
SummaryMobility is becoming a central aspect of everyday life and must not be ignored by the ongoing efforts in defining the Next Generation Grid architectures. Currently existing network agnostic Grid middleware solutions are duplicating functionality available from lower layers and cannot benefit from a richer set of additional information available such as the user or device context. In this article we will motivate why an integration of the infrastructure services built for supporting mobile users can be beneficial for the realization of a Next Generation Grid. Starting from a definition of the term Mobile Grid it is shown how new kind of adaptive applications can be realized and how the major obstacles for the wide take up of Grid solutions can be addressed. In particular how grid solutions can benefit from user authentication models, cross-organizational accounting, auditing and billing are covered. Beside the opportunities of this integration several new challenges in particular related to workflows and the management of Service Level Agreements will be outlined. This so called Cross-Layer Cooperation is seen as one of the major differences to other Next Generation Grids based on the Open Grid Service Architecture (OGSA) [10] approach. In the next section considerations related to deployment of Mobile Grids and why the approach outlined in the OGSA architecture to model grids as the composition of individual services is not adequate for this are presented.
APA, Harvard, Vancouver, ISO, and other styles
36

"Empirical Data on Mobile Money Hesitation Factors in Somalia." International Journal of Engineering and Advanced Technology 9, no. 3 (February 29, 2020): 3719–27. http://dx.doi.org/10.35940/ijeat.c6307.029320.

Full text
Abstract:
Mobile money is an electronic system of transferring money from person to person. The mobile money service has expanded its coverage all over the world and there is hardly any country that do not practice any form of mobile money transfer. Somalia is one of the countries that embraced mobile money unconditionally as there is lack of traditional financial institutions providing financial services since the collapse of central government in 1991. Somalians accepted mobile money because it has made money transfer easier for them to pay bill and shopping. However, there are hesitation factors that hinder the full scale functioning of the system and makes people hesitate to use mobile money. Currently mobile money users practice very limited mobile money functions such as sending and receiving, withdrawal, top up and internet recharge. Other mobile money functions such as pay tuition fees, payrolls, payments for purchase t, utility payment and saving money into mobile money account are lagging behind. This empirical study explores the inconvenience factors that lead people to hesitate to use mobile money in a large scale. In this study, 650 survey questionnaire were distributed among mobile money users in Somalia. The questionnaires were distributed through online Google form. A total of 375 respondents submitted their responses and all the answers were recorded into SPSS. IBM-SPSS statistics 22 were used to statistically analyses the data. Factor analysis for data validity and scale analysis for data reliability, frequency and descriptive statistics were conducted to analyze the data. The study found that there are numerous mobile money hesitation factors that make Somalian people to hesitate fully practicing the system. These hesitation factors include perceived risk of financial loss, perceived risk of system error, perceived risk of authentication weaknesses, lack of regulation and policy and interoperability between the mobile money service providers. This study concludes that hesitation factors needs to be addressed that will improve the level of mobile money usage into full scale. Among factors that may reduce hesitation factors of the usage of mobile money services in Somalia are high level accuracy of mobile money authentication system, operative interoperability platform, highly effective compensation system and functioning mobile money regulations and policy.
APA, Harvard, Vancouver, ISO, and other styles
37

Kumar, Sanjay, Binod Kumar Singh, Akshita, Sonika Pundir, Rashi Joshi, and Simran Batra. "Role of Digital Watermarking in Wireless Sensor Network." Recent Advances in Computer Science and Communications 13 (July 30, 2020). http://dx.doi.org/10.2174/2666255813999200730230731.

Full text
Abstract:
Abstract: WSN has been exhilarated in many application areas such as military, medical, environment, etc. Due to the rapid increase in applications, it causes proportionality to security threats. Mostly, nodes used, are independent of human reach and dependent on their limited resources, the major challenges can be framed as energy consumption and resource reliability. Introduction: Due to the limitation of resources in the sensor nodes, the traditionally intensive security mechanism is not feasible for WSNs. This limitation brought the concept of Digital watermarking in existence. The intent of this paper is to investigate the security related issues and role of watermarking in wireless sensor networks. Watermarking is an effective way to provide security, integrity, data aggregation and robustness in WSN. Mehtod: Digital watermarking is essential in WSN since it provides security in various forms such as confidentiality, service availability and data freshness. We have briefly discussed the various security requirements in the wireless sensor networks. Several issues and challenges, as well as the various threats of WMSN, is also considered. The related work, suggests that lots of research work are required to improve the security and authentication issues of WMSN by digital watermarking. This survey contribution will be helpful for researcher to accomplish effective watermarking scheme for WSN. Result: WSN is the collection of sensors that are spread in the environment. They measure and monitor physical conditions like temperature, sound, pressure, humidity, etc. and organize the collected data at the main location. The base station then forwards the data to end-users who analyze and make strategic decisions. Nowadays the modern network is bi-directional i.e. we can control the activity of the sensor. The central stations, unlike another sensor network, have infinite power, plenty of memory, powerful processors and a high bandwidth link. Whereas sensors are small in size, have fewer computation abilities, transfer wirelessly and are power-driven by small batteries Conclusion: Watermarking is an effective way to provide security, integrity, data aggregation and robustness in WSN. Digital watermarking is essential in WSN since it provides security in various forms such as confidentiality, data integrity, service availability and data freshness. In this paper, we have briefly discussed the various security requirements in the wireless sensor networks. We have also discussed work done by several authors in WSN with watermarking techniques Discussion: Wireless network faces more challenges as compared to wired network. In WSN, maximum amount of data and information can be stealed and effortlessly accessed during transmission. The nodes in a wireless sensor network (WSN) are resources constrained. Due to which WSNs have certain challenges i.e. processing power, storage and computational capacity and high bandwidth demand. Security schemes are not easy to design since the network tends to be large and ad-hoc. The analysis for privacy and security have been explored by some authors like J. Zhu et al. In recent years, a cryptography-based security designs have been proposed for WSNs. Cryptography is the process of concealing the information into the cipher text by encrypting it.
APA, Harvard, Vancouver, ISO, and other styles
38

Pawar, Ankush Balaram, Dr Shashikant U. Ghumbre, and Dr Rashmi M. Jogdand. "Privacy preserving model-based authentication and data security in cloud computing." International Journal of Pervasive Computing and Communications ahead-of-print, ahead-of-print (June 17, 2021). http://dx.doi.org/10.1108/ijpcc-11-2020-0193.

Full text
Abstract:
Purpose Cloud computing plays a significant role in the initialization of secure communication between users. The advanced technology directs to offer several services, such as platform, resources, and accessing the network. Furthermore, cloud computing is a broader technology of communication convergence. In cloud computing architecture, data security and authentication are the main significant concerns. Design/methodology/approach The purpose of this study is to design and develop authentication and data security model in cloud computing. This method includes six various units, such as cloud server, data owner, cloud user, inspection authority, attribute authority, and central certified authority. The developed privacy preservation method includes several stages, namely setup phase, key generation phase, authentication phase and data sharing phase. Initially, the setup phase is performed through the owner, where the input is security attributes, whereas the system master key and the public parameter are produced in the key generation stage. After that, the authentication process is performed to identify the security controls of the information system. Finally, the data is decrypted in the data sharing phase for sharing data and for achieving data privacy for confidential data. Additionally, dynamic splicing is utilized, and the security functions, such as hashing, Elliptic Curve Cryptography (ECC), Data Encryption Standard-3 (3DES), interpolation, polynomial kernel, and XOR are employed for providing security to sensitive data. Findings The effectiveness of the developed privacy preservation method is estimated based on other approaches and displayed efficient outcomes with better privacy factor and detection rate of 0.83 and 0.65, and time is highly reduced by 2815ms using the Cleveland dataset. Originality/value This paper presents the privacy preservation technique for initiating authenticated encrypted access in clouds, which is designed for mutual authentication of requester and data owner in the system.
APA, Harvard, Vancouver, ISO, and other styles
39

Pereira, André Albino, João Bosco M. Sobral, and Carla M. Westphall. "Towards Scalability for Federated Identity Systems for Cloud-Based Environments." CLEI Electronic Journal, December 1, 2015. http://dx.doi.org/10.19153/cleiej.18.3.2.

Full text
Abstract:
As multi-tenant authorization and federated identity management systems for cloud computing matures, the provisioning of services using this paradigm allows maximum efficiency on business that requires access control. However, regarding scalability support, mainly horizontal, some characteristics of those approaches based on central authentication protocols are problematic. The objective of this work is to address these issues by providing an adapted sticky-session mechanism for a Shibboleth architecture using JASIG CAS. This alternative, compared with the recommended distributed memory approach, shown improved efficiency and less overall infrastructure complexity, as well as demanding less 58% of computational resources and improving throughput (requests per second) by 11%.
APA, Harvard, Vancouver, ISO, and other styles
40

Bombace, Michael P. "Blazing Trails: A New Way Forward for Virtual Currencies and Money Laundering." Journal For Virtual Worlds Research 6, no. 3 (September 16, 2013). http://dx.doi.org/10.4101/jvwr.v6i3.7039.

Full text
Abstract:
Virtual currencies grew up in virtual worlds. They were a central element in the game experience. They remain so and now represent a widespread form of value exchange on the Internet. They are an increasingly effective way to monetize games. Because of their versatility within games as part of game play and as a monetization method, they are a central tool of innovation for game developers. In tandem with their rise in use and complexity come anti-money laundering concerns. Their use for illegal acts is predicted to grow. Because of their still nascent state there is a window of opportunity to get regulation right and balance the cost of constraining innovation and online trade with the benefits of addressing anti-money laundering concerns. There is now some urgency because of recent regulatory guidance issued by the Financial Crimes Enforcement Network, a bureau of the United States Treasury Department.This paper presents a new approach. First, a data retention policy that includes identity authentication requirements. Second, restrictions on the use of payment systems at a high risk for abuse. Third, a safe harbor granting criminal and civil immunity for good faith efforts by game companies to help reduce the cost of compliance. Absent from this proposal are suspicious activity reports, which are expensive and place a burden that is handled better, and already done, by payment systems that connect to game companies, such as PayPal, and traditional services such as bank accounts or credit cards. Virtual currencies are an important tool for game developers that in turn provide real economic development and creativity that require unique treatment in the law. Regulation will occur—the question is how it will be crafted. This paper presents a path forward in that discussion.
APA, Harvard, Vancouver, ISO, and other styles
41

Ruch, Adam, and Steve Collins. "Zoning Laws: Facebook and Google+." M/C Journal 14, no. 5 (October 18, 2011). http://dx.doi.org/10.5204/mcj.411.

Full text
Abstract:
As the single most successful social-networking Website to date, Facebook has caused a shift in both practice and perception of online socialisation, and its relationship to the offline world. While not the first online social networking service, Facebook’s user base dwarfs its nearest competitors. Mark Zuckerberg’s creation boasts more than 750 million users (Facebook). The currently ailing MySpace claimed a ceiling of 100 million users in 2006 (Cashmore). Further, the accuracy of this number has been contested due to a high proportion of fake or inactive accounts. Facebook by contrast, claims 50% of its user base logs in at least once a day (Facebook). The popular and mainstream uptake of Facebook has shifted social use of the Internet from various and fragmented niche groups towards a common hub or portal around which much everyday Internet use is centred. The implications are many, but this paper will focus on the progress what Mimi Marinucci terms the “Facebook effect” (70) and the evolution of lists as a filtering mechanism representing one’s social zones within Facebook. This is in part inspired by the launch of Google’s new social networking service Google+ which includes “circles” as a fundamental design feature for sorting contacts. Circles are an acknowledgement of the shortcomings of a single, unified friends list that defines the Facebook experience. These lists and circles are both manifestations of the same essential concept: our social lives are, in fact, divided into various zones not defined by an online/offline dichotomy, by fantasy role-play, deviant sexual practices, or other marginal or minority interests. What the lists and circles demonstrate is that even very common, mainstream people occupy different roles in everyday life, and that to be effective social tools, social networking sites must grant users control over their various identities and over who knows what about them. Even so, the very nature of computer-based social tools lead to problematic definitions of identities and relationships using discreet terms, in contrast to more fluid, performative constructions of an individual and their relations to others. Building the Monolith In 1995, Sherry Turkle wrote that “the Internet has become a significant social laboratory for experimenting with the constructions and reconstructions of self that characterize postmodern life” (180). Turkle describes the various deliberate acts of personnae creation possible online in contrast to earlier constraints placed upon the “cycling through different identities” (179). In the past, Turkle argues, “lifelong involvement with families and communities kept such cycling through under fairly stringent control” (180). In effect, Turkle was documenting the proliferation of identity games early adopters of Internet technologies played through various means. Much of what Turkle focused on were MUDs (Multi-User Dungeons) and MOOs (MUD Object Oriented), explicit play-spaces that encouraged identity-play of various kinds. Her contemporary Howard Rheingold focused on what may be described as the more “true to life” communities of the WELL (Whole Earth ‘Lectronic Link) (1–38). In particular, Rheingold explored a community established around the shared experience of parenting, especially of young children. While that community was not explicitly built on the notion of role-play, the parental identity was an important quality of community members. Unlike contemporary social media networks, these early communities were built on discreet platforms. MUDs, MOOs, Bulletin Board Systems, UseNet Groups and other early Internet communication platforms were generally hosted independently of one another, and even had to be dialled into via modem separately in some cases (such as the WELL). The Internet was a truly disparate entity in 1995. The discreetness of each community supported the cordoning off of individual roles or identities between them. Thus, an individual could quite easily be “Pete” a member of the parental WELL group and “Gorak the Destroyer,” a role-player on a fantasy MUD without the two roles ever being associated with each other. As Turkle points out, even within each MUD ample opportunity existed to play multiple characters (183–192). With only a screen name and associated description to identify an individual within the MUD environment, nothing technical existed to connect one player’s multiple identities, even within the same community. As the Internet has matured, however, the tendency has been shifting towards monolithic hubs, a notion of collecting all of “the Internet” together. From a purely technical and operational perspective, this has led to the emergence of the ISP (Internet service provider). Users can make a connection to one point, and then be connected to everything “on the Net” instead of individually dialling into servers and services one at a time as was the case in the early 1980s with companies such as Prodigy, the Source, CompuServe, and America On-Line (AOL). The early information service providers were largely walled gardens. A CompuServe user could only access information on the CompuServe network. Eventually the Internet became the network of choice and services migrated to it. Standards such as HTTP for Web page delivery and SMTP for email became established and dominate the Internet today. Technically, this has made the Internet much easier to use. The services that have developed on this more rationalised and unified platform have also tended toward monolithic, centralised architectures, despite the Internet’s apparent fundamental lack of a hierarchy. As the Internet replaced the closed networks, the wider Web of HTTP pages, forums, mailing lists and other forms of Internet communication and community thrived. Perhaps they required slightly more technological savvy than the carefully designed experience of walled-garden ISPs such as AOL, but these fora and IRC (Internet Relay Chat) rooms still provided the discreet environments within which to role-play. An individual could hold dozens of login names to as many different communities. These various niches could be simply hobby sites and forums where a user would deploy their identity as model train enthusiast, musician, or pet owner. They could also be explicitly about role-play, continuing the tradition of MUDs and MOOs into the new millennium. Pseudo- and polynymity were still very much part of the Internet experience. Even into the early parts of the so-called Web 2.0 explosion of more interactive Websites which allowed for easier dialog between site owner and viewer, a given identity would be very much tied to a single site, blog or even individual comments. There was no “single sign on” to link my thread from a music forum to the comments I made on a videogame blog to my aquarium photos at an image gallery site. Today, Facebook and Google, among others, seek to change all that. The Facebook Effect Working from a psychological background Turkle explored the multiplicity of online identities as a valuable learning, even therapeutic, experience. She assessed the experiences of individuals who were coming to terms with aspects of their own personalities, from simple shyness to exploring their sexuality. In “You Can’t Front on Facebook,” Mimi Marinucci summarizes an analysis of online behaviour by another psychologist, John Suler (67–70). Suler observed an “online disinhibition effect” characterised by users’ tendency to express themselves more openly online than offline (321). Awareness of this effect was drawn (no pun intended) into popular culture by cartoonist Mike Krahulik’s protagonist John Gabriel. Although Krahulik’s summation is straight to the point, Suler offers a more considered explanation. There are six general reasons for the online disinhibition effect: being anonymous, being invisible, the communications being out of sync, the strange sensation that a virtual interlocutor is all in the mind of the user, the general sense that the online world simply is not real and the minimisation of status and authority (321–325). Of the six, the notion of anonymity is most problematic, as briefly explored above in the case of AOL. The role of pseudonymity has been explored in more detail in Ruch, and will be considered with regard to Facebook and Google+ below. The Facebook effect, Marinucci argues, mitigates all six of these issues. Though Marinucci explains the mitigation of each factor individually, her final conclusion is the most compelling reason: “Facebook often facilitates what is best described as an integration of identities, and this integration of identities in turn functions as something of an inhibiting factor” (73). Ruch identifies this phenomenon as the “aggregation of identities” (219). Similarly, Brady Robards observes that “social network sites such as MySpace and Facebook collapse the entire array of social relationships into just one category, that of ‘Friend’” (20). Unlike earlier community sites, Ruch notes “Facebook rejects both the mythical anonymity of the Internet, but also the actual pseudo- or polynonymous potential of the technologies” (219). Essentially, Facebook works to bring the offline social world online, along with all the conventional baggage that accompanies the individual’s real-world social life. Facebook, and now Google+, present a hard, dichotomous approach to online identity: anonymous and authentic. Their socially networked individual is the “real” one, using a person’s given name, and bringing all (or as many as the sites can capture) their contacts from the offline world into the online one, regardless of context. The Facebook experience is one of “friending” everyone one has any social contact with into one homogeneous group. Not only is Facebook avoiding the multiple online identities that interested Turkle, but it is disregarding any multiplicity of identity anywhere, including any online/offline split. David Kirkpatrick reports Mark Zuckerberg’s rejection of this construction of identity is explained by his belief that “You have one identity … having two identities for yourself is an example of a lack of integrity” (199). Arguably, Zuckerberg’s calls for accountability through identity continue a perennial concern for anonymity online fuelled by “on the Internet no one knows you’re a dog” style moral panics. Over two decades ago Lindsy Van Gelder recounted the now infamous case of “Joan and Alex” (533) and Julian Dibbell recounted “a rape in cyberspace” (11). More recent anxieties concern the hacking escapades of Anonymous and LulzSec. Zuckerberg’s approach has been criticised by Christopher Poole, the founder of 4Chan—a bastion of Internet anonymity. During his keynote presentation at South by SouthWest 2011 Poole argued that Zuckerberg “equates anonymity with a lack of authenticity, almost a cowardice.” Yet in spite of these objections, Facebook has mainstream appeal. From a social constructivist perspective, this approach to identity would be satisfying the (perceived?) need for a mainstream, context-free, general social space online to cater for the hundreds of millions of people who now use the Internet. There is no specific, pre-defined reason to join Facebook in the way there is a particular reason to join a heavy metal music message board. Facebook is catering to the need to bring “real” social life online generally, with “real” in this case meaning “offline and pre-existing.” Very real risks of missing “real life” social events (engagements, new babies, party invitations etc) that were shared primarily via Facebook became salient to large groups of individuals not consciously concerned with some particular facet of identity performance. The commercial imperatives towards monolithic Internet and identity are obvious. Given that both Facebook and Google+ are in the business of facilitating the sale of advertising, their core business value is the demographic information they can sell to various companies for target advertising. Knowing a user’s individual identity and tastes is extremely important to those in the business of selling consumers what they currently want as well as predicting their future desires. The problem with this is the dawning realisation that even for the average person, role-playing is part of everyday life. We simply aren’t the same person in all contexts. None of the roles we play need to be particularly scandalous for this to be true, but we have different comfort zones with people that are fuelled by context. Suler proposes and Marinucci confirms that inhibition may be just as much part of our authentic self as the uninhibited expression experienced in more anonymous circumstances. Further, different contexts will inform what we inhibit and what we express. It is not as though there is a simple binary between two different groups and two different personal characteristics to oscillate between. The inhibited personnae one occupies at one’s grandmother’s home is a different inhibited self one plays at a job interview or in a heated discussion with faculty members at a university. One is politeness, the second professionalism, the third scholarly—yet they all restrain the individual in different ways. The Importance of Control over Circles Google+ is Google’s latest foray into the social networking arena. Its previous ventures Orkut and Google Buzz did not fare well, both were variously marred by legal issues concerning privacy, security, SPAM and hate groups. Buzz in particular fell afoul of associating Google accounts with users” real life identities, and (as noted earlier), all the baggage that comes with it. “One user blogged about how Buzz automatically added her abusive ex-boyfriend as a follower and exposed her communications with a current partner to him. Other bloggers commented that repressive governments in countries such as China or Iran could use Buzz to expose dissidents” (Novak). Google+ takes a different approach to its predecessors and its main rival, Facebook. Facebook allows for the organisation of “friends” into lists. Individuals can span more than one list. This is an exercise analogous to what Erving Goffman refers to as “audience segregation” (139). According to the site’s own statistics the average Facebook user has 130 friends, we anticipate it would be time-consuming to organise one’s friends according to real life social contexts. Yet without such organisation, Facebook overlooks the social structures and concomitant behaviours inherent in everyday life. Even broad groups offer little assistance. For example, an academic’s “Work People” list may include the Head of Department as well as numerous other lecturers with whom a workspace is shared. There are things one might share with immediate colleagues that should not be shared with the Head of Department. As Goffman states, “when audience segregation fails and an outsider happens upon a performance that was not meant for him, difficult problems in impression management arise” (139). By homogenising “friends” and social contexts users are either inhibited or run the risk of some future awkward encounters. Google+ utilises “circles” as its method for organising contacts. The graphical user interface is intuitive, facilitated by an easy drag and drop function. Use of “circles” already exists in the vocabulary used to describe our social structures. “List” by contrast reduces the subject matter to simple data. The utility of Facebook’s friends lists is hindered by usability issues—an unintuitive and convoluted process that was added to Facebook well after its launch, perhaps a reaction to privacy concerns rather than a genuine attempt to emulate social organisation. For a cogent breakdown of these technical and design problems see Augusto Sellhorn. Organising friends into lists is a function offered by Facebook, but Google+ takes a different approach: organising friends in circles is a central feature; the whole experience is centred around attempting to mirror the social relations of real life. Google’s promotional video explains the centrality of emulating “real life relationships” (Google). Effectively, Facebook and Google+ have adopted two different systemic approaches to dealing with the same issue. Facebook places the burden of organising a homogeneous mass of “friends” into lists on the user as an afterthought of connecting with another user. In contrast, Google+ builds organisation into the act of connecting. Whilst Google+’s approach is more intuitive and designed to facilitate social networking that more accurately reflects how real life social relationships are structured, it suffers from forcing direct correlation between an account and the account holder. That is, use of Google+ mandates bringing online the offline. Google+ operates a real names policy and on the weekend of 23 July 2011 suspended a number of accounts for violation of Google’s Community Standards. A suspension notice posted by Violet Blue reads: “After reviewing your profile, we determined the name you provided violates our Community Standards.” Open Source technologist Kirrily Robert polled 119 Google+ users about their experiences with the real names policy. The results posted to her on blog reveal that users desire pseudonymity, many for reasons of privacy and/or safety rather than the lack of integrity thought by Zuckerberg. boyd argues that Google’s real names policy is an abuse of power and poses danger to those users employing “nicks” for reasons including being a government employment or the victim of stalking, rape or domestic abuse. A comprehensive list of those at risk has been posted to the Geek Feminism Wiki (ironically, the Wiki utilises “Connect”, Facebook’s attempt at a single sign on solution for the Web that connects users’ movements with their Facebook profile). Facebook has a culture of real names stemming from its early adopters drawn from trusted communities, and this culture became a norm for that service (boyd). But as boyd also points out, “[r]eal names are by no means universal on Facebook.” Google+ demands real names, a demand justified by rhetoric of designing a social networking system that is more like real life. “Real”, in this case, is represented by one’s given name—irrespective of the authenticity of one’s pseudonym or the complications and dangers of using one’s given name. Conclusion There is a multiplicity of issues concerning social networks and identities, privacy and safety. This paper has outlined the challenges involved in moving real life to the online environment and the contests in trying to designate zones of social context. Where some earlier research into the social Internet has had a positive (even utopian) feel, the contemporary Internet is increasingly influenced by powerful and competing corporations. As a result, the experience of the Internet is not necessarily as flexible as Turkle or Rheingold might have envisioned. Rather than conducting identity experimentation or exercising multiple personnae, we are increasingly obligated to perform identity as it is defined by the monolithic service providers such as Facebook and Google+. This is not purely an indictment of Facebook or Google’s corporate drive, though they are obviously implicated, but has as much to do with the new social practice of “being online.” So, while there are myriad benefits to participating in this new social context, as Poole noted, the “cost of failure is really high when you’re contributing as yourself.” Areas for further exploration include the implications of Facebook positioning itself as a general-purpose user authentication tool whereby users can log into a wide array of Websites using their Facebook credentials. If Google were to take a similar action the implications would be even more convoluted, given the range of other services Google offers, from GMail to the Google Checkout payment service. While the monolithic centralisation of these services will have obvious benefits, there will be many more subtle problems which must be addressed. References Blue, Violet. “Google Plus Deleting Accounts en Masse: No Clear Answers.” zdnet.com (2011). 10 Aug. 2011 ‹http://www.zdnet.com/blog/violetblue/google-plus-deleting-accounts-en-masse-no-clear-answers/56›. boyd, danah. “Real Names Policies Are an Abuse of Power.” zephoria.org (2011). 10 Aug. 2011 ‹http://www.zephoria.org/thoughts/archives/2011/08/04/real-names.html›. Cashmore, Pete. “MySpace Hits 100 Million Accounts.” mashable.com (2006). 10 Aug. 2011 ‹http://mashable.com/2006/08/09/myspace-hits-100-million-accounts›. Dibble, Julian. My Tiny Life: Crime and Passion in a Virtual World. New York: Henry Holt & Company, 1998. Facebook. “Fact Sheet.” Facebook (2011). 10 Aug. 2011 ‹http://www.facebook.com/press/info.php?statistic›. Geek Feminism Wiki. “Who Is Harmed by a Real Names Policy?” 2011. 10 Aug. 2011 ‹http://geekfeminism.wikia.com/wiki/Who_is_harmed_by_a_%22Real_Names%22_policy› Goffman, Erving. The Presentation of Self in Everyday Life. London: Penguin, 1959. Google. “The Google+ Project: Explore Circles.” Youtube.com (2011). 10 Aug. 2011 ‹http://www.youtube.com/watch?v=ocPeAdpe_A8›. Kirkpatrick, David. The Facebook Effect. New York: Simon & Schuster, 2010. Marinucci, Mimi. “You Can’t Front on Facebook.” Facebook and Philosophy. Ed. Dylan Wittkower. Chicago & La Salle, Illinois: Open Court, 2010. 65–74. Novak, Peter. “Privacy Commissioner Reviewing Google Buzz.” CBC News: Technology and Science (2010). 10 Aug. 2011 ‹http://www.cbc.ca/news/technology/story/2010/02/16/google-buzz-privacy.html›. Poole, Christopher. Keynote presentation. South by SouthWest. Texas, Austin, 2011. Robards, Brady. “Negotiating Identity and Integrity on Social Network Sites for Educators.” International Journal for Educational Integrity 6.2 (2010): 19–23. Robert, Kirrily. “Preliminary Results of My Survey of Suspended Google Accounts.” 2011. 10 Aug. 2011 ‹http://infotrope.net/2011/07/25/preliminary-results-of-my-survey-of-suspended-google-accounts/›. Rheingold, Howard. The Virtual Community: Homesteading on the Electronic Frontier. New York: Harper Perennial, 1993. Ruch, Adam. “The Decline of Pseudonymity.” Posthumanity. Eds. Adam Ruch and Ewan Kirkland. Oxford: Inter-Disciplinary.net Press, 2010: 211–220. Sellhorn, Augusto. “Facebook Friend Lists Suck When Compared to Google+ Circles.” sellmic.com (2011). 10 Aug. 2011 ‹http://sellmic.com/blog/2011/07/01/facebook-friend-lists-suck-when-compared-to-googleplus-circles›. Suler, John. “The Online Disinhibition Effect.” CyberPsychology and Behavior 7 (2004): 321–326. Turkle, Sherry. Life on the Screen: Identity in the Age of the Internet. New York: Simon & Schuster, 1995. Van Gelder, Lindsy. “The Strange Case of the Electronic Lover.” Computerization and Controversy: Value Conflicts and Social Choices Ed. Rob Kling. New York: Academic Press, 1996: 533–46.
APA, Harvard, Vancouver, ISO, and other styles
42

Champion, Katherine M. "A Risky Business? The Role of Incentives and Runaway Production in Securing a Screen Industries Production Base in Scotland." M/C Journal 19, no. 3 (June 22, 2016). http://dx.doi.org/10.5204/mcj.1101.

Full text
Abstract:
IntroductionDespite claims that the importance of distance has been reduced due to technological and communications improvements (Cairncross; Friedman; O’Brien), the ‘power of place’ still resonates, often intensifying the role of geography (Christopherson et al.; Morgan; Pratt; Scott and Storper). Within the film industry, there has been a decentralisation of production from Hollywood, but there remains a spatial logic which has preferenced particular centres, such as Toronto, Vancouver, Sydney and Prague often led by a combination of incentives (Christopherson and Storper; Goldsmith and O’Regan; Goldsmith et al.; Miller et al.; Mould). The emergence of high end television, television programming for which the production budget is more than £1 million per television hour, has presented new opportunities for screen hubs sharing a very similar value chain to the film industry (OlsbergSPI with Nordicity).In recent years, interventions have proliferated with the aim of capitalising on the decentralisation of certain activities in order to attract international screen industries production and embed it within local hubs. Tools for building capacity and expertise have proliferated, including support for studio complex facilities, infrastructural investments, tax breaks and other economic incentives (Cucco; Goldsmith and O’Regan; Jensen; Goldsmith et al.; McDonald; Miller et al.; Mould). Yet experience tells us that these will not succeed everywhere. There is a need for a better understanding of both the capacity for places to build a distinctive and competitive advantage within a highly globalised landscape and the relative merits of alternative interventions designed to generate a sustainable production base.This article first sets out the rationale for the appetite identified in the screen industries for co-location, or clustering and concentration in a tightly drawn physical area, in global hubs of production. It goes on to explore the latest trends of decentralisation and examines the upturn in interventions aimed at attracting mobile screen industries capital and labour. Finally it introduces the Scottish screen industries and explores some of the ways in which Scotland has sought to position itself as a recipient of screen industries activity. The paper identifies some key gaps in infrastructure, most notably a studio, and calls for closer examination of the essential ingredients of, and possible interventions needed for, a vibrant and sustainable industry.A Compulsion for ProximityIt has been argued that particular spatial and place-based factors are central to the development and organisation of the screen industries. The film and television sector, the particular focus of this article, exhibit an extraordinarily high degree of spatial agglomeration, especially favouring centres with global status. It is worth noting that the computer games sector, not explored in this article, slightly diverges from this trend displaying more spatial patterns of decentralisation (Vallance), although key physical hubs of activity have been identified (Champion). Creative products often possess a cachet that is directly associated with their point of origin, for example fashion from Paris, films from Hollywood and country music from Nashville – although it can also be acknowledged that these are often strategic commercial constructions (Pecknold). The place of production represents a unique component of the final product as well as an authentication of substantive and symbolic quality (Scott, “Creative cities”). Place can act as part of a brand or image for creative industries, often reinforcing the advantage of being based in particular centres of production.Very localised historical, cultural, social and physical factors may also influence the success of creative production in particular places. Place-based factors relating to the built environment, including cheap space, public-sector support framework, connectivity, local identity, institutional environment and availability of amenities, are seen as possible influences in the locational choices of creative industry firms (see, for example, Drake; Helbrecht; Hutton; Leadbeater and Oakley; Markusen).Employment trends are notoriously difficult to measure in the screen industries (Christopherson, “Hollywood in decline?”), but the sector does contain large numbers of very small firms and freelancers. This allows them to be flexible but poses certain problems that can be somewhat offset by co-location. The findings of Antcliff et al.’s study of workers in the audiovisual industry in the UK suggested that individuals sought to reconstruct stable employment relations through their involvement in and use of networks. The trust and reciprocity engendered by stable networks, built up over time, were used to offset the risk associated with the erosion of stable employment. These findings are echoed by a study of TV content production in two media regions in Germany by Sydow and Staber who found that, although firms come together to work on particular projects, typically their business relations extend for a much longer period than this. Commonly, firms and individuals who have worked together previously will reassemble for further project work aided by their past experiences and expectations.Co-location allows the development of shared structures: language, technical attitudes, interpretative schemes and ‘communities of practice’ (Bathelt, et al.). Grabher describes this process as ‘hanging out’. Deep local pools of creative and skilled labour are advantageous both to firms and employees (Reimer et al.) by allowing flexibility, developing networks and offsetting risk (Banks et al.; Scott, “Global City Regions”). For example in Cook and Pandit’s study comparing the broadcasting industry in three city-regions, London was found to be hugely advantaged by its unrivalled talent pool, high financial rewards and prestigious projects. As Barnes and Hutton assert in relation to the wider creative industries, “if place matters, it matters most to them” (1251). This is certainly true for the screen industries and their spatial logic points towards a compulsion for proximity in large global hubs.Decentralisation and ‘Sticky’ PlacesDespite the attraction of global production hubs, there has been a decentralisation of screen industries from key centres, starting with the film industry and the vertical disintegration of Hollywood studios (Christopherson and Storper). There are instances of ‘runaway production’ from the 1920s onwards with around 40 per cent of all features being accounted for by offshore production in 1960 (Miller et al., 133). This trend has been increasing significantly in the last 20 years, leading to the genesis of new hubs of screen activity such as Toronto, Vancouver, Sydney and Prague (Christopherson, “Project work in context”; Goldsmith et al.; Mould; Miller et al.; Szczepanik). This development has been prompted by a multiplicity of reasons including favourable currency value differentials and economic incentives. Subsidies and tax breaks have been offered to secure international productions with most countries demanding that, in order to qualify for tax relief, productions have to spend a certain amount of their budget within the local economy, employ local crew and use domestic creative talent (Hill). Extensive infrastructure has been developed including studio complexes to attempt to lure productions with the advantage of a full service offering (Goldsmith and O’Regan).Internationally, Canada has been the greatest beneficiary of ‘runaway production’ with a state-led enactment of generous film incentives since the late 1990s (McDonald). Vancouver and Toronto are the busiest locations for North American Screen production after Los Angeles and New York, due to exchange rates and tax rebates on labour costs (Miller et al., 141). 80% of Vancouver’s production is attributable to runaway production (Jensen, 27) and the city is considered by some to have crossed a threshold as:It now possesses sufficient depth and breadth of talent to undertake the full array of pre-production, production and post-production services for the delivery of major motion pictures and TV programmes. (Barnes and Coe, 19)Similarly, Toronto is considered to have established a “comprehensive set of horizontal and vertical media capabilities” to ensure its status as a “full function media centre” (Davis, 98). These cities have successfully engaged in entrepreneurial activity to attract production (Christopherson, “Project Work in Context”) and in Vancouver the proactive role of provincial government and labour unions are, in part, credited with its success (Barnes and Coe). Studio-complex infrastructure has also been used to lure global productions, with Toronto, Melbourne and Sydney all being seen as key examples of where such developments have been used as a strategic priority to take local production capacity to the next level (Goldsmith and O’Regan).Studies which provide a historiography of the development of screen-industry hubs emphasise a complex interplay of social, cultural and physical conditions. In the complex and global flows of the screen industries, ‘sticky’ hubs have emerged with the ability to attract and retain capital and skilled labour. Despite being principally organised to attract international production, most studio complexes, especially those outside of global centres need to have a strong relationship to local or national film and television production to ensure the sustainability and depth of the labour pool (Goldsmith and O’Regan, 2003). Many have a broadcaster on site as well as a range of companies with a media orientation and training facilities (Goldsmith and O’Regan, 2003; Picard, 2008). The emergence of film studio complexes in the Australian Gold Coast and Vancouver was accompanied by an increasing role for television production and this multi-purpose nature was important for the continuity of production.Fostering a strong community of below the line workers, such as set designers, locations managers, make-up artists and props manufacturers, can also be a clear advantage in attracting international productions. For example at Cinecitta in Italy, the expertise of set designers and experienced crews in the Barrandov Studios of Prague are regarded as major selling points of the studio complexes there (Goldsmith and O’Regan; Miller et al.; Szczepanik). Natural and built environments are also considered very important for film and television firms and it is a useful advantage for capturing international production when cities can double for other locations as in the cases of Toronto, Vancouver, Prague for example (Evans; Goldsmith and O’Regan; Szczepanik). Toronto, for instance, has doubled for New York in over 100 films and with regard to television Due South’s (1994-1998) use of Toronto as Chicago was estimated to have saved 40 per cent in costs (Miller et al., 141).The Scottish Screen Industries Within mobile flows of capital and labour, Scotland has sought to position itself as a recipient of screen industries activity through multiple interventions, including investment in institutional frameworks, direct and indirect economic subsidies and the development of physical infrastructure. Traditionally creative industry activity in the UK has been concentrated in London and the South East which together account for 43% of the creative economy workforce (Bakhshi et al.). In order, in part to redress this imbalance and more generally to encourage the attraction and retention of international production a range of policies have been introduced focused on the screen industries. A revised Film Tax Relief was introduced in 2007 to encourage inward investment and prevent offshoring of indigenous production, and this has since been extended to high-end television, animation and children’s programming. Broadcasting has also experienced a push for decentralisation led by public funding with a responsibility to be regionally representative. The BBC (“BBC Annual Report and Accounts 2014/15”) is currently exceeding its target of 50% network spend outside London by 2016, with 17% spent in Scotland, Wales and Northern Ireland. Channel 4 has similarly committed to commission at least 9% of its original spend from the nations by 2020. Studios have been also developed across the UK including at Roath Lock (Cardiff), Titanic Studios (Belfast), MedicaCity (Salford) and The Sharp Project (Manchester).The creative industries have been identified as one of seven growth sectors for Scotland by the government (Scottish Government). In 2010, the film and video sector employed 3,500 people and contributed £120 million GVA and £120 million adjusted GVA to the economy and the radio and TV sector employed 3,500 people and contributed £50 million GVA and £400 million adjusted GVA (The Scottish Parliament). Beyond the direct economic benefits of sectors, the on-screen representation of Scotland has been claimed to boost visitor numbers to the country (EKOS) and high profile international film productions have been attracted including Skyfall (2012) and WWZ (2013).Scotland has historically attracted international film and TV productions due to its natural locations (VisitScotland) and on average, between 2009-2014, six big budget films a year used Scottish locations both urban and rural (BOP Consulting, 2014). In all, a total of £20 million was generated by film-making in Glasgow during 2011 (Balkind) with WWZ (2013) and Cloud Atlas (2013), representing Philadelphia and San Francisco respectively, as well as doubling for Edinburgh for the recent acclaimed Scottish films Filth (2013) and Sunshine on Leith (2013). Sanson (80) asserts that the use of the city as a site for international productions not only brings in direct revenue from production money but also promotes the city as a “fashionable place to live, work and visit. Creativity makes the city both profitable and ‘cool’”.Nonetheless, issues persist and it has been suggested that Scotland lacks a stable and sustainable film industry, with low indigenous production levels and variable success from year to year in attracting inward investment (BOP Consulting). With regard to crew, problems with an insufficient production base have been identified as an issue in maintaining a pipeline of skills (BOP Consulting). Developing ‘talent’ is a central aspect of the Scottish Government’s Strategy for the Creative Industries, yet there remains the core challenge of retaining skills and encouraging new talent into the industry (BOP Consulting).With regard to film, a lack of substantial funding incentives and the absence of a studio have been identified as a key concern for the sector. For example, within the film industry the majority of inward investment filming in Scotland is location work as it lacks the studio facilities that would enable it to sustain a big-budget production in its entirety (BOP Consulting). The absence of such infrastructure has been seen as contributing to a drain of Scottish talent from these industries to other areas and countries where there is a more vibrant sector (BOP Consulting). The loss of Scottish talent to Northern Ireland was attributed to the longevity of the work being provided by Games of Thrones (2011-) now having completed its six series at the Titanic Studios in Belfast (EKOS) although this may have been stemmed somewhat recently with the attraction of US high-end TV series Outlander (2014-) which has been based at Wardpark in Cumbernauld since 2013.Television, both high-end production and local broadcasting, appears crucial to the sustainability of screen production in Scotland. Outlander has been estimated to contribute to Scotland’s production spend figures reaching a historic high of £45.8 million in 2014 (Creative Scotland ”Creative Scotland Screen Strategy Update”). The arrival of the program has almost doubled production spend in Scotland, offering the chance for increased stability for screen industries workers. Qualifying for UK High-End Television Tax Relief, Outlander has engaged a crew of approximately 300 across props, filming and set build, and cast over 2,000 supporting artist roles from within Scotland and the UK.Long running drama, in particular, offers key opportunities for both those cutting their teeth in the screen industries and also by providing more consistent and longer-term employment to existing workers. BBC television soap River City (2002-) has been identified as a key example of such an opportunity and the programme has been credited with providing a springboard for developing the skills of local actors, writers and production crew (Hibberd). This kind of pipeline of production is critical given the work patterns of the sector. According to Creative Skillset, of the 4,000 people in Scotland are employed in the film and television industries, 40% of television workers are freelance and 90% of film production work in freelance (EKOS).In an attempt to address skills gaps, the Outlander Trainee Placement Scheme has been devised in collaboration with Creative Scotland and Creative Skillset. During filming of Season One, thirty-eight trainees were supported across a range of production and craft roles, followed by a further twenty-five in Season Two. Encouragingly Outlander, and the books it is based on, is set in Scotland so the authenticity of place has played a strong component in the decision to locate production there. Producer David Brown began his career on Bill Forsyth films Gregory’s Girl (1981), Local Hero (1983) and Comfort and Joy (1984) and has a strong existing relationship to Scotland. He has been very vocal in his support for the trainee program, contending that “training is the future of our industry and we at Outlander see the growth of talent and opportunities as part of our mission here in Scotland” (“Outlander fast tracks next generation of skilled screen talent”).ConclusionsThis article has aimed to explore the relationship between place and the screen industries and, taking Scotland as its focus, has outlined a need to more closely examine the ways in which the sector can be supported. Despite the possible gains in terms of building a sustainable industry, the state-led funding of the global screen industries is contested. The use of tax breaks and incentives has been problematised and critiques range from use of public funding to attract footloose media industries to the increasingly zero sum game of competition between competing places (Morawetz; McDonald). In relation to broadcasting, there have been critiques of a ‘lift and shift’ approach to policy in the UK, with TV production companies moving to the nations and regions temporarily to meet the quota and leaving once a production has finished (House of Commons). Further to this, issues have been raised regarding how far such interventions can seed and develop a rich production ecology that offers opportunities for indigenous talent (Christopherson and Rightor).Nonetheless recent success for the screen industries in Scotland can, at least in part, be attributed to interventions including increased decentralisation of broadcasting and the high-end television tax incentives. This article has identified gaps in infrastructure which continue to stymie growth and have led to production drain to other centres. Important gaps in knowledge can also be acknowledged that warrant further investigation and unpacking including the relationship between film, high-end television and broadcasting, especially in terms of the opportunities they offer for screen industries workers to build a career in Scotland and notable gaps in infrastructure and the impact they have on the loss of production.ReferencesAntcliff, Valerie, Richard Saundry, and Mark Stuart. Freelance Worker Networks in Audio-Visual Industries. University of Central Lancashire, 2004.Bakhshi, Hasan, John Davies, Alan Freeman, and Peter Higgs. "The Geography of the UK’s Creative and High–Tech Economies." 2015.Balkind, Nicola. World Film Locations: Glasgow. Intellect Books, 2013.Banks, Mark, Andy Lovatt, Justin O’Connor, and Carlo Raffo. "Risk and Trust in the Cultural Industries." Geoforum 31.4 (2000): 453-464.Barnes, Trevor, and Neil M. Coe. “Vancouver as Media Cluster: The Cases of Video Games and Film/TV." Media Clusters: Spatial Agglomeration and Content Capabilities (2011): 251-277.Barnes, Trevor, and Thomas Hutton. "Situating the New Economy: Contingencies of Regeneration and Dislocation in Vancouver's Inner City." Urban Studies 46.5-6 (2009): 1247-1269.Bathelt, Harald, Anders Malmberg, and Peter Maskell. "Clusters and Knowledge: Local Buzz, Global Pipelines and the Process of Knowledge Creation." Progress in Human Geography 28.1 (2004): 31-56.BBC Annual Report and Accounts 2014/15 London: BBC (2015)BOP Consulting Review of the Film Sector in Glasgow: Report for Creative Scotland. Edinburgh: BOP Consulting, 2014.Champion, Katherine. "Problematizing a Homogeneous Spatial Logic for the Creative Industries: The Case of the Digital Games Industry." Changing the Rules of the Game. Palgrave Macmillan UK, 2013. 9-27.Cairncross, Francis. The Death of Distance London: Orion Business, 1997.Channel 4. Annual Report. London: Channel 4, 2014.Christopherson, Susan. "Project Work in Context: Regulatory Change and the New Geography of Media." Environment and Planning A 34.11 (2002): 2003-2015.———. "Hollywood in Decline? US Film and Television Producers beyond the Era of Fiscal Crisis." Cambridge Journal of Regions, Economy and Society 6.1 (2013): 141-157.Christopherson, Susan, and Michael Storper. "The City as Studio; the World as Back Lot: The Impact of Vertical Disintegration on the Location of the Motion Picture Industry." Environment and Planning D: Society and Space 4.3 (1986): 305-320.Christopherson, Susan, and Ned Rightor. "The Creative Economy as “Big Business”: Evaluating State Strategies to Lure Filmmakers." Journal of Planning Education and Research 29.3 (2010): 336-352.Christopherson, Susan, Harry Garretsen, and Ron Martin. "The World Is Not Flat: Putting Globalization in Its Place." Cambridge Journal of Regions, Economy and Society 1.3 (2008): 343-349.Cook, Gary A.S., and Naresh R. Pandit. "Service Industry Clustering: A Comparison of Broadcasting in Three City-Regions." The Service Industries Journal 27.4 (2007): 453-469.Creative Scotland Creative Scotland Screen Strategy Update. 2016. <http://www.creativescotland.com/__data/assets/pdf_file/0008/33992/Creative-Scotland-Screen-Strategy-Update-Feb2016.pdf>.———. Outlander Fast Tracks Next Generation of Skilled Screen Talent. 2016. <http://www.creativescotland.com/what-we-do/latest-news/archive/2016/02/outlander-fast-tracks-next-generation-of-skilled-screen-talent>.Cucco, Marco. "Blockbuster Outsourcing: Is There Really No Place like Home?" Film Studies 13.1 (2015): 73-93.Davis, Charles H. "Media Industry Clusters and Public Policy." Media Clusters: Spatial Agglomeration and Content Capabilities (2011): 72-98.Drake, Graham. "‘This Place Gives Me Space’: Place and Creativity in the Creative Industries." Geoforum 34.4 (2003): 511-524.EKOS. “Options for a Film and TV Production Space: Report for Scottish Enterprise.” Glasgow: EKOS, March 2014.Evans, Graeme. "Creative Cities, Creative Spaces and Urban Policy." Urban Studies 46.5-6 (2009): 1003-1040.Freidman, Thomas. "The World Is Flat." New York: Farrar, Straus and Giroux, 2006.Goldsmith, Ben, and Tom O’Regan. “Cinema Cities, Media Cities: The Contemporary International Studio Complex.” Screen Industry, Culture and Policy Research Series. Sydney: Australian Film Commission, Sep. 2003.Goldsmith, Ben, Susan Ward, and Tom O’Regan. "Global and Local Hollywood." InMedia. The French Journal of Media and Media Representations in the English-Speaking World 1 (2012).Grabher, Gernot. "The Project Ecology of Advertising: Tasks, Talents and Teams." Regional Studies 36.3 (2002): 245-262.Helbrecht, Ilse. "The Creative Metropolis Services, Symbols and Spaces." Zeitschrift für Kanada Studien 18 (1998): 79-93.Hibberd, Lynne. "Devolution in Policy and Practice: A Study of River City and BBC Scotland." Westminster Papers in Communication and Culture 4.3 (2007): 107-205.Hill, John. "'This Is for the Batmans as Well as the Vera Drakes': Economics, Culture and UK Government Film Production Policy in the 2000s." Journal of British Cinema and Television 9.3 (2012): 333-356.House of Commons Scottish Affairs Committee. “Creative Industries in Scotland.” Second Report of Session 2015–16. London: House of Commons, 2016.Hutton, Thomas A. "The New Economy of the Inner City." Cities 21.2 (2004): 89-108.Jensen, Rodney J.C. "The Spatial and Economic Contribution of Sydney's Visual Entertainment Industries." Australian Planner 48.1 (2011): 24-36.Leadbeater, Charles, and Kate Oakley. Surfing the Long Wave: Knowledge Entrepreneurship in Britain. London: Demos, 2001.McDonald, Adrian H. "Down the Rabbit Hole: The Madness of State Film Incentives as a 'Solution' to Runaway Production." University of Pennsylvania Journal of Business Law 14.85 (2011): 85-163.Markusen, Ann. "Sticky Places in Slippery Space: A Typology of Industrial Districts." Economic Geography (1996): 293-313.———. "Urban Development and the Politics of a Creative Class: Evidence from a Study of Artists." Environment and Planning A 38.10 (2006): 1921-1940.Miller, Toby, N. Govil, J. McMurria, R. Maxwell, and T. Wang. Global Hollywood 2. London: BFI, 2005.Morawetz, Norbert, et al. "Finance, Policy and Industrial Dynamics—The Rise of Co‐productions in the Film Industry." Industry and Innovation 14.4 (2007): 421-443.Morgan, Kevin. "The Exaggerated Death of Geography: Learning, Proximity and Territorial Innovation Systems." Journal of Economic Geography 4.1 (2004): 3-21.Mould, Oli. "Mission Impossible? Reconsidering the Research into Sydney's Film Industry." Studies in Australasian Cinema 1.1 (2007): 47-60.O’Brien, Richard. "Global Financial Integration: The End of Geography." London: Royal Institute of International Affairs, Pinter Publishers, 2002.OlsbergSPI with Nordicity. “Economic Contribution of the UK’s Film, High-End TV, Video Game, and Animation Programming Sectors.” Report presented to the BFI, Pinewood Shepperton plc, Ukie, the British Film Commission and Pact. London: BFI, Feb. 2015.Pecknold, Diane. "Heart of the Country? The Construction of Nashville as the Capital of Country Music." Sounds and the City. London: Palgrave Macmillan UK, 2014. 19-37.Picard, Robert G. Media Clusters: Local Agglomeration in an Industry Developing Networked Virtual Clusters. Jönköping International Business School, 2008.Pratt, Andy C. "New Media, the New Economy and New Spaces." Geoforum 31.4 (2000): 425-436.Reimer, Suzanne, Steven Pinch, and Peter Sunley. "Design Spaces: Agglomeration and Creativity in British Design Agencies." Geografiska Annaler: Series B, Human Geography 90.2 (2008): 151-172.Sanson, Kevin. Goodbye Brigadoon: Place, Production, and Identity in Global Glasgow. Diss. University of Texas at Austin, 2011.Scott, Allen J. "Creative Cities: Conceptual Issues and Policy Questions." Journal of Urban Affairs 28.1 (2006): 1-17.———. Global City-Regions: Trends, Theory, Policy. Oxford University Press, 2002.Scott, Allen J., and Michael Storper. "Regions, Globalization, Development." Regional Studies 41.S1 (2007): S191-S205.The Scottish Government. The Scottish Government Economic Strategy. Edinburgh: Scottish Government, 2015.———. Growth, Talent, Ambition – the Government’s Strategy for the Creative Industries. Edinburgh: Scottish Government, 2011.The Scottish Parliament Economy, Energy and Tourism Committee. The Economic Impact of the Film, TV and Video Games Industries. Edinburgh: Scottish Parliament, 2015.Sydow, Jörg, and Udo Staber. "The Institutional Embeddedness of Project Networks: The Case of Content Production in German Television." Regional Studies 36.3 (2002): 215-227.Szczepanik, Petr. "Globalization through the Eyes of Runners: Student Interns as Ethnographers on Runaway Productions in Prague." Media Industries 1.1 (2014).Vallance, Paul. "Creative Knowing, Organisational Learning, and Socio-Spatial Expansion in UK Videogame Development Studios." Geoforum 51 (2014): 15-26.Visit Scotland. “Scotland Voted Best Cinematic Destination in the World.” 2015. <https://www.visitscotland.com/blog/films/scotland-voted-best-cinematic-destination-in-the-world/>.
APA, Harvard, Vancouver, ISO, and other styles
43

Cesarini, Paul. "‘Opening’ the Xbox." M/C Journal 7, no. 3 (July 1, 2004). http://dx.doi.org/10.5204/mcj.2371.

Full text
Abstract:
“As the old technologies become automatic and invisible, we find ourselves more concerned with fighting or embracing what’s new”—Dennis Baron, From Pencils to Pixels: The Stage of Literacy Technologies What constitutes a computer, as we have come to expect it? Are they necessarily monolithic “beige boxes”, connected to computer monitors, sitting on computer desks, located in computer rooms or computer labs? In order for a device to be considered a true computer, does it need to have a keyboard and mouse? If this were 1991 or earlier, our collective perception of what computers are and are not would largely be framed by this “beige box” model: computers are stationary, slab-like, and heavy, and their natural habitats must be in rooms specifically designated for that purpose. In 1992, when Apple introduced the first PowerBook, our perception began to change. Certainly there had been other portable computers prior to that, such as the Osborne 1, but these were more luggable than portable, weighing just slightly less than a typical sewing machine. The PowerBook and subsequent waves of laptops, personal digital assistants (PDAs), and so-called smart phones from numerous other companies have steadily forced us to rethink and redefine what a computer is and is not, how we interact with them, and the manner in which these tools might be used in the classroom. However, this reconceptualization of computers is far from over, and is in fact steadily evolving as new devices are introduced, adopted, and subsequently adapted for uses beyond of their original purpose. Pat Crowe’s Book Reader project, for example, has morphed Nintendo’s GameBoy and GameBoy Advance into a viable electronic book platform, complete with images, sound, and multi-language support. (Crowe, 2003) His goal was to take this existing technology previously framed only within the context of proprietary adolescent entertainment, and repurpose it for open, flexible uses typically associated with learning and literacy. Similar efforts are underway to repurpose Microsoft’s Xbox, perhaps the ultimate symbol of “closed” technology given Microsoft’s propensity for proprietary code, in order to make it a viable platform for Open Source Software (OSS). However, these efforts are not forgone conclusions, and are in fact typical of the ongoing battle over who controls the technology we own in our homes, and how open source solutions are often at odds with a largely proprietary world. In late 2001, Microsoft launched the Xbox with a multimillion dollar publicity drive featuring events, commercials, live models, and statements claiming this new console gaming platform would “change video games the way MTV changed music”. (Chan, 2001) The Xbox launched with the following technical specifications: 733mhz Pentium III 64mb RAM, 8 or 10gb internal hard disk drive CD/DVD ROM drive (speed unknown) Nvidia graphics processor, with HDTV support 4 USB 1.1 ports (adapter required), AC3 audio 10/100 ethernet port, Optional 56k modem (TechTV, 2001) While current computers dwarf these specifications in virtually all areas now, for 2001 these were roughly on par with many desktop systems. The retail price at the time was $299, but steadily dropped to nearly half that with additional price cuts anticipated. Based on these features, the preponderance of “off the shelf” parts and components used, and the relatively reasonable price, numerous programmers quickly became interested in seeing it if was possible to run Linux and additional OSS on the Xbox. In each case, the goal has been similar: exceed the original purpose of the Xbox, to determine if and how well it might be used for basic computing tasks. If these attempts prove to be successful, the Xbox could allow institutions to dramatically increase the student-to-computer ratio in select environments, or allow individuals who could not otherwise afford a computer to instead buy and Xbox, download and install Linux, and use this new device to write, create, and innovate . This drive to literally and metaphorically “open” the Xbox comes from many directions. Such efforts include Andrew Huang’s self-published “Hacking the Xbox” book in which, under the auspices of reverse engineering, Huang analyzes the architecture of the Xbox, detailing step-by-step instructions for flashing the ROM, upgrading the hard drive and/or RAM, and generally prepping the device for use as an information appliance. Additional initiatives include Lindows CEO Michael Robertson’s $200,000 prize to encourage Linux development on the Xbox, and the Xbox Linux Project at SourceForge. What is Linux? Linux is an alternative operating system initially developed in 1991 by Linus Benedict Torvalds. Linux was based off a derivative of the MINIX operating system, which in turn was a derivative of UNIX. (Hasan 2003) Linux is currently available for Intel-based systems that would normally run versions of Windows, PowerPC-based systems that would normally run Apple’s Mac OS, and a host of other handheld, cell phone, or so-called “embedded” systems. Linux distributions are based almost exclusively on open source software, graphic user interfaces, and middleware components. While there are commercial Linux distributions available, these mainly just package the freely available operating system with bundled technical support, manuals, some exclusive or proprietary commercial applications, and related services. Anyone can still download and install numerous Linux distributions at no cost, provided they do not need technical support beyond the community / enthusiast level. Typical Linux distributions come with open source web browsers, word processors and related productivity applications (such as those found in OpenOffice.org), and related tools for accessing email, organizing schedules and contacts, etc. Certain Linux distributions are more or less designed for network administrators, system engineers, and similar “power users” somewhat distanced from that of our students. However, several distributions including Lycoris, Mandrake, LindowsOS, and other are specifically tailored as regular, desktop operating systems, with regular, everyday computer users in mind. As Linux has no draconian “product activation key” method of authentication, or digital rights management-laden features associated with installation and implementation on typical desktop and laptop systems, Linux is becoming an ideal choice both individually and institutionally. It still faces an uphill battle in terms of achieving widespread acceptance as a desktop operating system. As Finnie points out in Desktop Linux Edges Into The Mainstream: “to attract users, you need ease of installation, ease of device configuration, and intuitive, full-featured desktop user controls. It’s all coming, but slowly. With each new version, desktop Linux comes closer to entering the mainstream. It’s anyone’s guess as to when critical mass will be reached, but you can feel the inevitability: There’s pent-up demand for something different.” (Finnie 2003) Linux is already spreading rapidly in numerous capacities, in numerous countries. Linux has “taken hold wherever computer users desire freedom, and wherever there is demand for inexpensive software.” Reports from technology research company IDG indicate that roughly a third of computers in Central and South America run Linux. Several countries, including Mexico, Brazil, and Argentina, have all but mandated that state-owned institutions adopt open source software whenever possible to “give their people the tools and education to compete with the rest of the world.” (Hills 2001) The Goal Less than a year after Microsoft introduced the The Xbox, the Xbox Linux project formed. The Xbox Linux Project has a goal of developing and distributing Linux for the Xbox gaming console, “so that it can be used for many tasks that Microsoft don’t want you to be able to do. ...as a desktop computer, for email and browsing the web from your TV, as a (web) server” (Xbox Linux Project 2002). Since the Linux operating system is open source, meaning it can freely be tinkered with and distributed, those who opt to download and install Linux on their Xbox can do so with relatively little overhead in terms of cost or time. Additionally, Linux itself looks very “windows-like”, making for fairly low learning curve. To help increase overall awareness of this project and assist in diffusing it, the Xbox Linux Project offers step-by-step installation instructions, with the end result being a system capable of using common peripherals such as a keyboard and mouse, scanner, printer, a “webcam and a DVD burner, connected to a VGA monitor; 100% compatible with a standard Linux PC, all PC (USB) hardware and PC software that works with Linux.” (Xbox Linux Project 2002) Such a system could have tremendous potential for technology literacy. Pairing an Xbox with Linux and OpenOffice.org, for example, would provide our students essentially the same capability any of them would expect from a regular desktop computer. They could send and receive email, communicate using instant messaging IRC, or newsgroup clients, and browse Internet sites just as they normally would. In fact, the overall browsing experience for Linux users is substantially better than that for most Windows users. Internet Explorer, the default browser on all systems running Windows-base operating systems, lacks basic features standard in virtually all competing browsers. Native blocking of “pop-up” advertisements is still not yet possible in Internet Explorer without the aid of a third-party utility. Tabbed browsing, which involves the ability to easily open and sort through multiple Web pages in the same window, often with a single mouse click, is also missing from Internet Explorer. The same can be said for a robust download manager, “find as you type”, and a variety of additional features. Mozilla, Netscape, Firefox, Konqueror, and essentially all other OSS browsers for Linux have these features. Of course, most of these browsers are also available for Windows, but Internet Explorer is still considered the standard browser for the platform. If the Xbox Linux Project becomes widely diffused, our students could edit and save Microsoft Word files in OpenOffice.org’s Writer program, and do the same with PowerPoint and Excel files in similar OpenOffice.org components. They could access instructor comments originally created in Microsoft Word documents, and in turn could add their own comments and send the documents back to their instructors. They could even perform many functions not yet capable in Microsoft Office, including saving files in PDF or Flash format without needing Adobe’s Acrobat product or Macromedia’s Flash Studio MX. Additionally, by way of this project, the Xbox can also serve as “a Linux server for HTTP/FTP/SMB/NFS, serving data such as MP3/MPEG4/DivX, or a router, or both; without a monitor or keyboard or mouse connected.” (Xbox Linux Project 2003) In a very real sense, our students could use these inexpensive systems previously framed only within the context of entertainment, for educational purposes typically associated with computer-mediated learning. Problems: Control and Access The existing rhetoric of technological control surrounding current and emerging technologies appears to be stifling many of these efforts before they can even be brought to the public. This rhetoric of control is largely typified by overly-restrictive digital rights management (DRM) schemes antithetical to education, and the Digital Millennium Copyright Act (DMCA). Combined,both are currently being used as technical and legal clubs against these efforts. Microsoft, for example, has taken a dim view of any efforts to adapt the Xbox to Linux. Microsoft CEO Steve Ballmer, who has repeatedly referred to Linux as a cancer and has equated OSS as being un-American, stated, “Given the way the economic model works - and that is a subsidy followed, essentially, by fees for every piece of software sold - our license framework has to do that.” (Becker 2003) Since the Xbox is based on a subsidy model, meaning that Microsoft actually sells the hardware at a loss and instead generates revenue off software sales, Ballmer launched a series of concerted legal attacks against the Xbox Linux Project and similar efforts. In 2002, Nintendo, Sony, and Microsoft simultaneously sued Lik Sang, Inc., a Hong Kong-based company that produces programmable cartridges and “mod chips” for the PlayStation II, Xbox, and Game Cube. Nintendo states that its company alone loses over $650 million each year due to piracy of their console gaming titles, which typically originate in China, Paraguay, and Mexico. (GameIndustry.biz) Currently, many attempts to “mod” the Xbox required the use of such chips. As Lik Sang is one of the only suppliers, initial efforts to adapt the Xbox to Linux slowed considerably. Despite that fact that such chips can still be ordered and shipped here by less conventional means, it does not change that fact that the chips themselves would be illegal in the U.S. due to the anticircumvention clause in the DMCA itself, which is designed specifically to protect any DRM-wrapped content, regardless of context. The Xbox Linux Project then attempted to get Microsoft to officially sanction their efforts. They were not only rebuffed, but Microsoft then opted to hire programmers specifically to create technological countermeasures for the Xbox, to defeat additional attempts at installing OSS on it. Undeterred, the Xbox Linux Project eventually arrived at a method of installing and booting Linux without the use of mod chips, and have taken a more defiant tone now with Microsoft regarding their circumvention efforts. (Lettice 2002) They state that “Microsoft does not want you to use the Xbox as a Linux computer, therefore it has some anti-Linux-protection built in, but it can be circumvented easily, so that an Xbox can be used as what it is: an IBM PC.” (Xbox Linux Project 2003) Problems: Learning Curves and Usability In spite of the difficulties imposed by the combined technological and legal attacks on this project, it has succeeded at infiltrating this closed system with OSS. It has done so beyond the mere prototype level, too, as evidenced by the Xbox Linux Project now having both complete, step-by-step instructions available for users to modify their own Xbox systems, and an alternate plan catering to those who have the interest in modifying their systems, but not the time or technical inclinations. Specifically, this option involves users mailing their Xbox systems to community volunteers within the Xbox Linux Project, and basically having these volunteers perform the necessary software preparation or actually do the full Linux installation for them, free of charge (presumably not including shipping). This particular aspect of the project, dubbed “Users Help Users”, appears to be fairly new. Yet, it already lists over sixty volunteers capable and willing to perform this service, since “Many users don’t have the possibility, expertise or hardware” to perform these modifications. Amazingly enough, in some cases these volunteers are barely out of junior high school. One such volunteer stipulates that those seeking his assistance keep in mind that he is “just 14” and that when performing these modifications he “...will not always be finished by the next day”. (Steil 2003) In addition to this interesting if somewhat unusual level of community-driven support, there are currently several Linux-based options available for the Xbox. The two that are perhaps the most developed are GentooX, which is based of the popular Gentoo Linux distribution, and Ed’s Debian, based off the Debian GNU / Linux distribution. Both Gentoo and Debian are “seasoned” distributions that have been available for some time now, though Daniel Robbins, Chief Architect of Gentoo, refers to the product as actually being a “metadistribution” of Linux, due to its high degree of adaptability and configurability. (Gentoo 2004) Specifically, the Robbins asserts that Gentoo is capable of being “customized for just about any application or need. ...an ideal secure server, development workstation, professional desktop, gaming system, embedded solution or something else—whatever you need it to be.” (Robbins 2004) He further states that the whole point of Gentoo is to provide a better, more usable Linux experience than that found in many other distributions. Robbins states that: “The goal of Gentoo is to design tools and systems that allow a user to do their work pleasantly and efficiently as possible, as they see fit. Our tools should be a joy to use, and should help the user to appreciate the richness of the Linux and free software community, and the flexibility of free software. ...Put another way, the Gentoo philosophy is to create better tools. When a tool is doing its job perfectly, you might not even be very aware of its presence, because it does not interfere and make its presence known, nor does it force you to interact with it when you don’t want it to. The tool serves the user rather than the user serving the tool.” (Robbins 2004) There is also a so-called “live CD” Linux distribution suitable for the Xbox, called dyne:bolic, and an in-progress release of Slackware Linux, as well. According to the Xbox Linux Project, the only difference between the standard releases of these distributions and their Xbox counterparts is that “...the install process – and naturally the bootloader, the kernel and the kernel modules – are all customized for the Xbox.” (Xbox Linux Project, 2003) Of course, even if Gentoo is as user-friendly as Robbins purports, even if the Linux kernel itself has become significantly more robust and efficient, and even if Microsoft again drops the retail price of the Xbox, is this really a feasible solution in the classroom? Does the Xbox Linux Project have an army of 14 year olds willing to modify dozens, perhaps hundreds of these systems for use in secondary schools and higher education? Of course not. If such an institutional rollout were to be undertaken, it would require significant support from not only faculty, but Department Chairs, Deans, IT staff, and quite possible Chief Information Officers. Disk images would need to be customized for each institution to reflect their respective needs, ranging from setting specific home pages on web browsers, to bookmarks, to custom back-up and / or disk re-imaging scripts, to network authentication. This would be no small task. Yet, the steps mentioned above are essentially no different than what would be required of any IT staff when creating a new disk image for a computer lab, be it one for a Windows-based system or a Mac OS X-based one. The primary difference would be Linux itself—nothing more, nothing less. The institutional difficulties in undertaking such an effort would likely be encountered prior to even purchasing a single Xbox, in that they would involve the same difficulties associated with any new hardware or software initiative: staffing, budget, and support. If the institutional in question is either unwilling or unable to address these three factors, it would not matter if the Xbox itself was as free as Linux. An Open Future, or a Closed one? It is unclear how far the Xbox Linux Project will be allowed to go in their efforts to invade an essentially a proprietary system with OSS. Unlike Sony, which has made deliberate steps to commercialize similar efforts for their PlayStation 2 console, Microsoft appears resolute in fighting OSS on the Xbox by any means necessary. They will continue to crack down on any companies selling so-called mod chips, and will continue to employ technological protections to keep the Xbox “closed”. Despite clear evidence to the contrary, in all likelihood Microsoft continue to equate any OSS efforts directed at the Xbox with piracy-related motivations. Additionally, Microsoft’s successor to the Xbox would likely include additional anticircumvention technologies incorporated into it that could set the Xbox Linux Project back by months, years, or could stop it cold. Of course, it is difficult to say with any degree of certainty how this “Xbox 2” (perhaps a more appropriate name might be “Nextbox”) will impact this project. Regardless of how this device evolves, there can be little doubt of the value of Linux, OpenOffice.org, and other OSS to teaching and learning with technology. This value exists not only in terms of price, but in increased freedom from policies and technologies of control. New Linux distributions from Gentoo, Mandrake, Lycoris, Lindows, and other companies are just now starting to focus their efforts on Linux as user-friendly, easy to use desktop operating systems, rather than just server or “techno-geek” environments suitable for advanced programmers and computer operators. While metaphorically opening the Xbox may not be for everyone, and may not be a suitable computing solution for all, I believe we as educators must promote and encourage such efforts whenever possible. I suggest this because I believe we need to exercise our professional influence and ultimately shape the future of technology literacy, either individually as faculty and collectively as departments, colleges, or institutions. Moran and Fitzsimmons-Hunter argue this very point in Writing Teachers, Schools, Access, and Change. One of their fundamental provisions they use to define “access” asserts that there must be a willingness for teachers and students to “fight for the technologies that they need to pursue their goals for their own teaching and learning.” (Taylor / Ward 160) Regardless of whether or not this debate is grounded in the “beige boxes” of the past, or the Xboxes of the present, much is at stake. Private corporations should not be in a position to control the manner in which we use legally-purchased technologies, regardless of whether or not these technologies are then repurposed for literacy uses. I believe the exigency associated with this control, and the ongoing evolution of what is and is not a computer, dictates that we assert ourselves more actively into this discussion. We must take steps to provide our students with the best possible computer-mediated learning experience, however seemingly unorthodox the technological means might be, so that they may think critically, communicate effectively, and participate actively in society and in their future careers. About the Author Paul Cesarini is an Assistant Professor in the Department of Visual Communication & Technology Education, Bowling Green State University, Ohio Email: pcesari@bgnet.bgsu.edu Works Cited http://xbox-linux.sourceforge.net/docs/debian.php>.Baron, Denis. “From Pencils to Pixels: The Stages of Literacy Technologies.” Passions Pedagogies and 21st Century Technologies. Hawisher, Gail E., and Cynthia L. Selfe, Eds. Utah: Utah State University Press, 1999. 15 – 33. Becker, David. “Ballmer: Mod Chips Threaten Xbox”. News.com. 21 Oct 2002. http://news.com.com/2100-1040-962797.php>. http://news.com.com/2100-1040-978957.html?tag=nl>. http://archive.infoworld.com/articles/hn/xml/02/08/13/020813hnchina.xml>. http://www.neoseeker.com/news/story/1062/>. http://www.bookreader.co.uk>.Finni, Scott. “Desktop Linux Edges Into The Mainstream”. TechWeb. 8 Apr 2003. http://www.techweb.com/tech/software/20030408_software. http://www.theregister.co.uk/content/archive/29439.html http://gentoox.shallax.com/. http://ragib.hypermart.net/linux/. http://www.itworld.com/Comp/2362/LWD010424latinlinux/pfindex.html. http://www.xbox-linux.sourceforge.net. http://www.theregister.co.uk/content/archive/27487.html. http://www.theregister.co.uk/content/archive/26078.html. http://www.us.playstation.com/peripherals.aspx?id=SCPH-97047. http://www.techtv.com/extendedplay/reviews/story/0,24330,3356862,00.html. http://www.wired.com/news/business/0,1367,61984,00.html. http://www.gentoo.org/main/en/about.xml http://www.gentoo.org/main/en/philosophy.xml http://techupdate.zdnet.com/techupdate/stories/main/0,14179,2869075,00.html. http://xbox-linux.sourceforge.net/docs/usershelpusers.html http://www.cnn.com/2002/TECH/fun.games/12/16/gamers.liksang/. Citation reference for this article MLA Style Cesarini, Paul. "“Opening” the Xbox" M/C: A Journal of Media and Culture <http://www.media-culture.org.au/0406/08_Cesarini.php>. APA Style Cesarini, P. (2004, Jul1). “Opening” the Xbox. M/C: A Journal of Media and Culture, 7, <http://www.media-culture.org.au/0406/08_Cesarini.php>
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography