Journal articles on the topic 'Strategic alliances (Business) – Computer network resources'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 29 journal articles for your research on the topic 'Strategic alliances (Business) – Computer network resources.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Zhao, Fang. "Taking a Strategic Alliance Approach to Enhance M-Commerce Development." International Journal of E-Business Research 6, no. 4 (October 2010): 26–37. http://dx.doi.org/10.4018/jebr.2010100103.

Full text
Abstract:
Current m-commerce business models show that m-commerce depends on complex networks of business relationships, which often comprise telecommunications service providers, mobile device makers, banking industry, Internet search engine providers, and various third-party value-adding companies. Due to the nature of m-commerce, the key to success in m-commerce lies predominantly in managing a network of alliances. This paper answers research questions, such as why do companies team up for m-commerce? What are the key challenges facing the alliances? How can companies address the challenges? What does the future hold for the study of strategic alliances including m-commerce alliances? This paper extends strategic alliance theories to the study of m-commerce alliances that are formulated in various cultural and national backgrounds. The authors examine both strategic and operational strategies for m-commerce alliances and discuss a wide range of issues in the formulation and implementation of m-commerce alliance strategy.
APA, Harvard, Vancouver, ISO, and other styles
2

Liginlal, Divakaran, Lara Khansa, and Stella C. Chia. "Using Real Options Theory to Evaluate Strategic Investment Options for Mobile Content Delivery." International Journal of Business Data Communications and Networking 6, no. 1 (January 2010): 17–37. http://dx.doi.org/10.4018/jbdcn.2010010102.

Full text
Abstract:
With a rich fare of localized content, but limited regional media outlet channels, mobile content generates new business opportunities for Media News, a small media company with considerable growth potential. Two business models are considered: partnering with wireless service providers and strategic alliances with mobile content syndicators. First, the models are evaluated based on their resource requirements, market share acquisition, revenue generation, and nature, scope and control of content and bandwidth. Then, real options analysis is used to value Media News’ managerial flexibility in responding to uncertainty in investment choices specific to the media industry. The modeling approach, analytical methods, and decision support tools employed in this paper serve as exemplar for engineering managers involved in strategic investment decisions, especially in emerging areas such as mobile commerce.
APA, Harvard, Vancouver, ISO, and other styles
3

Chen, I.-Fen, and Shao-Chi Chang. "The intra business group effects of alliance network extensions." Management Decision 54, no. 6 (July 11, 2016): 1420–42. http://dx.doi.org/10.1108/md-06-2015-0223.

Full text
Abstract:
Purpose – The purpose of this paper is to better understand the influence of business group membership by exploring how actions by a member firm influence other firms in the business group. Specifically, the authors ask two questions in this study: when a member firm forms strategic alliances with partners outside of the business group, how does the alliance influence other members in the business group? Moreover, which types of member firms are more affected than others? Design/methodology/approach – The authors employ standard event-study methodology to examine the stock price responses for the focal and member firms on the announcement of an alliance. Moreover, the authors employ the cross-sectional regression analyses to test hypotheses concerning the impact of alliance, group, and firm characteristics on the cumulative abnormal returns of non-announcing members. All regressions are estimated using ordinary least squares. Findings – The results show that, on average, alliance-announcing member firms experience significantly positive share price responses to announcements of strategic alliances. Moreover, the impact of alliance formation spillover to other non-announcing members in the business group. The authors also find that the influences on the non-announcing members are dissimilar. The non-announcing members are more strongly affected when they are in different industries from the non-member partner, and when the ownership of the business group is more concentrated. Originality/value – This study is to extend the resource complementarities perspective, which may help firms to more effectively configure their network portfolios in order to develop synergies among related network resources. The study thus extends the alliance portfolio literature to the literature on business groups. Since the inter-firm networks within business groups are more complex than those in alliance portfolios, the authors are able to study how the structure of a business, such as ownership concentration, can influence the intra-network effect.
APA, Harvard, Vancouver, ISO, and other styles
4

Parkhomenko, N. O. "A Comparative Characterization of the Strategies for Organizational Development of Business Systems in Global Environment." Business Inform 11, no. 514 (2020): 385–92. http://dx.doi.org/10.32983/2222-4459-2020-11-385-392.

Full text
Abstract:
The research is aimed at comparing the strategies for organizational development of business systems in the global environment and determining the priorities of their application in the practice of business. When analyzing approaches to strategic management, the peculiarities of application of the strategies for organizational development of business systems are considered, and strategies are systematized into three following groups: non-interference, development, and decline. Considerable attention is paid to development strategies: internal growth; external growth – horizontal and vertical integration, concentric and conglomerate diversification, network structures, strategic alliances, outsourcing. As a result of the research, it is defined that the global environment is characterized by a significant degree of interstate and inter-firm competition. In order to make the most use of market opportunities, to minimize threats, to adapt to changes in a timely manner, taking into account business conditions in different countries, it is proposed to use innovative strategies of organizational development more actively. The strengths and weaknesses of each of the strategies considered are identified. The feasibility of using growth strategies is substantiated, namely, strategies of network structures and strategic alliances for powerful business systems, and outsourcing strategies for medium and small enterprises limited in resources. At the same time, it is proposed to combine various strategies for the organizational development of business systems to achieve various goals. Prospects for further research in this direction are the formation of methodological support for evaluating strategic alternatives in substantiating the choice of a strategy for the organizational development of business systems in global environment.
APA, Harvard, Vancouver, ISO, and other styles
5

Parkhomenko, N. O. "A Comparative Characterization of the Strategies for Organizational Development of Business Systems in Global Environment." Business Inform 11, no. 514 (2020): 385–92. http://dx.doi.org/10.32983/2222-4459-2020-11-385-392.

Full text
Abstract:
The research is aimed at comparing the strategies for organizational development of business systems in the global environment and determining the priorities of their application in the practice of business. When analyzing approaches to strategic management, the peculiarities of application of the strategies for organizational development of business systems are considered, and strategies are systematized into three following groups: non-interference, development, and decline. Considerable attention is paid to development strategies: internal growth; external growth – horizontal and vertical integration, concentric and conglomerate diversification, network structures, strategic alliances, outsourcing. As a result of the research, it is defined that the global environment is characterized by a significant degree of interstate and inter-firm competition. In order to make the most use of market opportunities, to minimize threats, to adapt to changes in a timely manner, taking into account business conditions in different countries, it is proposed to use innovative strategies of organizational development more actively. The strengths and weaknesses of each of the strategies considered are identified. The feasibility of using growth strategies is substantiated, namely, strategies of network structures and strategic alliances for powerful business systems, and outsourcing strategies for medium and small enterprises limited in resources. At the same time, it is proposed to combine various strategies for the organizational development of business systems to achieve various goals. Prospects for further research in this direction are the formation of methodological support for evaluating strategic alternatives in substantiating the choice of a strategy for the organizational development of business systems in global environment.
APA, Harvard, Vancouver, ISO, and other styles
6

Неретина, Евгения, Evgeniya Neretina, Анастасия Гришнева, and Anastasiya Grishneva. "METHODOLOGICAL BASES AND PRACTICAL EXPERIENCE OF THE FOREIGN COMPANIES IN THE ORGANIZATION OF NETWORK INTERACTION OF SUBJECTS OF INNOVATIVE BUSINESS." Russian Journal of Management 4, no. 4 (December 8, 2016): 575–80. http://dx.doi.org/10.12737/22558.

Full text
Abstract:
In the conditions of globalization of social and economic processes, growth of dynamism and uncertainty of factors of environment network interaction of economic agents, especially in the innovative sphere, becomes the important instrument of formation of competitive advantage due to access to various combinations of ideas, advanced knowledge and technologies, unique resources. In this regard in article methodological basics of creation of intercompany networks are covered: the categories opening the maintenance of intersubject networks, dual approach to structural interaction of subjects of innovative activity, theoretical prerequisites of formation of mechanisms of formation of network structures (the theory of information, chaos, synergetrics, nonequilibrium thermodynamics). The practice of network interaction within the portfolios of alliances created by the international companies Sony and Sumsung allowing to reveal features in the organization of intra firm interaction of structural divisions, management of the companies and also in creation of portfolios of alliances is studied. The analysis of various configurations of networks, their advantages and shortcomings is carried out, the factors having impact on success of network interaction of subjects of innovative business, their competitive advantages are established. It is shown that the Russian business still has no understanding of importance of change of organizational forms of both intra firm, and intercompany interaction, wide use of the experience of creation of interindustry and international networks which is saved up by the foreign companies on the basis of strategic partnership and alliances.
APA, Harvard, Vancouver, ISO, and other styles
7

STAVSKA, Yulia. "THE DIRECTIONS OF INNOVATIVE CHANGES IN THE DEVELOPMENT OF TOURISM IN UKRAINE." "EСONOMY. FINANСES. MANAGEMENT: Topical issues of science and practical activity", no. 2 (42) (February 2019): 45–57. http://dx.doi.org/10.37128/2411-4413-2019-2-4.

Full text
Abstract:
In the context of globalization, the sphere of tourism becomes more and more important, since it is one of the most profitable and dynamic branches of the world economy of the country. Tourism contributes to the growth of employment, accelerates the diversification of the economy, because this sphere covers more than 50 branches of the national economy, therefore the innovation of the tourist sphere should become its constant component. In addition, tourism not only preserves, but also develops the cultural potential of the country and the population, harmonize relations between different peoples and contributes to the friendly use of the environment. Indicators of its profitability indicate the importance of tourism for the development of economy, namely: 8% of world exports and 30% of world services sales, as well as annual growth of world tourist flows by 4-5%. However, Ukraine loses significantly in the competition, lagging behind the leading countries of the world in terms of development of tourist infrastructure and the quality of tourist services. Financial and economic crisis that has been continuing in recent years, the events associated with annexation of the Autonomous Republic of Crimea and anti-terrorist operation on the territory of Donetsk and Luhansk regions had a negative impact on the tourism flows, the tourism structure and tourist opportunities of the country both on the domestic and the external tourism market. In addition, Ukrainians have recently received the possibility of visa-free travel to the countries of the European Union, which will also lead to the increase in outbound tourism, and a reduction in the entrance. In this regard, it is necessary to introduce innovations in the development of tourism in Ukraine in order to attract visitors. Innovation in tourism "involves developing an original approach, developing new ways to use existing resources while seeking new ones";. Typization of innovation in tourism is developed by Abernathy and Clark, who distinguish 4 types of innovations: regular, niche, revolutionary and architectural. Regular innovations refer to the continuous improvement of the quality of services, the improvement of personnel skills and productivity. Niches innovations usually change the structure of cooperation, but not basic knowledge and skills; they combine existing services in a new way. Revolutionary innovations are associated with the use of new technologies in firms, the development of new methods; they have a radical impact on the key knowledge and skills of the staff up to the appearance of new occupations in the sector. Architectural innovations change the structure, business model and rules in tourism; they create new events and objects that require reorganization, chang physical or institutional infrastructure, research and training facilities. One of the main directions of innovation is the development of sustainable tourism. Destination (tourist areas) are beginning to create a management system for tourism resources and use them in the planning of territorial development. Recently, it became clear that the directions of innovation in tourism are increasingly becoming a systemic nature, when tourism turns from the industry into a "public enterprise";, which involves a variety of institutional stakeholders: government, business, local communities, science and education. At the same time, the remarkable thing is that the more the system of tourism management varies from industrial to post-industrial, the greater the innovative potential has the sphere of tourism, the greater the extent of tourism shifts from mass to individualized. This is due to the smaller segmentation of the market and the emergence of many niche, specialized services and tourist products. There is adaptation of tourism infrastructure, buildings and equipment for the needs of such new, growing groups of tourists as children, the elderly and the disabled. The innovations in the tourism industry, according to Novikov V.S., “are the result of actions aimed at creating a new or changing existing tourist product, developing new markets, using advanced information and telecommunication technologies, improving the provision of tourist, transport and hotel services, creating strategic alliances for realization of tourist business, introduction of modern forms of organizational and managerial activity of tourist enterprises”. In general, tourism is a multi-vector industry, which can create some difficulties in introducing innovations. At the same time, certain areas of tourism activity are impossible without the use of innovative technologies, in particular, information and computer technologies. Effective administrative work of tourism enterprises involves the application of various facilities, as Ukrainian travel agencies are widely using the information sector, in the form of messages, and the virtual sector of travel in global networks. Studies conducted in the field of tourism demonstrate the "consumer nature" of information innovation, while tourism is not a producer in this field. According to preliminary estimates, the cost of the initiative tour operators in Ukraine for maintaining an on-line booking system is about 2% of the cost of services sold. Sales of entertainment and recreation tours provide about 3.3 million dollars. US annually spent money on developing, maintaining and filling in the information support system. Common directions of the use of innovative technologies in tourism are: mobile Internet, catalogs of electronic offers, on-line booking not only for retail agencies, but also directly for clients. Further development of innovation activity is the development of new ideas for promotion of tourist product, development of new tourist routes, availability of information to the consumer, software, etc.
APA, Harvard, Vancouver, ISO, and other styles
8

Napoli, Francesco. "Corporate governance and firm networks: An empirical research based on Italy." Corporate Ownership and Control 15 (2018): 231–47. http://dx.doi.org/10.22495/cocv15i2c1p9.

Full text
Abstract:
We examine problems of strategic change and innovation in Italian firms which develop cooperative relationships with other firms. The inter-firm network phenomenon has taken on such importance in Italy that, in 2009, the State issued a law (Decreto Legge 5/2009) specifically to regulate the concluding of cooperative contracts for the formation of inter-firm networks. This law offers firms that wish to keep their groups of owners separate the possibility to establish a multiplicity of inter-firm relationships through the signing of just one single contract, named “Contratto di rete”, which, in this paper, we will refer to as a “network contract”. For historical reasons, all firms in Italy, even those quoted on the stock market (Milan Stock Exchange), exhibit a high level of ownership concentration. The largest class of blockholders is that of families who are active in the family firm. As regards the size of firms that maintain cooperative relationships, data on network contracts show that 95% of the firms stipulating these contracts are small- or medium-sized enterprises (SMEs), so categorised because they have fewer than 50 employees. Through strategic alliances and collaborative relationships, Italian family firms have been able to develop business ideas that, as a consequence of the companies’ small dimensions, would have been impossible otherwise. On the basis of this premise, we considered it convenient to analyse small- or medium-sized family firms that developed relationships of cooperation regulated by network contracts in the period between 1/1/2013 and 31/12/2016. With reference to this category of firm, we analysed data on strategic change and innovation for a sample of 391 firms that accepted to be interviewed by us. Some of these firms had opened their top management teams (TMT) and/or their Boards of Directors to the participation of individuals from outside the dominant family, while others had not. The results of this research show that the firm that extends participation in the board or the Top Management Team by involving individuals from outside the dominant family, so as to gain better access to critical resources controlled by partners, creates a more favourable context for strategic change and innovation.
APA, Harvard, Vancouver, ISO, and other styles
9

Morandi, Valentina, and Francesca Sgobbi. "Learning in Networks of SMEs." International Journal of Human Capital and Information Technology Professionals 2, no. 1 (January 2011): 66–79. http://dx.doi.org/10.4018/jhcitp.2011010105.

Full text
Abstract:
This paper contributes to the debate on the participation of SMEs in voluntary business networks by framing the relationship between the different types of network-based learning. Learning about networking, which concerns the capability to set, manage, and terminate a strategic alliance, is opposed to learning by networking, which involves the sharing and the joint creation of technical knowledge. The proposed framework is tested in the case of a network of Italian SMEs in the ICT sector. Empirical evidence confirms that learning about networking enables learning by networking and helps to balance those tensions and conflicts that inevitably mark the existence of inter-firm networks. Learning about alliance management provides networked IT entrepreneurs with the capabilities to compete against larger competitors. As learning paths also drive the evolution of inter-firm alliances, networked entrepreneurs would benefit from choosing collective goals in line with their alliance management capabilities.
APA, Harvard, Vancouver, ISO, and other styles
10

Colapinto, Cinzia, Laura Gavinelli, Mariangela Zenga, and Angelo Di Gregorio. "Different approaches to the pursuit of internationalization by Italian SMEs." Journal of Research in Marketing and Entrepreneurship 17, no. 2 (October 19, 2015): 229–48. http://dx.doi.org/10.1108/jrme-11-2014-0030.

Full text
Abstract:
Purpose – The aim of this paper is to analyse why Italian small and medium enterprises (SMEs) pursue internationalization (current and future entry modes, motivations, advantages and difficulties) and how they go about it, with reference to four key areas: innovation and technology, networking, environmental approach and human resource (HR) competences. Design/methodology/approach – A questionnaire was distributed to 792 enterprises with a response rate of 24.37 per cent. Data were collected using the computer assisted web interviewing (CAWI) method and processed with Rasch analysis, Principal Components Analysis and Cluster analysis methods. Findings – The paper presents the results of a quantitative research on SMEs located in the Province of Monza and Brianza – one of the most productive territories in Italy. Four different clusters emerged with specific approaches. Briefly, this paper points out that: innovation is mostly linked to the product and is incremental; HR and their competences are crucial for facing complex markets; the green issue is not dominant (it is considered only for saving energy and reducing cost production); and networking is not a key issue (except informal relations, contractual agreements and strategic alliances). Research limitations/implications – The research could be extended: through a longitudinal survey on the same sample; by covering different territories on the same topics. The cluster analysis identifies potential guidelines for entrepreneurial behaviour in respect to key factors for exiting from the economic and financial crisis: innovation and technology, formal and informal networks, the “green” approach, HR training. Originality/value – This paper presents a new interdisciplinary approach that may work beyond country boundaries, providing a new basis to the debate on the internationalization of SMEs.
APA, Harvard, Vancouver, ISO, and other styles
11

Daoping, Wang, Wei Xiaoyan, and Fang Fang. "The resource evolution of standard alliance by technology standardization." Chinese Management Studies 10, no. 4 (November 7, 2016): 787–801. http://dx.doi.org/10.1108/cms-08-2016-0169.

Full text
Abstract:
Purpose This paper aims to explore the evolution mechanism of resources in a standard alliance that are matched with resources required at different standardization stages from the viewpoint of dynamic matching. How core enterprises in an alliance allocate resources, select member enterprises and maintain the normal operation of an alliance, according to the resource evolution of a standard alliance, is an important issue when dealing with the implementation of technology standardization. Design/methodology/approach The authors have chosen the Intelligent Grouping and Resource Sharing (IGRS) standard alliance of computer companies in China as the object of this study. The authors have built indices to identify core enterprises in the alliance from the viewpoint of network organization. The authors also collected data from authoritative news websites concerning patents and cooperative projects undertaken by 216 enterprises in the IGRS alliance during the period from 2002 to 2016, and they have computed and analyzed these data by using UCINET 6.0 software and social network analysis methodology to identify core enterprises at different standardization stages, thus revealing the evolution mechanism for resources in the standard alliance. Findings Technology standardization is divided into R&D, industrialization and marketization stages, and the standard alliance requires different resources to satisfy what is required at each of those different standardization stages. While technology standardization is a process during which technology systems standards are continuously being perfected and the standard product market is continuously expanding, the development of technology standardization affects the evolutionary processes of the core enterprises and affects the selection of member enterprises in the standard alliance. Practical implications The results obtained will assist the standard alliance to select proper member enterprises and dynamically match the alliance’s resources with the resources required at different standardization stages to speed up the implementation of independent standardization in China. Originality/value This study demonstrates the evolution mechanism of resources in technology standard alliances at different standardization stages by using quantitative analysis methodology, and it enriches the research on which elements are influential for technology standardization’s development in the context of China’s social, economic and cultural characteristics.
APA, Harvard, Vancouver, ISO, and other styles
12

Попович and Aleksey Popovich. "Innovative Approach to Management Education’s Content and Structure Improving." Administration 3, no. 4 (December 10, 2015): 96–102. http://dx.doi.org/10.12737/16703.

Full text
Abstract:
For the formation and development of tomorrow’s highly effective Manager model it is not already enough of competences from the area of only classic management and/or classic emotional and even business leadership. The optimum combination of universal values and strategic anticipation, social leadership, universal and special management technologies is necessary. Aspects of innovation process management on organization’s holistic development have been considered, main groups of higher school’s development strategies have been marked, and two types of innovative processes in the education system have been characterized in this paper. The author emphasizes that the traditional association of efforts in the training of specialists through a formal merger of several structures is ultimately ineffective. More productive will be the consolidation of efforts (in parallel with formal associations) through contractual relationships, creation of basic chairs at enterprises, strategic alliances, consortia, network organizations, educational and industrial groups, technology parks, business parks, innovative educational and industrial clusters, in other words, through creation of formal virtual structures. These chairs should become such points for growth of future organizational structures for graduates’ training, in other words, they should become mini-academies, which will provide the high level training, and development of science. The paper talks about creation of new forms and organization experiences for training of managerial human resources of new generation through formal virtual structures creation with authorities, high educational institutions, business representatives, secondary school institutions, preschool facilities. This project implementation requires the allocation of several stages. The first one is creation of Institute of management (as faculty) as the core organizational structure in the form of high educational institution unit. A sample structure of such Institute has been proposed, the practical experience of such structures formation has been placed in the clearest light. In the process of their development the Institutions can be transformed into more complex structures (e.g., academies, management universities), combining the traditional organizational structures with creation of strategic formally virtual associations with various educational and industrial structures at the municipal, regional, Federal, and international levels
APA, Harvard, Vancouver, ISO, and other styles
13

Della Corte, Valentina. "The light side and the dark side of inter-firm collaboration: How to govern distrust in business networks." Corporate Ownership and Control 6, no. 4 (2009): 407–26. http://dx.doi.org/10.22495/cocv6i4c3p6.

Full text
Abstract:
In front of globalization, hypercompetition and turbulence (D’Aveni, 1994, 1995), it’s more and more frequent to see inter-firms relationships increase exponentially: alliances, partnerships, social groups, clans. Networks are becoming a prevailing organizational form in the21st century (Cravens, Piercy, 1994). The unit of analysis, in this article, is the strategic systems and more precisely the strategic network that develops within a territory (business districts, destinations) or a virtual set and that is even denser and more complex than ordinary networks: local resources can be relevant for the whole aggregate and relations are also physically or virtually particularly closed. Strategic networks and inter-firm collaborations have often been analysed with respect to their main success factors. Less attention has been paid to the more obscure and less satisfying aspects that someway explain why, in some cases, they fail or at least do not take off. Even theoretical frameworks usually adopted as Resource-Based Theory (Rumelt, 1982; Wernerfelt, 1984; Barney, 1991, 2007) Transaction Cost Economics (Williamson, 1975, 1981) and Social Network Theory (Granovetter, 1973, 1982; Lieberskind et al., 1996, Wasserman, Faust, 1999) are used according to a positive approach, aimed at finding and analyzing mainly successful initiatives. The aim of this article is to analyse, in particular, situations of distrust, that can either continue pushing firms not to cooperate or rather evolve towards more trustful situations and therefore with more chances of really developing business networks. A specific model is proposed, to manage distrust and to evolve towards trustful situations. The process, however, requires a specific intervention of a network governance actor, that can stimulate it. This actor must have distinctive capabilities and competences to manage the process. The proposed model is developed with the help of Game Theory (Fudemberg, Tyrole, 1991; Gibbons, 1992; Myerson, 2002, 2006) and can be applied empirically to verify what prevents actors from cooperating and how the governance actor can lead the process towards trust situations. Game theory is also used to study the possible level of coopetition (Brandenbruger, Nalebuff, 1996), that is the collaboration that can be put forward among competitors. Firms involved in these processes vary their own approaches both in terms of realizing the opportunities brought about by collaboration and of assuming a positive vs opportunistic behaviour. But the latter often prevails….The results offered by the model will also have some managerial implications since they should be able to give useful hints to decision makers on how to govern distrust. The model will be tested empirically on a sample of firms operating in tourism sector in Southern Italy, involved in local networks. In tourism industry, cooperation between players operating in the same destination is something needed to compete against global destinations. It will be then applied to other industries characterized by small and medium enterprises that have invested in the same area/district, with a high potential for collaboration.
APA, Harvard, Vancouver, ISO, and other styles
14

Zang, Jinjuan. "Structural holes, exploratory innovation and exploitative innovation." Management Decision 56, no. 8 (August 13, 2018): 1682–95. http://dx.doi.org/10.1108/md-05-2017-0485.

Full text
Abstract:
Purpose Existing research has demonstrated that the innovation implications of structural holes are inconsistent. The diverse and broad resources associated with structural holes facilitate innovation. On the contrary, brokerage will also hinder trust and increase the opportunism behaviors among partners, which will damage innovation. Inspired by the conflicting conclusions, the purpose of this paper is to analyze the roles of structural holes on exploratory innovation and exploitative innovation. Design/methodology/approach To test the model, the paper used a panel of 305 US computer focal firms and 6,894 alliances from the period spanning 1993 to 2004, and adopted the Heckman two-stage selection procedure in predicting the results. Findings The results show that structural holes help firms to develop exploratory innovation while negatively impacting exploitative innovation. Originality/value This study offers precise insights on inconsistent understandings between structural holes and innovation by differentiating exploratory innovation from exploitative innovation. Furthermore, it contributes to the burgeoning literature on exploration and exploitation from the network perspective.
APA, Harvard, Vancouver, ISO, and other styles
15

Lam, H. Y., G. T. S. Ho, C. H. Wu, and K. L. Choy. "Customer relationship mining system for effective strategies formulation." Industrial Management & Data Systems 114, no. 5 (June 3, 2014): 711–33. http://dx.doi.org/10.1108/imds-08-2013-0329.

Full text
Abstract:
Purpose – The purpose of this paper is to propose a customer relationship mining system (CRMS) to analyze the data collected from franchisees and formulates a marketing strategy based on customer demand and behavior. Design/methodology/approach – The system makes use of cloud technology to collect and manage data among the franchisees. An integrated approach of association rule mining and the neural network technique is adopted to investigate customer behavioral patterns and to forecast sales demand, respectively. Findings – The significance and contribution of this paper are demonstrated by adopting the CRMS in the education industry in Hong Kong. The findings led to the identification of student learning intentions such as course preferences, and the forecasting of enrolment demand in terms of demand forecast. It is believed that better resources allocation can be achieved and an increase in customer satisfaction is foreseeable. Research limitations/implications – The proposed CRMS could be applied to various franchising industries for effective marketing strategy formulation. However, since the data in this study are extracted from a specific industry, modifications may be required before the CRMS can be applied to other franchising industries. Originality/value – This study presents a new application to convert data into useful knowledge, and provides useful insights for delivering strategic promotional plans under a franchising business model. Through the pilot study conducted in a franchising education center, the results demonstrate that the proposed CRMS is valuable in providing effective promotion to attract more customers, better preparation in resources allocation and more standardized methods to formulate marketing strategies in the franchising industry.
APA, Harvard, Vancouver, ISO, and other styles
16

Bila, Svitlana. "Strategic priorities of social production digitalization: world experience." University Economic Bulletin, no. 48 (March 30, 2021): 40–55. http://dx.doi.org/10.31470/2306-546x-2021-48-40-55.

Full text
Abstract:
Actual importance of study. At the beginning of the 2020s developed world countries and countries which are the leaders of world economic development faced up the challenges of radical structural reformation of social production (from industry to service system) which is based on digitalization. Digital technologies in world science and business practice are considered essential part of a complex technological phenomenon like ‘Industry 4.0’. Digitalization should cover development of all business processes and management processes at micro-, meso- and microlevels, processes of social production management at national and world economy levels. In general, in the 21st century world is shifting rapidly to the strategies of digital technologies application. The countries which introduce these strategies will gain guaranteed competitive advantages: from reducing production costs and improved quality of goods and services to developing new sales market and making guaranteed super-profits. The countries which stand aside from digitalization processes are at risk of being among the outsiders of socio-economic development. Such problem statement highlights the actual importance of determining the directions, trends and strategic priorities of social production digitalization. This issue is really crucial for all world countries, including Ukraine which is in midst of profound structural reformation of all national production system. Problem statement. Digital economy shapes the ground for ‘Industry 4.0’, information, It technologies and large databases become the key technologies. The main asset of ‘Industry 4.0’ is information, the major tool of production is cyberphysical systems that lead to formation the single unified highly productive environmental system of collecting, analyzing and applying data to production and other processes. Cyberphysical systems provides ‘smart machines’ (productive machines, tools and equipment which are programmed) integration via their connection to the Internet, or creation special network, ‘Industrial Internet’ (IIoT) which is regarded as a productive analogue of ‘Internet of Things’ (IoT) that is focused on the consumers. ‘Internet of Things’ can be connected with ‘smart factories’ which use ‘Industrial Internet’ to adjust production processes quickly turning into account the changes in costs and availability of resources as well as demand for production made. One of the most essential tasks for current economics and researchers of systems and processes of organization future maintenance of world production is to determine the main strategic priorities of social production digitalization. Analysis of latest studies and publications. Valuable contribution to the study of the core and directions of strategic priorities concerning social production digitalization was made by such foreign scientists as the Canadian researcher Tapscott D [1], foreigners Sun, L., Zhao, L [2], Mcdowell, M. [3] and others. Yet, the study of issues concerning social production digitalization are mainly done by the team of authors as such issues are complicated and multihierarchical. Furthermore, the problem of social production digitalization is closely linked to the transition to sustainable development, which is reflected in the works by Ukrainian scholars like Khrapkin V., Ustimenko V., Kudrin O., Sagirov A. and others in the monograph “Determinants of sustainable economy development” [4]. The edition of the first in Ukraine inter-disciplinary textbook on Internet economy by a group of scientists like Tatomyr I., Kvasniy L., Poyda S. and others [5] should also be mentioned. But the challenges of social production digitalization are constantly focused on by theoretical scientists, analytics and practitioners of these processes. Determining unexplored parts of general problem. Defining strategic priorities of social production digitalization requires clear understanding of prospective spheres of their application, economic advantages and risks which mass transition of social production from traditional (industrial and post-industrial)to digital technologies bear. A new system of technological equipment (production digitalization, Internet-economy, technology ‘Industry 4.0’, NBIC- technologies and circular economy) has a number of economic advantages for commodity producers and countries, as well as leads to dramatical changes in the whole social security system, changes at labour market and reformation the integral system of social relations in the society. Tasks and objectives of the study. The objective of the study is to highlight the core and define the main strategic priorities of social production digitalization, as they cause the process of radical structural reformation of industrial production, services and social spheres of national economy of world countries and world economy in general. To achieve the objective set in the article the following tasks are determined and solved: - to define the main priorities of digital technologies development, which is radically modify all social production business processes; - to study the essence and the role of circular economy for transition to sustainable development taken EU countries as an example; - to identify the strategic priorities of robotization of production processes and priority spheres of industrial and service robots application; - to define the role of NBIC-technologies in the process of social production structural reformation and its transition to new digital technologies in the 21st century. Method and methodology of the study. While studying strategic priorities of social production digitalization theoretical and empirical methods of study are used, such as historical and logical, analysis and synthesis, abstract and specific, casual (cause-and-effect) ones. All of them helped to keep the track of digital technologies evolution and its impact on structural reformation of social production. Synergetic approach, method of expert estimates and casual methods are applied to ground system influence of digital technologies, ‘Industry 4.0’ and their materialization as ‘circular economy’ on the whole complicated and multihierarchical system of social production in general. Basic material (the results of the study). Digital economy, i.e. economy where it is virtual but not material or physical assets and transactions are of the greatest value, institutional environment in which business processes as well as all managerial processes are developed on the basis of digital computer technologies and information and communication technologies (ICT), lies as the ground for social production digitalization. ICT sphere involves production of electronic equipment, computing, hardware,.software and services. It also provides various information sevices. Information Technology serves as a material basis for digital economy and digital technologies development. Among the basic digital technologies the following ones play the profound role: technology ‘Blockchain’, 3D priniting, unmanned aerial vehicles and flying drones, virtual reality (VR). Augmented reality (AR), Internet of Things (IoT), Industrial Internet of Things (IIoT), Internet of Value (IoV) which is founded on IT and blockchain technology, Internet of Everything (IoE), Artificial Intelligence (AI), neuron networks and robots. These basic digital technologies in business processes and management practices are applied in synergy, complexity and system but not in a single way. System combination of digital technologies gives maximal economic effect from their practical application in all spheres of social production-from industry to all kinds of services. For instance, in education digital technologies promote illustrating and virtual supplement of study materials; in tourism trade they promote engagement of virtual guides, transport and logistics security of tourist routes, virtual adverts and trips arrangements, virtual guidebooks, virtual demonstration of services and IT brochures and leaflets. Digital technologies radically change gambling and show businesses, in particular, they provide virtual games with ‘being there’ effect. Digital technologies drastically modify the retail trade sphere, advertisement and publishing, management and marketing, as well as provide a lot of opportunities for collecting unbiased data concerning changes in market conditions in real time. Digital technologies lie as the basis for ‘circular economy’, whose essence rests with non-linear, secondary, circular use of all existing natural and material resources to provide the production and consumption without loss of quality and availability of goods and services developed on the grounds of innovations, IT-technology application and ‘Industry 4.0’. Among priorities of circular economy potential applications the following ones should be mentioned: municipal services, solid household wastes management and their recycling, mass transition to smart houses and smart towns, circular agriculture development, circular and renewable energy, The potential of circular economy fully and equally corresponds to the demands for energy efficiency and rational consumption of limited natural resources, so it is widely applied in EU countries while transiting to sustainable development. In the 21st century processes of social production robotization draw the maximal attention of the society. There is a division between industrial and service robots which combine artificial intelligence and other various digital technologies in synergy. Industrial robots are widely used in production, including automotive industry, processing industry, energetic, construction sectors and agriculture Services are applied in all other spheres and sectors of national and world economies –from military-industrial complex (for instance, for mining and demining the areas, military drones) to robots-cleaners (robots-vacuum cleaners), robots-taxis, robots engaged in health care service and served as nurses (provide the ill person with water, tidy up, bring meals). Social production robotization is proceeding apace. According to “World Robotic Report 2020”, within 2014 – 2019 the total quantity of industrial robots increased by 85 %. By 2020 in the world the share of robots in the sphere of automated industrial production had comprised 34 %, in electronics – 25%, in metallurgy – 10 %. These indicators are constantly growing which results in structural reformation of the whole system of economic and industrial processes, radical changes in world labour market and the social sphere of world economy in general. Alongside with generally recognized types of digital technologies and robotization processes, an innovation segment of digital economy – NBIC – technologies (Nanotechnology, Biotechnology, Information technology, Cognitive Science) are rapidly spread. Among the priorities of NBIC-technologies development the special place belongs to interaction between information and cognitive technologies. As a material basis for its synergy in NBIC-technologies creation of neuron networks, artificial intelligence, artificial cyber brain for robots are applied. It is estimated as one of the most prospective and important achievements of digital economy which determines basic, innovational vector of social production structural reformations in the 21st century. The sphere of results application. International economic relations and world economy, development of competitive strategies of national and social production digitalization of world economy in general. Conclusions. Digital technologies radically change all spheres of social production and social life, including business and managerial processes at all levels. Digital technologies are constantly developing and modifying, that promotes emergence of new spheres and new business activities and management. 21st century witnessed establishing digital economy, smart economy, circular economy, green economy and other various arrangements of social production which are based on digital technologies. Social production digitalization and innovative digital technologies promotes business with flexible systems of arrangement and management, production and sales grounded on processing large Big Data permanently, on the basis of online monitoring in real time. Grounded on digital technologies business in real time mode processes a massive Big Data and on their results makes smart decisions in all business spheres and business processes management. Radical shifts in social production digitalization provides businesses of the states which in practice introduce digital technologies with significant competitive advantages - from decrease in goods and services production cost to targeted meeting of specific needs of consumers. Whereas, rapid introduction of digital technologies in the countries-leaders of world economic development results in a set of system socio-economic and socio-political challenges, including the following: crucial reformatting the world labour market and rise in mass unemployment, shift from traditional export developing countries’ specialization, breakups of traditional production networks being in force since the end of the 20th century, so called ‘chains of additional value shaping’, breakups of traditional cooperation links among world countries and shaping the new ones based on ‘Industry 4.0’ and ‘Industrial Internet’. Socio-economic and political consequences of radical structural reformation of all spheres in national and world economy in the 21st century, undoubtedly, will be stipulated with the processes of social production digitalization. It will require further systemic and fundamental scientific studies on this complicated and multi hierarchical process.
APA, Harvard, Vancouver, ISO, and other styles
17

Solovyev, V. S., and M. N. Urda. "Importance of the Internet in Illegal Migration and Migration Crimes Determination." Actual Problems of Russian Law 1, no. 12 (January 20, 2020): 114–22. http://dx.doi.org/10.17803/1994-1471.2019.109.12.114-122.

Full text
Abstract:
The purpose of the paper is to identify the correlation between illegal migration, migration crimes and the use of the Internet resources. The content analysis of advertising sites, social networks in combination with other research methods has revealed that in the virtual space there is both active promotion of services facilitating illegal migration and a wide demand for them. Based on the results of the study, the authors develop the following proposals for improving measures to combat illegal migration in the global network. 1. A strategic direction for countering illegal migration should be the adjustment of the state’s migration policy through the establishment of a correlation between this phenomenon and the use of modern information and telecommunication technologies aiming at organizing and developing a criminal business providing illegal entry, stay (residence) services, fictitious registration, and registration on migration registration, etc. 2. The counteraction to illegal migration in cyberspace should be implemented through an integrated approach. From the perspective of criminal law, it is necessary to add the element “using the media or information and telecommunication networks (including the Internet)” into Part 2 of Art. 322.1 of the Criminal Code of the Russian Federation, which provides for liability for the crime elements of the organization of illegal migration. From the perspective of criminology it is important to identify and record the real conditionality of the development of illegal migration, migration crimes and the use of the Internet to commit them; to improve statistical reporting on crime by including information on the organization of illegal migration committed via the media or information and telecommunication networks (including the Internet) in the report of the State Automated Police Center of the Ministry of Internal Affairs of the Russian Federation on crimes committed in the field of telecommunication and computer information.
APA, Harvard, Vancouver, ISO, and other styles
18

Cassaniti, Jarret. "Influence Networks Relating to Health Knowledge Among Nairobi’s Micro-Retailers and Their Clients." Electronic Journal of Knowledge Management 18, no. 3 (April 23, 2021): pp302–324. http://dx.doi.org/10.34190/ejkm.18.3.2068.

Full text
Abstract:
TRANSFORM, founded in 2015 by Unilever and the UK’s Department for International Development, supports several social enterprises by combining public sector resources with private sector technical capabilities and networks to support innovative social enterprises. Digital programs have enabled social enterprise partnerships to expand the reach of their initiatives to broader audiences including specifically defined groups that hitherto were untapped or difficult to reach. Unilever partnered with TRANSFORM and Every1Mobile to develop UJoin and UAfya in informal settlements of Nairobi, Kenya. UJoin is a social enterprise initiative for promoting business growth among underserved neighborhood shops called dukas. UAfya focuses on young expectant and new mothers, and women interested in family and maternal health topics. Each initiative uses an online community network to discuss and improve knowledge and behaviors regarding livelihoods and health. Online communities provide opportunities to reach specific groups with targeted behavior change messages and campaigns. However, little systematic knowledge is currently available on how to develop, and scale-up effective behavior change programs for digital communities in low-income markets. There is also little information about key guiding principles and best practices that underlie successful digital and online, social networking models. A systematic and participatory tool known as Net-Map was used to explore and understand potential frameworks for establishing digital-based community-driven partnerships with the private sector for health promotion through behavior change. The Net-Map approach was used to help individuals and groups clarify their view of a situation (including networks and power structures), foster discussion, and develop a strategic approach to their networking activities. Eight Net-Maps were constructed, stratified by groups based on location and digital platform. Each map was constructed by an average of 9-10 people for a total of 76 participants. Seventy-six participants identified actors – stakeholders and groups of people involved - and influential links – ways actors are connected - through the Net-Map activity. Among UAfya participants, local government, family, and friends, and the media were identified as the most important actor types. A comparison of the discussions associated with the creation of the maps by UAfya members shows that the two most important link types are conflict, and collaboration/partnership. Among UJoin participants, the three most important actor types were local government, business and financial institutions, and customers. UJoin members identified regulation, and conflict and competition, collaboration and, information sharing as key links between actors. Recommendations based on findings support a vision for scale-up of the UJoin and UAfya programs through accreditation and branding of a novel type of duka. Shop keepers would be trained and knowledgeable to provide high-quality services to improve customer health while also selling health products that benefit the bottom line.
APA, Harvard, Vancouver, ISO, and other styles
19

Fang, Yin-Ying, Chi-Fang Chen, and Sheng-Ju Wu. "Feature identification using acoustic signature of Ocean Researcher III (ORIII) of Taiwan." ANZIAM Journal 59 (July 25, 2019): C318—C357. http://dx.doi.org/10.21914/anziamj.v59i0.12655.

Full text
Abstract:
Underwater acoustic signature identification has been employed as a technique for detecting underwater vehicles, such as in anti-submarine warfare or harbour security systems. The underwater sound channel, however, has interference due to spatial variations in topography or sea state conditions and temporal variations in water column properties, which cause multipath and scattering in acoustic propagation. Thus, acoustic data quality control can be very challenging. One of challenges for an identification system is how to recognise the same target signature from measurements under different temporal and spatial settings. This paper deals with the above challenges by establishing an identification system composed of feature extraction, classification algorithms, and feature selection with two approaches to recognise the target signature of underwater radiated noise from a research vessel, Ocean Researcher III, with a bottom mounted hydrophone in five cruises in 2016 and 2017. The fundamental frequency and its power spectral density are known as significant features for classification. In feature extraction, we extract the features before deciding which is more significant from the two aforementioned features. The first approach utilises Polynomial Regression (PR) classifiers and feature selection by Taguchi method and analysis of variance under a different combination of factors and levels. The second approach utilises Radial Basis Function Neural Network (RBFNN) selecting the optimised parameters of classifier via genetic algorithm. The real-time classifier of PR model is robust and superior to the RBFNN model in this paper. This suggests that the Automatic Identification System for Vehicles using Acoustic Signature developed here can be carried out by utilising harmonic frequency features extracted from unmasking the frequency bandwidth for ship noises and proves that feature extraction is appropriate for our targets. References Nathan D Merchant, Kurt M Fristrup, Mark P Johnson, Peter L Tyack, Matthew J Witt, Philippe Blondel, and Susan E Parks. Measuring acoustic habitats. Methods in Ecology and Evolution, 6(3):257265, 2015. doi:10.1111/2041-210X.12330. Nathan D Merchant, Philippe Blondel, D Tom Dakin, and John Dorocicz. Averaging underwater noise levels for environmental assessment of shipping. The Journal of the Acoustical Society of America, 132(4):EL343EL349, 2012. doi:10.1121/1.4754429. Chi-Fang Chen, Hsiang-Chih Chan, Ray-I Chang, Tswen-Yung Tang, Sen Jan, Chau-Chang Wang, Ruey-Chang Wei, Yiing-Jang Yang, Lien-Siang Chou, Tzay-Chyn Shin, et al. Data demonstrations on physical oceanography and underwater acoustics from the marine cable hosted observatory (macho). In OCEANS, 2012-Yeosu, pages 16. IEEE, 2012. doi:10.1109/OCEANS-Yeosu.2012.6263639. Sauda Sadaf P Yashaswini, Soumya Halagur, Fazil Khan, and Shanta Rangaswamy. A literature survey on ambient noise analysis for underwater acoustic signals. International Journal of Computer Engineering and Sciences, 1(7):19, 2015. doi:10.26472/ijces.v1i7.37. Shuguang Wang and Xiangyang Zeng. Robust underwater noise targets classification using auditory inspired time-frequency analysis. Applied Acoustics, 78:6876, 2014. doi:10.1016/j.apacoust.2013.11.003. LG Weiss and TL Dixon. Wavelet-based denoising of underwater acoustic signals. The Journal of the Acoustical Society of America, 101(1):377383, 1997. doi:10.1121/1.417983. Timothy Alexis Bodisco, Jason D'Netto, Neil Kelson, Jasmine Banks, Ross Hayward, and Tony Parker. Characterising an ecg signal using statistical modelling: a feasibility study. ANZIAM Journal, 55:3246, 2014. doi:10.21914/anziamj.v55i0.7818. José Ribeiro-Fonseca and Luís Correia. Identification of underwater acoustic noise. In OCEANS'94.'Oceans Engineering for Today's Technology and Tomorrow's Preservation.'Proceedings, volume 2, pages II/597II/602 vol. 2. IEEE. Linus YS Chiu and Hwei-Ruy Chen. Estimation and reduction of effects of sea surface reflection on underwater vertical channel. In Underwater Technology Symposium (UT), 2013 IEEE International, pages 18. IEEE, 2013. doi:10.1109/UT.2013.6519874. G.M. Wenz. Acoustic ambient noise in the ocean: spectra and sources. Thesis, 1962. doi:10.1121/1.1909155. Donald Ross. Mechanics of underwater noise. Elsevier, 2013. doi:10.1121/1.398685. Chris Drummond and Robert C Holte. Exploiting the cost (in) sensitivity of decision tree splitting criteria. In ICML, volume 1, 2000. Charles Elkan. The foundations of cost-sensitive learning. In International joint conference on artificial intelligence, volume 17, pages 973978. Lawrence Erlbaum Associates Ltd, 2001. Chris Gillard, Alexei Kouzoubov, Simon Lourey, Alice von Trojan, Binh Nguyen, Shane Wood, and Jimmy Wang. Automatic classification of active sonar echoes for improved target identification. Douglas C Montgomery. Design and analysis of experiments. John wiley and sons, 2017. doi:10.1002/9781118147634. G Taguchi. Off-line and on-line quality control systems. In Proceedings of International Conference on Quality Control, 1978. Sheng-Ju Wu, Sheau-Wen Shiah, and Wei-Lung Yu. Parametric analysis of proton exchange membrane fuel cell performance by using the taguchi method and a neural network. Renewable Energy, 34(1):135144, 2009. doi:10.1016/j.renene.2008.03.006. Genichi Taguchi. Introduction to quality engineering: designing quality into products and processes. Technical report, 1986. doi:10.1002/qre.4680040216. Richard Horvath, Gyula Matyasi, and Agota Dregelyi-Kiss. Optimization of machining parameters for fine turning operations based on the response surface method. ANZIAM Journal, 55:250265, 2014. doi:10.21914/anziamj.v55i0.7865. Chuan-Tien Li, Sheng-Ju Wu, and Wei-Lung Yu. Parameter design on the multi-objectives of pem fuel cell stack using an adaptive neuro-fuzzy inference system and genetic algorithms. International Journal of Hydrogen Energy, 39(9):45024515, 2014. doi:10.1016/j.ijhydene.2014.01.034. Antoine Guisan, Thomas C Edwards Jr, and Trevor Hastie. Generalized linear and generalized additive models in studies of species distributions: setting the scene. Ecological modelling, 157(2-3):89100, 2002. doi:10.1016/S0304-3800(02)00204-1. Sheng Chen, Colin FN Cowan, and Peter M Grant. Orthogonal least squares learning algorithm for radial basis function networks. IEEE Transactions on neural networks, 2(2):302309, 1991. doi:10.1109/72.80341. Howard Demuth and Mark Beale. Neural network toolbox for use with matlab-user's guide verion 4.0. 1993. Janice Gaffney, Charles Pearce, and David Green. Binary versus real coding for genetic algorithms: A false dichotomy? ANZIAM Journal, 51:347359, 2010. doi:10.21914/anziamj.v51i0.2776. Daniel May and Muttucumaru Sivakumar. Techniques for predicting total phosphorus in urban stormwater runoff at unmonitored catchments. ANZIAM Journal, 45:296309, 2004. doi:10.21914/anziamj.v45i0.889. Chang-Xue Jack Feng, Zhi-Guang Yu, and Andrew Kusiak. Selection and validation of predictive regression and neural network models based on designed experiments. IIE Transactions, 38(1):1323, 2006. doi:10.1080/07408170500346378. Yin-Ying Fang, Ping-Jung Sung, Kai-An Cheng, Meng Fan Tsai, and Chifang Chen. Underwater radiated noise measurement of ocean researcher 3. In The 29th Taiwan Society of Naval Architects and Marine Engineers Conference, 2017. Yin-Ying Fang, Chi-Fang Chen, and Sheng-Ju Wu. Analysis of vibration and underwater radiated noise of ocean researcher 3. In The 30th Taiwan Society of Naval Architects and Marine Engineers Conference, 2018. Det Norske Veritas. Rules for classification of ships new buildings special equipment and systems additional class part 6 chapter 24 silent class notation. Rules for Classification of ShipsNewbuildings, 2010. Underwater acousticsquantities and procedures for description and measurement of underwater sound from ships-part 1requirements for precision measurements in deep water used for comparison purposes. (ISO 17208-1:2012), 2012. Bureau Veritas. Underwater radiated noise, rule note nr 614 dt r00 e. Bureau Veritas, 2014. R.J. Urick. Principles of underwater sound, volume 3. McGraw-Hill New York, 1983. Lars Burgstahler and Martin Neubauer. New modifications of the exponential moving average algorithm for bandwidth estimation. In Proc. of the 15th ITC Specialist Seminar, 2002. Bishnu Prasad Lamichhane. Removing a mixture of gaussian and impulsive noise using the total variation functional and split bregman iterative method. ANZIAM Journal, 56:5267, 2015. doi:10.21914/anziamj.v56i0.9316. Chao-Ton Su. Quality engineering: off-line methods and applications. CRC press, 2016. Jiju Antony and Mike Kaye. Experimental quality: a strategic approach to achieve and improve quality. Springer Science and Business Media, 2012. Ozkan Kucuk, Tayeb Elfarah, Serkan Islak, and Cihan Ozorak. Optimization by using taguchi method of the production of magnesium-matrix carbide reinforced composites by powder metallurgy method. Metals, 7(9):352, 2017. doi:10.3390/met7090352. G Taguchi. System of experimental design, quality resources. New York, 108, 1987. Gavin C Cawley and Nicola LC Talbot. Efficient leave-one-out cross-validation of kernel fisher discriminant classifiers. Pattern Recognition, 36(11):25852592, 2003. doi:10.1016/S0031-3203(03)00136-5.
APA, Harvard, Vancouver, ISO, and other styles
20

Tatarinov, Vadym V., Vadym S. Tatarinov, and Valentina А. Pavlova. "SMALL AND MEDIUM BUSINESS AS A MEMBER OF INTERNATIONAL STRATEGIC ALLIANCES." Academic Review 1, no. 54 (June 2021). http://dx.doi.org/10.32342/2074-5354-2021-1-54-10.

Full text
Abstract:
The article considers the peculiarities of the construction and functioning of international strategic alliances and the reasons that hinder the participation of Ukrainian small and medium-sized businesses. Recommendations for the development of knowledge-intensive components of small and medium-sized businesses (venture, innovation and manufacturing businesses) and preparation for their participation in international strategic alliances are given: 1. In the conditions of globalization and strengthening of competition companies look for ways of saving of the economic efficiency, increase of competitiveness. One of such ways can be considered an international strategic alliance. ISA is an organizational agreement on long-term cooperation, which provides for the joint use of resources and management structures of two or more independent firms from different countries to implement tasks related to the mission of each of them; ISA is able to promote the innovative development of individual enterprises and the economy of a developing country as a whole; ISAs are an important link in the transfer of technology and other scientific and technical developments of national companies; SMEs with large companies may spread, on the one hand, due to the lack of financial and managerial opportunities for SMEs to develop business and compete with global players in the industry, and, on the other hand, large members of the alliance can count on the flexibility of SMEs and to obtain the results of their scientific and technical work, as well as the expansion of its scientific and technical base through mergers with SMEs or their acquisition; The highest achievement of ISA can be the creation of such innovations and modernization of production at a level that was not even laid down in the formation of the alliance. 2. In order to increase innovation activity, development and preparation of venture, innovation and production small and medium-sized businesses for participation in ISAs, the state must ensure after completion of quarantine: - increasing the investment attractiveness of Ukraine by maintaining political, economic stability and lasting peace in the country; - in the legislative order of creation and stimulation of the competitive environment in the field of venture, innovative and industrial small and average business in perspective branches of economy; – the possibility of using ways to diversify effective financial support for innovative enterprises based on the experience of developed countries; – stimulating bank capital to preferential financing of innovative SMEs in promising industries; – creation of a legislative basis for the development of the “angel network” in Ukraine; – obtaining a mandatory level of financial training through online education for start-ups in order to reduce the risk of using their financial support; – creation and development of regional infrastructure of resource, information, scientific and technical support for SMEs; – organization of training of high-class specialists in the field of support of venture business and organization of works on creation and realization of perspective projects; – sustainable development of the Ukrainian stock market; – improvement of the legal framework in terms of clarifying the functions and principles of venture funds and venture firms in order to eliminate their misuse of tax benefits; – protection of Ukrainian intellectual property abroad.
APA, Harvard, Vancouver, ISO, and other styles
21

Meisel, Jose D., Felipe Montes, Angie M. Ramirez, Pablo D. Lemoine, Juan A. Valdivia, and Roberto Zarama. "Network analysis of collaboration in networked universities." Kybernetes ahead-of-print, ahead-of-print (June 1, 2021). http://dx.doi.org/10.1108/k-10-2020-0648.

Full text
Abstract:
PurposeIn Latin America and the Caribbean, the access of students to higher education has presented an extraordinary growth over the past fifteen years. This rapid growth has presented a challenge for increasing the system resources and capabilities while maintaining its quality. As a result, the networked universities (NUs) organized themselves as a collaborative network, and they have become an interesting model for facing the complexity driven by globalization, rapidly changing technology, dynamic growth of knowledge and highly specialized areas of expertise. In this article, we studied the NU named Red Universitaria Mutis (Red Mutis) with the aim of characterizing the collaboration and integration structure of the network.Design/methodology/approachNetwork analytic methods (visual analysis, positional analysis and a stochastic network method) were used to characterize the organizational structure and robustness of the network, and to identify what variables or structural tendencies are related to the likelihood that specific areas of a university would collaborate.FindingsRed Mutis is a good example of regional NUs that could take advantage of the strengths, partnerships, information and knowledge of the regional and international universities that form the network. Analyses showed that Red Mutis has a differentiated structure consisting of academic and non-academic university areas with a vertical coordination (by steering and management) of the different university areas.Originality/valueThe methodology could be used as a framework to analyze and strengthen other strategic alliances between universities and as a model for the development of other NU in local and global contexts.
APA, Harvard, Vancouver, ISO, and other styles
22

Basile, Gianpaolo, M. Simona Andreano, Laura Martiniello, and Andrea Mazzitelli. "Drivers of performance in a complex environment." Kybernetes ahead-of-print, ahead-of-print (June 15, 2019). http://dx.doi.org/10.1108/k-07-2018-0410.

Full text
Abstract:
Purpose Paper aim is to analyse and consider the business network contract (BNC) as a model of voluntary holarchy in which the holons are isomorphically linked between them by means managerial chooses and laws to reach a communal and individual survival condition. Design/methodology/approach The Italian SME manufacturing firms signing a BNC are seen as holonic elements in an adaptive system. Data drawn from the Italian business register are analyzed to understand the driving factors of the firms’ adaptation and survival, by using descriptive, causality and analysis of variance (ANOVA) statistical techniques. Findings The main findings of the paper support the holonic approach by demonstrating that BNC are alliances based on strategic relations able to create synergies and increase performance. Empirical results suggest that “internal and external efficiency,” given by knowledge sharing practices and firms’ geographical proximity will positively influence BN firms’ productivity, although without resorting to investments (tangible or intangible). Practical implications BNC is an instrument able to introduce common rules and finalized isomorphic behaviors making firms acting as a holon with positive effects on performance. Originality/value This work enriches the existing literature by joining the systemic approach with the network theories and providing evidence of the suitability of the “holon” construct as the basis for a multi-level framework for the study of organizational networking.
APA, Harvard, Vancouver, ISO, and other styles
23

Wang, Pengwu. "A study on the intellectual capital management over cloud computing using analytic hierarchy process and partial least squares." Kybernetes ahead-of-print, ahead-of-print (September 23, 2021). http://dx.doi.org/10.1108/k-03-2021-0241.

Full text
Abstract:
PurposeIn the age of a knowledge-based economy and following extensive socio-economic changes, the success of organizations is not limited to gaining financial and material resources. Instead, it depends on the acquisition of intangible assets that can be used to achieve a sustainable competitive advantage. In the new strategic environment, organizations will thrive when they see themselves as a learning organization whose goal is to improve intellectual capital continually; an organization that cannot increase its intellectual capital cannot survive. The term intellectual capital is used in the overlap of all assets, intangible resources and non-physical resources of an organization, including processes, innovation capacity and implicit and explicit knowledge of its members and partner network. However, despite the growing importance of intellectual capital and cloud computing as vital resources for organizations' competitive advantage, there is a limited understanding of them. Simultaneously, the management of intellectual capital enables organizational managers to create, nurture, control and preserves a strong competitive advantage source, the advantage that competitors will not easily capture. So, the main objective of the present investigation is to check out the factors affecting the adoption of intellectual capital management systems based on cloud computing in hospitals.Design/methodology/approachIn the last two decades, we have moved toward economics, where investment in Information Technology (IT), human resources, development, research and advertising is essential to maintain competitive advantage and certify the sustainability of organizations. Therefore, it can be stated that the economic value is the creation and management of intangible assets, which are referred to as intellectual capital. On the other hand, cloud computing is presented as a new paradigm for hosting and providing services through the Internet. Cloud computing can lead to too many benefits to organizations, including cost reduction, flexibility and improved performance. The present article examines how optimal intellectual capital management can be achieved using cloud computing. So, seven hypotheses were developed through the dimensions of technology, environment, organization and innovation. In this study, the path analysis was performed using Analytic Hierarchy Process (AHP) and Partial Least Squares (PLS). By reviewing the literature related to the model of technology, organization, environment and innovation dissemination theory, four main criteria, and 15 sub-criteria were identified based on the opinions of specialists, professors and IT experts based on AHP and PLS methods.FindingsThe results of this investigation confirmed all the hypotheses. The results illustrated that environmental and technological factors should be regarded more when adopting intellectual capital management systems based on cloud computing. The results also indicated that intellectual capital highly influences improving performance. Furthermore, cloud apps, like other disruptive technology, deliver superior benefits while still presenting a slew of realistic challenges that must be tackled. In order to draw a growing customer base to this business model, software vendors should resolve these concerns. The literature revealed that the computing industry is making tremendous strides around the world. Nevertheless, in order to achieve a faster and softer adoption, newer and more advanced techniques are still required.Research limitations/implicationsThe research outcomes can significantly impact a wide range of organizations, such as health-related organizations. However, there are some limitations; for example, the sample is limited to one country. Therefore, future studies can measure the data of this study in different samples in different countries. Future researchers can also boost the model's predictive capability to adopt cloud computing in other organizations by adding environmental, organizational, innovation and other technical factors.Practical implicationsManagers will use these emerging innovations to minimize costs and maximize profits in the intellectual capital management competition. An effective cloud computing based on an electronic human resource management system can significantly increase system performance in industries. The investigators expect that the results will direct clinicians and scholars into a more advanced and developed age of cloud-based apps.Originality/valueInvestigations on the impact of cloud computing on intellectual capital management are rare. Accordingly, this investigation provided a new experience in terms of intellectual capital in the field of cloud computing. This study filled the scientific research gap to understand the factors affecting intellectual capital management systems based on cloud computing. This study provides a better insight into the power of organizational and environmental structure to adopt this technology in hospitals.
APA, Harvard, Vancouver, ISO, and other styles
24

Burns, Alex. "Select Issues with New Media Theories of Citizen Journalism." M/C Journal 10, no. 6 (April 1, 2008). http://dx.doi.org/10.5204/mcj.2723.

Full text
Abstract:
“Journalists have to begin a new type of journalism, sometimes being the guide on the side of the civic conversation as well as the filter and gatekeeper.” (Kolodzy 218) “In many respects, citizen journalism is simply public journalism removed from the journalism profession.” (Barlow 181) 1. Citizen Journalism — The Latest Innovation? New Media theorists such as Dan Gillmor, Henry Jenkins, Jay Rosen and Jeff Howe have recently touted Citizen Journalism (CJ) as the latest innovation in 21st century journalism. “Participatory journalism” and “user-driven journalism” are other terms to describe CJ, which its proponents argue is a disruptive innovation (Christensen) to the agenda-setting media institutions, news values and “objective” reportage. In this essay I offer a “contrarian” view, informed by two perspectives: (1) a three-stage model of theory-building (Carlile & Christensen) to evaluate the claims made about CJ; and (2) self-reflexive research insights (Etherington) from editing the US-based news site Disinformation between November 1999 and February 2008. New media theories can potentially create “cognitive dissonance” (Festinger) when their explanations of CJ practices are compared with what actually happens (Feyerabend). First I summarise Carlile & Christensen’s model and the dangers of “bad theory” (Ghoshal). Next I consider several problems in new media theories about CJ: the notion of ‘citizen’, new media populism, parallels in event-driven and civic journalism, and mergers and acquisitions. Two ‘self-reflexive’ issues are considered: ‘pro-ams’ or ‘professional amateurs’ as a challenge to professional journalists, and CJ’s deployment in new media operations and production environments. Finally, some exploratory questions are offered for future researchers. 2. An Evaluative Framework for New Media Theories on Citizen Journalism Paul Carlile and Clayton M. Christensen’s model offers one framework with which to evaluate new media theories on CJ. This framework is used below to highlight select issues and gaps in CJ’s current frameworks and theories. Carlile & Christensen suggest that robust theory-building emerges via three stages: Descriptive, Categorisation and Normative (Carlile & Christensen). There are three sub-stages in Descriptive theory-building; namely, the observation of phenomena, inductive classification into schemas and taxonomies, and correlative relationships to develop models (Carlile & Christensen 2-5). Once causation is established, Normative theory evolves through deductive logic which is subject to Kuhnian paradigm shifts and Popperian falsifiability (Carlile & Christensen 6). Its proponents situate CJ as a Categorisation or new journalism agenda that poses a Normative challenged and Kuhnian paradigm shift to traditional journalism. Existing CJ theories jump from the Descriptive phase of observations like “smart mobs” in Japanese youth subcultures (Rheingold) to make broad claims for Categorisation such as that IndyMedia, blogs and wiki publishing systems as new media alternatives to traditional media. CJ theories then underpin normative beliefs, values and worldviews. Correlative relationships are also used to differentiate CJ from the demand side of microeconomic analysis, from the top-down editorial models of traditional media outlets, and to adopt a vanguard stance. To support this, CJ proponents cite research on emergent collective behaviour such as the “wisdom of crowds” hypothesis (Surowiecki) or peer-to-peer network “swarms” (Pesce) to provide scientific justification for their Normative theories. However, further evaluative research is needed for three reasons: the emergent collective behaviour hypothesis may not actually inform CJ practices, existing theories may have “correlation not cause” errors, and the link may be due to citation network effects between CJ theorists. Collectively, this research base also frames CJ as an “ought to” Categorisation and then proceeds to Normative theory-building (Carlile & Christensen 7). However, I argue below that this Categorisation may be premature: its observations and correlative relationships might reinforce a ‘weak’ Normative theory with limited generalisation. CJ proponents seem to imply that it can be applied anywhere and under any condition—a “statement of causality” that almost makes it a fad (Carlile & Christensen 8). CJ that relies on Classification and Normative claims will be problematic without a strong grounding in Descriptive observation. To understand what’s potentially at stake for CJ’s future consider the consider the parallel debate about curricula renewal for the Masters of Business Administration in the wake of high-profile corporate collapses such as Enron, Worldcom, HIH and OneTel. The MBA evolved as a sociological and institutional construct to justify management as a profession that is codified, differentiated and has entry barriers (Khurana). This process might partly explain the pushback that some media professionals have to CJ as one alternative. MBA programs faced criticism if they had student cohorts with little business know-how or experiential learning (Mintzberg). Enron’s collapse illustrated the ethical dilemmas and unintended consequences that occurred when “bad theories” were implemented (Ghoshal). Professional journalists are aware of this: MBA-educated managers challenged the “craft” tradition in the early 1980s (Underwood). This meant that journalism’s ‘self-image’ (Morgan; Smith) is intertwined with managerial anxieties about media conglomerates in highly competitive markets. Ironically, as noted below, Citizen Journalists who adopt a vanguard position vis-a-vis media professionals step into a more complex game with other players. However, current theories have a naïve idealism about CJ’s promise of normative social change in the face of Machiavellian agency in business, the media and politics. 3. Citizen Who? Who is the “citizen” in CJ? What is their self-awareness as a political agent? CJ proponents who use the ‘self-image’ of ‘citizen’ draw on observations from the participatory vision of open source software, peer-to-peer networks, and case studies such as Howard Dean’s 2004 bid for the Democrat Party nominee in the US Presidential election campaign (Trippi). Recent theorists note Alexander Hamilton’s tradition of civic activism (Barlow 178) which links contemporary bloggers with the Federalist Papers and early newspaper pamphlets. One unsurfaced assumption in these observations and correlations is that most bloggers will adopt a coherent political philosophy as informed citizens: a variation on Lockean utilitarianism, Rawlsian liberalism or Nader consumer activism. To date there is little discussion about how political philosophy could deepen CJ’s ‘self-image’: how to critically evaluate sources, audit and investigation processes, or strategies to deal with elites, deterrence and power. For example, although bloggers kept Valerie Plame’s ‘outing’ as a covert intelligence operative highly visible in the issues-attention cycle, it was agenda-setting media like The New York Times who the Bush Administration targeted to silence (Pearlstine). To be viable, CJ needs to evolve beyond a new media populism, perhaps into a constructivist model of agency, norms and social change (Finnemore). 4. Citizen Journalism as New Media Populism Several “precursor trends” foreshadowed CJ notably the mid-1990s interest in “cool-hunting” by new media analysts and subculture marketeers (Gibson; Gladwell). Whilst this audience focus waned with the 1995-2000 dotcom bubble it resurfaced in CJ and publisher Tim O’Reilly’s Web 2.0 vision. Thus, CJ might be viewed as new media populism that has flourished with the Web 2.0 boom. Yet if the boom becomes a macroeconomic bubble (Gross; Spar) then CJ could be written off as a “silver bullet” that ultimately failed to deliver on its promises (Brooks, Jr.). The reputations of uncritical proponents who adopted a “true believer” stance would also be damaged (Hoffer). This risk is evident if CJ is compared with a parallel trend that shares its audience focus and populist view: day traders and technical analysts who speculate on financial markets. This parallel trend provides an alternative discipline in which the populism surfaced in an earlier form (Carlile & Christensen 12). Fidelity’s Peter Lynch argues that stock pickers can use their Main Street knowledge to beat Wall Street by exploiting information asymmetries (Lynch & Rothchild). Yet Lynch’s examples came from the mid-1970s to early 1980s when indexed mutual fund strategies worked, before deregulation and macroeconomic volatility. A change in the Web 2.0 boom might similarly trigger a reconsideration of Citizen Journalism. Hedge fund maven Victor Niederhoffer contends that investors who rely on technical analysis are practicing a Comtean religion (Niederhoffer & Kenner 72-74) instead of Efficient Market Hypothesis traders who use statistical arbitrage to deal with ‘random walks’ or Behavioural Finance experts who build on Amos Tversky and Daniel Kahneman’s Prospect Theory (Kahneman & Tversky). Niederhoffer’s deeper point is that technical analysts’ belief that the “trend is your friend” is no match for the other schools, despite a mini-publishing industry and computer trading systems. There are also ontological and epistemological differences between the schools. Similarly, CJ proponents who adopt a ‘Professional Amateur’ or ‘Pro-Am’ stance (Leadbeater & Miller) may face a similar gulf when making comparisons with professional journalists and the production environments in media organisations. CJ also thrives as new media populism because of institutional vested interests. When media conglomerates cut back on cadetships and internships CJ might fill the market demand as one alternative. New media programs at New York University and others can use CJ to differentiate themselves from “hyperlocal” competitors (Christensen; Slywotzky; Christensen, Curtis & Horn). This transforms CJ from new media populism to new media institution. 5. Parallels: Event-driven & Civic Journalism For new media programs, CJ builds on two earlier traditions: the Event-driven journalism of crises like the 1991 Gulf War (Wark) and the Civic Journalism school that emerged in the 1960s social upheavals. Civic Journalism’s awareness of minorities and social issues provides the character ethic and political philosophy for many Citizen Journalists. Jay Rosen and others suggest that CJ is the next-generation heir to Civic Journalism, tracing a thread from the 1968 Chicago Democratic Convention to IndyMedia’s coverage of the 1999 “Battle in Seattle” (Rosen). Rosen’s observation could yield an interesting historiography or genealogy. Events such as the Southeast Asian tsunami on 26 December 2004 or Al Qaeda’s London bombings on 7 July 2005 are cited as examples of CJ as event-driven journalism and “pro-am collaboration” (Kolodzy 229-230). Having covered these events and Al Qaeda’s attacks on 11th September 2001, I have a slightly different view: this was more a variation on “first responder” status and handicam video footage that journalists have sourced for the past three decades when covering major disasters. This different view means that the “salience of categories” used to justify CJ and “pro-am collaboration” these events does not completely hold. Furthermore, when Citizen Journalism proponents tout Flickr and Wikipedia as models of real-time media they are building on a broader phenomenon that includes CNN’s Gulf War coverage and Bloomberg’s dominance of financial news (Loomis). 6. The Mergers & Acquisitions Scenario CJ proponents often express anxieties about the resilience of their outlets in the face of predatory venture capital firms who initiate Mergers & Acquisitions (M&A) activities. Ironically, these venture capital firms have core competencies and expertise in the event-driven infrastructure and real-time media that CJ aspires to. Sequoia Capital and other venture capital firms have evaluative frameworks that likely surpass Carlile & Christensen in sophistication, and they exploit parallels, information asymmetries and market populism. Furthermore, although venture capital firms such as Union Street Ventures have funded Web 2.0 firms, they are absent from the explanations of some theorists, whose examples of Citizen Journalism and Web 2.0 success may be the result of survivorship bias. Thus, the venture capital market remains an untapped data source for researchers who want to evaluate the impact of CJ outlets and institutions. The M&A scenario further problematises CJ in several ways. First, CJ is framed as “oppositional” to traditional media, yet this may be used as a stratagem in a game theory framework with multiple stakeholders. Drexel Burnham Lambert’s financier Michael Milken used market populism to sell ‘high-yield’ or ‘junk’ bonds to investors whilst disrupting the Wall Street establishment in the late 1980s (Curtis) and CJ could fulfil a similar tactical purpose. Second, the M&A goal of some Web 2.0 firms could undermine the participatory goals of a site’s community if post-merger integration fails. Jason Calacanis’s sale of Weblogs, Inc to America Online in 2005 and MSNBC’s acquisition of Newsvine on 5 October 2007 (Newsvine) might be success stories. However, this raises issues of digital “property rights” if you contribute to a community that is then sold in an M&A transaction—an outcome closer to business process outsourcing. Third, media “buzz” can create an unrealistic vision when a CJ site fails to grow beyond its start-up phase. Backfence.com’s demise as a “hyperlocal” initiative (Caverly) is one cautionary event that recalls the 2000 dotcom crash. The M&A scenarios outlined above are market dystopias for CJ purists. The major lesson for CJ proponents is to include other market players in hypotheses about causation and correlation factors. 7. ‘Pro-Ams’ & Professional Journalism’s Crisis CJ emerged during a period when Professional Journalism faced a major crisis of ‘self-image’. The Demos report The Pro-Am Revolution (Leadbeater & Miller) popularised the notion of ‘professional amateurs’ which some CJ theorists adopt to strengthen their categorisation. In turn, this triggers a response from cultural theorists who fear bloggers are new media’s barbarians (Keen). I concede Leadbeater and Miller have identified an important category. However, how some CJ theorists then generalise from ‘Pro-Ams’ illustrates the danger of ‘weak’ theory referred to above. Leadbeater and Miller’s categorisation does not really include a counter-view on the strengths of professionals, as illustrated in humanistic consulting (Block), professional service firms (Maister; Maister, Green & Galford), and software development (McConnell). The signs of professionalism these authors mention include a commitment to learning and communal verification, mastery of a discipline and domain application, awareness of methodology creation, participation in mentoring, and cultivation of ethical awareness. Two key differences are discernment and quality of attention, as illustrated in how the legendary Hollywood film editor Walter Murch used Apple’s Final Cut Pro software to edit the 2003 film Cold Mountain (Koppelman). ‘Pro-Ams’ might not aspire to these criteria but Citizen Journalists shouldn’t throw out these standards, either. Doing so would be making the same mistake of overconfidence that technical analysts make against statistical arbitrageurs. Key processes—fact-checking, sub-editing and editorial decision-making—are invisible to the end-user, even if traceable in a blog or wiki publishing system, because of the judgments involved. One post-mortem insight from Assignment Zero was that these processes were vital to create the climate of authenticity and trust to sustain a Citizen Journalist community (Howe). CJ’s trouble with “objectivity” might also overlook some complexities, including the similarity of many bloggers to “noise traders” in financial markets and to op-ed columnists. Methodologies and reportage practices have evolved to deal with the objections that CJ proponents raise, from New Journalism’s radical subjectivity and creative non-fiction techniques (Wolfe & Johnson) to Precision Journalism that used descriptive statistics (Meyer). Finally, journalism frameworks could be updated with current research on how phenomenological awareness shapes our judgments and perceptions (Thompson). 8. Strategic Execution For me, one of CJ’s major weaknesses as a new media theory is its lack of “rich description” (Geertz) about the strategic execution of projects. As Disinfo.com site editor I encountered situations ranging from ‘denial of service’ attacks and spam to site migration, publishing systems that go offline, and ensuring an editorial consistency. Yet the messiness of these processes is missing from CJ theories and accounts. Theories that included this detail as “second-order interactions” (Carlile & Christensen 13) would offer a richer view of CJ. Many CJ and Web 2.0 projects fall into the categories of mini-projects, demonstration prototypes and start-ups, even when using a programming language such as Ajax or Ruby on Rails. Whilst the “bootstrap” process is a benefit, more longitudinal analysis and testing needs to occur, to ensure these projects are scalable and sustainable. For example, South Korea’s OhmyNews is cited as an exemplar that started with “727 citizen reporters and 4 editors” and now has “38,000 citizen reporters” and “a dozen editors” (Kolodzy 231). How does OhmyNews’s mix of hard and soft news change over time? Or, how does OhmyNews deal with a complex issue that might require major resources, such as security negotiations between North and South Korea? Such examples could do with further research. We need to go beyond “the vision thing” and look at the messiness of execution for deeper observations and counterintuitive correlations, to build new descriptive theories. 9. Future Research This essay argues that CJ needs re-evaluation. Its immediate legacy might be to splinter ‘journalism’ into micro-trends: Washington University’s Steve Boriss proclaims “citizen journalism is dead. Expert journalism is the future.” (Boriss; Mensching). The half-lives of such micro-trends demand new categorisations, which in turn prematurely feeds the theory-building cycle. Instead, future researchers could reinvigorate 21st century journalism if they ask deeper questions and return to the observation stage of building descriptive theories. In closing, below are some possible questions that future researchers might explore: Where are the “rich descriptions” of journalistic experience—“citizen”, “convergent”, “digital”, “Pro-Am” or otherwise in new media? How could practice-based approaches inform this research instead of relying on espoused theories-in-use? What new methodologies could be developed for CJ implementation? What role can the “heroic” individual reporter or editor have in “the swarm”? Do the claims about OhmyNews and other sites stand up to longitudinal observation? Are the theories used to justify Citizen Journalism’s normative stance (Rheingold; Surowiecki; Pesce) truly robust generalisations for strategic execution or do they reflect the biases of their creators? How could developers tap the conceptual dimensions of information technology innovation (Shasha) to create the next Facebook, MySpace or Wikipedia? References Argyris, Chris, and Donald Schon. Theory in Practice. San Francisco: Jossey-Bass Publishers, 1976. Barlow, Aaron. The Rise of the Blogosphere. Westport, CN: Praeger Publishers, 2007. Block, Peter. Flawless Consulting. 2nd ed. San Francisco, CA: Jossey-Bass/Pfeiffer, 2000. Boriss, Steve. “Citizen Journalism Is Dead. Expert Journalism Is the Future.” The Future of News. 28 Nov. 2007. 20 Feb. 2008 http://thefutureofnews.com/2007/11/28/citizen-journalism-is-dead- expert-journalism-is-the-future/>. Brooks, Jr., Frederick P. The Mythical Man-Month: Essays on Software Engineering. Rev. ed. Reading, MA: Addison-Wesley Publishing Company, 1995. Campbell, Vincent. Information Age Journalism: Journalism in an International Context. New York: Arnold, 2004. Carlile, Paul R., and Clayton M. Christensen. “The Cycles of Building Theory in Management Research.” Innosight working paper draft 6. 6 Jan. 2005. 19 Feb. 2008 http://www.innosight.com/documents/Theory%20Building.pdf>. Caverly, Doug. “Hyperlocal News Site Takes A Hit.” WebProNews.com 6 July 2007. 19 Feb. 2008 http://www.webpronews.com/topnews/2007/07/06/hyperlocal-news- sites-take-a-hit>. Chenoweth, Neil. Virtual Murdoch: Reality Wars on the Information Superhighway. Sydney: Random House Australia, 2001. Christensen, Clayton M. The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail. Boston, MA: Harvard Business School Press, 1997. Christensen, Clayton M., Curtis Johnson, and Michael Horn. Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns. New York: McGraw-Hill, 2008. Curtis, Adam. The Mayfair Set. London: British Broadcasting Corporation, 1999. Etherington, Kim. Becoming a Reflexive Researcher: Using Ourselves in Research. London: Jessica Kingsley Publishers, 2004. Festinger, Leon. A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press, 1962. Feyerabend, Paul. Against Method. 3rd ed. London: Verso, 1993. Finnemore, Martha. National Interests in International Society. Ithaca, NY: Cornell University Press, 1996. Geertz, Clifford. The Interpretation of Cultures. New York: Basic Books, 1973. Ghoshal, Sumantra. “Bad Management Theories Are Destroying Good Management Practices.” Academy of Management Learning & Education 4.1 (2005): 75-91. Gibson, William. Pattern Recognition. London: Viking, 2003. Gladwell, Malcolm. “The Cool-Hunt.” The New Yorker Magazine 17 March 1997. 20 Feb. 2008 http://www.gladwell.com/1997/1997_03_17_a_cool.htm>. Gross, Daniel. Pop! Why Bubbles Are Great for the Economy. New York: Collins, 2007. Hoffer, Eric. The True Believer. New York: Harper, 1951. Howe, Jeff. “Did Assignment Zero Fail? A Look Back, and Lessons Learned.” Wired News 16 July 2007. 19 Feb. 2008 http://www.wired.com/techbiz/media/news/2007/07/assignment_ zero_final?currentPage=all>. Kahneman, Daniel, and Amos Tversky. Choices, Values and Frames. Cambridge: Cambridge UP, 2000. Keen, Andrew. The Cult of the Amateur. New York: Doubleday Currency, 2007. Khurana, Rakesh. From Higher Aims to Hired Hands. Princeton, NJ: Princeton UP, 2007. Kolodzy, Janet. Convergence Journalism: Writing and Reporting across the News Media. Oxford: Rowman & Littlefield, 2006. Koppelman, Charles. Behind the Seen: How Walter Murch Edited Cold Mountain Using Apple’s Final Cut Pro and What This Means for Cinema. Upper Saddle River, NJ: New Rider, 2004. Leadbeater, Charles, and Paul Miller. “The Pro-Am Revolution”. London: Demos, 24 Nov. 2004. 19 Feb. 2008 http://www.demos.co.uk/publications/proameconomy>. Loomis, Carol J. “Bloomberg’s Money Machine.” Fortune 5 April 2007. 20 Feb. 2008 http://money.cnn.com/magazines/fortune/fortune_archive/2007/04/16/ 8404302/index.htm>. Lynch, Peter, and John Rothchild. Beating the Street. Rev. ed. New York: Simon & Schuster, 1994. Maister, David. True Professionalism. New York: The Free Press, 1997. Maister, David, Charles H. Green, and Robert M. Galford. The Trusted Advisor. New York: The Free Press, 2004. Mensching, Leah McBride. “Citizen Journalism on Its Way Out?” SFN Blog, 30 Nov. 2007. 20 Feb. 2008 http://www.sfnblog.com/index.php/2007/11/30/940-citizen-journalism- on-its-way-out>. Meyer, Philip. Precision Journalism. 4th ed. Lanham, MD: Rowman & Littlefield, 2002. McConnell, Steve. Professional Software Development. Boston, MA: Addison-Wesley, 2004. Mintzberg, Henry. Managers Not MBAs. San Francisco, CA: Berrett-Koehler, 2004. Morgan, Gareth. Images of Organisation. Rev. ed. Thousand Oaks, CA: Sage, 2006. Newsvine. “Msnbc.com Acquires Newsvine.” 7 Oct. 2007. 20 Feb. 2008 http://blog.newsvine.com/_news/2007/10/07/1008889-msnbccom- acquires-newsvine>. Niederhoffer, Victor, and Laurel Kenner. Practical Speculation. New York: John Wiley & Sons, 2003. Pearlstine, Norman. Off the Record: The Press, the Government, and the War over Anonymous Sources. New York: Farrar, Straus & Giroux, 2007. Pesce, Mark D. “Mob Rules (The Law of Fives).” The Human Network 28 Sep. 2007. 20 Feb. 2008 http://blog.futurestreetconsulting.com/?p=39>. Rheingold, Howard. Smart Mobs: The Next Social Revolution. Cambridge MA: Basic Books, 2002. Rosen, Jay. What Are Journalists For? Princeton NJ: Yale UP, 2001. Shasha, Dennis Elliott. Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists. New York: Copernicus, 1995. Slywotzky, Adrian. Value Migration: How to Think Several Moves Ahead of the Competition. Boston, MA: Harvard Business School Press, 1996. Smith, Steve. “The Self-Image of a Discipline: The Genealogy of International Relations Theory.” Eds. Steve Smith and Ken Booth. International Relations Theory Today. Cambridge, UK: Polity Press, 1995. 1-37. Spar, Debora L. Ruling the Waves: Cycles of Discovery, Chaos and Wealth from the Compass to the Internet. New York: Harcourt, 2001. Surowiecki, James. The Wisdom of Crowds. New York: Doubleday, 2004. Thompson, Evan. Mind in Life: Biology, Phenomenology, and the Sciences of Mind. Cambridge, MA: Belknap Press, 2007. Trippi, Joe. The Revolution Will Not Be Televised. New York: ReganBooks, 2004. Underwood, Doug. When MBA’s Rule the Newsroom. New York: Columbia University Press, 1993. Wark, McKenzie. Virtual Geography: Living with Global Media Events. Bloomington IN: Indiana UP, 1994. Wolfe, Tom, and E.W. Johnson. The New Journalism. New York: Harper & Row, 1973. Citation reference for this article MLA Style Burns, Alex. "Select Issues with New Media Theories of Citizen Journalism." M/C Journal 10.6/11.1 (2008). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0804/10-burns.php>. APA Style Burns, A. (Apr. 2008) "Select Issues with New Media Theories of Citizen Journalism," M/C Journal, 10(6)/11(1). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0804/10-burns.php>.
APA, Harvard, Vancouver, ISO, and other styles
25

Burns, Alex. "Select Issues with New Media Theories of Citizen Journalism." M/C Journal 11, no. 1 (June 1, 2008). http://dx.doi.org/10.5204/mcj.30.

Full text
Abstract:
“Journalists have to begin a new type of journalism, sometimes being the guide on the side of the civic conversation as well as the filter and gatekeeper.” (Kolodzy 218) “In many respects, citizen journalism is simply public journalism removed from the journalism profession.” (Barlow 181) 1. Citizen Journalism — The Latest Innovation? New Media theorists such as Dan Gillmor, Henry Jenkins, Jay Rosen and Jeff Howe have recently touted Citizen Journalism (CJ) as the latest innovation in 21st century journalism. “Participatory journalism” and “user-driven journalism” are other terms to describe CJ, which its proponents argue is a disruptive innovation (Christensen) to the agenda-setting media institutions, news values and “objective” reportage. In this essay I offer a “contrarian” view, informed by two perspectives: (1) a three-stage model of theory-building (Carlile & Christensen) to evaluate the claims made about CJ; and (2) self-reflexive research insights (Etherington) from editing the US-based news site Disinformation between November 1999 and February 2008. New media theories can potentially create “cognitive dissonance” (Festinger) when their explanations of CJ practices are compared with what actually happens (Feyerabend). First I summarise Carlile & Christensen’s model and the dangers of “bad theory” (Ghoshal). Next I consider several problems in new media theories about CJ: the notion of ‘citizen’, new media populism, parallels in event-driven and civic journalism, and mergers and acquisitions. Two ‘self-reflexive’ issues are considered: ‘pro-ams’ or ‘professional amateurs’ as a challenge to professional journalists, and CJ’s deployment in new media operations and production environments. Finally, some exploratory questions are offered for future researchers. 2. An Evaluative Framework for New Media Theories on Citizen Journalism Paul Carlile and Clayton M. Christensen’s model offers one framework with which to evaluate new media theories on CJ. This framework is used below to highlight select issues and gaps in CJ’s current frameworks and theories. Carlile & Christensen suggest that robust theory-building emerges via three stages: Descriptive, Categorisation and Normative (Carlile & Christensen). There are three sub-stages in Descriptive theory-building; namely, the observation of phenomena, inductive classification into schemas and taxonomies, and correlative relationships to develop models (Carlile & Christensen 2-5). Once causation is established, Normative theory evolves through deductive logic which is subject to Kuhnian paradigm shifts and Popperian falsifiability (Carlile & Christensen 6). Its proponents situate CJ as a Categorisation or new journalism agenda that poses a Normative challenged and Kuhnian paradigm shift to traditional journalism. Existing CJ theories jump from the Descriptive phase of observations like “smart mobs” in Japanese youth subcultures (Rheingold) to make broad claims for Categorisation such as that IndyMedia, blogs and wiki publishing systems as new media alternatives to traditional media. CJ theories then underpin normative beliefs, values and worldviews. Correlative relationships are also used to differentiate CJ from the demand side of microeconomic analysis, from the top-down editorial models of traditional media outlets, and to adopt a vanguard stance. To support this, CJ proponents cite research on emergent collective behaviour such as the “wisdom of crowds” hypothesis (Surowiecki) or peer-to-peer network “swarms” (Pesce) to provide scientific justification for their Normative theories. However, further evaluative research is needed for three reasons: the emergent collective behaviour hypothesis may not actually inform CJ practices, existing theories may have “correlation not cause” errors, and the link may be due to citation network effects between CJ theorists. Collectively, this research base also frames CJ as an “ought to” Categorisation and then proceeds to Normative theory-building (Carlile & Christensen 7). However, I argue below that this Categorisation may be premature: its observations and correlative relationships might reinforce a ‘weak’ Normative theory with limited generalisation. CJ proponents seem to imply that it can be applied anywhere and under any condition—a “statement of causality” that almost makes it a fad (Carlile & Christensen 8). CJ that relies on Classification and Normative claims will be problematic without a strong grounding in Descriptive observation. To understand what’s potentially at stake for CJ’s future consider the consider the parallel debate about curricula renewal for the Masters of Business Administration in the wake of high-profile corporate collapses such as Enron, Worldcom, HIH and OneTel. The MBA evolved as a sociological and institutional construct to justify management as a profession that is codified, differentiated and has entry barriers (Khurana). This process might partly explain the pushback that some media professionals have to CJ as one alternative. MBA programs faced criticism if they had student cohorts with little business know-how or experiential learning (Mintzberg). Enron’s collapse illustrated the ethical dilemmas and unintended consequences that occurred when “bad theories” were implemented (Ghoshal). Professional journalists are aware of this: MBA-educated managers challenged the “craft” tradition in the early 1980s (Underwood). This meant that journalism’s ‘self-image’ (Morgan; Smith) is intertwined with managerial anxieties about media conglomerates in highly competitive markets. Ironically, as noted below, Citizen Journalists who adopt a vanguard position vis-a-vis media professionals step into a more complex game with other players. However, current theories have a naïve idealism about CJ’s promise of normative social change in the face of Machiavellian agency in business, the media and politics. 3. Citizen Who? Who is the “citizen” in CJ? What is their self-awareness as a political agent? CJ proponents who use the ‘self-image’ of ‘citizen’ draw on observations from the participatory vision of open source software, peer-to-peer networks, and case studies such as Howard Dean’s 2004 bid for the Democrat Party nominee in the US Presidential election campaign (Trippi). Recent theorists note Alexander Hamilton’s tradition of civic activism (Barlow 178) which links contemporary bloggers with the Federalist Papers and early newspaper pamphlets. One unsurfaced assumption in these observations and correlations is that most bloggers will adopt a coherent political philosophy as informed citizens: a variation on Lockean utilitarianism, Rawlsian liberalism or Nader consumer activism. To date there is little discussion about how political philosophy could deepen CJ’s ‘self-image’: how to critically evaluate sources, audit and investigation processes, or strategies to deal with elites, deterrence and power. For example, although bloggers kept Valerie Plame’s ‘outing’ as a covert intelligence operative highly visible in the issues-attention cycle, it was agenda-setting media like The New York Times who the Bush Administration targeted to silence (Pearlstine). To be viable, CJ needs to evolve beyond a new media populism, perhaps into a constructivist model of agency, norms and social change (Finnemore). 4. Citizen Journalism as New Media Populism Several “precursor trends” foreshadowed CJ notably the mid-1990s interest in “cool-hunting” by new media analysts and subculture marketeers (Gibson; Gladwell). Whilst this audience focus waned with the 1995-2000 dotcom bubble it resurfaced in CJ and publisher Tim O’Reilly’s Web 2.0 vision. Thus, CJ might be viewed as new media populism that has flourished with the Web 2.0 boom. Yet if the boom becomes a macroeconomic bubble (Gross; Spar) then CJ could be written off as a “silver bullet” that ultimately failed to deliver on its promises (Brooks, Jr.). The reputations of uncritical proponents who adopted a “true believer” stance would also be damaged (Hoffer). This risk is evident if CJ is compared with a parallel trend that shares its audience focus and populist view: day traders and technical analysts who speculate on financial markets. This parallel trend provides an alternative discipline in which the populism surfaced in an earlier form (Carlile & Christensen 12). Fidelity’s Peter Lynch argues that stock pickers can use their Main Street knowledge to beat Wall Street by exploiting information asymmetries (Lynch & Rothchild). Yet Lynch’s examples came from the mid-1970s to early 1980s when indexed mutual fund strategies worked, before deregulation and macroeconomic volatility. A change in the Web 2.0 boom might similarly trigger a reconsideration of Citizen Journalism. Hedge fund maven Victor Niederhoffer contends that investors who rely on technical analysis are practicing a Comtean religion (Niederhoffer & Kenner 72-74) instead of Efficient Market Hypothesis traders who use statistical arbitrage to deal with ‘random walks’ or Behavioural Finance experts who build on Amos Tversky and Daniel Kahneman’s Prospect Theory (Kahneman & Tversky). Niederhoffer’s deeper point is that technical analysts’ belief that the “trend is your friend” is no match for the other schools, despite a mini-publishing industry and computer trading systems. There are also ontological and epistemological differences between the schools. Similarly, CJ proponents who adopt a ‘Professional Amateur’ or ‘Pro-Am’ stance (Leadbeater & Miller) may face a similar gulf when making comparisons with professional journalists and the production environments in media organisations. CJ also thrives as new media populism because of institutional vested interests. When media conglomerates cut back on cadetships and internships CJ might fill the market demand as one alternative. New media programs at New York University and others can use CJ to differentiate themselves from “hyperlocal” competitors (Christensen; Slywotzky; Christensen, Curtis & Horn). This transforms CJ from new media populism to new media institution. 5. Parallels: Event-driven & Civic Journalism For new media programs, CJ builds on two earlier traditions: the Event-driven journalism of crises like the 1991 Gulf War (Wark) and the Civic Journalism school that emerged in the 1960s social upheavals. Civic Journalism’s awareness of minorities and social issues provides the character ethic and political philosophy for many Citizen Journalists. Jay Rosen and others suggest that CJ is the next-generation heir to Civic Journalism, tracing a thread from the 1968 Chicago Democratic Convention to IndyMedia’s coverage of the 1999 “Battle in Seattle” (Rosen). Rosen’s observation could yield an interesting historiography or genealogy. Events such as the Southeast Asian tsunami on 26 December 2004 or Al Qaeda’s London bombings on 7 July 2005 are cited as examples of CJ as event-driven journalism and “pro-am collaboration” (Kolodzy 229-230). Having covered these events and Al Qaeda’s attacks on 11th September 2001, I have a slightly different view: this was more a variation on “first responder” status and handicam video footage that journalists have sourced for the past three decades when covering major disasters. This different view means that the “salience of categories” used to justify CJ and “pro-am collaboration” these events does not completely hold. Furthermore, when Citizen Journalism proponents tout Flickr and Wikipedia as models of real-time media they are building on a broader phenomenon that includes CNN’s Gulf War coverage and Bloomberg’s dominance of financial news (Loomis). 6. The Mergers & Acquisitions Scenario CJ proponents often express anxieties about the resilience of their outlets in the face of predatory venture capital firms who initiate Mergers & Acquisitions (M&A) activities. Ironically, these venture capital firms have core competencies and expertise in the event-driven infrastructure and real-time media that CJ aspires to. Sequoia Capital and other venture capital firms have evaluative frameworks that likely surpass Carlile & Christensen in sophistication, and they exploit parallels, information asymmetries and market populism. Furthermore, although venture capital firms such as Union Street Ventures have funded Web 2.0 firms, they are absent from the explanations of some theorists, whose examples of Citizen Journalism and Web 2.0 success may be the result of survivorship bias. Thus, the venture capital market remains an untapped data source for researchers who want to evaluate the impact of CJ outlets and institutions. The M&A scenario further problematises CJ in several ways. First, CJ is framed as “oppositional” to traditional media, yet this may be used as a stratagem in a game theory framework with multiple stakeholders. Drexel Burnham Lambert’s financier Michael Milken used market populism to sell ‘high-yield’ or ‘junk’ bonds to investors whilst disrupting the Wall Street establishment in the late 1980s (Curtis) and CJ could fulfil a similar tactical purpose. Second, the M&A goal of some Web 2.0 firms could undermine the participatory goals of a site’s community if post-merger integration fails. Jason Calacanis’s sale of Weblogs, Inc to America Online in 2005 and MSNBC’s acquisition of Newsvine on 5 October 2007 (Newsvine) might be success stories. However, this raises issues of digital “property rights” if you contribute to a community that is then sold in an M&A transaction—an outcome closer to business process outsourcing. Third, media “buzz” can create an unrealistic vision when a CJ site fails to grow beyond its start-up phase. Backfence.com’s demise as a “hyperlocal” initiative (Caverly) is one cautionary event that recalls the 2000 dotcom crash. The M&A scenarios outlined above are market dystopias for CJ purists. The major lesson for CJ proponents is to include other market players in hypotheses about causation and correlation factors. 7. ‘Pro-Ams’ & Professional Journalism’s Crisis CJ emerged during a period when Professional Journalism faced a major crisis of ‘self-image’. The Demos report The Pro-Am Revolution (Leadbeater & Miller) popularised the notion of ‘professional amateurs’ which some CJ theorists adopt to strengthen their categorisation. In turn, this triggers a response from cultural theorists who fear bloggers are new media’s barbarians (Keen). I concede Leadbeater and Miller have identified an important category. However, how some CJ theorists then generalise from ‘Pro-Ams’ illustrates the danger of ‘weak’ theory referred to above. Leadbeater and Miller’s categorisation does not really include a counter-view on the strengths of professionals, as illustrated in humanistic consulting (Block), professional service firms (Maister; Maister, Green & Galford), and software development (McConnell). The signs of professionalism these authors mention include a commitment to learning and communal verification, mastery of a discipline and domain application, awareness of methodology creation, participation in mentoring, and cultivation of ethical awareness. Two key differences are discernment and quality of attention, as illustrated in how the legendary Hollywood film editor Walter Murch used Apple’s Final Cut Pro software to edit the 2003 film Cold Mountain (Koppelman). ‘Pro-Ams’ might not aspire to these criteria but Citizen Journalists shouldn’t throw out these standards, either. Doing so would be making the same mistake of overconfidence that technical analysts make against statistical arbitrageurs. Key processes—fact-checking, sub-editing and editorial decision-making—are invisible to the end-user, even if traceable in a blog or wiki publishing system, because of the judgments involved. One post-mortem insight from Assignment Zero was that these processes were vital to create the climate of authenticity and trust to sustain a Citizen Journalist community (Howe). CJ’s trouble with “objectivity” might also overlook some complexities, including the similarity of many bloggers to “noise traders” in financial markets and to op-ed columnists. Methodologies and reportage practices have evolved to deal with the objections that CJ proponents raise, from New Journalism’s radical subjectivity and creative non-fiction techniques (Wolfe & Johnson) to Precision Journalism that used descriptive statistics (Meyer). Finally, journalism frameworks could be updated with current research on how phenomenological awareness shapes our judgments and perceptions (Thompson). 8. Strategic Execution For me, one of CJ’s major weaknesses as a new media theory is its lack of “rich description” (Geertz) about the strategic execution of projects. As Disinfo.com site editor I encountered situations ranging from ‘denial of service’ attacks and spam to site migration, publishing systems that go offline, and ensuring an editorial consistency. Yet the messiness of these processes is missing from CJ theories and accounts. Theories that included this detail as “second-order interactions” (Carlile & Christensen 13) would offer a richer view of CJ. Many CJ and Web 2.0 projects fall into the categories of mini-projects, demonstration prototypes and start-ups, even when using a programming language such as Ajax or Ruby on Rails. Whilst the “bootstrap” process is a benefit, more longitudinal analysis and testing needs to occur, to ensure these projects are scalable and sustainable. For example, South Korea’s OhmyNews is cited as an exemplar that started with “727 citizen reporters and 4 editors” and now has “38,000 citizen reporters” and “a dozen editors” (Kolodzy 231). How does OhmyNews’s mix of hard and soft news change over time? Or, how does OhmyNews deal with a complex issue that might require major resources, such as security negotiations between North and South Korea? Such examples could do with further research. We need to go beyond “the vision thing” and look at the messiness of execution for deeper observations and counterintuitive correlations, to build new descriptive theories. 9. Future Research This essay argues that CJ needs re-evaluation. Its immediate legacy might be to splinter ‘journalism’ into micro-trends: Washington University’s Steve Boriss proclaims “citizen journalism is dead. Expert journalism is the future.” (Boriss; Mensching). The half-lives of such micro-trends demand new categorisations, which in turn prematurely feeds the theory-building cycle. Instead, future researchers could reinvigorate 21st century journalism if they ask deeper questions and return to the observation stage of building descriptive theories. In closing, below are some possible questions that future researchers might explore: Where are the “rich descriptions” of journalistic experience—“citizen”, “convergent”, “digital”, “Pro-Am” or otherwise in new media?How could practice-based approaches inform this research instead of relying on espoused theories-in-use?What new methodologies could be developed for CJ implementation?What role can the “heroic” individual reporter or editor have in “the swarm”?Do the claims about OhmyNews and other sites stand up to longitudinal observation?Are the theories used to justify Citizen Journalism’s normative stance (Rheingold; Surowiecki; Pesce) truly robust generalisations for strategic execution or do they reflect the biases of their creators?How could developers tap the conceptual dimensions of information technology innovation (Shasha) to create the next Facebook, MySpace or Wikipedia? References Argyris, Chris, and Donald Schon. Theory in Practice. San Francisco: Jossey-Bass Publishers, 1976. Barlow, Aaron. The Rise of the Blogosphere. Westport, CN: Praeger Publishers, 2007. Block, Peter. Flawless Consulting. 2nd ed. San Francisco, CA: Jossey-Bass/Pfeiffer, 2000. Boriss, Steve. “Citizen Journalism Is Dead. Expert Journalism Is the Future.” The Future of News. 28 Nov. 2007. 20 Feb. 2008 < http://thefutureofnews.com/2007/11/28/citizen-journalism-is-dead- expert-journalism-is-the-future/ >. Brooks, Jr., Frederick P. The Mythical Man-Month: Essays on Software Engineering. Rev. ed. Reading, MA: Addison-Wesley Publishing Company, 1995. Campbell, Vincent. Information Age Journalism: Journalism in an International Context. New York: Arnold, 2004. Carlile, Paul R., and Clayton M. Christensen. “The Cycles of Building Theory in Management Research.” Innosight working paper draft 6. 6 Jan. 2005. 19 Feb. 2008 < http://www.innosight.com/documents/Theory%20Building.pdf >. Caverly, Doug. “Hyperlocal News Site Takes A Hit.” WebProNews.com 6 July 2007. 19 Feb. 2008 < http://www.webpronews.com/topnews/2007/07/06/hyperlocal-news- sites-take-a-hit >. Chenoweth, Neil. Virtual Murdoch: Reality Wars on the Information Superhighway. Sydney: Random House Australia, 2001. Christensen, Clayton M. The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail. Boston, MA: Harvard Business School Press, 1997. Christensen, Clayton M., Curtis Johnson, and Michael Horn. Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns. New York: McGraw-Hill, 2008. Curtis, Adam. The Mayfair Set. London: British Broadcasting Corporation, 1999. Etherington, Kim. Becoming a Reflexive Researcher: Using Ourselves in Research. London: Jessica Kingsley Publishers, 2004. Festinger, Leon. A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press, 1962. Feyerabend, Paul. Against Method. 3rd ed. London: Verso, 1993. Finnemore, Martha. National Interests in International Society. Ithaca, NY: Cornell University Press, 1996. Geertz, Clifford. The Interpretation of Cultures. New York: Basic Books, 1973. Ghoshal, Sumantra. “Bad Management Theories Are Destroying Good Management Practices.” Academy of Management Learning & Education 4.1 (2005): 75-91. Gibson, William. Pattern Recognition. London: Viking, 2003. Gladwell, Malcolm. “The Cool-Hunt.” The New Yorker Magazine 17 March 1997. 20 Feb. 2008 < http://www.gladwell.com/1997/1997_03_17_a_cool.htm >. Gross, Daniel. Pop! Why Bubbles Are Great for the Economy. New York: Collins, 2007. Hoffer, Eric. The True Believer. New York: Harper, 1951. Howe, Jeff. “Did Assignment Zero Fail? A Look Back, and Lessons Learned.” Wired News 16 July 2007. 19 Feb. 2008 < http://www.wired.com/techbiz/media/news/2007/07/assignment_ zero_final?currentPage=all >. Kahneman, Daniel, and Amos Tversky. Choices, Values and Frames. Cambridge: Cambridge UP, 2000. Keen, Andrew. The Cult of the Amateur. New York: Doubleday Currency, 2007. Khurana, Rakesh. From Higher Aims to Hired Hands. Princeton, NJ: Princeton UP, 2007. Kolodzy, Janet. Convergence Journalism: Writing and Reporting across the News Media. Oxford: Rowman & Littlefield, 2006. Koppelman, Charles. Behind the Seen: How Walter Murch Edited Cold Mountain Using Apple’s Final Cut Pro and What This Means for Cinema. Upper Saddle River, NJ: New Rider, 2004. Leadbeater, Charles, and Paul Miller. “The Pro-Am Revolution”. London: Demos, 24 Nov. 2004. 19 Feb. 2008 < http://www.demos.co.uk/publications/proameconomy >. Loomis, Carol J. “Bloomberg’s Money Machine.” Fortune 5 April 2007. 20 Feb. 2008 < http://money.cnn.com/magazines/fortune/fortune_archive/2007/04/16/ 8404302/index.htm >. Lynch, Peter, and John Rothchild. Beating the Street. Rev. ed. New York: Simon & Schuster, 1994. Maister, David. True Professionalism. New York: The Free Press, 1997. Maister, David, Charles H. Green, and Robert M. Galford. The Trusted Advisor. New York: The Free Press, 2004. Mensching, Leah McBride. “Citizen Journalism on Its Way Out?” SFN Blog, 30 Nov. 2007. 20 Feb. 2008 < http://www.sfnblog.com/index.php/2007/11/30/940-citizen-journalism- on-its-way-out >. Meyer, Philip. Precision Journalism. 4th ed. Lanham, MD: Rowman & Littlefield, 2002. McConnell, Steve. Professional Software Development. Boston, MA: Addison-Wesley, 2004. Mintzberg, Henry. Managers Not MBAs. San Francisco, CA: Berrett-Koehler, 2004. Morgan, Gareth. Images of Organisation. Rev. ed. Thousand Oaks, CA: Sage, 2006. Newsvine. “Msnbc.com Acquires Newsvine.” 7 Oct. 2007. 20 Feb. 2008 < http://blog.newsvine.com/_news/2007/10/07/1008889-msnbccom- acquires-newsvine >. Niederhoffer, Victor, and Laurel Kenner. Practical Speculation. New York: John Wiley & Sons, 2003. Pearlstine, Norman. Off the Record: The Press, the Government, and the War over Anonymous Sources. New York: Farrar, Straus & Giroux, 2007. Pesce, Mark D. “Mob Rules (The Law of Fives).” The Human Network 28 Sep. 2007. 20 Feb. 2008 < http://blog.futurestreetconsulting.com/?p=39 >. Rheingold, Howard. Smart Mobs: The Next Social Revolution. Cambridge MA: Basic Books, 2002. Rosen, Jay. What Are Journalists For? Princeton NJ: Yale UP, 2001. Shasha, Dennis Elliott. Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists. New York: Copernicus, 1995. Slywotzky, Adrian. Value Migration: How to Think Several Moves Ahead of the Competition. Boston, MA: Harvard Business School Press, 1996. Smith, Steve. “The Self-Image of a Discipline: The Genealogy of International Relations Theory.” Eds. Steve Smith and Ken Booth. International Relations Theory Today. Cambridge, UK: Polity Press, 1995. 1-37. Spar, Debora L. Ruling the Waves: Cycles of Discovery, Chaos and Wealth from the Compass to the Internet. New York: Harcourt, 2001. Surowiecki, James. The Wisdom of Crowds. New York: Doubleday, 2004. Thompson, Evan. Mind in Life: Biology, Phenomenology, and the Sciences of Mind. Cambridge, MA: Belknap Press, 2007. Trippi, Joe. The Revolution Will Not Be Televised. New York: ReganBooks, 2004. Underwood, Doug. When MBA’s Rule the Newsroom. New York: Columbia University Press, 1993. Wark, McKenzie. Virtual Geography: Living with Global Media Events. Bloomington IN: Indiana UP, 1994. Wolfe, Tom, and E.W. Johnson. The New Journalism. New York: Harper & Row, 1973.
APA, Harvard, Vancouver, ISO, and other styles
26

Deck, Andy. "Treadmill Culture." M/C Journal 6, no. 2 (April 1, 2003). http://dx.doi.org/10.5204/mcj.2157.

Full text
Abstract:
Since the first days of the World Wide Web, artists like myself have been exploring the new possibilities of network interactivity. Some good tools and languages have been developed and made available free for the public to use. This has empowered individuals to participate in the media in ways that are quite remarkable. Nonetheless, the future of independent media is clouded by legal, regulatory, and organisational challenges that need to be addressed. It is not clear to what extent independent content producers will be able to build upon the successes of the 90s – it is yet to be seen whether their efforts will be largely nullified by the anticyclones of a hostile media market. Not so long ago, American news magazines were covering the Browser War. Several real wars later, the terms of surrender are becoming clearer. Now both of the major Internet browsers are owned by huge media corporations, and most of the states (and Reagan-appointed judges) that were demanding the break-up of Microsoft have given up. A curious about-face occurred in U.S. Justice Department policy when John Ashcroft decided to drop the federal case. Maybe Microsoft's value as a partner in covert activity appealed to Ashcroft more than free competition. Regardless, Microsoft is now turning its wrath on new competitors, people who are doing something very, very bad: sharing the products of their own labour. This practice of sharing source code and building free software infrastructure is epitomised by the continuing development of Linux. Everything in the Linux kernel is free, publicly accessible information. As a rule, the people building this "open source" operating system software believe that maintaining transparency is important. But U.S. courts are not doing much to help. In a case brought by the Motion Picture Association of America against Eric Corley, a federal district court blocked the distribution of source code that enables these systems to play DVDs. In addition to censoring Corley's journal, the court ruled that any programmer who writes a program that plays a DVD must comply with a host of license restrictions. In short, an established and popular media format (the DVD) cannot be used under open source operating systems without sacrificing the principle that software source code should remain in the public domain. Should the contents of operating systems be tightly guarded secrets, or subject to public review? If there are capable programmers willing to create good, free operating systems, should the law stand in their way? The question concerning what type of software infrastructure will dominate personal computers in the future is being answered as much by disappointing legal decisions as it is by consumer choice. Rather than ensuring the necessary conditions for innovation and cooperation, the courts permit a monopoly to continue. Rather than endorsing transparency, secrecy prevails. Rather than aiming to preserve a balance between the commercial economy and the gift-economy, sharing is being undermined by the law. Part of the mystery of the Internet for a lot of newcomers must be that it seems to disprove the old adage that you can't get something for nothing. Free games, free music, free pornography, free art. Media corporations are doing their best to change this situation. The FBI and trade groups have blitzed the American news media with alarmist reports about how children don't understand that sharing digital information is a crime. Teacher Gail Chmura, the star of one such media campaign, says of her students, "It's always been interesting that they don't see a connection between the two. They just don't get it" (Hopper). Perhaps the confusion arises because the kids do understand that digital duplication lets two people have the same thing. Theft is at best a metaphor for the copying of data, because the original is not stolen in the same sense as a material object. In the effort to liken all copying to theft, legal provisions for the fair use of intellectual property are neglected. Teachers could just as easily emphasise the importance of sharing and the development of an electronic commons that is free for all to use. The values advanced by the trade groups are not beyond question and are not historical constants. According to Donald Krueckeberg, Rutgers University Professor of Urban Planning, native Americans tied the concept of property not to ownership but to use. "One used it, one moved on, and use was shared with others" (qtd. in Batt). Perhaps it is necessary for individuals to have dominion over some private data. But who owns the land, wind, sun, and sky of the Internet – the infrastructure? Given that publicly-funded research and free software have been as important to the development of the Internet as have business and commercial software, it is not surprising that some ambiguity remains about the property status of the dataverse. For many the Internet is as much a medium for expression and the interplay of languages as it is a framework for monetary transaction. In the case involving DVD software mentioned previously, there emerged a grass-roots campaign in opposition to censorship. Dozens of philosophical programmers and computer scientists asserted the expressive and linguistic bases of software by creating variations on the algorithm needed to play DVDs. The forbidden lines of symbols were printed on T-shirts, translated into different computer languages, translated into legal rhetoric, and even embedded into DNA and pictures of MPAA president Jack Valenti (see e.g. Touretzky). These efforts were inspired by a shared conviction that important liberties were at stake. Supporting the MPAA's position would do more than protect movies from piracy. The use of the algorithm was not clearly linked to an intent to pirate movies. Many felt that outlawing the DVD algorithm, which had been experimentally developed by a Norwegian teenager, represented a suppression of gumption and ingenuity. The court's decision rejected established principles of fair use, denied the established legality of reverse engineering software to achieve compatibility, and asserted that journalists and scientists had no right to publish a bit of code if it might be misused. In a similar case in April 2000, a U.S. court of appeals found that First Amendment protections did apply to software (Junger). Noting that source code has both an expressive feature and a functional feature, this court held that First Amendment protection is not reserved only for purely expressive communication. Yet in the DVD case, the court opposed this view and enforced the inflexible demands of the Digital Millennium Copyright Act. Notwithstanding Ted Nelson's characterisation of computers as literary machines, the decision meant that the linguistic and expressive aspects of software would be subordinated to other concerns. A simple series of symbols were thereby cast under a veil of legal secrecy. Although they were easy to discover, and capable of being committed to memory or translated to other languages, fair use and other intuitive freedoms were deemed expendable. These sorts of legal obstacles are serious challenges to the continued viability of free software like Linux. The central value proposition of Linux-based operating systems – free, open source code – is threatening to commercial competitors. Some corporations are intent on stifling further development of free alternatives. Patents offer another vulnerability. The writing of free software has become a minefield of potential patent lawsuits. Corporations have repeatedly chosen to pursue patent litigation years after the alleged infringements have been incorporated into widely used free software. For example, although it was designed to avoid patent problems by an array of international experts, the image file format known as JPEG (Joint Photographic Experts Group) has recently been dogged by patent infringement charges. Despite good intentions, low-budget initiatives and ad hoc organisations are ill equipped to fight profiteering patent lawsuits. One wonders whether software innovation is directed more by lawyers or computer scientists. The present copyright and patent regimes may serve the needs of the larger corporations, but it is doubtful that they are the best means of fostering software innovation and quality. Orwell wrote in his Homage to Catalonia, There was a new rule that censored portions of the newspaper must not be left blank but filled up with other matter; as a result it was often impossible to tell when something had been cut out. The development of the Internet has a similar character: new diversions spring up to replace what might have been so that the lost potential is hardly felt. The process of retrofitting Internet software to suit ideological and commercial agendas is already well underway. For example, Microsoft has announced recently that it will discontinue support for the Java language in 2004. The problem with Java, from Microsoft's perspective, is that it provides portable programming tools that work under all operating systems, not just Windows. With Java, programmers can develop software for the large number of Windows users, while simultaneously offering software to users of other operating systems. Java is an important piece of the software infrastructure for Internet content developers. Yet, in the interest of coercing people to use only their operating systems, Microsoft is willing to undermine thousands of existing Java-language projects. Their marketing hype calls this progress. The software industry relies on sales to survive, so if it means laying waste to good products and millions of hours of work in order to sell something new, well, that's business. The consequent infrastructure instability keeps software developers, and other creative people, on a treadmill. From Progressive Load by Andy Deck, artcontext.org/progload As an Internet content producer, one does not appeal directly to the hearts and minds of the public; one appeals through the medium of software and hardware. Since most people are understandably reluctant to modify the software running on their computers, the software installed initially is a critical determinant of what is possible. Unconventional, independent, and artistic uses of the Internet are diminished when the media infrastructure is effectively established by decree. Unaccountable corporate control over infrastructure software tilts the playing field against smaller content producers who have neither the advance warning of industrial machinations, nor the employees and resources necessary to keep up with a regime of strategic, cyclical obsolescence. It seems that independent content producers must conform to the distribution technologies and content formats favoured by the entertainment and marketing sectors, or else resign themselves to occupying the margins of media activity. It is no secret that highly diversified media corporations can leverage their assets to favour their own media offerings and confound their competitors. Yet when media giants AOL and Time-Warner announced their plans to merge in 2000, the claim of CEOs Steve Case and Gerald Levin that the merged companies would "operate in the public interest" was hardly challenged by American journalists. Time-Warner has since fought to end all ownership limits in the cable industry; and Case, who formerly championed third-party access to cable broadband markets, changed his tune abruptly after the merger. Now that Case has been ousted, it is unclear whether he still favours oligopoly. According to Levin, global media will be and is fast becoming the predominant business of the 21st century ... more important than government. It's more important than educational institutions and non-profits. We're going to need to have these corporations redefined as instruments of public service, and that may be a more efficient way to deal with society's problems than bureaucratic governments. Corporate dominance is going to be forced anyhow because when you have a system that is instantly available everywhere in the world immediately, then the old-fashioned regulatory system has to give way (Levin). It doesn't require a lot of insight to understand that this "redefinition," this slight of hand, does not protect the public from abuses of power: the dissolution of the "old-fashioned regulatory system" does not serve the public interest. From Lexicon by Andy Deck, artcontext.org/lexicon) As an artist who has adopted telecommunications networks and software as his medium, it disappoints me that a mercenary vision of electronic media's future seems to be the prevailing blueprint. The giantism of media corporations, and the ongoing deregulation of media consolidation (Ahrens), underscore the critical need for independent media sources. If it were just a matter of which cola to drink, it would not be of much concern, but media corporations control content. In this hyper-mediated age, content – whether produced by artists or journalists – crucially affects what people think about and how they understand the world. Content is not impervious to the software, protocols, and chicanery that surround its delivery. It is about time that people interested in independent voices stop believing that laissez faire capitalism is building a better media infrastructure. The German writer Hans Magnus Enzensberger reminds us that the media tyrannies that affect us are social products. The media industry relies on thousands of people to make the compromises necessary to maintain its course. The rapid development of the mind industry, its rise to a key position in modern society, has profoundly changed the role of the intellectual. He finds himself confronted with new threats and new opportunities. Whether he knows it or not, whether he likes it or not, he has become the accomplice of a huge industrial complex which depends for its survival on him, as he depends on it for his own. He must try, at any cost, to use it for his own purposes, which are incompatible with the purposes of the mind machine. What it upholds he must subvert. He may play it crooked or straight, he may win or lose the game; but he would do well to remember that there is more at stake than his own fortune (Enzensberger 18). Some cultural leaders have recognised the important role that free software already plays in the infrastructure of the Internet. Among intellectuals there is undoubtedly a genuine concern about the emerging contours of corporate, global media. But more effective solidarity is needed. Interest in open source has tended to remain superficial, leading to trendy, cosmetic, and symbolic uses of terms like "open source" rather than to a deeper commitment to an open, public information infrastructure. Too much attention is focussed on what's "cool" and not enough on the road ahead. Various media specialists – designers, programmers, artists, and technical directors – make important decisions that affect the continuing development of electronic media. Many developers have failed to recognise (or care) that their decisions regarding media formats can have long reaching consequences. Web sites that use media formats which are unworkable for open source operating systems should be actively discouraged. Comparable technologies are usually available to solve compatibility problems. Going with the market flow is not really giving people what they want: it often opposes the work of thousands of activists who are trying to develop open source alternatives (see e.g. Greene). Average Internet users can contribute to a more innovative, free, open, and independent media – and being conscientious is not always difficult or unpleasant. One project worthy of support is the Internet browser Mozilla. Currently, many content developers create their Websites so that they will look good only in Microsoft's Internet Explorer. While somewhat understandable given the market dominance of Internet Explorer, this disregard for interoperability undercuts attempts to popularise standards-compliant alternatives. Mozilla, written by a loose-knit group of activists and programmers (some of whom are paid by AOL/Time-Warner), can be used as an alternative to Microsoft's browser. If more people use Mozilla, it will be harder for content providers to ignore the way their Web pages appear in standards-compliant browsers. The Mozilla browser, which is an open source initiative, can be downloaded from http://www.mozilla.org/. While there are many people working to create real and lasting alternatives to the monopolistic and technocratic dynamics that are emerging, it takes a great deal of cooperation to resist the media titans, the FCC, and the courts. Oddly enough, corporate interests sometimes overlap with those of the public. Some industrial players, such as IBM, now support open source software. For them it is mostly a business decision. Frustrated by the coercive control of Microsoft, they support efforts to develop another operating system platform. For others, including this writer, the open source movement is interesting for the potential it holds to foster a more heterogeneous and less authoritarian communications infrastructure. Many people can find common cause in this resistance to globalised uniformity and consolidated media ownership. The biggest challenge may be to get people to believe that their choices really matter, that by endorsing certain products and operating systems and not others, they can actually make a difference. But it's unlikely that this idea will flourish if artists and intellectuals don't view their own actions as consequential. There is a troubling tendency for people to see themselves as powerless in the face of the market. This paralysing habit of mind must be abandoned before the media will be free. Works Cited Ahrens, Frank. "Policy Watch." Washington Post (23 June 2002): H03. 30 March 2003 <http://www.washingtonpost.com/ac2/wp-dyn/A27015-2002Jun22?la... ...nguage=printer>. Batt, William. "How Our Towns Got That Way." 7 Oct. 1996. 31 March 2003 <http://www.esb.utexas.edu/drnrm/WhatIs/LandValue.htm>. Chester, Jeff. "Gerald Levin's Negative Legacy." Alternet.org 6 Dec. 2001. 5 March 2003 <http://www.democraticmedia.org/resources/editorials/levin.php>. Enzensberger, Hans Magnus. "The Industrialisation of the Mind." Raids and Reconstructions. London: Pluto Press, 1975. 18. Greene, Thomas C. "MS to Eradicate GPL, Hence Linux." 25 June 2002. 5 March 2003 <http://www.theregus.com/content/4/25378.php>. Hopper, D. Ian. "FBI Pushes for Cyber Ethics Education." Associated Press 10 Oct. 2000. 29 March 2003 <http://www.billingsgazette.com/computing/20001010_cethics.php>. Junger v. Daley. U.S. Court of Appeals for 6th Circuit. 00a0117p.06. 2000. 31 March 2003 <http://pacer.ca6.uscourts.gov/cgi-bin/getopn.pl?OPINION=00a0... ...117p.06>. Levin, Gerald. "Millennium 2000 Special." CNN 2 Jan. 2000. Touretzky, D. S. "Gallery of CSS Descramblers." 2000. 29 March 2003 <http://www.cs.cmu.edu/~dst/DeCSS/Gallery>. Links http://artcontext.org/lexicon/ http://artcontext.org/progload http://pacer.ca6.uscourts.gov/cgi-bin/getopn.pl?OPINION=00a0117p.06 http://www.billingsgazette.com/computing/20001010_cethics.html http://www.cs.cmu.edu/~dst/DeCSS/Gallery http://www.democraticmedia.org/resources/editorials/levin.html http://www.esb.utexas.edu/drnrm/WhatIs/LandValue.htm http://www.mozilla.org/ http://www.theregus.com/content/4/25378.html http://www.washingtonpost.com/ac2/wp-dyn/A27015-2002Jun22?language=printer Citation reference for this article Substitute your date of access for Dn Month Year etc... MLA Style Deck, Andy. "Treadmill Culture " M/C: A Journal of Media and Culture< http://www.media-culture.org.au/0304/04-treadmillculture.php>. APA Style Deck, A. (2003, Apr 23). Treadmill Culture . M/C: A Journal of Media and Culture, 6,< http://www.media-culture.org.au/0304/04-treadmillculture.php>
APA, Harvard, Vancouver, ISO, and other styles
27

Goggin, Gerard. "Broadband." M/C Journal 6, no. 4 (August 1, 2003). http://dx.doi.org/10.5204/mcj.2219.

Full text
Abstract:
Connecting I’ve moved house on the weekend, closer to the centre of an Australian capital city. I had recently signed up for broadband, with a major Australian Internet company (my first contact, cf. Turner). Now I am the proud owner of a larger modem than I have ever owned: a white cable modem. I gaze out into our new street: two thick black cables cosseted in silver wire. I am relieved. My new home is located in one of those streets, double-cabled by Telstra and Optus in the data-rush of the mid-1990s. Otherwise, I’d be moth-balling the cable modem, and the thrill of my data percolating down coaxial cable. And it would be off to the computer supermarket to buy an ASDL modem, then to pick a provider, to squeeze some twenty-first century connectivity out of old copper (the phone network our grandparents and great-grandparents built). If I still lived in the country, or the outskirts of the city, or anywhere else more than four kilometres from the phone exchange, and somewhere that cable pay TV will never reach, it would be a dish for me — satellite. Our digital lives are premised upon infrastructure, the networks through which we shape what we do, fashion the meanings of our customs and practices, and exchange signs with others. Infrastructure is not simply the material or the technical (Lamberton), but it is the dense, fibrous knotting together of social visions, cultural resources, individual desires, and connections. No more can one easily discern between ‘society’ and ‘technology’, ‘carriage’ and ‘content’, ‘base’ and ‘superstructure’, or ‘infrastructure’ and ‘applications’ (or ‘services’ or ‘content’). To understand telecommunications in action, or the vectors of fibre, we need to consider the long and heterogeneous list of links among different human and non-human actors — the long networks, to take Bruno Latour’s evocative concept, that confect our broadband networks (Latour). The co-ordinates of our infrastructure still build on a century-long history of telecommunications networks, on the nineteenth-century centrality of telegraphy preceding this, and on the histories of the public and private so inscribed. Yet we are in the midst of a long, slow dismantling of the posts-telegraph-telephone (PTT) model of the monopoly carrier for each nation that dominated the twentieth century, with its deep colonial foundations. Instead our New World Information and Communication Order is not the decolonising UNESCO vision of the late 1970s and early 1980s (MacBride, Maitland). Rather it is the neoliberal, free trade, market access model, its symbol the 1984 US judicial decision to require the break-up of AT&T and the UK legislation in the same year that underpinned the Thatcherite twin move to privatize British Telecom and introduce telecommunications competition. Between 1984 and 1999, 110 telecommunications companies were privatized, and the ‘acquisition of privatized PTOs [public telecommunications operators] by European and American operators does follow colonial lines’ (Winseck 396; see also Mody, Bauer & Straubhaar). The competitive market has now been uneasily installed as the paradigm for convergent communications networks, not least with the World Trade Organisation’s 1994 General Agreement on Trade in Services and Annex on Telecommunications. As the citizen is recast as consumer and customer (Goggin, ‘Citizens and Beyond’), we rethink our cultural and political axioms as well as the axes that orient our understandings in this area. Information might travel close to the speed of light, and we might fantasise about optical fibre to the home (or pillow), but our terrain, our band where the struggle lies today, is narrower than we wish. Begging for broadband, it seems, is a long way from warchalking for WiFi. Policy Circuits The dreary everyday business of getting connected plugs the individual netizen into a tangled mess of policy circuits, as much as tricky network negotiations. Broadband in mid-2003 in Australia is a curious chimera, welded together from a patchwork of technologies, old and newer communications industries, emerging economies and patterns of use. Broadband conjures up grander visions, however, of communication and cultural cornucopia. Broadband is high-speed, high-bandwidth, ‘always-on’, networked communications. People can send and receive video, engage in multimedia exchanges of all sorts, make the most of online education, realise the vision of home-based work and trading, have access to telemedicine, and entertainment. Broadband really entered the lexicon with the mass takeup of the Internet in the early to mid-1990s, and with the debates about something called the ‘information superhighway’. The rise of the Internet, the deregulation of telecommunications, and the involuted convergence of communications and media technologies saw broadband positioned at the centre of policy debates nearly a decade ago. In 1993-1994, Australia had its Broadband Services Expert Group (BSEG), established by the then Labor government. The BSEG was charged with inquiring into ‘issues relating to the delivery of broadband services to homes, schools and businesses’. Stung by criticisms of elite composition (a narrow membership, with only one woman among its twelve members, and no consumer or citizen group representation), the BSEG was prompted into wider public discussion and consultation (Goggin & Newell). The then Bureau of Transport and Communications Economics (BTCE), since transmogrified into the Communications Research Unit of the Department of Communications, Information Technology and the Arts (DCITA), conducted its large-scale Communications Futures Project (BTCE and Luck). The BSEG Final report posed the question starkly: As a society we have choices to make. If we ignore the opportunities we run the risk of being left behind as other countries introduce new services and make themselves more competitive: we will become consumers of other countries’ content, culture and technologies rather than our own. Or we could adopt new technologies at any cost…This report puts forward a different approach, one based on developing a new, user-oriented strategy for communications. The emphasis will be on communication among people... (BSEG v) The BSEG proposed a ‘National Strategy for New Communications Networks’ based on three aspects: education and community access, industry development, and the role of government (BSEG x). Ironically, while the nation, or at least its policy elites, pondered the weighty question of broadband, Australia’s two largest telcos were doing it. The commercial decision of Telstra/Foxtel and Optus Vision, and their various television partners, was to nail their colours (black) to the mast, or rather telegraph pole, and to lay cable in the major capital cities. In fact, they duplicated the infrastructure in cities such as Sydney and Melbourne, then deciding it would not be profitable to cable up even regional centres, let alone small country towns or settlements. As Terry Flew and Christina Spurgeon observe: This wasteful duplication contrasted with many other parts of the country that would never have access to this infrastructure, or to the social and economic benefits that it was perceived to deliver. (Flew & Spurgeon 72) The implications of this decision for Australia’s telecommunications and television were profound, but there was little, if any, public input into this. Then Minister Michael Lee was very proud of his anti-siphoning list of programs, such as national sporting events, that would remain on free-to-air television rather than screen on pay, but was unwilling, or unable, to develop policy on broadband and pay TV cable infrastructure (on the ironies of Australia’s television history, see Given’s masterly account). During this period also, it may be remembered, Australia’s Internet was being passed into private hands, with the tendering out of AARNET (see Spurgeon for discussion). No such national strategy on broadband really emerged in the intervening years, nor has the market provided integrated, accessible broadband services. In 1997, landmark telecommunications legislation was enacted that provided a comprehensive framework for competition in telecommunications, as well as consolidating and extending consumer protection, universal service, customer service standards, and other reforms (CLC). Carrier and reseller competition had commenced in 1991, and the 1997 legislation gave it further impetus. Effective competition is now well established in long distance telephone markets, and in mobiles. Rivalrous competition exists in the market for local-call services, though viable alternatives to Telstra’s dominance are still few (Fels). Broadband too is an area where there is symbolic rivalry rather than effective competition. This is most visible in advertised ADSL offerings in large cities, yet most of the infrastructure for these services is comprised by Telstra’s copper, fixed-line network. Facilities-based duopoly competition exists principally where Telstra/Foxtel and Optus cable networks have been laid, though there are quite a number of ventures underway by regional telcos, power companies, and, most substantial perhaps, the ACT government’s TransACT broadband network. Policymakers and industry have been greatly concerned about what they see as slow takeup of broadband, compared to other countries, and by barriers to broadband competition and access to ‘bottleneck’ facilities (such as Telstra or Optus’s networks) by potential competitors. The government has alternated between trying to talk up broadband benefits and rates of take up and recognising the real difficulties Australia faces as a large country with a relative small and dispersed population. In March 2003, Minister Alston directed the ACCC to implement new monitoring and reporting arrangements on competition in the broadband industry. A key site for discussion of these matters has been the competition policy institution, the Australian Competition and Consumer Commission, and its various inquiries, reports, and considerations (consult ACCC’s telecommunications homepage at http://www.accc.gov.au/telco/fs-telecom.htm). Another key site has been the Productivity Commission (http://www.pc.gov.au), while a third is the National Office on the Information Economy (NOIE - http://www.noie.gov.au/projects/access/access/broadband1.htm). Others have questioned whether even the most perfectly competitive market in broadband will actually provide access to citizens and consumers. A great deal of work on this issue has been undertaken by DCITA, NOIE, the regulators, and industry bodies, not to mention consumer and public interest groups. Since 1997, there have been a number of governmental inquiries undertaken or in progress concerning the takeup of broadband and networked new media (for example, a House of Representatives Wireless Broadband Inquiry), as well as important inquiries into the still most strategically important of Australia’s companies in this area, Telstra. Much of this effort on an ersatz broadband policy has been piecemeal and fragmented. There are fundamental difficulties with the large size of the Australian continent and its harsh terrain, the small size of the Australian market, the number of providers, and the dominant position effectively still held by Telstra, as well as Singtel Optus (Optus’s previous overseas investors included Cable & Wireless and Bell South), and the larger telecommunications and Internet companies (such as Ozemail). Many consumers living in metropolitan Australia still face real difficulties in realising the slogan ‘bandwidth for all’, but the situation in parts of rural Australia is far worse. Satellite ‘broadband’ solutions are available, through Telstra Countrywide or other providers, but these offer limited two-way interactivity. Data can be received at reasonable speeds (though at far lower data rates than how ‘broadband’ used to be defined), but can only be sent at far slower rates (Goggin, Rural Communities Online). The cultural implications of these digital constraints may well be considerable. Computer gamers, for instance, are frustrated by slow return paths. In this light, the final report of the January 2003 Broadband Advisory Group (BAG) is very timely. The BAG report opens with a broadband rhapsody: Broadband communications technologies can deliver substantial economic and social benefits to Australia…As well as producing productivity gains in traditional and new industries, advanced connectivity can enrich community life, particularly in rural and regional areas. It provides the basis for integration of remote communities into national economic, cultural and social life. (BAG 1, 7) Its prescriptions include: Australia will be a world leader in the availability and effective use of broadband...and to capture the economic and social benefits of broadband connectivity...Broadband should be available to all Australians at fair and reasonable prices…Market arrangements should be pro-competitive and encourage investment...The Government should adopt a National Broadband Strategy (BAG 1) And, like its predecessor nine years earlier, the BAG report does make reference to a national broadband strategy aiming to maximise “choice in work and recreation activities available to all Australians independent of location, background, age or interests” (17). However, the idea of a national broadband strategy is not something the BAG really comes to grips with. The final report is keen on encouraging broadband adoption, but not explicit on how barriers to broadband can be addressed. Perhaps this is not surprising given that the membership of the BAG, dominated by representatives of large corporations and senior bureaucrats was even less representative than its BSEG predecessor. Some months after the BAG report, the Federal government did declare a broadband strategy. It did so, intriguingly enough, under the rubric of its response to the Regional Telecommunications Inquiry report (Estens), the second inquiry responsible for reassuring citizens nervous about the full-privatisation of Telstra (the first inquiry being Besley). The government’s grand $142.8 million National Broadband Strategy focusses on the ‘broadband needs of regional Australians, in partnership with all levels of government’ (Alston, ‘National Broadband Strategy’). Among other things, the government claims that the Strategy will result in “improved outcomes in terms of services and prices for regional broadband access; [and] the development of national broadband infrastructure assets.” (Alston, ‘National Broadband Strategy’) At the same time, the government announced an overall response to the Estens Inquiry, with specific safeguards for Telstra’s role in regional communications — a preliminary to the full Telstra sale (Alston, ‘Future Proofing’). Less publicised was the government’s further initiative in indigenous telecommunications, complementing its Telecommunications Action Plan for Remote Indigenous Communities (DCITA). Indigenous people, it can be argued, were never really contemplated as citizens with the ken of the universal service policy taken to underpin the twentieth-century government monopoly PTT project. In Australia during the deregulatory and re-regulatory 1990s, there was a great reluctance on the part of Labor and Coalition Federal governments, Telstra and other industry participants, even to research issues of access to and use of telecommunications by indigenous communicators. Telstra, and to a lesser extent Optus (who had purchased AUSSAT as part of their licence arrangements), shrouded the issue of indigenous communications in mystery that policymakers were very reluctant to uncover, let alone systematically address. Then regulator, the Australian Telecommunications Authority (AUSTEL), had raised grave concerns about indigenous telecommunications access in its 1991 Rural Communications inquiry. However, there was no government consideration of, nor research upon, these issues until Alston commissioned a study in 2001 — the basis for the TAPRIC strategy (DCITA). The elision of indigenous telecommunications from mainstream industry and government policy is all the more puzzling, if one considers the extraordinarily varied and significant experiments by indigenous Australians in telecommunications and Internet (not least in the early work of the Tanami community, made famous in media and cultural studies by the writings of anthropologist Eric Michaels). While the government’s mid-2003 moves on a ‘National Broadband Strategy’ attend to some details of the broadband predicament, they fall well short of an integrated framework that grasps the shortcomings of the neoliberal communications model. The funding offered is a token amount. The view from the seat of government is a glance from the rear-view mirror: taking a snapshot of rural communications in the years 2000-2002 and projecting this tableau into a safety-net ‘future proofing’ for the inevitable turning away of a fully-privately-owned Telstra from its previously universal, ‘carrier of last resort’ responsibilities. In this aetiolated, residualist policy gaze, citizens remain constructed as consumers in a very narrow sense in this incremental, quietist version of state securing of market arrangements. What is missing is any more expansive notion of citizens, their varied needs, expectations, uses, and cultural imaginings of ‘always on’ broadband networks. Hybrid Networks “Most people on earth will eventually have access to networks that are all switched, interactive, and broadband”, wrote Frances Cairncross in 1998. ‘Eventually’ is a very appropriate word to describe the parlous state of broadband technology implementation. Broadband is in a slow state of evolution and invention. The story of broadband so far underscores the predicament for Australian access to bandwidth, when we lack any comprehensive, integrated, effective, and fair policy in communications and information technology. We have only begun to experiment with broadband technologies and understand their evolving uses, cultural forms, and the sense in which they rework us as subjects. Our communications networks are not superhighways, to invoke an enduring artefact from an older technology. Nor any longer are they a single ‘public’ switched telecommunications network, like those presided over by the post-telegraph-telephone monopolies of old. Like roads themselves, or the nascent postal system of the sixteenth century, broadband is a patchwork quilt. The ‘fibre’ of our communications networks is hybrid. To be sure, powerful corporations dominate, like the Tassis or Taxis who served as postmasters to the Habsburg emperors (Briggs & Burke 25). Activating broadband today provides a perspective on the path dependency of technology history, and how we can open up new threads of a communications fabric. Our options for transforming our multitudinous networked lives emerge as much from everyday tactics and strategies as they do from grander schemes and unifying policies. We may care to reflect on the waning potential for nation-building technology, in the wake of globalisation. We no longer gather our imagined community around a Community Telephone Plan as it was called in 1960 (Barr, Moyal, and PMG). Yet we do require national and international strategies to get and stay connected (Barr), ideas and funding that concretely address the wider dimensions of access and use. We do need to debate the respective roles of Telstra, the state, community initiatives, and industry competition in fair telecommunications futures. Networks have global reach and require global and national integration. Here vision, co-ordination, and resources are urgently required for our commonweal and moral fibre. To feel the width of the band we desire, we need to plug into and activate the policy circuits. Thanks to Grayson Cooke, Patrick Lichty, Ned Rossiter, John Pace, and an anonymous reviewer for helpful comments. Works Cited Alston, Richard. ‘ “Future Proofing” Regional Communications.’ Department of Communications, Information Technology and the Arts, Canberra, 2003. 17 July 2003 <http://www.dcita.gov.au/Article/0,,0_1-2_3-4_115485,00.php> —. ‘A National Broadband Strategy.’ Department of Communications, Information Technology and the Arts, Canberra, 2003. 17 July 2003 <http://www.dcita.gov.au/Article/0,,0_1-2_3-4_115486,00.php>. Australian Competition and Consumer Commission (ACCC). Broadband Services Report March 2003. Canberra: ACCC, 2003. 17 July 2003 <http://www.accc.gov.au/telco/fs-telecom.htm>. —. Emerging Market Structures in the Communications Sector. Canberra: ACCC, 2003. 15 July 2003 <http://www.accc.gov.au/pubs/publications/utilities/telecommu... ...nications/Emerg_mar_struc.doc>. Barr, Trevor. new media.com: The Changing Face of Australia’s Media and Telecommunications. Sydney: Allen & Unwin, 2000. Besley, Tim (Telecommunications Service Inquiry). Connecting Australia: Telecommunications Service Inquiry. Canberra: Department of Information, Communications and the Arts, 2000. 17 July 2003 <http://www.telinquiry.gov.au/final_report.php>. Briggs, Asa, and Burke, Peter. A Social History of the Internet: From Gutenberg to the Internet. Cambridge: Polity, 2002. Broadband Advisory Group. Australia’s Broadband Connectivity: The Broadband Advisory Group’s Report to Government. Melbourne: National Office on the Information Economy, 2003. 15 July 2003 <http://www.noie.gov.au/publications/NOIE/BAG/report/index.htm>. Broadband Services Expert Group. Networking Australia’s Future: Final Report. Canberra: Australian Government Publishing Service (AGPS), 1994. Bureau of Transport and Communications Economics (BTCE). Communications Futures Final Project. Canberra: AGPS, 1994. Cairncross, Frances. The Death of Distance: How the Communications Revolution Will Change Our Lives. London: Orion Business Books, 1997. Communications Law Centre (CLC). Australian Telecommunications Regulation: The Communications Law Centre Guide. 2nd edition. Sydney: Communications Law Centre, University of NSW, 2001. Department of Communications, Information Technology and the Arts (DCITA). Telecommunications Action Plan for Remote Indigenous Communities: Report on the Strategic Study for Improving Telecommunications in Remote Indigenous Communities. Canberra: DCITA, 2002. Estens, D. Connecting Regional Australia: The Report of the Regional Telecommunications Inquiry. Canberra: DCITA, 2002. <http://www.telinquiry.gov.au/rti-report.php>, accessed 17 July 2003. Fels, Alan. ‘Competition in Telecommunications’, speech to Australian Telecommunications Users Group 19th Annual Conference. 6 March, 2003, Sydney. <http://www.accc.gov.au/speeches/2003/Fels_ATUG_6March03.doc>, accessed 15 July 2003. Flew, Terry, and Spurgeon, Christina. ‘Television After Broadcasting’. In The Australian TV Book. Ed. Graeme Turner and Stuart Cunningham. Allen & Unwin, Sydney. 69-85. 2000. Given, Jock. Turning Off the Television. Sydney: UNSW Press, 2003. Goggin, Gerard. ‘Citizens and Beyond: Universal service in the Twilight of the Nation-State.’ In All Connected?: Universal Service in Telecommunications, ed. Bruce Langtry. Melbourne: University of Melbourne Press, 1998. 49-77 —. Rural Communities Online: Networking to link Consumers to Providers. Melbourne: Telstra Consumer Consultative Council, 2003. Goggin, Gerard, and Newell, Christopher. Digital Disability: The Social Construction of Disability in New Media. Lanham, MD: Rowman & Littlefield, 2003. House of Representatives Standing Committee on Communications, Information Technology and the Arts (HoR). Connecting Australia!: Wireless Broadband. Report of Inquiry into Wireless Broadband Technologies. Canberra: Parliament House, 2002. <http://www.aph.gov.au/house/committee/cita/Wbt/report.htm>, accessed 17 July 2003. Lamberton, Don. ‘A Telecommunications Infrastructure is Not an Information Infrastructure’. Prometheus: Journal of Issues in Technological Change, Innovation, Information Economics, Communication and Science Policy 14 (1996): 31-38. Latour, Bruno. Science in Action: How to Follow Scientists and Engineers Through Society. Cambridge, MA: Harvard University Press, 1987. Luck, David. ‘Revisiting the Future: Assessing the 1994 BTCE communications futures project.’ Media International Australia 96 (2000): 109-119. MacBride, Sean (Chair of International Commission for the Study of Communication Problems). Many Voices, One World: Towards a New More Just and More Efficient World Information and Communication Order. Paris: Kegan Page, London. UNESCO, 1980. Maitland Commission (Independent Commission on Worldwide Telecommunications Development). The Missing Link. Geneva: International Telecommunications Union, 1985. Michaels, Eric. Bad Aboriginal Art: Tradition, Media, and Technological Horizons. Sydney: Allen & Unwin, 1994. Mody, Bella, Bauer, Johannes M., and Straubhaar, Joseph D., eds. Telecommunications Politics: Ownership and Control of the Information Highway in Developing Countries. Mahwah, NJ: Erlbaum, 1995. Moyal, Ann. Clear Across Australia: A History of Telecommunications. Melbourne: Thomas Nelson, 1984. Post-Master General’s Department (PMG). Community Telephone Plan for Australia. Melbourne: PMG, 1960. Productivity Commission (PC). Telecommunications Competition Regulation: Inquiry Report. Report No. 16. Melbourne: Productivity Commission, 2001. <http://www.pc.gov.au/inquiry/telecommunications/finalreport/>, accessed 17 July 2003. Spurgeon, Christina. ‘National Culture, Communications and the Information Economy.’ Media International Australia 87 (1998): 23-34. Turner, Graeme. ‘First Contact: coming to terms with the cable guy.’ UTS Review 3 (1997): 109-21. Winseck, Dwayne. ‘Wired Cities and Transnational Communications: New Forms of Governance for Telecommunications and the New Media’. In The Handbook of New Media: Social Shaping and Consequences of ICTs, ed. Leah A. Lievrouw and Sonia Livingstone. London: Sage, 2002. 393-409. World Trade Organisation. General Agreement on Trade in Services: Annex on Telecommunications. Geneva: World Trade Organisation, 1994. 17 July 2003 <http://www.wto.org/english/tratop_e/serv_e/12-tel_e.htm>. —. Fourth protocol to the General Agreement on Trade in Services. Geneva: World Trade Organisation. 17 July 2003 <http://www.wto.org/english/tratop_e/serv_e/4prote_e.htm>. Links http://www.accc.gov.au/pubs/publications/utilities/telecommunications/Emerg_mar_struc.doc http://www.accc.gov.au/speeches/2003/Fels_ATUG_6March03.doc http://www.accc.gov.au/telco/fs-telecom.htm http://www.aph.gov.au/house/committee/cita/Wbt/report.htm http://www.dcita.gov.au/Article/0,,0_1-2_3-4_115485,00.html http://www.dcita.gov.au/Article/0,,0_1-2_3-4_115486,00.html http://www.noie.gov.au/projects/access/access/broadband1.htm http://www.noie.gov.au/publications/NOIE/BAG/report/index.htm http://www.pc.gov.au http://www.pc.gov.au/inquiry/telecommunications/finalreport/ http://www.telinquiry.gov.au/final_report.html http://www.telinquiry.gov.au/rti-report.html http://www.wto.org/english/tratop_e/serv_e/12-tel_e.htm http://www.wto.org/english/tratop_e/serv_e/4prote_e.htm Citation reference for this article Substitute your date of access for Dn Month Year etc... MLA Style Goggin, Gerard. "Broadband" M/C: A Journal of Media and Culture< http://www.media-culture.org.au/0308/02-featurebroadband.php>. APA Style Goggin, G. (2003, Aug 26). Broadband. M/C: A Journal of Media and Culture, 6,< http://www.media-culture.org.au/0308/02-featurebroadband.php>
APA, Harvard, Vancouver, ISO, and other styles
28

Ashton, Daniel. "Digital Gaming Upgrade and Recovery: Enrolling Memories and Technologies as a Strategy for the Future." M/C Journal 11, no. 6 (November 30, 2008). http://dx.doi.org/10.5204/mcj.86.

Full text
Abstract:
IntroductionThe tagline for the 2008 Game On exhibition at the Australian Centre for the Moving Image in Melbourne invites visitors to “play your way through the history of videogames.” The Melbourne hosting follows on from exhibitions that have included the Barbican (London), the Royal Museum (Edinburgh) and the Science Museum (London). The Game On exhibition presents an exemplary instance of how digital games and digital games culture are recovered, organised and presented. The Science Museum exhibition offered visitors a walkthrough from the earliest to the latest consoles and games (Pong to Wii Sports) with opportunities for game play framed by curatorial plaques. This article will explore some of the themes and narratives embodied within the exhibition that see digital games technologies enrolled within a media teleology that emphasises technological advancement and upgrade. Narratives of Technological Upgrade The Game On exhibition employed a “social contextualisation” approach, connecting digital gaming developments with historical events and phenomena such as the 1969 moon landing and the Spice Girls. Whilst including thematic strands such as games and violence and games in education, the exhibition’s chronological ordering highlighted technological advancement. In doing so, the exhibition captured a broader tension around celebrating past technological advancement in gaming, whilst at the same time emphasising the quaint shortfalls and looking to future possibilities. That technological advancements stand out, particularly as a means of organising a narrative of digital gaming, resonates with Stephen Kline, Nick Dyer-Withford and Greig de Peuter’s analysis of digital gaming as a “perpetual innovation economy.” For Kline et al., corporations “devote a growing share of their resources to the continual alteration and upgrading of their products” (66). Technological upgrade and advancement were described by the Game On curator as an engaging aspect of the exhibition: When we had a BBC news presenter come in, she was talking about ‘here we have the PDP 1 and here I have the Nintendo DS’. She was just sort of comparing and contrasting. I know certainly that journalists were very keen on: ‘yeah, but how much processing power does the PDP 1 have?’, ‘what does it compare to today?’ – and it is very hard to compare. How do you compare Space War on the PDP 1 with something that runs on your mobile phone? They are very different systems. (Lee)This account of journalistic interest in technological progression and the curator’s subsequent interpretation raise a significant tension around understanding digital gaming. The concern with situating past gaming technologies and comparing capacities and capabilities, emphasises both the fascination with advancement and technological progress in the field and how the impressiveness of this advancement depends on remembering what has come before. Questions of remembering, recovering and forgetting are clear in the histories that console manufacturers offer when they describe past innovation and pioneering developments. For example, the company history provided by Nintendo on its website is exclusively a history of games technologies with no reference to the proceeding business of playing-card games from the late nineteenth century. Its website-published history only starts with the 1985 release of the NES (Nintendo Entertainment System), “an instant hit [that] over the course of the next two years, it almost single-handedly revitalized the video game industry” (Nintendo, ‘History’), and thereby overlooks the earlier 1983 less successful Famicom system. Past technologies are selectively remembered and recovered as part of the foundations for future success. This is a tension, that can be unpacked in a number of ways, across current industry transformations and strategies that potentially erase the past whilst simultaneously seeking to recover it as part of an evidence-base for future development. The following discussion develops an analysis of how digital gaming history is recovered and constructed.Industry Wind Change and Granny on the WiiThere is “unease, almost embarrassment”, James Newman suggests, “about the videogames industry within certain quarters of the industry itself” (6). Newman goes on to suggest:Various euphemisms have passed into common parlance, all seemingly motivated by a desire to avoid the use of the word ‘game’ and perhaps even ‘computer’, thereby adding a veneer of respectability, distancing the products and experiences from the childish pursuits of game, play and toys, and downplaying the technology connection with its unwanted resonances of nerds in bedrooms hunched over ZX Spectrums and Commodore 64s and the amateurism of hobbyist production. (6-7) The attempted move away from the resonances of “nerds in bedrooms” has been a strategic decision for Nintendo especially. This is illustrated by the naming of consoles: ‘family’ in Famicom, ‘entertainment’ in NES and, more recently, the renaming of the Wii from ‘Revolution’. The seventh generation Nintendo Wii console, released in November and December 2006, may be been seen as industry leading in efforts to broaden gaming demographics. In describing the console for instance, Satoru Iwata, the President of Nintendo, stated, “we want to appeal to mothers who don't want consoles in their living rooms, and to the elderly and to young women. It’s a challenge, like trying to sell cosmetics to men” (Edge Online). This position illustrates a digital games industry strategy to expand marketing to demographic groups previously marginalised.A few examples from the marketing and advertising campaigns for the Nintendo Wii help to illustrate this strategy. The marketing associated with the Wii can be seen as part of a longer lineage of Nintendo marketing with Kline et al. suggesting, “it was under Nintendo’s hegemony that the video game industry began to see the systematic development of a high-intensity marketing apparatus, involving massive media budgets, ingenious event marketing, ground breaking advertising and spin off merchandising” (118). The “First Experiences” show on the Wii website mocks-up domestic settings as the backdrop to the Wii playing experience to present an ideal, potential Wii-play scenario. These advertisements can be seen to position the player within an imagined home and game-play environment and speak for the Wii. As Keith Grint and Steve Woolgar suggest, “technology does not speak for itself but has to be spoken for” (32). As part of their concern with addressing, “the particular regime of truth which surrounds, upholds, impales and represents technology” (32), Grint and Woolgar “analyse the way certain technologies gain specific attributes” (33). Across advertisements for the Wii there are a range of domestic environments and groups playing. Of these, the power to bring the family together and facilitate ease of game-play for the novice is most noticeable. David Morley’s comment that, “‘hi-tech’ discourse is often carefully framed and domesticated by a rather nostalgic vision of ‘family values’” (438) is borne out here.A television advertisement aired on Nickelodeon illustrates the extent to which the Wii was at the forefront in motioning forward a strategy of industry and gaming inclusiveness around the family: “the 60-second spot shows a dad mistaking the Wii Remote for his television remote control. Dad becomes immersed in the game and soon the whole family joins in” (Nintendo World Report). From confused fathers to family bonding, the Wii is presented as the easy-to-use and accessible device that brings the family together. The father confusing the Wii remote with a television remote control is an important gesture to foreground the accessibility of the Wii remotes compared to previous “joypads”, and emphasize the Wii as an accessible device with no bedroom, technical wizardry required. Within the emerging industry inclusivity agenda, the ‘over technological’ past of digital gaming is something to move away from. The forms of ‘geek’ or ‘hardcore’ that epitomise previous dominant representations of gaming have seemingly stood in the way of the industry reaching its full market potential. This industry wind change is captured in the comments of a number of current industry professionals.For Matthew Jeffrey, head of European Recruitment for Electronic Arts (EA), speaking at the London Games Week Career Fair, the shift in the accessibility and inclusivity of digital gaming is closely bound up with Nintendo’s efforts and these have impacted upon EA’s strategy: There is going to be a huge swathe of new things and the great thing in the industry, as you are all easy to identify, is that Nindento DS and the Wii have revolutionised the way we look at the way things are going on.Jeffrey goes on to add, “hopefully some of you have seen that your eighty year old grandparent is quite happy to play a game”, pointing to the figure of the grandparent as a game-player to emphasise the inclusivity shift within gaming.Similarly, at Edinburgh Interactive Festival 2007, the CEO of Ubisoft Yves Guillemot in his “The New Generation of Gaming: Facing the Challenges of a Changing Market” speech outlined the development of a family friendly portfolio to please a new, non-gamer population that would include the recruitment of subject experts for “non-game” titles. This instance of the accessibility and inclusivity strategy being advocated is notable for it being part of a keynote speech at the Edinburgh Interactive Festival, an event associated with the Edinburgh festival that is both an important industry gathering and receives mainstream press coverage. The approaches taken by the other leading console manufacturers Sony and Microsoft, illustrate that whilst this is by no means a total shift, there is nevertheless an industry-wide engagement. The ‘World of Playstation: family and friends’ for example suggests that, “with PlayStation, games have never been more family-friendly” and that “you can even team up as a family to challenge your overseas relatives to a round of online quizzing over the PLAYSTATION Network” (Playstation).What follows from these accounts and transformations is a consideration of where the “geeky” past resides in the future of gaming as inclusive and accessible. Where do these developments leave digital gaming’s “subcultural past” (“subcultural” as it now becomes even within the games industry), as the industry forges on into mainstream culture? Past digital games technologies are clearly important in indicating technological progression and advancement, but what of the bedroom culture of gaming? How does “geek game culture” fit within a maturing future for the industry?Bedroom Programmers and Subcultural Memories There is a tension between business strategy directed towards making gaming accessible and thereby fostering new markets, and the games those in industry would design for people like themselves. This is not to deny the willingness or commitment of games developers to work on a variety of games, but instead to highlight transformation and tension. In their research into games development, Dovey and Kennedy suggest that, the “generation, now nearing middle-age and finding themselves in the driving seat of cultures of new media, have to reconcile a subcultural history and a dominant present” (145). Pierre Bourdieu’s account of symbolic capital is influential in tracing this shift, and Dovey and Kennedy note Bourdieu’s comment around, “the subjective image of the occupational project and the objective function of the occupation” (145). This shift is highly significant for ways of understanding maturation and inclusivity strategies within digital gaming.Bourdieu’s account of the “conservative functions attached” to an occupation for Dovey and Kennedy: Precisely describes the tensions between designers’ sense of themselves as ‘outsiders’ and rebels (‘the subjective image of the occupational project’) on the one hand and their position within a very tight production machine (‘the objective function of the occupation’) on the other. (145) I would suggest the “production machine”, that is to say the broader corporate management structures by which games development companies are increasingly operated, has a growing role in understandings of the industry. This approach was implicit in Iwata’s comments on selling cosmetics to men and broadening demographics, and Jeffrey’s comments pointing to how EA’s outlook would be influenced by the accessibility and inclusivity strategy championed by Nintendo. It may be suggested that as the occupational project of gaming is negotiated and shifts towards an emphasis on accessibility and inclusivity, the subjective image must be similarly reoriented. That previous industry models are being replaced, is highlighted in this excerpt from a Managing Director of a ‘leisure software’ company in the Staying ahead report on the creative industries by the Work Foundation:The first game that came from us was literally two schoolchildren making a game in their bedroom … the game hadn’t been funded, but made for fun … As those days are gone, the biggest challenges nowadays for game developers are finding funding that doesn’t impinge on creativity, and holding onto IP [intellectual property], which is so important if you want a business that is going to have any value. (27)This account suggests a hugely important transition from bedroom production, the days that ‘are gone’, towards Intellectual Property-aware production. The creative industries context for these comments should not be overlooked and is insightful for further recognising the shifts and negotiations taking place in digital gaming, notably, around the maturation of the games industry. The creative industries context is made explicit in creative industries reports such as Staying ahead and in the comments of Shaun Woodward (former Parliamentary Under-Secretary of State at the Department for Culture, Media and Sport) in a keynote speech at the 2006 British Video Games Academy Awards, in which he referred to the games industry as “one of our most important creative industries”. The forms of collaboration between, for instance, The Independent Games Developers Association (TIGA) and the Department of Culture, Media and Sport (see Gamasutra), further indicate the creative industries context to the maturation of the UK games industry.The creative industries context also presents the anchor through which tensions between a subcultural history and professional future and the complex forms of recovery can be more fully engaged with. The Game On curator’s indication that making comparisons between different games technologies systems was a delicate balance insightfully provides cautions to any attempt to mark out a strict departure from the ‘subcultural’ to the ‘professional’. Clearly put, the accessibility and inclusivity strategy that shifts away from geek culture and technical wizardry remains in conversation with geek elements as the foundation for the future. As technologies are recovered within a lineage of technological development and upgrade, the geek bedroom culture of gaming is almost mythologized to offer the industry history creative credentials and future potential. Recovering and Combining: Technologies and Memories for a Professional Future Emphasised thus far has been a shift from the days gone by of bedroom programming towards an inclusive and accessible professional and mature future. This is a teleological shift in the sense that the latest technological developments can be located within a past replete with innovation and pioneering spirit. In relation to the Wii for example, a Nintendo employee states:Nintendo is a company where you are praised for doing something different from everyone else. In this company, when an individual wants to do something different, everyone else lends their support to help them overcome any hurdles. I think this is how we made the challenge of Wii a possibility. (Nintendo)Nintendo’s history, alluded to here and implicit throughout the interviews with Nintendo staff from which this comment is taken, and previous and existing ‘culture’ of experimentation is offered here as the catalyst and enabler of the Wii. A further example may be offered in relation to Nintendo’s competitor Sony.A hugely significant transformation in digital gaming, further to the accessibility and inclusivity agenda, is the ability of players to develop their own games using games engines. For Phil Harrison (Sony), gaming technology is creating a, “‘virtual community’ of collaborative digital production, marking a return to the ‘golden age of video game development, which was at home, on your own with a couple of friends, designing a game yourself’” (Kline et al., 204). Bedroom gaming that in the earlier comments was regarded as days gone by for professionals, takes on a new significance as a form of user-engagement. The previous model of bedroom production, now outmoded compared to industry production, is relocated as available for users and recovered as the ‘golden age of gaming’. It is recovered as a model for users to aspire to. The significance of this for business strategy is made clear by Kline et al. who suggest that, “thousands of bright bulbs have essentially become Sony’s junior development community” (204). An obsolete model of past production is recovered and deployed within a future vision of the games industry that sees users participating and extending forms of games engagement and consumption. Similarly, the potential of ‘bedroom’ production and its recovery in relation to growth areas such as games for mobile phones, is carefully framed by Intellectual Property Rights (Edwards and Coulton). In this respect, forms of bedroom production are carefully situated in terms of industry strategies.The “Scarce Talent Seminars” as part of the London Games Week 2008 “Skills Week” further illustrate this continual recovery of ‘past’, or more accurately alternative, forms of production in line with narratives of professionalisation and industry innovation. The seminars were stated as offering advice on bridging the gap between the “bedroom programmer” and the “professional developer”. The discourse of ‘talent’ framed this seminar, and the bedroom programmer is held up as being (not having) raw talent with creative energies and love and commitment for gaming that can be shaped for the future of the industry. This discourse of bedroom programmers as talent emphasises the industry as an enabler of individual talent through access to professional development and technological resources. This then sits alongside the recovery of historical narratives in which bedroom gaming culture is celebrated for its pioneering spirit, but is ultimately recovered in terms of current achievements and future possibilities. “Skills Week” and guidance for those wanting to work in the industry connects with the recovery of past technologies and ways of making games visible amongst the potential industry workers of the future – students. The professional future of the industry is intertwined with graduates with professional qualifications. Those qualifications need not be, and sometimes preferably should not be, in ‘gaming’ courses. What is important is the love of games and this may be seen through the appreciation of gaming’s history. During research conducted with games design students in higher education courses in the UK, many students professed a love of games dating back to the Spectrum console in the 1980s. There was legitimacy and evidence of professing long-seated interests in consoles. At the same time as acknowledging the significant, embryonic power these consoles had in stimulating their interests, many students engaged in learning games design skills with the latest software packages. Similarly, they engaged in bedroom design activities themselves, as in the days gone by, but mainly as training and to develop skills useful to securing employment within a professional development studio. Broadly, students could be said to be recovering both technologies and ways of working that are then enrolled within their development as professional workers of the future. The professional future of the gaming industry is presented as part of a teleological trajectory that mirrors the technological progression of the industry’s upgrade culture. The days of bedroom programming are cast as periods of incubation and experimentation, and part of the journey that has brought gaming to where it is now. Bedroom programming is incorporated into the evidence-base of creative industries policy reports. Other accounts of bedroom programming, independent production and attempts to explore alternative publishing avenues do not feature as readily.In the 2000 Scratchware Manifesto for example, the authors declare, “the machinery of gaming has run amok”, and say, “Basta! Enough!” (Scratchware). The Scratchware Manifesto puts forward Scratchware as a response: “a computer game, created by a microteam, with pro-quality art, game design, programming and sound to be sold at paperback prices” (Scratchware). The manifesto goes on to say, “we need Scratchware because there is more than one way to develop good computer games” (Scratchware, 2000). Using a term readily associated with the Zapatista Army of National Liberation, the Scratchware Manifesto called for a revolution in gaming and stated, “we will strive for […] originality over the tried and tested” (Scratchware). These are the experiences and accounts of the games industry that seem to fall well outside of the technological and upgrade focused agenda of professional games development.The recovery and framing of past technologies and industry practices, in ways supportive to current models of technological upgrade and advancement, legitimises these models and marginalizes others. A eulogized and potentially mythical past is recovered to point to cultures of innovation and creative vibrancy and to emphasize current and future technological prowess. We must therefore be cautious of the instrumental dangers of recovery in which ‘bright bulbs’ are enrolled and alternative forms of production marginalised.As digital gaming establishes a secure footing with increased markets, the growing pains of the industry can be celebrated and recovered as part of the ongoing narratives of the industry. Recovery is vital to make sense of both the past and future. Within digital gaming, the PDP-1 and the bedroom geek both exist in the past, present and future as part of an industry strategy and trajectory that seeks to move away from them but also relies on them. They are the legitimacy, the evidence and the potential for affirming industry models. The extent to which other narratives can be told and technologies and memories recovered as alternative forms of evidence and potential is a question I, and hopefully others, will leave open.ReferencesDovey, John, and Helen W. Kennedy. Game Cultures. Maidenhead: Open University Press, 2006.Edge-Online. "Iwata: Wii Is 'Like Selling Make-Up to Men.'" Edge-Online 19 Sep. 2006. 29 Sep. 2006 ‹http://www.edge-online.com/news/iwata-wii-like-selling-make-up-men›.Edwards, Reuben, and Paul Coulton. "Providing the Skills Required for Innovative Mobile Game Development Using Industry/Academic Partnerships." Italics e journal 5.3 (2006). ‹http://www.ics.heacademy.ac.uk/italics/vol5iss3/edwardscoulton.pdf›.Gamasutra. "TIGA Pushing for Continued UK Industry Government Support." Gamasutra Industry News 3 July 2007. 8 July 2007 ‹http://www.gamasutra.com/php-bin/news_index.php?story=14504›Grint, Keith, and Steve Woolgar. The Machine at Work. London: Blackwell, 1997.Jeffrey, Matthew. Transcribed Speech. 24 October 2007.Kline, Stephen, Nick Dyer-Witheford, and Greig De Peuter. Digital Play. London: McGill-Queen’s University Press, 2003.Lee, Gaetan. Personal Interview. 27 July 2007.Morley, David. "What’s ‘Home’ Got to Do with It? Contradictory Dynamics in the Domestication of Technology and the Dislocation of Domesticity." European Journal of Cultural Studies 6.4 (2003): 435-458.Newman, James. Videogames. London: Routledge, 2004.Nintendo. "Company History." Nintendo. 2007. 3 Nov. 2008 ‹http://www.nintendo.com/corp/history.jsp›.Nintendo. "Wii Remote." Nintendo. 2006. 29 Sep. 2008 ‹http://wiiportal.nintendo-europe.com/97.html›.Nintendo World Report. "Nintendo’s Marketing Blitz: Wii Play for All!" Nintendo World Report 13 Nov. 2006. 29 Sep. 2008 ‹http://www.nintendoworldreport.com/newsArt.cfm?artid=12383›.Playstation. "World of Playstation: Family and Friends." Sony Playstation. 3 Nov. 2008 ‹http://uk.playstation.com/home/news/articles/detail/item103208/World-of-PlayStation:-Family-&-Friends/›.Scratchware. "The Scratchware Manifesto." 2000. 14 June 2006 ‹http://www.the-underdogs.info/scratch.php›.Work Foundation. Staying Ahead: The Economic Performance of the UK’s Creative Industries. London: Department of Culture, Media and Sport, 2007.
APA, Harvard, Vancouver, ISO, and other styles
29

Jones, Steve. "Seeing Sound, Hearing Image." M/C Journal 2, no. 4 (June 1, 1999). http://dx.doi.org/10.5204/mcj.1763.

Full text
Abstract:
“As the old technologies become automatic and invisible, we find ourselves more concerned with fighting or embracing what’s new”—Dennis Baron, From Pencils to Pixels: The Stage of Literacy Technologies Popular music is firmly rooted within realist practice, or what has been called the "culture of authenticity" associated with modernism. As Lawrence Grossberg notes, the accelleration of the rate of change in modern life caused, in post-war youth culture, an identity crisis or "lived contradiction" that gave rock (particularly) and popular music (generally) a peculiar position in regard to notions of authenticity. Grossberg places rock's authenticity within the "difference" it maintains from other cultural forms, and notes that its difference "can be justified aesthetically or ideologically, or in terms of the social position of the audiences, or by the economics of its production, or through the measure of its popularity or the statement of its politics" (205-6). Popular music scholars have not adequately addressed issues of authenticity and individuality. Two of the most important questions to be asked are: How is authenticity communicated in popular music? What is the site of the interpretation of authenticity? It is important to ask about sound, technology, about the attempt to understand the ideal and the image, the natural and artificial. It is these that make clear the strongest connections between popular music and contemporary culture. Popular music is a particularly appropriate site for the study of authenticity as a cultural category, for several reasons. For one thing, other media do not follow us, as aural media do, into malls, elevators, cars, planes. Nor do they wait for us, as a tape player paused and ready to play. What is important is not that music is "everywhere" but, to borrow from Vivian Sobchack, that it creates a "here" that can be transported anywhere. In fact, we are able to walk around enveloped by a personal aural environment, thanks to a Sony Walkman.1 Also, it is more difficult to shut out the aural than the visual. Closing one's ears does not entirely shut out sound. There is, additionally, the sense that sound and music are interpreted from within, that is, that they resonate through and within the body, and as such engage with one's self in a fashion that coincides with Charles Taylor's claim that the "ideal of authenticity" is an inner-directed one. It must be noted that authenticity is not, however, communicated only via music, but via text and image. Grossberg noted the "primacy of sound" in rock music, and the important link between music, visual image, and authenticity: Visual style as conceived in rock culture is usually the stage for an outrageous and self-conscious inauthenticity... . It was here -- in its visual presentation -- that rock often most explicitly manifested both an ironic resistance to the dominant culture and its sympathies with the business of entertainment ... . The demand for live performance has always expressed the desire for the visual mark (and proof) of authenticity. (208) But that relationship can also be reversed: Music and sound serve in some instances to provide the aural mark and proof of authenticity. Consider, for instance, the "tear" in the voice that Jensen identifies in Hank Williams's singing, and in that of Patsy Cline. For the latter, voicing, in this sense, was particularly important, as it meant more than a singing style, it also involved matters of self-identity, as Jensen appropriately associates with the move of country music from "hometown" to "uptown" (101). Cline's move toward a more "uptown" style involved her visual image, too. At a significant turning point in her career, Faron Young noted, Cline "left that country girl look in those western outfits behind and opted for a slicker appearance in dresses and high fashion gowns" (Jensen 101). Popular music has forged a link with visual media, and in some sense music itself has become more visual (though not necessarily less aural) the more it has engaged with industrial processes in the entertainment industry. For example, engagement with music videos and film soundtracks has made music a part of the larger convergence of mass media forms. Alongside that convergence, the use of music in visual media has come to serve as adjunct to visual symbolisation. One only need observe the increasingly commercial uses to which music is put (as in advertising, film soundtracks and music videos) to note ways in which music serves image. In the literature from a variety of disciplines, including communication, art and music, it has been argued that music videos are the visualisation of music. But in many respects the opposite is true. Music videos are the auralisation of the visual. Music serves many of the same purposes as sound does generally in visual media. One can find a strong argument for the use of sound as supplement to visual media in Silverman's and Altman's work. For Silverman, sound in cinema has largely been overlooked (pun intended) in favor of the visual image, but sound is a more effective (and perhaps necessary) element for willful suspension of disbelief. One may see this as well in the development of Dolby Surround Sound, and in increased emphasis on sound engineering among video and computer game makers, as well as the development of sub-woofers and high-fidelity speakers as computer peripherals. Another way that sound has become more closely associated with the visual is through the ongoing evolution of marketing demands within the popular music industry that increasingly rely on visual media and force image to the front. Internet technologies, particularly the WorldWideWeb (WWW), are also evidence of a merging of the visual and aural (see Hayward). The development of low-cost desktop video equipment and WWW publishing, CD-i, CD-ROM, DVD, and other technologies, has meant that visual images continue to form part of the industrial routine of the music business. The decrease in cost of many of these technologies has also led to the adoption of such routines among individual musicians, small/independent labels, and producers seeking to mimic the resources of major labels (a practice that has become considerably easier via the Internet, as it is difficult to determine capital resources solely from a WWW site). Yet there is another facet to the evolution of the link between the aural and visual. Sound has become more visual by way of its representation during its production (a representation, and process, that has largely been ignored in popular music studies). That representation has to do with the digitisation of sound, and the subsequent transformation sound and music can undergo after being digitised and portrayed on a computer screen. Once digitised, sound can be made visual in any number of ways, through traditional methods like music notation, through representation as audio waveform, by way of MIDI notation, bit streams, or through representation as shapes and colors (as in recent software applications particularly for children, like Making Music by Morton Subotnick). The impetus for these representations comes from the desire for increased control over sound (see Jones, Rock Formation) and such control seems most easily accomplished by way of computers and their concomitant visual technologies (monitors, printers). To make computers useful tools for sound recording it is necessary to employ some form of visual representation for the aural, and the flexibility of modern computers allows for new modes of predominately visual representation. Each of these connections between the aural and visual is in turn related to technology, for as audio technology develops within the entertainment industry it makes sense for synergistic development to occur with visual media technologies. Yet popular music scholars routinely analyse aural and visual media in isolation from one another. The challenge for popular music studies and music philosophy posed by visual media technologies, that they must attend to spatiality and context (both visual and aural), has not been taken up. Until such time as it is, it will be difficult, if not impossible, to engage issues of authenticity, because they will remain rootless instead of situated within the experience of music as fully sensual (in some cases even synaesthetic). Most of the traditional judgments of authenticity among music critics and many popular music scholars involve space and time, the former in terms of the movement of music across cultures and the latter in terms of history. None rely on notions of the "situatedness" of the listener or musicmaker in a particular aural, visual and historical space. Part of the reason for the lack of such an understanding arises from the very means by which popular music is created. We have become accustomed to understanding music as manipulation of sound, and so far as most modern music production is concerned such manipulation occurs as much visually as aurally, by cutting, pasting and otherwise altering audio waveforms on a computer screen. Musicians no more record music than they record fingering; they engage in sound recording. And recording engineers and producers rely less and less on sound and more on sight to determine whether a recording conforms to the demands of digital reproduction.2 Sound, particularly when joined with the visual, becomes a means to build and manipulate the environment, virtual and non-virtual (see Jones, "Sound"). Sound & Music As we construct space through sound, both in terms of audio production (e.g., the use of reverberation devices in recording studios) and in terms of everyday life (e.g., perception of aural stimuli, whether by ear or vibration in the body, from points surrounding us), we centre it within experience. Sound combines the psychological and physiological. Audio engineer George Massenburg noted that in film theaters: You couldn't utilise the full 360-degree sound space for music because there was an "exit sign" phenomena [sic]. If you had a lot of audio going on in the back, people would have a natural inclination to turn around and stare at the back of the room. (Massenburg 79-80) However, he went on to say, beyond observations of such reactions to multichannel sound technology, "we don't know very much". Research in psychoacoustics being used to develop virtual audio systems relies on such reactions and on a notion of human hardwiring for stimulus response (see Jones, "Sense"). But a major stumbling block toward the development of those systems is that none are able to account for individual listeners' perceptions. It is therefore important to consider the individual along with the social dimension in discussions of sound and music. For instance, the term "sound" is deployed in popular music to signify several things, all of which have to do with music or musical performance, but none of which is music. So, for instance, musical groups or performers can have a "sound", but it is distinguishable from what notes they play. Entire music scenes can have "sounds", but the music within such scenes is clearly distinct and differentiated. For the study of popular music this is a significant but often overlooked dimension. As Grossberg argues, "the authenticity of rock was measured by its sound" (207). Visually, he says, popular music is suspect and often inauthentic (sometimes purposefully so), and it is grounded in the aural. Similarly in country music Jensen notes that the "Nashville Sound" continually evoked conflicting definitions among fans and musicians, but that: The music itself was the arena in and through which claims about the Nashville Sound's authenticity were played out. A certain sound (steel guitar, with fiddle) was deemed "hard" or "pure" country, in spite of its own commercial history. (84) One should, therefore, attend to the interpretive acts associated with sound and its meaning. But why has not popular music studies engaged in systematic analysis of sound at the level of the individual as well as the social? As John Shepherd put it, "little cultural theoretical work in music is concerned with music's sounds" ("Value" 174). Why should this be a cause for concern? First, because Shepherd claims that sound is not "meaningful" in the traditional sense. Second, because it leads us to re-examine the question long set to the side in popular music studies: What is music? The structural homology, the connection between meaning and social formation, is a foundation upon which the concept of authenticity in popular music stands. Yet the ability to label a particular piece of music "good" shifts from moment to moment, and place to place. Frith understates the problem when he writes that "it is difficult ... to say how musical texts mean or represent something, and it is difficult to isolate structures of musical creation or control" (56). Shepherd attempts to overcome this difficulty by emphasising that: Music is a social medium in sound. What [this] means ... is that the sounds of music provide constantly moving and complex matrices of sounds in which individuals may invest their own meanings ... [however] while the matrices of sounds which seemingly constitute an individual "piece" of music can accommodate a range of meanings, and thereby allow for negotiability of meaning, they cannot accommodate all possible meanings. (Shepherd, "Art") It must be acknowledged that authenticity is constructed, and that in itself is an argument against the most common way to think of authenticity. If authenticity implies something about the "pure" state of an object or symbol then surely such a state is connected to some "objective" rendering, one not possible according to Shepherd's claims. In some sense, then, authenticity is autonomous, its materialisation springs not from any necessary connection to sound, image, text, but from individual acts of interpretation, typically within what in literary criticism has come to be known as "interpretive communities". It is not hard to illustrate the point by generalising and observing that rock's notion of authenticity is captured in terms of songwriting, but that songwriters are typically identified with places (e.g. Tin Pan Alley, the Brill Building, Liverpool, etc.). In this way there is an obvious connection between authenticity and authorship (see Jones, "Popular Music Studies") and geography (as well in terms of musical "scenes", e.g. the "Philly Sound", the "Sun Sound", etc.). The important thing to note is the resultant connection between the symbolic and the physical worlds rooted (pun intended) in geography. As Redhead & Street put it: The idea of "roots" refers to a number of aspects of the musical process. There is the audience in which the musician's career is rooted ... . Another notion of roots refers to music. Here the idea is that the sounds and the style of the music should continue to resemble the source from which it sprang ... . The issue ... can be detected in the argument of those who raise doubts about the use of musical high-technology by African artists. A final version of roots applies to the artist's sociological origins. (180) It is important, consequently, to note that new technologies, particularly ones associated with the distribution of music, are of increasing importance in regulating the tension between alienation and progress mentioned earlier, as they are technologies not simply of musical production and consumption, but of geography. That the tension they mediate is most readily apparent in legal skirmishes during an unsettled era for copyright law (see Brown) should not distract scholars from understanding their cultural significance. These technologies are, on the one hand, "liberating" (see Hayward, Young, and Marsh) insofar as they permit greater geographical "reach" and thus greater marketing opportunities (see Fromartz), but on the other hand they permit less commercial control, insofar as they permit digitised music to freely circulate without restriction or compensation, to the chagrin of copyright enthusiasts. They also create opportunities for musical collaboration (see Hayward) between performers in different zones of time and space, on a scale unmatched since the development of multitracking enabled the layering of sound. Most importantly, these technologies open spaces for the construction of authenticity that have hitherto been unavailable, particularly across distances that have largely separated cultures and fan communities (see Paul). The technologies of Internetworking provide yet another way to make connections between authenticity, music and sound. Community and locality (as Redhead & Street, as well as others like Sara Cohen and Ruth Finnegan, note) are the elements used by audience and artist alike to understand the authenticity of a performer or performance. The lived experience of an artist, in a particular nexus of time and space, is to be somehow communicated via music and interpreted "properly" by an audience. But technologies of Internetworking permit the construction of alternative spaces, times and identities. In no small way that has also been the situation with the mediation of music via most recordings. They are constructed with a sense of space, consumed within particular spaces, at particular times, in individual, most often private, settings. What the network technologies have wrought is a networked audience for music that is linked globally but rooted in the local. To put it another way, the range of possibilities when it comes to interpretive communities has widened, but the experience of music has not significantly shifted, that is, the listener experiences music individually, and locally. Musical activity, whether it is defined as cultural or commercial practice, is neither flat nor autonomous. It is marked by ever-changing tastes (hence not flat) but within an interpretive structure (via "interpretive communities"). Musical activity must be understood within the nexus of the complex relations between technical, commercial and cultural processes. As Jensen put it in her analysis of Patsy Cline's career: Those who write about culture production can treat it as a mechanical process, a strategic construction of material within technical or institutional systems, logical, rational, and calculated. But Patsy Cline's recording career shows, among other things, how this commodity production view must be linked to an understanding of culture as meaning something -- as defining, connecting, expressing, mattering to those who participate with it. (101) To achieve that type of understanding will require that popular music scholars understand authenticity and music in a symbolic realm. Rather than conceiving of authenticity as a limited resource (that is, there is only so much that is "pure" that can go around), it is important to foreground its symbolic and ever-changing character. Put another way, authenticity is not used by musician or audience simply to label something as such, but rather to mean something about music that matters at that moment. Authenticity therefore does not somehow "slip away", nor does a "pure" authentic exist. Authenticity in this regard is, as Baudrillard explains concerning mechanical reproduction, "conceived according to (its) very reproducibility ... there are models from which all forms proceed according to modulated differences" (56). Popular music scholars must carefully assess the affective dimensions of fans, musicians, and also record company executives, recording producers, and so on, to be sensitive to the deeply rooted construction of authenticity and authentic experience throughout musical processes. Only then will there emerge an understanding of the structures of feeling that are central to the experience of music. Footnotes For analyses of the Walkman's role in social settings and popular music consumption see du Gay; Hosokawa; and Chen. It has been thus since the advent of disc recording, when engineers would watch a record's grooves through a microscope lens as it was being cut to ensure grooves would not cross over one into another. References Altman, Rick. "Television/Sound." Studies in Entertainment. Ed. Tania Modleski. Bloomington: Indiana UP, 1986. 39-54. Baudrillard, Jean. Symbolic Death and Exchange. London: Sage, 1993. Brown, Ronald. Intellectual Property and the National Information Infrastructure: The Report of the Working Group on Intellectual Property Rights. Washington, DC: U.S. Department of Commerce, 1995. Chen, Shing-Ling. "Electronic Narcissism: College Students' Experiences of Walkman Listening." Annual meeting of the International Communication Association. Washington, D.C. 1993. Du Gay, Paul, et al. Doing Cultural Studies. London: Sage, 1997. Frith, Simon. Sound Effects. New York: Pantheon, 1981. Fromartz, Steven. "Starts-ups Sell Garage Bands, Bowie on Web." Reuters newswire, 4 Dec. 1996. Grossberg, Lawrence. We Gotta Get Out of This Place. London: Routledge, 1992. Hayward, Philip. "Enterprise on the New Frontier." Convergence 1.2 (Winter 1995): 29-44. Hosokawa, Shuhei. "The Walkman Effect." Popular Music 4 (1984). Jensen, Joli. The Nashville Sound: Authenticity, Commercialisation and Country Music. Nashville, Vanderbilt UP, 1998. Jones, Steve. Rock Formation: Music, Technology and Mass Communication. Newbury Park, CA: Sage, 1992. ---. "Popular Music Studies and Critical Legal Studies" Stanford Humanities Review 3.2 (Fall 1993): 77-90. ---. "A Sense of Space: Virtual Reality, Authenticity and the Aural." Critical Studies in Mass Communication 10.3 (Sep. 1993), 238-52. ---. "Sound, Space & Digitisation." Media Information Australia 67 (Feb. 1993): 83-91. Marrsh, Brian. "Musicians Adopt Technology to Market Their Skills." Wall Street Journal 14 Oct. 1994: C2. Massenburg, George. "Recording the Future." EQ (Apr. 1997): 79-80. Paul, Frank. "R&B: Soul Music Fans Make Cyberspace Their Meeting Place." Reuters newswire, 11 July 1996. Redhead, Steve, and John Street. "Have I the Right? Legitimacy, Authenticity and Community in Folk's Politics." Popular Music 8.2 (1989). Shepherd, John. "Art, Culture and Interdisciplinarity." Davidson Dunston Research Lecture. Carleton University, Canada. 3 May 1992. ---. "Value and Power in Music." The Sound of Music: Meaning and Power in Culture. Eds. John Shepherd and Peter Wicke. Cambridge: Polity, 1993. Silverman, Kaja. The Acoustic Mirror. Bloomington: Indiana UP, 1988. Sobchack, Vivian. Screening Space. New York: Ungar, 1982. Young, Charles. "Aussie Artists Use Internet and Bootleg CDs to Protect Rights." Pro Sound News July 1995. Citation reference for this article MLA style: Steve Jones. "Seeing Sound, Hearing Image: 'Remixing' Authenticity in Popular Music Studies." M/C: A Journal of Media and Culture 2.4 (1999). [your date of access] <http://www.uq.edu.au/mc/9906/remix.php>. Chicago style: Steve Jones, "Seeing Sound, Hearing Image: 'Remixing' Authenticity in Popular Music Studies," M/C: A Journal of Media and Culture 2, no. 4 (1999), <http://www.uq.edu.au/mc/9906/remix.php> ([your date of access]). APA style: Steve Jones. (1999) Seeing Sound, Hearing Image: "Remixing" Authenticity in Popular Music Studies. M/C: A Journal of Media and Culture 2(4). <http://www.uq.edu.au/mc/9906/remix.php> ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography