Siga este enlace para ver otros tipos de publicaciones sobre el tema: E-content credibility.

Artículos de revistas sobre el tema "E-content credibility"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores artículos de revistas para su investigación sobre el tema "E-content credibility".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Tashtoush, Yahya M., Aisha Zaidan y Izzat M. Alsmadi. "Implications for Website Trust and Credibility Assessment". International Journal of E-Entrepreneurship and Innovation 3, n.º 4 (octubre de 2012): 17–33. http://dx.doi.org/10.4018/jeei.2012100102.

Texto completo
Resumen
With the expansion of the Internet services provided to users to cover almost all areas that were dominated by traditional face-to-face and location based businesses, one of the major challenges for such expansion is security and its related concerns. Customers or users need to trust the websites they visit in terms of the information or content. This research proposes a new formula for evaluating the credibility (called XD TRank) metric of websites. A case study of 40 selected websites in Jordan is used to assess the proposed credibibility metric. The metrics required to assess Websites and pages credibility are collected and evaluated based on 25 existing metrics and built a model using SPSS by applying stepwise linear regression analysis to predict the XD TRank. Results showed that there is a broad range of metrics that affect the credibility of a website or a webpage and their impact on credibility may vary on their significancy or impact on the trust rank metric. For e-business in particular, trust rank metrics can be used part of quality assurance and auditing processes. Those can be important assets for users to be able to distinguish known, popular and reliable e-commerce websites from spammers or websites which try to trick novice users. Trust rank can be also used like a logo in all Website pages to alert users if they were redirected to phishing pages.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Maggio, Lauren A., Melinda Krakow y Laura L. Moorhead. "‘There were some clues’: a qualitative study of heuristics used by parents of adolescents to make credibility judgements of online health news articles citing research". BMJ Open 10, n.º 8 (agosto de 2020): e039692. http://dx.doi.org/10.1136/bmjopen-2020-039692.

Texto completo
Resumen
ObjectiveTo identify how parents judge the credibility of online health news stories with links to scientific research.DesignThis qualitative study interviewed parents who read online stories about e-cigarettes and human papillomavirus (HPV) vaccination published by top-tier US news organisations. Researchers asked participants to describe elements of a story that influenced their judgement about content credibility. Researchers analysed transcripts using inductive and deductive techniques. Deductive analysis drew on cognitive heuristics previously identified as being used by the public to judge online health information. Inductive analysis allowed the emergence of new heuristics, especially relating to health.SettingThe US National Cancer Institute’s Audience Research Lab in Maryland, in August–November 2018.ParticipantsSixty-four parents with at least one child between the ages of 9 and 17 residing in Maryland, Virginia, or the District of Columbia participated. Researchers randomly assigned 31 parents to the HPV vaccination story and 33 to the e-cigarette story.ResultsEvidence of existing heuristics, including reputation, endorsement, consistency, self-confirmation, expectancy violation and persuasive intent emerged from the interviews, with participants deeming stories credible when mentioning physicians (reputation heuristic) and/or consistent with information provided by personal physicians (consistency heuristic). Participants also described making credibility judgements based on presence of statistics, links to scientific research and their general feelings about news media. In relation to presence of statistics and links, participants reported these elements increased the credibility of the news story, whereas their feelings about the news media decreased their credibility judgement.ConclusionsParents used a constellation of heuristics to judge the credibility of online health news stories. Previously identified heuristics for online health information are also applicable in the context of health news stories. The findings have implications for initiatives in education, health communication and journalism directed towards increasing the public’s engagement with health news and their credibility judgements.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Owusu, Richard A., Crispin M. Mutshinda, Imoh Antai, Kofi Q. Dadzie y Evelyn M. Winston. "Which UGC features drive web purchase intent? A spike-and-slab Bayesian Variable Selection Approach". Internet Research 26, n.º 1 (1 de febrero de 2016): 22–37. http://dx.doi.org/10.1108/intr-06-2014-0166.

Texto completo
Resumen
Purpose – The purpose of this paper is to identify user-generated content (UGC) features that determine web purchase decision making. Design/methodology/approach – The authors embed a spike-and-slab Bayesian variable selection mechanism into a logistic regression model to identify the UGC features that are critical to web purchase intent. This enables us to make a highly reliable analysis of survey data. Findings – The results indicate that the web purchase decision is driven by the relevance, up-to-dateness and credibility of the UGC information content. Research limitations/implications – The results show that the characteristics of UGC are seen as positive and the medium enables consumers to sort information and concentrate on aspects of the message that are similar to traditional word-of-mouth (WOM). One important implication is the relative importance of credibility which has been previously hypothesized to be lower in the electronic word-of-mouth (e-WOM) context. The results show that consumers consider credibility important as the improved technology provides more possibilities to find out about that factor. A limitation is that the data are not fully representative of the general population but our Bayesian method gives us high analytical quality. Practical implications – The study shows that UGC impacts consumer online purchase intentions. Marketers should understand the wide range of media that provide UGC and they should concentrate on the relevance, up-to-dateness and credibility of product information that they provide. Originality/value – The analytical quality of the spike- and- slab Bayesian method suggests a new way of understanding the impact of aspects of UGC on consumers.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Park, Jae-Jin y Fritz Cropp. "An Exploratory Study of Marketers’ Perceptions of the Internet". Communication and Culture in Korea 13, n.º 1 (6 de junio de 2003): 119–39. http://dx.doi.org/10.1075/japc.13.1.08par.

Texto completo
Resumen
The Internet has dramatically changed the way organizations communicate with consumers. This study examines South Korean marketers’ perceptions of reciprocal communication forms (e.g., e-mail, comment, chat with webmaster, bulletin board, and survey) in terms of the extent of use, marketing cost reduction, usefulness, informativeness, credibility, barriers, and prediction of future use. A sample of Korean marketers found that the content of consumer feedback is more important than the Internet form in which it is delivered. Even marketers using at least one of the reciprocal communication forms wonder about the credibility of the information they receive and the lack of response. These phenomena might reflect that those forms have not yet reached the stage where they are seen as useful in replicating traditional communication with consumers. Managerial implications are discussed.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Imlawi, Jehad. "E-WOM Adoption and Sharing Behavior in Social Network Sites: The Impact of Engagement in SNSs". International Business Research 10, n.º 6 (13 de mayo de 2017): 87. http://dx.doi.org/10.5539/ibr.v10n6p87.

Texto completo
Resumen
Social network sites (SNSs) is becoming a credible source of online information. Despite the increasing use of social networks in message persuasion literature, there is still a need for investigating the role it can play in users’ adoption of online information and its impact on users' sharing behavioral intention of this online information. This research utilizes the peripheral route in elaboration likelihood model to investigate the impact of source credibility on engagement in SNSs and on e-WOM adoption, the impact of engagement in SNSs and recommendation rating on e-WOM adoption, and the impact of e-WOM adoption on sharing behavioral intention.The findings suggest that factors, that are not directly related to the online message content, like source credibility, recommendation rating, and online users' engagement in SNSs groups, positively impact online information adoption by SNSs users, and their sharing behavioral intention of this online information. The study is finally concluded by suggesting the theoretical implications, and by providing strategies for firms to adjust their online activities in order to succeed in improving their customers’ engagement, and their customers' adopting of these firms' products and services’ information.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Nosita, Firda y Tina Lestari. "The Influence of User Generated Content and Purchase Intention on Beauty Products". GATR Journal of Management and Marketing Review 4, n.º 3 (21 de septiembre de 2019): 171–83. http://dx.doi.org/10.35609/jmmr.2019.4.3(2).

Texto completo
Resumen
Objective – The thrive of social media enables everyone to share their purchase and consumption experiences, including beauty product consumption. The study aims to determine whether the attitude towards UGC, perceived credibility and user activity of UGC on YouTube influences the purchase intention toward a beauty product. Methodology/Technique – Questionnaires were distributed online to 200 people who had watched beauty product review videos on YouTube at least once and who were minimum 18 years old. The data was analyzed using multiple regression. Findings – The results indicate that attitudes towards UGC content on YouTube and perceived credibility affect purchasing intentions. Whereas user activities does not correlate with purchase intentions on beauty products. UGC content usually provides information and provides tips and tricks about using beauty products. The more attractive the content is, the more people want to see it and the more likely they will be to use the content to fulfill their information needs. Beauty vloggers are considered more credible than producer-generated content. Activities such as searching for, liking, subscribing or commenting does not necessarily indicate purchase intentions. This simply represents people fulfilling their social needs to interact with each other in a social environment. Novelty – Companies could provide training or facilities for UGC creators in order to create more attractive content. The most important finding of this study is that companies should continually improve the quality of their products, because the credibility of content makers relies on their experience with the products themselves. Marketers should monitor community discussions to find out more about the public interest in their products. In addition, marketers can also identify the shortcomings of their products to better enable them to fix them by reviewing comments on UGC. Type of Paper: Empirical. Keywords: User Generated Content (UGC); Beauty Vlogger; Beauty Product; E-WoM; YouTube. Reference to this paper should be made as follows: Nosita, F.; Lestari,T. 2019. The Influence of User Generated Content and Purchase Intention on Beauty Products, J. Mgt. Mkt. Review 4 (3) 171 – 183. https://doi.org/10.35609/jmmr.2019.4.3(2) JEL Classification: M31, M37, M39.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Träsel, Marcelo, Sílvia Lisboa y Giulia Reis Vinciprova. "Post-truth and trust in journalism: an analysis of credibility indicators in Brazilian venues". Brazilian Journalism Research 15, n.º 3 (30 de diciembre de 2019): 452–73. http://dx.doi.org/10.25200/bjr.v15n3.2019.1211.

Texto completo
Resumen
The terms ‘fake news’ and ‘post-truth’ have been used to describe the augmented dissemination potential of misinformation in digital networks in the second decade of the years 2000. In Brazil, different actors have been exploiting digital social networks for political purposes, disseminating content that imitates legitimate journalistic material, often obtaining better audience metrics than the news stories published by mainstream media. This article is divided into two parts. First, defines the term pseudojournalism to classify fraudulent texts that use journalistic narrative resources to deceive the audience. Second, it presents the results of an analysis of 23 political content producers with the greatest audience on Facebook in Brazil, based on the credibility indicators developed by Projeto Credibilidade (Trust Project). The results suggest that, in the current scenario, it is not possible to distinguish the quality journalism from pseudojournalism based on the characteristics of the websites and articles published by political content producers.Os termos “notícias falsas” e “pós-verdade” vêm sendo usados para descrever a potencialização da desinformação nas redes digitais na segunda década dos anos 2000. No Brasil, diversos atores vêm instrumentalizando as redes sociais para disputas políticas, espalhando conteúdo falso que imita materiais jornalísticos legítimos, muitas vezes obtendo mais audiência do que o noticiário de veículos tradicionais. Este artigo se divide em duas partes. Na primeira, conceitua o termo pseudojornalismo para classificar textos fraudulentos que usam os recursos narrativos jornalísticos para ludibriar a audiência. Na segunda, apresenta os resultados de uma análise de 23 produtores de conteúdo político do país com maior audiência no Facebook, a partir dos indicadores de credibilidade desenvolvidos pelo Projeto Credibilidade (Trust Project). Os resultados sugerem que, no cenário atual, não é possível distinguir o jornalismo de qualidade do pseudojornalismo a partir das características dos websites e matérias publicadas por produtores de conteúdo político.Las expresiones “noticias falsas” y “posverdad” vienen siendo utilizados para describir la potencialización de la desinformación en las redes digitales en la segunda década de los años 2000. En Brasil, distintos actores vienen instrumentalizando las redes sociales para disputas políticas, diseminando contenido falso que simula materiales periodísticos legítimos, obteniendo, a menudo, mayor audiencia que el noticiero de medios tradicionales. Este artículo está dividido en dos partes. Primero, conceptualiza el término pseudoperiodismo para calificar textos fraudulentos que utilizan los recursos de narración típicos del periodismo para engañar a la audiencia. En segundo lugar, presenta los resultados de un análisis de 23 productores de contenido político del país con mayor audiencia en Facebook, a partir de los indicadores de credibilidad desarrollados por el Proyecto Credibilidad (Trust Project). Los resultados sugieren que, en el escenario actual, no es posible diferenciar el periodismo de calidad del pseudoperiodismo a partir de las características de los sitios web y de materias publicadas por productores de contenido político.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Sturloni, Giancarlo y Nico Pitrelli. "Conflicting interests: research, profits, information, health". Journal of Science Communication 03, n.º 01 (21 de marzo de 2004): F01. http://dx.doi.org/10.22323/2.03010901.

Texto completo
Resumen
On 15 September 2001, thirteen major international journals, coordinated by the International Committee of Medical Journals Editors (ICMJE), published a joint editorial titled "Sponsorship, authorship, and accountability". Unfortunately, only four days from the tragedy of 9-11, there is no room in the media for other news. In the scientific world, however, the content of that editorial sets off an alarm: the conflict of interest undermines the objectivity of biomedical research and the credibility of international journals vouchsafeing the quality of that research. (Translated by Andrea Cavatorti, Scuola Superiore di Lingue Moderne per Interpreti e Traduttori, Trieste, Italy.)
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Khafajeh, Hayel y Issam Jebreen. "A Proposed Assessment Criterion for E-Learning Sites Evaluation: An Experts’ Opinion". Computer and Information Science 9, n.º 4 (25 de octubre de 2016): 37. http://dx.doi.org/10.5539/cis.v9n4p37.

Texto completo
Resumen
<p>An increasing growth in the number of e-learning sites at universities and other educational institutions led to necessary of develop and adopt a standards element to assess these sites to ensure efficiency, rival, and educational quality. Therefore, this study proposed an assessment criterion to evaluate the e-learning sites as a guide for decision-makers in order to purchase and development e-learning sites in which these assessment criterions are commensurate with the learning process. An experts’ opinion from universities professors who specialize in the field of teaching in different Jordan universities have been considered in order to develop a proposed assessment criterion, the result shows that the assessment criterion to evaluate the e-learning sites has twenty six criterion under five main categorizes namely: website design, and scientific knowledge content, technical elements, operational elements and finally with credibility of information sites. Given that the proposed assessment criterion to evaluate e-learning sites is guide for students, teachers, owners and developers about the benefits of e-learning sites.</p>
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Kravchuk, M. "The concept of knowledge credibility and problem of uncertainty in empirical sociological study". Bulletin of Taras Shevchenko National University of Kyiv. Sociology, n.º 7 (2016): 43–47. http://dx.doi.org/10.17721/2413-7979/7.118.

Texto completo
Resumen
The article considers the problem of the credibility of knowledge in empirical sociological research. The author makes a refinement of the concept of credibility, analyzing it as an indicator of true knowledge. Building on the achievements of the researchers, as E. Borel, P.Kopnin, V.Volovich determines the ratio of the probability and reliability characteristics as evidence of knowledge. Particular attention is paid to the relation of concepts of uncertainty and certainty of knowledge. Also performed clarifying the content and scope of the notion of error and uncertainty. On the basis of theoretical achievements A. Seidel, Y.Kemnits, L. Brillouin analyzes the causes of errors and perform their classification. The author analyzes the concept of elementary and complex error indicates the additive nature of the error. The publication notes that the notion of error refers to the incompleteness of knowledge related to the data. At the same time uncertainty covers the entire scope of the incompleteness of new knowledge as a result of the study as a whole and can not be reduced to the sum of complex errors. The concept of uncertainty is considered as indicators relevant to the probability characteristics as evidence of knowledge. Methodological background of the research is the concept of postnonclassical science of V. Stepin.
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Lehkyy, Oleh y Oleksandra Martsinkovska. "NATIVE ADVERTISING AS A PRIORITY WAY OF CONTENT STRATEGY MANAGEMENT". Regional’ni aspekti rozvitku produktivnih sil Ukraїni, n.º 24 (2019): 53–62. http://dx.doi.org/10.35774/rarrpsu2019.24.053.

Texto completo
Resumen
The following article outlines the current state of the management of communication channels usage on the Internet, especially the PPC (Pay Per Click) model of it; nowadays this process is characterized by the prevalence of the consumer value of messages, which occurs in the context of content marketing; in its turn, it is highlighted how content marketing initiatives compete with more traditional means (contextual and display advertising); modern principles and requirements of the content strategy of the company, based primarily on Google’s assessments recommendations and quality experience of the website visitor, are systematized, and they include the profile and authority of the author, specifications regarding video on web-pages, “Your Money Your Life” criteria (which focuses on goods and services that may influence many state or health of the consumer or visitor of the web-page), author’s reputation, and the criterion E-A-T (Expertise, Credibility, Reliability); the essence of native advertising as a new hybrid digital marketing tool that combines the consumer usefulness of the material and the commercial component of the advertisement module to achieve tactical (generating inbound traffic, increasing campaign reach) and strategic goals (applying the image and, in part, reminiscent marketing functions of this tool) are identified; the basic factors of success of native advertising in modern conditions of competition of information portals and content projects in social networks are highlighted; the method of calculation of the native advertising performance is suggested which includes several indicators to calculate the returns on the specific native advertising; tendencies of development of native advertising and ways of determining its effectiveness are outlined.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Geraldo, Denilson. "O processo canônico sobre os delitos contra menores". Revista Eclesiástica Brasileira 72, n.º 287 (15 de febrero de 2019): 604. http://dx.doi.org/10.29386/reb.v72i287.853.

Texto completo
Resumen
As notícias sobre crimes contra menores envolvendo clérigos têm repercutido intensamente na imprensa. Nos últimos tempos, a legislação eclesiástica sobre o assunto foi sendo aperfeiçoada em duas direções: assegurar o cumprimento da justiça e a proteção às vítimas e o direito de defesa aos clérigos acusados. A notícia ou denúncia do delito ao Ordinário deve responder aos requisitos: o conteúdo do delito contra o sexto mandamento e a credibilidade da denúncia. O agir do Ordinário ao receber a notícia sobre o delito visa, inicialmente, a proteção do menor; por isso, a prudência é sempre a primeira das virtudes, assegurando a boa fama do clérigo investigado. Neste sentido, desenvolvem-se as normas sobre o método da investigação prévia que pode chegar ao afastamento do acusado do ministério sagrado, iniciando o processo penal administrativo ou judicial. Contudo, a falta de credibilidade da acusação pode levar ao encerramento e ao arquivamento da denúncia. De fato, o processo canônico é instrumento de justiça e todos os envolvidos neste trabalho eclesial são chamados a testemunhar a caridade.Abstract: News of crimes against minors involving clerics has been strongly reflected in the press. In recent times the ecclesiastical legislation on the subject has been refined in two directions: firstly to ensure the fulfilment of justice and protection to victims, and secondly the right to defend accused clergy. The news of the offence to the Ordinary must meet the following requirements: the content of an offence against the sixth commandment and the credibility of the complaint. The initial act of the Ordinary on knowledge of the offence is to protect minors, so caution is always the first virtue, as well as ensuring the good reputation of the cleric is investigated. In this sense, the Church has developed rules on the method of prior research, which may result in the removal of the accused from the sacred ministry, and include initiating criminal proceedings and/or administrative proceedings. However, the lack of credibility of the prosecution can lead to a conclusion and the filing of the complaint. In fact, the canonical process is an instrument of justice and all involved in this work and within the Church are called to witness to charity.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Barbosa, Maria Luciene Sampaio y Vilso Junior Santi. "A INTENCIONALIDADE NAS NOTÍCIAS FALSAS: A NOTA DE REPÚDIO COMO ESTRATÉGIA DE DEFESA DO JORNALISMO NA ERA DAS FAKES NEWS". Aturá - Revista Pan-Amazônica de Comunicação 3, n.º 3 (1 de septiembre de 2019): 93–109. http://dx.doi.org/10.20873/uft.2526-8031.2019v3n3p93.

Texto completo
Resumen
Este artigo discute através do estudo de caso, a utilização da Nota de Repúdio como estratégia do jornal online Roraima em Tempo na defesa e resgate da credibilidade das notícias produzidas e divulgadas pelo jornalismo digital na era das fakes news. Com a publicação da nota de repúdio, levantou-se a questão se as fakes news causam preocupação e abalam a credibilidade do jornalismo online. Analisar esse mecanismo de defesa e repúdio utilizado pelo jornal abre a discussão sobre a intencionalidade das notícias falsas disseminadas na web. Essa análise foi feita tomando por aporte teórico o pensamento de Norbert Elias e John L. Scotson (2000) que tratam sobre relações de poder e Pierry Lévy (2003; 2007) que aponta para as mudanças na forma de se comunicar e nas relações ocasionadas pelo ciberespaço. A nota de repúdio no jornal Roraima em Tempo abriu espaço para uma discussão latente sobre a proliferação das notícias falsas, obrigando o veículo de comunicação a utilizar estratégias de defesa para reafirmar que o conteúdo por ele veiculado é verdadeiro e merece credibilidade. PALAVRAS-CHAVE: Intencionalidade nas notícias; Notícias falsas; Nota de repúdio. ABSTRACT This article discusses through the case study the use of the Note of Repudiation as a strategy of the online newspaper Roraima em Tempo in defending and restoring the credibility of the news produced and disseminated by digital journalism in the era of fakes news. With the release of the repudiation note, the question arose as to whether fakes news causes concern and undermines the credibility of online journalism. Analyzing this defense and repudiation mechanism used by the newspaper opens the discussion about the intentionality of fake news disseminated on the web. This analysis was made taking as theoretical basis the thought of Norbert Elias and John L. Scotson (2000) that deal with power relations and Pierry Lévy (2003; 2007) that points to the changes in the way of communicating and the relations caused by the cyberspace. The note of repudiation in the newspaper Roraima em Tempo made room for a latent discussion about the proliferation of fake news, forcing the media to use defense strategies to reaffirm that the content it conveys is true and deserves credibility. KEYWORDS: Intentionality in the news; Fake news; Note of repudiation. RESUMEN Este artículo discute a través del estudio de caso el uso de la Nota de Repudio como una estrategia del periódico en línea Roraima em Tempo para defender y restaurar la credibilidad de las noticias producidas y difundidas por el periodismo digital en la era de las noticias falsas. Con el lanzamiento de la nota de repudio, surgió la pregunta de si las noticias falsas causan preocupación y socavan la credibilidad del periodismo en línea. El análisis de este mecanismo de defensa y repudio utilizado por el periódico abre la discusión sobre la intencionalidad de las noticias falsas difundidas en la web. Este análisis se realizó tomando como base teórica el pensamiento de Norbert Elias y John L. Scotson (2000) que se ocupan de las relaciones de poder y Pierry Lévy (2003; 2007) que señala los cambios en la forma de comunicarse y las relaciones causadas por el ciberespacio La nota de repudio en el periódico Roraima em Tempo dejó espacio para una discusión latente sobre la proliferación de noticias falsas, obligando a los medios a usar estrategias de defensa para reafirmar que el contenido que transmite es verdadero y merece credibilidad. PALABRAS CLAVE: Intencionalidad en las noticias; Noticias falsas; Nota de repudio.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Gomes, Almiralva Ferraz, Beatriz Rodrigues Silva Bockorni, Aline Záide Pinheiro Matos Santos y Kelliane De Jesus Nascimento. "As contribuições da Análise de Conteúdo e do Discurso para os estudos em Administração". Revista Foco 13, n.º 1 (10 de marzo de 2020): 146. http://dx.doi.org/10.28950/1981-223x_revistafocoadm/2020.v13i1.695.

Texto completo
Resumen
Este artigo objetiva analisar as contribuições que a Análise de Conteúdo e a Análise do Discurso trazem para os estudos em Administração, tendo em vista a crescente adoção de cada uma delas, principalmente como forma de análise textual na produção científica da área. Para tal, foram levantadas e analisadas as publicações, no período de 2016 e 2017, nas bases Scielo e ANPAD, em função da credibilidade de cada uma delas para o meio acadêmico e, sobretudo, para a Administração. Nas referidas bases, foram identificados e analisados um total de 61 artigos. Do ponto de vista metodológico, a presente pesquisa é, portanto, do tipo bibliográfica e adotou uma abordagem qualitativa, uma vez que não deu tratamento estatísticos aos dados coletados e se restringiu a analisar as contribuições da produção acadêmica no referido período. O levantamento e a análise permitiram verificar que a Análise de Conteúdo e a Análise do Discurso são tipos de análise que têm uma variedade de abordagens, o que assente que os textos sejam compreendidos por completo, assim como também permite entender como o texto se insere e é percebido na sociedade.This article aims to analyze the contributions that Content Analysis and Discourse Analysis in management studies, because that increasing adoption of each one of them, mainly as a form of textual analysis in the scientific production of the area. Then, the publications, during 2016 and 2017, were collected and analyzed in the Scielo and ANPAD databases, based on the credibility of each of them to the Management. In these bases, a total of 61 articles were identified and analyzed. From the methodological point of view, this research is, therefore, of the bibliographic type and adopted a qualitative approach, since it did not give statistical treatment to the collected data and was restricted to analyze the contributions of academic production in the referred period. The survey and analysis allowed us to verify that Content Analysis and Discourse Analysis are types of analysis that have a variety of approaches, which allows the texts to be fully understood, as well as to understand how the text is inserted and is perceived in society.
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Baseman, Janet, Debra Revere, Ian Painter, Mark Oberle, Jeffrey Duchin, Hanne Thiede, Randall Nett, Dorothy MacEachern y Andy Stergachis. "A Randomized Controlled Trial of the Effectiveness of Traditional and Mobile Public Health Communications With Health Care Providers". Disaster Medicine and Public Health Preparedness 10, n.º 1 (22 de diciembre de 2015): 98–107. http://dx.doi.org/10.1017/dmp.2015.139.

Texto completo
Resumen
AbstractObjectivesHealth care providers play an essential role in public health emergency preparedness and response. We conducted a 4-year randomized controlled trial to systematically compare the effectiveness of traditional and mobile communication strategies for sending time-sensitive public health messages to providers.MethodsSubjects (N=848) included providers who might be leveraged to assist with emergency preparedness and response activities, such as physicians, pharmacists, nurse practitioners, physician’s assistants, and veterinarians. Providers were randomly assigned to a group that received time-sensitive quarterly messages via e-mail, fax, or cell phone text messaging (SMS) or to a no-message control group. Follow-up phone interviews elicited information about message receipt, topic recall, and perceived credibility and trustworthiness of message and source.ResultsOur main outcome measures were awareness and recall of message content, which was compared across delivery methods. Per-protocol analysis revealed that e-mail messages were recalled at a higher rate than were messaged delivered by fax or SMS, whereas the as-treated analysis found that e-mail and fax groups had similar recall rates and both had higher recall rates than the SMS group.ConclusionsThis is the first study to systematically evaluate the relative effectiveness of public health message delivery systems. Our findings provide guidance to improve public health agency communications with providers before, during, and after a public health emergency. (Disaster Med Public Health Preparedness. 2016;10:98–107)
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Rager, Jessica L., Julie Cavallario, Dorice A. Hankemeier, Cailee E. Welch Bacon y Stacy E. Walker. "The Preparation and Development of Preceptors in Professional Graduate Athletic Training Programs". Athletic Training Education Journal 14, n.º 3 (1 de julio de 2019): 156–66. http://dx.doi.org/10.4085/1403156.

Texto completo
Resumen
Context As professional athletic training programs transition to the graduate level, administrators will need to prepare preceptors to teach advanced learners. Currently, preceptor development is variable among programs and ideal content has yet to be identified. Exploring the development of preceptors teaching graduate learners can lead to an understanding of effective preceptorships. Objective To explore graduate professional athletic training program administrators' (ie, program directors', clinical education coordinators') experiences preparing and implementing preceptor development. Design Consensual qualitative research. Setting Individual phone interviews. Patients or Other Participants Eighteen program administrators (11 women, 7 men; 5.92 ± 4.19 years of experience; 17 clinical education coordinators, 1 program director). Participants were recruited and interviewed until data saturation was achieved. Main Outcome Measure(s) Interviews were conducted using a semistructured interview guide, and were recorded and transcribed verbatim. Data were analyzed by a 4-person research team and coded into themes and categories based on a consensus process. Credibility was established by using multiple researchers, an external auditor, and member checks. Results Participants reported the delivery of preceptor development occurs formally (eg, in person, online) and informally (eg, phone calls, e-mail). The content typically included programmatic policies, expectations of preceptors, clinical teaching methods, and new clinical skills that had been added to the curriculum. Adaptations to content were made depending on several factors, including experience level of preceptors, years precepting with a specific program, and geographical location of the program. The process of determining content involved obtaining feedback from program stakeholders when planning future preceptor development. Conclusions Complex decision making occurs during planning of preceptor development. Preceptor development is modified based on programmatic needs, stakeholder feedback, and the evolution of professional education. Future research should explore the challenges associated with developing preceptors, and which aspects of preceptor development are effective at facilitating student learning and readiness for clinical practice.
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Sayogo, Djoko Sigit. "SOCIAL MEDIA AND USERS’ PARTICIPATION: IDENTIFYING THE ATTRIBUTES AND ITS IMPACT ON INFORMATION POLICY". Jurnal Reviu Akuntansi dan Keuangan 9, n.º 1 (2 de mayo de 2019): 114. http://dx.doi.org/10.22219/jrak.v9i1.8324.

Texto completo
Resumen
Social media has grown remarkably and transform the communication mechanism. This recent advent of social media has escalated the hope of better and increase quality of users’ participation. This paper thus studies the impact of the determinants of social media usage on information policies, in particular information stewardship and usefulness. A combination of literary analysis, descriptive statistics of data from PEW Internet, and content analysis from interview were used as methodology to support the arguments. This study argues for five challenges related to the technological and users’ related usage of social media. These challenges are: a) clarity of users’ identity, b) conflict of information ownership & confidentiality, c) distinguishing public versus private information, d) definition of records & creation of meta data, and e) longevity of data & retention of records. Furthermore, this study proposes four critical factors that should be consider in the design and development of information policy governing the steward and use of information. These critical factors are: a) information credibility, integrity, and accuracy, b) security, privacy, and confidentiality of information, c) ensuring information access and availability, and d) data sharing and public-private partnership.
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Rahmat, Abdul. "EVALUASI PROGRAM PENDIDIKAN KESETARAAN PAKET B UNTUK MENDUKUNG WAJAR DIKDAS 9 TAHUN DI KABUPATEN GORONTALO". JIV 6, n.º 2 (30 de diciembre de 2011): 189–201. http://dx.doi.org/10.21009/jiv.0602.9.

Texto completo
Resumen
At The research is aimed at describing the implementation of equity education evaluation program of equity education paket B for wajar dikdas 9th year at sub province Gorontalo. Design of this research is qualitative with use phenomenology approach and design multi cases. Base of decided to use this approach is (1) this research have been done at nature background and two sided background different case; (2) this research use human as important instrument; (3) this research more focus to process, not result. For implementation this multi cases studies as base as opinion that multi cases studies is a study with detail with two or more background with have different characteristic, a subject, documents or a happen. This research use snowball sampling technic, (1) deep interview; (2) participation observation; (3) documentation study. Informer decided with purposive technic source triangulation, And than did evaluation of credibility, dependability, and confirmability. Data analysis include: (1) case individual analysis and (2) analysis multicases. Result of this research is (1) Content program: (a) curriculum centered, (b) application for skill, (c) program purpose to student potential. (2) Learning: (a) community centre, (b) environment student, (c) program structural flexsibility, (d) student centered, (e) user resourch. (3) Program assurance: (a) innisiative organization and participative student and decentralitation, (b) democrazy.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Rodríguez Aguilar, Susana. "Constantes y variantes en la generación y publicación de la fotografía periodística. Instrumento de investigación". HiSTOReLo. Revista de Historia Regional y Local 4, n.º 8 (1 de julio de 2012): 401–18. http://dx.doi.org/10.15446/historelo.v4n8.30352.

Texto completo
Resumen
Al aplicar la propuesta de instrumento de investigación —compuesta por 20 categorías— en los medios de comunicación impresos pueden identificarse temporal y espacialmente las fotos periodísticas de los principales acontecimientos nacionales e internacionales realizadas por los fotorreporteros; imágenes informativas con contenido histórico que reflejan la técnica y representación cultural del trabajador de la lente, así como los criterios editoriales e ideología de la prensa escrita de la época.Con esta propuesta pueden identificarse índices tales como: la puesta en página, el crédito o no a la misma, el tamaño, la sección, el género periodístico, el sitio geográfico (colonia, ciudad y país) y el personaje o los personajes identificables, las características de la toma, así como el título o pie de foto asignado. Conjuntos documentales con valor indicativo, que por su carácter repetitivo y por sus rasgos comunes permiten homogenizar la información.Palabras claves: fotos, técnicas, fotorreporteros, prensa, representación cultural. Constants and Variants in the Generation and Publication of Photojournalism. Research toolAbstractBy implement the proposal of research tool —made up by 20 categories— in the media of printed communication, it is possible to identify several aspects like spatial and temporal journalistic photos of major national and international events made by the reporters, informative images with historical content that reflect the technical and cultural representation of the worker of the lens, as well as the editorial criteria and ideology of the written press of the time. several aspects can be identified with this proposal such as the credibility or not to the page, size, section, the journalistic genre, geographical site (district, city and country) and the character or identifiable characters, the characteristics of the shot, as well as the title or caption assigned. Documentary sets with indicative value, which by its repetitive character and its common traits allow homogenize information.Keywords: photos, techniques, reporters, newspapers, cultural representation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Luo, Biao, Zheyu Zhang, Yong Liu y Weihe Gao. "What does it say and who said it? The contingent effects of online word of mouth in China". Nankai Business Review International 7, n.º 4 (7 de noviembre de 2016): 474–90. http://dx.doi.org/10.1108/nbri-12-2015-0035.

Texto completo
Resumen
Purpose The purpose of this paper is to examine how consumers respond to online word of mouth (WOM) with different valence (i.e. what does it say) and from different sources (i.e. who said it) in an important emerging economy, China. Design/methodology/approach Theory with experiments. Findings The authors find that Chinese consumers seek confirmatory information and pay greater attention to WOM that agrees with their initial attitude. Consumers with a high (vs low) need for cognition are more likely to rate WOM from far (vs closer) social distance as more impactful on themselves. For public-consumption products, the consumers are influenced more by “who said it” (source) than by “what does it say” (valence). The reverse holds for private consumption. Research limitations/implications The paper could be extended to other online behaviors. It can also be extended to empirical testing using market data. Practical implications Since Chinese consumers tend to focus on online information that is consistent with their initial attitude, it can be more difficult for either the seller or third-party website to utilize online WOM as a persuasive tool in China than in other countries. Firms may also customize their online strategies based on product category. For products that are consumed in private, WOM content is more important than source. If the firm wants to facilitate consumer interaction and influence, greater attention should be paid to make the content easy to access and utilize. Social implications Due to the explosive growth of e-Commerce in China, many global and Chinese firms rushed to set up online communities to facilitate information exchange among consumers. Our findings indicate that the impact of these communities may have been overvalued. Chinese consumers are influenced by online information, but if the majority of the online messages are from anonymous strangers, consumers tend to discount their credibility. Originality/value Our study represents an earlier effort to predict, and test, how online WOM can be associated with the specific cultural and market environments. It provides direct implications for both consumer behavior and firm strategy.
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Barbosa, Jane Kelly Dantas, Carolina Machado Saraiva de Albuquerque Maranhão y Ana Flávia Rezende. "A POLI (MONO)FONIA DO TELETRABALHO". Revista Foco 10, n.º 3 (4 de diciembre de 2017): 146. http://dx.doi.org/10.28950/1981-223x_revistafocoadm/2017.v10i3.454.

Texto completo
Resumen
O teletrabalho vem ganhando destaque e um número considerável de adeptos frente aos imperativos de maior flexibilidade e redução de custos do mercado globalizado e inconstante em que as organizações e profissionais estão inseridos. Diante disso, torna-se um tema que merece atenção no mundo corporativo e acadêmico por se tratar de um campo multifacetado e ainda pouco explorado. O objetivo deste artigo consiste em analisar os sentidos de teletrabalho encontrados nas revistas Você S.A. e Exame, exemplares da literatura popular de gestão ou literatura pop management; identificando as palavras e expressões encontradas com maior frequência nas publicações acerca da temática e analisando-as enquanto categorias de sentido. A natureza deste estudo é qualitativa e exploratória, utilizando análise de conteúdo para tratamento dos dados. Foram analisadas as publicações sobre teletrabalho nas revistas citadas no período de 2010 a 2016, demonstrando as visões comumente relacionadas à temática e contribuindo para o melhor entendimento acerca do fenômeno teletrabalho. A partir dos resultados obtidos é possível inferir que o teletrabalho carece de atenção em relação à sua delimitação, conceituação e regulação jurídica, a fim de atribuir mais segurança aos envolvidos e mais credibilidade às estatísticas e estudos realizados. Apesar de ser um tema que conceitual e legalmente ainda está sendo construído, existe um conjunto de publicações de pop management que contribui para a construção de um conceito mais homogêneo sobre o teletrabalho, caracterizando uma “monofonia” e revelando a necessidade da realização de estudos que contemplem suas especificidades e as brechas existentes. Teleworking has been gaining prominence and a considerable number of adepts facing the imperatives of greater flexibility and cost reduction in the globalized Market and inconsistent in which organizations and professionals are embedded. Given this, it becomes a topic that deserves attention in the corporate and academic world because it is a multifaceted field and still little explored. The objective of this article is to analyze the senses of telework found in the magazines Você S.A. and Exame, exemplars of the popular management literature or literature pop management; identifying the words and expressions found most frequently in the publications about the theme and analyzing them as categories of meaning. The nature of this study is qualitative and exploratory, using content analysis for data treatment. The publications on teleworking were analyzed in the journals cited in the period from 2010 to 2016, demonstrating the visions commonly related to the subject and contributing to the better understanding about the telework phenomenon. From the results obtained it is possible to infer that teleworking needs attention in relation to its delimitation, conceptualization and legal regulation, in order to give more security to those involved and more credibility to the statistics and studies carried out. Although it is a subject that is conceptually and legally still being built, there is a set of pop-management publications that contribute to the construction of a more homogeneous concept of teleworking, characterizing a "monophony" and revealing the need to carry out studies that their specificities and existing gaps.
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

LIBER, ARKADIUSZ. "The issues connected with the anonymization of medical data. Part 1. The introduction to the anonymization of medical data. Ensuring the protection of sensitive information with the use of such methods as f(a) and f(a,b)". Medical Science Pulse 8, n.º 1 (31 de marzo de 2014): 13–21. http://dx.doi.org/10.5604/01.3001.0003.3155.

Texto completo
Resumen
Introduction: Medical documentation must be protected against damage or loss, in compliance with its integrity and credibility and the opportunity to a permanent access by the authorized staff and, finally, protected against the access of unauthorized persons. Anonymization is one of the methods to safeguard the data against the disclosure.Aim of the study: The study aims at the analysis of methods of anonymization, the analysis of methods of the protection of anonymized data and the study of a new security type of privacy enabling to control sensitive data by the entity which the data concerns.Material and methods: The analytical and algebraic methods were used.Results: The study ought to deliver the materials supporting the choice and analysis of the ways of the anonymization of medical data, and develop a new privacy protection solution enabling the control of sensitive data by entities whom this data concerns.Conclusions: In the paper, the analysis of solutions of data anonymizing used for medical data privacy protection was con-ducted. The methods, such as k-Anonymity, (X,y)- Anonymity, (a,k)- Anonymity, (k,e)-Anonymity, (X,y)-Privacy, LKC-Privacy, l-Diversity, (X,y)-Linkability, t-Closeness, Confidence Bounding and Personalized Privacy were described, explained and analyzed. The analysis of solutions to control sensitive data by their owners was also conducted. Apart from the existing methods of the anonymization, the analysis of methods of the anonimized data protection was conducted, in particular the methods of: d-Presence, e-Differential Privacy, (d,g)-Privacy, (a,b)-Distributing Privacy and protections against (c,t)-Isolation were analyzed. The author introduced a new solution of the controlled protection of privacy. The solution is based on marking a protected field and multi-key encryption of the sensitive value. The suggested way of fields marking is in accordance to the XML standard. For the encryption (n,p) different key cipher was selected. To decipher the content the p keys of n is used. The proposed solution enables to apply brand new methods for the control of privacy of disclosing sensitive data.
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Silva, Cássio Faria da, Amanda Pontes Rassi, Jackson Wilke da Cruz Souza, Renata Ramisch, Roger Alfredo de Marci Rodrigues Antunes y Helena De Medeiros Caseli. "Quality of argumentation in political tweets: what is and how to measure it / Qualidade da argumentação em tweets de política: o que e como avaliar". REVISTA DE ESTUDOS DA LINGUAGEM 29, n.º 4 (28 de julio de 2021): 2537. http://dx.doi.org/10.17851/2237-2083.29.4.2537-2586.

Texto completo
Resumen
Abstract: Argumentation is something inherent to human beings and essential to written and spoken communication. Because of the popularization of Internet access, social media are one of the main means of creation and profusion of argumentative texts in various fields, such as politics. As a way to contribute to research related to the assessment of the quality of argumentation in Portuguese, we aim in this paper to propose and validate criteria and guidelines for the assessment of the quality of argumentation in Twitter posts in the domain of politics. For this purpose, a corpus was produced and annotated with tweets whose content is related to the Brazilian political scenario. The texts were collected in the first months of 2021, resulting in 1,649,674 posts. From the analysis of a sample, we defined linguistic criteria that would potentially characterize relevant aspects of the rhetorical dimension of argumentation, namely: (i) Clarity, (ii) Arrangement, (iii) Credibility, and (iv) Emotional appeal. After this phase of analysis, we proposed the annotation of a new set of 400 tweets, by four annotators. As a result, an agreement of around 70% for three out of four annotators was obtained. It is worth noting that this is the first work that proposes linguistic criteria for the evaluation of the quality of argumentation in social medias for Brazilian Portuguese. It is intended to construct a computer model that can automatically evaluate the quality of argumentation in social media messages, such as Twitter, based on the establishment of linguistic criteria, annotation rules, and annotated corpus.Keywords: argumentation; corpus; quality; rhetorical dimension; tweets; politics.Resumo: A argumentação é algo inerente ao ser humano e essencial para a comunicação escrita e falada. Por conta da popularização do acesso à Internet, as redes sociais são um dos principais meios de criação e profusão de textos argumentativos de vários domínios, como a política. Como forma de contribuir com as pesquisas relacionadas à avaliação da qualidade da argumentação em português, este trabalho tem como objetivo propor e validar critérios e diretrizes para a avaliação da qualidade da argumentação em postagens no Twitter no domínio da política. Para tanto, produziu-se um corpus anotado com tweets cujo conteúdo relaciona-se ao cenário político brasileiro. Os textos foram coletados nos primeiros meses de 2021, resultando em 1.649.674 postagens. A partir da análise de uma amostra, foram definidos critérios linguísticos que potencialmente caracterizariam aspectos relevantes da dimensão retórica da argumentação, a saber: (i) Clareza, (ii) Organização, (iii) Credibilidade e (iv) Apelo emocional. Após essa fase de análise, propôs-se a anotação de um novo conjunto de 400 tweets, por quatro anotadores. Como resultado, obteve-se uma concordância de cerca de 70% entre 3 dos 4 anotadores. Vale ressaltar que esse é o primeiro trabalho que propõe critérios linguísticos para a avaliação da qualidade da argumentação em redes sociais para o português brasileiro. A partir da definição dos critérios linguísticos, diretrizes de anotação e corpus anotado, espera-se construir um modelo computacional que possa avaliar automaticamente a qualidade da argumentação em textos de redes sociais, como o Twitter.Palavras-chave: argumentação; corpus; qualidade; dimensão retórica; tweets; política.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

del-Moral-Pérez, María Esther y Lourdes Villalustre-Martínez. "Media literacy, participation and accountability for the media of generation of silence". Comunicar 20, n.º 40 (1 de marzo de 2013): 173–81. http://dx.doi.org/10.3916/c40-2013-03-08.

Texto completo
Resumen
The purpose of this research is to study the level of media literacy in a sample of elderly women, the so-called «silent generation», belonging to the Asturian Housewives Association, by means of a questionnaire to collect data on critical awareness in women. The questionnaire seeks information on the persuasive effects of advertising; the women’s evaluation of the information conveyed by the media, and their training, commitment and participation as media consumers. The survey also tries to identify the women’s demands and concerns, regarding the media they usually use, by conducting focus group discussions. Findings show that the women surveyed believe that advertising lacks credibility and claim that some TV stations offer information and content which is biased or has been manipulated to the extent that it goes against the law. Although such women know the channels for citizen participation, they do not know how to exercise their rights in the face of illegal content. In addition, certain training needs have been detected. This research points to the need to design a training plan for media literacy which will empower them with critical skills and foster participation as active and responsible consumers. It is also intended that such women will acquire specific knowledge about the media, as well as the psychological strategies, technical resources and audiovisual language the media use.La presente investigación, por un lado, pulsa el nivel de alfabetización mediática que posee una muestra de mujeres de la tercera edad o generación del silencio –integradas en la Agrupación de Amas de Casa del Principado de Asturias–, mediante un cuestionario que recaba información sobre su conciencia crítica a partir de: el efecto de persuasión que creen tiene la publicidad; su valoración sobre la información transmitida por los medios de comunicación; y su formación, compromiso y participación como consumidoras de los mismos. Y, por otro, recoge sus demandas y preocupaciones más acuciantes en torno a los medios que normalmente utilizan, obtenidas a partir de los debates generados en los grupos de discusión que participaron. Entre los resultados más destacados cabe mencionarse que las encuestadas consideran que la publicidad no merece credibilidad e indican que existen cadenas de TV que ofrecen información manipulada o sesgada y contenidos denunciables que vulneran la legislación vigente. Aunque conocen los cauces para la participación ciudadana, no saben reclamar sus derechos ante la exhibición de contenidos denunciables. Además, se detectan determinadas lagunas formativas. De la investigación se deriva la necesidad de diseñar un plan formativo de alfabetización mediática que potencie sus habilidades críticas y fomente su participación como consumidoras activas y responsables, al tiempo que les dote de conocimientos específicos sobre los medios, sus estrategias psicológicas, los recursos técnicos y el lenguaje audiovisual que emplean.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Reia-Baptista, Vitor. "Education for media, a necessary, urgent and with future question". Comunicar 13, n.º 25 (1 de octubre de 2005): 153–59. http://dx.doi.org/10.3916/c25-2005-021.

Texto completo
Resumen
One of the problems of the society of the information is the information credibility and its sources that we find available in the media. The amount of on-line information is so gigantic that it is often a problem to select the accurate information. The information we find in Internet, but also in other media, like television, is immeasurable and in the majority of the cases the quantity wins towards the quality. The most important is that the user is alerted, conscious, critical and has the concern of questioning the quality and the credibility of the contents and of the forms; especially when he has difficulties in contextualizing all that information. The mediated memory and the credibility themes are topics of some controversy that may help us to develop an educational approach to the pedagogical dimensions of the media literacy problems. Numa perspectiva de desenvolvimento das capacidades de leitura e análise dos media, ou seja, de uma literacia dos media, por parte dos receptores dos processos comunicativos e utilizadores dos suportes mediáticos, torna-se necessário desenvolver igualmente uma boa capacidade analítica dos contextos pedagógicos em causa, num sentido muito amplo do termo pedagogia, tal como ele tem sido utilizado e desenvolvido como componente fundamental de uma pedagogia dos media e das suas estratégias subsequentes de educação para os media, com o principal objectivo de contextualizar os processos informativos e comunicativos na generalidade das suas dimensões pedagógicas, assim como debater e reflectir sobre as suas vertentes mais polémicas e problemáticas.
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Huang, Gang, Xiu Ying Wu, Man Yuan y Rui Fang Li. "Research on Data Quality of E&P Database Base on Metadata-Driven Data Quality Assessment Architecture". Applied Mechanics and Materials 530-531 (febrero de 2014): 813–17. http://dx.doi.org/10.4028/www.scientific.net/amm.530-531.813.

Texto completo
Resumen
So far there have never been systematic evaluation criteria and entire assessment and evaluation system in Data quality internationally. Based on the research on the relative content of both international and domestic data quality, this article analyzes the requests of data quality for the large enterprises. First of all, this paper raises and builds a complete data quality assessment system. Second, the definitions and specific algorithms of data quality assessment indicators and poses data quality analysis are built to evaluate the architecture and processes. A frame structure of data quality meta-model is presented in this paper. In addition, this paper also designs an evaluation system. This system includes the classification and definition of data quality and the algorithm in evaluation index and the system and process of data quality evaluation. This paper provides credibility basis for enterprises in evaluation of data quality.
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Burke, Thomas F. "Hybrid Telecoaching for Corporate Speech Training and Potential Applications to Clinical Practice". Perspectives of the ASHA Special Interest Groups 4, n.º 2 (15 de abril de 2019): 322–24. http://dx.doi.org/10.1044/2019_pers-sig3-2018-0012.

Texto completo
Resumen
Purpose The purpose of this article was to describe a model for “hybrid speech telecoaching” developed for a Fortune 100 organization and offer a “thought starter” on how clinicians might think of applying these corporate strategies within future clinical practice. Conclusion The author contends in this article that corporate telecommunications and best practices gleaned from software development engineering teams can lend credibility to e-mail, messaging apps, phone calls, or other emerging technology as viable means of hybrid telepractice delivery models and offer ideas about the future of more scalable speech-language pathology services.
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Bruni, C., M. H. Buch, P. Seferovic y M. Matucci-Cerinic. "AB0556 PRIMARY SYSTEMIC SCLEROSIS HEART INVOLVEMENT (PSSCHI): A SYSTEMATIC LITERATURE REVIEW (SLR), CONSENSUS-BASED DEFINITION AND PRELIMINARY VALIDATION." Annals of the Rheumatic Diseases 79, Suppl 1 (junio de 2020): 1574.1–1575. http://dx.doi.org/10.1136/annrheumdis-2020-eular.1964.

Texto completo
Resumen
Background:pSScHI may cause tissue, functional and conduction abnormalities with varied clinical manifestations. The absence of a clear definition of pSScHI impairs the significance and ability of focussed research, frequently not allowing the distinction between primary and secondary involvement.Objectives:We aimed to establish an expert consensus definition for pSScHI, to be used in clinical trials and everyday clinical practice, and to start its validation process.Methods:A SLR for cardiac manifestations and alterations in SSc was conducted using PubMed, Web of Science and Embase. Articles published from inception to December 31st, 2018 were identified. Inclusion criteria included papers in English on adult SSc patients, with heart involvement as outcome. We excluded non-human studies, secondary heart involvement (eg PAH, drugs, infections), reviews and case reports. PRISMA recommendations were followed where applicable. Extracted data were categorized into relevant domains (signs, symptoms, anatomical site involved, physiological abnormalities, pathological changes, prognostic outcomes), which informed the consensus definition. Sixteen senior experts (7 rheumatologists, 8 cardiologists, 1 pathologist) discussed the data and, using a nominal group technique, added expert opinion, provided statements to consider and ranked them. Consensus was attained for agreement >70%. Sixteen clinical cases were evaluated in two rounds to test for face validity, feasibility, inter- and intra-rater reliability and criterion validity (gold standard set by agreed evaluation between expert rheumatologist, cardiologist and methodologist).Results:2593 publications were identified and screened, 251 full texts were evaluated,172 met eligibility criteria. Data from the 7 domains were extracted and used to develop the World Scleroderma Foundation – Heart Failure Association (WSF-HFA) consensus-derived definition of pSSc-HI, as follows:“pSScHI comprises cardiac abnormalities that are predominantly attributable to SSc rather than other causes and/or complications*. pSScHI may be sub-clinical and must be confirmed through diagnostic investigation. The pathogenesis of pSScHI comprises one or more of inflammation, fibrosis and vasculopathy. *Non SSc-specific cardiac conditions (e.g. Ischaemic heart disease, arterial hypertension, drug toxicity, other cardiomyopathy, primary valvular disease) and/or SSc non cardiac conditions (e.g. PAH, Renal involvement, ILD).”Face validity was determined by a 100% agreement on credibility; application was feasible, with a median 60 (5-600) seconds taken per case; inter rater agreement was moderate [mKappa (95%CI) 0.56 (0.46-1.00) and 0.55 (0.44-1.00) for the two rounds] and intra rater agreement was good [mKappa (95%CI) 0.77 (0.47-1,00)]. Content validity was reached based on the wide variety of patients in the SLR, criterion validity was reached with 78 (73-84) % correctness.Conclusion:Using a SLR and modified nominal technique, we have developed a preliminary pSScHI consensus-based definition and started a validation process for it to be used in clinical research and clinical practice.Acknowledgments:Aleksandra Djokovic, Giacomo De Luca, Raluca B. Dumitru,Alessandro Giollo, Marija Polovina, Yossra Atef Suliman, Kostantinos Bratis, Alexia Steelandt, Ivan Milinkovic, Anna Baritussio, Ghadeer Hasan, Anastasia Xintarakou, Yohei Isomura, George Markousis-Mavrogenis, Silvia Bellando-Randone, Lorenzo Tofani, Sophie Mavrogeni, Luna Gargani, Alida L.P. Caforio, Carsten Tschoepe, Arsen Ristic, Karin Klingel, Sven Plein, Elijah Behr, Yannick Allanore, Masataka Kuwana, Christopher Denton, Daniel E. Furst, Dinesh Khanna, Thomas Krieg, Renzo Marcolongo.Disclosure of Interests:Cosimo Bruni Speakers bureau: Actelion, Eli Lilly, Maya H Buch Grant/research support from: Pfizer, Roche, and UCB, Consultant of: Pfizer; AbbVie; Eli Lilly; Gilead Sciences, Inc.; Merck-Serono; Sandoz; and Sanofi, Petar Seferovic: None declared, Marco Matucci-Cerinic Grant/research support from: Actelion, MSD, Bristol-Myers Squibb, Speakers bureau: Acetelion, Lilly, Boehringer Ingelheim
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Robinson, Anthony C. "Design, Dissemination, and Disinformation in Viral Maps". Abstracts of the ICA 1 (15 de julio de 2019): 1–2. http://dx.doi.org/10.5194/ica-abs-1-314-2019.

Texto completo
Resumen
<p><strong>Abstract.</strong> Social media has made it possible for maps to reach massive audiences outside of traditional media sources. In some cases, social media maps are original designs crafted by users, in other cases they are modified or replicated from previous sources. It is now relatively easy for novice Internet users to create new maps or manipulate existing images, and social media provides a vehicle for these maps to become visible in ways that were simply not possible even a decade ago. In addition, traditional media sources now harvest content from social streams, and in some cases may amplify what was originally a socially-shared map.</p><p>Maps that rapidly reach popularity via social media can be considered viral maps. A key element of virality in social media is the structure of how content becomes viral. The concept of structural virality suggests that the nature of how media are shared is more important than the raw population that might see something (Goel, Anderson et al. 2016). For example, a social media user with millions of followers can broadcast their content to a large audience, but structurally viral content is media that does not require a major broadcaster in order to reach a large audience.</p><p>Previous work on viral cartography has shown how viral maps may develop conditions in which their audiences begin creating and repurposing maps in response, resulting in large collections of social media maps. For example, Robinson (Robinson 2018) showed how a viral election map resulted in hundreds of maps shared by social media users in response to the original work.</p><p>Viral maps and the maps that emerge in subsequent responses from social media users pose interesting challenges for cartographers to address. Understanding their design dimensions and the ways in which these maps are disseminated (often outside of the social media stream where they may have originated) are two key areas of potential research inquiry. Knowledge of design and dissemination in social mapping is necessary as well if we wish to understand the capability of social media maps to inform or actively disinform the public. We argue that the latter topic is of utmost importance given the relative ease of making maps today versus their clear rhetorical power in public discussion and debate.</p><p>New methods are emerging to characterize the design elements of social media maps and their context on the internet. For example, proprietary machine learning services such as Google Cloud Vison and Amazon Rekognition are used for real-time detection of faces, text, sentiment, image structure, and relevant web results. While the primary use case for these services is to support image moderation on social media, to improve search results, and to support marketing activities, these methods can also be applied to the study of social media maps in support of cartographic research.</p><p>For example, we have used Google Cloud Vision to characterize the design and dissemination of a viral map created and shared by Kenneth Field, a cartographer at Esri. In March of 2018, Field tweeted an image of a dot-density map showing the 2016 United States Presidential Election results. A unique aspect of this map was its ability to show one dot for each of the more than 60 million votes cast in the 2016 election. Field’s tweet was liked more than 10,000 times and retweeted over 4000 times, reaching millions of potential viewers.</p><p>Google Cloud Vision analysis of Field’s map highlights a range of election and cartographic entities that it finds relevant to the original posting (Figure 1). Field’s map generated website content that focused on both its meaning in terms of interpreting the 2016 election, as well as its technical execution in terms of cartography. It could be argued that these are not terribly surprising results, but this demonstrates nevertheless that an automated routine has the power to deliver sensible contextual information about map images. Extrapolating from one map to the millions that appear each year on social media, it becomes plausible then to apply machine learning methods to characterize their design and web context, even from streaming sources, as these methods are already built to support real-time analysis of streaming data.</p><p>The dissemination of a viral map can be characterized by the number of engagements via social means in both direct and indirect forms. Direct forms of engagement may include user actions to like, share, or reply directly to a social media post. Indirect types of engagement can include the number of people who saw an item in their social media feed, and the potential audience who may have the opportunity to see an item in their social media feeds. In addition, viral maps can become the focus of media attention from traditional news sources, and amplified further to their respective audiences. Finally, users may blog about a viral map or share them in private messages or group chats.</p><p>One way to understand the dissemination of a viral map is to take advantage of image analysis service capabilities to produce URLs that show full and partially matching versions of an image. Google Cloud Vision provides this capability along with its other image analysis functions. In the case of the Field dot density map of the 2016 election, webpages that reference the exact image from Field’s original tweet include media stories about his map, blog postings, e-commerce sites that sell printed versions of the map, and message forum discussions that reference the map. Partial image matching results reveal only a few sites that have derived versions of Field’s original maps, and all of those we reviewed were simply resampled versions of the original. Other partial image matching results included other types of dasymetric and thematic maps located on the web. For example, multiple cellular phone coverage maps are highlighted as partial matches to Field’s original work (Figure 2).</p><p>We hypothesize that there is considerable potential for social media maps to be sources of disinformation. Map remain a powerful means of communication, and it is easier than ever to create a new map or modify an existing map to convey misleading information. Future research may be able to leverage the attributes and links derived from machine learning image analysis services such as Google Cloud Vision to assess the potential for a viral map to be an agent of disinformation. For example, being able to quickly identify the original source for a map image and to characterize the constellation of websites on which it has been shared may aid users in evaluating the credibility of what they are seeing.</p><p>In November 2018, climate scientist Brian Brettschneider shared a map on Twitter that purported to show regions of the United States and their preferred Thanksgiving pie. This map went viral, drawing attention from traditional media sources as well as Twitter users with large audiences of their own, including one U.S. Senator. Many who saw this absurd map argued about its content because they incorrectly assumed it was based on real data. Brettschneider reflected on the power of creating and sharing fake viral maps in a subsequent article for Forbes (Brettschneider 2018), stating, “We cannot let maps, as a medium for communicating information, be co-opted by people with nefarious intentions. I pledge to do my part by clearly noting if a map is a parody in the future.”</p>
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Masilamani, Vaageessan, Arulchelvan Sriram y Ann-Maria Rozario. "eHealth literacy of late adolescents: Credibility and quality of health information through smartphones in India". Comunicar 28, n.º 64 (1 de julio de 2020): 86–95. http://dx.doi.org/10.3916/c64-2020-08.

Texto completo
Resumen
The introduction of smartphones has revolutionized how late adolescents (aged 18-21 years) access and use the internet. Vast troves of health information are today just a tap or swipe away, with smartphones and internet connectivity becoming increasingly accessible. The need for eHealth literacy among late adolescents is now gaining importance as it ensures an effective use of health information. This study conducted a survey among 427 late adolescents in order to evaluate their eHealth literacy levels; their perceptions of the quality of online health information; their level of trust and credibility in online health and checked if acquiring health information through the online medium led to a change in their behavior intention. The results showed that most of the late adolescents preferred viewing multiple websites for their health information needs. Health information in the form of text and images were preferred over video content; and most preferred accessing online health information in their native language. Cancer and obesity are the common health issues of interest to both genders. Mobile applications (apps) were the least preferred mode of accessing heath information despite the high usage of smartphones. eHealth literacy and credibility positively predicted behavior intention while quality of health information did not predict behavior intention. La introducción de los smartphones (teléfonos inteligentes) ha revolucionado la forma en que los adolescentes tardíos (de entre 18 y 21 años) acceden y usan Internet. Hay una gran cantidad de información a solo un toque de distancia y los teléfonos móviles y la conectividad a Internet son cada vez más accesibles. La necesidad de aprender acerca de eSalud entre los adolescentes tardíos ahora está cobrando importancia, ya que garantiza un uso eficaz de la información de la salud. En este estudio se realiza una encuesta a 427 adolescentes tardíos para evaluar sus conocimientos en eSalud; sus percepciones de la calidad de la información de la eSalud; su nivel de confianza y credibilidad en eSalud y verificar si la adquisición de información de salud a través de este medio conduce a un cambio en su intención de comportamiento. Los resultados mostraron que la mayoría de los adolescentes tardíos preferían ver múltiples páginas web para sus necesidades de información de salud y la mayoría preferían acceder a información de eSalud en su idioma nativo. Las aplicaciones móviles (apps) eran el método menos usado para acceder a la información de salud a pesar del alto uso de smartphones. La alfabetización y la credibilidad de eSalud predijeron positivamente la intención de comportamiento, mientras que la calidad de la información de salud no predice la intención de comportamiento.
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Mahboob, Usman. "Deliberations on the contemporary assessment system". Health Professions Educator Journal 2, n.º 2 (30 de junio de 2019): 66–69. http://dx.doi.org/10.53708/hpej.v2i2.235.

Texto completo
Resumen
There are different apprehensions regarding the contemporary assessment system. Often, I listen to my colleagues saying that multiple-choice questions are seen as easier to score. Why can’t all assessments be multiple-choice tests? Some others would say, whether the tests given reflect what students will need to know as competent professionals? What evidence can be collected to make sure that test content is relevant? Others come up with concerns that there is a perception amongst students that some examiners are harsher than others and some tasks are easier than others. What can be done to evaluate whether this is the case? Sometimes, the students come up with queries that they are concerned about being observed when interacting with patients. They are not sure why this is needed. What rationale is there for using workplace-based assessment? Some of the students worry if the pass marks for the assessments are ‘correct’, and what is the evidence for the cut-off scores? All these questions are important, and I would deliberate upon them with evidence from the literature. Deliberating on the first query of using multiple-choice questions for everything, we know that assessment of a medical student is a complex process as there are multiple domains of learning such as cognition, skills, and behaviors (Norcini and McKinley, 2007)(Boulet and Raymond, 2018). Each of the domains further has multiple levels from simple to complex tasks (Norcini and McKinley, 2007). For example, the cognition is further divided into six levels, starting from recall (Cognition level 1 or C1) up to creativity (Cognition level 6 or C6) (Norcini and McKinley, 2007). Similarly, the skills and behaviors also have levels starting from observation up to performance and practice (Norcini and McKinley, 2007). Moreover, there are different competencies within each domain that further complicates our task as an assessor to appropriately assess a student (Boulet and Raymond, 2018). For instance, within the cognitive domain, it is not just making the learning objectives based on Bloom’s Taxonomy that would simplify our task because the literature suggests that individuals have different thinking mechanisms, such as fast and slow thinking to perform a task (Kahneman, 2011). We as educationalists do not know what sort of cognitive mechanism have we triggered through our exam items (Swanson and Case, 1998). Multiple Choice Questions is one of the assessment instruments to measure competencies related to the cognitive domain. This means that we cannot use multiple-choice questions to measure the skills and behaviors domains, so clearly multiple-choice questions cannot assess all domains of learning (Vleuten et al, 2010). Within the cognitive domain, there are multiple levels and different ways of thinking mechanisms (Kahneman, 2011). Each assessment instrument has its strength and limitations. Multiple-choice questions may be able to assess a few of the competencies, also with some added benefits in terms of marking but there always are limitations. The multiple-choice question is no different when it comes to the strengths and limitations profile of an assessment instrument (Swanson and Case, 1998). There are certain competencies that can be easily assessed using multiple-choice questions (Swanson and Case, 1998). For example, content that requires recall, application, and analysis can be assessed with the help of multiple-choice questions. However, creativity or synthesis which is cognition level six (C6) as per Blooms’ Taxonomy, cannot be assessed with closed-ended questions such as a multiple-choice question. This means that we need some additional assessment instruments to measure the higher levels of cognition within the cognitive domain. For example, asking students to explore an open-ended question as a research project can assess the higher levels of cognition because the students would be gathering information from different sources of literature, and then synthesizing it to answer the question. It is reported that marking and reading the essay questions would be time-consuming for the teachers (McLean and Gale, 2018). Hence, the teacher to student’s ratio in assessing the higher levels of cognition needs to be monitored so that teachers or assessors can give appropriate time to assess the higher levels of cognition of their students. Hence, we have to use other forms of assessment instruments along with multiple-choice questions to assess the cognitive domain. This will help to assess the different levels of cognition and will also incite the different thinking mechanisms. Regarding the concerns, whether the tests given reflect what students will need to know as competent professionals? What evidence can be collected to make sure that test content is relevant? It is one of an important issue for medical education and assessment directors whether the tests that they are taking are reflective of the students being competent practitioners? It is also quite challenging as some of the competencies such as professionalism or professional identity formation are difficult to be measured quantitatively with the traditional assessment instruments (Cruess, Cruess, & Steinert, 2016). Moreover, there is also a question if all the competencies that are required for a medical graduate can be assessed with the assessment instruments presently available? Hence, we as educationalists have to provide evidence for the assessment of required competencies and relevant content. One of the ways that we can opt is to carefully align the required content with their relevant assessment instruments. This can be done with the help of assessment blueprints, or also known as the table of specifications in some of the literature (Norcini and McKinley, 2013). An assessment blueprint enables us to demonstrate our planned curriculum, that is, what are our planned objectives, and how are we going to teach and assess them (Boulet and Raymond, 2018). We can also use the validity construct in addition to the assessment blueprints to provide evidence for testing the relevant content. Validity means that the test is able to measure what it is supposed to measure (Boulet and Raymond, 2018). There are different types of validity but one of the validity that is required in this situation to establish the appropriateness of the content is the Content Validity. Content validity is established by a number of subject experts who comment on the appropriateness and relevance of the content (Lawshe, 1975). The third method by which the relevance of content can be established is through standard-setting. A standard is a single cut-off score to qualitatively declare a student competent or incompetent based on the judgment of subject experts (Norcini and McKinley, 2013). There are different ways of standard-setting for example Angoff, Ebel, Borderline method, etc. (Norcini and McKinley, 2013). Although the main purpose is the establishment and decides the cut-off score during the process, the experts also debate on the appropriateness and relevance of the content. This means that the standard-setting methods also have validity procedures that are in-built in their process of establishing the cut-off score. These are some of the methods by which we can provide evidence of the relevance of the content that is required to produce a competent practitioner. The next issue is the perception amongst students that some examiners are harsher than others and some tasks are easier than others. Both these observations have quite a lot of truth in them and can be evaluated following the contemporary medical education evaluation techniques. The first issue reported is that some examiners are harsher than others. In terms of assessment, it has been reported in the literature as ‘hawk dove effect’ (McManus et al, 2006, Murphy et al, 2009). There are different reasons identified in the literature for some of the examiners to be more stringent than others such as age, ethnic background, behavioral reasons, educational background, and experience in a number of years (McManus et al, 2006). Specifically, those examiners who are from ethnic minorities and have more experience show more stringency (McManus et al, 2006). Interestingly, it has been reported elsewhere how the glucose levels affect the decision making of the pass-fail judgments (Kahneman, 2011). There are psychometric methods reported in the literature, such as Rasch modeling that can help determine the ‘hawk dove effect’ of different examiners, and whether it is too extreme or within a zone of normal deviation (McManus et al, 2006, Murphy, et al, 2009). Moreover, the literature also suggests ways to minimize the hawk-dove effect by identifying and paring such examiners so the strictness of one can be compensated by the leniency of the other examiner (McManus et al, 2006). The other issue in this situation is that the students find some tasks easier than others. This is dependent on the complexity of tasks and also on the competence level of students. For example, a medical student may achieve independent measuring of blood pressure in his/her first year but even a consultant surgeon may not be able to perform complex surgery such as a Whipple procedure. This means that while developing tasks we as educationalists have to consider both the competence level of our students and the complexity of the tasks. One way to theoretically understand it is by taking help from the cognitive load theory (Merrienboer 2013). The cognitive load theory suggests that there are three types of cognitive loads; namely, the Intrinsic, Extraneous, and Germane loads (Merrienboer 2013). The intrinsic load is associated with the complexity of the task. The extraneous load is added to the working memory of students due to a teacher who does not plan his/her teaching session as per students' needs (Merrienboer 2013). The third load is the germane or the good load that helps the student to understand the task and is added by using teaching methods that helps students understand the task (Merrienboer 2013). The teachers can use different instructional designs such as the 4CID model to plan their teaching session of the complex tasks (Merrienboer 2013). One of the ways to understand the difficulty of the task can be to pilot test the task with few students or junior colleagues. Another way to determine the complexity of the task can be through standard-setting methods where a cut-off score is established after the experts discuss each task and determine its cut-off score based on their judgments (Norcini and McKinley, 2013). However, it is important that the experts who have been called for setting standards have relevant experience so as to make credible judgments (Norcini and McKinley, 2013). A third way to evaluate the complexity of tasks is by applying the post-exam item analysis techniques. The difficulty of the task is evaluated after the performance of students in the exam. Each item’s difficulty in the exam can be measured. The items can be placed from extremely easy (100% students correctly answered the item) to extremely difficult (100% students failed on that specific item). The item analysis enables the teachers to determine which tasks were easier in exams as compared to more difficult tasks. Another concern that comes from students is about their observation when interacting with patients. Health professions training programs require the interaction of students with patients. The student-patient interaction is not very often in initial years of student’s training due to the issues of patient safety, and due to the heavy workload on clinical faculty. However, with the passage of time in the training program, these student patient interactions increase. There is also a strong theoretical basis for better learning when the students are put in a context or a given situation (Wenger, 1998). For example, infection control can be taught through a lecture however the learning can be more effective if the students practically learn it in an operation theatre. Moreover, the undergraduate students or foundation year house job doctors are yet not competent enough to practice independently and require supervision for the obvious reasons of patient safety. Although, some of the students may not like being observed it is one of the requirements for their training. The examiners observing them can give them constructive feedback to further improve their performance (Etheridge and Boursicot, 2013). Feedback is one of the essential components of workplace-based assessments, and it is suggested in the literature that the time for feedback to the student should be almost equal to one-third of the procedure or task time (Etheridge and Boursicot, 2013), that is, for a fifteen minutes tasks, there should be at least five minutes for the feedback hence having a total of twenty minutes time on the whole. Further, it is important for the examiners and senior colleagues to establish trust in the competence of their students or trainees. The ‘trust’ is one of the behavioral constructs that also starts initially with an observation (Etheridge and Boursicot, 2013). Hence, observation of students or house officers by senior colleagues or teachers during clinical encounters is important to establish trust in student’s competence levels. Additionally, in the workplace, there are different skills that are required by the students to demonstrate, and each skill is quite different to others. There are different workplace-based assessment instruments and each of them assesses only certain aspects of student’s performance during clinical practice. For instance, the Mini Clinical Evaluation Exercise (Mini-CEX) can primarily assess the history taking and physical examination skills of students (Etheridge and Boursicot, 2013). Similarly, the Directly Observed Procedural Skills (DOPS) is required to assess the technical and procedural skills of students (Etheridge and Boursicot, 2013). More so, the Case-based Discussion (CBD) is required to assess clinical reasoning skills, decision-making skills, ethics, and professionalism (Etheridge and Boursicot, 2013). Further, multi-source feedback (MSF) or 360-degree assessment collects feedback about a student on their performance from multiple sources such as patients, senior and junior colleagues, nursing staff, and administrative staff (Etheridge and Boursicot, 2013). All these workplace-based assessments require observation of students so they can be given appropriate feedback on their technical and nontechnical skills (Etheridge and Boursicot, 2013). Hence, clinical encounters at the workplace are quite complex and require training of students from different aspects to fully train them that cannot be accomplished without observation. Some students also worry whether the pass marks for the assessments are ‘correct’, and what is the evidence for the cut-off score in their exams? A standard is a single cut-off score that determines the competence of a student in a particular exam (Norcini and McKinley, 2013). The cut-off score is decided by experts who make a qualitative judgment (Norcini and McKinley, 2013). The purpose is not to establish an absolute truth but to demonstrate the creditability of pass-fail decisions in an exam (Norcini and McKinley, 2013). There are certain variables related to standard setters that may affect the creditability of the standard-setting process; such as age, gender, ethnicity, their understanding of the learners, their educational qualification, and their place of work. Moreover, the definition of competence varies with time, place and person (Norcini and McKinley, 2013). Hence, it is important that the standard setters must know the learners and the competence level expected from them and the standard setters must be called from different places. This is one of the first requirements to have the profile of the standard setters to establish their credibility. Moreover, the selection of the method of standard setting is important, and how familiar are the standard setters with the method of standard-setting. There are many standard-setting methods for different assessment instruments and types of exams (Norcini and McKinley, 2013). It is essential to use the appropriate standard-setting method, and also to train the standard setters on that method of standard setting so they know the procedure. The training can be done by providing them certain data to solve it following the steps of the standard-setting procedure. The record of these exercises is important and can be required at later stages to show the experience of the standard setters. Further, every standard-setter writes a cut-off score for each item (Norcini and McKinley, 2013). The mean score of all the standard setters is calculated to determine the cut-off score for each item (Norcini and McKinley, 2013). The total cut-off score is calculated by adding the pass marks of each individual item (Norcini and McKinley, 2013). The cut-off scores for items would also help in differentiating the hawks from doves, that is, those examiners who are quite strict from those who are lenient (McManus et al, 2006). Hence, it is important to keep the record of these cut-off scores of each item for future records and to have a balanced standard-setting team for future exams (Norcini and McKinley, 2013). Additionally, the meeting minutes is an important document to keep the record for the decisions made during the meeting. Lastly, the exam results and post-exam item analysis is an important document to see the performance of students on each item and to make comparisons with the standard-setting meeting (Norcini and McKinley, 2013). It would be important to document the items that behaved as predicted by the standard setters and those items that would show unexpected responses; for example, the majority of the borderline students either secured quite high marks than the cut-off score or vice versa (Norcini and McKinley, 2013). All the documents mentioned above would ensure the creditability of the standard-setting process and would also improve the quality of exam items. There are many other aspects that could not be discussed in this debate on the contemporary assessment system in medical education. Another area that needs deliberations is the futuristic assessment system and how it would address the limitations of the current system? Disclaimer: This work is derived from one of the assignments of the author submitted for his certificate from Keele University. -------------------------------------------------------------------------- References Boulet, J. and Raymond, M. (2018) ‘Blueprinting: Planning your tests. FAIMER-Keele Master’s in Health Professions Education: Accreditation and Assessment. Module 1, Unit 2.’, FAIMER Centre for Distance Learning, CenMEDIC. 6th edn. London, pp. 7–90. Cruess, R. L., Cruess, S. R., & Steinert, Y. (2016). ‘Amending Miller’s pyramid to include professional identity formation’. Acad Med, 91(2), pp. 180–185. Etheridge, L. and Boursicot, K. (2013) ‘Performance and workplace assessment’, in Dent, J. A. and Harden, R. M. (eds) A practical guide for medical teachers. 4th edn. London: Elsevier Limited. Kahneman, D. (2011) Thinking, fast and slow. New York: Farrar, Straus and Giroux. Lawshe, CH. (1975) A quantitative approach to content validity. Pers Psychol, 28(4), pp. 563–75. McLean, M. and Gale, R. (2018) Essays and short answer questions. FAIMER-Keele Master’s in Health Professions Education: Accreditation and Assessment. Module 1, Unit 5, 5th edition. FAIMER Centre for Distance Learning, CenMEDIC, London. McManus, IC. Thompson, M. and Mollon, J. (2006) ‘ Assessment of examiner leniency and stringency (‘hawk-dove effect’) in the MRCP(UK) clinical examination (PACES) using multi-facet Rasch modelling’ BMC Med Educ. 42(6) doi:10.1186/1472- 6920-6-42 Merrienboer, J.J.G. (2013) ‘Instructional Design’, in Dent, J. A. and Harden, R. M. (eds) A practical guide for medical teachers. 4th edn. London: Elsevier Limited. Murphy, JM. Seneviratne, R. Remers, O and Davis, M. (2009) ‘Hawks’ and ‘doves’: effect of feedback on grades awarded by supervisors of student selected components, Med Teach, 31(10), e484-e488, DOI: 10.3109/01421590903258670 Norcini, J. and McKinley, D. W. (2007) ‘Assessment methods in medical education’, Teaching and Teacher Education, 23(3), pp. 239–250. doi: 10.1016/j.tate.2006.12.021. Norcini, J. and Troncon, L. (2018) Foundations of assessment. FAIMER-Keele Master’s in Health Professions Education: Accreditation and Assessment. Module 1, Unit 1. 6th edn. London: FAIMER Centre for Distance Learning CenMEDIC. Norcini, J. and McKinley, D. W. (2013) ‘Standard Setting’, in Dent, J. A. and Harden, R. M. (eds) A practical guide for medical teachers. 4th edn. London: Elsevier Limited. Swanson, D. and Case, S. (1998) Constructing written test questions for the basic and clincial sciences. 3rd Ed. National Board of Medical Examiners. 3750 Market Street Philadelphia, PA 19104. Van Der Vleuten, C. Schuwirth, L. Scheele, F. Driessen, E. and Hodges, B. (2010) ‘The assessment of professional competence: building blocks for theory development’, Best Practice & Research Clinical Obstetrics and Gynecology, pp. 1-17. doi:10.1016/j. bpobgyn.2010.04.001 Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. Cambridge university press.
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Rasmussen, Karsten Boye. "As open as possible and as closed as needed". IASSIST Quarterly 43, n.º 3 (26 de septiembre de 2019): 1–2. http://dx.doi.org/10.29173/iq965.

Texto completo
Resumen
Welcome to the third issue of volume 43 of the IASSIST Quarterly (IQ 43:3, 2019). Yes, we are open! Open data is good. Just a click away. Downloadable 24/7 for everybody. An open government would make the decisionmakers’ data open to the public and the opposition. As an example, communal data on bicycle paths could be open, so more navigation apps would flourish and embed the information in maps, which could suggest more safe bicycle routes. However, as demonstrated by all three articles in this IQ issue, very often research data include information that requires restrictions concerning data access. The second paper states that data should be ‘as open as possible and as closed as needed’. This phrase originates from a European Union Horizon 2020 project called the Open Research Data Pilot, in ‘Guidelines on FAIR Data Management in Horizon 2020’ (July 2016). Some data need to be closed and not freely available. So once more it shows that a simple solution of total openness and one-size-fits-all is not possible. We have to deal with more complicated schemes depending on the content of data. Luckily, experienced people at data institutions are capable of producing adapted solutions. The first article ‘Restricting data’s use: A spectrum of concerns in need of flexible approaches’ describes how data producers have legitimate needs for restricting data access for users. This understanding is quite important as some users might have an automatic objection towards all restrictions on use of data. The authors Dharma Akmon and Susan Jekielek are at ICPSR at the University of Michigan. ICPSR has been the U.S. research archive since 1962, so they have much practice in long-term storage of digital information. From a short-term perspective you might think that their primary task is to get the data in use and thus would be opposed to any kind of access restrictions. However, both producers and custodians of data are very well aware of their responsibility for determining restrictions and access. The caveat concerns the potential harm through disclosure, often exemplified by personal data of identifiable individuals. The article explains how dissemination options differ in where data are accessed and what is required for access. If you are new to IASSIST, the article also gives an excellent short introduction to ICPSR and how this institution guards itself and its users against the hazards of data sharing. In the second article ‘Managing data in cross-institutional projects’, the reader gains insight into how FAIR data usage benefits a cross-institutional project. The starting point for the authors - Zaza Nadja Lee Hansen, Filip Kruse, and Jesper Boserup Thestrup – is the FAIR principles that data should be: findable, accessible, interoperable, and re-useable. The authors state that this implies that the data should be as open as possible. However, as expressed in the ICPSR article above, data should at the same time be as closed as needed. Within the EU, the mention of GDPR (General Data Protection Regulation) will always catch the attention of the economical responsible at any institution because data breaches can now be very severely fined. The authors share their experience with implementation of the FAIR principles with data from several cross-institutional projects. The key is to ensure that from the beginning there is agreement on following the specific guidelines, standards and formats throughout the project. The issues to agree on are, among other things, storage and sharing of data and metadata, responsibilities for updating data, and deciding which data format to use. The benefits of FAIR data usage are summarized, and the article also describes the cross-institutional projects. The authors work as a senior consultant/project manager at the Danish National Archives, senior advisor at The Royal Danish Library, and communications officer at The Royal Danish Library. The cross-institutional projects mentioned here stretch from Kierkegaard’s writings to wind energy. While this issue started by mentioning that ICPSR was founded in 1962, we end with a more recent addition to the archive world, established at Qatar University’s Social and Economic Survey Research Institute (SESRI) in 2017. The paper ‘Data archiving for dissemination within a Gulf nation’ addresses the experience of this new institution in an environment of cultural and political sensitivity. With a positive view you can regard the benefits as expanding. The start is that archive staff get experience concerning policies for data selection, restrictions, security and metadata. This generates benefits and expands to the broader group of research staff where awareness and improvements relate to issues like design, collection and documentation of studies. Furthermore, data sharing can be seen as expanding in the Middle East and North Africa region and generating a general improvement in the relevance and credibility of statistics generated in the region. Again, the FAIR principles of findable, accessible, interoperable, and re-useable are gaining momentum and being adopted by government offices and data collection agencies. In the article, the story of SESRI at Qatar University is described ahead of sections concerning data sharing culture and challenges as well as issues of staff recruitment, architecture and workflow. Many of the observations and considerations in the article will be of value to staff at both older and infant archives. The authors of the paper are the senior researcher and lead archivist at the archive of the Qatar University Brian W. Mandikiana, and Lois Timms-Ferrara and Marc Maynard – CEO and director of technology at Data Independence (Connecticut, USA). Submissions of papers for the IASSIST Quarterly are always very welcome. We welcome input from IASSIST conferences or other conferences and workshops, from local presentations or papers especially written for the IQ. When you are preparing such a presentation, give a thought to turning your one-time presentation into a lasting contribution. Doing that after the event also gives you the opportunity of improving your work after feedback. We encourage you to login or create an author login to https://www.iassistquarterly.com (our Open Journal System application). We permit authors 'deep links' into the IQ as well as deposition of the paper in your local repository. Chairing a conference session with the purpose of aggregating and integrating papers for a special issue IQ is also much appreciated as the information reaches many more people than the limited number of session participants and will be readily available on the IASSIST Quarterly website at https://www.iassistquarterly.com. Authors are very welcome to take a look at the instructions and layout: https://www.iassistquarterly.com/index.php/iassist/about/submissions Authors can also contact me directly via e-mail: kbr@sam.sdu.dk. Should you be interested in compiling a special issue for the IQ as guest editor(s) I will also be delighted to hear from you. Karsten Boye Rasmussen - September 2019
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Pinuji, Muhammad Fariz. "PENGARUH USER-GENERATED CONTENT SEBAGAI ALAT KOMUNIKASI BISNIS TERHADAP KREDIBILITAS INFORMASI WEBSITE “TOKOPEDIA”". Inter Script: Journal of Creative Communication 1, n.º 1 (11 de diciembre de 2019). http://dx.doi.org/10.33376/is.v1i1.347.

Texto completo
Resumen
This study aims to determine the effect of User-Generated Content as a Business Communication Tool on the Credibility of 'Tokopedia' Website Information. This is considering, the factor of content creation by other users can influence consumers in making purchasing decisions. Therefore, product content or reviews that are objective, accurate, and relevant to the product posted by other users will make the information look credible to other parties. The theory used for independent variables is the E-WOM theory, and user-generated content is measured by four dimensions of reciprocity, responsiveness, non-verbal information, and response speed. Whereas, the theory for the dependent variable uses the model of adopting information, information credibility is measured by two dimensions namely the quality of the argument and the credibility of the source. The research findings show that user - generated content as a business communication tool has a significant effect on the credibility of the Tokopedia website information. This means that the content or comments made by other users on the Tokopedia website look credible so that other consumers who read them trust the information contained in the comments column. Great Influence of User - Generated Content as a Business Communication Tool Against the Credibility of Tokopedia Website Information, obtained by 0.624 including the category of strong influence. This means that the content or product reviews made by other users on the Tokopedia Website look credible information. The conclusion of the study shows that user - generated content as a business communication tool has a significant effect on the credibility of the Tokopedia website information, and the category of influence is included category strong.
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Tanase, Madalina F. "Developing Teacher Credibility in Urban Environments: A B.E.A.R. Necessity". Journal of Education, 11 de noviembre de 2020, 002205742096943. http://dx.doi.org/10.1177/0022057420969430.

Texto completo
Resumen
Teacher credibility is a prerequisite of effective instruction. A credible teacher is honest, knowledgeable, and caring. Credible teachers do the right thing when no one is watching; they are in control of the learning environment, but they do this in an enthusiastic and engaging way. Credibility takes time to develop. This study investigated ways in which urban teachers develop and maintain credibility in their classrooms. Participants were 22 secondary mathematics and science teachers in their first year of teaching. The researcher used the B.E.A.R. framework developed by Riner in 2008 to interpret the results. In this framework, B stands for believability; E stands for expertise; A stands for attractive power, and R stands for relationships. Results show that the participants developed credibility by being believable, by possessing content and pedagogical content knowledge, by being in charge of the classroom in a positive way, and by developing relationships with their students. Credibility is a prerequisite of effective instruction, and, in some cases, it takes time to develop. A credible teacher is honest, knowledgeable, and caring. This study investigated ways in which urban teachers develop and maintain credibility in their classrooms. Participants were 22 secondary mathematics and science teachers in their first year of teaching. Results show that the participants developed credibility by being believable, by possessing content and pedagogical content knowledge, by being in charge of the classroom in a positive way, and by developing relationships with their students.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Verma, Deepak y Prem Prakash Dewani. "eWOM credibility: a comprehensive framework and literature review". Online Information Review ahead-of-print, ahead-of-print (22 de diciembre de 2020). http://dx.doi.org/10.1108/oir-06-2020-0263.

Texto completo
Resumen
PurposeThe purpose of this paper is to provide a comprehensive review on electronic word-of-mouth (eWOM) credibility. Further, the authors propose a comprehensive and integrated model on eWOM credibility.Design/methodology/approachThe authors conducted a systematic review of the extant literature on marketing, sociology and psychology to identify the factors that affect eWOM credibility. Further, the authors developed themes and identified factors which lead to eWOM credibility.FindingsFour factors were identified, i.e. content, communicator, context and consumer, which affect eWOM credibility. Several variables associated with these four factors were identified, which result in eWOM credibility. Further, the authors developed 22 propositions to explain the causal relationship between these variables and eWOM credibility.Research limitations/implicationsThe conceptual model needs empirical validation across various eWOM platforms, i.e. social networking websites, e-commerce websites, etc.Practical implicationsManagers and e-commerce vendors can use these inputs to develop specific design elements and assessment tools which can help consumers to identify credible eWOM messages. Credible eWOM messages, in turn, will increase the “trust” and “loyalty” of the customers on e-commerce vendors.Originality/valueThis paper provides a conclusive takeaway of eWOM credibility literature by integrating multiple perspectives and arguments from the extant literature. This study also presents an integrated model, which provides a theoretical framework for researchers to further examine the interaction effect of various variables, which results in eWOM credibility.Peer reviewThe peer review history for this article is available at: https://publons.com/publon/10.1108/OIR-06-2020-0263
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Sebastianelli, Rose, Nabil Tamimi y Murli Rajan. "How Shopping Frequency And Product Type Affect Consumers Perceptions Of E-Tailing Quality". Journal of Business & Economics Research (JBER) 5, n.º 1 (7 de febrero de 2011). http://dx.doi.org/10.19030/jber.v5i1.2515.

Texto completo
Resumen
We survey a national sample of US online consumers about their perceptions regarding the quality of online shopping experiences. Our intent is to examine whether the frequency with which they purchase products online and the types of products they purchase affect their perceptions of internet retailer quality. In this study, the quality of online shopping is measured using a set of items that represent the four phases encountered when shopping via the Internet: (1) the retailer’s homepage, (2) online product catalog, (3) order form and (4) customer service and support. Factor analysis of these items uncovers the following seven underlying e-tailing quality dimensions: reliability, accessibility, ordering services, convenience, product content, assurance and credibility. We find that frequent online shoppers consider both reliability and product content significantly more important than infrequent online shoppers; ordering services is significantly more important to infrequent online shoppers. With regard to product type, we find some significant differences between online shoppers who purchase “search” versus “experience” products. Specifically, those who purchase experience products online rate ordering services and product content significantly more important than those who buy search products.
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Jayawardena, Nirma Sadamali. "The e-learning persuasion through gamification: an elaboration likelihood model perspective". Young Consumers ahead-of-print, ahead-of-print (28 de diciembre de 2020). http://dx.doi.org/10.1108/yc-08-2020-1201.

Texto completo
Resumen
Purpose The purpose of this theoretical paper is to introduce a conceptual model to investigate e-learning persuasion through gamification elements using the social psychology theory of elaboration likelihood model (ELM). Design/methodology/approach The author systematically reviewed several theoretical and empirical papers which applied the ELM in various settings. Based on the literature, the author identified six research prepositions which facilitate to investigate e-learning persuasion through gamification. Findings This study contributes to the existing literature by identifying an ELM-based conceptual model which can be used to empirically investigate the e-learning persuasion using gamification elements. Accordingly, the central route persuasion could be conducted through argument quality, demographic differences and technology context facilitated through gamification elements. The peripheral route persuasion could be conducted through variables such as source credibility, social presence and message content. Practical implications This study contributes important findings to the e-learning research by introducing a conceptual model–based on the social psychology theory of ELM. Thereby, this study introduces a method for the future researchers, to investigate the e-learning persuasion using gamification elements. Further, future researchers can use this model to investigate the e-learning persuasion through gamification in different contexts including primary, secondary and tertiary educational levels. Originality/value To the best of the author’s knowledge, this study can be considered as the first theoretical paper which developed an ELM-based conceptual model to investigate the e-learning persuasion through gamification in education context.
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Ayyakkannu, Purushothaman, Ganesh A, Meenatchi Packirisamy, Sundaram Ramalingam y Venkataramanan S. "Antioxidant potential of Eclipta alba, a traditional medicinal herb attenuates oxidative DNA damage in vitro". Nusantara Bioscience 12, n.º 1 (1 de junio de 2020). http://dx.doi.org/10.13057/nusbiosci/n120113.

Texto completo
Resumen
Abstract. Purushothaman A, Ganesh A, Meenatchi P, Sundaram R, Venkataramanan. 2020. Antioxidant potential of Eclipta alba, a traditional medicinal herb attenuates oxidative DNA damage in vitro. Nusantara Bioscience 12: 73-78. The plant Eclipta alba (L.) Hassk. is an important plant used in the traditional Ayurvedic, Unani systems of holistic health and herbal medicine of the East. This study aimed to evaluate the antioxidant and DNA damage protection activities of ethanolic extract of E. alba. Quantitative analysis of total phenolic content (TPC) and identification of bioactive components using Gas Chromatography-Mass Spectroscopy (GC-MS) was performed to provide scientific basis for traditional usage of this plant. To investigate the antioxidant potential, extracts were tested for their capacity to scavenge 1,1-diphenyl-2-picrylhydrazyl (DPPH·), hydrogen peroxide (H2O2) and Superoxide radicals (O2•-). DNA damage protective activity of ethanol extract of E. alba was checked on pBluescript M13+ plasmid DNA. The Plasmid DNA was oxidized with H2O2 + UV treatment in the absence and presence of different concentrations of E. alba extract (75, 150, and 300 μg/mL). Electrophoresis was performed using 1% agarose at 40 V for 3 h in the presence of ethidium bromide. Gel was scanned on a Gel documentation system. Bands on the gels corresponding to supercoiled circular, circular relaxed, and linearized DNA were quantified. The results of preliminary phytochemical screening of E. alba extract showed the presence of flavonoids, saponins, steroids, terpenoids, and tannins. The extract was found to have rich phenolics content of 26.38 ± 2.45 milligram of gallic acid equivalents (mg GAE/g). The extract exhibited excellent antioxidant activities. GC-MS analysis of the extract confirmed the presence of major active principles. Furthermore, the extract significantly inhibited DNA damage induced by reactive oxygen species (ROS). Altogether, the results of current study revealed that E. alba is a potential source of antioxidants and provides pharmacological credibility to the ethnomedicinal use of this plant in traditional system of medicine, also justifying its therapeutic application in oxidative damage induced diseases such as cancer, diabetes, and neurological disorders.
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Guo, Xunhua, Guoqing Chen, Cong Wang, Qiang Wei y Zunqiang Zhang. "Calibration of Voting-Based Helpfulness Measurement for Online Reviews: An Iterative Bayesian Probability Approach". INFORMS Journal on Computing, 25 de junio de 2020. http://dx.doi.org/10.1287/ijoc.2019.0951.

Texto completo
Resumen
Voting mechanisms are widely adopted for evaluating the quality and credibility of user-generated content, such as online product reviews. For the reviews that do not receive sufficient votes, techniques and models are developed to automatically assess their helpfulness levels. Existing methods serving this purpose are mostly centered on feature analysis, ignoring the information conveyed in the frequencies and patterns of user votes. Consequently, the accuracy of helpfulness measurement is limited. Inspired by related findings from prediction theories and consumer behavior research, we propose a novel approach characterized by the technique of iterative Bayesian distribution estimation, aiming to more accurately measure the helpfulness levels of reviews used for training prediction models. Using synthetic data and a real-world data set involving 1.67 million reviews and 5.18 million votes from Amazon, a simulation experiment and a two-stage data experiment show that the proposed approach outperforms existing methods on accuracy measures. Moreover, an out-of-sample user study is conducted on Amazon Mechanical Turk. The results further illustrate the predictive power of the new approach. Practically, the research contributes to e-commerce by providing an enhanced method for exploiting the value of user-generated content. Academically, we contribute to the design science literature with a novel approach that may be adapted to a wide range of research topics, such as recommender systems and social media analytics.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

"ICT & Online Education – Factors affecting Employees’ preference for Online Education". International Journal of Innovative Technology and Exploring Engineering 8, n.º 9 (10 de julio de 2019): 994–1000. http://dx.doi.org/10.35940/ijitee.h6734.078919.

Texto completo
Resumen
The potential of ICTs in promoting the development and reach of educational avenues in India is unambiguously clear in the light of the challenges facing the country. Role of ICTs with enhanced focus on development of content and the applications to provide enhanced quality of education must be synchronized with the various initiatives for using ICT for education and should be guided by adequate guidelines and framework. Provisioning of ICT is limited by the Infrastructure especially in the rural areas, where Internet and electrification are major issues of concern. It is well known that higher penetration of mobile phone, radio and TV implies increased development and delivery of innovative content via these media. This paper is focussed on the necessity to incorporate ICT as a part of the curriculum and also use it to strengthen the teaching learning process. The paper explores the key factors that drive the growth in the E-Learning sector. The authors undertook a research for identifying the various factors that may affect the choice and preference of employees for opting for online education as a measure for career/knowledge enhancement. The study indicates that online education market in India is currently booming. The growth of the market is dependent on the field of study, the willingness to pay, the credibility of the offering organization and the acceptability of the learning in the Corporate sector.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Su, Jing Jing y Doris Sau Fung Yu. "Effectiveness of eHealth cardiac rehabilitation on health outcomes of coronary heart disease patients: a randomized controlled trial protocol". BMC Cardiovascular Disorders 19, n.º 1 (29 de noviembre de 2019). http://dx.doi.org/10.1186/s12872-019-1262-5.

Texto completo
Resumen
Abstract Background Cardiac rehabilitation (CR) uptake and adherence remain sub-optimal despite the apparent health benefits of modifying healthy behavior and slowing disease progression. eHealth is the use of information and communication technology (ICT) for health. eHealth lifestyle interventions and disease management have emerged as modalities to enhance CR accessibility, enable an individualized progress page, and enrich real-time contact, video-based information, and technology monitored functionality. This study aims to develop a nurse-led eHealth cardiac rehabilitation (NeCR) intervention and investigate its effectiveness on coronary heart disease (CHD) patients’ health outcomes. Methods This single-blinded two-arm parallel randomized controlled trial will randomize 146 patients from the inpatient cardiovascular units of a hospital in Wuhan, China to receive either the NeCR or the usual care. The NeCR intervention uses a hybrid approach consisting of a brief face-to-face preparatory phase and an empowerment phase delivered by health technology. The preparatory phase aims at identifying self-care needs, developing a goal-oriented patient centered action plan, incorporating a peer support network and orientation to the use of the e-platform. The empowerment phase includes use of the multi-media interactive NeCR for promoting symptom management, monitoring lifestyle changes and offering psychological support. A tele-care platform is also integrated to enhance health care dialogue with health professionals and peer groups. The control group will receive the usual care. An evaluation of lifestyle behavioral changes, self-efficacy, health-related quality of life, anxiety and depression, cardiovascular risk parameters, and unplanned health services use will be conducted at baseline, 6 weeks and 12 weeks post-intervention. Discussion This protocol proposes an individualized, comprehensive, and interactive NeCR delivered using a hybrid approach and guided by an empowerment model to optimize health outcomes of CHD patients. The intervention content and web-design is based on international health guidelines to improve credibility, comprehensibility and implementation. This study also proposes a new method of peer support in which the researcher shares participants’ progress toward goal attainment with the peer group. Results of this research have the potential to increase accessibility and availability of CR, improve cardiac rehabilitation service development in China, and inform eHealth lifestyle interventions. Trial registration Chinese Clinical Trial Registry: ChiCTR1800020411; Date of registration: December 28, 2018.
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Ashton, Daniel y Martin Couzins. "Content Curators as Cultural Intermediaries: “My reputation as a curator is based on what I curate, right?”". M/C Journal 18, n.º 4 (11 de agosto de 2015). http://dx.doi.org/10.5204/mcj.1005.

Texto completo
Resumen
In 2011 The Economist alerted us to the claim that “digital data will flood the planet.” The exponential increase in data such as e-mails, Tweets and Instagram pictures underpins claims that we are living in an age of ‘infoglut’ (Andrejevic) and information superabundance (Internet Live Stats). Several years earlier, Shirky posed this as an issue not of “information overload” but of “filter failure” (Asay). Shirky’s claim suggests that we should not despair in the face of unmanageable volumes of content, but develop ways to make sense of this information – to curate. Reflecting on his experiences of curating the Meltdown Festival, David Byrne addressed the emergence of everyday curating practices: “Nowadays, everything and everyone can be curated. There are curators of socks, menus and dirt bike trails […] Anyone who has come up with a top-ten list is, in effect, a curator. And anyone who clicks ‘Like’ is a curator.” Byrne’s comments on socks and top ten lists captures how curating can be personal. In their discussion of curating as a new literacy practice, Potter and Gilje highlight how “as well as the institutional and professional contexts for such work through the centuries and across cultures, many people have made personal collections of texts and artefacts that have stood for them in the world” (123). The emergence of easily, and often freely, available content curating tools is linked to practices of accessible curating (Good). There has been a proliferation of content curating platforms and tools. Notwithstanding that accessibility and everyday usage are often the hallmark of content curating (for example, see Villi on social curating and user-distributed content), this article specifically focuses on content curating as a service. Defining the content curator as “someone who continually finds, groups, organizes and shares the best and most relevant content on a specific issue online”, Bhargava in 2009 described content curating as the next big social media job of the future. Popova stresses the importance of authorship and approaching curating as a “form of creative labor in and of itself” and identifies content curators as “human sense-makers” in a culture of “information overload”. By addressing curating ‘content for others’ rather than other curatorship practices such as ‘content for me’ and ‘content about me’, we aim to offer insights into the professional and commercial practices of content curating. Through connecting autoethnographic research with academic literature on the concept of ‘cultural intermediaries’, we identify two ways of understanding professional content curating - connected cultural intermediation, and curating literacies. Researching Content Curators as Creative Labour In his introduction to Curation Nation, Rosenbaum suggests that there is “both amateur and professional curation, and the emergence of amateur or prosumer curators isn’t in any way a threat to professionals” (3). Likewise, we do not see a threat or tension between amateur and professional curating. We are, though keen to address ‘professional’ strategic content curating for an intended audience as a notable difference and departure. To generate detailed insights into the role of the professional content curator we employed an autoethnographic approach. Holt’s review of the literature and his own experiences of autoethnography provide a helpful overview: “autoethnography is a genre of writing and research that connects the personal to the cultural, placing the self within a social context” (2). Specifically, we focus on Couzins’ personal experiences of content curating, his professional practices and his ‘cultural milieu’ (Reed-Danahay). Couzins was a business-to-business journalist for 17 years before starting a content and communications agency that: helps organisations tell their story through curated and created stories; runs a media brand for corporate learning, which features curated content and a weekly curated e-mail; designs and delivers massive open online courses on the Curatr platform (a social learning platform designed for curating content). The research and writing process for our analysis was informed by Anderson’s approach to analytical autoethnography, and from this we stress that Couzins is a full member of the research setting. Our focus on his experiences also resonates with the use of first-hand narratives in media industries research (Holt and Perren). Following preliminary exchanges, including collaborative note taking and face-to-face conversations, Ashton created an interview schedule that was then reviewed and revised with Couzins. This schedule was used as the basis for a semi-structured interview of around 90 minutes. Both authors transcribed and coded the interview data. Through thematic analysis we identified and agreed on five codes: industry developments and business models; relationships with technologies; identifying and sharing information sources; curating literacies; expertise and working with/for clients. This research paper was then co-written. As a conversation with only two participants, our account runs up against the widely stated concern associated with autoethnography of observing too few cultural members and not spending enough time with others (Coffey; Ellis, Adams and Bochner). However, we would argue that the processes of dyadic interviewing underpinned by self-analysis provides accessible and “useful stories” (Ellis, Adams and Bochner). Specifically, Anderson’s five features helped to guide our research and writing from documenting personal experience and providing insider perspectives, to broader generalisation and “theoretical development, refinement, and extension” (387). Indeed, we see this research as complementing and contributing to the large scale survey research undertaken by Liu providing excerpts on how “technology bloggers and other professionals explain the value of curating in a networked world” (20). The major theme emerging from the interview exchange, perhaps not unexpectedly, is how professional content curating revolves around making sense of specific materials. In acting as a bridge between the content and publications of some and its reception by others, literature on cultural intermediaries was identified as a helpful conceptual pointer. The relevance of this concept and literature for exploring professional content curators is illuminated by Smith-Maguire and Matthews’ comments that “cultural intermediaries impact upon notions of what, and thereby who, is legitimate, desirable and worthy, and thus by definition what and who is not” (552). The process of curating content necessarily involves judgements on what is deemed to be desirable and worthy for clients. Scholarship on cultural intermediaries was explicitly explored in the interview and co-writing stages, and the following covers some of the meeting points between our research and this concept. Content Curators as Cultural Intermediaries: Taste, Expertise and Value The concept of cultural intermediaries has been explored by academics in relation to a range of industries. This paper does not necessarily seek to add content curators to the expanding list of occupations analysed through the cultural intermediaries’ lens. There are though a range of questions and prompts from studies on cultural intermediaries helpful for understanding the ways in which content is made sense of and circulated. Smith-Maguire and Matthews’s 2012 article ‘Are We All Cultural Intermediaries Now?’ is particularly helpful for connecting content curating with debates on cultural intermediaries. They consider how cultural intermediaries “effect other’s orientation” (552), and the question they pose is directly relevant for thinking through distinctions in curating ‘for/about me’ and ‘for others’. The following statement by Couzins on a client relationship with a private membership network provides a useful account of what the job of content curating involves: Each week I curate a set of articles or videos on hot topics that have been identified by network members. Once I have identified suitable content I upload links to their website, including a reading time and a short summary. These two elements serve to help members decide whether or not to read it and when to read it. For example, if they have a short train journey they might have time to read a ten-minute article. All articles are tagged so that the curated links become a deeper resource over time and members are alerted by e-mail each week when new links have been published. The reference to “suitable content” highlights how the curator can shape a narrative by intentionally deciding what to keep in and what to leave out. Beyond this choice of what the client is directed to, there is also the importance of the “short summary” and thus how this curated content is packaged and made sense of by the curator for the client. McFall, in her contribution to the Cultural Intermediaries Reader, offers a specific lens for examining the distinctive filtering practice of content curators as acts of ‘economization’. McFall outlines how economization “involves the work of ‘qualifying’ behaviours, organizations and institutions as economic. This is positioned in contrast to the idea that there is some kind of mystery “x-factor” which defines things as inherently economic” (46). McFall explains how “things are rendered (i.e. they become) economic through the actions of producers, governments, research organisations, media, consumers, and so forth. Economization allows for the ways things may, throughout their life cycle, move in and out of being economic” (46). Whilst McFall’s comments recognise how things may be rendered economic through, for example, a ‘top ten list’, we want to specifically examine what this rendering looks like with the ‘professional’ content curator. The act of filtering is one of rendering, and the content that is curated and shared (whether it be articles, videos, links, etc.) becomes economic within this specific context. Whilst there are many organisations that would provide the regular service of producing curated content, two distinctive approaches were revealed in our exchange. The first approach we identify concerns content curating as connected cultural intermediation, and the second approach we identify concerns facilitating curating literacies and co-creation with clients. Connected Cultural Intermediation Connected cultural intermediation refers to how content curators can connect with their own clients and with producers of content. As the following explores, these connections are built around being explicit and open about the content that curators identify and how they filter it. Being open with producers of content was important as these connections could lead to future opportunities for Couzins to identify content for his clients. Couzins addresses the connections he makes in terms of transparency, stating: “you just need to have some more transparency around you as the curator, like who you are, who you represent, why you are doing it, and the scope of what you are looking at.” Part of this involves identifying his impact and influences as a taste-shaper. Couzins remarks, “I'm creating a story […] but my point of view will be based on my interest in what I bring to the curating process.” Transparency was further presented as a part of the process in which judgements and validations are made: “My reputation as a curator is based on what I curate, right? So therefore it has to be as sound as it can be, and I try to be as dispassionate as I can be about this.” These comments capture how transparency is integral for how Couzins establishes his reputation as a content curator. Couzins promotes transparency in content usage by alerting content producers to where content is curated: I also share on Twitter, so they [producers] know it's been shared because they are included in the retweets, for example. I would tell some people that I've linked to their stuff as well, and sometimes I would also say "Thank you," to so and so for linking to that. It's like thanking my supply chain, if you like. Because it's a network. The act of retweeting also operates as a means to develop connections with producers of content. As well as indicating to producers that content is being used, Couzins’ reference to the “supply chain” indicates the importance he invests in establishing connections and the wide circulation of curated content. One approach to content curating as a commercial practice could be to limit access and create a “pay wall” style scenario in which the curated content can only be accessed after a payment. Curated materials could be sent directly to e-mail or uploaded to private websites. For Couzins however, it is access to the flows of content and the connections with others that underpin and enable his content curating commercial practice. It is important for Couzins that curated content is available to the producers, clients and more publicly through his free-to-access website and Twitter feed. The earlier reference to reputation as a curator in part concerned being dispassionate and enabling verification through open and explicit acknowledgements and links. This comment also addresses generating a reputation for “sound” information. The following comments pick this up and point to the need for engaging with different sources: “I have got my own bubble that I operate in, and it's really challenging to get out of that bubble, bring new stuff in, or review what's in there. I find myself sharing stuff quite a lot from certain places, sometimes. You've got to work at that. It's hard.” Working hard to escape from the bubble was part of Couzins’ work to build his reputation. Acknowledging producers is both a way for Couzins to promote transparency around his filtering and a way to foster new sources of content to escape the bubble: “I do build some relationships with some of the producers, because I get to know them, and I thank them, and I say, ‘That's really good,’ so I have a lot of relationships with people just through their content. But it's not a commercial relationship.” Whilst Couzins suggests there may not be a commercial relationship with the producers of content, economic significance can be seen in two ways. Firstly, moving outside of the ‘bubble’ can help the content curator make more diverse contributions and establish a reputation for this. Secondly, these connections can be of benefit to content producers as their material further circulates. Whilst there is no payment to the content producer, Couzins directs this curated content to clients and does not restrict wider public access to it in this curated form. McFall’s comments on economization stress the role of cultural intermediaries in the life cycle of how things “move in and out of being economic.” With content curating there is a ‘rendering’ of content that sees it become significant in new contexts. Here, the obvious relationship may be between Couzins and his clients. The references to transparent relationships and thanking the ‘supply chain’ show that relationships with content producers are also crucial and that the content curator needs to be continually connecting. Curating Literacies and Co-Creation In the earlier discussion of cultural intermediaries we addressed framing and how judgements can shape legitimacy, desirability and worth. With the content curating cultural intermediary practices under discussion here, a different perspective is possible in which subject specialist knowledge and expertise may not necessarily be the primary driver behind clients’ needs. The role of the content curator as cultural intermediary here is still in the rendering of content. However, it does not specifically involve the selecting content but guiding others in the framing and circulation of content. Where the concern of the client is finding appropriate ways to share the materials that they identify, then the content curator may not need subject-specialist knowledge. Here the interest in the content curating service is on the curating processes and practices, rather than content knowledge. Couzins’ account revealed that in some cases the intermediary’s role is not about the selection of material, but in the framing of material already selected by clients for wider engagement: “People internally will decide what to share and tell you why it’s worth looking at. Basically, it’s making them look like they know what they’re talking about, building their credibility.” The content curator role here does not concern selecting content, but offering guidance on how to frame it. As such, there remains a crucial “rendering” role in which the content selected by clients becomes meaningful through the guidance and input of the professional content curator. Our interview exchange also identified another scenario of relationships in which the content curator has little involvement in either selecting or framing content. Part of the commercial activity explored in our interview included supporting staff within a client’s company to showcase their expertise and knowledge through both selecting and framing content. Specifically, as Couzins outlines, this could be undertaken as a bespoke face-to-face service with in-house training: I helped one client put a curation tool into their website enabling them to become the curators. I spent a lot of time talking to them about curation and their role as curators. We split curation into topics of interest – based on the client’s area of expertise - which would be useful both internally for personal/professional development and externally for business development by sharing content relevant for their customers. In this respect, the service provided by the content curator involves the sharing of their own curatorial expertise with clients in order that clients may undertake their own curatorial practice. There is for the client a similar concern with enhancing their social reputation and profile, but this approach stresses the expertise of clients in identifying and responding to their own content curating needs. In part this emphasis on the client’s selection of material is about the challenges of establishing and maintaining legitimacy as a content curator across several fields: “You can't begin to say you're an expert when you are not, because you'll be found out.” More than this though, it was an approach to content curating literacy. As Couzins state, “my view is that if people with the domain expertise have an interest in doing this […] then they should be doing it themselves. If you gave it to me I hold the keys to all your knowledge. Why would you want that?” The content curator and client exchange here is not restricted to gathering interests and then providing content. This scenario sees content curating as accessible, but also sees the curator in their continued role as cultural intermediary--where expertise is about mediation more than content. Returning to McFall’s comments on rendering as how things “move in and out of being economic”, our second understanding of content curating intermediaries directs attention away from what the things/content are to instead how those things move. Conclusion Set within debates and transformations around content and information abundance and filtering, this paper explored how the practices of filtering, finding, and sharing at the heart of content curating have much in common with the work of cultural intermediaries. Specifically, this paper identified two ways of understanding commercial content curating. Firstly, content curating involves the rendering of content, and the ability to succeed here relies on developing connections outside the curator-client dynamic. Secondly, professional content curators can approach their relationships with clients as one of facilitation in which the expertise of the content curating cultural intermediary does not rest with the content, but on the curating and the intermediating. References Anderson, Leon. “Analytical Autoethnography.” Journal of Contemporary Ethnography 35.4 (2006): 373-395. Andrejevic, Mark. Infoglut: How Too Much Information Is Changing the Way We Think and Know. London: Routledge, 2013. Asay, Matt. “Problem Is Filter Failure, Not Info Overload.” CNet Jan. 2009. 1 Jun. 2015 ‹http://www.cnet.com/uk/news/shirky-problem-is-filter-failure-not-info-overload/›. Bhargava, Rohit. “Manifesto for the Content Curator: The Next Big Social Media Job of the Future?” 2009. 2 June 2015. ‹http://rohitbhargava.typepad.com/weblog/2009/09/manifestoDforDtheDcontentDcuratorD theDnextDbigDsocialDmediaDjobDofDtheDfutureD.html›. Byrne, David. “David Byrne: A Great Curator Beats Any Big Company’s Algorithm.” New Statesman June 2015. 8 June 2015. ‹http://www.newstatesman.com/2015/05/man-versus-algorithm›. Coffey, Paul. The Ethnographic Self. London: Sage, 1999. The Economist. “Drowning in Numbers.” 2011. 4 June 2015 ‹http://www.economist.com/blogs/dailychart/2011/11/big-data-0›. Ellis, Carolyn, Adams, Tony E. and Bochner, Arthur P. “Autoethnography: An Overview.” Forum: Qualitative Social Research 12.1 (2011). Good, Robin. “Content Curation Tools Supermap.” Pearl Trees Apr. 2015. 4 June 2015 ‹http://www.pearltrees.com/robingood/content-curating-supermap/id5947231›. Holt, Jennifer, and Alisa Perren. “Introduction: Does the World Really Need One More Field of Study?” Media Industries: History, Theory, and Methods. Eds. Jennifer Holt and Alisa Perren. Chichester: Wiley-Blackwell, 2009. 1-16. Holt, Nicholas, L. “Representation, Legitimation, and Autoethnography: An Autoethnographic Writing Story.” International Journal of Qualitative Methods 2.1 (2003): 1-22. Internet Live Stats. 5 June 2015 ‹http://www.internetlivestats.com/one-second/›. Krysa, Joasia. Curating Immateriality: The Work of the Curator in the Age of Network Systems. New York: Autonomedia, 2006. Liu, Sophia B. “Trends in Distributed Curatorial Technology to Manage Data Deluge in a Networked World.” Upgrade: The European Journal for the Informatics Professional 11.4 (2010): 18-24. Matthews, Julian, and Jennifer Smith-Maguire. “Introduction: Thinking with Cultural Intermediaries.” The Cultural Intermediaries Reader. Eds. Julian Matthews and Jennifer Smith-Maguire. London: Sage, 2014. 1-11. McFall, Liz. “The Problem of Cultural Intermediaries in the Economy of Qualities.” The Cultural Intermediaries Reader. Eds. Julian Matthews and Jennifer Smith-Maguire. London: Sage, 2014. 42-51. Popova, Maria. “In a New World of Informational Abundance, Content Curation Is a New Kind of Authorship.” Nieman Lab June 2011. 6 Jun 2015 ‹http://www.niemanlab.org/2011/06/maria-popova-in-a-new-world-of-informational-abundance-content-curation-is-a-new-kind-of-authorship/›. Potter, John, and Øystein Gilje. “Curatorship as a New Literacy Practice.” E-Learning and Digital Media 12.2 (2015): 123-127. Reed-Danahay, Deborah. Auto/Ethnography. New York: Berg, 1997. Rosenbaum, Steve. Curating Nation: How to Win in a World Where Consumers Are Creators. New York: McGraw-Hill Professional, 2011. Smith-Maguire, Jennifer, and Julian Matthews. “Are We All Cultural Intermediaries Now?” European Journal of Cultural Studies 15.5 (2012): 551-562. Sun, Will. “Introducing LinkedIn Elevate: Helping Companies Empower Their Employees to Share Content.” Apr. 2015. 29 May 2015 ‹http://blog.linkedin.com/2015/04/13/elevate/›. Villi, Mikko. “Social Curation in Audience Communities: UDC (User-Distributed Content) in the Networked Media Ecosystem.” Participations 9.2 (2012): 614-632.
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Reifegerste, Doreen y Annemarie Wiedicke. "Quality (Health Coverage)". DOCA - Database of Variables for Content Analysis, 26 de marzo de 2021. http://dx.doi.org/10.34778/2a.

Texto completo
Resumen
To judge the quality of the media coverage of health information, research mostly focuses on ten criteria: adequately discussion of costs, quantification of benefits, adequately explanation and quantification of potential harms, comparison of the new idea with existing alternatives, independence of sources and discussion of potential conflicts of interests, avoidance of disease mongering, review of methodology or the quality of the evidence, discussion of the true novelty and availability of the idea, approach or product as well as giving information that go beyond a news release (Schwitzer, 2008, 2014; Smith et al., 2005). Other quality dimensions applied in content analyses of health news coverage are diversity, completeness, relevance, understandability and objectiveness (Reineck, 2014; Reineck & Hölig, 2013). These criteria are increasingly relevant as people use online health information more frequently and in addition to the information from their physician for medical decision making (Wang, Xiu, & Shahzad, 2019). Thus, analyzing the quality of health content in the media coverage becomes even more relevant. As Schwitzer (2017) points out, there is a variety of quality problems due to hurried, incomplete, poorly researched news. To measure quality, the content of health news coverage can be compared to content of the original research paper (e.g., Ashorkhani et al., 2012) or the quality of media content is continuously judged by journalist, medical experts or independent organizations such as HealthNewsReview with respect to different criteria (e.g., Schwitzer, 2008; Selvaraj et al., 2014). Field of application/theoretical foundation: Online health information, medical decision making, journalism studies References/combination with other methods: Focus group discussions with journalists, editors-in-chief and news gatekeepers (Ashorkhani et al., 2012), focus group discussions with consumers of health information (Marshall & Williams, 2006) Example studies: Anhäuser & Wormer (2012); Schwitzer (2008); Wormer (2014); Reineck & Hölig (2013); Reineck (2014) Information on Reineck & Hölig, 2013 Authors: Dennis Reineck, Sascha Hölig Research question: Which factors contribute to the quality of health journalism? Object of analysis: Sample of all health-related articles in four German newspapers: Süddeutsche Zeitung (n = 167), Die Welt (n = 426), Frankfurter Rundschau (n = 219) and die tageszeitung (n = 84) Time frame of analysis: March, 1, 2010 to February, 28, 2011 Info about variables Variables: Variables defining five dimensions of quality for health-related newspaper articles, deduction of a quality index: coding of 0 to 100 points for each indicator of the different variables, deduction of a quality index for each article based on these points Level of analysis: news article Quality dimension Variable Indicator(s) Diversity (rH= 0.78) Quantitative diversity Length of the article Source diversity Number of sources Opinion diversity Discussion of contrary opinions Completeness (rH= 0.86) Journalistic completeness and scientific completeness, risks For diseases: information about prevention, symptoms and remedies Scientific completeness For research studies: information about method, sample and results Risks For treatment options: addressing of risks and side effects Relevance (rH= 0.85) Source credibility Sources with the highest reputation Usefulness Take-home-messages, references to additional information Newsworthiness News factors (e.g., topicality) Understandability (rH= 0.86) Simplicity Simplicity vs. complexity of language Structure Well-structured vs. inadequately structured presentation Conciseness Concise vs. circuitous presentation Storytelling Storytelling vs. matter-of-fact presentation Objectiveness (rH= 0.95) Emotionalization Emotional language Dramatization Dramatization of information References Anhäuser, M., & Wormer, H. (2012). A question of quality: Criteria for the evaluation of science and medical reporting and testing their applicability. PCST 2012 Book of Papers: Quality, Honesty and Beauty in Science and Technology Communication. http://www.medien-doktor.de/medizin/wp-content/uploads/sites/3/downloads/2014/04/Paper-Florenz.pdf Ashorkhani, M., Gholami, J., Maleki, K., Nedjat, S., Mortazavi, J., & Majdzadeh, R. (2012). Quality of health news disseminated in the print media in developing countries: A case study in Iran. BMC Public Health, 12, 627. https://doi.org/10.1186/1471-2458-12-627 Marshall, L. A., & Williams, D. (2006). Health information: does quality count for the consumer? Journal of Librarianship and Information Science, 38(3), 141–156. https://doi.org/10.1177/0961000606066575 Reineck, D. (2014). Placebo oder Aufklärung mit Wirkpotenzial? Eine Diagnose der Qualität der Gesundheitsberichterstattung in überregionalen Tageszeitungen. In V. Lilienthal (Ed.), Qualität im Gesundheitsjournalismus: Perspektiven aus Wissenschaft und Praxis (Vol. 325, pp. 39–60). Springer VS. https://doi.org/10.1007/978-3-658-02427-7_3 Reineck, D., & Hölig, S. (2013). Patient Gesundheitsjournalismus: Eine inhaltsanalytische Untersuchung der Qualität in überregionalen Tageszeitungen. In C. Rossmann & M. R. Hastall (Eds.), Medien + Gesundheit: Band 6. Medien und Gesundheitskommunikation: Befunde, Entwicklungen, Herausforderungen (1st ed., pp. 19–31). Nomos. Schwitzer, G. (2008). How do US journalists cover treatments, tests, products, and procedures? An evaluation of 500 stories. PLoS Medicine, 5(5), e95. Schwitzer, G. (2014). A guide to reading health care news stories. JAMA Internal Medicine, 174(7), 1183–1186. https://doi.org/10.1001/jamainternmed.2014.1359 Schwitzer, G. (2017). Pollution of health news. BMJ (Clinical Research Ed.), 356, j1262. https://doi.org/10.1136/bmj.j1262 Selvaraj, S., Borkar, D. S., & Prasad, V. (2014). Media coverage of medical journals: Do the best articles make the news? PloS One, 9(1), e85355. https://doi.org/10.1371/journal.pone.0085355 Smith, D. E., Wilson, A. J., & Henry, D. A. (2005). Monitoring the quality of medical news reporting: Early experience with media doctor. The Medical Journal of Australia, 183(4), 190–193.
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Antonio, Amy Brooke y David Tuffley. "Promoting Information Literacy in Higher Education through Digital Curation". M/C Journal 18, n.º 4 (10 de agosto de 2015). http://dx.doi.org/10.5204/mcj.987.

Texto completo
Resumen
This article argues that digital curation—the art and science of searching, analysing, selecting, and organising content—can be used to promote the development of digital information literacy skills among higher education students. Rather than relying on institutionally approved journal articles that have been pre-ordained as suitable for a given purpose, digital curation tools allow students to evaluate the quality of Web based-based content and then present it in an attractive form, all of which contributes to the cultivation of their digital literacy skills. We draw on a case study in which first- year information and communications technology (ICT) students used the digital curation platform Scoop.it to curate an annotated collection of resources pertaining to a particular topic. The notion of curation has undergone a significant transformation in the wake of an increasingly digital society. To “curate,” traditionally referred to as “taking care,” has morphed into a process of cataloguing, accessing, and representing artefacts. In the digital age, curation is a way of sifting, organising, and making sense of the plethora of information; it has become an important life skill without which one cannot fully participate in digital life. Moreover, the ready availability of information, made possible by the ubiquity of Internet technology, makes digital curation an essential skill for the twenty-first 21st century learner. In answer to this need, we are seeing the emergence of suites of digital tools, dubbed “‘curation”’ tools, that meet the perceived need to locate, select, and synthesise Web content into open, user-organised collections. With information overload, a distinctive feature of the Internet, the ability to sift through the noise and dross to select high- quality, relevant content—selected on the basis of authority, currency, and fitness-for-purpose—is indeed a valuable skill. To examine this issue, we performed a case study in which a group of first- year Information and Communication Technology (ICT) students curated Web- based resources to inform an assessment task. We argue that curation platforms, such as Scoop.it, can be effective at cultivating the digital information literacy skills of higher education students. Digital Curation Traditionally, curation is a practice most commonly associated with the Art world— something reserved for the curators of art exhibitions and museums. However, in today’s world, digital curation tools, such as Scoop.it, make it possible for the amateur curator to collect and arrange content pertaining to a particular topic in a professional way. While definitions of curation in the context of the online environment have been proposed (Scime; Wheeler; Rosenbaum), these have not been aligned to the building of core digital information literacy competencies. The digital curator must give due consideration to the materials they choose to include in a digital collection, which necessitates engaging in a certain amount of metacognitive-cognitive reasoning. For the purpose of this article, the following definition of digital curation is proposed: “Curation can be summarised as an active process whereby content/artefacts are purposely selected to be preserved for future access. In the digital environment, additional elements can be leveraged, such as the inclusion of social media to disseminate collected content, the ability for other users to suggest content or leave comments and the critical evaluation and selection of the aggregated content”. (Antonio, Martin, and Stagg).This definition exemplifies the digital information literacy skills at work in the curation of digital content. It can be further broken down to elucidate the core competencies involved: “Curation can be summarised as an active process whereby content/artefacts are purposely selected.” (Antonio, Martin and Stagg). The user, who curates a particular topic, actively chooses the content they want to appear in their collection. The content must be relevant, up-to-date, and from reputable sources or databases. Achieving this requires a degree of information literacy both in terms of justifying the content that is selected and, conversely, that which is not. The second part of the definition is: “In the digital environment, additional elements can be leveraged, such as the inclusion of social media to disseminate collected content, the ability for other users to suggest content or leave comments.” (Antonio, Martin and Stagg). The digital curator is engaged and immersed in Web 2.0 technologies, ranging from the curation tools themselves to social media platforms such as Facebook and Twitter. The use of these tools thus requires at least basic digital literacy skills, which can potentially be further developed through continued engagement with them. Finally, curation involves the “human-mediated automation of content collection.” (Antonio, Martin and Stagg). The curator must accept or reject the content generated by the search algorithm, which necessitates a level of metacognitive-cognitive analysis to determine the value of a piece of content. While there are countless tools laying claim to the digital curation label, including Pinterest, Storify, and Pearltrees, Scoop.it was selected for this study, as the authors consider that it adheres most closely to the stated definition of curation. Scoop.it requires the user to define the sources from which content will be suggested and to make an informed decision about which pieces of content are appropriate for the collection they are creating. This requires the curator to critically evaluate the relevance, currency, and validity (information literacy) of the suggested materials. Additionally, users can include content from other Scoop.it pages, which is referred to as “re-scooping”. Scoop.it therefore relies on an active editorial role undertaken by the user in the selection, or rejection, of content. That is, the owner of a particular collection makes the final decision regarding what will appear on their Scoop.it page. The content is then displayed visually with the collection growing as new content is added. The successful use of Scoop.it depends on the curator’s ability to interpret and critically assess digital information. This study is thus built on the premise that the metacognitive processes inherent in the discovery of traditional, non-Web based information are transferable to the digital environment and Scoop.it can, as such, be utilised for the cultivation of digital information literacy skills. Digital Information Literacy According to the Laboratory for Innovative Technology in Education at the University of Houston, “digital information literacy” refers to the ability to effectively analyse and evaluate evidence; to analyse and evaluate alternate points of view; to synthesise and make connections between information and arguments; and to reflect critically, interpret, and draw conclusions based on analysis. Research suggests that the digital information literacy skills of higher education students are inadequate (White; Antonio, Tuffley and Martin) and that further training in how to assess the value, credibility, and reliability of information is required. According to the CIBER’s Information Behaviour report, students’ often believe that they are information literate (based on their ability to check the validity of sources) and yet, in reality, their methods may not be sufficiently rigorous to qualify. Students may not be adequately equipped with the information literacy skills required to retrieve and critically evaluate sources outside of those that are institutionally provided, such as textbooks and assigned readings. Moreover, a report by the Committee of Inquiry (Hughes) addresses both the digital divide among students and the responsibility of the higher education sector to ensure that students are equipped with the information literacy skills required to search, authenticate, and critically evaluate material from multiple sources. Throughout history, educators have been teaching traditional literacy skills—reading, writing, finding information in libraries—to students. However, in an increasingly digital society, where a wealth of information is available online, higher education institutions need to teach students how to apply these metacognitive skills—searching, retrieving, authenticating, critically evaluating, and attributing material—to the online environment. Many institutions continue to adhere to the age-old practice of exclusive use of peer-reviewed sources for assessment tasks (Antonio and Tuffley). We argue that this is an unnecessary limitation; when students are denied access to non- peer-reviewed Web -based resources, they are not developing the skills they need to determine the credibility of digital information. While it is not suggested that the solution is to simply allow students to use Wikipedia as their primary reference point, we acknowledge that printed texts and journal articles are not the only source of credible, authoritative information. The current study is thus built on the premise that students need opportunities to help them develop their digital information literacy skills and, in order to do this, they must interact with and utilise Web -based content. The desirability of using curation tools for developing students’ digital information literacy skills thus forms the foundation of this article. Method For the purpose of this study, a group of 258 first-year students enrolled in a Communications for ICT course curated digital content for the research component of an assessment task. These ICT students were selected, firstly, because a level of proficiency with digital technology was assumed and, secondly, because previous course evaluations indicated a desire on the part of the students for technology to be integrated into the course, as a traditional essay was deemed unsuitable for ICT students. The assignment consisted of two parts: a written essay about an emerging technology and an annotated bibliography. The students were required to create a Scoop.it presentation on a particular area of technology and curate content that would assist the essay-writing component of the task. On completion of the assessment task, the students submitted their Scoop.it URL to the course lecturer and were invited to complete an anonymous online survey. The survey consisted of 20 questions—eight addressed demographic factors, three were open- ended (qualitative), and nine multiple choice items specifically assessed the students’ beliefs about whether or not the digital curation task had helped them develop their digital information literacy skills. The analysis below pertains to these nine multiple -choice items. Results and Discussion Of the 258 students who completed the assessment task, 89 participated in the survey. The students were asked: “What were the primary benefits of using the curation tool Scoop.it?” The students were permitted to select multiple responses for this item: 69% of participants said the primary benefit of using Scoop.it was “Engaging with my topic”, while 62% said “Learning how to use a new tool”; and 53% said “Learning how to assess the value of Web- based content” was the primary benefit of the curation task. This suggests that the process of digital curation as described in this project could, potentially, be used to enhance students’ digital information literacy skills. It is noteworthy that the participants in this study were not given any specific instructions on how to assess online information before doing the assignment. They were presented with a one-page summary of what constitutes an annotated bibliography; however, a specific set of guidelines for the types of processes that could be considered indicative of digital information literacy skills was not provided. This might have included the date, for currency; author credentials; cross-checking with other sources etc. It is therefore remarkable that more than 50% of respondents believed that the act of curation had positively impacted their performance on this assessment task and enhanced their ability to critically assess the value of Web -based content. This strongly suggests that the simple act of being exposed to online information, and using it in a purposeful way (in this case to research an emerging technology), can aid the development of critical thinking skills. The students were asked to indicate the extent to which they agreed or disagreed with a series of eight statements, each of which addressed a specific component of digital information literacy. Responses were presented on a Likert scale ranging from strongly agree to strongly disagree. The students’ responses for strongly agree and agree and strongly disagree and disagree were conflated. Statement 1: The use of Scoop.it helped me develop my critical thinking skills. 44% of respondents agreed that the curation tool Scoop.it had helped them develop their critical thinking skills and 30% disagreed. Statement 2: As a result of using Scoop.it, I feel I can make judgments about the value of digital content. 43% of respondents agreed and compared to 22% who disagreed that the curation tool Scoop.it had helped them make judgments about the value of digital content. Statement 3: As a result of using Scoop.it, I feel I can synthesise and organise ideas and information. 58% of respondents agreed that curation via Scoop.it helped them synthesise and organise ideas and information, while and 14% disagreed. Statement 4: As a result of using Scoop.it, I feel I can make judgments about the currency of information. 43% of respondents agreed and 21% disagreed that using Scoop.it had assisted them in their ability to make judgments about the currency of information. Statement 5: As a result of using Scoop.it, I feel I can analyse content in depth. 37% of respondents agreed that the curation task had helped them analyse content in-depth. In contrast, 21% disagreed. Statements 1 to 5 each address a specific component of digital information literacy—the ability to think critically; to make judgments about the value of content; to synthesise and organise ideas and information; to make judgments about the currency of the information; and to analyse content in depth. In response to each of these five components, a greater percentage of students agreed than disagreed that the Scoop.it task helped them develop their digital information literacy skills. By its very nature, Scoop.it generates content based on the key-word parameters entered by the user when creating a given topic. The user is then responsible from for trawling through and evaluating this content in order to make an informed decision about what content they wish to appear on their Scoop.it page. As such, it is perhaps not particularly surprising that the students in this study indicated that the practice of curating content helped them develop their digital information literacy skills. It would, however, be interesting to explore whether or not these students were confident in their abilities prior to undertaking the Scoop.it task, as previous research (CIBER) suggests. Without this information, it is difficult to draw conclusions about the success, or otherwise, of the curation task for cultivating the digital information literacy skills of higher education students. Statement 6: As a result of using Scoop.it, I feel able to cite Web-based information. 48% of respondents agreed that using Scoop.it had assisted them in citing Web-based information, while and 24% disagreed. Statement 7: As a result of using Scoop.it, I feel confident in my ability to use Web -based content in my assignments. 52% of respondents believed that using Scoop.it to curate resources had positively contributed to their confidence in using web-based content for their assignments, compared to 17% who disagreed. The results of statements 6 and 7 indicate that further instruction in using and citing non- peer-reviewed online resources may be required; however, this will not be possible if higher education institutions continue to mandate the exclusive use of journal articles and textbooks, to the exclusion of other non-peer- reviewed Web-based information, such as blogs and wikis. More than half of the students were more confident using digital information following the Scoop.it task, which suggests that the opportunity to engage with the alternate sources of information generated by the Scoop.it platform (such as blogs and wikis and digital newspapers) encouraged the students to think critically about how such sources can be incorporated into academic writing. Statement 8: As a result of using Scoop.it, I feel I can distinguish between good and bad Web-based content. While 38% of students who responded to the survey said that using Scoop.it to curate content had enabled them to distinguish between high- and low- quality information, 25% did not believe that this was the case, and an additional 37% were neither confident nor unconfident about distinguishing between good and bad Web-based content. In keeping with previous research (White), the results of this case study suggest that, while many students believed that the Scoop.it task encouraged them to think critically about the quality of non- peer-reviewed digital resources, they are were not necessarily confident in their ability to distinguish good from poor content. The implication, as Hughes contends, is that there is a need for educators to ensure that higher education students are equipped with these metacognitive-cognitive skills prior to leaving university, as it is imperative that we produce graduates who can function in an increasingly digital society. Conclusion The rising tide of digital information in the twenty-first 21st century necessitates the development of new approaches to making sense of the information found on the World Wide Web. Such is the exponentially expanding volume of this information on the Web—so-called ‘big data’—, that, unless a new breed of tools for sifting and arranging information is made available to those who use the Web for information- gathering, their capacity to deal with the volume will be overwhelmed. The new breed of digital curation tools, such as Scoop.it, are a rational response to this emerging issue. We have made the case that, by using digital curation tools to make sense of data, users are able to discern, at least to some extent, the quality and reliability of information. While this is not a substitute for peer-reviewed, academically rigorous sources, digital curation tools arguably have a supplementary role in an educational context—perhaps as a preliminary method for gathering general information about a topic area before diving deeper with peer-reviewed articles in the second pass. This dual perspective may prove to be a beneficial approach, as it has the virtue of considering both the breadth and depth of a topic. Even without formal instruction on assessing the value of Web content, no less than 53% of participants felt the primary benefit of using the digital curation tool was assessing the value of such content. This result strongly indicates the potential benefits of combining digital curation tools with formal, content-evaluation instruction. This represents a promising avenue for future research.References Antonio, A., N. Martin, and A. Stagg. “Engaging Higher Education Students via Digital Curation.” Future Challenges, Sustainable Futures (2012). 10 Feb. 2013 ‹http://www.ascilite.org.au/conferences/wellington12/2012/pagec16a.html›. Antonio, A., D. Tuffley, and N. Martin. “Creating Active Engagement and Cultivating Information Literacy Skills via Scoop.it.” Electric Dreams (2013). 4 May 2015 ‹http://www.ascilite.org.au/conferences/sydney13/program/proceedings.pdf›. Antonio, A., and D. Tuffley. “Creating Educational Networking Opportunities with Scoop.it.” Journal of Creative Communications 9.2 (2014): 185-97. CIBER. “Information Behaviour of the Researcher of the Future.” JISC (2008). 5 May 2015 ‹http://partners.becta.org.uk/index.php?section=rh&catcode=_re_rp_02&rid=15879›. Hughes, A. “Higher Education in a Web 2.0 World.” JISC (2009). 6 May 2015 ‹http://www.jisc.ac.uk/publications/generalpublications/2009/heweb2.aspx›. Laboratory for Innovative Technology in Education. “New Technologies and 21st Century Skills.” (2013). 5 May 2015 ‹http://newtech.coe.uh.edu/›. Rosenbaum, S. “Why Content Curation Is Here to Stay.” Mashable 3 Mar. 2010. 7 May 2015 ‹http://mashable.com/2010/05/03/content-curation-creation/›. Scime, E. “The Content Strategist as Digital Curator.” Dopedata 8 Dec. 2009. 30 May 2012 ‹http://www.dopedata.com/2009/12/08/the-content-strategits-as-digital-curator/›. Wheeler, S. “The Great Collective.” 2011. ‹http://stevewheeler.blogspot.com.au/2011/07/great-collective.html#!/2011/07/great-collective.html›. White, D. “Digital Visitors and Residents: Progress Report.” JISC (2012). 28 May 2012 ‹http://www.oclc.org/research/activities/vandr.html›.
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Stojanovski, Jadranka. "New directions in scholarly publishing: journal articles beyond the present". Septentrio Conference Series, n.º 1 (8 de diciembre de 2014). http://dx.doi.org/10.7557/5.3228.

Texto completo
Resumen
>> See video of presentation (28 min.) The primary goal of scholarly communication is improving human knowledge and sharing is the key to achieve this goal: sharing ideas, sharing methodologies, sharing of results, sharing data, information and knowledge. Although the concept of sharing applies to all phases of scholarly communication, most often the only visible part is the final publication, with the journal article as a most common type. The traditional characteristics of the present journals allow only limited possibilities for sharing the knowledge. Basic functions, registration, dissemination, certification, and storage, are still present but they are no more effective in the network environment. Registration is too slow, there are various barriers to dissemination, certification system has many shortcomings, and used formats are not suitable for the long term preservation and storage. Although the journals today are digital and various powerful technologies are available, they are still focused on their unaltered printed versions. This presentation will discuss possible evolution of journal article to become more compliant with users' needs and to enable “the four R’s of openness” – reuse, redistribute, revise and remix (Hilton, Wiley, Stein, & Johnson, 2010).Several aspects of openness will be presented and discussed: open access, open data, open peer review, open authorship, and open formats. With digital technology which has become indispensable in the creation, collection, processing and storage of data in all scientific disciplines the way of conducting scientific research has changed and the concept of "data-driven science" has been introduced (Ware & Mabe, 2009). Sharing research data enhances the capabilities of reproducing the results, reuse maximizes the value of research, accelerating the advancement of science, ensuring transparency of scientific research, reducing the possibility of bias in the interpretation of results and increasing the credibility of published scientific knowledge. The open peer review can ensure full transparency of the entire process of assessment and help to solve many problems in the present scholarly publishing. Through the process of the open peer review each manuscript can be immediately accessible, reviewers can publicly demonstrate their expertise and could be rewarded, and readers can be encouraged to make comments and views and to become active part of the scholarly communication process. The trend to to describe the author's contribution is also present, which will certainly lead to a reduced number of “ghost”, "guest" and "honorary" authors, and will help to establish better standards for author’s identification.Various web technologies can be used also for the semantic enhancement of the article. One of the most important aspects of semantic publication is the inclusion of the research data, to make them available to the user as an active data that can be manipulated. It is possible to integrate data from external sources, or to merge the data from different resources (data fusion) (Shotton, 2012), so the reader can gain further understanding of the presented data. Additional options provide merging data from different articles, with the addition of the component of time. Other semantic enhancement can include enriched bibliography, interactive graphical presentations, hyperlinks to external resources, tagged text, etc.Instead of mostly static content, journals can offer readers dynamic content that includes multimedia, "living mathematics", “executable articles”, etc. Videos highlighting critical points in the research process, 3D representations of chemical compounds or art works, audio clips with the author's reflections and interviews, and animated simulations or models of ocean currents, tides, temperature and salinity structure, can became soon common part of every research article. The diversity of content and media, operating systems (GNU / Linux, Apple Mac OSX, Microsoft Windows), and software tools that are available to researchers, suggests the usage of the appropriate open formats. Different formats have their advantages and disadvantages and it would be necessary to make multiple formats available, some of which are suitable for "human" reading (including printing on paper), and some for machine reading that can be used by computers without human intervention. Characteristics and possibilities of several formats will be discussed, including XML as the most recommended format, which can enable granulate document structure as well as deliver semantics to the human reader or to the computer.Literature:Hilton, J. I., Wiley, D., Stein, J., & Johnson, A. (2010). The Four R’s of Openness and ALMS Analysis: Frameworks for Open Educational Resources. Open Learning: The Journal of Open, Distance and E-Learning, 25(1), 37–44. doi:10.1080/02680510903482132Shotton, D. (2012). The Five Stars of Online Journal Articles - a Framework for Article Evaluation. D-Lib Magazine, 18(1/2), 1–16. doi:10.1045/january2012-shottonWare, M., & Mabe, M. (2009). The stm report (p. 68).
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Nalabandian, Taleen, Roman Taraban, Jessica C. Pittman y Sage Maliepaard. "Assessing College Writing: Do Students Connect with the Text?" East European Journal of Psycholinguistics 7, n.º 1 (30 de junio de 2020). http://dx.doi.org/10.29038/eejpl.2020.7.1.nal.

Texto completo
Resumen
Reading-response research has shown that students respond to a text by engaging various cognitive and emotional processes. The aim of the current study was to examine students’ written reactions to an assigned reading as a way to determine (1) whether students connect with the reading and (2) the differing cognitive styles they may utilize in their reactions. The methods applied two text-analytic procedures to 238 student reactions to an ethics case study. The procedures were language style matching, which is a metric of engagement, and the categorical-dynamic index, which is a metric of analytic and experiential processing. We predicted that students who more strongly connected—or engaged—with the text would also demonstrate greater analytic thinking in their written response and, conversely, those who weakly connected with the text would express a more informal response based on experience. The data were analyzed using correlation statistics. The results showed that students whose writing more closely matched with the linguistic style of the case study were more likely to use an analytical style of writing, and students whose writing weakly matched the linguistic style of the case study were more likely to use an informal narrative style of writing. Future research should examine the extent to which language style matching and an associated analytic cognitive style are emergent skills that develop over the course of a college experience. References Baddeley, J.L. (2012). E-mail communications among people with and without major depressive disorder (Unpublished doctoral dissertation). University of Texas at Austin, Austin, TX Blackburn, K.G. (2015). The narrative arc: Exploring the linguistic structure of the narrative (Unpublished doctoral dissertation). University of Texas at Austin, Austin, Texas. Chung, C., & Pennebaker, J. W. (2007). The psychological functions of function words. Social Communication, 1, 343-359. Ireland, M. E., & Pennebaker, J. W. (2010). Language style matching in writing: Synchrony in essays, correspondence, and poetry. Journal of Personality and Social Psychology, 99(3), 549. https://doi.org/10.1037/a0020386 Ireland, M.E., Slatcher, R.B., Eastwick, P.W., Scissors, L.E., Finkel, E.J., & Pennebaker, J.W. (2011). Language style matching predicts relationship initiation and stability. Psychological Science, 22(1), 39-44. https://doi.org/10.1177/0956797610392928 Inbar, Y., Cone, J., & Gilovich, T. (2010). People’s intuitions about intuitive insight and intuitive choice. Journal of Personality and Social Psychology, 99, 232–247. https://doi.org/10.1037/a0020215 Jordan, K. N., & Pennebaker, J. W. (2017). The exception or the rule: Using words to assess analytic thinking, Donald Trump, and the American presidency. Translational Issues in Psychological Science, 3(3), 312-316. https://doi.org/10.1037/tps0000125 Jordan, K. N., Sterling, J., Pennebaker, J. W., & Boyd, R. L. (2019). Examining long-term trends in politics and culture through language of political leaders and cultural institutions. Proceedings of the National Academy of Sciences, 116(9), 3476-3481. https://doi.org/10.1073/pnas.1811987116 Kacewicz, E., Pennebaker, J. W., Davis, M., Jeon, M., & Graesser, A. C. (2014). Pronoun use reflects standings in social hierarchies. Journal of Language and Social Psychology, 33(2), 125-143. https://doi.org/10.1177/0261927X13502654 Lance G.N., Williams W.T. (1967): Mixed-data classificatory programs, I.) Agglomerative systems. Australian Computer Journal, 1, 15-20. Leaper, C. (2014). Gender similarities and differences in language. In T. M. Holtgraves (Ed.), The Oxford handbook of language and social psychology. (pp. 62-81). Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199838639.013.002 Ludwig, S., de Ruyter, K., Mahr, D., Wetzels, M., Brüggen, E. and De Ruyck, T. (2014). Take their word for it: The symbolic role of linguistic style matches in user communities. MIS Quarterly: Management Information Systems, 38(4), 1201-1217. Mart, C. T. (2019). Reader-response theory and literature discussions: A Springboard for exploring literary texts. The New Educational Review, 56, 78-87. https://doi.org/10.15804/tner.2019.56.2.06 Niederhoffer, K. G., & Pennebaker, J. W. (2002). Linguistic style matching in social interaction. Journal of Language and Social Psychology, 21, 337-360. https://doi.org/10.1177/026192702237953 Pennebaker, J. W. (2011). The secret life of pronouns: How our words reflect who we are. New York, NY: Bloomsbury. Pennebaker, J.W., Booth, R.J., Boyd, R.L., & Francis, M.E. (2015). Linguistic Inquiry and Word Count: LIWC2015. Austin, TX: Pennebaker Conglomerates. Pennebaker, J.W., Chung, C.K., Frazee, J., Lavergne, G.M., & Beaver, D.I. (2014). When small words foretell academic success: The case of college admissions essays. PLoS ONE, 9. https://doi.org/10.1371/journal.pone.0115844 Pulvermüller, F., Shtyrov, Y., Hasting, A. S., & Carlyon, R. P. (2008). Syntax as a reflex: Neurophysiological evidence for early automaticity of grammatical processing. Brain and Language, 104, 244-253. https://doi.org/10.1016/j.bandl.2007.05.002 Richardson, B. H., Taylor, P. J., Snook, B., Conchie, S. M., & Bennell, C. (2014). Language style matching and police interrogation outcomes. Law and Human Behavior, 38(4), 357-366. https://doi.org/10.1037/lhb0000077 Rosenblatt, L. M. (2016). Literature as exploration. Modern Language Association. Segalowitz, S. J., & Lane, K. C. (2000). Lexical access of function versus content words. Brain and Language, 75, 376-389. https://doi.org/10.1006/brln.2000.2361 Segrin, C. (2000). Social skills deficits associated with depression. Clinical Psychology Review, 20, 379- 403. https://doi.org/10.1016/S0272-7358(98)00104-4 Segrin, C. & Abramson, L. Y. (1994). Negative reactions to depressive behaviors: A communication theories analysis. Journal of Abnormal Psychology, 103, 655-668. https://doi.org/10.1037/0021-843X.103.4.655 Shaw, H., Taylor, P., Conchie, S., & Ellis, D. A. (2019, March 6). Language Style Matching : A Comprehensive List of Articles and Tools. https://doi.org/10.31234/osf.io/yz4br Wyatt, D., Pressley, M., El-Dinary, P. B., Stein, S., Evans, P., & Brown, R. (1993). Comprehension strategies, worth and credibility monitoring, and evaluations: Cold and hot cognition when experts read professional articles that are important to them. Learning and Individual Differences, 5(1), 49-72. https://doi.org/10.1016/1041-6080(93)90026-O
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Fraim, John. "Friendly Persuasion". M/C Journal 3, n.º 1 (1 de marzo de 2000). http://dx.doi.org/10.5204/mcj.1825.

Texto completo
Resumen
"If people don't trust their information, it's not much better than a Marxist-Leninist society." -- Orville Schell Dean, Graduate School of Journalism, UC Berkeley "Most people aren't very discerning. Maybe they need good financial information, but I don't think people know what good information is when you get into culture, society, and politics." -- Steven Brill,Chairman and Editor-in-chief, Brill's Content Once upon a time, not very long ago, advertisements were easy to recognise. They had simple personalities with goals not much more complicated than selling you a bar of soap or a box of cereal. And they possessed the reassuring familiarity of old friends or relatives you've known all your life. They were Pilgrims who smiled at you from Quaker Oats boxes or little tablets named "Speedy" who joyfully danced into a glass of water with the sole purpose of giving up their short life to help lessen your indigestion from overindulgence. Yes, sometimes they could be a little obnoxious but, hey, it was a predictable annoyance. And once, not very long ago, advertisements also knew their place in the landscape of popular culture, their boundaries were the ad space of magazines or the commercial time of television programs. When the ads got too annoying, you could toss the magazine aside or change the TV channel. The ease and quickness of their dispatch had the abruptness of slamming your front door in the face of an old door-to-door salesman. This all began to change around the 1950s when advertisements acquired a more complex and subtle personality and began straying outside of their familiar media neighborhoods. The social observer Vance Packard wrote a best-selling book in the late 50s called The Hidden Persuaders which identified this change in advertising's personality as coming from hanging around Professor Freud's psychoanalysis and learning his hidden, subliminal methods of trickery. Ice cubes in a glass for a liquor ad were no longer seen as simple props to help sell a brand of whiskey but were now subliminal suggestions of female anatomy. The curved fronts of automobiles were more than aesthetic streamlined design features but rather suggestive of a particular feature of the male anatomy. Forgotten by the new subliminal types of ads was the simple salesmanship preached by founders of the ad industry like David Ogilvy and John Caples. The word "sales" became a dirty word and was replaced with modern psychological buzzwords like subliminal persuasion. The Evolution of Subliminal Techniques The book Hidden Persuaders made quite a stir at the time, bringing about congressional hearings and even the introduction of legislation. Prominent motivation researchers Louis Cheskin and Ernest Dichter utilised the new ad methods and were publicly admonished as traitors to their profession. The life of the new subliminal advertising seemed short indeed. Even Vance Packard predicted its coming demise. "Eventually, say by A.D. 2000," he wrote in the preface to the paperback edition of his book, "all this depth manipulation of the psychological variety will seem amusingly old- fashioned". Yet, 40 years later, any half-awake observer of popular culture knows that things haven't exactly worked out the way Packard predicted. In fact what seems old-fashioned today is the belief that ads are those simpletons they once were before the 50s and that products are sold for features and benefits rather than for images. Even Vance Packard expresses an amazement at the evolution of advertising since the 50s, noting that today ads for watches have nothing to do with watches or that ads for shoes scarcely mention shoes. Packard remarks "it used to be the brand identified the product. In today's advertising the brand is the product". Modern advertising, he notes, has an almost total obsession with images and feelings and an almost total lack of any concrete claims about the product and why anyone should buy it. Packard admits puzzlement. "Commercials seem totally unrelated to selling any product at all". Jeff DeJoseph of the J. Walter Thompson firm underlines Packard's comments. "We are just trying to convey a sensory impression of the brand, and we're out of there". Subliminal advertising techniques have today infiltrated the heart of corporate America. As Ruth Shalit notes in her article "The Return of the Hidden Persuaders" from the 27 September 1999 issue of Salon magazine, "far from being consigned to the maverick fringe, the new psycho- persuaders of corporate America have colonized the marketing departments of mainstream conglomerates. At companies like Kraft, Coca-Cola, Proctor & Gamble and Daimler-Chrysler, the most sought-after consultants hail not from McKinsey & Company, but from brand consultancies with names like Archetype Discoveries, PsychoLogics and Semiotic Solutions". Shalit notes a growing number of CEOs have become convinced they cannot sell their brands until they first explore the "Jungian substrata of four- wheel drive; unlock the discourse codes of female power sweating; or deconstruct the sexual politics of bologna". The result, as Shalit observes, is a "charmingly retro school of brand psychoanalysis, which holds that all advertising is simply a variation on the themes of the Oedipus complex, the death instinct, or toilet training, and that the goal of effective communications should be to compensate the consumer for the fact that he was insufficiently nursed as an infant, has taken corporate America by storm". The Growing Ubiquity of Advertising Yet pervasive as the subliminal techniques of advertising have become, the emerging power of modern advertising ultimately centres around "where" it is rather than "what" it is or "how" it works. The power of modern advertising is within this growing ubiquity or "everywhereness" of advertising rather than the technology and methodology of advertising. The ultimate power of advertising will be arrived at when ads cannot be distinguished from their background environment. When this happens, the environment will become a great continuous ad. In the process, ads have wandered away from their well-known hangouts in magazines and TV shows. Like alien-infected pod-people of early science fiction movies, they have stumbled out of these familiar media playgrounds and suddenly sprouted up everywhere. The ubiquity of advertising is not being driven by corporations searching for new ways to sell products but by media searching for new ways to make money. Traditionally, media made money by selling subscriptions and advertising space. But these two key income sources are quickly drying up in the new world of online media. Journalist Mike France wisely takes notice of this change in an important article "Journalism's Online Credibility Gap" from the 11 October 1999 issue of Business Week. France notes that subscription fees have not worked because "Web surfers are used to getting content for free, and they have been reluctant to shell out any money for it". Advertising sales and their Internet incarnation in banner ads have also been a failure so far, France observes, because companies don't like paying a flat fee for online advertising since it's difficult to track the effectiveness of their marketing dollars. Instead, they only want to pay for actual sales leads, which can be easily monitored on the Web as readers' click from site to site. Faced with the above situation, media companies have gone on the prowl for new ways to make money. This search underpins the emerging ubiquity of advertising: the fact that it is increasingly appearing everywhere. In the process, traditional boundaries between advertising and other societal institutions are being overrun by these media forces on the prowl for new "territory" to exploit. That time when advertisements knew their place in the landscape of popular culture and confined themselves to just magazines or TV commercials is a fading memory. And today, as each of us is bombarded by thousands of ads each day, it is impossible to "slam" the door and keep them out of our house as we could once slam the door in the face of the old door-to-door salesmen. Of course you can find them on the matchbook cover of your favorite bar, on t-shirts sold at some roadside tourist trap or on those logo baseball caps you always pick up at trade shows. But now they have got a little more personal and stare at you over urinals in the men's room. They have even wedged themselves onto the narrow little bars at the check-out counter conveyer belts of supermarkets or onto the handles of gasoline pumps at filling stations. The list goes on and on. (No, this article is not an ad.) Advertising and Entertainment In advertising's march to ubiquity, two major boundaries have been crossed. They are crucial boundaries which greatly enhance advertising's search for the invisibility of ubiquity. Yet they are also largely invisible themselves. These are the boundaries separating advertising from entertainment and those separating advertising from journalism. The incursion of advertising into entertainment is a result of the increasing merger of business and entertainment, a phenomenon pointed out in best-selling business books like Michael Wolf's Entertainment Economy and Joseph Pine's The Experience Economy. Wolf, a consultant for Viacom, Newscorp, and other media heavy-weights, argues business is becoming synonymous with entertainment: "we have come to expect that we will be entertained all the time. Products and brands that deliver on this expectation are succeeding. Products that do not will disappear". And, in The Experience Economy, Pine notes the increasing need for businesses to provide entertaining experiences. "Those businesses that relegate themselves to the diminishing world of goods and services will be rendered irrelevant. To avoid this fate, you must learn to stage a rich, compelling experience". Yet entertainment, whether provided by businesses or the traditional entertainment industry, is increasingly weighted down with the "baggage" of advertising. In a large sense, entertainment is a form of new media that carries ads. Increasingly, this seems to be the overriding purpose of entertainment. Once, not long ago, when ads were simple and confined, entertainment was also simple and its purpose was to entertain rather than to sell. There was money enough in packed movie houses or full theme parks to make a healthy profit. But all this has changed with advertising's ubiquity. Like media corporations searching for new revenue streams, the entertainment industry has responded to flat growth by finding new ways to squeeze money out of entertainment content. Films now feature products in paid for scenes and most forms of entertainment use product tie-ins to other areas such as retail stores or fast-food restaurants. Also popular with the entertainment industry is what might be termed the "versioning" of entertainment products into various sub-species where entertainment content is transformed into other media so it can be sold more than once. A film may not make a profit on just the theatrical release but there is a good chance it doesn't matter because it stands to make a profit in video rentals. Advertising and Journalism The merger of advertising and entertainment goes a long way towards a world of ubiquitous advertising. Yet the merger of advertising and journalism is the real "promised land" in the evolution of ubiquitous advertising. This fundamental shift in the way news media make money provides the final frontier to be conquered by advertising, a final "promised land" for advertising. As Mike France observes in Business Week, this merger "could potentially change the way they cover the news. The more the press gets in the business of hawking products, the harder it will be to criticize those goods -- and the companies making them". Of course, there is that persistent myth, perpetuated by news organisations that they attempt to preserve editorial independence by keeping the institutions they cover and their advertisers at arm's length. But this is proving more and more difficult, particularly for online media. Observers like France have pointed out a number of reasons for this. One is the growth of ads in news media that look more like editorial content than ads. While long-standing ethical rules bar magazines and newspapers from printing advertisements that look like editorial copy, these rules become fuzzy for many online publications. Another reason making it difficult to separate advertising from journalism is the growing merger and consolidation of media corporations. Fewer and fewer corporations control more and more entertainment, news and ultimately advertising. It becomes difficult for a journalist to criticise a product when it has a connection to the large media conglomerate the journalist works for. Traditionally, it has been rare for media corporations to make direct investments in the corporations they cover. However, as Mike France notes, CNBC crossed this line when it acquired a stake in Archipelago in September 1999. CNBC, which runs a business-news Website, acquired a 12.4% stake in Archipelago Holdings, an electronic communications network for trading stock. Long-term plans are likely to include allowing visitors to cnbc.com to link directly to Archipelago. That means CNBC could be in the awkward position of both providing coverage of online trading and profiting from it. France adds that other business news outlets, such as Dow Jones (DJ), Reuters, and Bloomberg, already have indirect ties to their own electronic stock-trading networks. And, in news organisations, a popular method of cutting down on the expense of paying journalists for content is the growing practice of accepting advertiser written content or "sponsored edit" stories. The confusion to readers violates the spirit of a long-standing American Society of Magazine Editors (ASME) rule prohibiting advertisements with "an editorial appearance". But as France notes, this practice is thriving online. This change happens in ever so subtle ways. "A bit of puffery inserted here," notes France, "a negative adjective deleted there -- it doesn't take a lot to turn a review or story about, say, smart phones, into something approaching highbrow ad copy". He offers an example in forbes.com whose Microsoft ads could easily be mistaken for staff-written articles. Media critic James Fallows points out that consumers have been swift to discipline sites that are caught acting unethically and using "sponsored edits". He notes that when it was revealed that amazon.com was taking fees of up to $10,000 for books that it labelled as "destined for greatness", its customers were outraged, and the company quickly agreed to disclose future promotional payments. Unfortunately, though, the lesson episodes like these teach online companies like Amazon centres around more effective ways to be less "revealing" rather than abstention from the practice of "sponsored edits". France reminds us that journalism is built on trust. In the age of the Internet, though, trust is quickly becoming an elusive quality. He writes "as magazines, newspapers, radio stations, and television networks rush to colonize the Internet, the Great Wall between content and commerce is beginning to erode". In the end, he ponders whether there is an irrevocable conflict between e-commerce and ethical journalism. When you can't trust journalists to be ethical, just who can you trust? Transaction Fees & Affiliate Programs - Advertising's Final Promised Land? The engine driving the growing ubiquity of advertising, though, is not the increasing merger of advertising with other industries (like entertainment and journalism) but rather a new business model of online commerce and Internet technology called transaction fees. This emerging and potentially dominant Internet e-commerce technology provides for the ability to track transactions electronically on Websites and to garner transaction fees. Through these fees, many media Websites take a percentage of payment through online product sales. In effect, a media site becomes one pervasive direct mail ad for every product mentioned on its site. This of course puts them in a much closer economic partnership with advertisers than is the case with traditional fixed-rate ads where there is little connection between product sales and the advertising media carrying them. Transaction fees are the new online version of direct marketing, the emerging Internet technology for their application is one of the great economic driving forces of the entire Internet commerce apparatus. The promise of transaction fees is that a number of people, besides product manufacturers and advertisers, might gain a percentage of profit from selling products via hypertext links. Once upon a time, the manufacturer of a product was the one that gained (or lost) from marketing it. Now, however, there is the possibility that journalists, news organisations and entertainment companies might also gain from marketing via transaction fees. The spread of transaction fees outside media into the general population provides an even greater boost to the growing ubiquity of advertising. This is done through the handmaiden of media transaction fees: "affiliate programs" for the general populace. Through the growing magic of Internet technology, it becomes possible for all of us to earn money through affiliate program links to products and transaction fee percentages in the sale of these products. Given this scenario, it is not surprising that advertisers are most likely to increasingly pressure media Websites to support themselves with e-commerce transaction fees. Charles Li, Senior Analyst for New Media at Forrester Research, estimates that by the year 2003, media sites will receive $25 billion in revenue from transaction fees, compared with $17 billion from ads and $5 billion from subscriptions. The possibility is great that all media will become like great direct response advertisements taking a transaction fee percentage for anything sold on their sites. And there is the more dangerous possibility that all of us will become the new "promised land" for a ubiquitous advertising. All of us will have some cut in selling somebody else's product. When this happens and there is a direct economic incentive for all of us to say nice things about products, what is the need and importance of subliminal techniques and methods creating advertising based on images which try to trick us into buying things? A Society Without Critics? It is for these reasons that criticism and straight news are becoming an increasingly endangered species. Everyone has to eat but what happens when one can no longer make meal money by criticising current culture? Cultural critics become a dying breed. There is no money in criticism because it is based around disconnection rather than connection to products. No links to products or Websites are involved here. Critics are becoming lonely icebergs floating in the middle of a cyber-sea of transaction fees, watching everyone else (except themselves) make money on transaction fees. The subliminal focus of the current consultancies is little more than a repackaging of an old theme discovered long ago by Vance Packard. But the growing "everywhereness" and "everyoneness" of modern advertising through transaction fees may mark the beginning of a revolutionary new era. Everyone might become their own "brand", a point well made in Tim Peters's article "A Brand Called You". Media critic James Fallows is somewhat optimistic that there still may remain "niche" markets for truthful information and honest cultural criticism. He suggests that surely people looking for mortgages, voting for a politician, or trying to decide what movie to see will continue to need unbiased information to help them make decisions. But one must ask what happens when a number of people have some "affiliate" relationship with suggesting particular movies, politicians or mortgages? Orville Schell, dean of the Graduate School of Journalism at the University of California at Berkeley, has summarised this growing ubiquity of advertising in a rather simple and elegant manner saying "at a certain point, people won't be able to differentiate between what's trustworthy and what isn't". Over the long run, this loss of credibility could have a corrosive effect on society in general -- especially given the media's importance as a political, cultural, and economic watchdog. Schell warns, "if people don't trust their information, it's not much better than a Marxist-Leninist society". Yet, will we be able to realise this simple fact when we all become types of Marxists and Leninists? Still, there is the great challenge to America to learn how to utilise transaction fees in a democratic manner. In effect, a combination of the technological promise of the new economy with that old promise, and perhaps even myth, of a democratic America. America stands on the verge of a great threshold and challenge in the growing ubiquity of advertising. In a way, as with most great opportunities or threats, this challenge centres on a peculiar paradox. On the one hand, there is the promise of the emerging Internet business model and its centre around the technology of transaction fees. At the same time, there is the threat posed by transaction fees to America's democratic society in the early years of the new millennium. Yes, once upon a time, not very long ago, advertisements were easy to recognise and also knew their place in the landscape of popular culture. Their greatest, yet silent, evolution (especially in the age of the Internet) has really been in their spread into all areas of culture rather than in methods of trickery and deceit. Now, it is more difficult to slam that front door in the face of that old door-to-door salesman. Or toss that magazine and its ad aside, or switch off commercials on television. We have become that door-to-door salesman, that magazine ad, that television commercial. The current cultural landscape takes on some of the characteristics of the theme of that old science fiction movie The Invasion of the Body Snatchers. A current advertising campaign from RJ Reynolds has a humorous take on the current zeitgeist fad of alien abduction with copy reading "if aliens are smart enough to travel through space then why do they keep abducting the dumbest people on earth?" One might add that when Americans allow advertising to travel through all our space, perhaps we all become the dumbest people on earth, abducted by a new alien culture so far away from a simplistic nostalgia of yesterday. (Please press below for your links to a world of fantastic products which can make a new you.) References Brill, Steven. Quoted by Mike France in "Journalism's Online Credibility Gap." Business Week 11 Oct. 1999. France, Mike. "Journalism's Online Credibility Gap." Business Week 11 Oct. 1999. <http://www.businessweek.com/1999/99_41/b3650163.htm>. Packard, Vance. The Hidden Persuaders. Out of Print, 1957. Pine, Joseph, and James Gilmore. The Experience Economy. Harvard Business School P, 1999. Shalit, Ruth. "The Return of the Hidden Persuaders." Salon Magazine 27 Sep. 1999. <http://www.salon.com/media/col/shal/1999/09/27/persuaders/index.php>. Schell, Orville. Quoted by Mike France in "Journalism's Online Credibility Gap." Business Week 11 Oct. 1999. Wolf, Michael. Entertainment Economy. Times Books, 1999. Citation reference for this article MLA style: John Fraim. "Friendly Persuasion: The Growing Ubiquity of Advertising, or What Happens When Everyone Becomes an Ad?." M/C: A Journal of Media and Culture 3.1 (2000). [your date of access] <http://www.uq.edu.au/mc/0003/ads.php>. Chicago style: John Fraim, "Friendly Persuasion: The Growing Ubiquity of Advertising, or What Happens When Everyone Becomes an Ad?," M/C: A Journal of Media and Culture 3, no. 1 (2000), <http://www.uq.edu.au/mc/0003/ads.php> ([your date of access]). APA style: John Fraim. (2000) Friendly Persuasion: The Growing Ubiquity of Advertising, or What Happens When Everyone Becomes an Ad?. M/C: A Journal of Media and Culture 3(1). <http://www.uq.edu.au/mc/0003/ads.php> ([your date of access]).
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Soares, Felipe y Raquel Recuero. "How the Mainstream Media Help to Spread Disinformation about Covid-19". M/C Journal 24, n.º 1 (15 de marzo de 2021). http://dx.doi.org/10.5204/mcj.2735.

Texto completo
Resumen
Introduction In this article, we hypothesise how mainstream media coverage can promote the spread of disinformation about Covid-19. Mainstream media are often discussed as opposed to disinformation (Glasser; Benkler et al.). While the disinformation phenomenon is related to the intentional production and spread of misleading and false information to influence public opinion (Fallis; Benkler et al.), mainstream media news is expected to be based on facts and investigation and focussed on values such as authenticity, accountability, and autonomy (Hayes et al.). However, journalists might contribute to the spread of disinformation when they skip some stage of information processing and reproduce false or misleading information (Himma-Kadakas). Besides, even when the purpose of the news is to correct disinformation, media coverage might contribute to its dissemination by amplifying it (Tsfati et al.). This could be particularly problematic in the context of social media, as users often just read headlines while scrolling through their timelines (Newman et al.; Ofcom). Thus, some users might share news from the mainstream media to legitimate disinformation about Covid-19. The pandemic creates a delicate context, as journalists are often pressured to produce more information and, therefore, are more susceptible to errors. In this research, we focussed on the hypothesis that legitimate news can contribute to the spread of disinformation on social media through headlines that reinforce disinformation discourses, even though the actual piece may frame the story differently. The research questions that guide this research are: are URLs with headlines that reinforce disinformation discourses and other mainstream media links shared into the same Facebook groups? Are the headlines that support disinformation discourses shared by Facebook users to reinforce disinformation narratives? As a case study, we look at the Brazilian disinformation context on Covid-19. The discussion about the disease in the country has been highly polarised and politically framed, often with government agents and scientists disputing the truth about facts on the disease (Araújo and Oliveira; Recuero and Soares; Recuero et al.). Particularly, the social media ecosystem seems to play an important role in these disputes, as Brazilian President Jair Bolsonaro and his supporters use it as a key channel to spread disinformation about the virus (Lisboa et al.; Soares et al.). We use data from public groups on Facebook collected through CrowdTangle and a combination of social network analysis and content analysis to analyse the spread and the content of URLs and posts. Theoretical Background Disinformation has been central to the Covid-19 “infodemic”, created by the overabundance of information about the pandemic, which makes it hard for people to find reliable guidance and exacerbates the outbreak (Tangcharoensathien et al.). We consider disinformation as distorted, manipulated, or false information intentionally created to mislead someone (Fallis; Benkler et al.). Disinformation is often used to strengthen radical political ideologies (Benkler et al.). Around the world, political actors politically framed the discussion about the pandemic, which created a polarised public debate about Covid-19 (Allcott et al., Gruzd and Mai; Recuero and Soares). On social media, contexts of polarisation between two different political views often present opposed narratives about the same fact that dispute public attention (Soares et al.). This polarisation creates a suitable environment for disinformation to thrive (Benkler et al.) The polarised discussions are often associated with the idea of “bubbles”, as the different political groups tend to share and legitimate only discourses that are aligned with the group's ideological views. Consequently, these groups might turn into ideological bubbles (Pariser). In these cases, content shared within one group is not shared within the other and vice versa. Pariser argues that users within the bubbles are exposed exclusively to content with which they tend to agree. However, research has shown that Pariser’s concept of bubbles has limitations (Bruns), as most social media users are exposed to a variety of sources of information (Guess et al.). Nevertheless, polarisation might lead to different media diets and disinformation consumption (Benkler et al.). That is, users would have contact with different types of information, but they would choose to share certain content over others because of their political alignment (Bruns). Therefore, we understand that bubbles are created by the action of social media users who give preference to circulate (through retweets, likes, comments, or shares) content that supports their political views, including disinformation (Recuero et al.). Thus, bubbles are ephemeral structures (created by users’ actions in the context of a particular political discussion) with permeable boundaries (users are exposed to content from the outside) in discussions on social media. This type of ephemeral bubble might use disinformation as a tool to create a unique discourse that supports its views. However, it does not mean that actors within a “disinformation bubble” do not have access to other content, such as the news from the mainstream media. It means that the group acts to discredit and to overlap this content with an “alternative” story (Larsson). In addition, the mainstream media might disseminate false or inaccurate disinformation (Tsfati et al.). Particularly, we focus on inaccurate headlines that reinforce disinformation narratives, as social media users often only read news headlines (Newman et al.; Ofcom). This is especially problematic because a large number of social media users are exposed to mainstream media content, while exposure to disinformation websites is heavily concentrated on only a few users (Guess et al.; Tsfati et al.). Therefore, when the mainstream media disseminate disinformation, it is more likely that a larger number of social media users will be exposed to this content and share it into ideological bubbles. Based on this discussion, we aim to understand how the mainstream media contribute to the spread of disinformation discourses about Covid-19. Methods This study is about how mainstream media coverage might contribute to the spread of disinformation about Covid-19 on Facebook. We propose two hypotheses, as follows: H1: When mainstream media headlines frame the information in a way that reinforces the disinformation narrative, the links go into a “disinformation bubble”. H2: In these cases, Facebook users might use mainstream media coverage to legitimate disinformation narratives. We selected three case studies based on events that created both political debate and high media coverage in Brazil. We chose them based on the hypothesis that part of the mainstream media links could have produced headlines that support disinformation discourses, as the political debate was high. The events are: On 24 March 2020, Brazilian President Jair Bolsonaro made a public pronouncement on live television. In the week before the pronouncement, Brazilian governors decided to follow World Health Organisation (WHO) protocols and closed non-essential business. In his speech, Bolsonaro criticised social distancing measures. The mainstream media reproduced some of his claims and claims from other public personalities, such as entrepreneurs who also said the protocols would harm the economy. On 8 June 2020, a WHO official said that it “seems to be rare that an asymptomatic person transmits [Covid-19] onward to a secondary individual”. Part of the mainstream media reproduced the claim out of context, which could promote the misperception that both asymptomatic and pre-symptomatic persons (early stages of an illness, before the first symptoms) do not transmit Covid-19 at all. On 9 November 2020, Brazil’s national sanitary watchdog Anvisa reported that they had halted the clinical studies on the CoronaVac vaccine, developed by the Chinese company Sinovac. Bolsonaro often criticised CoronaVac because it was being produced in partnership with São Paulo’s Butantan Institute and became the subject of a political dispute between Bolsonaro and the Governor of São Paulo, João Dória. Bolsonaro said the halt of the CoronaVac trial was "another victory for Jair Bolsonaro". Anvisa halted the trail after a "severe adverse event". The mainstream media rapidly reverberated the decision. Later, it was revealed that the incident was a death that had nothing to do with the vaccine. Before we created our final dataset that includes links from the three events together, we explored the most shared URLs in each event. We used keywords to collect posts shared in the public groups monitored by CrowdTangle, a tool owned by Facebook that tracks publicly available posts on the platform. We collected posts in a timeframe of three days for each event to prevent the collection of links unrelated to the cases. We collected only posts containing URLs. Table 1 summarises the data collected. Table 1: Data collected Dates March 24-26 2020 June 8-10 2020 November 9-11 2020 Keywords “Covid-19” or “coronavirus” and “isolation” or “economy” “Covid-19” or “coronavirus” and “asymptomatic” “vaccine” and “Anvisa” or “CoronaVac” Number of posts 4780 2060 3273 From this original dataset, we selected the 60 most shared links from each period (n=180). We then filtered for those which sources were mainstream media outlets (n=74). We used content analysis (Krippendorff) to observe which of these URLs headlines could reinforce disinformation narratives (two independent coders, Krippendorff’s Alpha = 0.76). We focussed on headlines because when these links are shared on Facebook, often it is the headline that appears to other users. We considered that a headlined reinforced disinformation discourses only when it was flagged by both coders (n=21 – some examples are provided in Table 3 in the Results section). Table 2 provides a breakdown of this analysis. Table 2: Content analysis Event Mainstream media links Headlines that support disinformation discourses Number of links Number of posts Economy and quarantine 24 7 112 Asymptomatic 22 7 163 Vaccine trial 28 7 120 Total 74 21 395 As the number of posts that shared URLs with headlines that supported disinformation was low (n=395), we conducted another CrowdTangle search to create our final dataset. We used a sample of the links we classified to create a “balanced” dataset. Out of the 21 links with headlines that reinforced disinformation, we collected the 10 most shared in public groups monitored by CrowdTangle (this time, without any particular timeframe) (n=1346 posts). In addition, we created a “control group” with the 10 most shared links that neither of the coders considered could reinforce disinformation (n=1416 posts). The purpose of the “control group” was to identify which Facebook groups tend to share mainstream media links without headlines that reinforce disinformation narratives. Therefore, our final dataset comprises 20 links and 2762 posts. We then used social network analysis (Wasserman and Faust) to map the spread of the 20 links. We created a bipartite network, in which nodes are (1) Facebook groups and (2) URLs; and edges represent when a post within a group includes a URL from our dataset. We applied a modularity metric (Blondel et al.) to identify clusters. The modularity metric allows us to identify “communities” that share the same or similar links in the network map. Thus, it helped us to identify if there was a “bubble” that only shares the links with headlines that support disinformation (H1). To understand if the disinformation was supporting a larger narrative shared by the groups, we explored the political alignments of each cluster (H2). We used Textometrica (Lindgreen and Palm) to create word clouds with the most frequent words in the names of the cluster groups (at least five mentions) and their connections. Finally, we also analysed the posts that shared each of the 10 links with headlines that reinforced disinformation. This also helped us to identify how the mainstream media links could legitimate disinformation narratives (H2). Out of the 1346 posts, only 373 included some message (the other 973 posts only shared the link). We used content analysis to see if these posts reinforced the disinformation (two independent coders – Krippendorff’s Alpha = 0.723). There were disagreements in the categorisation of 27 posts. The two coders reviewed and discussed the classification of these posts to reach an agreement. Results Bubbles of information In the graph (Figure 1), red nodes are links with headlines that support disinformation discourses, blue nodes are the other mainstream media links, and black nodes are Facebook groups. Our first finding is that groups that shared headlines that support disinformation rarely shared the other mainstream media links. Out of the 1623 groups in the network, only 174 (10.7%) shared both a headline that supports disinformation discourse, and another mainstream media link; 712 groups (43.8%) only shared headlines that support disinformation; and 739 groups (45.5%) only shared other links from the mainstream media. Therefore, users’ actions created two bubbles of information. Figure 1: Network graph The modularity metric confirmed this tendency of two “bubbles” in the network (Figure 2). The purple cluster includes seven URLs with headlines that support disinformation discourse. The green cluster includes three headlines that support disinformation discourse and the other 10 links from the mainstream media. This result partially supports H1: When mainstream media headlines frame the information in a way that reinforces the disinformation narrative, the links go into a “disinformation bubble”. As we identified, most of the headlines that support disinformation discourse went into a separate “bubble”, as users within the groups of this bubble did not share the other links from the mainstream media. Figure 2: Network graph with modularity This result shows that users’ actions boost the creation of bubbles (Bakshy et al.), as they choose to share one type of content over the other. The mainstream media are the source of all the URLs we analysed. However, users from the purple cluster chose to share only links with headlines that supported disinformation discourses. This result is also related to the political framing of the discussions, as we explore below. Disinformation and Political Discourse We used word clouds (Lindgreen and Palm) to analyse the Facebook groups’ names to explore the ideological affiliation of the bubbles. The purple bubble is strongly related to Bolsonaro and his discourse (Figure 3). Bolsonaro is the most frequent word. Other prevalent words are Brazil, patriots (both related to his nationalist discourse), right-wing, conservative, military (three words related to his conservative discourse and his support of the military dictatorship that ruled Brazil from 1964 to 1985), President, support, and Alliance [for Brazil] (the name of his party). Some of the most active groups within the purple bubble are “Alliance for Brazil”, “Bolsonaro 2022 [next presidential election]”, “Bolsonaro’s nation 2022”, and “I am right-wing with pride”. Figure 3: Purple cluster word cloud Bolsonaro is also a central word in the green cluster word cloud (Figure 4). However, it is connected to other words such as “against” and “out”, as many groups are anti-Bolsonaro. Furthermore, words such as left-wing, Workers’ Party (centre-left party), Lula and Dilma Rousseff (two Workers’ Party ex-presidents) show another ideological alignment in general. In addition, there are many local groups (related to locations such as Rio de Janeiro, São Paulo, Rio Grande do Sul, Minas Gerais, and others), and groups to share news (news, newspaper, radio, portal). “We are 70 per cent [anti-Bolsonaro movement]”, “Union of the Left”, “Lula president”, and “Anti-Bolsonaro” are some of the most active groups within the green cluster. Figure 4: Green cluster word cloud Then, we analysed how users shared the mainstream media links with headlines that support disinformation discourses. In total, we found that 81.8% of the messages in the posts that shared these links also reproduced disinformation narratives. The frequency was higher (86.2%) when considering only posts that shared one of the seven links from the purple cluster (based on the modularity metric). Consequently, it was lower (64%) in the messages that shared one of the other three links. The messages often showed support for Bolsonaro; criticised other political and health authorities (the WHO, São Paulo Governor João Dória, and others), China, and the “leftists” (all opposition to Bolsonaro); claimed that quarantine and social distancing measures were unnecessary; and framed vaccines as dangerous. We provide some examples of headlines and posts in Table 3 (we selected the most-shared URL for each event to illustrate). This result supports H2 as we found that users shared mainstream media headlines that reinforce disinformation discourse to legitimate the disinformation narrative; and that it was more prevalent in the purple bubble. Table 3: Examples of headlines and posts Headline Post "Unemployment is a crisis much worse than coronavirus", says Bolsonaro Go to social media to support the President. Unemployment kills. More than any virus... hunger, depression, despair and everything UNEMPLOYMENT, THE DEPUTIES CHAMBER, THE SENATE AND THE SUPREME COURT KILL MORE THAN COVID19 Asymptomatic patients do not boost coronavirus, says WHO QUARANTINE IS FAKE #StayHome, the lie of the century! THIS GOES TO THE PUPPETS OF THE COMMUNIST PARTIES THE AND FUNERARY MEDIA Anvisa halts Coronavac vaccine trial after "serious adverse event" [The event] is adverse and serious, so the vaccine killed the person by covid And Doria [Governor of São Paulo and political adversary of Bolsonaro] wants to force you to take this shit This result shows that mainstream media headlines that support disinformation narratives may be used to reinforce disinformation discourses when shared on Facebook, making journalists potential agents of disinformation (Himma-Kadakas; Tsfati et al.). In particular, the credibility of mainstream news is used to support an opposing discourse, that is, a disinformation discourse. This is especially problematic in the context of Covid-19 because the mainstream media end up fuelling the infodemic (Tangcharoensathien et al.) by sharing inaccurate information or reverberating false claims from political actors. Conclusion In this article, we analysed how the mainstream media contribute to the spread of disinformation about Covid-19. In particular, we looked at how links from the mainstream media with headlines that support disinformation discourse spread on Facebook, compared to other links from the mainstream media. Two research questions guided this study: Are URLs with headlines that reinforce disinformation discourses and other mainstream media links shared into the same Facebook groups? Are the headlines that support disinformation discourses shared by Facebook users to reinforce disinformation narratives? We identified that (1) some Facebook groups only shared links with headlines that support disinformation narratives. This created a “disinformation bubble”. In this bubble, (2) Facebook users shared mainstream media links to reinforce disinformation – in particular, pro-Bolsonaro disinformation, as many of these groups had a pro-Bolsonaro alignment. In these cases, the mainstream media contributed to the spread of disinformation. Consequently, journalists ought to take extra care when producing news, especially headlines, which will be the most visible part of the stories on social media. This study has limitations. We analysed only a sample of links (n=20) based on three events in Brazil. Other events and other political contexts might result in different outcomes. Furthermore, we used CrowdTangle for data collection. CrowdTangle only provides information about public posts in groups monitored by the tool. Therefore, our result does not represent the entire Facebook. References Allcott, Hunt, et al. “Polarization and Public Health: Partisan Differences in Social Distancing during the Coronavirus Pandemic.” National Bureau of Economic Research, Working Paper No. 26946 (2020). 6 Jan. 2021 <https://doi.org/10.3386/w26946>. Araújo, Ronaldo Ferreira, and Thaiane Moreira Oliveira. “Desinformação e Mensagens Sobre a Hidroxicloroquina no Twitter: Da Pressão Política à Disputa Científica.” Atoz – Novas Práticas em Informação e Conhecimento 9.2 (2020). 6 Jan. 2021 <http://dx.doi.org/10.5380/atoz.v9i2.75929>. Bakshy, Eytan, et al. “Exposure to Ideologically Diverse News and Opinion on Facebook.” Science 348.6239 (2015). 6 Jan. 2021 <https://science.sciencemag.org/content/348/6239/1130>. Benkler, Yochai, et al. Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics. New York: Oxford University Press, 2018. Blondel, Vincent D., et al. “Fast Unfolding of Communities in Large Networks.” Physics.soc-ph (2008). 6 Jan. 2021 <http://lanl.arxiv.org/abs/0803.0476>. Bruns, Axel. Are Filter Bubbles Real?. Cambridge: Polity Press, 2019. CrowdTangle Team. CrowdTangle. Menlo Park, Calif.: Facebook, 2020. <https://apps.crowdtangle.com/search/>. Fallis, Don. “What Is Disinformation?” Library Trends 63.3 (2015): 401-426. Glasser, Susan B. “Covering Politics in a ‘Post-Truth’ America.” Brookings Institution Press, 2 Dec. 2016. 22 Feb. 2021 <https://www.brookings.edu/essay/covering-politics-in-a-post-truth-america/>. Gruzd, Anatoliy, and Philip Mai. “Going Viral: How a Single Tweet Spawned a COVID-19 Conspiracy Theory on Twitter.” Big Data & Society, 7.2 (2020). 6 Jan. 2021 <https://doi.org/10.1177/2053951720938405>. Guess, Andrew, et al. Avoiding the Echo Chamber about Echo Chambers: Why Selective Exposure to Like-Minded Political News Is Less Prevalent than You Think. Miami: John S. and James L. Knight Foundation, 2018. Hayes, Arthur S., et al. “Shifting Roles, Enduring Values: The Credible Journalist in a Digital Age.” Journal of Mass Media Ethics 22.4 (2007): 262-279. 22 Feb.2021 <https://doi.org/10.1080/08900520701583545>. Himma-Kadakas, Marju. “Alternative Facts and Fake News Entering Journalistic Content Production Cycle”. Cosmopolitan Civil Societies: An Interdisciplinary Journal 9.2 (2017). 6 Jan. 2021 <https://doi.org/10.5130/ccs.v9i2.5469>. Kripendorff, Klaus. Content Analysis: An Introduction to Its Methodology. California: Sage Publications, 2013. Larsson, Anders Olof. “News Use as Amplification – Norwegian National, Regional and Hyperpartisan Media on Facebook.” Journalism & Mass Communication Quarterly 96 (2019). 6 Jan. 2021 <https://doi.org/10.1177/1077699019831439>. Lindgreen, Simon, and Fredrik Palm. Textometrica Service Package (2011). 6 Jan. 2021 <http://textometrica.humlab.umu.se>. Lisboa, Lucas A., et al. “A Disseminação da Desinformação Promovida por Líderes Estatais na Pandemia da COVID-19.” Proceedings of the Workshop Sobre as Implicações da Computação na Sociedade (WICS), Porto Alegre: Sociedade Brasileira de Computação, 2020. 6 Jan. 2021 <https://doi.org/10.5753/wics.2020.11042>. Newman, Nic, et al. Reuters Institute Digital News Report 2018. Oxford: Oxford University, 2018. Ofcom. “Scrolling News: The Changing Face of Online News Consumption.” 2016. 23 Feb. 2021 <https://www.ofcom.org.uk/__data/assets/pdf_file/0022/115915/Scrolling-News.pdf>. Pariser, Eli. The Filter Bubble. New York: Penguin, 2011. Recuero, Raquel, and Felipe Soares. “O Discurso Desinformativo sobre a Cura do COVID-19 no Twitter: Estudo de Caso.” E-Compós (2020). 23 Feb. 2021 <https://doi.org/10.30962/ec.2127>. Recuero, Raquel, et al. “Polarization, Hyperpartisanship, and Echo Chambers: How the Disinformation about COVID-19 Circulates on Twitter.” Contracampo (2021, in press). 23 Feb. 2021 <https://doi.org/10.1590/SciELOPreprints.1154>. Soares, Felipe Bonow, et al. “Disputas discursivas e desinformação no Instagram sobre o uso da hidroxicloroquina como tratamento para o Covid-19.” Proceedings of the 43º Congresso Brasileiro de Ciências da Comunicação, Salvador: Intercom, 2020. 23 Feb. 2021 <http://www.intercom.org.br/sis/eventos/2020/resumos/R15-0550-1.pdf>. Tangcharoensathien, Viroj, et al. “Framework for Managing the COVID-19 Infodemic: Methods and Results of an Online Crowdsourced WHO Technical Consultation.” J Med Internet Res 22.6 (2020). 6 Jan. 2021 <https://doi.org/10.2196/19659>. Tsfati, Yariv, et al. “Causes and Consequences of Mainstream Media Dissemination of Fake News: Literature Review and Synthesis.” Annals of the International Communication Association 44.2 (2020): 157-173. 22 Feb. 2021 <https://doi.org/10.1080/23808985.2020.1759443>. Wasserman, Stanley, and Katherine Faust. Social Network Analysis: Methods and Applications. Cambridge: Cambridge UP, 1994.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Thankavadivel, Ramanan y Norman Ravikumar Muthurajah. "Open Access to Sri Lankan Scholarly Publications: a web survey of Sri Lankan Journals Online and e-Repositories". Septentrio Conference Series, n.º 5 (24 de noviembre de 2015). http://dx.doi.org/10.7557/5.3655.

Texto completo
Resumen
Scholarly publication has still been a challenge to young academics that are yearning for sharing their works in high-impact journals. One of the challenges to access to reputed databases is their high subscription rates. Next discouraging factor is that many of these journals charge article processing fees at varying amounts, which are unaffordable to authors with inadequate income background. In general, universities/institutions do not support to pay the publications fees. Thirdly, not every author is better at his/her first attempt to get their works accepted since they need training and experience to master the academic writing. Although the authors find ways to publish their research in renowned journals, their universities are finding it difficult to subscribe to those expensive information resources. Therefore, open access has become not only an advantage to academic institutions that are deprived of adequate budgetary allocations for subscriptions but also a relief to novice academic writers. In this line, Sri Lankan Journals Online (SLJOL) and the e-repositories of universities in the country are making scholarly publications available as open access. SLJOL is a product of the International Network for the Availability of Scientific Publications (INASP) under the joint project of Journals Online. With its inception in August 2008, SLJOL sets its objectives to widen the access to and visibility of research published in Sri Lanka. Since it uses ‘Open Journals System’ that was created by the Public Knowledge Project (based in Canada), journal contents are indexed through Open Archive Initiative that harvest metadata of articles. Thus, SLJOL provides avenue for local authors to have a global audience. Presently, SLJOL provides open access to 60 peer reviewed journal titles published in Sri Lanka, which are covering a wide spectrum of disciplines. Of 60 journals, 36.66% are published by the universities and affiliated institutions, 35% of journals are published by institutes and learned societies, and 21.66% are published by professional organizations and associations, whilst, 6.66% of the publications are contributed by research institutes. The subject coverage of the journals is as follows: Agriculture is 13.33% (8), Architecture, Building and Planning 3.33% (2), Science 10% (6), Education 1.66% (1), Law 1.66% (1), IT and Computer Sciences 1.66% (1), Medicine and subject allied to medicine 40% (24), Multidisciplinary 6.66% (4), Physical Sciences 1.66% (1), Humanities and Social Studies 5 %(3), Management 8.33% (5), Environmental Science 1.66% (1) and Library and Information Science 3.33% (2). Considering electronic repositories, most of the universities, research institutes and other academic societies in Sri Lanka have their own digital collections that are freely accessible. Currently, there are 17 digital repositories are listed on Directory Sri Lankan Institutional Repositories, which contains 53,185 articles in various disciplines. The National Science Foundation (NSF) provides national e-repository, which covers full text scholarly literature of Sri Lankan origin. Research articles and reports, theses and dissertation, conference proceedings, abstracts of research session, and articles authored by academics and researchers of the institutions are stored on these repositories. Finally, the authors intend to kindle awareness about ever proliferating journal business in every nook and corner of academia of the globe. There has been skepticism of these ‘mushroom’ journals for their authenticity and credibility. Eminent academics are hesitant to publish their works in those journals. As a result of which, there has been a gap between knowledge shared on well-known sources and easily-affordable mushroom journals. Therefore, scholarly publications on open access sources need to comply with standards, whilst giving opportunity for research to be available to grassroots level of society.
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Foith, Michael. "Virtually Witness Augmentation Now: Video Games and the Future of Human Enhancement". M/C Journal 16, n.º 6 (6 de noviembre de 2013). http://dx.doi.org/10.5204/mcj.729.

Texto completo
Resumen
Introduction Ever-enduring advancements in science and technology promise to offer solutions to problems or simply to make life a bit easier. However, not every advancement has only positive effects, but can also have undesired, negative ramifications. This article will take a closer look at Deus Ex: Human Revolution (DXHR), a dystopian video game which promises to put players in the position of deciding whether the science of human enhancement is a way to try to play God, or whether it enables us “to become the Gods we’ve always been striving to be” (Eidos Montreal, “Deus Ex: Human Revolution”). In this article I will argue that DXHR creates a space in which players can virtually witness future technologies for human performance enhancement without the need to alter their own bodies. DXHR is special particularly in two respects: first, the developers have achieved a high credibility and scientific realism of the enhancement technologies depicted in the game which can be described as being “diegetic prototypes” (Kirby, “The Future Is Now ” 43); second, the game directly invites players to reflect upon the impact and morality of human enhancement. It does so through a story in line with the cyberpunk genre, which envisions not only the potential benefits of an emergent technology, but has an even stronger focus on the negative contingencies. The game and its developers foresee a near-future society that is split into two fractions due to human enhancement technologies which come in the form of neuro-implants and mechanical prosthetics; and they foresee a near-future setting in which people are socially and economically forced to undergo enhancement surgery in order to keep up with the augmented competition. DXHR is set in the year 2027 and the player takes control of Adam Jensen, an ex-SWAT police officer who is now the chief of security of Sarif Industries, one of the world's leading biotechnology companies that produce enhancement technologies. Augmented terrorists attack Sarif Industries, abduct the head scientists, and nearly kill Jensen. Jensen merely survives because his boss puts him through enhancement surgery, which replaces many parts of his body with mechanical augmentations. In the course of the game it becomes clear that Jensen has been augmented beyond any life-saving necessity that grants him superhuman abilities and allows him to find and defeat the terrorists, but the augmentations also challenge his humanity. Is Jensen a human, a cyborg, or has he become more machine than man? DXHR grants players the illusion of immersion into a virtual world in which augmentations exist as a matter of fact and in which a certain level of control can be practiced. Players take up the role of a character distinctly more powerful and capable than the person in control, exceeding the limits of human abilities. The superior abilities are a result of scientific and technological advancements implying that every man or woman is able to attain the same abilities by simply acquiring augmentations. Thus, with the help of the playable character, Adam Jensen, the game lets players experience augmentations without any irreparable damages done to their bodies, but the experience will leave a lasting impression on players regarding the science of human enhancement. The experience with augmentations happens through and benefits from the effect of “virtual witnessing”: The technology of virtual witnessing involves the production in a reader’s mind of such an image of an experimental scene as obviates the necessity for either direct witness or replication. Through virtual witnessing the multiplication of witnesses could be, in principle, unlimited. (Shapin and Schaffer 60) In other words, simply by reading about and/or seeing scientific advancements, audiences can witness them without having to be present at the site of creation. The video game, hereby, is itself the medium of virtual witnessing whereby audiences can experience scientific advancements. Nevertheless, the video game is not just about reading or seeing potential future enhancement technologies, but permits players to virtually test-drive augmentations—to actually try out three-dimensionally rendered prototypes on a virtual body. In order to justify this thesis, a couple of things need to be clarified that explain in which ways the virtual witnessing of fictional enhancements in DXHR is a valid claim. Getting into the Game First I want to briefly describe how I investigated the stated issue. I have undertaken an auto-ethnography (Ellis, Adams, and Bochner) of DXHR, which concretely means that I have analytically played DXHR in an explorative fashion (Aarseth) trying to discover as many elements on human enhancement that the game has to offer. This method requires not only close observation of the virtual environment and documentation through field notes and screenshots, but also self-reflection of the actions that I chose to take and that were offered to me in the course of the game. An essential part of analytically playing a game is to be aware that the material requires “the activity of an actual player in order to be accessible for scrutiny” (Iversen), and that the player’s input fundamentally shapes the gaming experience (Juul 42). The meaning of the game is contingent upon the contribution of the player, especially in times in which digital games grant players more and more freedom in terms of narrative construction. In contrast to traditional narrative, the game poses an active challenge to the player which entails the need to become better in relation to the game’s mechanics and hence “studying games … implies interacting with the game rules and exploring the possibilities created by these rules, in addition to studying the graphical codes or the narration that unfolds” (Malliet). It is important to highlight that, although the visual representation of human enhancement technologies has an enormous potential impact on the player’s experience, it is not the only crucial element. Next to the representational shell, the core of the game, i.e. “how game rules and interactions with game objects and other players are structured” (Mäyrä 165), shapes the virtual witnessing of the augmentations in just an important way. Finally, the empirical material that was collected was analyzed and interpreted with the help of close-reading (Bizzocchi and Tanenbaum 395). In addition to the game itself, I have enriched my empirical material with interviews of developers of the game that are partly freely available on the Internet, and with the promotional material such as the trailers and a website (Eidos Montreal, “Sarif Industries”) that was released prior to the game. Sociotechnical Imaginaries In this case study of DXHR I have not only investigated how augmented bodies and enhancement technologies are represented in this specific video game, but also attempted to uncover which “sociotechnical imaginaries” (Jasanoff and Kim) underlie the game and support the virtual witnessing experience. Sociotechnical imaginaries are defined as “collectively imagined forms of social life and social order reflected in the design and fulfillment of nation-specific scientific and/or technological projects” (Jasanoff and Kim 120). The concept appeared to be suitable for this study as it covers and includes “promises, visions and expectations of future possibilities” (Jasanoff and Kim 122) of a technology as well as “implicit understandings of what is good or desirable in the social world writ large” (Jasanoff and Kim 122–23). The game draws upon several imaginaries of human enhancement. For example, the most basic imaginary in the game is that advanced engineered prosthetics and implants will be able to not only remedy dysfunctional parts of the human body, but will be able to upgrade these. Apart from this idea, the two prevailing sociotechnical imaginaries that forward the narrative can be subsumed as the transhumanist and the purist imaginary. The latter views human enhancement, with the help of science and technology, as unnatural and as a threat to humanity particularly through the power that it grants to individuals, while the former transports the opposing view. Transhumanism is: the intellectual and cultural movement that affirms the possibility and desirability of fundamentally improving the human condition through applied reason, especially by developing and making widely available technologies to eliminate aging and to greatly enhance human intellectual, physical, and psychological capacities. (Chrislenko et al.) The transhumanist imaginary in the game views technological development of the body as another step in the human evolution, not as something abhorrent to nature, but a fundamental human quality. Similar ideas can be found in the writings of Sigmund Freud and Arnold Gehlen, who both view the human being’s need to improve as part of its culture. Gehlen described the human as a “Mängelwesen”—a ‘deficient’ creature—who is, in contrast to other species, not specialized to a specific environment, but has the ability to adapt to nearly every situation because of this deficiency (Menne, Trutwin, and Türk). Freud even denoted the human as a “Prothesengott”—a god of prostheses: By means of all his tools, man makes his own organs more perfect—both the motor and the sensory—or else removes the obstacles in the way of their activity. Machinery places gigantic power at his disposal which, like his muscles, he can employ in any direction; ships and aircraft have the effect that neither air nor water can prevent his traversing them. With spectacles he corrects the defects of the lens in his own eyes; with telescopes he looks at far distances; with the microscope he overcomes the limitations in visibility due to the structure of his retina. (Freud 15) Returning to DXHR, how do the sociotechnical imaginaries matter for the player? Primarily, the imaginaries cannot be avoided as they pervade nearly every element in the game, from the main story that hinges upon human enhancement over the many optional side missions, to contextual elements such as a conference on “the next steps in human evolution” (Eidos Montreal, “Deus Ex: Human Revolution”). Most importantly, it impacts the player’s view in a crucial way. Human enhancement technologies are presented as controversial, neither exclusively good nor bad, which require reflection and perhaps even legal regulation. In this way, DXHR can be seen as offering the player a restricted building set of sociotechnical imaginaries of human enhancement, whereby the protagonist, Adam Jensen, becomes the player’s vessel to construct one’s own individual imaginary. In the end the player is forced to choose one of four outcomes to complete the game, and this choice can be quite difficult to make. Anticipation of the Future It is not unusual for video games to feature futuristic technologies that do not exist in the real world, but what makes DXHR distinct from others is that the developers have included an extent of information that goes beyond any game playing necessity (see Figures 1 & 2). Moreover, the information is not fictional but the developers have taken strategic steps to make it credible. Mary DeMarle, the narrative designer, explained at the San Diego Comic-Con in 2011, that a timeline of augmentation was created during the production phase in which the present state of technology was extrapolated into the future. In small incremental steps the developers have anticipated which enhancement technologies might be potentially feasible by the year 2027. Their efforts were supported by the science consultant, Will Rosellini, who voluntarily approached the development team to help. Being a neuroscientist, he could not have been a more fitting candidate for the job as he is actively working and researching in the biotechnology sector. He has co-founded two companies, MicroTransponder Inc., which produces tiny implantable wireless devices to interface with the nervous system to remedy diseases (see Rosellini’s presentation at the 2011 Comic-Con) and Rosellini Scientific, which funds, researches and develops advanced technological healthcare solutions (Rosellini; Rosellini Scientific). Due to the timeline which has been embedded explicitly and implicitly, no augmentation appears as a disembodied technology without history in the game. For example, although the protagonist wears top-notch military arm prostheses that appear very human-like, this prosthesis is depicted as one of the latest iterations and many non-playable characters possess arm prostheses that appear a lot older, cruder and more industrial than those of Jensen. Furthermore, an extensive description employing scientific jargon for each of the augmentations can be read on the augmentation overview screen, which includes details about the material composition and bodily locations of the augmentations. Figure 1: More Info Section of the Cybernetic Arm Prosthesis as it appears in-game (all screenshots taken with permission from Deus Ex: Human Revolution (2011), courtesy of Eidos Montreal) More details are provided through eBooks, which are presented in the form of scientific articles or conference proceedings, for which the explorative gamer is also rewarded with valuable experience points upon finding which are used to activate and upgrade augmentations. The eBooks also reflect the timeline as each eBook is equipped with a year of publication between 2001 and 2022. Despite the fact that these articles have been supposedly written by a fictional character, the information is authentic and taken from actual scientific research papers, whereby some of these articles even include a proper scientific citation. Figure 2: Example of a Darrow eBook The fact that a scientist was involved in the production of the game allows classifying the augmentations as “diegetic prototypes” which are “cinematic depictions of future technologies … that demonstrate to large public audiences a technology’s need, benevolence and viability” (“The Future Is Now” 43). Diegetic prototypes are fictional, on-screen depictions of technologies that do not exist in that form in real life and have been created with the help of a science consultant. They have been placed in movies to allay anxieties and doubts and perhaps to even provoke a longing in audiences to see depicted technologies become reality (Kirby, “The Future Is Now” 43). Of course the aesthetic appearance of the prototypes has an impact on audiences’s desire, and particularly the artificial arms of Jensen that have been designed in an alluring fashion as can be seen in the following figure: Figure 3: Adam Jensen and arm prosthesis An important fact about diegetic prototypes—and about prototypes (see Suchman, Trigg, and Blomberg) in general—is that they are put to specific use and are embedded and presented in an identifiable social context. Technological objects in cinema are at once both completely artificial—all aspects of their depiction are controlled—and normalized as practical objects. Characters treat these technologies as a ‘natural’ part of their landscape and interact with these prototypes as if they are everyday parts of their world. … fictional characters are ‘socializing’ technological artifacts by creating meanings for the audience, ‘which is tantamount to making the artifacts socially relevant’. (Kirby, “Lab Coats” 196) The power of DXHR is that the diegetic prototypes—the augmentations—are not only based on real world scientific developments and contextualized in a virtual social space, but that the player has the opportunity to handle the augmentations. Virtual Testing Virtual witnessing of the not-yet-existent augmentations is supported by scientific descriptions, articles, and the appearance of the technologies in DXHR, but the moral and ethical engagement is established by the player’s ability to actively use the augmentations and by the provision of choice how to use them. As mentioned, most of the augmentations are inactive and must first be activated by accumulating and spending experience points on them. This requires the player to make reflections on the potential usage and how a particular augmentation will lead to the successful completion of a mission. This means that the player has to constantly decide how s/he wants to play the game. Do I want to be able to hack terminals and computers or do I rather prefer getting mission-critical information by confronting people in conversation? Do I want to search for routes where I can avoid enemy detection or do I rather prefer taking the direct route through the enemy lines with heavy guns in hands? This recurring reflection of which augmentation to choose and their continuous usage throughout the game causes the selected augmentations to become valuable and precious to the player because they transform from augmentations into frequently used tools that facilitate challenge and reduce difficulty of certain situations. In addition, the developers have ensured that no matter which approach is taken, it will always lead to success. This way the role-playing elements of the game are accentuated and each player will construct their own version of Jensen. However, it may be argued that DXHR goes beyond mere character building. There is a breadth of information and opinions on human enhancement offered, but also choices that are made invite players to reflect upon the topic of human enhancement. Among the most conspicuous instances in the game, that involve the player’s choice, are the conversations with other non-playable characters. These are events in the game which require the player to choose one out of three responses for Jensen, and hence, these determine to some extent Jensen’s attitude towards human enhancement. Thus, in the course of the game players might discover their own conviction and might compose their own imaginary of human enhancement. Conclusion This article has explored that DXHR enables players to experience augmentations without being modified themselves. The game is filled with various sociotechnical imaginaries of prosthetic and neurological human enhancement technologies. The relevance of these imaginaries is increased by a high degree of credibility as a science consultant has ensured that the fictional augmentations are founded upon real world scientific advancements. The main story, and much of the virtual world, hinge upon the existence and controversy of these sorts of technologies. Finally, the medium ‘videogame’ allows taking control of an individual, who is heavily augmented with diegetic prototypes of future enhancement technologies, and it also allows using and testing the increased abilities in various situations and challenges. All these elements combined enable players to virtually witness not-yet-existent, future augmentations safely in the present without the need to undertake any alterations of their own bodies. This, in addition to the fact that the technologies are depicted in an appealing fashion, may create a desire in players to see these augmentations become reality. Nevertheless, DXHR sparks an important incentive to critically think about the future of human enhancement technologies.References Aarseth, Espen. “Playing Research: Methodological Approaches to Game Analysis.” DAC Conference, Melbourne, 2003. 14 Apr. 2013 ‹http://hypertext.rmit.edu.au/dac/papers/Aarseth.pdf›. Bizzocchi, J., and J. Tanenbaum. “Mass Effect 2: A Case Study in the Design of Game Narrative.” Bulletin of Science, Technology & Society 32.5 (2012): 393-404. Chrislenko, Alexander, et al. “Transhumanist FAQ.” humanity+. 2001. 18 July 2013 ‹http://humanityplus.org/philosophy/transhumanist-faq/#top›. Eidos Montreal. “Deus Ex: Human Revolution.” Square Enix. 2011. PC. ———. “Welcome to Sarif Industries: Envisioning a New Future.” 2011. 14 Apr. 2013 ‹http://www.sarifindustries.com›. Ellis, Carolyn, Tony E. Adams, and Arthur P. Bochner. “Autoethnography: An Overview.” Forum Qualitative Sozialforschung 12.1 (2010): n. pag. 9 July 2013 ‹http://www.qualitative-research.net/index.php/fqs/article/view/1589/3095›. Freud, Sigmund. Civilization and Its Discontents. Aylesbury, England: Chrysoma Associates Limited, 1929. Iversen, Sara Mosberg. “In the Double Grip of the Game: Challenge and Fallout 3.” Game Studies 12.2 (2012): n. pag. 5 Feb. 2013 ‹http://gamestudies.org/1202/articles/in_the_double_grip_of_the_game›. Jasanoff, Sheila, and Sang-Hyun Kim. “Containing the Atom: Sociotechnical Imaginaries and Nuclear Power in the United States and South Korea.” Minerva 47.2 (2009): 119–146. Juul, Jesper. “A Clash between Game and Narrative.” MA thesis. U of Copenhagen, 1999. 29 May 2013 ‹http://www.jesperjuul.net/thesis/›. Kirby, David A. Lab Coats in Hollywood. Cambridge, Massachusetts: MIT Press, 2011. ———. “The Future Is Now : Diegetic Prototypes and the Role of Popular Films in Generating Real-World Technological Development.” Social Studies of Science 40.1 (2010): 41-70. Malliet, Steven. “Adapting the Principles of Ludology to the Method of Video Game Content Analysis Content.” Game Studies 7.1 (2007): n. pag. 28 May 2013 ‹http://gamestudies.org/0701/articles/malliet›. Mäyrä, F. An Introduction to Game Studies. London: Sage, 2008. Menne, Erwin, Werner Trutwin, and Hans J. Türk. Philosophisches Kolleg Band 4 Anthropologie. Düsseldorf: Patmos, 1986. Rosellini, Will, and Mary DeMarle. “Deus Ex: Human Revolution.” Comic Con. San Diego, 2011. Panel. Rosellini Scientific. “Prevent. Restore. Enhance.” 2013. 25 May 2013 ‹http://www.roselliniscientific.com›. Shapin, Steven, and Simon Schaffer. Leviathan and the Air Pump: Hobbes, Boyle and the Experimental Life. Princeton: Princeton University Press, 1985. Suchman, Lucy, Randall Trigg, and Jeanette Blomberg. “Working Artefacts: Ethnomethods of the Prototype.” The British Journal of Sociology 53.2 (2002): 163-79. Image Credits All screenshots taken with permission from Deus Ex: Human Revolution (2011), courtesy of Eidos Montreal.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía