Academic literature on the topic 'Human and organizational errors'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Human and organizational errors.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Human and organizational errors"
Gronewold, Ulfert, and Michaela Donle. "Organizational Error Climate and Auditors' Predispositions toward Handling Errors." Behavioral Research in Accounting 23, no. 2 (November 1, 2011): 69–92. http://dx.doi.org/10.2308/bria-10061.
Full textGöktürk, Söheyda, Oguzhan Bozoglu, and Gizem Günçavdi. "Error management practices interacting with national and organizational culture." Learning Organization 24, no. 4 (May 8, 2017): 245–56. http://dx.doi.org/10.1108/tlo-07-2016-0041.
Full textTomić, Silvia, Milovan Lazarević, Leposava Grubić-Nešić, Danijela Ćirić Lalić, and Jelena Kanjuh. "Human error management approach in practice: the use of HERCA tool for a systematic analysis of human errors." Journal of East European Management Studies 27, no. 4 (2022): 637–61. http://dx.doi.org/10.5771/0949-6181-2022-4-637.
Full textTang, Jun Xi, Li Cheng Wang, Peng Jia Shi, Zhao Li, Su Hong Pang, and Chuang Xin Guo. "Research on the Influence Factors System of Human Error in Power System." Advanced Materials Research 988 (July 2014): 687–90. http://dx.doi.org/10.4028/www.scientific.net/amr.988.687.
Full textParamanantham, Shampave, and Sidath Liyanage. "Assessing the Impact of Human Error Assessment on Organization Performance in the Software Industry." International Journal of Information Systems and Social Change 14, no. 1 (January 1, 2023): 1–32. http://dx.doi.org/10.4018/ijissc.314563.
Full textEmby, Craig, Bin Zhao, and Jost Sieweke. "Audit Senior Modeling Fallibility: The Effects of Reduced Error Strain and Enhanced Error-Related Self-Efficacy on Audit Juniors' Responses to Self-Discovered Errors." Behavioral Research in Accounting 31, no. 2 (June 1, 2019): 17–30. http://dx.doi.org/10.2308/bria-52471.
Full textShi, Xiaobo, Yan Liu, Dongyan Zhang, Ruixu Li, Yaning Qiao, Alex Opoku, and Caiyun Cui. "Influencing Factors of Human Errors in Metro Construction Based on Structural Equation Modeling (SEM)." Buildings 12, no. 10 (September 21, 2022): 1498. http://dx.doi.org/10.3390/buildings12101498.
Full textRamanujam, Rangaraj, and Paul S. Goodman. "Latent errors and adverse organizational consequences: a conceptualization." Journal of Organizational Behavior 24, no. 7 (2003): 815–36. http://dx.doi.org/10.1002/job.218.
Full textGrohnert, Therese, Roger H. G. Meuwissen, and Wim H. Gijselaers. "Valuing errors for learning: espouse or enact?" Journal of Workplace Learning 29, no. 5 (July 10, 2017): 394–408. http://dx.doi.org/10.1108/jwl-11-2016-0102.
Full textAsgarian, Azadeh, Keivan Ghassami, Farahnaz Heshmat, Abolfazl Mohammadbeigi, and Mohammad Abbasinia. "Barriers and Facilitators of Reporting Medical Errors in a Hospital: A Qualitative Study." Archives of Hygiene Sciences 10, no. 4 (October 1, 2021): 279–88. http://dx.doi.org/10.32598/ahs.10.4.251.2.
Full textDissertations / Theses on the topic "Human and organizational errors"
Taylor-Hyde, Dr Mary Ellen. "Human Resource Strategies for Improving Organizational Performance to Reduce Medical Errors." ScholarWorks, 2017. https://scholarworks.waldenu.edu/dissertations/3580.
Full textBaltazar, Ana Rita Duarte Gomes Simões. "Erro humano e erro organizacional nas atividades de manutenção das aeronaves na perspetiva da Grounded Theory : o caso nacional." Doctoral thesis, Instituto Superior de Economia e Gestão, 2020. http://hdl.handle.net/10400.5/20577.
Full textNos últimos anos ocorreram situações que demonstram que os acidentes em organizações de elevada fiabilidade têm consequências catastróficas que precisam de ser contidas ou evitadas. As medidas para a contenção e prevenção do erro estão estabelecidas nesse tipo de organizações, mas focalizam-se em evitar as consequências negativas dos erros, não analisando as consequências positivas dos mesmos (quando existem). A literatura aponta como consequências positivas a aprendizagem, a inovação e a resiliência. O trabalho conclui que de forma conceptual a consequência positiva dos erros é um aumento da Segurança Organizacional através de processos de melhoria associados à Aprendizagem Organizacional. O erro humano não deve ser primariamente entendido como a principal causa dos acidentes, mas antes como uma possível consequência da atividade organizacional. Foi necessário compreender como (How) ocorre e porque (Why) ocorre o erro organizacional; e, ainda, qual a relação entre os diferentes níveis de erro (humano, de equipa e organizacional) e os fatores organizacionais. Esta abordagem transportou o investigador para a necessidade de uma análise aprofundada do conceito de condições/erros latentes. O conhecimento das causas primárias de um incidente/acidente poderá levar a que se criem indicadores que sirvam de alertas em situações futuras e/ou se alterem essas mesmas condições para que se evitem situações idênticas. Verificou-se neste trabalho que cada incidente/acidente, depois de estudado, é uma fonte de informação absolutamente essencial para a melhoria do sistema. No entanto, existem outras fontes que necessitam de ser mais estimuladas, nomeadamente, o reporte de ocorrências e a correspondente análise e partilha de resultados na Organização. A investigação recorre a uma metodologia qualitativa e os resultados aplicam-se apenas à Organização em estudo. O modelo final explica como através do erro de manutenção aeronáutica, na Força Aérea Portuguesa, se aumenta a Segurança Organizacional.
In recent years, situations have occurred which demonstrate that accidents in High Reliability Organizations have catastrophic consequences that need to be restrained or avoided. Measures to contain and prevent errors are established in this type of organizations, but focus on avoiding the negative consequences of errors, thus not analyzing their positive consequences (when they exist). The literature points to positive consequences of learning, innovation and resilience. The study concludes that, in a conceptual way, the positive consequence of the errors is an increase of the Organizational Security through processes of improvement associated with the Organizational Learning. Human error should not be primarily understood as the main cause of accidents, but rather as a possible consequence of organizational activity. It is necessary to understand how and why organizational errors occur; and the relationship between the different levels of error (human, team and organizational) and organizational factors. This approach transported the researcher to the need for an in-depth analysis of the concept of latent conditions / errors. Knowing the root cause of an incident / accident may lead to the creation of indicators that serve as warnings in future situations and / or change the same conditions, so that similar situations are avoided. It was verified in this study that each incident / accident, once studied, is an absolutely essential source of information for the improvement of the system. However, there are other sources that need to be more stimulated, namely the reporting of occurrences and the corresponding analysis and sharing of results in the organization. The research uses a qualitative methodology and the results apply only to the organization being studied. The final model explains how the Organizational Safety is increased, through the aeronautical maintenance error in the Portuguese Air Force.
info:eu-repo/semantics/publishedVersion
Barbarini, Luiz Henrique Maiorino. "Análise de risco para embarcações com sistemas de alarmes com foco nos fatores humanos e organizacionais." Universidade de São Paulo, 2012. http://www.teses.usp.br/teses/disponiveis/3/3135/tde-19102012-104521/.
Full textThis work presents a risk analysis model for ships, focusing on the scenarios where the crew interacts with the alarm and monitoring system. According to statistics of classification societies, humans are largely responsible for accidents on board and, therefore, are considered a major component of the safety of vessels. The relevance of the human element is given by the fact that human decisions and actions are related to the cause of accidents, either being the direct causative factor of failure or influencing the probability of failure, and the prevention of accidents or mitigation of the consequences. The alarm system is a mandatory component of certified vessels, with direct participation in an accident. It supplies information for the decision making process of the crew, considering their actions to recover the system. The study of the interactions between this automation equipment and the human element on board provides guidelines to managers and owners to invest in proper security systems and policies that influence human behavior, and therefore the safety on board. The model, inspired in accident reports, has as starting point a sequential structure of the accident, and takes into account a typical and simplified sequence of events, starting from a failure in the physical system. The human element is incorporated into the risk analysis through techniques of human reliability analysis, which place man as another component of the system, or the \"liveware\" interacting with software and hardware. From this point of view, a socio-technical approach is applied, considering that a ship is composed of not only its structure and machinery, but also of the entire crew. In order to illustrate the steps and assumptions to be done by an analyst applying the proposed model, the accident of the vessel Maersk Doha, occurred in October of 2006 in the United States, is analyzed. The report on the investigation of this accident is public and accessible via the Internet site of the Marine Accident Investigation Branch MAIB.
Videira, Rogerio Luiz da Rocha. "Acurácia diagnóstica, análise da decisão e heurísticas relacionadas à decisão clínica intuitiva de usar antagonista de bloqueador neuromuscular." Universidade de São Paulo, 2010. http://www.teses.usp.br/teses/disponiveis/5/5152/tde-01022011-165044/.
Full textBACKGROUND: Residual curarization is associated with a higher risk of death after anesthesia. Diagnostic errors after the use of neuromuscular blocking agents (NMBA) are related to 65-88% prevalence of preextubation residual curarization (PERC). This study analyzed the clinical intuitive decision of antagonizing NMBA before tracheal extubation. METHODS: After IRB approval, this clinical decision was audited in 150 patients. Participation in the study was voluntary and anonymous. Decisions, as if a diagnostic test, were compared to acceleromyography, with TOF<0.9 defined as PERC. A decision tree was structured to compare different decision strategies. A sequential survey (Delphi) was conducted among 108 anaesthesiologists to elicit the most frequently used heuristics (rules of thumb). RESULTS: PERC prevalence was 77%. Clinical intuition presented sensitivity of 0.35 (0.23-0.49) and specificity of 0.80 (0.54-0.94) (P=0.0001). In a 0-10 rating scale, expected utility of intuition was lower than always antagonize all patients (4.1 + 4.4 vs. 8.4 + 3.0, P<0.05). The most salient heuristics were Short interval since the last NMBA dose and Breathing pattern is inadequate stated by 73% and 71% of the anesthesiologists, respectively. One hour after a single dose of atracurium compared with rocuronium, 69.3% vs. 47.1% (P= 0.0035) of the anesthesiologists do not use antagonist before tracheal extubation. They perceive that prevalence of clinically significant residual curarization is higher in their colleagues practice than in their own clinical practice (odds ratio 7.8 (3.8- 16.2), P< 0.0001). CONCLUSIONS: Clinical intuition should not be used to rule out residual curarization. Routine antagonism is a better strategy than the use of clinical intuition to make this decision. Clinicians make this intuitive decision based on a forecast of the duration of the effects of NMBA and on a qualitative judgement about the adequacy of the patients breathing pattern. They consider themselves more capable of avoiding residual curarization than their colleagues. They are overconfident in their own capacity to predict NMBA duration and intuitively rule PERC out
Humanson, Richard, and Patrik Nordeman. "Proactive Crisis Management (PCM) : Perceptions of crisis-awareness and crisis-readiness in organizations in relation with their actual strategic initiatives against industrial crises caused by human errors." Thesis, Blekinge Tekniska Högskola, Institutionen för industriell ekonomi, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-15519.
Full textAl-Shirawi, Ali. "Medical errors: defining the confines of system weaknesses and human errors." Thesis, McGill University, 2011. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=97142.
Full textMalgré les changements innovateurs dans la biotechnologie, l'équipement médical et d'autres approches thérapeutiques, les erreurs dans la pratique de la médecine continuent à provoquer des problèmes médicaux pour un nombre important de patients. Les définitions actuelles d'erreurs médicales ne reflètent pas la réalité complète de la causalité d'erreurs. La taxinomie d'erreurs médicale est aussi strictement concentrée sur les faiblesses du système dans les institutions de santé et l'erreur humaine. Les faiblesses des systèmes qui autorisent et contrôlent les organisations, les fournisseurs de santé publique, les règlements des professions de la santé, les organismes de règlements gouvernemental des professions de la santé et la conduite des professionnels de la santé, et les risques de l'industrie de recherche médicale, tous causent des problèmes importants qui ne sont pas actuellement explicitement reconnu pour leur responsabilité d'erreurs médicales. Ces joueurs ne réalisent pas leurs autorité actuelle. L'évidence démontre de la négligence, de l'incompétence, d'une conduite non étique, d'un intérêt institutionnel et d'un intérêt personnel dans le processus de prise de décision par ces instances. C'est-à-dire, l'approche du principe que les principes de l'éthique institutionnelle sont des instruments puissants pour contraindre la responsabilité de tous les joueurs. La vision contemporaine des erreurs médicales est déficiente et non durable. Une telle vision est déficiente et non supportable. Elle n'a pas contribué à la réduction d'erreurs médicales. Une formulation sur les définitions nécessaires des limitations des systèmes liés à l'être humain est nécessaire. La proposition de cette thèse expose une façon de percevoir les erreurs médicales dans le but de rejoindre les nombreux agents d'erreur et de mal dans un système en mettant ainsi l'emphase sur la responsabilité, et ainsi ouvrant la voie à la réforme.
Abu, Hawwach Mohammed. "Human errors in industrial operations and maintenance." Thesis, Mälardalens högskola, Innovation och produktrealisering, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-54794.
Full textKhan, Mohammad Ali, and Majid Nasir. "Human Errors and Learnability Evaluation of Authentication System." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4054.
Full textThis study addressed the usability experiences of users by exploring the human errors and the learnability situation of the authentication systems. Authors conducted a case study to explore the situation of human errors and learnability of authentication systems. Observation and interviews were adapted to gather data. Then analysis through SHERPA (to evaluate human errors) and Grossman et al. learnability metric (to evaluate learnability) had been conducted. First, the authors identified the human errors and learnability issues on the authentication systems from user’s perspective, from the gathered raw data. Then further analysis had been conducted on the summary of the data to identify the features of the authentication systems which are affecting the human errors and learnability issues. The authors then compared the two different categories of authentication systems, such as the 1-factor and the multi-factor authentication systems, from the gathered information through analysis. Finally, the authors argued the possible updates of the SHERPA’s human error metric and additional measurable learnability issues comparing to Grossman et al. learnability metrics. The studied authentication systems are not human errors free. The authors identified eight human errors associated with the studied authentication systems and three features of the authentication systems which are influencing the human errors. These errors occurred while the participants in this study took too long time locating the login menu or button or selecting the correct login method, and eventually took too long time to login. Errors also occurred when the participants failed to operate the code generating devices, or failed to retrieve information from errors messages or supporting documents, and/or eventually failed to login. As these human errors are identifiable and predictable through the SHERPA, they can be solved as well. The authors also found the studied authentication systems have learnability issues and identified nine learnability issues associated with them. These issues were identified when very few users could complete the task optimally, or completed without any help from the documentation. Issues were also identified while analyzing the participants’ task completion time after reviewing documentations, operations on code generating devices, and average errors while performing the task. These learnability issues were identified through Grossman et al. learnability metric, and the authors believe more study on the identified learnability issues can improve the learnability of the authentication systems. Overall, the authors believe more studies should be conducted on the identified human errors and learnability issues to improve the overall human errors and learnability situation of the studied authentication systems at presence. Moreover, these issues also should be taken into consideration while developing future authentication systems. The authors believe, in future, the outcome of this study will also help researchers to propose more usable, but yet secured authentication systems for future growth. Finally, authors proposed some potential research ares, which they believe will have important contribution to the current knowledge. In this study, the authors used the SHERPA to identify the human errors. Though the SHERPA (and its metrics) is arguably one of the best methods to evaluate human errors, the authors believe there are scopes of improvements in the SHERPA’s metrics. Human’s perception and knowledge is getting changed, and to meet the challenge, the SHERPA’s human error metrics can be updated as well. Grossman et al. learnability metrics had been used in this study to identify learnability issues. The authors believe improving the current and adding new metrics may identify more learnability issues. Evaluation of learnability issues may have improved if researchers could have agreed upon a single learnability definition. The authors believe more studies should be conducted on the definition of learnability in order to achieve more acceptable definition of the learnability for further research. Finally, more studies should be conducted on the remedial strategies of the identified human errors, and improvement on the identified learnability issues, which the authors believe will help researchers to propose more usable, but yet secured authentication systems for the future growth.
30/1, Shideshwari Lane, Shantinagar, Ramna, Dhaka, Bangladesh, Post Code 1217. Contact: +88017130 16973
Seastrunk, Chad Stephen. "Algorithm to Systematically Reduce Human Errors in Healthcare." NCSU, 2005. http://www.lib.ncsu.edu/theses/available/etd-12012005-073356/.
Full textBarroso, Monica Frias da Costa Paz. "Human error and disturbance occurrence in manufacturing systems." Thesis, University of Nottingham, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.342060.
Full textBooks on the topic "Human and organizational errors"
Gesine, Hofinger, and Buerschaper Cornelius, eds. Crisis management in acute care settings: Human factors and team psychology in a high stakes environment. Berlin: Springer, 2008.
Find full textAdvances in human factors and ergonomics in healthcare. Boca Raton: CRC Press, 2011.
Find full textGill, Geoffrey W. Maritime error management: Discussing and remediating factors contributory to casualties. Atglen, PA: Cornell Maritime Press, 2011.
Find full textTurkstra, Carl J. Human error and organization factors in bridge design and construction. Downsview, Ont: Research and Development Branch, Ontario Ministry of Transportation, 1991.
Find full textRobert, Tannenbaum, Margulies Newton, and Massarik Fred, eds. Human systems development. San Francisco: Jossey-Bass Publishers, 1985.
Find full textHuman performance consulting: Transforming human potential into productive business performance. Houston, Tex: Gulf Pub., 2000.
Find full textSue, Bogner Marilyn, ed. Human error in medicine. Hillsdale, N.J: L. Erlbaum Associates, 1994.
Find full textRN, Jones Terry L., ed. Creating a just culture: A nurse leader's guide. Danvers, MA: HCPro, 2011.
Find full textBook chapters on the topic "Human and organizational errors"
Licao, Dai, Li Hu, Chen Jianhua, Lu Wenjie, and Li Pengcheng. "Organizational Resilience Model in a Nuclear Power Plant." In Advances in Human Error, Reliability, Resilience, and Performance, 235–43. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-20037-4_21.
Full textLu, Yi, Huayan Huangfu, Shuguang Zhang, and Shan Fu. "Organizational Risk Dynamics Archetypes for Unmanned Aerial System Maintenance and Human Error Shaping Factors." In Advances in Human Error, Reliability, Resilience, and Performance, 75–87. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-20037-4_7.
Full textKontogiannis, Tom, and Stathis Malakis. "Human Error Detection and Recovery." In Cognitive Engineering and Safety Organization in Air Traffic Management, 163–96. Boca Raton : CRC Press, Taylor & FrancisGroup, 2017.: CRC Press, 2017. http://dx.doi.org/10.1201/b22178-6.
Full textKore, Akshay. "Handling Errors." In Designing Human-Centric AI Experiences, 281–323. Berkeley, CA: Apress, 2022. http://dx.doi.org/10.1007/978-1-4842-8088-1_6.
Full textLeacock, Claudia, Martin Chodorow, Michael Gamon, and Joel Tetreault. "Collocation Errors." In Synthesis Lectures on Human Language Technologies, 63–71. Cham: Springer International Publishing, 2010. http://dx.doi.org/10.1007/978-3-031-02137-4_7.
Full textCarbery, Ronan. "Organizational Learning." In Human Resource Development, 84–102. London: Macmillan Education UK, 2015. http://dx.doi.org/10.1007/978-1-137-36010-6_5.
Full textLovallo, Dan. "From Individual Biases to Organizational Errors." In Organization and Strategy in the Evolution of the Enterprise, 103–23. London: Palgrave Macmillan UK, 1996. http://dx.doi.org/10.1007/978-1-349-13389-5_5.
Full textChèze, Laurence. "Errors in Measurement." In Kinematic Analysis of Human Movement, 59–72. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2014. http://dx.doi.org/10.1002/9781119058144.ch4.
Full textPhilip, Pierre, Cyril Chaufton, Lino Nobili, and Sergio Garbarino. "Errors and Accidents." In Sleepiness and Human Impact Assessment, 81–92. Milano: Springer Milan, 2014. http://dx.doi.org/10.1007/978-88-470-5388-5_7.
Full textLeacock, Claudia, Martin Chodorow, Michael Gamon, and Joel Tetreault. "Annotating Learner Errors." In Synthesis Lectures on Human Language Technologies, 81–90. Cham: Springer International Publishing, 2010. http://dx.doi.org/10.1007/978-3-031-02137-4_9.
Full textConference papers on the topic "Human and organizational errors"
Terwel, Karel. "Should we focus on human or organizational factors?" In IABSE Workshop, Helsinki 2017: Ignorance, Uncertainty, and Human Errors in Structural Engineering. Zurich, Switzerland: International Association for Bridge and Structural Engineering (IABSE), 2017. http://dx.doi.org/10.2749/helsinki.2017.017.
Full textHohberg, Jörg-Martin. "Risk-Based Thinking and Knowledge in Engineering Organizations." In IABSE Workshop, Helsinki 2017: Ignorance, Uncertainty, and Human Errors in Structural Engineering. Zurich, Switzerland: International Association for Bridge and Structural Engineering (IABSE), 2017. http://dx.doi.org/10.2749/helsinki.2017.009.
Full textPassalacqua, Roberto, and Fumiaki Yamada. "Human Reliability and the Current Dilemma in Human-Machine Interface Design Strategies." In 10th International Conference on Nuclear Engineering. ASMEDC, 2002. http://dx.doi.org/10.1115/icone10-22061.
Full textYang, Shen, Geng Bo, and Li Dan. "Based on Human Behavior Process of Human Error Defensive Management Research for NPP." In 2017 25th International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/icone25-66007.
Full textTorres, Yaniel, Sylvie Nadeau, and Kurt Landau. "Analysis of Assembly Errors using Systems Thinking Approach: Application of the HFACS Framework." In 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1001568.
Full textUjita, Hiroshi, and Naoko Matsuo. "Human Performance Improvement Activities for Risk Reduction." In 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1002642.
Full textBridle, Peter Vincent. "Catastrophic Events and Human Error: A Few Rotten Apples or Organizational Dysfunction?" In SPE Trinidad and Tobago Section Energy Resources Conference. SPE, 2021. http://dx.doi.org/10.2118/200942-ms.
Full textBridle, Peter. "Catastrophic Events and Human Error: A Few Rotten Apples or Organizational Dysfunction?" In SPE Annual Technical Conference and Exhibition. SPE, 2021. http://dx.doi.org/10.2118/205858-ms.
Full textKim, Yoonik, Kwang-Won Ahn, Chang-Hyun Chung, Kil Yoo Kim, and Joon-Eon Yang. "Use of Influence Diagrams and Fuzzy Theory to Develop Assessment Method of Organizational Influences on Component Maintenance." In 10th International Conference on Nuclear Engineering. ASMEDC, 2002. http://dx.doi.org/10.1115/icone10-22323.
Full textMatahri, Naoe¨lle. "Link Between Operational Experience Data and Pre-Accidental Data." In 16th International Conference on Nuclear Engineering. ASMEDC, 2008. http://dx.doi.org/10.1115/icone16-48488.
Full textReports on the topic "Human and organizational errors"
Petrowski, Michael, Joe Lockwood, and Jason Smith. Human Performance Improvement Task Group Task 21-1 Best Practice: Using virtual capabilities or options for HPI application (to reduce errors, strengthening defenses, strengthening the organization, and/or increasing capacity). Office of Scientific and Technical Information (OSTI), February 2021. http://dx.doi.org/10.2172/1766973.
Full textMurrell, Emily. Organizational Culture Change Resulting From Human Resources Outsourcing. Portland State University Library, January 2015. http://dx.doi.org/10.15760/honors.144.
Full textPOND, DANIEL J., F. KAY HOUGHTON, and WALTER E. GILMORE. CONTRIBUTORS TO HUMAN ERRORS AND BREACHES IN NATIONAL SECURITY APPLICATIONS. Office of Scientific and Technical Information (OSTI), August 2002. http://dx.doi.org/10.2172/801246.
Full textBartel, Ann, Ciaran Phibbs, Nancy Beaulieu, and Patricia Stone. Human Capital and Organizational Performance: Evidence from the Healthcare Sector. Cambridge, MA: National Bureau of Economic Research, September 2011. http://dx.doi.org/10.3386/w17474.
Full textHenriksen, K., R. D. Kaye, R. Jones, D. S. Morisseau, and D. I. Serig. Human factors evaluation of teletherapy: Training and organizational analysis. Volume 4. Office of Scientific and Technical Information (OSTI), July 1995. http://dx.doi.org/10.2172/91921.
Full textDaniellou, François, Marcel Simard, and Ivan Boissières. Human and organizational factors of safety: a state of the art. Fondation pour une culture de sécurité industrielle, January 2011. http://dx.doi.org/10.57071/429dze.
Full textBushway, Shawn, Emily Owens, and Anne Morrison Piehl. Sentencing Guidelines and Judicial Discretion: Quasi-experimental Evidence from Human Calculation Errors. Cambridge, MA: National Bureau of Economic Research, April 2011. http://dx.doi.org/10.3386/w16961.
Full textBarriere, M. T., W. J. Luckas, J. Wreathall, S. E. Cooper, D. C. Bley, and A. Ramey-Smith. Multidisciplinary framework for human reliability analysis with an application to errors of commission and dependencies. Office of Scientific and Technical Information (OSTI), August 1995. http://dx.doi.org/10.2172/106594.
Full textDrillings, Michael, Leonard Adelman, Angel Manzo, and Michael D. Shaler. Human and Organizational Issues in the Army After Next: A Conference Held 13-15 November 1997. Fort Belvoir, VA: Defense Technical Information Center, November 1998. http://dx.doi.org/10.21236/ada357651.
Full textCallan, J. R., R. T. Kelly, and M. L. Quinn. Human factors evaluation of remote afterloading brachytherapy. Supporting analyses of human-system interfaces, procedures and practices, training and organizational practices and policies. Volume 3. Office of Scientific and Technical Information (OSTI), July 1995. http://dx.doi.org/10.2172/93757.
Full text