Academic literature on the topic 'Contrôle qualité proactif'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Contrôle qualité proactif.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Contrôle qualité proactif":

1

Zahedi Fard, Seyed Yahya, Mohammad Karim Sohrabi, and Vahid Ghods. "Energy-Aware and Proactive Host Load Detection in Virtual Machine Consolidation." Information Technology and Control 50, no. 2 (June 17, 2021): 332–41. http://dx.doi.org/10.5755/j01.itc.50.2.28056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
With the expansion and enhancement of cloud data centers in recent years, increasing the energy consumptionand the costs of the users have become the major concerns in the cloud research area. Service quality parametersshould be guaranteed to meet the demands of the users of the cloud, to support cloud service providers,and to reduce the energy consumption of the data centers. Therefore, the data center's resources must be managedefficiently to improve energy utilization. Using the virtual machine (VM) consolidation technique is animportant approach to enhance energy utilization in cloud computing. Since users generally do not use all thepower of a VM, the VM consolidation technique on the physical server improves the energy consumption andresource efficiency of the physical server, and thus improves the quality of service (QoS). In this article, a serverthreshold prediction method is proposed that focuses on the server overload and server underload detectionto improve server utilization and to reduce the number of VM migrations, which consequently improves theVM's QoS. Since the VM integration problem is very complex, the exponential smoothing technique is utilizedfor predicting server utilization. The results of the experiments show that the proposed method goes beyondexisting methods in terms of power efficiency and the number of VM migrations.
2

Owen, Tony. "Quality Through People: A Blueprint for Proactive Total Quality Management by John Choppin IFS Publications, Bedford, UK, 1991, 461pages incl. index (£29.95)." Robotica 9, no. 4 (December 1991): 448–49. http://dx.doi.org/10.1017/s0263574700000679.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lou, Helen H., and Yinlun L. Huang. "Hierarchical decision making for proactive quality control: system development for defect reduction in automotive coating operations." Engineering Applications of Artificial Intelligence 16, no. 3 (April 2003): 237–50. http://dx.doi.org/10.1016/s0952-1976(03)00060-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Tiele, Akira, Siavash Esfahani, and James Covington. "Design and Development of a Low-Cost, Portable Monitoring Device for Indoor Environment Quality." Journal of Sensors 2018 (2018): 1–14. http://dx.doi.org/10.1155/2018/5353816.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This article describes the design and development of a low-cost, portable monitoring system for indoor environment quality (IEQ). IEQ is a holistic concept that encompasses elements of indoor air quality (IAQ), indoor lighting quality (ILQ), acoustic comfort, and thermal comfort (temperature and relative humidity). The unit is intended for the monitoring of temperature, humidity, PM2.5, PM10, total VOCs (×3), CO2, CO, illuminance, and sound levels. Experiments were conducted in various environments, including a typical indoor working environment and outdoor pollution, to evaluate the unit’s potential to monitor IEQ parameters. The developed system was successfully able to monitor parameter variations, based on specific events. A custom IEQ index was devised to rate the parameter readings with a simple scoring system to calculate an overall IEQ percentage. The advantages of the proposed system, with respect to commercial units, is associated with better customisation and flexibility to implement a variety of low-cost sensors. Moreover, low-cost sensor modules reduce the overall cost to provide a comprehensive, portable, and real-time monitoring solution. This development facilities researchers and interested enthusiasts to become engaged and proactive in participating in the study, management, and improvement of IEQ.
5

Schadd, Maarten P. D., Tjeerd A. J. Schoonderwoerd, Karel van den Bosch, Olaf H. Visker, Tjalling Haije, and Kim H. J. Veltman. "“I’m Afraid I Can’t Do That, Dave”; Getting to Know Your Buddies in a Human–Agent Team." Systems 10, no. 1 (February 12, 2022): 15. http://dx.doi.org/10.3390/systems10010015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The rapid progress in artificial intelligence enables technology to more and more become a partner of humans in a team, rather than being a tool. Even more than in human teams, partners of human–agent teams have different strengths and weaknesses, and they must acknowledge and utilize their respective capabilities. Coordinated team collaboration can be accomplished by smartly designing the interactions within human–agent teams. Such designs are called Team Design Patterns (TDPs). We investigated the effects of a specific TDP on proactive task reassignment. This TDP supports team members to dynamically allocate tasks by utilizing their knowledge about the task demands and about the capabilities of team members. In a pilot study, agent–agent teams were used to study the effectiveness of proactive task reassignment. Results showed that this TDP improves a team’s performance, provided that partners have accurate knowledge representations of each member’s skill level. The main study of this paper addresses the effects of task reassignments in a human–agent team. It was hypothesized that when agents provide explanations when issuing and responding to task reassignment requests, this will enhance the quality of the human’s mental model. Results confirmed that participants developed more accurate mental models when agent-partners provide explanations. This did not result in a higher performance of the human–agent team, however. The study contributes to our understanding of designing effective collaboration in human–agent teams.
6

Khatib, Emil Jatib, and Raquel Barco. "Optimization of 5G Networks for Smart Logistics." Energies 14, no. 6 (March 22, 2021): 1758. http://dx.doi.org/10.3390/en14061758.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Industry 4.0 is generalizing the use of wireless connectivity in manufacturing and logistics. Specifically, in Smart Logistics, novel Industry 4.0 technologies are used to enable agile supply chains, with reduced management, energy and storage costs. Cellular networks allow connectivity throughout all the scenarios where logistics processes take place, each having their own challenges. This paper explores such scenarios and challenges, and proposes 5G technology as a global unified connectivity solution. Moreover, this paper proposes a system for exploiting the application-specific optimization capabilities of 5G networks to better cater for the needs of Smart Logistics. An application traffic modeling process is proposed, along with a proactive approach to network optimization that can improve the Quality of Service and reduce connectivity costs.
7

Li, Weijun, Yibo Sun, Qinggui Cao, Min He, and Yuquan Cui. "A proactive process risk assessment approach based on job hazard analysis and resilient engineering." Journal of Loss Prevention in the Process Industries 59 (May 2019): 54–62. http://dx.doi.org/10.1016/j.jlp.2019.02.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zheng, Zengwei, Chenwei Zhao, and Jianwei Zhang. "Robust and Fast Converging Cross-Layer Failure Correction in Segment-Routed Networks." Electronics 10, no. 22 (November 22, 2021): 2874. http://dx.doi.org/10.3390/electronics10222874.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Due to overlay technologies, service providers have a logical view of the underlay network and can optimize the experience quality without modifying the physical network. However, the cross-layer interaction inevitably causes network fluctuation due to their inconsistent optimization objectives. Aside from that, network failures that occur in both layers not only cause network performance degradation but also significantly increase the frequency of cross-layer interaction. These problems make the network fluctuate for a long time, reduce the network performance, and influence the user experience, especially for time-sensitive applications. In this paper, we design a cross-layer architecture in which the logical layer can satisfy the service function chain demands and maximize the user experience and physical layer so it can optimize the overall network performance. Our cross-layer architecture can make proactive corrections in both layers. Furthermore, we investigate the cross-layer interaction and design two strategies to eliminate fluctuations and make the network converge quickly.
9

Yasin, Qadeer, Zeshan Iqbal, Muhammad Attique Khan, Seifedine Kadry, and Yunyoung Nam. "Reliable Multipath Flow for Link Failure Recovery in 5G Networks Using SDN Paradigm." Information Technology and Control 51, no. 1 (March 26, 2022): 5–17. http://dx.doi.org/10.5755/j01.itc.51.1.29408.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In modern networks and cloud evolution, as new nodes to internetwork growing rapidly and use of gaming and video streaming over the network require high availability with very small latency rate. 5G networks provides muchfaster services than 4G but link failure occurrence can affect the quality of service. In 5G networks environmentalfactors also affect the efficiency of wireless signals. To overcome such type of issues, base stations are placed indistributed manner around urban areaswhen decision is required for the placement. However, in some scenarioswe can have few similarities, as in general, highways are same. In existed systems signals distribution is performedhomogenously, so it will be generating issues like fractal and environmental. It will be cause of great economicloss. However, being an emerging network paradigm, Software defined Network (SDN) is easy to manage due tological separation of control plane and data plane. SDN supports numerous advantages, one of them is link robustness to avoid service unavailability. To cope with network link failures there are many mechanisms existedlike proactive and reactive mechanisms, but these mechanisms calculate multiple paths and stored in flow tableswithout considering reliability of link. Therefore, it will be cause of high latency rate due to calculation of manymultiple paths and increased traffic overhead too. To overcome these issues, we proposed a new approach in whichmultipath numbers depend on reliability of primary path. Number of alternative paths decreased as reliability oflink increased it leads to less time required to calculate alternative path. In addition, traffic overhead decreases ascompared to existing approaches. Secondly, we also integrate the shortest distance factor with reliability factor,and we get better results than existing approaches. Our proposed system will be helpful in increasing the availability of services in 5G network due to low latency rate and small traffic overhead required in link failure recovery
10

Guzik, Robert, Arkadiusz Kołoś, Jakub Taczanowski, Łukasz Fiedeń, Krzysztof Gwosdz, Katarzyna Hetmańczyk, and Jakub Łodziński. "The Second Generation Electromobility in Polish Urban Public Transport: The Factors and Mechanisms of Spatial Development." Energies 14, no. 22 (November 18, 2021): 7751. http://dx.doi.org/10.3390/en14227751.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
One of the key challenges on the road to sustainable mobility is the development of low/zero emission urban public transport (UPT). This is crucial in order to meet environmental requirements aiming at reducing greenhouse gas (GHG) emission. In some countries (e.g., Poland) reduction of air pollution is also an important reason behind the implementation of low/zero emission UPT. The aim of this study is to investigate the factors and mechanisms influencing the development of modern electromobility in Polish UPT. We have examined all 242 UPT systems in the country in terms of the characteristics of the relevant urban municipalities, such as size, economic prosperity, level of human and social capital, development paths of urban public transport in the long term as well as the institutional context and proximity and connections to other cities with experience in electromobility. Classification and statistical methods are used based on a variety of approaches, as assigning a score to various preliminarily identified indicators or applying correlation between quantities to verify the formulated hypotheses. Our analysis demonstrates that electromobility adoption is the result of a combination of favourable economic, urban, social and technological characteristic features of a given city. Zero or low emission buses are more common in large cities which are highly positioned in urban hierarchy, economically sound and which are characterized by a well-developed tertiary economy as well as by high human capital. An additional factor that positively influences the implementation of electromobility—in particular at the very first stage—is proximity to the location of low emission bus producers. The leadership in modern electromobility can be understood as part of a broader, proactive development policy of the cities aimed at improving the quality of life of their residents. This is especially important in medium-sized towns where utilizing electric vehicles can be an instrument to maintain or even develop their role and status. The results of the article may provide a basis for creating sustainable urban policies, especially sustainable mobility and improving environmental quality.

Dissertations / Theses on the topic "Contrôle qualité proactif":

1

Chouhad, Hassan. "Towards online metrology for proactive quality control in smart manufacturing." Thesis, Paris, HESAM, 2022. http://www.theses.fr/2022HESAE021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Dans l’industrie de fabrication traditionnelle, la métrologie est un élément essentiel de sanction de la qualité en bout de la chaîne de production. L’innovant dans le concept de la fabrication intelligente conduit à un repositionnement de la métrologie qui devient proactive au cœur même de la production pour fabriquer dès le départ une première pièce conforme. L’objet de cette thèse est donc de proposer une approche méthodologique pour le développement d'un système proactif, augmenté par des modèles d’intelligence artificielle IA, de contrôle en usinage de la conformité d’un produit à un cahier des charges et de caractériser ses défauts. Pour cela, une première étude sur l’aspect de surface a été réalisée en recueillant des images à haute résolution de fils de cuivre revêtus et découpés pouvant présenter des défauts. Les images, prises par un système de vision par ordinateur basé sur l'imagerie confocale chromatique, ont été utilisées pour générer différents modèles d'intelligence artificielle. Ce traitement consiste à faire de la segmentation et de la classification des défauts observés. En comparant la précision et le temps de traitement des modèles d'IA, l'apprentissage par transfert utilisant le modèle de mobile-net a montré de meilleures performances. Afin d'élargir l'étude de l'évaluation de la qualité de surface, des mesures de profil de surface sur machine-outil ont été effectuées à l'aide de capteurs confocaux chromatiques sans contact. Deux approches ont été réalisées : i) le fraisage de l'aluminium sans signature d’usure d’outil de coupe et ii) le fraisage du titane en tenant compte de la signature de l'usure de l'outil de coupe. Dans les deux configurations de coupe, les paramètres d’usinage, les profils de rugosité de surface et les efforts d’enlèvement de matière ont été enregistrés pour construire une base de données pour l'entraînement des modèles de prédiction par apprentissage automatique. Les résultats ont montré que le modèle XGboost a présenté la meilleure performance de prédiction et ce pour les deux scénarios. En considérant le temps de coupe dans le fraisage du titane, le modèle de prévision de séries temporelles ARIMA a été appliqué pour suivre l'évolution de la rugosité en fonction de l'usure de l'outil. L’analyse moyenne mobile autorégressive intégrée a permis de suivre l’évolution de la rugosité en fonction de la signature d’usure
In the traditional manufacturing industry, metrology is an essential element in sanctioning quality at the end of the production line. The innovation brought by concept of smart manufacturing leads to a repositioning of metrology to be proactive at the heart of production by performing the so-called first-time-right manufacturing of parts. The goal of this thesis is therefore to propose a methodological approach for the development of a proactive system, enhanced by AI models, to control the conformity of a product to a specification during machining and to characterize its defects. For this purpose, a first study on the surface aspect was carried out by collecting high-resolution images of coated and cut copper wires that may present defects. The images, taken by a computer vision system based on chromatic confocal imaging, were used to generate different artificial intelligence models. These models can perform segmentation and classification of observed defects. When comparing the accuracy and processing time of the AI models, transfer learning using the mobile-net model showed better performance. To extend the study of surface quality assessment, surface profile measurements on machine tools were performed using non-contact chromatic confocal sensors. Two approaches were performed: i) milling aluminum without tool wear signature, and ii) milling titanium with tool wear signature. In both cutting configurations, machining parameters, surface roughness profiles, and cutting forces were measured to build a dataset for training the prediction models by machine learning. The results showed that the XGboost model presented the best prediction performance and for both scenarios i) and ii). By considering the cutting time in titanium milling, the autoregressive integrated moving average time series prediction model was applied to track the evolution of roughness with tool wear
2

Joshi, Raoul, and Per Sundström. "WCDMA Cell Load Control in a High-speed Train Scenario : Development of Proactive Load Control Strategies." Thesis, Linköpings universitet, Kommunikationssystem, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-84635.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Load control design is one of the major cornerstones of radio resource management in today's UMTS networks. A WCDMA cell's ability to utilize available spectrum efficiently, maintain system stability and deliver minimum quality of service (QoS) requirements to in-cell users builds on the algorithms employed to manage the load. Admission control (AC) and congestion control (CC) are the two foremost techniques used for regulating the load, and differing environments will place varying requirements on the AC and CC schemes to optimize the QoS for the entire radio network. This thesis studies a real-life situation where cells are put under strenuous conditions, investigates the degrading effects a high-speed train has on the cell's ability to maintain acceptable levels of QoS, and proposes methods for mitigating these effects. The scenario is studied with regard to voice traffic where the limiting radio resource is downlink power. CC schemes that take levels of fairness into account between on-board train users and outdoor users are proposed and evaluated through simulation. Methods to anticipatorily adapt radio resource management (RRM) in a cell to prepare for a train is proposed and evaluated through simulation. A method to detect a high-speed train in a cell, and the users on it, is outlined and motivated but not simulated. Simulation results are promising but not conclusive. The suggested CC schemes show a surprising tendency towards an increase in congestion avoidance performance. Proactive RRM shows a significant increase in QoS for on-board users. No negative effects to users in the macro environment is noticed, with regard to the studied metrics.
3

Villamayor, Ibarra José Fernando. "Integração de modelos de processo e produto na fase de construção para o controle da produção e da qualidade com o apoio de BIM." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2016. http://hdl.handle.net/10183/149814.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Os controles de produção e qualidade executados no canteiro de obras têm sido frequentemente associados a trâmites burocráticos, sendo limitados a tarefas administrativas, em lugar de adicionar valor para os clientes internos e externos. Por esta razão, diversas pesquisas propuseram abordagens e indicadores para a integração entre controles de produção e qualidade. Uma das abordagens apontadas na literatura como promissora consiste no uso de dispositivos móveis, para apoiar a implementação destes controles. Tais dispositivos permitem a gestão de grandes quantidades de informações, evitando retrabalhos e erros derivados da coleta manual no papel e transcrição posterior para dispositivos digitais. Outros esforços têm se focado na integração entre os modelos de processo e produto de forma que a informação possa ser utilizada de forma mais eficaz na fase de construção. Os modelos de processo são necessários para realizar o planejamento e controle da produção, enquanto os modelos do produto são relacionados a representações 2D e 3D das obras e, mais recentemente, a modelos BIM. No entanto, a literatura sobre o uso de tecnologia da informação para implementar sistemas integrados de controle da produção e da qualidade é escassa. O objetivo do presente trabalho consiste em desenvolver um modelo para a integração entre processos gerenciais, representados pelos sistemas de planejamento e controle da produção e de gestão da qualidade, e um modelo BIM do produto, buscando facilitar o acompanhamento da execução da obra, incluindo o desempenho em termos de qualidade e a ocorrência de perdas. Foram desenvolvidos dois estudos empíricos em diferentes empresas construtoras, sendo adotada a abordagem metodológica da design science research. A principal contribuição deste trabalho está relacionada à execução da integração entre os resultados de controles pró-ativos no canteiro e o modelo BIM do empreendimento de forma sistemática. Além disso, o modelo proposto tem a possibilidade de ser adaptado para usos diversos, podendo, assim, incorporar informações correspondentes a outras fases do ciclo de vida dos empreendimentos construtivos.
Production and quality controls undertaken in construction sites have often been associated to bureaucratic procedures, being limited to simple administrative tasks, instead of adding value for internal and external stakeholders. For that reason, some research studies have proposed approaches and indicators for the integration of production and quality controls. One of these approaches pointed out in the literature is the use of mobile devices, to support the implementation of those controls. Those devices allow the management of large information batches, reducing rework and errors derived from manual paper-based data collection and further transcription to digital media. Other efforts have focused on the integration between product and process models so that information can be more effectively used during the construction phase of the project. Process models are necessary for carrying out production planning and control, while product models are concerned with 2D or 3D design, and, most recently, with BIM models. Nonetheless, the literature on the use of information technology for implementing integrated production and quality control is scarce. The aim of this research work is to develop a model for the integration of management processes, composed by production planning, control and quality management processes, and a BIM product model, in order to facilitate the monitoring of project execution, including the performance in terms of quality and the occurrence of waste. Two empirical studies were carried out in different construction companies, using the design science research approach. The main contributions of this investigation are related to the integration between some proactive control results and the BIM model in a systematic way. Furthermore, the proposed model has the possibility to be adapted for diverse uses, such as incorporating information from different phases of the construction project life cycle.

Books on the topic "Contrôle qualité proactif":

1

Smith, Preston G. Proactive risk management. New York: Productivity Press, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Latino, Robert J. Patient safety: The PROACT root cause analysis approach. Boca Raton: Taylor & Francis, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Smith, Preston G., and Guy M. Merritt. Proactive Risk Management : Controlling Uncertainty in Product Development. Productivity Press, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Merritt, Guy M., and Preston G. Smith. Proactive Risk Management: Controlling Uncertainty in Product Development. Productivity Press, 2020.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Contrôle qualité proactif":

1

Tavrov, Dan, Olena Temnikova, and Volodymyr Temnikov. "Method for Proactive Quality Control of Aviation Security of Airports Using Inverse Interval Weighted Average." In Recent Developments and the New Direction in Soft-Computing Foundations and Applications, 471–80. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-47124-8_38.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

YAU, STEPHEN S., and DAZHI HUANG. "PROACTIVE MONITORING AND CONTROL OF WORKFLOW EXECUTION IN ADAPTIVE SERVICE-BASED SYSTEMS." In Adaptive Control Approach for Software Quality Improvement, 239–66. WORLD SCIENTIFIC, 2011. http://dx.doi.org/10.1142/9789814340922_0008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sharma, Lalitsen, and Supriya Gupta. "Analyzing Performance of Ad hoc Routing Protocols under Various Constraints." In Technological Advancements and Applications in Mobile Ad-Hoc Networks, 152–66. IGI Global, 2012. http://dx.doi.org/10.4018/978-1-4666-0321-9.ch009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The mobility of nodes in mobile ad hoc networks and absence of any centralized control cause unpredictable changes in the network topologies. This makes routing a challenging task. Several routing protocols for mobile ad-hoc networks have come into existence. The protocols are classified in mainly in three categories: proactive, reactive, and hybrid. In this chapter, a study of one of each of the proactive and reactive protocols (respectively, Destination Sequence Distance Vector routing [DSDV], and Dynamic Source Routing [DSR]) is presented. The performance of above said protocols has been measured under varying mobility environment using NS-2 simulator based upon three quality metrics: average end-to-end delay, throughput, and jitter.
4

Delaney, Scott. "Data Profiling and Data Quality Metric Measurement as a Proactive Input into the Operation of Business Intelligence Systems." In Information Quality and Governance for Business Intelligence, 253–70. IGI Global, 2014. http://dx.doi.org/10.4018/978-1-4666-4892-0.ch013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Business intelligence systems have reached business critical status within many companies. It is not uncommon for such systems to be central to the decision-making effectiveness of these enterprises. However, the processes used to load data into these systems often do not exhibit a level of robustness in line with their criticality to the organisation. The processes of loading business intelligence systems with data are subject to compromised execution, delays, or failures as a result of changes in the source system data. These ETL processes are not designed to recognise nor deal with such shifts in data shape. This chapter proposes the use of data profiling techniques as a means of early discovery of issues and changes within the source system data and examines how this knowledge can be applied to guard against reductions in the decision making capability and effectiveness of an organisation caused by interruptions to business intelligence system availability or compromised data quality. It does so by examining issues such as where profiling can be best be applied to get appropriate benefit and value, the techniques of establishing profiling, and the types of actions that may be taken once the results of profiling are available. The chapter describes components able to be drawn together to provide a system of control that can be applied around a business intelligence system to enhance the quality of organisational decision making through monitoring the characteristics of arriving data and taking action when values are materially different than those expected.
5

Delaney, Scott. "Data Profiling and Data Quality Metric Measurement as a Proactive Input into the Operation of Business Intelligence Systems." In Business Intelligence, 2171–88. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-4666-9562-7.ch107.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Business intelligence systems have reached business critical status within many companies. It is not uncommon for such systems to be central to the decision-making effectiveness of these enterprises. However, the processes used to load data into these systems often do not exhibit a level of robustness in line with their criticality to the organisation. The processes of loading business intelligence systems with data are subject to compromised execution, delays, or failures as a result of changes in the source system data. These ETL processes are not designed to recognise nor deal with such shifts in data shape. This chapter proposes the use of data profiling techniques as a means of early discovery of issues and changes within the source system data and examines how this knowledge can be applied to guard against reductions in the decision making capability and effectiveness of an organisation caused by interruptions to business intelligence system availability or compromised data quality. It does so by examining issues such as where profiling can be best be applied to get appropriate benefit and value, the techniques of establishing profiling, and the types of actions that may be taken once the results of profiling are available. The chapter describes components able to be drawn together to provide a system of control that can be applied around a business intelligence system to enhance the quality of organisational decision making through monitoring the characteristics of arriving data and taking action when values are materially different than those expected.
6

Michaelides, Zenon, and Richard Forster. "The Use of RFID Technologies for E-Enabling Logistics Supply Chains." In E-Logistics and E-Supply Chain Management, 198–217. IGI Global, 2013. http://dx.doi.org/10.4018/978-1-4666-3914-0.ch011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This chapter reviews the potential benefits and challenges of introducing Radio Frequency Identification (RFID) technologies as a means of e-enabling logistics supply and distribution systems. It introduces RFID and associated technologies as a catalyst for e-enabling optimised supply and distribution activities. In particular, the emerging role of RFID in integrating logistics supply chains is considered key to aligning tasks and achieving operational efficiencies. Other benefits include better visibility resulting from proactive task and process management, and improved risk assessment associated with better data accuracy/quality. In addition, the optimisation of planning and control functions is enhanced through the introduction of key RFID technologies and their integration into logistics systems and operations. Finally, the use of RFID technologies is reviewed in a variety of diverse sectors and areas, from assisting humanitarian efforts through solutions aimed at recovering from the effects of natural disasters to providing accurate and effective methods of recording race times for the Los Angeles marathon.
7

Junior, Walter Coelho Pereira de Magalhães, Marcelo Bonnet, Leandro Diamantino Feijó, and Marilde Terezinha Prado Santos. "Risk-Off Method." In Cases on SMEs and Open Innovation, 40–64. IGI Global, 2012. http://dx.doi.org/10.4018/978-1-61350-314-0.ch003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Here the Risk-Off Method is presented as a contribution to improve the quality of data and information using milk chemical safety as a model, as overseen by the National Plan for Control of Residues and Contaminants (PNCRC) of the Brazilian Ministry of Agriculture, Livestock and Supply (MAPA). In particular, Small and Medium Enterprises (SMEs), which notably lack internal expertise, could benefit from the Risk-Off method, given that SMEs worldwide contribute significant amounts of food to meet global needs. This study develops an innovative tool to help countries provide robust and transparent chemical safety guarantees for their food products. Creating a flexible base platform to appropriately pre-classify results generated by laboratory testing of food samples, the method pre-processes data undergoing the process of Knowledge Discovery in Databases – KDD, producing systemic intelligence deriving from effective, proactive assessment and management of chemical safety risks in foods, a complex issue of increasingly global concern.

Conference papers on the topic "Contrôle qualité proactif":

1

Radfar, Reza, Javad Jassbi, Felora Ghoreishi, Sohrab Khanmohammadi, and Mahmood Alborzi. "Proactive quality paint thickness control using ANFIS." In 2010 IEEE International Conference on Systems, Man and Cybernetics - SMC. IEEE, 2010. http://dx.doi.org/10.1109/icsmc.2010.5642189.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Adeleke, Jude A., and Deshendran Moodley. "An Ontology for Proactive Indoor Environmental Quality Monitoring and Control." In the 2015 Annual Research Conference. New York, New York, USA: ACM Press, 2015. http://dx.doi.org/10.1145/2815782.2815816.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ho, Yao-Hua, Pei-En Li, Ling-Jyh Chen, and Yu-Lun Liu. "Indoor air quality monitoring system for proactive control of respiratory infectious diseases." In SenSys '20: The 18th ACM Conference on Embedded Networked Sensor Systems. New York, NY, USA: ACM, 2020. http://dx.doi.org/10.1145/3384419.3430456.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sarkar, Dr Debasis. "Advanced proactive statistical process control for online quality monitoring of ready mixed concrete." In Annual International Conference on Architecture and Civil Engineering. Global Science & Technology Forum (GSTF), 2013. http://dx.doi.org/10.5176/2301-394x_ace13.18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Coupek, D., A. Verl, J. Aichele, and M. Colledani. "Proactive quality control system for defect reduction in the production of electric drives." In 2013 3rd International Electric Drives Production Conference (EDPC). IEEE, 2013. http://dx.doi.org/10.1109/edpc.2013.6689762.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Junek, Lubomir, Jaroslav Bartonicek, and Milan Vrana. "Degradation Mechanisms Control of Mechanical Components During Operation." In ASME 2009 Pressure Vessels and Piping Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/pvp2009-77655.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
There are important mechanical systems, structures and components (SSC) in industrial equipments and nuclear power plants. These SSC are decisive for safety and economical operation and they mustn’t fail. There are principles used for these components to ensure their integrity during operation. Their fundamentals are as follows: – achievement of required quality during design, manufacturing and assembly, – quality assurance in following operation, – appropriate quality certificate in operation. The design should include all degradation mechanisms but some of them are very difficult to specify exactly and analyze before operation. The hardly specified degradation mechanisms have to be excluded from this concept. Appropriate operation measures have to be determined. Typical examples are operation vibration, material corrosion phenomena, thermal stratification, striping and dynamic loading during valve operation, etc. Design specification can determine basic design temperature, pressure, sustained loads and time history loadings during normal, abnormal and emergency operation only. Efficiency of the measures during operation has to be verified. Necessary measures have to be determined during design for verification and control of the real reasons during the SSC service. Both certificates and design of structure properties (material properties, dimensions, shapes, etc.) shall allow exclusion of systematic mistakes during design, manufacturing and assembly. Control of specified and unspecified degradation reasons during operation is the first redundant measure for insurance of required quality during operation and also timely assessment of inspection results regarding their influence on components quality. Appropriate measures shall be determined in time to minimize this influence if necessary. Check of degradation consequences is the second important measure. Relevant consequences have to be discovered in time before SSC failure. These measures shall be periodically updated in accordance with latest knowledge and their evaluation regarding the equipment. If relevant consequences are detected necessary measures shall be undertaken to exclude these consequences during following operation. Certificate of sufficient quality in operation have to be made before SSC is put in operation or after long term operation. Basis for this certificate are actual design and relevant load, real geometry, performed attachments, piping hanging and relevant operation loadings including media and specified postulated rupture according to current knowledge. Potential degradation mechanisms in SSC operation are necessary to determine and to prove sufficient insurance of their reasons. There are three levels assigned in the concept regarding SSC quality in future operation: – quality has to be guaranteed (prevent failure - proactive approach) by monitoring of causes of damage; – quality has to be maintained (preventative maintenance - proactive approach) by monitoring of damage results; – no specific demands on quality (maintenance triggered by damage - reactive approach) statistical approach for damage. The paper deals with theoretical background of the integrity concept on Czech NPPs and example of practical application is presented.
7

Bazzi, Loda. "Integrating a Proactive Quality Control Concept into Machining Operation of a Crankshaft Manufacturing Process." In WCX SAE World Congress Experience. 400 Commonwealth Drive, Warrendale, PA, United States: SAE International, 2019. http://dx.doi.org/10.4271/2019-01-0507.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Bartonicek, Jaroslav, Lubomir Junek, Milan Vrana, and Stanislav Vejvoda. "Safety Approach and Ageing Management in Czech Rules and Standards." In ASME 2009 Pressure Vessels and Piping Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/pvp2009-77653.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
There are basic technical (protection) objectives determined for assurance of nuclear power plant safety and the following generally belong among them: - Reactor pressure vessel shut down, - Long term maintenance of sub-critical state, - Long term cooling, - Prevention of radioactivity leakage. To ensure these objectives multi-step concept of deep protection is used for the design of a nuclear power plant and it includes: - Prevention of failures during normal and abnormal operation, - Control of failures and their consequences, - Minimizing of risks during accidents. Failure of operating systems is conservatively postulated for determination of systems requirements using for failure ensure as piping breaks. Ensure of these postulated failures come under multilevel safety approach. Failure consequences should be mainly ensured by design measures as separation of high energy piping, whip piping restrains etc. Efficiency of design measures have to be demonstrated. This passive safety procedure during design of new NPP can be applied. Application of this passive procedure for operating NPP can lead to technical and economical problems. It can be done by non precise and non sufficient requirements, current standards and documents. Leak before break concept (LBB) is very often out due to break operation conditions for successful concept usage. Beak preclusion concept was defined in Germany thirty years ago. The concept is developed from this time. Required quality of SSC is basic of this concept. The quality has to be received during manufacturing and assembly of new components to system or the quality passport has to be documented for SSC in operation before enlistment to the concept. During next operation they are sufficient and redundant measures necessary to control and to manage ageing phenomena (conceptual, technological, and physical) for exclusion of premature ageing. This proactive approach is also basic of documents from the last year’s required ageing and lifetime management. In Czech NPPs postulated failures and their consequences in accord with producer knowledge state at that time were insured. Postulated failures and their consequences were insured partly design measures and partly design supposed quality too. It is very difficult to realize new requirements on needed design provision on NPP in operation. It is impracticable in any cases. Needed national law for approach application exists in Czech from 1997. Regulation on lifetime management and national nuclear standards with specific requirements exist in Czech too. There are backgrounds for application proactive approach as it is used in Germany NPPs. New safety approach was provided in Czech NPPs. SSC are separated into three groups on the base safety approach: - SSC must not fail (guarantee of quality), - SSC may fail in rare case (preventative maintenance), - SSC may fail (failure orientated maintenance). The contribution deals about new Czech safety concept aspects, boundary conditions, needed document and proactive measures.
9

Kirsal, Yonal, Vishnu Vardhan Paranthaman, and Glenford Mapp. "Exploring analytical models to maintain quality-of-service for resource management using a proactive approach in highly mobile environments." In 2018 7th International Conference on Computers Communications and Control (ICCCC). IEEE, 2018. http://dx.doi.org/10.1109/icccc.2018.8390456.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ogunsanya, B. O., and A. J. Ifebajo. "Developing a Proactive Environmental Management System (PEMS) in Offshore West Africa." In ASME 2001 Engineering Technology Conference on Energy. American Society of Mechanical Engineers, 2001. http://dx.doi.org/10.1115/etce2001-17082.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract Every industry that seeks to remain efficient and relevant in this millennium should constantly be looking for ways of becoming more environmentally responsible — no business may call itself efficient if it threatens the environment within which it operates. As the quest for hydrocarbons intensifies in our deeper waters, we see environmental performance quality playing an increasingly critical role in every company’s business performance. In the last couple of years, reports from onshore E&P activities in Nigeria have shown that operating in the Niger Delta region poses some of the toughest challenges in the world. This region has witnessed a spate of attacks on oil and gas facilities, staff and contractors. Consequently, major oil and gas players have to contend with complex operational uncertainties due to increased pressures from the local communities for improved environmental control measures. In this project, we have outlined safe and effective plans, actions, and procedures to help pre-empt these pressures; maintain harmony with local communities, and effectively manage operational uncertainties within complex environmental settings like the Nigerian Niger Delta area. A proactive environmental management style based on continuous consultation, goal-oriented monitoring, as well as a continuous improvement attitude (CIA) are some of the various solutions proposed in this work. Finally, we are confident that this kind of environmental management system will undoubtedly enhance the economic viability, as well as the global competitiveness of our deep-water fields in offshore West Africa.

Reports on the topic "Contrôle qualité proactif":

1

Apiyo, Eric, Zita Ekeocha, Stephen Robert Byrn, and Kari L. Clase. Improving Pharmacovigilliance Quality Management System in the Pharmacy and Poisions Board of Kenya. Purdue University, December 2021. http://dx.doi.org/10.5703/1288284317444.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The purpose of this study was to explore ways of improving the pharmacovigilance quality system employed by the Pharmacy and Poisons Board of Kenya. The Pharmacy and Poisons Board of Kenya employs a hybrid system of pharmacovigilance that utilizes an online system of reporting pharmacovigilance incidences and a physical system, where a yellow book is physically filled by the healthcare worker and sent to the Pharmacy and Poisons Board for onward processing. This system, even though it has been relatively effective compared to other systems employed in Africa, has one major flaw. It is a slow and delayed system that captures the data much later after the fact and the agency will always be behind the curve in controlling the adverse incidents and events. This means that the incidences might continue to arise or go out of control. This project attempts to develop a system that would be more proactive in the collection of pharmacovigilance data and more predictive of pharmacovigilance incidences. The pharmacovigilance system should have the capacity to detect and analyze subtle changes in reporting frequencies and in patterns of clinical symptoms and signs that are reported as suspected adverse drug reactions. The method involved carrying out a thorough literature review of the latest trends in pharmacovigilance employed by different regulatory agencies across the world, especially the more stringent regulatory authorities. A review of the system employed by the Pharmacy and Poisons Board of Kenya was also done. Pharmacovigilance data, both primary and secondary, were collected and reviewed. Media reports on adverse drug reactions and poor-quality medicines over the period were also collected and reviewed. An appropriate predictive pharmacovigilance tool was also researched and identified. It was found that the Pharmacy and Poisons Board had a robust system of collecting historical pharmacovigilance data both from the healthcare workers and the general public. However, a more responsive data collection and evaluation system is proposed that will help the agency achieve its pharmacovigilance objectives. On analysis of the data it was found that just above half of all the product complaints, about 55%, involved poor quality medicines; 15% poor performance, 13% presentation, 8% adverse drug reactions, 7% market authorization, 2% expired drugs and 1% adulteration complaints. A regulatory pharmacovigilance prioritization tool was identified, employing a risk impact analysis was proposed for regulatory action.
2

McPhedran, R., K. Patel, B. Toombs, P. Menon, M. Patel, J. Disson, K. Porter, A. John, and A. Rayner. Food allergen communication in businesses feasibility trial. Food Standards Agency, March 2021. http://dx.doi.org/10.46756/sci.fsa.tpf160.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Background: Clear allergen communication in food business operators (FBOs) has been shown to have a positive impact on customers’ perceptions of businesses (Barnett et al., 2013). However, the precise size and nature of this effect is not known: there is a paucity of quantitative evidence in this area, particularly in the form of randomised controlled trials (RCTs). The Food Standards Agency (FSA), in collaboration with Kantar’s Behavioural Practice, conducted a feasibility trial to investigate whether a randomised cluster trial – involving the proactive communication of allergen information at the point of sale in FBOs – is feasible in the United Kingdom (UK). Objectives: The trial sought to establish: ease of recruitments of businesses into trials; customer response rates for in-store outcome surveys; fidelity of intervention delivery by FBO staff; sensitivity of outcome survey measures to change; and appropriateness of the chosen analytical approach. Method: Following a recruitment phase – in which one of fourteen multinational FBOs was successfully recruited – the execution of the feasibility trial involved a quasi-randomised matched-pairs clustered experiment. Each of the FBO’s ten participating branches underwent pair-wise matching, with similarity of branches judged according to four criteria: Food Hygiene Rating Scheme (FHRS) score, average weekly footfall, number of staff and customer satisfaction rating. The allocation ratio for this trial was 1:1: one branch in each pair was assigned to the treatment group by a representative from the FBO, while the other continued to operate in accordance with their standard operating procedure. As a business-based feasibility trial, customers at participating branches throughout the fieldwork period were automatically enrolled in the trial. The trial was single-blind: customers at treatment branches were not aware that they were receiving an intervention. All customers who visited participating branches throughout the fieldwork period were asked to complete a short in-store survey on a tablet affixed in branches. This survey contained four outcome measures which operationalised customers’: perceptions of food safety in the FBO; trust in the FBO; self-reported confidence to ask for allergen information in future visits; and overall satisfaction with their visit. Results: Fieldwork was conducted from the 3 – 20 March 2020, with cessation occurring prematurely due to the closure of outlets following the proliferation of COVID-19. n=177 participants took part in the trial across the ten branches; however, response rates (which ranged between 0.1 - 0.8%) were likely also adversely affected by COVID-19. Intervention fidelity was an issue in this study: while compliance with delivery of the intervention was relatively high in treatment branches (78.9%), erroneous delivery in control branches was also common (46.2%). Survey data were analysed using random-intercept multilevel linear regression models (due to the nesting of customers within branches). Despite the trial’s modest sample size, there was some evidence to suggest that the intervention had a positive effect for those suffering from allergies/intolerances for the ‘trust’ (β = 1.288, p<0.01) and ‘satisfaction’ (β = 0.945, p<0.01) outcome variables. Due to singularity within the fitted linear models, hierarchical Bayes models were used to corroborate the size of these interactions. Conclusions: The results of this trial suggest that a fully powered clustered RCT would likely be feasible in the UK. In this case, the primary challenge in the execution of the trial was the recruitment of FBOs: despite high levels of initial interest from four chains, only one took part. However, it is likely that the proliferation of COVID-19 adversely impacted chain participation – two other FBOs withdrew during branch eligibility assessment and selection, citing COVID-19 as a barrier. COVID-19 also likely lowered the on-site survey response rate: a significant negative Pearson correlation was observed between daily survey completions and COVID-19 cases in the UK, highlighting a likely relationship between the two. Limitations: The trial was quasi-random: selection of branches, pair matching and allocation to treatment/control groups were not systematically conducted. These processes were undertaken by a representative from the FBO’s Safety and Quality Assurance team (with oversight from Kantar representatives on pair matching), as a result of the chain’s internal operational restrictions.
3

Innovative Solutions to Human-Wildlife Conflicts: National Wildlife Research Center Accomplishments, 2010. U.S. Department of Agriculture, Animal and Plant Health Inspection Service, April 2011. http://dx.doi.org/10.32747/2011.7291310.aphis.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
As the research arm of Wildlife Services, a program within the U.S. Department of Agriculture’s (USDA) Animal and Plant Health Inspection Service (APHIS), NWRC develops methods and information to address human-wildlife conflicts related to agriculture, human health and safety, property damage, invasive species, and threatened and endangered species. The NWRC is the only Federal research facility in the United States devoted entirely to the development of methods for effective wildlife damage management, and it’s research authority comes from the Animal Damage Control Act of 1931. The NWRC’s research priorities are based on nationwide research needs assessments, congressional directives, APHIS Wildlife Services program needs, and stakeholder input. The Center is committed to helping resolve the ever-expanding and changing issues associated with human-wildlife conflict management and remains well positioned to address new issues through proactive efforts and strategic planning activities. NWRC research falls under four principal areas that reflect APHIS’ commitment to “protecting agricultural and natural resources from agricultural animal and plant health threats, zoonotic diseases, invasive species, and wildlife conflicts and diseases”. In addition to the four main research areas, the NWRC maintains support functions related to animal care, administration, information transfer, archives, quality assurance, facility development, and legislative and public affairs.

To the bibliography