Academic literature on the topic 'Industry data tools'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Industry data tools.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Industry data tools"

1

Pender, Jocelyn. "Leveraging Industry Visualization Tools for Biodiversity Science." Biodiversity Information Science and Standards 2 (May 22, 2018): e25842. https://doi.org/10.3897/biss.2.25842.

Full text
Abstract:
Widespread technology usage has resulted in a deluge of data that is not limited to scientific domains. For example, technology companies accumulate vast amounts of data on their users to support their applications and platforms. The participation of many domains in big data collection, data analysis and visualization, and the need for fast data exploration has provided a stellar market opportunity for high quality data visualization software to emerge. In this talk, leading industry visualization software (Tableau) will be used to explore a biodiversity dataset (<i>Carex</i> spp. distribution and morphology). The advantages and disadvantages of using Tableau for scientific exploration will be discussed, as well as how to integrate data visualization tools early into the data pipeline. Lastly, the potential for developing a data visualization "stack" (i.e., a combination of software products and programming languages) using available tools will be discussed, as well as what the future might look like for scientists looking to capitalize on the growth of industry tools.
APA, Harvard, Vancouver, ISO, and other styles
2

Larrucea, Xabier, Micha Moffie, and Dan Mor. "Enhancing GDPR compliance through data sensitivity and data hiding tools." JUCS - Journal of Universal Computer Science 27, no. (7) (2021): 650–66. https://doi.org/10.3897/jucs.70369.

Full text
Abstract:
Since the emergence of GDPR, several industries and sectors are setting informatics solutions for fulfilling these rules. The Health sector is considered a critical sector within the Industry 4.0 because it manages sensitive data, and National Health Services are responsible for managing patients&rsquo; data. European NHS are converging to a connected system allowing the exchange of sensitive information cross different countries. This paper defines and implements a set of tools for extending the reference architectural model industry 4.0 for the healthcare sector, which are used for enhancing GDPR compliance. These tools are dealing with data sensitivity and data hiding tools A case study illustrates the use of these tools and how they are integrated with the reference architectural model.
APA, Harvard, Vancouver, ISO, and other styles
3

Larrucea, Xabier, Micha Moffie, and Dan Mor. "Enhancing GDPR compliance through data sensitivity and data hiding tools." JUCS - Journal of Universal Computer Science 27, no. 7 (2021): 650–66. http://dx.doi.org/10.3897/jucs.70369.

Full text
Abstract:
Since the emergence of GDPR, several industries and sectors are setting informatics solutions for fulfilling these rules. The Health sector is considered a critical sector within the Industry 4.0 because it manages sensitive data, and National Health Services are responsible for managing patients&amp;rsquo; data. European NHS are converging to a connected system allowing the exchange of sensitive information cross different countries. This paper defines and implements a set of tools for extending the reference architectural model industry 4.0 for the healthcare sector, which are used for enhancing GDPR compliance. These tools are dealing with data sensitivity and data hiding tools A case study illustrates the use of these tools and how they are integrated with the reference architectural model.
APA, Harvard, Vancouver, ISO, and other styles
4

Eldridge, Jeanette. "Data visualisation tools—a perspective from the pharmaceutical industry." World Patent Information 28, no. 1 (2006): 43–49. http://dx.doi.org/10.1016/j.wpi.2005.10.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zazdravnykh, Aleksey V., and Elena Yu Boitsova. "Big Data as a Factor of Industry Market Entry." Vestnik Tomskogo gosudarstvennogo universiteta. Ekonomika, no. 56 (2021): 50–66. http://dx.doi.org/10.17223/19988648/56/4.

Full text
Abstract:
Over the past few years, the Russian expert and scientific community has registered a significant increase in professional interest in the problem of using the capabilities of big data and algorithms for processing them as effective tools for shaping the market advantages of firms, as well as tools for limiting competition in commodity markets. At the same time, in theoretical and practical terms, this urgent issue is still very poorly studied. This article examines certain aspects of the influence of big data control on the dynamics of markets. The authors aim to develop the theory of the issue of the modem typology of entry barriers and to expand scientific ideas about new tools for limiting competition in commodity markets. The authors state that the potential of big data control as a factor that restricts the entry of new operators is manifested in the restriction of access to big data sources and the technological infrastructure of their processing, in the effects of the “positive feedback loop”, and in new opportunities for a cooperative behavior of firms. The authors are convinced that the ability of firms to qualitatively structure data and work in real time with relevant data sets is of fundamental importance for the stability of firms’ market positions today. In the development of the discussion, alternative opinions on the issue are also given. Separately, the authors discuss the effects of using big data on consumer welfare, as well as the associated privacy concerns. The authors note that the ability of firms to guarantee confidentiality in the consumption process creates new points of growth in competitiveness and new types of entry barriers.
APA, Harvard, Vancouver, ISO, and other styles
6

Pender, Jocelyn. "Leveraging Industry Visualization Tools for Biodiversity Science." Biodiversity Information Science and Standards 2 (May 22, 2018): e25842. http://dx.doi.org/10.3897/biss.2.25842.

Full text
Abstract:
Widespread technology usage has resulted in a deluge of data that is not limited to scientific domains. For example, technology companies accumulate vast amounts of data on their users to support their applications and platforms. The participation of many domains in big data collection, data analysis and visualization, and the need for fast data exploration has provided a stellar market opportunity for high quality data visualization software to emerge. In this talk, leading industry visualization software (Tableau) will be used to explore a biodiversity dataset (Carex spp. distribution and morphology). The advantages and disadvantages of using Tableau for scientific exploration will be discussed, as well as how to integrate data visualization tools early into the data pipeline. Lastly, the potential for developing a data visualization "stack" (i.e., a combination of software products and programming languages) using available tools will be discussed, as well as what the future might look like for scientists looking to capitalize on the growth of industry tools.
APA, Harvard, Vancouver, ISO, and other styles
7

Nisreen Nizar Raouf and Mohammad A. Taha Aldabbagh. "Cloud Data Integration Tools As Services." Jurnal Kendali Teknik dan Sains 2, no. 3 (2024): 08–19. http://dx.doi.org/10.59581/jkts-widyakarya.v2i3.3223.

Full text
Abstract:
In this paper, a variety of data integration, cloud computing, and web services-related topics were discussed. The conversation covered the advantages of using cloud-based methods for cloud-based data integration, such as enhanced data accuracy and completeness, as well as the challenges and considerations that must be addressed. In addition, the significance of document integrity and the use of auto-enhance document tools to ensure data accuracy were emphasized. The paper also provided a broad overview of the subject, touching on a variety of aspects and providing insights into the potential of cloud-based data integration methods in the cloud industry.
APA, Harvard, Vancouver, ISO, and other styles
8

Kumar, Sunil, and Maninder Singh. "Big data analytics for healthcare industry: impact, applications, and tools." Big Data Mining and Analytics 2, no. 1 (2019): 48–57. http://dx.doi.org/10.26599/bdma.2018.9020031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Malviya, Shreekant. "AI-Powered Data Governance for Insurance: A Comparative Tool Evaluation." International journal of data science and machine learning 05, no. 01 (2025): 280–99. https://doi.org/10.55640/ijdsml-05-01-24.

Full text
Abstract:
As insurers are increasingly utilizing artificial intelligence for underwriting, pricing, and claims processing in an automated manner, end-to-end, open, and industry-level data governance solutions became the top priority. Although numerous AI-driven governance technologies are available, they are mostly purpose-built for generic corporate requirements and do not entirely meet the decision-making-oriented, ethics-conscious, and regulation-compliant insurance industry requirements. This paper presents a comparative evaluation of six top governance platforms—Collibra, Informatica CLAIRE, BigID, Immuta, IBM Watson Knowledge Catalog, and Alation—on eight dimensions, such as explainability, consent management, and insurance-specific flexibility. The research also illustrates the industry specific adoption of AI driven data governance in finance, health care and insurance along with a comparative insights amongst the three most data centric industry. The study reviews insurance governance practices to assess capability gaps in the existing available commercial tools and strategic recommendations to insurers and tech vendors. This paper provides the basis for building AI governance systems that are compatible, scalable, fair, transparent, and flexible to the specific working context of the insurance data universe by overcoming technical limitations and moral dilemmas.
APA, Harvard, Vancouver, ISO, and other styles
10

Ifrim, Ana Maria, Cătălin Ionuț Silvestru, Mihai-Alexandru Stoica, Cristina Vasilica Icociu, Ionica Oncioiu, and Marian Ernuț Lupescu. "Quality Tools and Their Applications in Industry." International Journal of Innovation in the Digital Economy 14, no. 1 (2023): 1–11. http://dx.doi.org/10.4018/ijide.325068.

Full text
Abstract:
Almost all quality improvement methods require data collection and analysis to solve quality problems. The combination of six sigma and agile creates a six sigma agile methodology that aims to reach quality levels according to the Six Sigma requirements of 3.4 defects per million measurements. In order to achieve these objectives, it is necessary to know the industry well and implicitly the product or the analysed process. Thus, the correctness of these analyses depends on the collection of the data that will be analysed. The use of data analysis methods at each stage, especially in the measurement and analysis stages, is critically important for making strong decisions. The purpose of this article is to present the added value of the integration of six sigma and agile methodologies for IT projects. Thus, the integration of the two methodologies will lead to faster decision-making without the risk of an increase in the number of failures.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Industry data tools"

1

ILARDI, DAVIDE. "Data-driven solutions to enhance planning, operation and design tools in Industry 4.0 context." Doctoral thesis, Università degli studi di Genova, 2023. https://hdl.handle.net/11567/1104513.

Full text
Abstract:
This thesis proposes three different data-driven solutions to be combined to state-of-the-art solvers and tools in order to primarily enhance their computational performances. The problem of efficiently designing the open sea floating platforms on which wind turbines can be mount on will be tackled, as well as the tuning of a data-driven engine's monitoring tool for maritime transportation. Finally, the activities of SAT and ASP solvers will be thoroughly studied and a deep learning architecture will be proposed to enhance the heuristics-based solving approach adopted by such software. The covered domains are different and the same is true for their respective targets. Nonetheless, the proposed Artificial Intelligence and Machine Learning algorithms are shared as well as the overall picture: promote Industrial AI and meet the constraints imposed by Industry 4.0 vision. The lesser presence of human-in-the-loop, a data-driven approach to discover causalities otherwise ignored, a special attention to the environmental impact of industries' emissions, a real and efficient exploitation of the Big Data available today are just a subset of the latter. Hence, from a broader perspective, the experiments carried out within this thesis are driven towards the aforementioned targets and the resulting outcomes are satisfactory enough to potentially convince the research community and industrialists that they are not just "visions" but they can be actually put into practice. However, it is still an introduction to the topic and the developed models are at what can be defined a "pilot" stage. Nonetheless, the results are promising and they pave the way towards further improvements and the consolidation of the dictates of Industry 4.0.
APA, Harvard, Vancouver, ISO, and other styles
2

Hedin, Nathalie, and Adrian Zander. "Using KPIs in decision-making tools in the construction industry." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-253805.

Full text
Abstract:
The construction industry has a great opportunity to streamline its operations even more by making greater use of the digital revolution. The industry today uses a lot of manual data management and analysis to get an overview of the business and to make decisions. This can be a time-consuming process that could be made more efficient through Business Intelligence (BI). BI is a technology that automatically, with the help of selected Key Performance Indicators (KPIs), shows the current status of how a business performs. This can allow managers and executives to make decisions easier and faster.This study examines which KPIs that are of common interest to companies and organizations in the construction industry as well as how these KPIs can be presented to the end users of a BI application. To investigate this, data is collected through literature studies and interviews, which results in a list of common KPIs for the industry. From this common list, a number of KPIs are selected to be visually represented.An analysis of the results, indicates that KPIs are of different importance and relevance depending on which sector of the construction industry the interviewee belongs to. There also appears to be sector-specific KPIs and the common list suggests that the profit margin is of great importance throughout the whole industry. KPIs can be represented in different types of charts and diagrams, depending on the purpose they hold, and should be designed so that they are intuitive and easy to understand.<br>Bygg- och hantverksbranschen har stor möjlighet att effektivisera sin verksamhet ännu mer genom att i större utsträckning utnyttja den digitaliserade utvecklingen. Branschen använder idag mycket manuell datahantering och manuell dataanalys för att få en överblick över verksamheten och ta beslut. Detta kan vara en tidskrävande process, som skulle kunna effektiviseras ytterligare genom Business Intelligence (BI). BI är en teknologi som automatiskt, med hjälp av valda nyckeltal (KPI:er), visar aktuell status på hur en verksamhet presterar. Detta kan göra att beslutsfattare kan fatta beslut enklare och snabbare.Denna studie undersöker vilka nyckeltal som är av gemensamt intresse för företag och organisationer i byggoch hantverksbranschen samt hur dessa nyckeltal kan presenteras för slutanvändarna av en BI-applikation. För att ta reda på detta samlas data in genom litteraturstudier och intervjuer, vilket resulterar i en lista över gemensamma KPI:er i branchen. Ur denna lista handplockas ett antal nyckeltal ut för att representeras visuellt.En analys av resultaten indikerar att nyckeltal är av olika vikt och relevans, beroende på vilken sektortillhörighet av byggoch hantverksbranschen den intervjuade tillhör. Det verkar även finnas sektorspecifika KPI:er samt att listan med de branchgemensamma nyckeltalen antyder att vinstmarginal är av stor vikt. KPI:er kan representeras i olika typer av diagram beroende på vilket syfte de innehar och bör designas så att de är intuitiva och lättförståeliga.
APA, Harvard, Vancouver, ISO, and other styles
3

An, Ping. "An investigation of the use of software development environments in the industry." Thesis, Linköping University, Department of Computer and Information Science, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2662.

Full text
Abstract:
<p>Software engineering tools are being used in the industry in order to improve the productivity and the quality of the software development process. The properties of those tools are being perceived to be unsatisfactory. For example, researchers have found that some problems are due to deficient integration among the tools. Furthermore, a continuing problem is that there is a gap between the IT education and real demand of tool-skills form IT industry. Consequently, knowledge is needed of the properties of software development tools as well an understanding of demanded tool-skill from the industry. </p><p>The purpose of this study is to survey commercial software development environment (SDEs) that are used today in professional software engineering and discuss their advantages adn disadvantages. A secondary goal of the study is to identify the actual requirements from the industry on the IT-education. </p><p>A questionnaire was sent out to 90 software developers and IT managers of 30 IT companies in Sweden. The results of the survey show that IT companies, for most part, use SDEs from commercial software vendors. Respondents report that common problems of the SDEs are the following: bad integration among the tools, problems to trace software artifacts in the different phases of the programming cycle, and deficient support for version control and system configuration. Furthermore, some tools are difficult to use which results in a time-consuming development process. </p><p>We conclude that future software development environments need to provide better support for integration, automation, and configuration management. Regarding the required tool-skills, we believe that the IT education would gain from including commercial tools that cover the whole software product lifecycle in the curriculum.</p>
APA, Harvard, Vancouver, ISO, and other styles
4

Berg, Martin, and Albin Eriksson. "Toward predictive maintenance in surface treatment processes : A DMAIC case study at Seco Tools." Thesis, Luleå tekniska universitet, Institutionen för ekonomi, teknik, konst och samhälle, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-84923.

Full text
Abstract:
Surface treatments are often used in the manufacturing industry to change the surface of a product, including its related properties and functions. The occurrence of degradation and corrosion in surface treatment processes can lead to critical breakdowns over time. Critical breakdowns may impair the properties of the products and shorten their service life, which causes increased lead times or additional costs in the form of rework or scrapping.  Prevention of critical breakdowns due to machine component failure requires a carefully selected maintenance policy. Predictive maintenance is used to anticipate equipment failures to allow for maintenance scheduling before component failure. Developing predictive maintenance policies for surface treatment processes is problematic due to the vast number of attributes to consider in modern surface treatment processes. The emergence of smart sensors and big data has led companies to pursue predictive maintenance. A company that strives for predictive maintenance of its surface treatment processes is Seco Tools in Fagersta. The purpose of this master's thesis has been to investigate the occurrence of critical breakdowns and failures in the machine components of the chemical vapor deposition and post-treatment wet blasting processes by mapping the interaction between its respective process variables and their impact on critical breakdowns. The work has been conducted as a Six Sigma project utilizing the problem-solving methodology DMAIC.  Critical breakdowns were investigated combining principal component analysis (PCA), computational fluid dynamics (CFD), and statistical process control (SPC) to create an understanding of the failures in both processes. For both processes, two predictive solutions were created: one short-term solution utilizing existing dashboards and one long-term solution utilizing a PCA model and an Orthogonal Partial Least Squares (OPLS) regression model for batch statistical process control (BSPC). The short-term solutions were verified and implemented during the master's thesis at Seco Tools. Recommendations were given for future implementation of the long-term solutions. In this thesis, insights are shared regarding the applicability of OPLS and Partial Least Squares (PLS) regression models for batch monitoring of the CVD process. We also demonstrate that the prediction of a certain critical breakdown, clogging of the aluminum generator in the CVD process, can be accomplished through the use of SPC. For the wet blasting process, a PCA methodology is suggested to be effective for visualizing breakdowns.
APA, Harvard, Vancouver, ISO, and other styles
5

Lewis, Paul Robert. "Cutting data for automated turning tool selection in industry." Thesis, Durham University, 1996. http://etheses.dur.ac.uk/5240/.

Full text
Abstract:
This thesis is concerned with the determination of cutting parameters (cutting speed, feed rate and depth of cut) in turning operations within an industrial environment. The parameters are required for the purposes of tool selection, working with a variety of batches of different materials. Previous work of this nature, little of which has been transferred into industry, has concentrated primarily on deriving optimum cutting conditions, based on a variety of deterministic and non- deterministic approaches, with a general reliance on experimentally-derived input variables. However, this work is only suited to tool selection for a single material. Under industrial conditions tools will frequently need to be selected for more than one material, in tool/material combinations not recommended by tool makers. Consequently, the objective of the research described in this thesis was to employ existing cutting data technology and to use it as the basis for a cutting data system, suitable for multi-batch tool selection. Two companies collaborated in this work, by making available suitable personnel and the provision of shop floor facilities on their premises. The initial work concentrated on the development of an algorithmic model, based on established theory. This was then tested industrially, using the concept of shop floor approved data as a substitute for optimum cutting data. The model was found to work reasonably, but required further development to make it suitable for multi- batch tool selection. This development took three main forms: a) a reduction of input data, particularly in the number of experimentally-derived variables, b) the removal of the tool/material-specific constraints traditionally used in cutting data optimisation, c) a method of data correction based on adjustment of the mean and standard deviation of the data. Further industrial testing was carried out using the resulting system. It was demonstrated that it was possible for a relaxed system with reduced input variables and appropriate data correction to function within an industrial environment.
APA, Harvard, Vancouver, ISO, and other styles
6

Kumar, Rajat, and Santosh Shinde. "Simulation as decision support tool : A case study with data analysis, VSM and simulation applied to an ETO system." Thesis, Uppsala universitet, Industriell teknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-385918.

Full text
Abstract:
Digital transformation is rising rapidly, where absolutely any kind of data is readily available. With the rise in digital information and technologies, businesses are aiming at measuring their processes and thereby getting better control over their operations. Industries are striving to increase efficiency and productivity to survive in the high global competitive market by adopting a digital tool. Simulation has been one of the most common tools of Industry 4.0 that allows a virtual representation of the real system. The use of simulation has tremendously increased in manufacturing industries due to its advantages where changes can be made and checked virtually before implementing them to the physical system. Data acquisition has been an important aspect that remains unfocussed especially in Engineer to Order (ETO) and Make to Order (MTO) environments where the production processes are complex, nonstandardised and depend heavily on manual work leading to a prolonged barrier towards digitalisation. So, can data create new values for companies? Are firms realising the importance of data? As the economy is moving towards a more data-driven state, it is essential for companies to realise the importance of data and find efficient ways of data collection. The thesis is based on a case study at an electrical transformer manufacturing firm taking its first step towards digital technology to have better control over its manufacturing processes. However, this transformation is primarily hindered by the limited availability of data and poor data quality due to manual data acquisition methods. Analysis of organisational documents and Value Stream Mapping (VSM) have been used to analyse missing primary and secondary production data. The results of this study conclude that modelling of the system can give additional data gaps, and the simulation model can be served as a powerful decision support tool for the firm. This switch in approach from intuition-based decision making to more fact-based decision making can result in better planning and control over the production. This thesis will lay the foundation for studies related to data acquisition and simulation in an ETO environment as it is not widely discussed in the existing literature
APA, Harvard, Vancouver, ISO, and other styles
7

Gustafson, Christopher, and Sam Florin. "Qualification of Tool for Static Code Analysis : Processes and Requirements for Approval of Static Code Analysis in the Aviation Industry." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-277941.

Full text
Abstract:
In the aviation industry, the use of software development tools is not as easily adopted as in other industries. Due to the catastrophic consequences of software errors in airborne systems, software development processes has rigorous requirements. One of these requirements is that a code standard must be followed. Code standards are used to exclude code constructions which could result in unwanted behaviours. The process of manually ensuring a specific code standard can be costly. This process could be automated by a tool for static code analysis, however, this requires a formal qualification. This thesis evaluates the process of qualifying a tool for static code analysis in accordance with the requirements of the major aviation authorities EASA and FAA. To describe the qualification process, a literature study was conducted. To further explain how an existing tool could be put through the qualification process, a case study of the existing tool Parasoft C/C++ test was conducted. The results of the literature study show what processes must be completed in order to qualify a static code analysis tool. Importantly, the study shows that no requirements are put on the development process of the tool. This was an important takeaway as it meant that an existing tool could be qualified without any additional data from the developer of the tool. The case study of Parasoft C/C++ test showed how the tool could be configured and verified to analyze code in accordance with a small set of code rules. Furthermore, three documents including qualification data were produced showing how the qualification process should be documented in order to communicate the process to an authority. The results of the thesis do not provide the full picture of how a tool could be qualified as the software, in which the tool is used, is considerations the are specific to the software the tool is used to develop still need to be taken into consideration. The thesis does, however, provide guidance on the majority of the applicable requirements. Future research could be done to provide the complete picture of the qualification process, as well as how the process would look like for other types of tools.<br>Inom flygindustrin är användandet av olika programmeringsverktyg inte lika självklart som inom andra industrier. På grund av de katastrofala konsekvenser som fel i mjukvaran i ett flygplan kan resultera i finns det rigorösa krav på mjukvaruutvecklingsprocessen. Ett av dessa krav är att en viss kodstandard måste upprätthållas. Kodstandarder används för att exkludera vissa strukturer i kod som kan leda till oönskat beteende. Upprätthållandet av en viss kodstandard är en långdragen process att genomföra manuellt, och kan därför automatiseras med hjälp av ett statiskt kodanalysverktyg. För att kunna använda ett sådant verktyg behövs däremot en formell verktygskvalificering. I denna uppsats kommer kvalificeringsprocessen av ett verktyg för statisk kodanalys att evalueras enligt de krav som de två stora flygmyndigheterna EASA och FAA ställer. För att förklara processen av att kvalificera ett sådant verktyg gjordes en litteraturstudie följt av en fallstudie av det existerande verktyget Parasoft C/C++ test. Resultaten av litteraturstudien beskriver de olika processerna som måste genomföras för att kvalificera ett statiskt kodanalysverktyg. Noterbart är att resultaten visar att inga krav ställs på utvecklingsprocessen av verktyget själv. Detta betyder att ett existerande kommersiellt verktyg kan kvalificeras utan att verktygsutvecklarna själva behöver bidra med extra information. Fallstudien visade hur verktyget Parasoft C/C++ test kan konfigureras och verifieras att följa en viss kodstandard. Vidare resulterade fallstudien i utkast av de nödvändiga dokumenten som behöver produceras för att kommunicera kvalificeringsprocessen till en myndighet. De resultat som presenteras i denna uppsats är i sig inte tillräckliga för beskriva hela kvalificeringsprocessen. Ytterligare överväganden som är specifika till den mjukvaran som verktyget ska användas till att utveckla måste göras för att en komplett kvalificering ska kunna genomföras. Uppsatsen bidrar däremot med riktlinjer och vägledning av majoriteten av de processerna som behöver genomföras. Ytterligare forskning kan göras för att bidra med den kompletta bilden av verktygskvalificering av ett statiskt kodanalysverktyg, samt hur kvalificering kan göras av andra typer av verktyg.
APA, Harvard, Vancouver, ISO, and other styles
8

Fenollosa, Artés Felip. "Contribució a l'estudi de la impressió 3D per a la fabricació de models per facilitar l'assaig d'operacions quirúrgiques de tumors." Doctoral thesis, Universitat Politècnica de Catalunya, 2019. http://hdl.handle.net/10803/667421.

Full text
Abstract:
La present tesi doctoral s’ha centrat en el repte d’aconseguir, mitjançant Fabricació Additiva (FA), models per a assaig quirúrgic, sota la premissa que els equips per fer-los haurien de ser accessibles a l’àmbit hospitalari. L’objectiu és facilitar l’extensió de l’ús dels prototips com a eina de preparació d’operacions quirúrgiques, transformant la pràctica mèdica actual de la mateixa manera que en el seu moment ho van fer tecnologies com les que van facilitar l’ús de radiografies. El motiu d’utilitzar FA, en lloc de tecnologies més tradicionals, és la seva capacitat de materialitzar de forma directa les dades digitals obtingudes de l’anatomia del pacient mitjançant sistemes d’escanejat tridimensional, fent possible l’obtenció de models personalitzats. Els resultats es centren en la generació de nou coneixement sobre com aconseguir equipaments d’impressió 3D multimaterials accessibles que permetin l’obtenció de models mimètics respecte als teixits vius. Per facilitar aquesta buscada extensió de la tecnologia, s’ha focalitzat en les tecnologies de codi obert com la Fabricació per Filament Fos (FFF) i similars basades en líquids catalitzables. La recerca s’alinea dins l’activitat de desenvolupament de la FA al CIM UPC, i en aquest àmbit concret amb la col·laboració amb l’Hospital Sant Joan de Déu de Barcelona (HSJD). El primer bloc de la tesi inclou la descripció de l’estat de l’art, detallant les tecnologies existents i la seva aplicació a l’entorn mèdic. S’han establert per primer cop unes bases de caracterització dels teixits vius -sobretot tous- per donar suport a la selecció de materials que els puguin mimetitzar en un procés de FA, a efectes de millorar l’experiència d’assaig dels cirurgians. El caràcter rígid dels materials majoritàriament usats en impressió 3D els fa poc útils per simular tumors i altres referències anatòmiques. De forma successiva, es tracten paràmetres com la densitat, la viscoelasticitat, la caracterització dels materials tous a la indústria, l’estudi del mòdul elàstic de teixits tous i vasos, la duresa d’aquests, i requeriments com l’esterilització dels models. El segon bloc comença explorant la impressió 3D mitjançant FFF. Es classifiquen les variants del procés des del punt de vista de la multimaterialitat, essencial per fer models d’assaig quirúrgic, diferenciant entre solucions multibroquet i de barreja al capçal. S’ha inclòs l’estudi de materials (filaments i líquids) que serien més útils per mimetitzar teixits tous. Es constata com en els líquids, en comparació amb els filaments, la complexitat del treball en processos de FA és més elevada, i es determinen formes d’imprimir materials molt tous. Per acabar, s’exposen sis casos reals de col·laboració amb l’HJSD, una selecció d’aquells en els que el doctorand ha intervingut en els darrers anys. L’origen es troba en la dificultat de l’abordatge d’operacions de resecció de tumors infantils com el neuroblastoma, i a la iniciativa del Dr. Lucas Krauel. Finalment, el Bloc 3 té per objecte explorar nombrosos conceptes (fins a 8), activitat completada al llarg dels darrers cinc anys amb el suport dels mitjans del CIM UPC i de l’activitat associada a treballs finals d’estudis d’estudiants de la UPC, arribant-se a materialitzar equipaments experimentals per validar-los. La recerca ampla i sistemàtica al respecte fa que s’estigui més a prop de disposar d’una solució d’impressió 3D multimaterial de sobretaula. Es determina que la millor via de progrés és la de disposar d’una pluralitat de capçals independents a fi de capacitar la impressora 3D per integrar diversos conceptes estudiats, materialitzant-se una possible solució. Cloent la tesi, es planteja com seria un equipament d’impressió 3D per a models d’assaig quirúrgic, a fi de servir de base per a futurs desenvolupaments.<br>La presente tesis doctoral se ha centrado en el reto de conseguir, mediante Fabricación Aditiva (FA), modelos para ensayo quirúrgico, bajo la premisa que los equipos para obtenerlos tendrían que ser accesibles al ámbito hospitalario. El objetivo es facilitar la extensión del uso de modelos como herramienta de preparación de operaciones quirúrgicas, transformando la práctica médica actual de la misma manera que, en su momento, lo hicieron tecnologías como las que facilitaron el uso de radiografías. El motivo de utilizar FA, en lugar de tecnologías más tradicionales, es su capacidad de materializar de forma directa los datos digitales obtenidos de la anatomía del paciente mediante sistemas de escaneado tridimensional, haciendo posible la obtención de modelos personalizados. Los resultados se centran en la generación de nuevo conocimiento para conseguir equipamientos de impresión 3D multimateriales accesibles que permitan la obtención de modelos miméticos respecto a los tejidos vivos. Para facilitar la buscada extensión de la tecnología, se ha focalizado en las tecnologías de código abierto como la Fabricación por Hilo Fundido (FFF) y similares basadas en líquidos catalizables. Esta investigación se alinea dentro de la actividad de desarrollo de la FA en el CIM UPC, y en este ámbito concreto con la colaboración con el Hospital Sant Joan de Déu de Barcelona (HSJD). El primer bloque de la tesis incluye la descripción del estado del arte, detallando las tecnologías existentes y su aplicación al entorno médico. Se han establecido por primera vez unas bases de caracterización de los tejidos vivos – principalmente blandos – para dar apoyo a la selección de materiales que los puedan mimetizar en un proceso de FA, a efectos de mejorar la experiencia de ensayo de los cirujanos. El carácter rígido de los materiales mayoritariamente usados en impresión 3D los hace poco útiles para simular tumores y otras referencias anatómicas. De forma sucesiva, se tratan parámetros como la densidad, la viscoelasticidad, la caracterización de materiales blandos en la industria, el estudio del módulo elástico de tejidos blandos y vasos, la dureza de los mismos, y requerimientos como la esterilización de los modelos. El segundo bloque empieza explorando la impresión 3D mediante FFF. Se clasifican las variantes del proceso desde el punto de vista de la multimaterialidad, esencial para hacer modelos de ensayo quirúrgico, diferenciando entre soluciones multiboquilla y de mezcla en el cabezal. Se ha incluido el estudio de materiales (filamentos y líquidos) que serían más útiles para mimetizar tejidos blandos. Se constata como en los líquidos, en comparación con los filamentos, la complejidad del trabajo en procesos de FA es más elevada, y se determinan formas de imprimir materiales muy blandos. Para acabar, se exponen seis casos reales de colaboración con el HJSD, una selección de aquellos en los que el doctorando ha intervenido en los últimos años. El origen se encuentra en la dificultad del abordaje de operaciones de resección de tumores infantiles como el neuroblastoma, y en la iniciativa del Dr. Lucas Krauel. Finalmente, el Bloque 3 desarrolla numerosos conceptos (hasta 8), actividad completada a lo largo de los últimos cinco años con el apoyo de los medios del CIM UPC y de la actividad asociada a trabajos finales de estudios de estudiantes de la UPC, llegándose a materializar equipamientos experimentales para validarlos. La investigación amplia y sistemática al respecto hace que se esté más cerca de disponer de una solución de impresión 3D multimaterial de sobremesa. Se determina que la mejor vía de progreso es la de disponer de una pluralidad de cabezales independientes, a fin de capacitar la impresora 3D para integrar diversos conceptos estudiados, materializándose una posible solución. Para cerrar la tesis, se plantea cómo sería un equipamiento de impresión 3D para modelos de ensayo quirúrgico, a fin de servir de base para futuros desarrollos.
APA, Harvard, Vancouver, ISO, and other styles
9

Ferreira, Gonçalo José Teixeira de Pinho. "Exploring augmented and data-driven digital modeling tools in product design and engineering." Master's thesis, 2019. http://hdl.handle.net/10773/28573.

Full text
Abstract:
Tools are indispensable for all diligent professional practice. New concepts and possibilities for paradigm shifting are emerging with recent computational technological developments in digital tools. However, new tools from key concepts such as “Big-Data”, “Accessibility” and “Algorithmic Design” are fundamentally changing the input and position of the Product Engineer and Designer. After the context introduction, this dissertation document starts by extracting three pivotal criteria from the Product Design Engineering's State of the Art analysis. In each one of those criteria the new emergent, more relevant and paradigmatic concepts are explored and later on are positioned and compared within the Product Lifecycle Management wheel scheme, where the potential risks and gaps are pointed to be explored in the experience part. There are two types of empirical experiences: the first being of case studies from Architecture and Urban Planning — from the student's professional experience —, that served as a pretext and inspiration for the experiments directly made for Product Design Engineering. First with a set of isolated explorations and analysis, second with a hypothetical experience derived from the latter and, finally, a deliberative section that culminate in a listing of risks and changes concluded from all the previous work. The urgency to reflect on what will change in that role and position, what kind of ethical and/or conceptual reformulations should exist for the profession to maintain its intellectual integrity and, ultimately, to survive, are of the utmost evidence.<br>As ferramentas são indispensáveis para toda a prática diligente profissional. Novos conceitos e possibilidades de mudança de paradigma estão a surgir com os recentes progressos tecnológicos a nível computacional nas ferramentas digitais. Contudo, novas ferramentas originadas sobre conceitos-chave como “Big Data”, “Acessibilidade” e “Design Algorítmico” estão a mudar de forma fundamental o contributo e posição do Engenheiro e Designer de Produto. Esta dissertação, após uma primeira introdução contextual, começa por extrair três conceitos-eixo duma análise ao Estado da Arte actual em Engenharia e Design de Produto. Em cada um desses conceitos explora-se os novos conceitos emergentes mais relevantes e paradigmáticos, que então são comparados e posicionados no círculo de Gestão de Ciclo de Vida de Produto, apontando aí potenciais riscos e falhas que possam ser explorados em experiências. As experiências empíricas têm duas índoles: a primeira de projetos e casos de estudo de arquitetura e planeamento urbanístico — experiência em contexto de trabalho do aluno —, que serviu de pretexto e inspiração para as experiências relacionadas com Engenharia e Design de Produto. Primeiro com uma série de análises e experiências isoladas, segundo com uma formulação hipotética com o compêndio dessas experiências e, finalmente, com uma secção de reflexão que culmina numa série de riscos e mudanças induzidas do trabalho anterior. A urgência em refletir sobre o que irá alterar nesse papel e posição, que género de reformulações éticas e/ou conceptuais deverão existir para que a profissão mantenha a sua integridade intelectual e, em última instância, sobreviva, são bastante evidentes.<br>Mestrado em Engenharia e Design de Produto
APA, Harvard, Vancouver, ISO, and other styles
10

Huang, Chung-Hao, and 黃崇豪. "A Study of Evaluating the Management Efficiency of Machine Tools Industry in Taiwan-An Application of Data Envelopment Analysis." Thesis, 2005. http://ndltd.ncl.edu.tw/handle/65339237241495227784.

Full text
Abstract:
碩士<br>朝陽科技大學<br>工業工程與管理系碩士班<br>93<br>With application of Data Enevelopment Analysis (DEA) approach, this study conducts performance analysis among sixteen machine tool manufacturers in Taiwan. The input factors include the number of staff, fix assets, R&D expense and processing costs; the output factors are net revenue and EPS. While interacting the above factors and calculated by DEAP, performance efficiency results for each machine tool manufacturers are measured, including category of efficiency unit, the best inputs in each steel mills. Then, this study applies Malmquist productivity index(MPI)to measure the 「Total productivity change」,「Technological change」 and 「Efficiency change」indexes respectively. The results show that among the 16 evaluated uints, 12 units were inefficient. 6 units did not achieve the most appropriate scale, and were on the increase stage of scale. Other 6 units on the decreasing stage of scale pay reflect their over invest resources. Therefore, they should reduce their amount of invest resources.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Industry data tools"

1

Alvarez, J. Microcomputers as management tools in the sugar cane industry. Elsevier, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

A, Vaĭsburg V., ed. Avtomatizat͡s︡ii͡a︡ prot͡s︡essov podgotovki aviat͡s︡ionnogo proizvodstva na baze ĖVM i oborudovanii͡a︡ s ChPU. "Mashinostroenie", 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yakimovich, Sergey, and Yuriy Efimov. Modeling and scientific research tools in the timber industry based on LabVIEW. INFRA-M Academic Publishing LLC., 2024. http://dx.doi.org/10.12737/1851518.

Full text
Abstract:
The textbook describes the methods of modeling and modern theory of industrial experiment in relation to the timber industry: model design, test preparation, selection of measuring instruments and experiment planning, data processing methods and their analysis. For practical consolidation of the material, laboratory work on experimental studies of random processes of longitudinal sawing of wood based on LabVIEW and spectral analysis is presented. Meets the requirements of the federal state educational standards of higher education of the latest generation. For bachelors and masters in the field of "Technology of logging and wood processing industries".
APA, Harvard, Vancouver, ISO, and other styles
4

D, Smith Fred M., ed. Physician investigator handbook: GCP tools and techniques. Interpharm Press, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

United States. National Aeronautics and Space Administration., ed. Automated data acquisition technology development: Automated modeling and control development : final technical report. National Aeronautics and Space Administration, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

United, States Congress Senate Committee on Banking Housing and Urban Affairs Subcommittee on Security and International Trade and Finance. Equipping financial regulators with the tools necessary to monitor systemic risk: Hearing before the Subcommittee on Security and International Trade and Finance of the Committee on Banking, Housing, and Urban Affairs, United States Senate, One Hundred Eleventh Congress, second session, on examining the systemic risk aspect of regulatory reform, focusing on regulators' current capabilities to collect and analyze financial market data, and assessing what additional tools and resources are necessary to monitor and identify systemic risk, February 12, 2010. U.S. G.P.O., 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Whybark, D. Clay. An analysis of global data on the impact of the market on manufacturing practices. Indiana Center for Global Business, School of Business, Indiana University, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Wilson, J. L. 3-D modeling as a tool to improve integrated design and construction. The Institute, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Akhmedov, M. Z. Operativnyĭ uchet i analiz ispolʹzovanii͡a︡ materialov. "Finansy i statistika", 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Akhmedov, M. Z. Operativnyĭ uchet i analiz ispolʹzovanii︠a︡ materialov. "Finansy i statistika", 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Industry data tools"

1

Krishnamurthy, Vallidevi, V. S. Aprajitha, S. Prasanna, and Nuthalapati Rahul. "Big data fusion with GEN AI Tools." In Industry 6.0. CRC Press, 2024. http://dx.doi.org/10.1201/9781003517993-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Begum, Khadija, Md Mamunur Rashid, and Mohammad Arafath Uddin Shariff. "Comparative Study of Big Data Visualization Tools and Techniques." In Applied Informatics for Industry 4.0. Chapman and Hall/CRC, 2023. http://dx.doi.org/10.1201/9781003256069-16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

König, Andreas. "Dimensionality Reduction and Interactive Visualization of Multivariate Data — Methods, Tools, Applications." In Soft Computing and Industry. Springer London, 2002. http://dx.doi.org/10.1007/978-1-4471-0123-9_39.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hong, Yan, Xianyi Zeng, Pascal Brunixaux, and Yan Chen. "Evaluation of Fashion Design Using Artificial Intelligence Tools." In Artificial Intelligence for Fashion Industry in the Big Data Era. Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-13-0080-6_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Dasgupta, J. "Imparting Hands-on Industry 4.0 Education at Low Cost Using Open Source Tools and Python Eco-System." In Studies in Big Data. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-25778-1_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Wörner, Daniel, Lukas Budde, and Thomas Friedli. "Towards Circular Business Models in the Punching Industry: Leveraging Smart Sensor Technology for Sustainable Manufacturing Processes." In Lecture Notes in Mechanical Engineering. Springer Nature Switzerland, 2025. https://doi.org/10.1007/978-3-031-77429-4_7.

Full text
Abstract:
AbstractThe punching industry (PI) remains largely dependent on unsustainable resource use in its manufacturing process, consuming finite materials. Evolving market needs for increased environmental sustainability in the PI causes punching companies to reevaluate their existing manufacturing processes. Considering this trend, this research explores the prerequisites for transitioning to circular business models (CBMs), a subset of sustainable business models, by using smart sensor technology (SST) supporting a sustainable manufacturing process (SMP). Prior research has shown that using different sensors pertaining punching tools and machines is prevalent. Standard parts in the punching tool itself address these challenges of an imprecise set-up of the punching tool on the punching machine resulting in process improvements and efficiency enhancements. Yet, no similar data-driven punching tool concepts have been installed highlighting the novelty. To address this gap, this study examines how data can be gathered and successfully utilized to support SMP fostering CBMs in the PI based on data-driven punching tools. Mixed methods are applied in a practice oriented project, involving a Swiss-based manufacturer. The results demonstrate several key drivers supporting SMPs enabled by data-driven punching tools. Future research should involve a larger number of interviews and further field testing to increase maturing SST data-driven punching tools.
APA, Harvard, Vancouver, ISO, and other styles
7

Ryckelynck, David, Fabien Casenave, and Nissrine Akkari. "Structured Data and Knowledge in Model-Based Engineering." In Manifold Learning. Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-52764-7_1.

Full text
Abstract:
AbstractModel-based engineering refers to the applied mathematical methods and tools used in the industry in the design processes. In this chapter we introduce how geometrical, thermal and mechanical models are used and combined in complex systems. These models are implemented in computer platforms. They generate structured data that enable engineers to design future products.
APA, Harvard, Vancouver, ISO, and other styles
8

Fortin, Clément, Grant McSorley, Dominik Knoll, Alessandro Golkar, and Ralina Tsykunova. "Study of Data Structures and Tools for the Concurrent Conceptual Design of Complex Space Systems." In Product Lifecycle Management and the Industry of the Future. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-72905-3_53.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Michaud, Marc-Antoine, and Roland Maranzana. "Cost Estimation Aided Software for Machined Parts: An Hybrid Model Based on PLM Tools and Data." In Product Lifecycle Management and the Industry of the Future. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-72905-3_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Malanchuk, Oksana, Аnatoliy Тryhuba, Ivan Rogovskii, Liudmyla Titova, Liudmyla Berezova, and Mykola Korobko. "Differential-symbolic approach and tools for management of medical support projects for the population of communities." In PROJECT MANAGEMENT: INDUSTRY SPECIFICS. TECHNOLOGY CENTER PC, 2024. https://doi.org/10.15587/978-617-8360-03-0.ch4.

Full text
Abstract:
The aim of the study is to propose a differential-symbolic approach to managing community health support projects, to develop algorithms and computer models on its basis, and to use them to conduct a study of the impact of project environment components on the choice of the optimal project implementation scenario and risk assessment. The work uses project management methodology, system and differential-symbolic approaches, which underlie the developed algorithms and computer models for planning community health improvement projects and assessing their risks. To implement the proposed models, code was written in the Python programming language using libraries for solving differential equations, optimizing and visualizing results. NumPy libraries were used to work with numerical data and vectors, SciPy for numerically solving differential equations and optimizing the objective function, and Matplotlib for visualizing the results. The main stages of the proposed differential-symbolic approach to managing community health support projects are presented. Mathematical models of differential-symbolic planning of projects for planning projects for improving the health of the community population and risk assessment of projects for medical support of the community population have been developed. They involve the use of differential equations to describe the dynamics of projects as a separate system and the use of symbolic expressions to represent individual parameters and their description. Algorithms of differential-symbolic management of projects for improving the health of the community population and risk assessment of projects for medical support of the community population have been developed, the block diagram of which involves the implementation of 16 and 9 interconnected steps, respectively. Based on the proposed algorithms, computer models of differential-symbolic planning of projects for improving the health of the community population and risk assessment of projects for medical support of the community population have been developed. Based on the use of computer models for given conditions of the project environment, the results of optimizing the configuration of projects for improving the health of the community population and risk assessment of projects for medical support of the community population have been obtained. The prospect of further research is to expand the functionality of computer models, adding modules for the analysis of other component projects. For the first time, a differential-symbolic approach to managing projects for medical support of the population of communities has been proposed, which is based on methods of mathematical modeling, numerical analysis and optimization, which ensure the determination of a rational configuration of these projects and the assessment of risks for the given characteristics of the project environment. Based on the substantiated stages of the differential-symbolic approach, mathematical models, algorithms and computer models have been developed. The use of the proposed computer models makes it possible to obtain the dependence of the growth rate of the percentage of the healthy population participating in educational activities on the configuration of projects for improving the health of the population of communities, as well as to determine the optimal scenarios for the implementation of these projects in the community and the risks of projects for medical support of the population of communities. The proposed computer models are a tool for project managers, which allows to perform labor-intensive calculations to form possible scenarios for the implementation of projects to improve the health of the population in the community, determine the optimal one among them, as well as assess the risks of projects for medical support of the population of communities.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Industry data tools"

1

Toqi, Shadha Al, Janardhan Rao Saithala, Monica Fernandez, and Talal Nabhani. "Data Visualization Aspects of Corrosion Management Strategy." In CONFERENCE 2023. AMPP, 2023. https://doi.org/10.5006/c2023-18906.

Full text
Abstract:
Abstract In the fourth Industry Revolution (IR 4.0) era, Oil &amp; gas operators are encouraged to transform the way we execute work processes to enhance safety, increase efficiency, and reduce cost through digital transformation. Currently, the Corrosion Barrier Management (CBM) performance is tracked manually (e.g., updating spreadsheets, extensive repetitive tasks, etc.), and there was a need for a “live” solution to monitor the facilities CBM performance through dashboards continuously. Therefore, the objective of this project was to develop, deploy and maintain a data visualization tool for online CBM. The vision was to avail the CBM for the corrosion control, integrity, operation engineers, leadership teams and central engineering organizations to help reduce high-risk corrosion events and to enhance corrosion mitigation strategies to maintain corrosion rates under acceptable limits. CBM is challenging for existing and ageing facilities as different elements are within scattered platforms that must be prepared for full integration and digitalization. Isolated systems do not consider all aspects of CBM strategies which limits the overall understanding of CBM performance, preventing sound corrosion management decisions. The next evolution of the CBM is moving them from monitoring performance tools to proactive prediction tools to achieve zero asset integrity incidents. This paper discusses the development steps of the proposed CBM tool in detail.
APA, Harvard, Vancouver, ISO, and other styles
2

Carnes, Matt, Bruce Beighle, and Brian Yeagley. "ECDA Workflow Optimization Utilizing Software Tools." In CORROSION 2005. NACE International, 2005. https://doi.org/10.5006/c2005-05176.

Full text
Abstract:
Abstract The External Corrosion Direct Assessment (ECDA) process requires management and analyses of large amounts of data from a variety of sources. The utilization of commercially available software tools can greatly enhance the effectiveness and validity of the ECDA workflow process, as well as reduce the resource requirements. This paper presents the following five examples of optimizing the NACE RP0502-2002 ECDA Recommended Practice1 workflow, with interpretations based on industry-developed protocols and proven decision support tools originally developed for pipeline risk assessment and resource allocation;(1)Pre-Assessment: Define ECDA Regions (3.5)Indirect Inspection: Alignment and Comparison (4.3)Classification (4.3.2.1)Direct Examination: Prioritization (5.2)Post Assessment: Assessment of ECDA Effectiveness (6.4)
APA, Harvard, Vancouver, ISO, and other styles
3

Ellor, James A., and J. Peter Ault. "Using Electronic Tools and Databases to Manage Coatings Used for Corrosion Control." In CORROSION 2008. NACE International, 2008. https://doi.org/10.5006/c2008-08197.

Full text
Abstract:
Abstract This paper will explore the use of electronic tools and databases to facilitate efficient use of protective coatings for corrosion control. Electronic tools and databases are available which may impact quality assurance, condition assessment, maintenance planning, and expert systems. For various reasons, few of the tools which are developed gain widespread acceptance or use. Collecting and managing data for corrosion control coatings remains a paper-intensive activity for most owners. The paper will review the author’s experience in assisting in the development of several different systems and the lessons learned in the implementation of such systems within the Department of Defense (DoD), private industry, and the bridge industry. The intent is to help future designers developing such systems.
APA, Harvard, Vancouver, ISO, and other styles
4

Uzelac, Neb I., Michael Beller, Konrad Reber, and Otto Alfred Barbian. "New Generation of Ultrasonic In-Line Inspection Tools." In CORROSION 2004. NACE International, 2004. https://doi.org/10.5006/c2004-04162.

Full text
Abstract:
Abstract In-line inspection (ILI) of pipelines has established itself as the most efficient tool for evaluating the condition of a pipeline and an indispensable part of pipeline integrity management. This paper describes the new generation of ultrasonic ILI tools, both for metal loss (corrosion) and crack detection. This paper shows how requirements on ILI data and ensuing defect assessment procedures; influence the development of new tools. The performance of these new tools is discussed, their detection and sizing performance, as well as their operational characteristics and improved reporting. It is demonstrated how the new ultrasonic (UT) metal loss tools provide superior results over others in fully using the most advanced defect assessment algorithms. Inspection of pipelines for crack and crack-like defects is another aspect where this technology offers the highest standards within the industry. Examples are shown of how cracks and crack-like defects of different kinds can be reliably detected classified and sized using this technology. Most recent results obtained with the described technology are shown, illustrating the performance and fields of application.
APA, Harvard, Vancouver, ISO, and other styles
5

Piwowarczyk, Zuzanna. "KNOWLEDGE SHARING IN DISTRIBUTED TEAMS - THE IMPACT OF VIRTUAL COLLABORATION TOOLS." In 24th SGEM International Multidisciplinary Scientific GeoConference 2024. STEF92 Technology, 2024. https://doi.org/10.5593/sgem2024/2.1/s07.07.

Full text
Abstract:
This study analyzes the impact of virtual collaboration tools on knowledge sharing in distributed and virtual teams, operating at the intersection of technology, human behavior, and organizational processes. It aims to unravel the complexities of knowledge sharing in modern workplaces by examining the role of these tools in facilitating knowledge exchange, identifying factors for effective practices, and exploring the impact of organizational culture and industry dynamics. Through empirical research, including a self-completed online questionnaire and secondary data sources, the study addresses key research objectives and hypotheses, seeking to maintain elevated levels of knowledge sharing with increasing virtual collaboration. Building on previous research on IT management models and knowledge exchange, the study finds that virtual collaboration tools positively influence knowledge sharing, with specific tools and functions such as speed of access and file storage contributing significantly. However, challenges remain in transparency of information structure and integration with organizational tools. Despite limitations in sample size, the research offers valuable insights and practical strategies for organizations navigating today's dynamic work landscape, supporting best practices in environments user planning and IT tool design, and deepening the understanding of knowledge sharing in virtual.
APA, Harvard, Vancouver, ISO, and other styles
6

Luckow, Andre, Matthew Cook, Nathan Ashcraft, Edwin Weill, Emil Djerekarov, and Bennie Vorster. "Deep learning in the automotive industry: Applications and tools." In 2016 IEEE International Conference on Big Data (Big Data). IEEE, 2016. http://dx.doi.org/10.1109/bigdata.2016.7841045.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yadav, Kusum, Manjusha Pandey, and Siddharth Swarup Rautaray. "Feedback analysis using big data tools." In 2016 International Conference on ICT in Business Industry & Government (ICTBIG). IEEE, 2016. http://dx.doi.org/10.1109/ictbig.2016.7892674.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Galotto, L., A. D. M. Brun, R. B. Godoy, F. R. R. Maciel, and J. O. P. Pinto. "Data based tools for sensors continuous monitoring in industry applications." In 2015 IEEE 24th International Symposium on Industrial Electronics (ISIE). IEEE, 2015. http://dx.doi.org/10.1109/isie.2015.7281536.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kruse, Rudolf. "Probabilistic Graphical Models for Data Mining and Planning in Automotive Industry." In 19th IEEE International Conference on Tools with Artificial Intelligence(ICTAI 2007). IEEE, 2007. http://dx.doi.org/10.1109/ictai.2007.182.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Jaskula, Klaudia, Eleni Papadonikolaki, and Dimitrios Rovas. "Comparison of current common data environment tools in the construction industry." In 2023 European Conference on Computing in Construction and the 40th International CIB W78 Conference. European Council for Computing in Construction, 2023. http://dx.doi.org/10.35490/ec3.2023.315.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Industry data tools"

1

Tandon, Samarth, Ming Gao, and Ravi Krishnamurthy. PR-328-083501-R01 Evaluation of EMAT Tool Performance and Reliability by Monitoring Industry Experience. Pipeline Research Council International, Inc. (PRCI), 2017. http://dx.doi.org/10.55274/r0011442.

Full text
Abstract:
PRCI project SCC 3-7 consists of two phases: Phase I: "Evaluation of EMAT Tool Performance by Monitoring Industry Experience" (2008-2011) and Phase II: "Evaluation of the Reliability of EMAT Tool by Monitoring Industry Experience" (2012-2015). In this report, performance of the EMAT tools based on analysis of the data from 15 pipeline segments from Phase I and 35 pipeline segments from Phase II is presented. The performance is updated with the newly analyzed data in terms of the probability of detection, identification, false calls, and sizing. Categorization of these 35 segments is presented at three different levels of using EMAT ILI as an integrity tool alternative to hydrotest for management of SCC in gas pipelines. Finally, the reliability of EMAT for SCC management is evaluated incorporating the performance measures and the assessment methodology. The reliability of the hydrotest is calculated based on the collected historical data from one Gas Pipeline Operator.
APA, Harvard, Vancouver, ISO, and other styles
2

bin Ahsan, Wahid, Md Tanvir Hasan, Danilson Placid Purification, Nilim Ahsan, Naima Haque Numa, and Mostain Billa Tusar. Challenges and Opportunities in Bangladesh’s Content Writing Industry: A Qualitative Exploration. Userhub, 2023. http://dx.doi.org/10.58947/ghkp-lxdn.

Full text
Abstract:
This research provides a qualitative, in-depth exploration of the content writing industry in Bangladesh, identifying prevalent trends, challenges, and growth potential. The study utilizes data from 44 participants, which include content writers and clients with diverse levels of industry experience, collected via online surveys and detailed interviews. Key findings suggest that while the industry is marked by a high demand for unique, engaging, and SEO-optimized content, issues pertaining to AI’s role, market saturation, and remuneration concerns persist. Despite these challenges, strategies for success emerged, such as continuous learning, effective client-writer communication, and strategic use of AI and social media tools. The study highlights the industry’s considerable potential for growth and recommends enhancing skill sets, promoting clear communication, creating unique value propositions, and encouraging supportive industry-wide policies. The study also signals future research directions, including the exploration of AI’s impact, pay practices, professional development programs, and the differential roles of social media platforms.
APA, Harvard, Vancouver, ISO, and other styles
3

Ukiwe and McDonnell. L52362 Assessing the Performance of Above Ground Coating Evaluation Surveys. Pipeline Research Council International, Inc. (PRCI), 2012. http://dx.doi.org/10.55274/r0010686.

Full text
Abstract:
The primary application area for above ground coating evaluation (AGCES) methods is for unpiggable pipelines. For several reasons, however, AGCES methods are not necessarily limited along these narrow lines. For instance, most ILI tools are incapable of identifying external corrosion (which resulted from coating damage overtime) until the wall loss damage has reached the measurable detection threshold of the ILI tool. Thus, even for piggable pipelines, AGCES methods still come in handy for pipeline integrity. In fact, the most proactive approach for pipeline corrosion integrity management for the future will include complementary surveys, such as ECDA and ICDA, all super-imposed and integrated on a common GIS framework for comparative and comprehensive data analysis. Within the pipeline industry there exist divergent views about the AGCES methods, their principles of operation, and limitations. An important defining point about AGCES methods is simple this: they are indirect inspection methods for identifying and classifying external coating damage; they are not - and should not be - tied to the functionalities of any specific proprietary tools or hardware manufactures. The second statement requires further qualification. Confusion is often created among users of AGCES methods when the names of certain coating survey tool are interchanged with the specific survey methods in question. For instance, company A manufactures a device C for detecting coating anomalies. The tendency within the industry is to call the coating survey method such names as "C Survey" or "C Coating Survey". The problem created in this instance is to limit AGCES methods to what device C can capture; the quality of the data become handicapped by that of device C. Industry standards have been created to avoid the fore-going pitfalls. Hence, rather than limit survey methods to the specifications of a given proprietary tool, attempts should be made to manufacture and design tools to meet industry standards. The result will be uniformity of methodology, the only distinctions being speed of data acquisition, methods of data analysis, and possibly, other subtle differences in technological advancements. Included in this distinction is the use of analog versus digital data acquisition methods. Of course, all that has been described so far assumes that the industry standards referenced captures all the details of an effective coating evaluation survey method, including known and even perceived limitations.
APA, Harvard, Vancouver, ISO, and other styles
4

Ersoy, Daniel. 693JK31810003 Non-Destructive Tools for Surface to Bulk Correlations of Yield Strength Toughness and Chemistry. Pipeline Research Council International, Inc. (PRCI), 2022. http://dx.doi.org/10.55274/r0012206.

Full text
Abstract:
Evaluates the use of non-destructive surface testing (micro indentation, micro-machining, in situ chemistry, and replicate microscopy analysis) as a means to perform pipe material confirmation. The test results from thousands of lab and field material tests done on actual pipeline samples have been used to develop models that account for pipe material thermo-mechanical process variations and through-wall variability of material, mechanical, and chemical properties. A separate "training set" of twenty pipelines was made available to GTI, Element Resources, and ASU to allow initial model testing and prove-out prior to the seventy primary samples that were used to fully characterize pipeline properties and the correlation of surface to bulk properties, as well as develop predictive models of bulk properties based solely on surface obtained pipeline data. A set of seventy pipeline samples (termed Pipe Library) that were in service from the natural gas industry were selected for the project testing and modeling. A great deal of care and effort was put forth to select a reasonable number that provided the adequate breadth of variety as typically encountered by the industry in the field.
APA, Harvard, Vancouver, ISO, and other styles
5

Hermansen, Anna, and Cailean Osborne. The Economic and Workforce Impacts of Open Source AI: Insights from Industry, Academia, and Open Source Research Publications. The Linux Foundation, 2025. https://doi.org/10.70828/itvq4899.

Full text
Abstract:
In a literature review commissioned by Meta, LF Research found that open source AI (OSAI) is widely adopted, cost effective, highly performing, and leads to faster and higher-quality development of tools and models. The study included a comprehensive analysis of academic and industry literature as well as empirical data from previous LF Research surveys to determine existing evidence of the economic and workforce impacts of OSAI. The study assessed these impacts in four areas: Adoption rates: a significant majority (89%) of organizations are using some form of open source in their AI stack and almost two-thirds (63%) of companies are using an open model Economic benefits: OSAI is considered a cost-effective choice as compared to proprietary solutions while increasing productivity and accelerating collaborative innovation Workforce impacts: AI has nuanced impacts on the workforce and is poised to be more of a complement for jobs than a tool to replace jobs Sector-specific insights: AI has unique impacts on healthcare, agriculture, construction, manufacturing, and energy
APA, Harvard, Vancouver, ISO, and other styles
6

Tidd, Alexander N., Richard A. Ayers, Grant P. Course, and Guy R. Pasco. Scottish Inshore Fisheries Integrated Data System (SIFIDS): work package 6 final report development of a pilot relational data resource for the collation and interpretation of inshore fisheries data. Edited by Mark James and Hannah Ladd-Jones. Marine Alliance for Science and Technology for Scotland (MASTS), 2019. http://dx.doi.org/10.15664/10023.23452.

Full text
Abstract:
[Extract from Executive Summary] The competition for space from competing sectors in the coastal waters of Scotland has never been greater and thus there is a growing a need for interactive seascape planning tools that encompass all marine activities. Similarly, the need to gather data to inform decision makers, especially in the fishing industry, has become essential to provide advice on the economic impact on fishing fleets both in terms of alternative conservation measures (e.g. effort limitations, temporal and spatial closures) as well as the overlap with other activities, thereby allowing stakeholders to derive a preferred option. The SIFIDS project was conceived to allow the different relevant data sources to be identified and to allow these data to be collated in one place, rather than as isolated data sets with multiple data owners. The online interactive tool developed as part of the project (Work Package 6) brought together relevant data sets and developed data storage facilities and a user interface to allow various types of user to view and interrogate the data. Some of these data sets were obtained as static layers which could sit as background data e.g. substrate type, UK fishing limits; whilst other data came directly from electronic monitoring systems developed as part of the SIFIDS project. The main non-static data source was Work Package 2, which was collecting data from a sample of volunteer inshore fishing vessels (&lt;12m). This included data on location; time; vessel speed; count, time and position of deployment of strings of creels (or as fleets and pots as they are also known respectively); and a count of how many creels were hauled on these strings. The interactive online tool allowed all the above data to be collated in a specially designed database and displayed in near real time on the web-based application.
APA, Harvard, Vancouver, ISO, and other styles
7

Janda and Scott. PR-314-103702-R01 Evaluate and Define CPCM Capabilities and Limitations. Pipeline Research Council International, Inc. (PRCI), 2011. http://dx.doi.org/10.55274/r0010822.

Full text
Abstract:
Recent technology advances have led to the development of ILI tools that monitor and record CP current through electric potential longitudinal gradient measurements at the inner pipe wall. This technique represents a potential improvement to the existing standard electrical survey methods by providing more accurate data and a safer method for acquiring data. The capabilities and limitations of this developing ILI technology have not been fully defined for application in the industry due to limited experience running the tool by pipeline companies. Through collective participation by PRCI member companies that have experience with the technology, a comprehensive analysis could be performed to clarify its capabilities and limitations so that the technology can be appropriately applied.
APA, Harvard, Vancouver, ISO, and other styles
8

Cazenave, Pablo. PR-328-153721-R01 Development of an Industry Test Facility and Qualification Process for ILI Technology. Pipeline Research Council International, Inc. (PRCI), 2016. http://dx.doi.org/10.55274/r0011020.

Full text
Abstract:
The project "Development of an Industry Test Facility and Qualification Processes for in-line inspection (ILI) technology Evaluation and Enhancements" aims to expand knowledge of ILI technology performance and identify gaps where new technology is needed. Additionally, this project aims to provide a continuing resource for ILI technology developers, researchers and pipeline operators to have access to test samples with a range of pipeline integrity threats and vintages and in-line technology test facilities at the Pipeline Research Council International, Inc. (PRCI) Technology Development and Deployment Center (TDC), a PRCI managed facility available for future industry and PHMSA research projects. An ILI pull test facility was designed and constructed as part of this project based on industry state of the art and opportunities for capability improvement. The major ILI technology provid-ers, together with pipeline operator team members, reviewed the TDC sample inventory and de-signed a series of ILI performance tests illustrating one of multiple possible research objectives, culminating in 16 inch and 24 inch nominal diameter test strings. The ILI technology providers proposed appropriate inspection tools based on limited knowledge of the integrity conditions in the test strings, a series of pull tests of the provided ILI tools were performed and the technology providers delivered reports of integrity anomaly location and physical dimensions for perfor-mance evaluation. PRCI engaged Blade Energy Partners, Ltd. (Blade) to conduct the evaluation of the ILI data obtained from repeated testing on the 16 and 24 inch pipeline strings at the TDC. Blade Energy was also requested by the PRCI Project Team to incorporate prior work concerning the development of the PRCI ILI test facility to serve as a final report for the PRCI project. The resulting data was analyzed, aligned, compared to truth data and evaluated by Blade, with the findings presented in this report. Quantitative measures of detection and sizing performance were disclosed in-confidence to the individual ILI technology providers. For instances where ILI predictions were outside of claimed performance, the vendors were given a limited sample of actual defect data to enable re-analysis, thus demonstrating the potential for improved integrity assessment with validation measurements. This report has a related webinar.
APA, Harvard, Vancouver, ISO, and other styles
9

Ruby, Jeffrey, Robert Fischer, Richard Massaro, et al. Optimization strategies for geospatial data on end-user devices. Engineer Research and Development Center (U.S.), 2024. http://dx.doi.org/10.21079/11681/49359.

Full text
Abstract:
The ability to quickly disseminate geospatial data across all echelons, particularly those at the tactical edge, is critical to meeting threats described by the Multi-Domain Operations doctrine. The US Army Engineer Research and Development Center, Geospatial Research Laboratory (ERDC-GRL), is researching the optimization of the formats, data models, file sizes, and quality of geospatial products to be exploited by end-user devices (EUDs). This report describes a processing methodology comprising custom software and open-source tools to optimize Army Geospatial Enterprise Standard Sharable Geospatial Foundation and industry-accepted products for exploitation on EUDs. The Integrated Visual Augmentation System (IVAS) was emphasized, but other devices, including the Nett Warrior and Program Executive Office—Soldier targeting systems, were also studied. Additionally, we developed a compression methodology that reduced the size of three-dimensional model data by a factor of 9 without a loss in data quality. A summary of the results describes steps to address remaining technical issues and considers future efforts to further optimize geospatial data for additional EUDs and tactical applications.
APA, Harvard, Vancouver, ISO, and other styles
10

Ruby, Jeffrey, Robert Fischer, Richard Massaro, et al. Optimization strategies for geospatial data on end-user devices. Engineer Research and Development Center (U.S.), 2024. http://dx.doi.org/10.21079/11681/49366.

Full text
Abstract:
The ability to quickly disseminate geospatial data across all echelons, particularly those at the tactical edge, is critical to meeting threats described by the Multi-Domain Operations doctrine. The US Army Engineer Research and Development Center, Geospatial Research Laboratory (ERDC-GRL), is researching the optimization of the formats, data models, file sizes, and quality of geospatial products to be exploited by end-user devices (EUDs). This report describes a processing methodology comprising custom software and open-source tools to optimize Army Geospatial Enterprise Standard Sharable Geospatial Foundation and industry-accepted products for exploitation on EUDs. The Integrated Visual Augmentation System (IVAS) was emphasized, but other devices, including the Nett Warrior and Program Executive Office—Soldier targeting systems, were also studied. Additionally, we developed a compression methodology that reduced the size of three-dimensional model data by a factor of 9 without a loss in data quality. A summary of the results describes steps to address remaining technical issues and considers future efforts to further optimize geospatial data for additional EUDs and tactical applications.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!