Academic literature on the topic 'Windows-based statistics software package'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Windows-based statistics software package.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Windows-based statistics software package"

1

Acutis, Marco, Patrizia Trevisiol, Roberto Confalonieri, et al. "Analytical Method Performance Evaluation (AMPE)A Software Tool for Analytical Method Validation." Journal of AOAC INTERNATIONAL 90, no. 5 (2007): 1432–38. http://dx.doi.org/10.1093/jaoac/90.5.1432.

Full text
Abstract:
Abstract A Windows-based software tool [Analytical Method Performance Evaluation (AMPE)] was developed to support the validation of analytical methods. The software implements standard statistical approaches commonly adopted in validation studies to estimate analytical method performance (limits of detection and quantitation, accuracy, specificity, working range, and linearity of responses) according to ISO 5725. In addition, AMPE proposes the application of innovative and unique approaches for the assessment of analytical method performance. Specifically, AMPE proposes the use of difference-based indexes to quantify the agreement between measurements and reference values, the use of pattern indexes to quantify methods bias with respect to specific external variables, and the application of fuzzy logic to aggregate into synthetic indicators the information collected independently via the different performance statistics traditionally estimated in validation studies. Aggregated measures are particularly useful for methods comparison, when more than one method is available for a specific analysis and itmay be of interest to identify the best performing one taking into account, simultaneously, the information available from different performance statistics. Illustrative examples of the type of outputs expected from AMPE-based validation sessions are given. The extensive data handling capabilities and the wide range of statistics supplied in the software package makes AMPE suitable for specific needs that may arise in different validation studies. The installation package, complete with a fully documented help file, is distributed free of charge to interested users along with input files exemplary of the type of entry data required to run validation data analyses.
APA, Harvard, Vancouver, ISO, and other styles
2

Desai, Trunil S., and Shireesh Srivastava. "FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses." PeerJ 6 (April 27, 2018): e4716. http://dx.doi.org/10.7717/peerj.4716.

Full text
Abstract:
13C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13C-MFA software that works in various operating systems will enable more researchers to perform 13C-MFA and to further modify and develop the package.
APA, Harvard, Vancouver, ISO, and other styles
3

Rogers, Richard L. "A Microcomputer-based Statistics Course with Individualized Assignments." Teaching of Psychology 14, no. 2 (1987): 109–11. http://dx.doi.org/10.1207/s15328023top1402_13.

Full text
Abstract:
This article describes my use of a microcomputer software/textbook package (Elzey, 1985), including computer-generated individualized assignments, in an introductory statistics course. Advantages and limitations of these materials are summarized and students' reactions to them are mentioned.
APA, Harvard, Vancouver, ISO, and other styles
4

Su, Shian, Vincent J. Carey, Lori Shepherd, Matthew Ritchie, Martin T. Morgan, and Sean Davis. "BiocPkgTools: Toolkit for mining the Bioconductor package ecosystem." F1000Research 8 (May 29, 2019): 752. http://dx.doi.org/10.12688/f1000research.19410.1.

Full text
Abstract:
Motivation: The Bioconductor project, a large collection of open source software for the comprehension of large-scale biological data, continues to grow with new packages added each week, motivating the development of software tools focused on exposing package metadata to developers and users. The resulting BiocPkgTools package facilitates access to extensive metadata in computable form covering the Bioconductor package ecosystem, facilitating downstream applications such as custom reporting, data and text mining of Bioconductor package text descriptions, graph analytics over package dependencies, and custom search approaches. Results: The BiocPkgTools package has been incorporated into the Bioconductor project, installs using standard procedures, and runs on any system supporting R. It provides functions to load detailed package metadata, longitudinal package download statistics, package dependencies, and Bioconductor build reports, all in "tidy data" form. BiocPkgTools can convert from tidy data structures to graph structures, enabling graph-based analytics and visualization. An end-user-friendly graphical package explorer aids in task-centric package discovery. Full documentation and example use cases are included. Availability: The BiocPkgTools software and complete documentation are available from Bioconductor (https://bioconductor.org/packages/BiocPkgTools).
APA, Harvard, Vancouver, ISO, and other styles
5

Lemaire, E. "A CAD analysis programme for prosthetics and orthotics." Prosthetics and Orthotics International 18, no. 2 (1994): 112–17. http://dx.doi.org/10.3109/03093649409164393.

Full text
Abstract:
A CAD (computer aided design) analysis software package (CADVIEW) was designed for use with prosthetic and orthotic CAD CAM (computer aided design/computer aided manufacture) systems. Using the Microsoft Windows 3.1 environment, CADVIEW provides a series of anatomical shape viewing and analysis tools. These tools include simultaneous display of multiple sockets and multiple views, two dimensional (2D) and three dimensional (3D) measurement, shape statistics, multi-shape alignment, cross-sectional comparison, colour coded 3D comparison, resolution enhancement, and image copying capabilities. This programme should be of benefit to clinicians and researchers who wish to assess and/or compare CAD data generated by MS-DOS based CAD CAM systems.
APA, Harvard, Vancouver, ISO, and other styles
6

Silva, A. Pedro Duarte, and Antonie Stam. "Nonparametric Two-Group Classification: Concepts and a SAS-Based Software Package." American Statistician 52, no. 2 (1998): 185. http://dx.doi.org/10.2307/2685479.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Lun, Aaron T. L., and Gordon K. Smyth. "csaw: a Bioconductor package for differential binding analysis of ChIP-seq data using sliding windows." Nucleic Acids Research 44, no. 5 (2015): e45-e45. http://dx.doi.org/10.1093/nar/gkv1191.

Full text
Abstract:
Abstract Chromatin immunoprecipitation with massively parallel sequencing (ChIP-seq) is widely used to identify binding sites for a target protein in the genome. An important scientific application is to identify changes in protein binding between different treatment conditions, i.e. to detect differential binding. This can reveal potential mechanisms through which changes in binding may contribute to the treatment effect. The csaw package provides a framework for the de novo detection of differentially bound genomic regions. It uses a window-based strategy to summarize read counts across the genome. It exploits existing statistical software to test for significant differences in each window. Finally, it clusters windows into regions for output and controls the false discovery rate properly over all detected regions. The csaw package can handle arbitrarily complex experimental designs involving biological replicates. It can be applied to both transcription factor and histone mark datasets, and, more generally, to any type of sequencing data measuring genomic coverage. csaw performs favorably against existing methods for de novo DB analyses on both simulated and real data. csaw is implemented as a R software package and is freely available from the open-source Bioconductor project.
APA, Harvard, Vancouver, ISO, and other styles
8

Mowinckel, Athanasia M., and Didac Vidal-Piñeiro. "Visualization of Brain Statistics With R Packages ggseg and ggseg3d." Advances in Methods and Practices in Psychological Science 3, no. 4 (2020): 466–83. http://dx.doi.org/10.1177/2515245920928009.

Full text
Abstract:
There is an increased emphasis on visualizing neuroimaging results in more intuitive ways. Common statistical tools for dissemination of these results, such as bar charts, lack the spatial dimension that is inherent in neuroimaging data. Here we present two packages for the statistical software R that integrate this spatial component. The ggseg and ggseg3d packages visualize predefined brain segmentations as 2D polygons and 3D meshes, respectively. Both packages are integrated with other well-established R packages, which allows great flexibility. In this Tutorial, we describe the main data and functions in the ggseg and ggseg3d packages for visualization of brain atlases. The highlighted functions are able to display brain-segmentation plots in R. Further, the accompanying ggsegExtra package includes a wider collection of atlases and is intended for community-based efforts to develop additional compatible atlases for ggseg and ggseg3d. Overall, the ggseg packages facilitate parcellation-based visualizations in R, improve and facilitate the dissemination of results, and increase the efficiency of workflows.
APA, Harvard, Vancouver, ISO, and other styles
9

Rainer, Johannes, Laurent Gatto, and Christian X. Weichenberger. "ensembldb: an R package to create and use Ensembl-based annotation resources." Bioinformatics 35, no. 17 (2019): 3151–53. http://dx.doi.org/10.1093/bioinformatics/btz031.

Full text
Abstract:
Abstract Summary Bioinformatics research frequently involves handling gene-centric data such as exons, transcripts, proteins and their positions relative to a reference coordinate system. The ensembldb Bioconductor package retrieves and stores Ensembl-based genetic annotations and positional information, and furthermore offers identifier conversion and coordinates mappings for gene-associated data. In support of reproducible research, data are tied to Ensembl releases and are kept separately from the software. Premade data packages are available for a variety of genomes and Ensembl releases. Three examples demonstrate typical use cases of this software. Availability and implementation ensembldb is part of Bioconductor (https://bioconductor.org/packages/ensembldb). Supplementary information Supplementary data are available at Bioinformatics online.
APA, Harvard, Vancouver, ISO, and other styles
10

Vida, A., B. L. Bodrogi, B. Balogh, and P. Bai. "Taxamat: Automated biodiversity data management tool – Implications for microbiome studies." Physiology International 107, no. 1 (2020): 12–17. http://dx.doi.org/10.1556/2060.2020.00004.

Full text
Abstract:
AbstractWorking with biodiversity data is a computationally intensive process. Numerous applications and services provide options to deal with sequencing and taxonomy data. Professional statistics software are also available to analyze these type of data. However, in-between the two processes there is a huge need to curate biodiversity sample files. Curation involves creating summed abundance values for chosen taxonomy ranks, excluding certain taxa from analysis, and finally merging and downsampling data files. Very few tools, if any, offer a solution to this problem, thus we present Taxamat, a simple data management application that allows for curation of biodiversity data files before they can be imported to other statistics software. Taxamat is a downloadable application for automated curation of biodiversity data featuring taxonomic classification, taxon filtering, sample merging, and downsampling. Input and output files are compatible with most widely used programs. Taxamat is available on the web at http://www.taxamat.com either as a single executable or as an installable package for Microsoft Windows platforms.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Windows-based statistics software package"

1

Amodeo, Joseph Leon. "A Windows® based conceptual design and analysis package for next generation passenger aircraft." Thesis, Imperial College London, 1998. http://hdl.handle.net/10044/1/8347.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Yu, Kuan Tao. "Development of a PC software package using windows 95 and visual C++ to evaluate traffic safety improvements based upon accidents per unit time." Ohio University / OhioLINK, 1996. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1177617333.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Windows-based statistics software package"

1

Wakefield, Dorothy. The minitab manual: Elementary statistics : picturing the world, second ed. 4th ed. Prentice Hall, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Patsias, Sophoclis. A Windows based software package for machine monitoring. University ofManchester, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Corston, Rod. A crash course in SPSS for Windows: Updated for versions 10 and 11. 2nd ed. Blackwell Pub., 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Láruson, Áki Jarl, and Floyd Allan Reed. Population Genetics with R. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780198829539.001.0001.

Full text
Abstract:
Population genetics is an inherently quantitative discipline. Because the focus of population genetics studies is usually on abstract concepts like the frequencies of genetic variants over time, it can at first glance be difficult to conceptualize and appropriately visualize. As more and more quantitative models and methods have become established in the discipline, it has become necessary for people just entering the field to quickly develop a good understanding of the many layers of complex approaches, so as to correctly interpret even basic results. An unfortunate side effect of the widespread implementation of ready-to-use quantitative software packages is that some facets of analysis can become rote, which at best might lead to implementation without the full understanding of the user and at worst, inappropriate application leading to misguided conclusions. In this book a “learning by doing” approach is employed to encourage readers to begin developing an intuitive understanding of population genetics concepts. The analytical software R, which has increasingly been the program of choice for early exposure to basic statistical programming, is freely available online, has cross-platform compatibility (Windows, Mac, and Linux all support distributions of R), and offers the potential for hands-on implementation by the students, in addition to using pre-packaged functions.
APA, Harvard, Vancouver, ISO, and other styles
5

Ragas: A windows based software package for reviewed analysis of Gauribidanur array seismograms. Bhabha Atomic Research Centre, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

A Crash Course in Spss for Windows. Blackwell Publishers, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Corston, Rod, and Andrew Colman. Crash Course in SPSS for Windows: Versions 10 and 11. 2nd ed. Blackwell Publishing, Incorporated, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Windows-based statistics software package"

1

Temba, Pontian L., Noah M. Pauline, and Patrick M. Ndaki. "Living and responding to climate variability and change among coffee and banana farmers in the highlands of Moshi rural district, Tanzania." In Climate change impacts and sustainability: ecosystems of Tanzania. CABI, 2020. http://dx.doi.org/10.1079/9781789242966.0009.

Full text
Abstract:
Abstract The study aimed at exploring perceived impacts of climate variability on coffee and banana farming and community responses in the highlands of Moshi Rural District. A socio-economic survey employing qualitative and quantitative research approaches was used. Data were collected using questionnaires, key informant interviews, focus group discussions as well as field observation. A total of 96 farmers were involved in the study. SPSS Statistics software package and Microsoft Excel were used for data processing and analysis. Findings showed that communities are knowledgeable about climate variability. Their knowledge is based on perceptions of the impacts already felt and attributed to climate variability, including unpredictable patterns of rainy seasons. Climate variability is associated with decrease in household food supply, unpredictable farming calendar and drying of water sources for irrigation and domestic use. Coffee yields showed a decreasing trend (at the rate of R<sup>2</sup> = -0.494) during the years 1990-2016. This was contrary to bananas, which indicated an increasing trend (R<sup>2</sup> = 0.036) of production during the same period. Communities were responding to impacts of climate variability in various ways, including intercropping, planting early maturing and drought-resistant varieties and gravity canal irrigation. Projected climate changes showed that the future was uncertain for farmers depending on rain-fed farming. Therefore, further research on viable options would help farmers adapt to current and future climatic stresses. Options may include intensified irrigation of crops and conservation farming which have the potential to increase banana and coffee production, thereby improving productivity and food security for communities.
APA, Harvard, Vancouver, ISO, and other styles
2

Amadou, Zakou. "Agropastoralists’ Climate Change Adaptation Strategy Modeling: Software and Coding Method Accuracies for Best-Worst Scaling Data." In African Handbook of Climate Change Adaptation. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-45106-6_129.

Full text
Abstract:
AbstractInvestigating software and coding method accuracies are still a challenge when dealing with best-worst scaling data. Comparing various climate change policy estimates and their relative importance across different statistical packages has received little attention. In this chapter, we use best-worst scaling approach to determine agropastoralist preferences for 13 climate change adaptation policies across two popular statistical packages (R and SAS). While data were collected from 271 agropastoralists, mixed logit was used to analyze data. Results reveal that mean and standard deviation estimates for 13 climate change adaptation policies from R are higher and statistically significant than SAS estimates. Based on R estimates, prolific animal selection, vaccination, settlement, strategic mobility, and strategic destocking are the most popular climate change adaptation policies, and more than two-third of respondents are in favor of these policies.
APA, Harvard, Vancouver, ISO, and other styles
3

Kalcheva, Neli, Anna Zagorska, Nikolay Dukov, and Kristina Bliznakova. "Analysis of Suitability of Five Statistical Methods Applied for the Validation of a Monte Carlo X-Ray Based Software Packages." In Advances in Intelligent Systems and Computing. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68321-8_46.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kleanthous, Irene, and Maria Meletiou-Mavrotheris. "Early Statistical Reasoning." In K-12 STEM Education. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-3832-5.ch018.

Full text
Abstract:
This paper explores the potential of dynamic statistics software for supporting the early teaching and learning of statistical and probabilistic concepts integrated within the mathematics curriculum. It shares the experiences from a case study that implemented a data-driven approach to mathematics instruction using the dynamic data-visualization software InspireData©, an educational package specifically designed to meet the learning needs of students in the middle and high school grades (Grades 4-12). The authors report on how a group of fourteen (n=14) Grade 4 (about 9-year-old) students used the affordances provided by the dynamic learning environment to gather, analyze, and interpret data, and to draw data-based conclusions and inferences. Findings from the study support the view that mathematics instruction can promote the development of learners' statistical reasoning at an early age, through an informal, data-based approach. They also suggest that the use of dynamic statistics software has the potential to enhance statistics instruction by scaffolding and extending young students' stochastical and mathematical reasoning.
APA, Harvard, Vancouver, ISO, and other styles
5

Kleanthous, Irene, and Maria Meletiou-Mavrotheris. "A Case Study of Primary School Students' Use of a Dynamic Statistics Software Package for Analyzing and Interpreting Data." In Cases on Technology Integration in Mathematics Education. IGI Global, 2015. http://dx.doi.org/10.4018/978-1-4666-6497-5.ch002.

Full text
Abstract:
This chapter explores the potential of dynamic statistics software for supporting the teaching and learning of the Common Core Standards for Mathematics. It shares the experiences from a teaching experiment that implemented a data-driven approach to mathematics instruction using the dynamic data-visualization software InspireData© (Hancock, 2006), an educational package specifically designed to meet the learning needs of students in the middle and high school grades (Grades 4-12). We report on how a group of Grade 4 (about 9-year-old) students used the affordances provided by the dynamic learning environment to gather, analyze, and interpret data, and to draw data-based conclusions and inferences. The role of the technological tool in scaffolding and extending these young students' stochastical and mathematical reasoning is discussed.
APA, Harvard, Vancouver, ISO, and other styles
6

"Laboratory Activities in Primary School Teaching-Learning Sequences (TLS)." In Advances in Early Childhood and K-12 Education. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-5718-1.ch006.

Full text
Abstract:
In this chapter some practical activities to do in primary school classes are shown, and these concern classic themes of arithmetic and geometry but also more recent topics in statistics and probability. Naturally, all of them are based on pedagogical-didactic aspects noted in the preceding chapters. In particular, MatCos 3.0 environment is used. A complete TLS, from which the new methodology based on the MatCos programming environment emerges, will be presented. Finally, the simulation software package, DAF, is presented to illustrate the concept and related operations of the fractions.
APA, Harvard, Vancouver, ISO, and other styles
7

Lamere, Alicia Taylor. "Cluster Analysis in R With Big Data Applications." In Open Source Software for Statistical Analysis of Big Data. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-2768-9.ch004.

Full text
Abstract:
This chapter discusses several popular clustering functions and open source software packages in R and their feasibility of use on larger datasets. These will include the kmeans() function, the pvclust package, and the DBSCAN (density-based spatial clustering of applications with noise) package, which implement K-means, hierarchical, and density-based clustering, respectively. Dimension reduction methods such as PCA (principle component analysis) and SVD (singular value decomposition), as well as the choice of distance measure, are explored as methods to improve the performance of hierarchical and model-based clustering methods on larger datasets. These methods are illustrated through an application to a dataset of RNA-sequencing expression data for cancer patients obtained from the Cancer Genome Atlas Kidney Clear Cell Carcinoma (TCGA-KIRC) data collection from The Cancer Imaging Archive (TCIA).
APA, Harvard, Vancouver, ISO, and other styles
8

Mavi, Ahmet, Ahmet Özmen, and Mehmet Ertuğrul. "Analyzing and Presenting Data with LabVIEW." In LabVIEW - A Flexible Environment for Modeling and Daily Laboratory Use. IntechOpen, 2021. http://dx.doi.org/10.5772/intechopen.96130.

Full text
Abstract:
LabVIEW is an abbreviation for Laboratory Virtual Instrument Engineering Workbench and allows scientists and engineers to develop and implement an interactive program. LabVIEW has been specially developed to take measurements, analyze data, and present the results to the user. You determine what the device looks like, rather than the manufacturer of the device. LabVIEW has a very large library of functions and subprograms (subVIs) that can help you during your programming and use without occupying memory. Hidden programming problems that you may encounter in traditional programming languages are less common in LabVIEW. LabVIEW also includes different applications such as serial device control, data analysis, data presentation, data storage and communication over the internet. Analysis library; It includes versatile and useful functions such as signal generation, signal processing filters, Windows statistics and regressions, linear algebra and array arithmetic. Due to the graphical nature of LabVIEW, it is an innate data presentation package. You can view the data in any form you want. Chart, graph and user-defined graph are among the output options that can be used. As a scientist or an engineer, you frequently measure physical changes such as temperature, pressure, time, mass, electric current, light intensity, radioactivity etc. You generally need to analyze and present the data. When you have large amounts of data, you need to use software to analyze and present the data. LabVIEW makes these actions easy for you. Because LabVIEW includes hundreds of built-in and add-on functions you need that make it easy to create a user-friendly interface. In this chapter, we focus on data analysis and presentation.
APA, Harvard, Vancouver, ISO, and other styles
9

"Formulating of recipe for fireproof coatings via DOE based on statistical software package." In Information Technology and Computer Application Engineering. CRC Press, 2013. http://dx.doi.org/10.1201/b15936-124.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ekker, Knut. "Emergency Management Training and Social Network Analysis." In Artificial Intelligence Technologies and the Evolution of Web 3.0. IGI Global, 2015. http://dx.doi.org/10.4018/978-1-4666-8147-7.ch013.

Full text
Abstract:
The chapter first presents a background review of the application of computer technology in simulations of natural hazard situations. The chapter then presents the efforts of researchers at Mid Sweden University and Nord-Trøndelag University College to build a comprehensive emergency training tool with funding from the Interreg/EU ERDF (European Regional Development Fund). The main part of the chapter reports empirical data from this project GSS (Gaining Security Symbiosis). The project developed the tool for training emergency personnel (police, fire, ambulance, and local officials) in handling crises in the border region between Norway and Sweden. The Web-based software incorporated complex scenarios that the emergency personnel had to contend with during 3-hour training sessions. The participants included employees at the operator, tactical, and strategic level of the organizations. The training tool recorded all communications among participants which primarily was text based. The rich data source was analyzed “on-the-fly” with the software from the R Project for statistical computing and the SNA package (Social Network Analysis package). The statistical software provided detailed graphs of the social networking of communications among the participants on both sides of the national border in central Scandinavia. The chapter concludes with a presentation of ideas towards using the social networking data as input into simulation models based on system dynamics. The empirical data from the project will naturally provide data for future training sessions. A planned future model of the comprehensive training tool—netAgora—will use experiential data in a Virtual Responder component in the training sessions of emergency personnel.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Windows-based statistics software package"

1

Chen, H. "Design of switched reluctance generator system based on the software package for Windows." In 6th International Conference on Advances in Power System Control, Operation and Management. Proceedings. APSCOM 2003. IEE, 2003. http://dx.doi.org/10.1049/cp:20030617.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Slavyanova, Yana, and Dmitriy Lagerev. "Development of the scoring model for assessing the probability of expulsion of university students." In International Conference "Computing for Physics and Technology - CPT2020". ANO «Scientific and Research Center for Information in Physics and Technique», 2020. http://dx.doi.org/10.30987/conferencearticle_5fd755c0420db0.31167163.

Full text
Abstract:
The work of most information systems involves the processing of data, its accumulation during operation and subsequent analysis. However, the analysis of such a large amount of information by a person is impossible without its preliminary automatic processing. For this purpose, Data Mining is used, which includes descriptive and predictive modeling. The statistical classification is one of the most understandable data analysis technologies for humans and relates to predictive modeling. This task consists in dividing the set of observations into classes based on their formal description. One of the methods for solving the classification problem is logistic regression, while scoring is a common area of application. This article discusses the application of scoring to the problem of assessing the probability of students' expulsion from the University based on data on their attendance and academic performance. The solution of this problem will allow curators of groups, directions and other interested parties to identify the tendency to expulsion in time, identify a risk group among students and take early measures to prevent the event predicted by the built model from becoming a fact. The built scoring model is subject to publication as a web service for further use in the software package for supporting the work of a University teacher. In this case, the model input receives aggregated characteristics obtained from accumulated data on student performance and attendance by the software package, which results in an integrated indicator of the probability of an event, namely, deductions. As a result of building a scoring model, a subsequent assessment of its quality is performed.
APA, Harvard, Vancouver, ISO, and other styles
3

MOVSESIAN, Diana, and Olga MYSLYUK. "ASSESSMENT OF ACID-BASE BUFFERING PROPERTIES OF SOILS OF THE CITY OF CHERKASSY." In Conference for Junior Researchers „Science – Future of Lithuania“. VGTU Technika, 2017. http://dx.doi.org/10.3846/aainz.2017.011.

Full text
Abstract:
The research carried out showed that the amount of the buffering capacity of soil in acid and alkaline ranges varied within 27–98% (from low to very high) and 31–81% (from medium to very high) respectively. The specific features of acid-base buffering capacity of soils are in asymmetry with buffering areas. For the first time the computer generated models were created using the software package SURFER that will allow to monitor the urban soils’ state in time and space, to estimate the degree of their degradation under the influence of growing technogenic load, to specify the peculiarities of the formation of ecogeochemical situation in the city. The city map was zoned by acid-base buffering properties of soils based on the theoretical, statistical and visual interpretation of cartographic data.
APA, Harvard, Vancouver, ISO, and other styles
4

Motriuk, Roman W. "Modelling of Flexible Victaulic Couplings Using Basic Finite Element Software." In 2008 7th International Pipeline Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/ipc2008-64301.

Full text
Abstract:
In the past decade, Victaulic couplings have gained significant recognition as important piping elements, which are used mainly in water and slurry transportation systems. For example, grooved flexible Victaulic couplings offered significant economical and reliable piping connections when compared to other connecting elements such as flanges. Victaulic couplings are on average three times faster to install than welding piping connections. They are more reliable and cost-effective than flanges or threaded connectors. In addition, the speed and easiness of their assembly or disassembly as well as their flexibility and ability to provide thermal gaps make the couplings desirable as piping elements. Furthermore, the couplings provide stress designers with a rare opportunity to cheaply and reliably compensate for piping loads which are otherwise exerted on equipment attached to piping. For the above reasons Victaulic couplings are frequently used in current piping designs. In spite of their simple design and application they pose a significant challenge for stress designers. The stress software packages based on piping finite element theories which are commonly used in industry do not provide the means to model Victaulic couplings adequately. These packages are based on stress linear theories, and Victaulic couplings with their gaps are definitely non-linear elements. Therefore, the approach to model these elements is very approximate and is usually done by the use of nonlinear restraints built into the software. The stiffness and friction for Victaulic coupling “restraints” are rarely known and assumptions of their values have to be made in order to carry out calculations. Therefore, the prescribed values for the restraints directly influence the stress results. This work discusses assumptions based on several simple stress models. An attempt is made by the author to minimize conservatism as much as practical in the modelling of Victaulic couplings, while waiting for the manufacturers of these elements to test their products and provide meaningful statistical information, which could then be used to carry out stress predictions. The couplings’ stiffness, bending moment and axial force capabilities provided in this work must not be used for design purposes unless verified and accepted by the couplings’ manufacturers.
APA, Harvard, Vancouver, ISO, and other styles
5

Idris, A., B. P. Huynh, and Z. Abdullah. "The Simulation of Natural Ventilation of Buildings With Different Location of Windows/Openings." In ASME 2015 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2015. http://dx.doi.org/10.1115/imece2015-51168.

Full text
Abstract:
Ventilation is a process of changing air in an enclosed space. Air should continuously be withdrawn and replaced by fresh air from a clean external source to maintain internal good air quality, which may referred to air quality within and around the building structures. In natural ventilation the air flow is due through cracks in the building envelope or purposely installed openings. Its can save significant amount of fossil fuel based energy by reducing the needs for mechanical ventilation and air conditioning. Numerical predictions of air velocities and the flow patterns inside the building are determined. To achieve optimum efficiency of natural ventilation, the building design should start from the climatic conditions and orography of the construction to ensure the building permeability to the outside airflow to absorb heat from indoors to reduce temperatures. Effective ventilation in a building will affects the occupant health and productivity. In this work, computational simulation is performed on a real-sized box-room with dimensions 5 m × 5 m × 5 m. Single-sided ventilation is considered whereby openings are located only on the same wall. Two opening of the total area 4 m2 are differently arranged, resulting in 16 configurations to be investigated. A logarithmic wind profile upwind of the building is employed. A commercial Computational Fluid Dynamics (CFD) software package CFD-ACE of ESI group is used. A Reynolds Average Navier Stokes (RANS) turbulence model &amp; LES turbulence model are used to predict the air’s flow rate and air flow pattern. The governing equations for large eddy motion were obtained by filtering the Navier-Stokes and continuity equations. The computational domain was constructed had a height of 4H, width of 9H and length of 13H (H=5m), sufficiently large to avoid disturbance of air flow around the building. From the overall results, the lowest and the highest ventilation rates were obtained with windward opening and leeward opening respectively. The location and arrangement of opening affects ventilation and air flow pattern.
APA, Harvard, Vancouver, ISO, and other styles
6

Qian, Zhengfang, and Xin Wu. "Seamless Tool Integration for Solder Joint Qualifications." In ASME 2008 International Mechanical Engineering Congress and Exposition. ASMEDC, 2008. http://dx.doi.org/10.1115/imece2008-66449.

Full text
Abstract:
This paper presents an approach and its demonstration for seamless tool integration for the virtual qualification and reliability prediction of solder joints of surface mounted electronic components on a populated circuit board. Starting from software called Surface Evolver, it can be used to realistically reproduce complicated geometry profiles of solder joints with various lead frames and pad specifications. User routines have been developed under ANSYS platform for automatic geometry transfer from the Surface Evolver and mesh regeneration of the solder joints inside ANSYS, as demonstrated in this paper. Moreover, a number of electronic packages with their solder joints on a typical printed circuit board (PCB) were also regenerated with parametric capability. Finite element analyses (FEAs) with proper set-up of materials, boundary conditions, and constitutive models were performed automatically by hitting one-button to go. Reliability prediction of solder joints and the failure rate of electronic components can be predicted based on failure criteria and test data implemented. An in-house tool was developed for the whole procedure of solder joint qualification and reliability assessment from deterministic to statistical aspects, including the evaluation of defect impacts.
APA, Harvard, Vancouver, ISO, and other styles
7

Aziz, Rosman. "Development of an Integrated System for Cam Design and Manufacture With Graphical User Interface." In ASME 1996 Design Engineering Technical Conferences and Computers in Engineering Conference. American Society of Mechanical Engineers, 1996. http://dx.doi.org/10.1115/96-detc/mech-1017.

Full text
Abstract:
Abstract The development of a Windows-based software package for the design and manufacture of cams is presented. There are four main modules in this software consisting of design, display, animation and NC program generation. From the data input the motion curves are constructed using building-block approach where proper blending at joint points are ensured. The interactive graphical user interface provide easy data entry for the user, and the output display give excellent graphics. B-spline curve fitting and techniques for offset free-form curves were employed to generate cutter path and cam profile. In order to evaluate the software, test samples were made, using the NC program generated. The test results were satisfactory.
APA, Harvard, Vancouver, ISO, and other styles
8

Fong, Jeffrey T., N. Alan Heckert, James J. Filliben, and Steven R. Doctor. "Three Approaches to Quantification of NDE Uncertainty and a Detailed Exposition of the Expert Panel Approach Using the Sheffield Elicitation Framework." In ASME 2018 Pressure Vessels and Piping Conference. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/pvp2018-84771.

Full text
Abstract:
The ASME Boiler &amp; Pressure Vessel Code Section XI Committee is currently developing a new Division 2 nuclear code entitled the “Reliability and Integrity Management (RIM) program,” with which one is able to arrive at a risk-informed, NDE-based engineering maintenance decision by estimating and managing all uncertainties for the entire life cycle including design, material selection, degradation processes, operation and non-destructive examination (NDE). This paper focuses on the uncertainty of the NDE methods employed for preservice and inservice inspections due to a large number of factors such as the NDE equipment type and age, the operator’s level and years of experience, the angle of probe, the flaw type, etc. In this paper, we describe three approaches with which uncertainty in NDE-risk-informed decision making can be quantified: (1) A regression model approach in analyzing round-robin experimental data such as the 1981–82 Piping Inspection Round Robin (PIRR), the 1986 Mini-Round Robin (MRR) on intergranular stress corrosion cracking (IGSCC) detection and sizing, and the 1989–90 international Programme for the Inspection of Steel Components III-Austenitic Steel Testing (PISC-AST). (2) A statistical design of experiments approach. (3) An expert knowledge elicitation approach. Based on a 2003 Pacific Northwest National Laboratory (PNNL) report by Heasler and Doctor (NUREG/CR-6795), we observe that the first approach utilized round robin studies that gave NDE uncertainty information on the state of the art of the NDE technology employed from the early 1980s to the early 1990s. This approach is very time-consuming and expensive to implement. The second approach is based on a design-of-experiments (DEX) of eight field inspection exercises for finding the length of a subsurface crack in a pressure vessel head using ultrasonic testing (UT), where five factors (operator’s service experience, UT machine age, cable length, probe angle, and plastic shim thickness), were chosen to quantify the sizing uncertainty of the UT method. The DEX approach is also time-consuming and costly, but has the advantage that it can be tailored to a specific defect-detection and defect-sizing problem. The third approach using an expert panel is the most efficient and least costly approach. Using the crack length results of the second approach, we introduce in this paper how the expert panel approach can be implemented with the application of a software package named the Sheffield Elicitation Framework (SHELF). The crack length estimation with uncertainty results of the three approaches are compared and discussed. Significance and limitations of the three uncertainty quantification approaches to risk assessment of NDE-based engineering decisions are presented and discussed.
APA, Harvard, Vancouver, ISO, and other styles
9

Bott, Terry F., Stephen W. Eisenhawer, Jonathan Kingson, and Brian P. Key. "A New Graphical Tool for Building Logic-Gate Trees." In ASME 2003 Pressure Vessels and Piping Conference. ASMEDC, 2003. http://dx.doi.org/10.1115/pvp2003-1915.

Full text
Abstract:
Tree structures that use logic gates to model system behavior have proven very useful in safety and reliability studies. In particular process trees are the basic structure used in a decision analysis methodology developed at Los Alamos called Logic Evolved Decision modeling (LED). LED TOOLS is the initial attempt to provide LED-based decision analysis tools in a state of the art software package. The initial release of the software, Version 2.0, addresses the first step in LED — determination of the possibilities. LED TOOLS is an object-oriented application written in Visual Basic for Windows NT based operating systems. It provides an innovative graphical user interface that was designed to emphasize the visual characteristics of logic trees and to make their development efficient and accessible to the subject matter experts who possess the detailed knowledge incorporated in the process trees. This eliminates the need for the current interface between subject matter experts and logic modeling experts. This paper provides an introduction to LED TOOLS. We begin with a description of the programming environment. The construction of a process tree is described and the simplicity and efficiency of the approach incorporated in the software is discussed. We consider the nature of the logical equations that the tree represents and show how solution of the equations yield natural language “paths.” Finally we discuss the planned improvements to the software.
APA, Harvard, Vancouver, ISO, and other styles
10

Tolchard, A. C., M. R. Looman, and J. A. Mason. "A Very High Efficiency Neutron Counter for the Measurement of Plutonium in Decommissioning Wastes." In ASME 2003 9th International Conference on Radioactive Waste Management and Environmental Remediation. ASMEDC, 2003. http://dx.doi.org/10.1115/icem2003-4659.

Full text
Abstract:
The design of the ANTECH Model 2203 Very High Efficiency Neutron Counter (V-HENC) is a natural progression from the well-proven ANTECH Series 2200 Passive Neutron Drum Monitor used for measuring plutonium in intermediate and low level waste (LLW) 200 litre drums. ANTECH has had considerable experience in the implementation of the base design, originally licensed to ANTECH from the Joint Research Centre at Ispra, Italy. Three Series 2200 systems have been supplied by ANTECH: two are in operation at AWE Aldermaston for waste monitoring, and a third is implemented at the SMP facility at BNFL Sellafield. Some 15 years cumulative operating experience has been gained, and ANTECH provides technical support as part of continuing maintenance and support arrangements. The design has proven to be inherently reliable, safe, easy to operate and maintain. Recently, both AWE instruments have been fully characterised and calibrated to function in conventional coincidence counting mode with calibrations. ANTECH has used MCNP to optimise the design of the fast detector packages in order to achieve the lower detection levels required to measure Pu at USA TRU/LLW and UK Nirex LLW levels. With the aid of simulations a typical detection efficiency of 36% with Cd filters deployed and up to 45% with the internal Cd liner removed has been achieved. Statistical data filtering is used to decrease the cosmic ray induced neutron background, the latter also being minimised by the absence of steelwork within the drum measurement chamber. An outer shielding of 270 mm thickness of polyethylene is used to shield the system from external neutrons. A total of 16 fast detector modules are used, which consist of eight, 7.5 atmospheres 3He tubes (25.4mm diameter and 1033 mm active length) embedded in high density polyethylene and arranged in a double row. Tubes are connected using HN connectors to a junction box at the top of the vertical modules or on the end of the horizontal modules. The junction boxes are hermetically sealed and contain the high voltage distribution and AMPTEK model A-111 charge sensitive amplifiers. The operation of the V-HENC is based on passive neutron counting of the correlated neutrons from spontaneous fission of the even Pu nuclides, principally 240Pu and is coupled with an ANTECH/Ortec Advanced Multiplicity Shift Register employing the Los Alamos INCC code. Alternatively, the multiple gate ANTECH Time Correlation Analyser (TCA) may be used for enhanced data acquisition for multiplicity counting. The V-HENC can be operated in conventional shift register coincidence counting (Reals) mode (with a calibration function), the absolute multiplicity counting mode (histogram function) or totals counting mode. Plant measured isotopic ratios can be used by the software to convert 240Pueffective mass to total Pu mass.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography