To see the other types of publications on this topic, follow the link: Meat operations.

Dissertations / Theses on the topic 'Meat operations'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Meat operations.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Smith, Jennifer Lynn. "The Effect of Thermal Processing Schedules and Unit Operations on the Quality of Blue Crab (Callinectes sapidus) Meat." Thesis, Virginia Tech, 1998. http://hdl.handle.net/10919/45047.

Full text
Abstract:
The effects of initial thermal processing, plant sanitation, and employee habits on the microbiological quality of blue crab (Callinectes sapidus) meat were determined in a commercial crab processing facility. Thermal processing was evaluated at 5, 7, and 8 minutes at 250ï °F for the destruction of microorganisms, including Listeria monocytogenes. F-values calculated indicated a sufficient reduction of L. monocytogenes at each processing time. Fresh picked crab meat was evaluated for microbial levels when exposed to ambient temperatures over a four hour period. It was found that time and temperature did not influence the microbial populations significantly except in the fourth hour. Plant sanitation was evaluated based on levels of adenosine triphosphate (ATP) and microbial counts. Areas found to have high levels of ATP typically had low microbial counts, thus suggesting that crab meat residual was the problem. The presence of Listeria species in the plant was determined using a commercial polyclonal antibody test. Listeria species were found under picking tables, on cooler doors, employees' aprons, and on several employees' hands. In a laboratory setting, an automated hand wash was compared with a manual hand wash for the removal of Listeria innocua, as a model for Listeria monocytogenes. It was found that a manual hand wash of 15 seconds was superior to an equal time automated wash. The microbial quality of crab meat was found to be affected by daily plant procedures, and could be changed by modifying procedures.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
2

Shange, Nompumelelo. "Contamination of game carcasses during harvesting and slaughter operations at a South African abattoir." Thesis, Stellenbosch : Stellenbosch University, 2015. http://hdl.handle.net/10019.1/98112.

Full text
Abstract:
Thesis (MSc Food Sc)--Stellenbosch University, 2015.
ENGLISH ABSTRACT: The consumption of game meat and its by-products is increasing locally and internationally. The increase in consumption requires research that is focused on the microbiological quality of game meat. The harvesting and slaughter process of springbok carcasses revealed the presence of bacterial contamination. Swab samples taken after skinning portrayed a presence of Escherichia coli (E. coli) and Enterobacteriaceae. Springbok carcasses swabbed after chilling indicated aerobic bacteria, Clostridium spp. and lactic acid bacteria. In contrast, swab samples taken at the evisceration’s incision area tend to be lower in counts when compared to swab samples taken after skinning and after chilling. Bacterial contamination was linked to poor hygienic practices during the harvesting and slaughter process. Results showed a need for the investigation of the slaughter process. To evaluate the slaughter process’s impact on the microbial quality of game carcasses, black wildebeest (Connochaetes gnou) carcasses were sampled throughout the slaughter process. Before skinning, aerobic bacteria, Enterobacteriaceae, and E. coli were enumerated from hide samples, counts ranged from 0.92 to 7.84 log cfu/g. after skinning, bacterial counts ranged from 0.93 to 6.12 log cfu/g and further decreased after chilling. Clostridium spp. counts increased after skinning, however, statistical analysis detected no significant differences between counts. Salmonella spp. was not detected. The results indicate that bacterial contamination does occur during the slaughter process. Hygienic status during the production of game meat products was also determined. Bacterial counts from raw game meat ranged from 2.37 to 5.37 log cfu/g. Counts as high as 6.16 log cfu/g were enumerated from retail products. Aerobic plate counts (APC) from ≤ 2.62 log cfu/cm2 to ≤ 6.3log cfu/cm2 were enumerated from surfaces, hands and equipment during production. Results highlighted the inefficiency of cleaning procedures and revealed that contaminated meat can allow for bacterial contamination. To determine if muscle pH influences colour stability and microbial spoilage of game meat, normal (n=6) and dark, firm and dry (DFD) (n=6) black wildebeest Longissimus thoracis et lumborum (LTL) muscles were studied. pH affected colour, as initial (day 0) L*,a*,b*,C* and Hab values from Normal pH samples were significantly higher than values reported for DFD samples. Initial APC and Enterobacteriaceae counts from samples with Normal pH were not significantly different from counts reported for DFD samples. Initial contamination was linked to the harvesting and slaughter process. Further refrigeration (5±1ºC) for 12 days in an aerobic environment and analyses of samples every third day revealed that pH did not affect lightness and brownness as L* and b* values for DFD samples did not significantly differ overtime, the same trend was seen for samples with Normal pH. Normal pH samples showed a significant increase in a* and C* values until day 12, whilst Hab values decreased until the 12th day. The same trend was seen for a* and C* values for DFD samples until the 9th day as on the 12th day values increased. Similarly, Hab values for DFD samples decreased until the 9th day, then increased on the 12th day. Using the microbial spoilage limit of 6 log cfu/g, it was seen that DFD meat reached this limit earlier than samples with Normal pH. Overall, the study provides baseline information on the microbiological quality of game meat harvested in South Africa and slaughtered at a South African abattoir.
AFRIKAANSE OPSOMMING: Die plaaslike en internasionale verbruik van wildsvleis en wildsvleisprodukte is aan’t toeneem. Hierdie toename in verbuik vereis navorsing wat gefokus is op die mikrobiese kwaliteit van wildsvleis. Die oes-en slagproses van springbok karkasse het die teenwoordigheid van bakteriese kontaminasie aan die lig gebring. Monsters geneem met ʼn depper na afslag van karkasse het ʼn teenwoordigheid van Escherichia coli (E. coli) getoon. Springbok karkasse wat getoets is na verkoeling het hoë vlakke van die aërobiese bakterium Clostridium spp. en van melksuurbakterieë getoon. In teenstelling hiermee is getalle laer rondom die ontweidings insnyding. Bakteriese kontaminasie was gekoppel aan swak higiëne gedurende die oes- en slagproses. Hierdie resultate het ʼn ondersoek van die slagproses aangemoedig. Om die impak van die slagproses op die mikrobiese kwaliteit van wildskarkasse te evalueer, is monsters regdeur geneem van swartwildebees (Connochaetes gnou). Getalle van aërobiese bakterieë, Enterobacteriaceae, en E. coli was bepaal op vel monsters voor afslag; getalle het gewissel tussen 0.92 en 7.84 log cve/g. Getalle van bakterieë na afslag het gewissel tussen 0.93 en 6.12 log cfu/g, en het verder afgeneem na verkoeling. Clostridum spp. het toegeneem na afslag, maar statistiese analises het geen beduidende verskille getoon nie. Monsters het negatief getoets vir Salmonella spp. Die resultate toon aan dat bakteriese kontaminasie wel plaasvind gedurende die slagproses. Die higiëniese status gedurende die produksie van wildsvleis is ook vasgestel. Bakteriegetalle van rou wildsvleis het gewissel tussen 2.37 log cve/g en 5.37 log cve/g. Getalle van handelsprodukte het getalle getoon van soveel as 6.16 log cve/g. Aërobiese plaat telling tussen ≤2.62 cve/cm2 en ≤ 6.3log cve/cm2 is vasgestel vanaf oppervlakte, hande en toerusting gedurende produksie. Resultate beklemtoon die ondoeltreffendheid van skoonmaakprosedures en wys dat aangetaste vleis bakteriese kontaminasie kan toelaat. Om te bepaal of die kleurstabiliteit en mikrobiese bederf van wildsvleis geaffekteer word deur spiere se pH, is normale (n=6) en donker, ferm, en droë (DFD) (n=6) Longissimus thoracis et lumborum (LTL) spiere van die swartwildebees bestudeer. Kleur was geaffekteer deur vleis pH, siende dat die aanvanklike waardes (dag 0) vir L*, a*, b*, C* en Hab aansienlik hoër was vir monsters met normale pH as DFD monsters. Aanvanklike getalle van aërobiese plaat telling en Enterobacteriaceae telling van monsters met Normale pH het nie beduidend verskil van DFD monsters nie. Aanvanklike besmetting was gekoppel aan die oes- en slagproses. Verdere verkoeling (5±1ºC) vir 12 dae in ʼn aërobiese omgewing en analise van monsters wys dat pH nie ligtheid en bruinheid affekteer nie; waardes vir L* en b* vir DFD monsters het nie beduidend verskil oor tyd nie. Dieselfde geld vir monsters met Normale pH. Monsters met Normale pH het ʼn beduidende toename in a* en C* getoon tot en met dag 12, terwyl waardes vir Hab afgeneem het tot en met dag 12. Dieselfde patroon is waargeneem by waardes vir a* en C* vir DFD monsters tot en met dag 9, terwyl dit toegeneem het op die 12de dag. Soortgelyk het Hab waardes vir DFD monsters afgeneem tot n met dag 9, en toegeneem op die 12de dag. Dit is ook gevind dat DFD vleis die limiet vir mikrobiese bederf (6 log cve/g) vroeër bereik as monsters met Normale pH. Die studie voorsien basis inligting oor die mikrobiese kwaliteit van wildsvleis wat geoes is in Suid Afrika, en geslag is by Suid Afrikaanse slagpale.
APA, Harvard, Vancouver, ISO, and other styles
3

Ebert, Douglas Cezar. "Simulação da dinâmica operacional de um processo industrial de abate de aves." Universidade Estadual do Oeste do Parana, 2007. http://tede.unioeste.br:8080/tede/handle/tede/220.

Full text
Abstract:
Made available in DSpace on 2017-05-12T14:47:14Z (GMT). No. of bitstreams: 1 Douglas Cezar Ebert.pdf: 309790 bytes, checksum: f0e2ece9bbf557060d09d5530f238727 (MD5) Previous issue date: 2007-07-17
Slaughter and meat processing of poultries occur at the environment called poultry slaughter industry where are carried out unitary operations logically organized. According to Operations Research fundaments a poultry slaughter industry is characterized as system which is associated the following factors: (i) input variables example: daily number of poultries to be slaughtered; and daily schedules; (ii) system parameters example: processing rates and water and vapor availabilities; and (iii) output variables example: production quantities of meats and derivatives, fixed and variable costs, and waist volumes. In reason of the number of factors involved, and the fact of theses could be stochastic, it is hard to define mental scenarios to support decision processes. In reason of that, use of simulation technique is appropriate, because it permit to realize experiments such as: sensitivity analysis, scenario analysis, optimization, and Monte Carlo simulation. Therefore, this work was carried out with objective to develop a computational model, using the simulation language EXTENDTM to (a) simulate the dynamic of poultry slaughter industry; and (b) realize sensitivity analysis. Developed model was classified as dynamic, stochastic and discrete. The real system modeled is located in Paraná State at Southwest Region and has daily slaughter capacity of 500,000 poultries, using three processing lines and operating in three daily schedules. At model validation was obtained data related to three schedules that were slaughtered 174,239; 166,870 and 144,021 poultries, respectively. Output variables contrasted, considering data obtained from system and generated by model, were: (i) processing time; (i) total live weight (kg); (iii) available live weight (kg); (iv) sub product weight (kg); (v) total production weight (kg); (vi) whole slaughtered poultry weight (kg); and (vii) total slaughtered poultry part weight (kg). Sensitivity analysis carried out, by changes lines processing rates in 7,000; 8,000 and 9,000 poultries per hour, showed the following averages for processing time 8.69, 7.86 and 7.86 hours, respectively. Results demonstrate that for current situation, the increase of processing rates in 9,000 poultries h-1 does not imply in a directly decrease of processing time, because current frequency of cargos arrives can establish idle periods of poultry slaughter facility.
O abate de aves e o processamento da carne desenrolam-se no ambiente denominado matadouro-frigorífico em que são realizadas operações unitárias, lógicas e seqüenciadas. De acordo com os preceitos da Pesquisa Operacional, um matadouro-frigorífico é caracterizado como um sistema quando os fatores associados são: (i) variáveis de entrada - exemplos: número de aves abatidas diariamente e turnos de funcionamento; (ii) parâmetros do sistema, exemplos: velocidades das linhas de processamento e disponibilidades de água e vapor; e (iii) variáveis de saída - exemplos: volumes de produção de carnes e derivados, custos fixos e variáveis e volume de dejetos. Em razão do número de fatores envolvidos e, além disso, devido ao fato desses poderem ser estocásticos; tornase árdua a definição mental de cenários para fundamentação de tomadas de decisão. Perante essa situação, o uso da técnica de simulação é pertinente por propiciar a condução de experimentos tais como: análise de sensibilidade, comparação de cenários, otimização e simulação de Monte Carlo. Deste modo, o presente trabalho foi conduzido com o objetivo de implementar um modelo computacional, por meio da linguagem de simulação EXTENDTM, para: (a) simular a dinâmica de atividades de um matadouro-frigorífico de aves e (b) conduzir análises de sensibilidade. O modelo implementado foi classificado como dinâmico, estocástico e discreto. O sistema real modelado está localizado na Região Sudoeste do Paraná e tem capacidade diária de abate próxima a 500.000 aves, utilizando-se três linhas de processamento, com operação em três turnos de trabalho diários. Para validação do modelo, foram coletados dados relativos a três turnos, em que foram abatidas 174.239, 166.870 e 144.021 aves, respectivamente. As variáveis de saída comparadas, considerando os dados obtidos do sistema real e gerados pelo modelo, foram: (i) tempo de processamento; (ii) peso vivo total; (iii) peso vivo aproveitado; (iv) peso de subproduto; (v) peso produção total; (vi) peso frango inteiro; (vii) peso total cortes. O modelo apresentou-se aplicável, uma vez que os erros médios percentuais foram inferiores a 1% para as variáveis comparadas. Análises de sensibilidades, conduzidas mediante as alterações das velocidades de processamento das linhas em 7.000, 8.000 e 9.000 frangos h-1, apresentaram os seguintes valores médios para variável tempo de processamento: 8,69, 7,86 e 7,86 horas, respectivamente. Os resultados demonstram que, para a atual situação, o aumento da velocidade de processamento para 9.000 frangos h-1 não implicará diretamente na redução do tempo de processamento, pois, a cadência atual da chegada das cargas do campo pode estabelecer períodos de ociosidade do matadouro-frigorífico.
APA, Harvard, Vancouver, ISO, and other styles
4

Allen, Thaddeus P. "Improving USAF Special Tactics readiness to meet the operational demands of the USAF and US Special Operations Command (SOCOM)." Thesis, Monterey, California. Naval Postgraduate School, 2002. http://hdl.handle.net/10945/5973.

Full text
Abstract:
Approved for public release; distribution is unlimited
The sometimes-divergent mission of the USAF and US SOCOM has strained the ability of USAF Special Tactics (ST) to meet the operational demands of each. The thesis will determine if USAF Special Tactics (ST) can better meet the operational requirements of both the USAF and USSOCOM. This is not a manpower study but a study of the readiness training required to support the ST operational mission. The thesis identifies ST requirements as the capability to perform its core competencies, Terminal Control, Recovery, and Reconnaissance, and their nine associated core tasks. This thesis will quantify the Training Time Required (TTR) and the Training Time Allotted (TTA) to accomplish the minimum essential training required to meet ST operational demands. Although the TTR to meet this demand exceeds the TTA, there are strategies available to deal with this training shortfall. With an improved readiness system in place ST can be more prepared to meet the operational demands of both the USAF and SOCOM. Choices must be made to implement a readiness system that best prepares for operational requirements, encourages innovative approaches, and maintains the flexibility to train for emerging missions.
APA, Harvard, Vancouver, ISO, and other styles
5

Allen, Thaddeus P. Fielden Patsy. "Improving USAF Special Tactics readiness to meet the operational demands of the USAF and US Special Operations Command (SOCOM) /." Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2002. http://library.nps.navy.mil/uhtbin/hyperion-image/02Jun%5FAllen.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Camargo, Fernando Henrique Fernandes de. "Aplicação de meta heurísticas na otimização multiobjetivo de sistemas hidrotérmicos." Universidade Federal de Goiás, 2017. http://repositorio.bc.ufg.br/tede/handle/tede/7536.

Full text
Abstract:
Submitted by Cássia Santos (cassia.bcufg@gmail.com) on 2017-06-19T13:06:19Z No. of bitstreams: 2 Dissertação - Fernando Henrique Fernandes de Camargo - 2017.pdf: 1478139 bytes, checksum: 1a8dae23d70a76b9e72b96aae929d85e (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2017-07-10T12:28:25Z (GMT) No. of bitstreams: 2 Dissertação - Fernando Henrique Fernandes de Camargo - 2017.pdf: 1478139 bytes, checksum: 1a8dae23d70a76b9e72b96aae929d85e (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Made available in DSpace on 2017-07-10T12:28:25Z (GMT). No. of bitstreams: 2 Dissertação - Fernando Henrique Fernandes de Camargo - 2017.pdf: 1478139 bytes, checksum: 1a8dae23d70a76b9e72b96aae929d85e (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Previous issue date: 2017-03-20
Conselho Nacional de Pesquisa e Desenvolvimento Científico e Tecnológico - CNPq
For countries like Brazil, which has hybrid resources as the major source of electricity, the optimization of the operation of the hydroelectric plants is extremely important and it’s being studied recurrently. Adopting a known temporal decomposition model of this optimization problem, this dissertation is proposed to compare the best multiobjective algorithms of the current literature, applying them to the medium term planning of hydroelectric plants. After several experiments, two algorithms are selected as the best options.
Para um país como o Brasil, que tem seus recursos hídricos como maior fonte de geração de energia elétrica, a otimização da operação das usinas hidrelétricas é extremamente importante e vem sendo estudada de maneira recorrente. Adotando um conhecido modelo de decomposição temporal desse problema de otimização, esta dissertação propôe-se a realizar uma comparação entre os melhores algoritmos de otimização multiobjetivo da literatura atual, aplicado-os ao planejamento de médio prazo de usinas hidrelétricas. Após diversos experimentos realizados, dois algoritmos são selecionados como as melhores opções.
APA, Harvard, Vancouver, ISO, and other styles
7

Hepdogan, Seyhun. "META-RAPS: PARAMETER SETTING AND NEW APPLICATIONS." Doctoral diss., University of Central Florida, 2006. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/3493.

Full text
Abstract:
ABSTRACT Recently meta-heuristics have become a popular solution methodology, in terms of both research and application, for solving combinatorial optimization problems. Meta-heuristic methods guide simple heuristics or priority rules designed to solve a particular problem. Meta-heuristics enhance these simple heuristics by using a higher level strategy. The advantage of using meta-heuristics over conventional optimization methods is meta-heuristics are able to find good (near optimal) solutions within a reasonable computation time. Investigating this line of research is justified because in most practical cases with medium to large scale problems, the use of meta-heuristics is necessary to be able to find a solution in a reasonable time. The specific meta-heuristic studied in this research is, Meta-RaPS; Meta-heuristic for Randomized Priority Search which is developed by DePuy and Whitehouse in 2001. Meta-RaPS is a generic, high level strategy used to modify greedy algorithms based on the insertion of a random element (Moraga, 2002). To date, Meta-RaPS had been applied to different types of combinatorial optimization problems and achieved comparable solution performance to other meta-heuristic techniques. The specific problem studied in this dissertation is parameter setting of Meta-RaPS. The topic of parameter setting for meta-heuristics has not been extensively studied in the literature. Although the parameter setting method devised in this dissertation is used primarily on Meta-RaPS, it is applicable to any meta-heuristic's parameter setting problem. This dissertation not only enhances the power of Meta-RaPS by parameter tuning but also it introduces a robust parameter selection technique with wide-spread utility for many meta-heuristics. Because the distribution of solution values generated by meta-heuristics for combinatorial optimization problems is not normal, the current parameter setting techniques which employ a parametric approach based on the assumption of normality may not be appropriate. The proposed method is Non-parametric Based Genetic Algorithms. Based on statistical tests, the Non-parametric Based Genetic Algorithms (NPGA) is able to enhance the solution quality of Meta-RaPS more than any other parameter setting procedures benchmarked in this research. NPGA sets the best parameter settings, of all the methods studied, for 38 of the 41 Early/Tardy Single Machine Scheduling with Common Due Date and Sequence-Dependent Setup Time (ETP) problems and 50 of the 54 0-1 Multidimensional Knapsack Problems (0-1 MKP). In addition to the parameter setting procedure discussed, this dissertation provides two Meta-RaPS combinatorial optimization problem applications, the 0-1 MKP, and the ETP. For the ETP problem, the Meta-RaPS application in this dissertation currently gives the best meta-heuristic solution performance so far in the literature for common ETP test sets. For the large ETP test set, Meta-RaPS provided better solution performance than Simulated Annealing (SA) for 55 of the 60 problems. For the small test set, in all four different small problem sets, the Meta-RaPS solution performance outperformed exiting algorithms in terms of average percent deviation from the optimal solution value. For the 0-1 MKP, the present Meta-RaPS application performs better than the earlier Meta-RaPS applications by other researchers on this problem. The Meta-RaPS 0-1 MKP application presented here has better solution quality than the existing Meta-RaPS application (Moraga, 2005) found in the literature. Meta-RaPS gives 0.75% average percent deviation, from the best known solutions, for the 270 0-1 MKP test problems.
Ph.D.
Department of Industrial Engineering and Management Systems
Engineering and Computer Science
Industrial Engineering and Management Systems
APA, Harvard, Vancouver, ISO, and other styles
8

Ahmadi, Ehsan. "Optimization-based Decision Support Tools for Managing Surgical Supplies and Sterile Instruments." Ohio University / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1564482727428522.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Fadnis, Kshitij Prakash. "Abductive Meta Hypothesis Plausibility Estimation and Selection Policies." The Ohio State University, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=osu1374064363.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Salud, Ellen. "Developing a library of display effects on pilot performance| Methods, meta-analyses, and performance estimates." Thesis, San Jose State University, 2013. http://pqdtopen.proquest.com/#viewpdf?dispub=1547139.

Full text
Abstract:

The design of NextGen and current-day cockpit displays are critical for efficient pilot performance and situation awareness on the flight deck. Before deployment of a design into the cockpit the costs and benefits that a display design imposes on performance and situation awareness should be considered. In this thesis, a design tool was developed to support the design of NextGen displays for situation awareness and performance. This design tool is a library of pilot performance estimates. Through literature reviews and meta-analyses of empirical data, the library was developed to provide display designers 1) qualitative distinctions of display properties that either support or limit full situation awareness, and 2) quantitative performance time estimates until situation awareness as a function of various display formats. A systematic method was also developed for future augmentation of the library.

APA, Harvard, Vancouver, ISO, and other styles
11

Hackett, Stacey Lynn Hyten Cloyd. "Improving administrative operations for better client service and appointment keeping in a medical/behavioral services clinic." [Denton, Tex.] : University of North Texas, 2008. http://digital.library.unt.edu/permalink/meta-dc-9099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Ramos, Edson da Silva. "Modelos de simulação e otimização para sistemas hidrotérmicos." Universidade Federal de Goiás, 2016. http://repositorio.bc.ufg.br/tede/handle/tede/7713.

Full text
Abstract:
Submitted by JÚLIO HEBER SILVA (julioheber@yahoo.com.br) on 2017-08-31T17:54:58Z No. of bitstreams: 2 Dissertação - Edson da Silva Ramos - 2017.pdf: 3997839 bytes, checksum: 46c0db17187cdc3a0d29e581ce8b11f0 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2017-09-15T13:42:24Z (GMT) No. of bitstreams: 2 Dissertação - Edson da Silva Ramos - 2017.pdf: 3997839 bytes, checksum: 46c0db17187cdc3a0d29e581ce8b11f0 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Made available in DSpace on 2017-09-15T13:42:24Z (GMT). No. of bitstreams: 2 Dissertação - Edson da Silva Ramos - 2017.pdf: 3997839 bytes, checksum: 46c0db17187cdc3a0d29e581ce8b11f0 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Previous issue date: 2016-09-15
The problem of planning the hydrothermal systems is complex, dynamic, stochastic, interconnected and nonlinear. In this work this problem is treated to meet one goal: minimize the use of water tank during a scenario of natural river flows lean period. This paper presents the application of meta-heuristics mono-objective of this problem, using a set of eight real plants in the National Interconnected System during the period of five years. The algorithms used were: PSO, ABeePSO, LSSPSO and KFPSO. The experiments were compared to studies using Nonlinear Programming and it appears that this work presents a simulation model and optimization for flexible hydrothermal system and highly adaptable to the use of different meta-heuristics allowing the researcher to apply different algorithms and compare the results between them.
O problema do planejamento da operação de sistemas hidrotérmicos é complexo, dinâmico, estocástico, interconectado e não linear. Nesse trabalho esse problema é tratado de forma a atender um objetivo: minimizar o uso do reservatório de água durante um cenário de período de escassez de vazão natural dos rios. Este trabalho apresenta a aplicação de meta-heurísticas mono-objetivo a esse problema, utilizando um conjunto de oito usinas reais do Sistema Interligado Nacional durante o período de cinco anos. Os algoritmos utilizados foram: PSO, ABeePSO, LSSPSO e KFPSO. Os experimentos realizados foram comparados com estudos que utilizaram Programação Não Linear. E conclui-se que esse trabalho apresenta um modelo de simulação e otimização para sistema hidrotérmicos flexível e altamente adaptável para o uso de diversas meta-heurísticas possibilitando o pesquisador aplicar diferentes algoritmos e comparar esses resultados entre os mesmos.
APA, Harvard, Vancouver, ISO, and other styles
13

Vana, Laura, Ronald Hochreiter, and Kurt Hornik. "Computing a journal meta-ranking using paired comparisons and adaptive lasso estimators." Springer, 2016. http://epub.wu.ac.at/5392/1/ePub_rl_lvana.pdf.

Full text
Abstract:
In a "publish-or-perish culture", the ranking of scientific journals plays a central role in assessing the performance in the current research environment. With a wide range of existing methods for deriving journal rankings, meta-rankings have gained popularity as a means of aggregating different information sources. In this paper, we propose a method to create a meta-ranking using heterogeneous journal rankings. Employing a parametric model for paired comparison data we estimate quality scores for 58 journals in the OR/MS/POM community, which together with a shrinkage procedure allows for the identification of clusters of journals with similar quality. The use of paired comparisons provides a flexible framework for deriving an aggregated score while eliminating the problem of missing data.
APA, Harvard, Vancouver, ISO, and other styles
14

Pulliam-Brown, Donna. "The Effects of Process Management on Stakeholder Performance: A Meta-Analysis." ScholarWorks, 2017. https://scholarworks.waldenu.edu/dissertations/3363.

Full text
Abstract:
In 2012, there were over 500,000 business management degrees conferred at the undergraduate and graduate level; however, the assessment of student performance has not kept pace with the growth of courses offered in both an online and traditional format. One of the objectives of teaching is to ensure that all students regardless of mode of instruction are receiving a quality education. The purpose of this meta-analysis was to measure the efficiency of learning in a business discipline by evaluating final course grades of 1,051 students. Ten traditional and 10 online course grades provided final student outcomes that were used to generate an effect size estimate. The research question focused on what knowledge related effect on student performance does both an online and a traditional format have in a business discipline utilizing Simonson's equivalency theory. This theoretical framework provided a context for understanding how information imparted in different environments may be equivalent in nature. This meta-analysis used effect size measurements to quantify the difference between online and traditional final grade assessments. The results indicated a low knowledge related effect size measurement on student performance outcomes that can be attributed to how online students compare to traditional students. This research has the potential to assist in the evaluation of distance education in business and other disciplines to determine its effect size results on student performance outcomes. This study contributes to social change by providing the ability for universities to manage student outcomes which can assist in improving the comparability between online and traditional business courses.
APA, Harvard, Vancouver, ISO, and other styles
15

Fallah-Fini, Saeideh. "Measuring the Efficiency of Highway Maintenance Operations: Environmental and Dynamic Considerations." Diss., Virginia Tech, 2010. http://hdl.handle.net/10919/77284.

Full text
Abstract:
Highly deteriorated U.S. road infrastructure, major budgetary restrictions and the significant growth in traffic have led to an emerging need for improving efficiency and effectiveness of highway maintenance practices that preserve the road infrastructure so as to better support society's needs. Effectiveness and efficiency are relative terms in which the performance of a production unit or decision making unit (DMU) is compared with a benchmark (best practice). Constructing the benchmark requires making a choice between an "estimation approach" based on observed best practices (i.e., using data from input and output variables corresponding to observed production units (DMUs) to estimate the benchmark with no elaboration on the details of the production process inside the black box) or an "engineering approach" to find the superior blueprint (i.e., focusing on the transformation process inside the black box for a better understanding of the sources of inefficiencies). This research discusses: (i) the application of the estimation approach (non-parametric approach) for evaluating and comparing the performance of different highway maintenance contracting strategies (performance-based contracting versus traditional contracting) and proposes a five-stage meta-frontier and bootstrapping analytical approach to account for the heterogeneity in the DMUs, the resulting bias in the estimated efficiency scores, and the effect of uncontrollable variables; (ii) the application of the engineering approach by developing a dynamic micro-level simulation model for the highway deterioration and renewal processes and its coupling with calibration and optimization to find optimum maintenance policies that can be used as a benchmark for evaluating performance of road authorities. This research also recognizes and discusses the fact that utilization of the maintenance budget and treatments that are performed in a road section in a specific year directly affect the road condition and required maintenance operations in consecutive years. Given this dynamic nature of highway maintenance operations, any "static" efficiency measurement framework that ignores the inter-temporal effects of inputs and managerial decisions in future streams of outputs (i.e., future road conditions) is likely to be inaccurate. This research discusses the importance of developing a dynamic performance measurement framework that takes into account the time interdependence between the input utilization and output realization of a road authority in consecutive periods. Finally, this research provides an overview of the most relevant studies in the literature with respect to evaluating dynamic performance and proposes a classification taxonomy for dynamic performance measurement frameworks according to five issues. These issues account for major sources of the inter-temporal dependence between input and output levels over different time periods and include the following: (i) material and information delays; (ii) inventories; (iii) capital or generally quasi-fixed factors and the related topic of embodied technological change; (iv) adjustment costs; and (v) incremental improvement and learning models (disembodied technological change). In the long-term, this line of research could contribute to a more efficient use of societal resources, greater level of maintenance services, and a highway and roadway system that is not only safe and reliable, but also efficient.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
16

Patricksson, Øyvind Selnes. "Semi-Submersible Platform Design to Meet Uncertainty in Future Operating Scenarios." Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for marin teknikk, 2012. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-18563.

Full text
Abstract:
This master thesis in marine systems design is about how to assess the future uncertainty in a design setting, or as the topic puts it; semi-submersible platform design to meet uncertainty in the future operation scenarios. Central terms that will be discussed are robustness, flexibility, adaptability, and real options, so-called ilities. Also, methods for evaluation of designs in relation to ilities and future uncertainty are presented.The background for this thesis is the ever importance of a good assessment of investment projects in the offshore business in general, and more specific in relation to designs subjected to different forms of ilities. Now, more than ever, it is crucial to make the right decisions when designing an offshore construction, to ensure that an investment is viable. This thesis has used the concept of an intervention semi, provided by Aker Solutions, to assess problems related to these aspects. At first, design drivers for the concept were identified. These were found to be cost, weigh and operability, where (total) cost and (total) weight are strictly correlated. Operability, meaning the ability to keep operations running in different conditions and situations, are mainly dependent on motion characteristics and layout, where vertical motions were found to be the most important. The properties of the intervention semi was presented as a functional breakdown, divided in five main categories; well intervention, drilling, power generation, station keeping and transit, and other functions. The last category, the one called other functions, incorporated accommodation, ballast and bilge water systems, and heave compensation system. Most relevant for the intervention concept are the intervention functions and drilling functions. Of well intervention procedures, the concept should be able to do wireline operations, coiled tubing operations, and for drilling, through tubing rotary drilling will be the main procedure. After presenting the properties for the intervention semi concept, aspects of changing requirements due to uncertainty in the future, were discussed. The design functions of changing requirements identified were operation method and technology, environment and legislation, area of operation, and economics. Following this, a discussion of how to accommodate for these changing requirements were presented, with focus on aspects regarding flexibility, robustness, adaptability, and real options. After these terms and aspects had been discussed, an evaluation of the concept in relation to the ilities presented was done. Most relevant was the possibility of a development of the coiled tubing equipment, the aspect of managed pressure drilling as a function that might be needed in the future, and the use of rental equipment. Also, ilities were identified and discussed in a concept similar to the intervention semi presented in this thesis. From this, it was found that functions related to the environment (regarding emissions) would be a potential area of ilities, due to the continually increasing focus on such matters, and by having functions related to this designed with ilities, It would make it easier to improve these functions at a later time. Also, the aspect of extra deck space was discussed, which will give the design better flexibility, and in general, it was found that flexibility in the procedures for intervention and drilling operation was important for this concept. Some functions and aspects were also found not to be relevant for any sort of ilities. Among these were functions related to heavy drilling, increased water depth and the aspect of ice class.To find the value of a design with functional ilities, different methods and aspects were presented. At first, economical aspects were discussed, and methods using net present value were found to be relevant in relation to the valuation of ilities. Another approach discussed was scenario development and assessment, where in particular one method was found relevant. This method proposes to find an optimal design for the scenario assumed most probable, and then test this design against the other possible scenarios (using the models as simulation models) to get an impression of the resilience of the designs. Two decision support models were proposed, Model 1 and Model 2. The first model presented, Model 1, can be described as a “hybrid” decision model, part static, part dynamic, where an optimal design is found for a set of contracts, taking real options into consideration. The contracts should reflect the future, and from a set of base designs, with varying possibilities for functions and options, a design with an optimal combination of capabilities and options will be the result of solving the problem. Model 2 is sort of a static variant of Model 1, where the possibility of real options is no longer available. The model will still find a design with an optimal combination of capabilities for a set of contracts, but all capabilities must be part of the construction initially.Further, the two models are implemented for use in a commercial solver, and parameters and constraints are discussed. These implemented models were then used for the illustrative cases.The case studies illustrate how the two models presented can be utilised, and in addition illustrate how the scenario assessment discussed earlier can be combined with the decision support models. There are mainly three cases presented; two where Model 1 is used, and a third, where Model 2 is used. In Case 1 there are three base designs, with different characteristics, and one only attribute (supplementary function) that should be assessed. Three scenarios are presented as a basis for the contract generation. First, an optimal design solution was found for each scenario (Case 1a, Case 1b and Case 1c). Secondly, a scenario assessment was done, where the solution from the scenario assumed most probable is tested against the other two scenarios using the model as a simulation model rather than an optimisation model. Scenario 1 was assumed to be the most probable one, represented by Case 1a, and the optimal solution for this case was Design 1. This design was then tested against the two other scenarios, and it came out with a rather good result, illustrating the resilience of the chosen design. Case 2 illustrated a more complex problem, where an optimal solution should be found among 16 different base designs and four possible attributes. The attributes could either be part of the design initially or made as options that can be realised at a later time. The instance tested is assumed to be somewhat more complex than a commercial problem, but illustrates in a good way the capability of Model 1. Case 3 is an example of how Model 2 can be used. In Case 3a, only one base design is available, and with a set of four possible attributes, an optimal design should be found. Due to the “static” character of Model 2, the attributes can only be part of the initial design. Case 3b is much the same, except here there are two base designs to choose among, in addition to the four attributesA computational study was carried out, using Model 1, and only this, as it is assumed to be the most complex of the two models. The test incident assumed most relevant, with 100 contracts, four base designs, and eight attributes, can be solved one time in on the average less than two seconds, and for a full scenario analysis, consisting of about 1000 runs, the analysis will take about half an hour.As a concluding remark for this thesis, I will say that the main scope, which I in my opinion was to discuss how different design solutions can be evaluated in relation to future uncertainty, was answered in a good way with the two decision models proposed together with how these could be used in a scenario setting.
APA, Harvard, Vancouver, ISO, and other styles
17

Horie, Michael. "On secure, dynamic customizing of a meta-space-based operating system." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp02/NQ37345.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Veluscek, Marco. "Global supply chain optimization : a machine learning perspective to improve caterpillar's logistics operations." Thesis, Brunel University, 2016. http://bura.brunel.ac.uk/handle/2438/13050.

Full text
Abstract:
Supply chain optimization is one of the key components for the effective management of a company with a complex manufacturing process and distribution network. Companies with a global presence in particular are motivated to optimize their distribution plans in order to keep their operating costs low and competitive. Changing condition in the global market and volatile energy prices increase the need for an automatic decision and optimization tool. In recent years, many techniques and applications have been proposed to address the problem of supply chain optimization. However, such techniques are often too problemspecific or too knowledge-intensive to be implemented as in-expensive, and easy-to-use computer system. The effort required to implement an optimization system for a new instance of the problem appears to be quite significant. The development process necessitates the involvement of expert personnel and the level of automation is low. The aim of this project is to develop a set of strategies capable of increasing the level of automation when developing a new optimization system. An increased level of automation is achieved by focusing on three areas: multi-objective optimization, optimization algorithm usability, and optimization model design. A literature review highlighted the great level of interest for the problem of multiobjective optimization in the research community. However, the review emphasized a lack of standardization in the area and insufficient understanding of the relationship between multi-objective strategies and problems. Experts in the area of optimization and artificial intelligence are interested in improving the usability of the most recent optimization algorithms. They stated the concern that the large number of variants and parameters, which characterizes such algorithms, affect their potential applicability in real-world environments. Such characteristics are seen as the root cause for the low success of the most recent optimization algorithms in industrial applications. Crucial task for the development of an optimization system is the design of the optimization model. Such task is one of the most complex in the development process, however, it is still performed mostly manually. The importance and the complexity of the task strongly suggest the development of tools to aid the design of optimization models. In order to address such challenges, first the problem of multi-objective optimization is considered and the most widely adopted techniques to solve it are identified. Such techniques are analyzed and described in details to increase the level of standardization in the area. Empirical evidences are highlighted to suggest what type of relationship exists between strategies and problem instances. Regarding the optimization algorithm, a classification method is proposed to improve its usability and computational requirement by automatically tuning one of its key parameters, the termination condition. The algorithm understands the problem complexity and automatically assigns the best termination condition to minimize runtime. The runtime of the optimization system has been reduced by more than 60%. Arguably, the usability of the algorithm has been improved as well, as one of the key configuration tasks can now be completed automatically. Finally, a system is presented to aid the definition of the optimization model through regression analysis. The purpose of the method is to gather as much knowledge about the problem as possible so that the task of the optimization model definition requires a lower user involvement. The application of the proposed algorithm is estimated that could have saved almost 1000 man-weeks to complete the project. The developed strategies have been applied to the problem of Caterpillar’s global supply chain optimization. This thesis describes also the process of developing an optimization system for Caterpillar and highlights the challenges and research opportunities identified while undertaking this work. This thesis describes the optimization model designed for Caterpillar’s supply chain and the implementation details of the Ant Colony System, the algorithm selected to optimize the supply chain. The system is now used to design the distribution plans of more than 7,000 products. The system improved Caterpillar’s marginal profit on such products by a factor of 4.6% on average.
APA, Harvard, Vancouver, ISO, and other styles
19

Seresinhe, R. "Impact of aircraft systems within aircraft operation : a MEA trajectory optimisation study." Thesis, Cranfield University, 2014. http://dspace.lib.cranfield.ac.uk/handle/1826/9261.

Full text
Abstract:
Air transport has been a key component of the socio-economic globalisation. The ever increasing demand for air travel and air transport is a testament to the success of the aircraft. But this growing demand presents many challenges. One of which is the environmental impact due to aviation. The scope of the environmental impact of aircraft can be discussed from many viewpoints. This research focuses on the environmental impact due to aircraft operation. Aircraft operation causes many environmental penalties. The most obvious is the fossil fuel based fuel burn and the consequent greenhouse gas emissions. Aircraft operations directly contribute to the CO2 and NOX emissions among others. The dependency on a limited natural resource such as fossil fuel presents the case for fuel optimised operation. The by-products of burning fossil fuel some of which are considered pollutants and greenhouse gases, presents the case for emissions optimised operations. Moreover, when considering the local impact of aircraft operation, aircraft noise is recognised as a pollutant. Hence noise optimised aircraft operation needs to be considered with regards to local impacts. It is clear whichever the objective is, optimised operation is key to improving the efficiency of the aircraft. The operational penalties have many different contributors. The most obvious of which is the way an aircraft is flown. This covers the scope of aircraft trajectory and trajectory optimisation. However, the design of the aircraft contributes to the operational penalties as well. For example the more-electric aircraft is an improvement over the conventional aircraft in terms of overall efficiency. It has been proven by many studies that the more-electric concept is more fuel efficient than a comparable conventional aircraft. The classical approach to aircraft trajectory optimisation does not account for the fuel penalties caused due to airframe systems operation. Hence the classical approach cannot define a conventional aircraft from a more-electric aircraft. With the more-electric aircraft expected to be more fuel efficient it was clear that optimal operation for the two concepts would be different. This research presents a methodology that can be used to study optimised trajectories for more-electric aircraft. The study present preliminary evidence of the environmental impact due to airframe systems operation and establishes the basis for an enhanced approach to aircraft trajectory optimisation which include airframe system penalties within the optimisation loop. It then presents a suite of models, the individual modelling approaches and the validation to conduct the study. Finally the research presents analysis and comparisons between the classical approach where the aircraft has no penalty due to systems, the conventional aircraft and the more-electric aircraft. When the case studies were optimised for the minimum fuel burn operation, the conventional airframe systems accounted for a 16.6% increase in fuel burn for a short haul flight and 6.24% increase in fuel burn for a long haul flight. Compared to the conventional aircraft, the more electric aircraft had a 9.9% lower fuel burn in the short haul flight and 5.35% lower fuel burn in the long haul flight. However, the key result was that the optimised operation for the moreelectric aircraft was significantly different than the conventional aircraft. Hence this research contributes by presenting a methodology to bridge the gap between theoretical and real aircraft-applicable trajectory optimisation.
APA, Harvard, Vancouver, ISO, and other styles
20

Hernández, Adrian V., Roop Kaw, Vinay Pasupuleti, Pouya Bina, A. Ioannidis John P, Hector Bueno, Eric Boersma, and Marc Gillinov. "Association between obesity and postoperative atrial fibrillation in patients undergoing cardiac operations: a systematic review and meta-analysis." Elsevier B.V, 2014. http://hdl.handle.net/10757/322422.

Full text
Abstract:
In a systematic review and random effects meta-analysis, we evaluated whether obesity is associated with postoperative atrial fibrillation (POAF) in patients undergoing cardiac surgery. Eighteen observational studies that excluded patients with preoperative AF were selected until December 2011 (n=36,147). Obese patients had a modest higher risk of POAF in comparison to non-obese (OR 1.12, 95%CI 1.04-1.21, p=0.002). The association between obesity and POAF did not vary substantially by type of cardiac surgery, study design or year of publication. POAF was significantly associated with higher risk of stroke, respiratory failure, and operative mortality.
Revisión por pares
APA, Harvard, Vancouver, ISO, and other styles
21

Grossbier, Stephany. "The effectiveness of training and written sanitation standard operating procedures on overall sanitation in a meat processing plant." Online version, 1998. http://www.uwstout.edu/lib/thesis/1998/1998grossbiers.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Downing, David L. Newsom Ron. "From reactionary to responsive applying the Internal Environmental Scan Protocol to lifelong learning strategic planning and operational model selection /." [Denton, Tex.] : University of North Texas, 2009. http://digital.library.unt.edu/permalink/meta-dc-9852.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Sambaluk, Nicholas Michael Hurley Alfred F. "The actions and operational thinking of Generals Stratemeyer and Partridge during the Korean War adjusting to political restrictions on air campaigns /." [Denton, Tex.] : University of North Texas, 2008. http://digital.library.unt.edu/permalink/meta-dc-6056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Wang, Zeyu. "Reliability Analysis and Updating with Meta-models: An Adaptive Kriging-Based Approach." The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1574789534726544.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

BARRIERA, VIRUET HERIBERTO. "EFFECT OF FORKLIFT OPERATION ON LOWER BACK PAIN - AN EVIDENCE-BASED APPROACH." University of Cincinnati / OhioLINK, 2006. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1148264126.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Wonsowicz, Johanna Christine. "Establishing an inventory management process to meet high customer service levels in a vaccines organization." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/59189.

Full text
Abstract:
Thesis (M.B.A.)--Massachusetts Institute of Technology, Sloan School of Management; and, (S.M.)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering; in conjunction with the Leaders for Manufacturing Program at MIT, 2010.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 76-77).
Inventory management is a complex aspect of Supply Chain Management that is frequently discussed and debated due to the fact that it has a high impact on customer satisfaction as well as financial performance. This thesis addresses how an inventory management policy was developed and established in a vaccines company where customer service is the top priority and product quantities are high. The work in this thesis is from a six month internship at Novartis Vaccines and Diagnostics in Marburg, Germany. Project work focused on three inventory management questions: What are the right inventory targets for each product? What is the process to manage, monitor and maintain the inventory targets? How should the inventory targets be measured and controlled? The results from this project show that an effective way to set inventory targets is through the combination of analytical inventory calculations and the strategic analysis of the business environment. A detailed inventory model was built in Microsoft Excel that uses common inventory formulas and considers critical product attributes such as shelf-life, process lead times, batch sizing, replenishment frequency and capacity constraints to calculate the inventory targets. The model results are part of the larger inventory management policy that was created and incorporated into the Supply Chain group's Sales & Operations Planning process. The complete inventory management policy addresses the details of regularly setting inventory targets, how they should be maintained and tracked and defines clear roles and responsibilities.
by Johanna Christine Wonsowicz.
S.M.
M.B.A.
APA, Harvard, Vancouver, ISO, and other styles
27

Postlethwaite, Bennett Eugene. "Fluid ability, crystallized ability, and performance across multiple domains: a meta-analysis." Diss., University of Iowa, 2011. https://ir.uiowa.edu/etd/1255.

Full text
Abstract:
Cognitive ability is one of the most frequently investigated individual differences in management and psychology. Countless studies have demonstrated that tests measuring cognitive ability or intelligence predict a number of important real-world outcomes such as academic performance, vocational training performance, and job performance. Although the relationship between intelligence and real-world performance is well established, there is a lack of consensus among scholars with regard to how intelligence should be conceptualized and measured. Of the more traditional theories of intelligence, two perspectives are particularly dominant: the Cattell-Horn model of fluid and crystallized intelligence and the theory of General Cognitive Ability (GCA or g). Fluid ability (Gf) represents novel or abstract problem solving capability and is believed to have a physiological basis. In contrast, crystallized ability (Gc) is associated with learned or acculturated knowledge. Drawing on recent research in neuroscience, as well as research on past performance, the nature of work, and expert performance, I argue that compared to measures of fluid ability, crystallized ability measures should more strongly predict real-world criteria in the classroom as well as the workplace. This idea was meta-analytically examined using a large, diverse set of over 400 primary studies spanning the past 100 years. With regard to academic performance, measures of fluid ability were found to positively predict learning (as measured by grades). However, as hypothesized, crystallized ability measures were found to be superior predictors of academic performance compared to their fluid ability counterparts. This finding was true for both high school and college students. Likewise, similar patterns of results were observed with regard to both training performance and job performance. Again, crystallized ability measures were found to be better predictors of performance than fluid measures. This finding was consistent at the overall level of analysis as well as for medium complexity jobs. These findings have important implications for both intelligence theory and selection practice. Contemporary intelligence theory has placed great emphasis on the role of fluid ability, and some researchers have argued that Gf and g are essentially the same construct. However, the results of this study, which are based on criterion-related validities rather than factor-analytic evidence, demonstrate that Gc measures are superior predictors in comparison to Gf measures. This is contrary to what one would expect if Gf and g were indeed the same construct. Rather, the findings of this study are more consistent with General Cognitive Ability theory, which predicts that Gc indicators will be the best predictors of future learning and performance. Given that Gc measures demonstrate higher criterion-related validities than Gf measures, Gc measures are likely to be preferred for selection purposes. Further, Gf scores are known to decline with age while Gc scores remain relatively stable over the lifespan. Thus, when used for selection purposes, Gf tests may underpredict the performance of older workers. In contrast, research has shown that Gc measures are predictively unbiased. Additional implications for theory and practice are discussed, along with study limitations and opportunities for future research.
APA, Harvard, Vancouver, ISO, and other styles
28

Chou, Cheng-Lung (Cheng-Lung John). "A proposed approach to assess supply chain risks to meet the new challenges in the Defense industry." Thesis, Massachusetts Institute of Technology, 2012. http://hdl.handle.net/1721.1/73379.

Full text
Abstract:
Thesis (M.B.A.)--Massachusetts Institute of Technology, Sloan School of Management; and, (S.M.)--Massachusetts Institute of Technology, Engineering Systems Division; in conjunction with the Leaders for Global Operations Program at MIT, 2012.
Cataloged from PDF version of thesis. Page 68 is blank.
Includes bibliographical references (p. 65-66).
Department of Defense (DoD) had doubled its planned investments in new weapon systems from about $700 billion in 2001 to nearly $1.4 trillion in 2006. Despite the technical superiority of its weapon systems, DoD's weapon systems acquisition process had been plagued with cost increases, schedule delays, and performance shortfalls'. To address the maturity gaps, DoD mandated in 2008 that all prime contractors (including Raytheon) for new US government funded defense programs to evaluate/document technology and manufacturing readiness levels (T/MRL) of their supply base. There are 10 manufacturing & 9 technology readiness levels and specific levels need to be met for certain program milestones. DoD has released a set of questionnaires (Deskbooks), designed to evaluate the maturity levels of a supplier in areas such as engineering design, operation, manufacturing, and facility etc. The goal of this thesis is to develop an assessment method, using the Deskbooks as a reference, to address the core issues in the defense acquisition process. The thesis will also take a deep dive into Raytheon's supply chain management philosophy and analyze how Raytheon's strategic sourcing initiatives align with the new challenges in the defense industry.
by Cheng-Lung Chou.
S.M.
M.B.A.
APA, Harvard, Vancouver, ISO, and other styles
29

Jacobs, Patricia A., Donald Paul Gaver, and Arthur Fries. "Prediction of changeover performance operational test (OT) parameters from developmental test (DT) parameters via meta-analysis." Monterey, California. Naval Postgraduate School, 1997. http://hdl.handle.net/10945/24476.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Deus, Guilherme Resende. "Otimização de sistemas hidrotérmicos de geração por meio de meta-heurísticas baseadas em enxame de partículas." Universidade Federal de Goiás, 2016. http://repositorio.bc.ufg.br/tede/handle/tede/7530.

Full text
Abstract:
Submitted by Cássia Santos (cassia.bcufg@gmail.com) on 2017-07-03T12:59:51Z No. of bitstreams: 2 Dissertação - Guilherme Resende Deus - 2016.pdf: 3406372 bytes, checksum: aaa431a0fa0dd2323a74cf35fb63f892 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2017-07-10T11:44:22Z (GMT) No. of bitstreams: 2 Dissertação - Guilherme Resende Deus - 2016.pdf: 3406372 bytes, checksum: aaa431a0fa0dd2323a74cf35fb63f892 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Made available in DSpace on 2017-07-10T11:44:22Z (GMT). No. of bitstreams: 2 Dissertação - Guilherme Resende Deus - 2016.pdf: 3406372 bytes, checksum: aaa431a0fa0dd2323a74cf35fb63f892 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Previous issue date: 2016-02-02
The objective of this work is to find reasonable solutions to the problem of optimization of hydrothermal generating systems by means of metaheuristics based on particle swarms. The proposed problem is complex, dynamic, nonlinear and presents some stochastic variables. The study consisted of the implementation of particle swarm algorithms, more specifically the variants of the Particle Swarm Optimization (PSO) algorithm: LSSPSO, ABeePSO and KFPSO. The algorithms were run in a mill simulator containing data from eight National Interconnected System mills during the five year period. The results were compared with the studies using the Nonlinear Programming (NLP) algorithm, and it was concluded that although the presented meta-heuristics were able to obtain a Final Storage Energy value equal to NLP, they did not have a generation cost Equivalent to or less than the Nonlinear Programming method.
O trabalho objetiva encontrar soluções razoáveis para o problema de otimização de sistemas hidrotérmicos de geração por meio de meta-heurísiticas baseadas em enxame de partículas. O problema proposto é complexo, dinâmico, não linear e apresenta algumas variáveis estocásticas. O estudo consistiu na implementação de algoritmos baseados em enxame de partículas, mais especificamente das variantes do algoritmo Particle Swarm Optimization (PSO): LSSPSO, ABeePSO e KFPSO. Os algoritmos foram executados em um simulador de usinas que contém dados de oito usinas do Sistema Interligado Nacional durante o período de cinco anos. Os resultados foram comparados com os estudos que utilizam o algoritmo de Programação Não-Linear (PNL), e conclui-se que apesar de as meta-heurísticas apresentadas conseguirem obter um valor de Energia Armazenada Final igual ao PNL, não obtiveram um custo de geração equivalente ou inferior ao método de Programação Não-Linear.
APA, Harvard, Vancouver, ISO, and other styles
31

Salimi-Khorshidi, Gholamreza. "Statistical models for neuroimaging meta-analytic inference." Thesis, University of Oxford, 2011. http://ora.ox.ac.uk/objects/uuid:40a10327-7f36-42e7-8120-ae04bd8be1d4.

Full text
Abstract:
A statistical meta-analysis combines the results of several studies that address a set of related research hypotheses, thus increasing the power and reliability of the inference. Meta-analytic methods are over 50 years old and play an important role in science; pooling evidence from many trials to provide answers that any one trial would have insufficient samples to address. On the other hand, the number of neuroimaging studies is growing dramatically, with many of these publications containing conflicting results, or being based on only a small number of subjects. Hence there has been increasing interest in using meta-analysis methods to find consistent results for a specific functional task, or for predicting the results of a study that has not been performed directly. Current state of neuroimaging meta-analysis is limited to coordinate-based meta-analysis (CBMA), i.e., using only the coordinates of activation peaks that are reported by a group of studies, in order to "localize" the brain regions that respond to a certain type of stimulus. This class of meta-analysis suffers from a series of problems and hence cannot result in as accurate results as desired. In this research, we describe the problems that existing CBMA methods are suffering from and introduce a hierarchical mixed-effects image-based metaanalysis (IBMA) solution that incorporates the sufficient statistics (i.e., voxel-wise effect size and its associated uncertainty) from each study. In order to improve the statistical-inference stage of our proposed IBMA method, we introduce a nonparametric technique that is capable of adjusting such an inference for spatial nonstationarity. Given that in common practice, neuroimaging studies rarely provide the full image data, in an attempt to improve the existing CBMA techniques we introduce a fully automatic model-based approach that employs Gaussian-process regression (GPR) for estimating the meta-analytic statistic image from its corresponding sparse and noisy observations (i.e., the collected foci). To conclude, we introduce a new way to approach neuroimaging meta-analysis that enables the analysis to result in information such as “functional connectivity” and networks of the brain regions’ interactions, rather than just localizing the functions.
APA, Harvard, Vancouver, ISO, and other styles
32

Erte, Idil. "Bivariate Random Effects And Hierarchical Meta-analysis Of Summary Receiver Operating Characteristic Curve On Fine Needle Aspiration Cytology." Master's thesis, METU, 2011. http://etd.lib.metu.edu.tr/upload/12613619/index.pdf.

Full text
Abstract:
In this study, meta-analysis of diagnostic tests, Summary Receiver Operating Characteristic (SROC) curve, bivariate random effects and Hierarchical Summary Receiver Operating Characteristic (HSROC) curve theories have been discussed and accuracy in literature of Fine Needle Aspiration (FNA) biopsy that is used in the diagnosis of masses in breast cancer (malignant or benign) has been analyzed. FNA Cytological (FNAC) examination in breast tumor is, easy, effective, effortless, and does not require special training for clinicians. Because of the uncertainty related to FNAC&lsquo
s accurate usage in publications, 25 FNAC studies have been gathered in the meta-analysis. In the plotting of the summary ROC curve, the logit difference and sums of the true positive rates and the false positive rates included in the meta-analysis&lsquo
s codes have been generated by SAS. The formula of the bivariate random effects model and hierarchical summary ROC curve is presented in context with the literature. Then bivariate random effects implementation with the new SAS PROC GLIMMIX is generated. Moreover, HSROC implementation is generated by SAS PROC HSROC NLMIXED. Curves are plotted with RevMan Version 5 (2008). It has been stated that the meta-analytic results of bivariate random effects are nearly identical to the results from the HSROC approach. The results achieved through both random effects meta-analytic methods prove that FNA Cytology is a diagnostic test with a high level of distinguish over breast tumor.
APA, Harvard, Vancouver, ISO, and other styles
33

Caetano, Daniel Jorge. "Modelagem integrada para a programação de voos e a alocação de frotas: abordagens baseadas em programação linear inteira e na meta-heurística colônia de formigas." Universidade de São Paulo, 2011. http://www.teses.usp.br/teses/disponiveis/3/3138/tde-29082011-112817/.

Full text
Abstract:
Este trabalho propõe modelos matemáticos e heurísticas para a definição da malha de voos de uma empresa aérea, como parte de seu planejamento operacional, visando à maior eficiência de operação frente às restrições relacionadas aos aeroportos, a equipamentos e à demanda. Em especial, é proposta uma função objetivo, baseada no momento de transporte, para a modelagem integrada dos problemas de Programação de Voos e Alocação de Frotas que inclui elementos específicos para a consideração de slots de pouso e decolagem. A abordagem tem aplicação especialmente relevante no âmbito de empresas aéreas de pequeno e médio porte atuando em mercados regionais, cuja malha é composta principalmente por voos de curta duração, em geral operando com aeronaves de pequeno e médio porte. Nestas condições, tais empresas trabalham com margens de lucro limitadas e, portanto, podem-se beneficiar sensivelmente da definição de uma malha mais eficiente e eficaz. Os modelos desenvolvidos, baseados em programação linear inteira e na meta-heurística Ant Colony Optimization, foram aplicados com sucesso ao caso de uma empresa aérea regional, com atuação no mercado brasileiro, possibilitando a definição de malhas alternativas, bem como fornecendo subsídos para a avaliação dos impactos na malha oriundos da utilização de novas aeronaves.
This research proposes mathematical models and heuristics to define the flight mesh of an airline, as part of its operational planning, considering restrictions related to airports, equipment and demand. In particular, an objective function is formulated, based on transport momentum, proposed for the integrated modeling of Flight Scheduling and Fleet Assignment problems that includes specific elements to consider landing and takeoff slots at airports. The approach is especially relevant for small and medium airlines operating in regional markets, with short-haul flights, in general operating with small or medium size aircraft. Accordingly, these companies work with limited profit margins, and, therefore, they can take great benefit from a more efficient and effective flight mesh. The models proposed, based on integer linear programming and on the Ant Colony Optimization meta-heuristic, were successfully applied to the case of a regional airline with operations in Brazil, enabling the definition of mesh alternatives as well as providing information for the assessment of impacts in its flight network arising from the utilization of new aircraft.
APA, Harvard, Vancouver, ISO, and other styles
34

Aghaie, Joobani Hossein. "Meta-Geopolitics of Central Asia : A Comparative Study of the Regional Influence of the European Union and the Shanghai Co-operation Organization." Thesis, Linköpings universitet, Institutionen för ekonomisk och industriell utveckling, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-100397.

Full text
Abstract:
Central Asia has been the focal point of intense geopolitical power struggle throughout history. At the dawn of the 21st century, Central Asia has undergone major changes as the European Union and the China-led Shanghai Co-operation Organization have emerged as two normative powers, both seeking to influence the patterns of security governance in the region. This study aims to delve deep into ‘the black boxes’ of the EU’s and China’s foreign policies toward five CA republics. It starts from the premise that the bulk of research on Eurasian politics tend to concentrate mostly on realist and traditional geopolitical doctrine, which seem to have failed to properly explain the normative and ideational transformations that have taken place in the region as a result of the presence of these two emerging normative agents. By interweaving both realist and constructivist theories of International Relations (IR) into a new all-encompassing analytical framework, termed “meta-geopolitics”, the thesis seeks to trace and examine how geopolitical as well as normative components of the EU and Chinese regional strategies have affected the contemporary power dynamics in the post-Soviet space. I argue that, in contrast to the geopolitical struggle during the 19th and 20th centuries, a clash of normative powers is brewing in the region between China, under the aegis of the SCO, and the EU. The research also concludes that China has relatively been in a better position in comparison to the EU to render its policies as feasible, effective and legitimate to the Central Asian states.
APA, Harvard, Vancouver, ISO, and other styles
35

Bornia, Poulsen Camilo José. "Desenvolvimento de um modelo para o School Timetabling Problem baseado na Meta-Heurística Simulated Annealing." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2012. http://hdl.handle.net/10183/39522.

Full text
Abstract:
Todo início de período letivo, gestores de instituições de ensino se deparam com um típico problema: montar as grades horárias das turmas, segundo as demandas de aulas de suas disciplinas e considerando as restrições de disponibilidade horária de todos os envolvidos. Conhecido na literatura como School Timetabling Problem (STP), este típico problema de otimização combinatória é reconhecidamente complexo por conta do seu elevado número de variáveis e restrições. Devido à dependência das regras do sistema educacional de cada país, o STP pode ter inúmeras variantes, cada uma com o seu próprio conjunto de particularidades. Este trabalho se propõe a oferecer um modelo para o STP considerando o sistema educacional brasileiro, visando alocar não apenas professores, mas também determinando que disciplina cada professor deve ministrar e alocando os locais de aula. O modelo proposto, baseado na meta-heurística simulated annealing, foi concebido para que cada instituição de ensino usuária tenha liberdade para definir a penalidade de cada tipo possível de inconformidade ou restrição, de modo que o algoritmo empregado possa encontrar uma solução com o menor custo possível.
Every beginning of term, educational institution managers face a typical problem: planning the classes' timetable, according to their lesson demands for each subject, considering, furthermore, the schedule constrains of all actors. Known as school timetabling problem (STP), this typical combinatorial optimization problem is remarkably complex due to the high number of variables and constraints. Owing to the rules of each country's educational system, STP can have uncountable variants, each one with their own set of features. This dissertation searches to offer a model to STP considering the Brazilian Educational System, focusing on allocating not only the teachers but also determining which subject each teacher should teach and allocating classrooms, laboratories and the like. The propesed model, based on the metaheuristic simulated annealing, was conceived so that each educational institution using this model has the freedom to define which penalty will be applied to each possible kind of noncomformity and constraint, in order for the applied algorithm to find a solution at the lowest cost as possible.
APA, Harvard, Vancouver, ISO, and other styles
36

Carbol, Ladislav. "Analýza stropního vytápění." Master's thesis, Vysoké učení technické v Brně. Fakulta stavební, 2013. http://www.nusl.cz/ntk/nusl-226007.

Full text
Abstract:
Diploma thesis deals with analysis of radiant ceiling heating in VUT dormitories. The work contains a theoretical analysis of radiant and convection heat transmission of ceiling heating. Part of this work is creation of mathematical model for evaluation of variables typical for radiant ceiling heating. Model outposts are compared with data measured on a real building.
APA, Harvard, Vancouver, ISO, and other styles
37

Wang, Deyun. "Integrated Scheduling of Production and Transportation Operations with Stage-dependent Inventory Costs and Due Dates Considerations." Phd thesis, Université de Technologie de Belfort-Montbeliard, 2012. http://tel.archives-ouvertes.fr/tel-00720660.

Full text
Abstract:
Increasing global competition in the business world and heightened expectations of customers have forced companies to consider not only the pricing or product quality, but reliability and timeliness of the deliveries as well. In manufacturing-centric industries such as automotive and electronics, distribution and inventory costs constitute the second and third largest cost components following the production costs. Therefore, industrial and logistics companies need to continuously search for ways to lower the inventory level and distribution cost. This trend has created a closer interaction between the different stages of a supply chain, and increased the practical usefulness of the integrated models.This thesis considers two categories of integrated scheduling problems. One is Integrated Scheduling of Production-Distribution-Inventory problems (ISPDI problems) and the other is Integrated Scheduling of Production-Inventory-Distribution-Inventory problems (ISPIDI problems). Jobs are first processed on a single machine in the production stage, and then delivered to a pre-specified customer by a capacitated transporter. Each job has a distinct due date, and must be delivered to customer before this due date. Each production batch requires a setup cost and a setup time before the first job of this batch is processed. Each round trip between the factory and customer requires a delivery cost as well as a delivery time. Moreover, it is assumed that a job which is completed before its departure date or delivered to the customer before its due date will incur a corresponding inventory cost. Our objective is to minimize the total cost involving setup, inventory and delivery costs while guaranteeing a certain customer service level.For ISPDI problems, we firstly provide a mixed integer programming model for the case of multi-product, single-stage situation, and develop an improved Genetic algorithm (GA) for solving it. Then, we extend this model to a single-product, multi-stage model, and provide two methods, dominance-related greedy algorithm and GA, for solving it. For ISPIDI problems, we establish a general non-linear model for the case of single-product situation and devise a special case from the general model. Then we provide an optimality property between the production and delivery schedules for the special case. Finally, a heuristic approach is developed for solving it. For each problem under study, in order to evaluate the performance of the proposed algorithms, some interesting lower bounds on the corresponding objective functions are established according to different methods such as Lagrangian relaxation method, classical bin-packing based method. Computational results show the efficiency of the proposed models and algorithms in terms of solution quality and running time.
APA, Harvard, Vancouver, ISO, and other styles
38

Whitman, Daniel S. "Emotional Intelligence and Leadership in Organization: A Meta-analytic Test of Process Mechanisms." FIU Digital Commons, 2009. http://digitalcommons.fiu.edu/etd/113.

Full text
Abstract:
The present study – employing psychometric meta-analysis of 92 independent studies with sample sizes ranging from 26 to 322 leaders – examined the relationship between EI and leadership effectiveness. Overall, the results supported a linkage between leader EI and effectiveness that was moderate in nature (ρ = .25). In addition, the positive manifold of the effect sizes presented in this study, ranging from .10 to .44, indicate that emotional intelligence has meaningful relations with myriad leadership outcomes including effectiveness, transformational leadership, LMX, follower job satisfaction, and others. Furthermore, this paper examined potential process mechanisms that may account for the EI-leadership effectiveness relationship and showed that both transformational leadership and LMX partially mediate this relationship. However, while the predictive validities of EI were moderate in nature, path analysis and hierarchical regression suggests that EI contributes less than or equal to 1% of explained variance in leadership effectiveness once personality and intelligence are accounted for.
APA, Harvard, Vancouver, ISO, and other styles
39

Kumar, Vikas. "An empirical investigation of the linkage between dependability, quality and customer satisfaction in information intensive service firms." Thesis, University of Exeter, 2010. http://hdl.handle.net/10036/3011.

Full text
Abstract:
The information service sector e.g. utilities, telecommunications and banking has grown rapidly in recent years and is a significant contributor to the Gross Domestic Product (GDP) of the world’s leading economies. Though, the information service sector has grown significantly, there have been relatively few attempts by researchers to explore this sector. The lack of research in this sector has motivated my PhD research that aims to explore the pre-established relationships between dependability, quality and customer satisfaction (RQ1) within the context of information service sector. Literature looking at the interrelationship between the dependability and quality (RQ2a), and their further impact on customer satisfaction (RQ2b) is also limited. With the understanding that Business to Business (B2B) and Business to Customer (B2C) businesses are different, exploring these relationships in these two different types of information firms will further add to existing literature. This thesis also attempts to investigate the relative significance of dependability and quality in both B2B and B2C information service firms (RQ3a and RQ3b). To address these issues, this PhD research follows a theory testing approach and uses multiple case studies to address the research questions. In total five cases from different B2B and B2C information service firms are being investigated. To explore the causality, the time series data set of over 24 to 60 months time and the ‘Path Analysis’ method has been used. For the generalization of the findings, Cumulative Meta Analysis method has been applied. The findings of this thesis indicate that dependability significantly affects customer satisfaction and an interrelationship exists between dependability and quality that further impacts customer satisfaction. The findings from B2C cases challenges the traditional priority afforded to relational aspect of quality by showing that dependability is the key driver of customer satisfaction. However, B2B cases findings shows that both dependability and quality are key drivers of customer satisfaction. Therefore, the findings of this thesis add considerably to literature in B2B and B2C information services context.
APA, Harvard, Vancouver, ISO, and other styles
40

Cooke, Alan. "History and Evolution of Metadata Standards for the FTI Community." International Foundation for Telemetering, 2014. http://hdl.handle.net/10150/577517.

Full text
Abstract:
ITC/USA 2014 Conference Proceedings / The Fiftieth Annual International Telemetering Conference and Technical Exhibition / October 20-23, 2014 / Town and Country Resort & Convention Center, San Diego, CA
The paper discusses the history and background of metadata standards for the FTI community over the last 20 years and speculates on how they may develop in the future. It starts by highlighting the deficiencies of proprietary formats and the resulting problems. It then discusses the characteristics and features of specific industry standard metadata descriptions such as TMATS, iHAL, MDL and XidML in addition to their levels of maturity. The attributes of what constitutes a fully mature FTI metadata standard is then discussed. It is suggested that any standard must serve at least two functions, Configuration and Validation, and outlines what exactly each means. Finally, it is argued that there is now a significant level of convergence and consensus in both the scope and application of metadata, and in the associated concept of operations (ConOps). The details of this Concept of Operations are then discussed along with suggestions as to how this may evolve in the coming years.
APA, Harvard, Vancouver, ISO, and other styles
41

Sund, Fredrik. "Teknisk tillgänglighet och dess nyckeltal - utifrån en marin kontext." Thesis, Försvarshögskolan, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:fhs:diva-6251.

Full text
Abstract:
Tillgänglighet är väsentligt för att skapa effekt med tekniska system. Detta leder till ett behov av att aktivt arbeta med uppföljning och nyckeltal för att kunna optimera organisation och design i syfte att uppnå den önskade effekten, utifrån tillgängliga resurser. En grund i detta är att ha en gemensam terminologi, god kommunikation och en förståelse för aktörernas roller.   Studien undersöker tillgänglighetsbegreppen och jämför definitionerna i litteratur med verkligheten. Syftet är att förtydliga och belysa problematiken kring begreppen för tillgänglighet och dess nyckeltal samt arbetet med detta. Detta görs bland annat genom att pröva om befintliga begrepp fortfarande är valida. Målsättningen är att arbetet skall styrka de uttalade behoven rörande en diskussion kring tillgänglighetsbegreppen och kunna vara en inledning till vidare studier.   Resultatet visar på att befintliga tillgänglighetsbegrepp är gångbara. Frågan som uppstår är rörande vilken typ av tillgänglighet som bör vara i fokus. Oavsett, är det utmanande att ur ett tillgänglighetsperspektiv, hantera multifunktionella plattformar med olika driftprofiler. Till detta finns det tolkningsdiskrepanser mellan litteraturen och verkligheten, samt inom och mellan organisationerna. Vidare noteras att ensade rutiner för uppföljning saknas och att detta främst är en chefs- och ledningsfråga.
Availability is essential to create effect with technological systems. This leads to a need to actively work on follow-up and key indicators to be able to optimize organization and design in order to achieve the desired effect, with the resources available. Fundamental to this is to have common terminology, good communication and an understanding of the roles of those involved.   This study examines the concepts of availability and compares the definitions from literature with reality. The aim is to clarify and highlight problems with concepts of availability, key indicators and the work being done on this. This is partly done by testing whether or not existing concepts are still valid. The objective is for this work to support needs expressed regarding a discussion of availability concepts, and to be a prelude for further studies.   The results of the study show that existing concepts of availability are still valid. The question that arises relates instead to which type of availability should be in focus. Nevertheless, it is quite clear that it is a challenging task, from an availability perspective, to manage multifunctional platforms with different operating profiles. Furthermore, there are also discrepancies both between literature and reality, and within and between organizations. It was also noted that there is a lack of uniform follow-up procedures, and that this is primarily an issue for commanders and management.
APA, Harvard, Vancouver, ISO, and other styles
42

Fincannon, Thomas. "Visuo-spatial abilities in remote perception: A meta-analysis of empirical work." Doctoral diss., University of Central Florida, 2013. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5632.

Full text
Abstract:
Meta-analysis was used to investigate the relationship between visuo-spatial ability and performance in remote environments. In order to be included, each study needed to examine the relationship between the use of an ego-centric perspective and various dimensions of performance (i.e., identification, localization, navigation, and mission completion time). The moderator analysis investigated relationships involving: (a) visuo-spatial construct with an emphasis on Carroll's (1993) visualization (VZ) factor; (b) performance outcome (i.e., identification, localization, navigation, and mission completion time); (c) autonomy to support mission performance; (d) task type (i.e., navigation vs. reconnaissance); and (e) experimental testbed (i.e., physical vs. virtual environments). The process of searching and screening for published and unpublished analyses identified 81 works of interest that were found to represent 50 unique datasets. 518 effects were extracted from these datasets for analyses. Analyses of aggregated effects (Hunter & Schmidt, 2004) found that visuo-spatial abilities were significantly associated with each construct, such that effect sizes ranged from weak (r = .235) to moderately strong (r = .371). For meta-regression (Borenstein, Hedges, Figgins, & Rothstein, 2009; Kalaian & Raudenbush, 1996; Tabachnick & Fidell, 2007), moderation by visuo-spatial construct (i.e., focusing on visualization) was consistently supported for all outcomes. For at least one of the outcomes, support was found for moderation by test, the reliability coefficient of a test, autonomy (i.e. to support identification, localization, and navigation), testbed (i.e., physical vs. virtual environment), intended domain of application, and gender. These findings illustrate that majority of what researchers refer to as “spatial ability” actually uses measures that load onto Carroll's (1993) visualization (VZ) factor. The associations between this predictor and all performance outcomes were significant, but the significant variation across moderators highlight important issues for the design of unmanned systems and the external validity of findings across domains. For example, higher levels of autonomy for supporting navigation decreased the association between visualization (VZ) and performance. In contrast, higher levels of autonomy for supporting identification and localization increased the association between visualization (VZ) and performance. Furthermore, moderation by testbed, intended domain of application, and gender challenged the degree to which findings can be expected to generalize across domains and sets of participants.
Ph.D.
Doctorate
Psychology
Sciences
Psychology; Human Factors Psychology
APA, Harvard, Vancouver, ISO, and other styles
43

Flores-Molina, Jose C. "A Total Quality Management Methodology for Universities." FIU Digital Commons, 2011. http://digitalcommons.fiu.edu/etd/375.

Full text
Abstract:
This research document is motivated by the need for a systemic, efficient quality improvement methodology at universities. There exists no methodology designed for a total quality management (TQM) program in a university. The main objective of this study is to develop a TQM Methodology that enables a university to efficiently develop an integral total quality improvement (TQM) Plan. Current research focuses on the need of improving the quality of universities, the study of the perceived best quality universities, and the measurement of the quality of universities through rankings. There is no evidence of research on how to plan for an integral quality improvement initiative for the university as a whole, which is the main contribution of this study. This research is built on various reference TQM models and criteria provided by ISO 9000, Baldrige and Six Sigma; and educational accreditation criteria found in ABET and SACS. The TQM methodology is proposed by following a seven-step meta-methodology. The proposed methodology guides the user to develop a TQM plan in five sequential phases: initiation, assessment, analysis, preparation and acceptance. Each phase defines for the user its purpose, key activities, input requirements, controls, deliverables, and tools to use. The application of quality concepts in education and higher education is particular; since there are unique factors in education which ought to be considered. These factors shape the quality dimensions in a university and are the main inputs to the methodology. The proposed TQM Methodology is used to guide the user to collect and transform appropriate inputs to a holistic TQM Plan, ready to be implemented by the university. Different input data will lead to a unique TQM plan for the specific university at the time. It may not necessarily transform the university into a world-class institution, but aims to strive for stakeholder-oriented improvements, leading to a better alignment with its mission and total quality advancement. The proposed TQM methodology is validated in three steps. First, it is verified by going through a test activity as part of the meta-methodology. Secondly, the methodology is applied to a case university to develop a TQM plan. Lastly, the methodology and the TQM plan both are verified by an expert group consisting of TQM specialists and university administrators. The proposed TQM methodology is applicable to any university at all levels of advancement, regardless of changes in its long-term vision and short-term needs. It helps to assure the quality of a TQM plan, while making the process more systemic, efficient, and cost effective. This research establishes a framework with a solid foundation for extending the proposed TQM methodology into other industries.
APA, Harvard, Vancouver, ISO, and other styles
44

Collings, John N. "The effect of training for field-independence on formal operations : the consequences for general ability and the effectiveness of developing an associated meta-cognitive language in combination with the training procedures." Thesis, University of Gloucestershire, 1987. http://eprints.glos.ac.uk/4589/.

Full text
Abstract:
After conducting a number of pilot studies pre- and post-tests were given to three experimental classes of 11 to 13 year old early adolescents, one taken by Collings, and the two others by an inexperienced teacher. With one class the latter used materials designed to develop Field-independence only, with the other the teacher followed a similar pattern to Collings who incorporated a meta-cognitive aspect by encouraging students to analyse their own thinking strategies and to 'bridge' between the Field-independence lessons and the contexts of science. There were two control classes, and the overall period of the intervention was one school year with about 20% of the science teaching time used for the intervention. The tests used were the Group embedded Figures Test (GHFT) for Field-independence, and Volume and Heaviness (SRTII), NFER (1979) for Piagetian operations. In the pre- post-test Comparisons between experimental and Control groups all the differences between the differences were statistically significant. Collings' own class showed an effect-size of 1.53 σ on GBFT over the controls, and 0.92 σ on SRTII. The inexperienced teacher's class with Field-independence training only, showed an effect-size of 1.09 σ on GHFT and 0.36 on SRTII whereas his class with meta cognition added showed an effect-size of 1.13 σ on GEFT, and 0.63 σ on SRTII. There was no statistical difference between the 1.09 and 1.13 σ on GEFT and this inferred that the Field-independence materials were fairly robust to teacher effects. The difference between 0.36 and 0.68 σ on SRTII was significantly different, and this was interpreted as showing that the meta-cognitive aspect assisted transfer of training to Formal Operations.
APA, Harvard, Vancouver, ISO, and other styles
45

Johnson, Gloria. "The Effect of Applying Design of Experiments Techniques to Software Performance Testing." ScholarWorks, 2015. https://scholarworks.waldenu.edu/dissertations/226.

Full text
Abstract:
Effective software performance testing is essential to the development and delivery of quality software products. Many software testing investigations have reported software performance testing improvements, but few have quantitatively validated measurable software testing performance improvements across an aggregate of studies. This study addressed that gap by conducting a meta-analysis to assess the relationship between applying Design of Experiments (DOE) techniques in the software testing process and the reported software performance testing improvements. Software performance testing theories and DOE techniques composed the theoretical framework for this study. Software testing studies (n = 96) were analyzed, where half had DOE techniques applied and the other half did not. Five research hypotheses were tested, where findings were measured in (a) the number of detected defects, (b) the rate of defect detection, (c) the phase in which the defect was detected, (d) the total number of hours it took to complete the testing, and (e) an overall hypothesis which included all measurements for all findings. The data were analyzed by first computing standard difference in means effect sizes, then through the Z test, the Q test, and the t test in statistical comparisons. Results of the meta-analysis showed that applying DOE techniques in the software testing process improved software performance testing (p < 05). These results have social implications for the software testing industry and software testing professionals, providing another empirically-validated testing methodology. Software organizations can use this methodology to differentiate their software testing process, to create more quality products, and to benefit the consumer and society in general.
APA, Harvard, Vancouver, ISO, and other styles
46

PINSON, ARQUETOUT SUZANNE. "Meta-modele et heuristiques de jugement : le systeme credex, application a l'evaluation du risque credit entreprise." Paris 6, 1987. http://www.theses.fr/1987PA066582.

Full text
Abstract:
Developpement d'un systeme expert, appele credex, qui a pour but d'aider les analystes financiers des banques a evaluer le risque afferent a l'attribution de prets a des entreprises. Credex est ecrit en snark. Son originalite reside dans: 1) son utilisation des modeles multiattributs de traitement de l'information pour combiner les elements de risque; 2) sa structure multi experts controlee au niveau superieur par un metamodele; 3) sa possibilite de batir une strategie d'evaluation adaptee a chaque entreprise candidate a un pret
APA, Harvard, Vancouver, ISO, and other styles
47

Gyawali, Himal. "Parametric Study for Assessment of Bridges to Meet Specialized Hauling Vehicles Requirements in Ohio." University of Toledo / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=toledo154473302303056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Al, Shaalane Amir. "Improving asset care plans in mining : applying developments from aviation maintenance." Thesis, Stellenbosch : Stellenbosch University, 2012. http://hdl.handle.net/10019.1/71813.

Full text
Abstract:
Thesis (MScEng)--Stellenbosch University, 2012.
ENGLISH ABSTRACT: The aim of this thesis is to compare the aviation derived reliability metric known as the Maintenance Free Operating Period (MFOP), with the traditionally used, and commonly found, reliability metric Mean Time Between Failure (MTBF), which has over the years shown some innate disadvantages in the field of maintenance. It will be shown that this is mainly due to MTBF’s inherent acceptance of failure and the unscheduled maintenance therewith directly connected. Moreover, MFOP is successfully applied to a mining specific case study, as to date, no other application of the MFOP concept to the mining sector is known. An extensive literature study is presented, which covers concepts relevant to the overall study and which helps to contextualise the problem, revealing the major shortcomings of the commonly accepted MTBF metric. A methodology to analyse systems MFOP performance, making use of failure statistics to analyse both repairable and non-repairable systems, is presented. Validation makes use of a case study which applies the MFOP methodology to a system, specifically in the mining sector. It was shown that MFOP could be applied to the data obtained from the mining sector, producing estimates which were accurate representations of reality. These findings provide an exciting basis on which to begin to facilitate a paradigm shift in the mind set of maintenance personnel, setting reliability targets and dealing with unscheduled maintenance stops. KEYWORDS: Maintenance Free Operating Period, Mean Time Between Failure, Maintenance, Mining
AFRIKAANSE OPSOMMING: Die doel van hierdie tesis is om die Onderhoudvrye Bedryf Tydperk (OBT), ’n betroubaarheidsmaatstaf afkomstig van die lugvaart industrie, te vergelyk met die Gemiddelde Tyd Tussen Falings (GTTF) maatstaf wat tradisioneel in algemene gebruik is, maar wat oor die jare inherente nadele met betrekking tot instandhouding geopenbaar het. Dit sal bewys word dat hierdie nadele hoofsaaklik ontstaan as gevolg van die GTTF se inherente aanvaarding van failure en die ongeskeduleerde instandhouding wat daarmee gepaard gaan. OBT word ook suksesvol aangewend in ’n mynwese-spesifieke gevallestudie, wat aaangegaan is aangesien geen ander sooortgelyke aanwending in die mynwese sektor tot datum bekend is nie. ’n Breedvoerige literatuurstudie word voorgelê wat relevante konsepte dek en die probleem binne konteks plaas, en daardeur die hoof tekortkominge van die algemeen aanvaarde GTTF metriek ontbloot. ’n Metodologie waardeur analise van die stelsel werkverrigting van die OBT uitgevoer kan word met gebruik van onderbrekings statistiek om herstelbaar sowel as onherstelbare stelsels te analiseer, word voorgestel. Geldigheid word getoets deur ’n gevallestudie wat die OBT metodologie aangewend word spesifiek vir ’n stelsel in die mynwese Dit is bewys dat OBT toegepas kan word op data afkomstig van die mynwese sector, en skattings lewer wat akkurate voorstellings is van die werklikheid. Hierdie bevindinge is opwindend, en dit dien as die basis vir ’n die aanwending van ’n paradigmaskuif in die benadering van instandhoudingspersoneel tot die daarstelling van teikens vir betroubaarheid en ook in hul hantering van ongeskeduleerde instandhoudingsophoud. SLEUTELWOORDE: Onderhoudvrye Bedryf Tydperk, Gemiddelde Tyd Tussen Falings, Onderhoud, Mynbou
APA, Harvard, Vancouver, ISO, and other styles
49

Reijers, Thayla Sara Soares Stivari. "Desenvolvimento de modelo computacional híbrido - baseado em agentes e em simulação de eventos discretos - para avaliação e planejamento da produção animal: uma aplicação na ovinocultura de corte." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/10/10135/tde-05122016-115209/.

Full text
Abstract:
Realizar a análise econômica de uma produção agropecuária não é algo trivial, seja pela enorme heterogeneidade entre cada unidade produtiva, seja porque utiliza muitos recursos naturais, alguns de difícil mensuração. Há diversos métodos disponíveis para o cálculo do custo de produção, que é o indicador chave para a análise da viabilidade de um empreendimento. O desafio é permitir que a projeção da atividade no horizonte produtivo seja a mais real e dinâmica possível. A simulação computacional é atualmente uma das mais poderosas ferramentas de análise disponível para o planejamento, projeto e controle de sistemas complexos, e vem sendo cada vez mais utilizada e difundida. Simular compreende a elaboração de um método de experimentação que, através da construção de modelos de um sistema real, procura descrever comportamentos, construir teorias ou hipóteses por meio do observado e predizer comportamentos futuros. O uso de modelos de simulação, que incorporem o risco e a probabilidade dentro produção animal, pode ser uma alternativa tanto técnica - auxiliando na tomada de decisão, gestão e planejamento pecuário, quanto científica - permitindo a avaliação de efeitos de resultados da pesquisa e identificação de limitantes que podem incentivar o desenvolvimento de pesquisas futuras. O modelo de simulação híbrido - baseado em simulação de eventos discretos e baseado em agentes, aqui proposto - visou identificar coeficientes zootécnicos e os critérios de manejo que mais impactam a produção de ovinos de corte. Esse modelo de simulação computacional híbrido possui caráter dinâmico e probabilístico, com eventos marcados no tempo (estação de monta, gestação, parição, desmame, engorda, abate, entre outras) e complexos o suficiente para que seus agentes sofram alterações tanto no tempo quanto em resposta a ocorrência ou não das variáveis ligadas a eles. Os resultados dos experimentos e dos cenários estudados revelaram que dentre os índices zootécnicos das matrizes, a ocorrência de aborto gera maior impacto na taxa de abate e financeiramente na margem líquida operacional. Contudo, analisando as variáveis tanto para matrizes como para cordeiros, a mortalidade neonatal, até os cinco dias de vida dos cordeiros, demonstrou ser o ponto fundamental para a lucratividade da atividade. A estabilização do rebanho foi mais afetada pela presença de matrizes adultas no plantel, que culminaram no aumento no número de cordeiros por matriz. Os resultados da análise dos fluxos de caixa de 30 anos, a uma taxa mínima de atratividade de 6,17% aa, permitiram constatar que iniciar a atividade com número muito reduzido de matrizes é o cenário menos interessantes, com taxa interna de retorno negativa para o período de análise. O estudo do fluxo de caixa permitiu atribuir a 200 matrizes como o tamanho de rebanho inicial mais interessante (TIR = 3,30% aa). Assim, o uso de simuladores híbridos baseados em simulação de eventos discretos e baseados em agentes, para estudos na pecuária nacional, apresentou-se como ferramenta com grande potencial de contribuição, no sentido de permitir conhecer os resultados possíveis das diferentes combinações tecnológicas disponíveis. O modelo ainda permite ser utilizado como ferramenta de estudo e análise para a cadeia produtiva, contribuindo na orientação aos cientistas, auxiliando no direcionamento de seus esforços no desenvolvimento de futuras pesquisas
Conduct an economic analysis of agricultural production is not trivial, either by its enormous heterogeneity between each production unit, or because it uses many natural resources, some of which are difficult to measure. There are several methods available for calculating the cost of production, which is the key indicator for assessing the feasibility of a project. The challenge is to allow the projection of activity in the productive horizon in the most real and dynamic form as possible. The computer simulation is currently one of the most powerful analysis tools available for planning, design and control of complex systems and is being increasingly used and disseminated. Simulate includes the development of a method of testing by building models of a real system, that seeks to describe behaviors, build theories or hypotheses through noted and predicted future behaviors. The use of simulation models that incorporate uncertainty and probability in animal production can be both an alternative technique - assisting in decision-making, management and livestock planning; as scientific - allowing the evaluation of the research results of effects and identification of limiting that may encourage the development of future research. The model of hybrid simulation - based on discrete event simulation and on agent-based, proposed here - aimed to identify factors husbandry and management criteria that most affect the production of meat sheep. The model of hybrid computer simulation have dynamic and probabilistic characteristics, with events scheduled in time (breeding season, pregnancy, parturition, weaning, fattening, slaughter, etc.) and enough complexity that its agents be adversely affected both in time and in response to the occurrence or not of variables linked to them. The results of the experiments and the scenarios studied showed that among the zootechnical indexes of the sheep, the occurrence of abortion generates a greater impact on the slaughter rate and financially in the net operating margin. However, analyzing the variables for both sheep and lambs, neonatal mortality, up to the lambs\' five days of life, proved to be the fundamental point for the profitability of the activity. The stabilization of the herd was more affected by the presence of adult sheep in the herd, which culminated increasing the number of lambs per sheep. The results of the analysis of the 30-year cash flows, at a minimum attractiveness rate of 6.17% per year, showed that starting the activity with a very small number of sheep was the least interesting scenario, with a negative internal rate of return for the analysis period. The study of the cash flow allowed to assign to 200 sheep as the most interesting initial herd size (IRR = 3.30% per year). Thus, the use of hybrid simulators, based on discrete event simulation and agent-based for studies in national livestock, is presented as a tool with great potential to contribute, to allow knowing the possible outcomes of different combinations of available technology. The model also allows to be used as a study tool and assessment of different technological combinations for the production chain, contributing to the guidance of scientists, assisting their efforts in the development of future research
APA, Harvard, Vancouver, ISO, and other styles
50

Oliver, Laura A. "Work Breaks, Employee Morale, and Satisfaction in the Restaurant Industry." ScholarWorks, 2016. https://scholarworks.waldenu.edu/dissertations/3057.

Full text
Abstract:
Work breaks during an individual's shift can be a powerful motivational tool for management; however, not all individuals receive breaks during their shifts. The purpose of this phenomenological, qualitative study was to explore how work breaks affect employee morale and satisfaction in the casual and fine dining restaurant industry. The questions explored in this study were related to the breaks effect employee satisfaction and morale in the casual and fine dining restaurant industry. Thirteen participants with a minimum of 5 years' experience as wait staff who worked more than 6 hours per day were interviewed using semi structured interviews. The results were analyzed using a modified version of van Kaam's method and MAXqda software. The results suggested that breaks did not directly affect employee satisfaction and morale; however, management style directly affected employee satisfaction and morale. The results from this study may help managers have a better understanding of how their interactions and dealings with employees affect employee satisfaction and morale. This research may spur a field-changing management training, which could promote positive social change for employees.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography