To see the other types of publications on this topic, follow the link: Isradipine and Quality Control.

Dissertations / Theses on the topic 'Isradipine and Quality Control'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Isradipine and Quality Control.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Aguiar, Fernando Armani. "Aplicações da eletroforese capilar na análise do biomarcadir alfa-1 glicoproteína ácida, no controle de qualidade do biofármaco interferon alfa 2a e na avaliação da estabilidade enantiosseletiva do fármaco isradipina." Universidade de São Paulo, 2013. http://www.teses.usp.br/teses/disponiveis/60/60137/tde-22102013-160148/.

Full text
Abstract:
A eletroforese é uma técnica de separação que se baseia na migração diferencial de compostos iônicos em tubo capilar semicondutor, preenchido com solução eletrolítica, sob a influência de campo elétrico. Na introdução desta tese, princípios, métodos e diferentes tipos de técnicas de eletromigração em capilar foram discutidos. No primeiro capítulo são mostrados os resultados de otimização e validação de um método eletroforético para a determinação das glicoformas da ?1-Glicoproteína Ácida, um biomarcador. A otimização das condições eletroforéticas usando eletrólito de corrida constituído por tricina (10 mmol L-1), cloreto de sódio (10 mmol L-1), acetato de sódio (10 mmol L-1), ureia (7 mol L-1) e putrescina (3,9 mmol L-1), pH de 4,5, tensão de 30 kV, e temperatura de análise de 35 °C levou à resolução mínima de aproximadamente 1,5 entre as oito glicoformas encontradas. Todas as análises foram realizadas em um capilar de sílica fundida não revestido internamente, diâmetro interno de 50 µm e comprimento efetivo de 50,0 centímetros. Após a otimização, o método foi validado, em que a linearidade foi obtida no intervalo de 0,125 a 2,5 mg mL-1 (r >= 0,993). O coeficiente de variação (%) e erros relativos (%) obtidos nos estudos de precisão e exatidão, respectivamente, intra e inter-dias foram inferiores a 15 %. Após a validação o método foi aplicado para a análise da ?1-AGP em amostras de plasma de pacientes com sepse, o qual demonstrou uma variabilidade na concentração da glicoformas. No segundo capítulo, um método simples, rápido e econômico por eletroforese capilar, foi desenvolvido e validado para a determinação de Interferon alfa-2a, um biofármaco, em formulação farmacêutica. Após otimização, os melhores resultados foram obtidos utilizando solução tampão tetraborato de sódio 30 mmol L-1, e pH 8,50, com 50 mmol L-1 de dodecil sulfato de sódio. A tensão aplicada foi de 25 kV e a injeção da amostra foi realizada no modo hidrodinâmico. Todas as análises foram realizadas em capilar de sílica fundida não revestido internamente, diâmetro interno de 75 µm e comprimento efetivo de 50,0 centímetros. Sob estas condições, a análise foi realizada em menos de 10 min. A linearidade foi obtida no intervalo de 0,41-1,54 MUI mL-1 (r >= 0,997). O coeficiente de variação (%) e erros relativos (%) obtidos nos estudos de precisão e exatidão, respectivamente, intra e inter-dias foram inferiores a 5 %. Após a validação o método foi aplicado no controle de qualidade de formulações farmacêuticas contendo o Interferon alfa-2a. No terceiro capítulo, um método enantiosseletivo simples por eletroforese capilar usando ciclodrextrina como seletor quiral foi desenvolvido e validado para a determinação dos enantiômeros da isradipina, um bloqueador de canal de cálcio, em formulação farmacêutica. Além disso, foi realizado estudo de estabilidade dos enantiômeros da isradipina submetidos à oxidação, hidrólise (ácida e alcalina) e fotólise. A resolução completa dos enantiômeros da isradipina foi obtida em menos de 7 minutos utilizando solução tampão borato de sódio 15 mmol L-1 e pH 9,3 e sulfobutil éter-?-ciclodextrina (2,5 %, m/v) como seletor quiral. A tensão aplicada foi de 30 kV, e a injeção da amostra foi realizada no modo hidrodinâmico. Todas as análises foram efetuadas em capilar de sílica fundida não revestido internamente e diâmetro interno de 50 µm e comprimento efetivo de 50 centímetros. A linearidade foi obtida no intervalo de 25 - 150 µg mL-1 para ambos enantiômeros (r >= 0,998). O coeficiente de variação (%) e erros relativos (%) obtidos nos estudos de precisão e exatidão, respectivamente, intra e inter-dias foram inferiores a 5 %. Após o método ter sido validado, este foi aplicado na análise de formulações farmacêuticas contendo os enantiômeros da isradipina. Nos estudos de estabilidade foi observada degradação dos enantiômeros em todas as condições avaliadas. Assim, de acordo com os resultados obtidos após o desenvolvimento dos três métodos, pode ser concluido que a eletroforese capilar é uma poderosa técnica de separação com aplicações na investigação, desenvolvimento, controle de qualidade e estudos de estabilidade de produtos farmacêuticos. Além disso, a eletroforese capilar é uma técnica complementar à cromatografia líquida de alta eficiência, que oferece vantagens como simplicidade, rapidez, baixo custo e consumo de solventes e reagentes e diferentes mecanismos de seletividade, podendo ser aplicada em diferentes tipos de amostras.
Electrophoresis is a separation technique that is based on the differential migration of charged compounds in a semi-conductive medium under the influence of an electric field. In the introduction of this thesis, principles, methods, and different types of electromigrations techniques in capillary were discussed. In the first chapter shows the results of optimization and validation of a electrophoretic method for determining the glycoforms of ?1-AGP. The running buffer after optimization consisted of Tricine (10 mmol L-1), sodium chloride (10 mmol L-1), sodium acetate (10 mmol L-1), urea (7 mol L-1) and putrescine (3.9 mmol L-1), pH 4.5, voltage (30 kV), temperature and analysis (35 ° C) led to resolution of at least of 1.5 among the eight glycoforms found. All analyses were carried out in a fused-silica uncoated capillary with an id of 50 ?m and effective length of 50.0 cm. After optimization method was validated in which the linearity was obtained in the range to 0.125 to 2.5 mg mL-1 (r >= 0.993). The coefficient of variation (%) and relative errors (%) obtained in the studies of precision and accuracy, respectively (intra-day and inter-day) were less than 15 %. After method validation, the analysis of ?1-AGP in plasma of septic patients was performed, which showed variability in the concentration of glycoforms. In the second chapter a simple CE based method was developed and validated for the determination of Interferon alpha-2a in a pharmaceutical formulation. After optimization, the best results were obtained using 30 mmol L-1 tetraborate buffer at pH 8.50 with 50 mmol L-1 of sodium dodecyl sulfate. The applied voltage was 25 kV, and the sample injection was performed in the hydrodynamic mode. All analyses were carried out in a fused-silica uncoated capillary with an id of 75 ?m and effective length of 50.0 cm. Under these conditions, the analysis was achieved in less than 10 min. Linearity was obtained in the range 0.41-1.54 MIU mL-1 (r >= 0.997). The RSD (%) and relative errors (%) obtained in precision and accuracy studies (intra-day and inter-day) were lower than 5 %. Therefore, this method was found to be appropriate for controlling pharmaceutical formulations containing Interferon alpha-2a. In the third chapter a simple enantioselective method based on CE using CD as chiral selector was developed and validated for the determination of isradipine (IRD) enantiomers in a pharmaceutical formulation and for the determination of IRD enantiomers in degradation studies. After optimization, the best results were obtained using 15 mmol L-1 borate buffer at pH 9.3 and sulfobutyl ether-?-cyclodextrin (SBE-?-CD) (2.5 %, w/v) as chiral selector. The applied voltage was 30 kV, and the sample injection was performed in the hydrodynamic mode. All analyses were carried out in a fused-silica uncoated capillary with an internal diameter of 50 ?m and effective length of 50 cm. Under these conditions, a complete separation between IRD enantiomers was achieved in less than 7 min. Linearity was obtained in the range 25 - 150 ?g mL-1 for both enantiomers (r >= 0.998). The RSD (%) and relative errors (%) obtained in precision and accuracy studies (intra-day and inter-day) were lower than 5 %. Therefore, this method was found to be appropriate for controlling pharmaceutical formulations containing IRD enantiomers and the assay was considered stability indicating. The drug was subjected to oxidation, hydrolysis and photolysis. In all stress conditions the drug presented considerable degradation. According to such results, the capillary electrophoresis showed is a powerful separation technique to research and development, quality control, and stability studies of pharmaceuticals. CE offers several advantages over high-performance liquid chromatography (HPLC), a technique commonly used in pharmaceutical analysis. These include simplicity, rapid analysis, automation, different mechanisms for selectivity, and low cost.
APA, Harvard, Vancouver, ISO, and other styles
2

Jin, Ye. "Quality control of phytopharmaceuticals : assessment and quality control of traditional Chinese medicine." Thesis, Liverpool John Moores University, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.327675.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bush, Helen Meyers. "Nonparametric multivariate quality control." Diss., Georgia Institute of Technology, 1996. http://hdl.handle.net/1853/25571.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sepúlveda, Ariel. "The Minimax control chart for multivariate quality control." Diss., Virginia Tech, 1996. http://hdl.handle.net/10919/30230.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Crossman, S. H. "Quality control in developing epithelia." Thesis, University College London (University of London), 2018. http://discovery.ucl.ac.uk/10044689/.

Full text
Abstract:
In the fruit-fly Drosophila melanogaster, extensive apoptosis is observed throughout the embryonic epidermis upon the mutation of many essential patterning genes. The molecular basis of cell elimination in this context is poorly understood, although previous studies have suggested the existence of a cell-autonomous quality control mechanism, which detects cells unable to adopt an appropriate terminal fate and removes them through apoptosis. This hypothetical system is thought to protect against patterning errors in order to preserve the integrity of the developing epidermis. To identify factors required for apoptosis in mis-patterned cells, I performed a targeted genetic screen, which identified a potential role for the EGFR signalling pathway in this process. Excess EGFR signalling was shown to rescue the cell death phenotype of the archetypal patterning mutant fushi tarazu (ftz), whilst EGFR null alleles triggered extensive epidermal apoptosis. Upon further experimentation, I was able to show that patterning mutant embryos fail to express the major EGFR activating ligands in the correct spatial pattern. This causes local troughs in EGFR signalling, which trigger transcriptional upregulation of the pro-apoptotic gene hid and subsequent cell death. These results argue against a cell-autonomous mechanism of cell elimination in mis-patterned embryos and instead suggest that the tissue-wide landscape of EGFR activity is responsible for coordinating cell fate and cell survival in the embryonic epidermis. Building on these observations, I have been able to show that the EGFR pathway also regulates apoptosis during normal development, where it specifies the maximum dimensions of embryonic segments. Taken together, these findings provide a novel link between early patterning events, cell viability and compartment size in the developing Drosophila embryo.
APA, Harvard, Vancouver, ISO, and other styles
6

Gordon, Kara Leigh. "TorsinA and protein quality control." Diss., University of Iowa, 2011. https://ir.uiowa.edu/etd/2708.

Full text
Abstract:
DYT1 dystonia (DYT1) is a disabling inherited neurological disorder with juvenile onset. The genetic mutation in DYT1 leads to the deletion of a glutamic acid (E) residue in the protein torsinA. The function of torsinA and how the mutation leads to DYT1 is poorly understood. We hypothesize that how efficiently the disease-linked mutant protein is cleared may be critical for DYT1 pathogenesis. Therefore we explored mechanisms of torsinA catabolism, employing biochemical, cellular, and animal-based approaches. We asked if torsinA(wt) and torsinA(DE) are degraded preferentially through different catabolic mechanisms, specifically the ubiquitin proteasome pathway (UPP) and autophagy. We determined that torsinA(wt) is cleared by autophagy while torsinA(DE) is efficiently degraded by the UPP suggesting degradation processes can modulate torsinA(DE) levels. Proteins implicated in recognizing motifs on torsinA(DE) for targeting to the UPP represent candidate proteins that may modify DYT1 pathogenesis. We examined how removal of the hydrophobic domain and mutation of glycosylated asparagine residues on torsinA altered stability and catabolic mechanism. We found the glycosylation sites on torsinA are important for stability modulate its degradation through the UPP. F-box G-domain protein 1 (FBG1) has been implicated in degradation of glycosylated ER proteins. We hypothesized that FBG1 would promote torsinA degradation and demonstrated that FBG1 modulates levels of torsinA in a non-canonical manner through the UPP and autophagy. We examined if lack of FBG1 in a torsinA(DE) mouse model altered motor phenotypes. We saw no effect which suggests FBG1 does not alter DYT1 pathogenesis despite its promotion of torsinA(DE) degradation. In addition, we explored a potential mechanism for the previously described role of torsinA in modulating cytoplasmic protein aggregation. We hypothesized this endoplasmic reticulum (ER) resident protein would indirectly alter cytoplasmic protein aggregation through modulation of ER stress. We employed a poly-glutamine expanded repeat protein and pharmacological ER stressors to determine that torsinA does not alter poly-glutamine protein aggregation nor ER stress in a mammalian system. In summary, this thesis suggests proteins involved in the catabolism of torsinA(DE) may modify DYT1 pathogenesis and that torsinA and its DYT1-linked mutant are model proteins for investigating ER protein degradation by the UPP and autophagy.
APA, Harvard, Vancouver, ISO, and other styles
7

Hughes, Anthony. "Quality control in radionuclide imaging." Thesis, University of Aberdeen, 1990. http://digitool.abdn.ac.uk/R?func=search-advanced-go&find_code1=WSN&request1=AAIU601994.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lennard, Nicola S. "Quality control for carotid endarterectomy." Thesis, University of Leicester, 2004. http://hdl.handle.net/2381/29469.

Full text
Abstract:
The aims of this study are to assess whether the introduction of a rigorous quality control method could produce a sustained reduction in the intraoperative stroke rate in this unit and whether it was feasible and practical to implement such a programme. The second part of this study will assess the incidence of sustained embolisation in the early post-operative period and investigate whether the antiplatelet agent Dextran 40 can help stop this embolisation, potentially preventing carotid artery thrombosis.;A prospective audit of all patients undergoing carotid endarterectomy was performed. The ability to monitor intraoperatively with TCD and perform completion angioscopy was assessed, as was the impact that these quality control techniques had on influencing the surgery. Patients were monitored postoperatively with TCD and any patient who developed sustained embolisation was commenced on an infusion of Dextran 40.;91% had continuous intraoperative TCD monitoring and 94% underwent successful completion angioscopy, a technical error was identified in 5% of angioscopic assessments. The intraoperative stroke rate was 0% during this study. Postoperative monitoring revealed that 5% of patients develop significant embolisation following CEA, Dextran 40 appeared to stop this embolisation. The overall 30-day stroke or death rate following CEA has fallen from 6% prior to 1992 to 2.2% in 1998.;It is possible to implement a quality control programme for CEA and this has been associated with a fall in the overall 30-day death and any stroke rate.
APA, Harvard, Vancouver, ISO, and other styles
9

Kenerson, Jonathan E. "Quality Assurance and Quality Control Methods for Resin Infusion." Fogler Library, University of Maine, 2010. http://www.library.umaine.edu/theses/pdf/KenersonJE2010.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Cassady, Charles Richard. "Statistical quality control techniques using multilevel discrete product quality measures." Diss., This resource online, 1996. http://scholar.lib.vt.edu/theses/available/etd-06062008-151120/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Moffitt, Richard Austin. "Quality control for translational biomedical informatics." Diss., Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/34721.

Full text
Abstract:
Translational biomedical informatics is the application of computational methods to facilitate the translation of basic biomedical science to clinical relevance. An example of this is the multi-step process in which large-scale microarray-based discovery experiments are refined into reliable clinical tests. Unfortunately, the quality of microarray data is a major issue that must be addressed before microarrays can reach their full potential as a clinical molecular profiling tool for personalized and predictive medicine. A new methodology, titled caCORRECT, has been developed to replace or augment existing microarray processing technologies, in order to improve the translation of microarray data to clinical relevance. Results of validation studies show that caCORRECT is able to improve the mean accuracy of microarray gene expression by as much as 60%, depending on the magnitude and size of artifacts on the array surface. As part of a case study to demonstrate the widespread usefulness of caCORRECT, the entire pipeline of biomarker discovery has been executed for the clinical problem of classifying Renal Cell Carcinoma (RCC) specimens into appropriate subtypes. As a result, we have discovered and validated a novel two-gene RT-PCR assay, which has the ability to diagnose between the Clear Cell and Oncocytoma RCC subtypes with near perfect accuracy. As an extension to this work, progress has been made towards a quantitative quantum dot immunohistochemical assay, which is expected to be more clinically viable than a PCR-based test.
APA, Harvard, Vancouver, ISO, and other styles
12

Smith, Kaleigh. "Towards quality control in DNA microarrays." Thesis, McGill University, 2003. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=79129.

Full text
Abstract:
We present a framework for detecting degenerate probes in a DNA microarray that may add to measurement error in hybridization experiments. We consider four types of behaviour: secondary structure formation, self-dimerization, cross-hybridization and dimerization. The framework uses a well-established model of nucleic acid sequence hybridization and a novel method for the detection of patterns in hybridization experiment data. Our primary result is the identification of unique patterns in hybridization experiment data that are correlated with each type of degenerate probe behaviour. The framework also contains a machine learning technique to learn from the hybridization experiment data. We implement the components of the framework and evaluate the ability of the framework to detect degenerate probes in the Affymetrix HuGeneFL GeneChip.
APA, Harvard, Vancouver, ISO, and other styles
13

Wändell, Johan. "Multistage gearboxes : vibration based quality control." Licentiate thesis, KTH, Aeronautical and Vehicle Engineering, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3987.

Full text
Abstract:

In this thesis, vibration based techniques for detection of localised surface damages in multistage gearboxes are presented and evaluated.

A modern vehicle gearbox is a complex system and the number of potential errors is large. For instance, surface damages can be caused by rough handling during assembly. Large savings can be made in the production industry by assuring the quality of products such as gearboxes. An automated quality test as a final step in the production line is one way to achieve this.

A brief review of available methods for vibration based condition monitoring of gearboxes is given in the opening summary. In the appended papers, a selection of these methods is used to design signal processing procedures for detection of localised surface damages in gearboxes. The procedures include the Synchronous signal averaging technique (SSAT), residual calculation, filtering with a prediction error filter (PEF) based on an AR-model and the use of crest factor and kurtosis as state features. The procedures are fully automatic and require no manual input during calibration or testing. This makes them easy to adapt to new test objects.

A numerical model, generating simulated gearbox vibration signals, is used to systematically evaluate the proposed procedures. The model originates from an existing model which is extended to include contributions from several gear stages as well as measurement noise. This enables simulation of difficulties likely to arise in quality testing such as varying background noise and modulation due to test rig misalignment. Without the numerical model, the evaluation would require extensive measure-ments. The numerical model is experimentally validated by comparing the simulated vibration signals to signals measured of a real gearbox.

In the experimental part of the study, vibration data is collected with accelerometers while the gearbox is running in an industrial test rig. In addition to the healthy condition, conditions including three different surface damage sizes are also considered.

The numerical and the experimental analysis show that the presented procedures are able to detect localised surface damages at an early stage. Previous studies of similar procedures have focused on gear crack detection and overall condition monitoring. The procedures can handle varying back-ground noise and reasonable modulation changes due to misalignment.

The results show that the choice of sensor position and operating conditions during measure-ments has a significant impact on the efficiency of the fault detection procedures. A localised surface damage excites resonances in the transfer path between the gear mesh and the accelerometer. These resonances amplify the defect signal. The results indicate that it is favourable to choose a speed at which the resonant defect signals are well separated from the gear meshing harmonics in the order domain. This knowledge is of great importance when it comes to quality testing. When a quality test procedure is being developed, it is often possible to choose the operating conditions and sensor positions. It can in fact be more important to choose proper operating conditions than to apply an optimal signal processing procedure.

APA, Harvard, Vancouver, ISO, and other styles
14

Wändell, Johan. "Multistage gearboxes : vibration based quality control /." Stockholm : Royal Institute of Technology, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3987.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Ng, Gary K. L. "Quality control in laser percussion drilling." Thesis, University of Manchester, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.488033.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Wort, Ralph George. "Integrated information system for quality control." Thesis, University of the West of England, Bristol, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.283909.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Mehenni, B. "Fast visual inspection for quality control." Thesis, University of Nottingham, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.328421.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Robinson, David Charles. "Computer based on-line quality control." Thesis, University of Portsmouth, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.292316.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Tshenye, Thapelo Obed. "Quality control of astronomical CCD observations." Master's thesis, University of Cape Town, 2007. http://hdl.handle.net/11427/4409.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Ghoudi, Kilani. "Multivariate non-parametric quality control statistics." Thesis, University of Ottawa (Canada), 1990. http://hdl.handle.net/10393/5658.

Full text
Abstract:
During the startup phase of a production process while statistics on the product quality are being collected it is useful to establish that the process is under control. Small samples $\{ n\sb{i}\} \sbsp{i=1}{q}$ are taken periodically for $q$ periods. We shall assume each measurement is bivariate. A process is under control or on-target if all the observations are deemed to be independent and identically distributed and moreover the distribution of each observation is a product distribution. This would be the case if each coordinate of an observation is a nominal value plus noise. Let $F\sp{i}$ represent the empirical distribution function of the $i\sp{-th}$ sample. Let $\overline {F}$ represent the empirical distribution function of all observations. Following Lehman (1951) we propose statistics of the form$${\sum\limits\sbsp{i = 1}{q}}\int\sbsp{-\infty}{\infty}\int\sbsp{-\infty}{\infty}(F\sp{i}(s,t) - \overline{F}(s)\overline{F}(t))\sp2 d\overline{F}(s,t)\eqno(1)$$The emphasis there, however, is on the case where $n\sb{i}\ \to\ \infty$ while $q$ stayed fixed. Here we study the following family of statistics$$S\sb{q}={\sum\limits\sbsp{i = 1}{q}}\int\sbsp{-\infty}{\infty}\int\sbsp{-\infty}{\infty}k\sb{q}(n, i, F\sp{i}(s,t),\overline{F}(s)\overline{F}(t))n\sb{i}dF\sp{i}(s,t)\eqno(2)$$in the above quality control situation, where $q\to\infty$ while $n\sb{i}$ stays fixed. (Abstract shortened by UMI.)
APA, Harvard, Vancouver, ISO, and other styles
21

Alexander, Adam Ross Washer Glenn A. "Guideline for implementing quality control and quality assurance for bridge inspection." Diss., Columbia, Mo. : University of Missouri--Columbia, 2009. http://hdl.handle.net/10355/6560.

Full text
Abstract:
The entire thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file; a non-technical public abstract appears in the public.pdf file. Title from PDF of title page (University of Missouri--Columbia, viewed on October 13, 2009). Thesis advisor: Dr. Glenn Washer. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
22

Abrahamsson, Petter. "User Interface Design for Quality Control : Development of a user interface for quality control of industrial manufactured parts." Thesis, Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-79724.

Full text
Abstract:
The expected quality on manufactured components in the automotive industry is high, often with an accuracy of tenths of a millimeter. The conventional methods used to ensure the manufactured components are very accurate, but they are both time consuming and insufficient and only a small part of the produced series are analyzed today. The measurement is performed manually in so-called measurement fixtures. Where each component is fixed and predetermined points of investigation are controlled with a dial indicator. These fixtures are very expensive to manufacture and they are only compatible with one specific kind of component. Nowadays, great volumes of material are scrapped from these procedures in the automotive industry. Hence, there is a great need to increase the amount of controlled components without affecting the production rate negatively. This project was carried out for the relatively new company Viospatia, which is a spin-off company based on research from Luleå University of Technology. They have developed a system that automatically measures each component directly at the production line with the use of photogrammetry technology. This makes it possible to discover erroneous components almost immediately and the manufacturer gets a more distinct view of their production and its capability. The aim of this thesis has been to investigate how a user interface should be developed to be as user-friendly as possible without limiting the system’s functions. The objective has been to design a proposal of a user interface adapted for the intended user, creating value and is easy to use. The progression has been structured around a human-centered approach expedient for interaction design, where the developing phase, containing analyze, design and validate, is performed through iterations with continuous feedback from users and the project’s employer. The context, where the intended solution is supposed to be used, was investigated through interviews and observations at the involved companies. In the project there were three factories involved, Gestamp Hardtech and Scania Ferruform in Luleå and Volvo Cars in Olofström. These factories are using similar production methods, sheet metal stamping, so their prerequisites and needs are similar for this type of quality control system. Creative methods have been applied throughout the project to generate as much ideas as possible while trying to satisfy all the important aspects. Initially analog prototypes were created but they were soon developed to digital interactive prototypes. A larger usability-test was conducted with seven participants by using a weblink to the digital prototype. With support from the feedback these tests generated some adjustments were made and the final user interface was designed, separated in two levels - Supervisor and Operator. Through extensive literature study and user-testing it became clear that the operator needs to get an unmistakable message from the user interface. There should not be any doubts whatsoever and the operator should react immediately. This message is delivered with the use of colors that have an established meaning. By identifying what needs the different actors have, the system’s functions can be separated and made accessible only for the intended user. The functions can then be more specifically developed for the intended user instead of modifying them trying to make a compromise that fits everybody. This separation of functions is not anything the user has to actively do but it is performed automatically by the user interface when the user is signing in.
Den förväntade kvalitén på tillverkade delar inom bilindustrin är väldigt hög, med toleranser på så lite som tiondels millimeter många gånger. De konventionella metoderna som används för att kontrollmäta de tillverkade delarna idag är mycket noggranna, men de är både tidskrävande och otillräckliga och endast en väldigt liten del av en producerad serie blir kontrollmätt idag. Mätningen utförs manuellt i så kallade mätfixturer. Där varje komponent fixeras och förutbestämda undersökningspunkter kontrolleras med en så kallad mätklocka. Dessa fixturer är även väldigt dyra att tillverka och de är bara kompatibla med en specifik komponent. I dagens läge så kasseras otroligt stora mängder material från dessa komponenter inom bilindustrin. Här finns det alltså ett stort behov för att öka mängden komponenter som kontrolleras utan att påverka tillverkningstakten. Det här projektet utfördes åt det relativt nystartade företaget Viospatia, vilket är ett spin-off företag från forskning utförd vid Luleå tekniska universitet. De har utvecklat ett system som med hjälp av fotogrammetri automatiskt mäter av varje komponent direkt i produktionslinan. Detta gör att eventuella fel upptäcks nästan omedelbart samtidigt som tillverkaren får en tydligare bild av sin produktion och dess kapacitet. Syftet med denna masteruppsats har varit att undersöka hur ett gränssnitt bör utvecklas för att det ska bli så användarvänligt som möjligt utan att begränsa systemets viktiga funktioner. Målet har varit att ta fram ett förslag på ett gränssnitt som är anpassat för den tänkta användaren, som skapar ett mervärde och är enkelt att använda. Processen har följt en användarcentrerad struktur fördelaktig för interaktionsdesign, där utvecklingsfasen bestående av analys, design och validering sker i flera iterationer med kontinuerlig återkoppling med användare och uppdragsgivare. Kontexten, där den tänkta lösningen ska användas, undersöktes initialt hos de involverade företagen. I projektet var tre fabriker involverade, Gestamp Hardtech och Scania Ferruform i Luleå och Volvo Cars i Olofström. Dessa fabriker använder mestadels liknande tillverkningsmetoder, metallpressning, vilket gör att de rimligtvis har en del gemensamma förutsättningar och behov. Under arbetets gång har diverse kreativa metoder använts för att generera så mycket idéer som möjligt utan att förbise viktiga aspekter. Till en början utvecklades prototyper analogt för att sedan utvecklas till digitala interaktiva prototyper. Ett större användbarhetstest genomfördes på distans med sju testpersoner via en länk till den digitala prototypen. Med hjälp av responsen från dessa tester gjordes en del ändringar och den slutliga designen på gränssnittet blev uppdelat i två nivåer, Supervisor och Operator. Genom teoristudie och användartester framgick det att operatören behöver få en omisskännlig uppmaning från gränssnittet. Det bör inte uppstå några som helst tveksamheter och operatören skall kunna agera direkt. Denna uppmaning sker genom en tydlig färgkodning som utnyttjar vedertagna uppfattningar om färgers innebörd. Genom att identifiera vilka behov de olika aktörerna har kan man på så sätt också hålla isär de olika funktionerna och göra de tillgängliga endast för den typen av aktör som behöver de. De kan på så sätt också utvecklas mer specifikt för den tänkta aktören istället för att modifieras för att passa alla. Denna separering av funktioner är inget som användaren behöver ställa in själv utan görs automatiskt då den loggar in med sitt användarkonto.
APA, Harvard, Vancouver, ISO, and other styles
23

Clarke-Pringle, Tracy Lee. "Product quality control in reduced dimension spaces." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0034/NQ66260.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Matos, Cristina Filipa Rodrigues de Oliveira. "Tat-mediated quality control in Escherichia coli." Thesis, University of Warwick, 2010. http://wrap.warwick.ac.uk/3769/.

Full text
Abstract:
The E. coli Twin-arginine translocation (Tat) pathway transports a subset of proteins from the cytosol, across the inner membrane to the periplasmic space. One of the unique features of this pathway is its ability to transport fully folded passenger proteins. These passenger proteins often contain redox co-factors such as the iron-sulphur (FeS) proteins. This feature of the pathway suggests that any quality control of passenger proteins must occur prior to export. In this study the question of Tat pathway quality control is addressed. Initial studies (Chapter 3) addressed the degree to which the Tat pathway would tolerate the misfolding of its passenger proteins. To this end, mutant forms of the FeS proteins NrfC and NapG, were generated with incremental impairment of FeS cluster formation. Expression of these mutants in E. coli revealed that the Tat system completely blocked the export of NrfC when even one of its four FeS centres was mutagenised. Furthermore, the rejected passenger proteins were rapidly degraded in a Tat dependent manner. Dissection of the components involved in this process led to the discovery that TatA/E were essential for the degradation (Chapter 3). Furthermore, the previously neglected subunit TatD also plays a central role in Tat-mediated quality control and degradation (Chapter 4). Interestingly, the data presented here demonstrate that this quality control of Tat passengers also extends to nonmutated and non-cofactor containing proteins that are not exported in a timely manner. Investigations into the mechanism of cytosolic degradation of rejected passenger proteins led to the discovery that the ClpAPS system is involved. Interestingly, in the case of FeS proteins, ClpP is not responsible for proteolysis yet ClpS and ClpA are required. However, the degradation of the non-cofactor containing passenger protein, FhuD, is dependent on the entire ClpAPS system (Chapter 5).
APA, Harvard, Vancouver, ISO, and other styles
25

Ratnam, Edward. "Indoor air quality simulation and feedback control." Thesis, Glasgow Caledonian University, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.388935.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Nuttall, James. "Protein quality control mechanisms in plant cells." Thesis, University of Warwick, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.399507.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Watts, Kate. "A study of quality control during galvannealing." Thesis, Swansea University, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.307558.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Yang, Yi. "Injection molding control : from process to quality /." View abstract or full-text, 2004. http://library.ust.hk/cgi/db/thesis.pl?CENG%202004%20YANG.

Full text
Abstract:
Thesis (Ph. D.)--Hong Kong University of Science and Technology, 2004.
Includes bibliographical references (leaves 218-244). Also available in electronic version. Access restricted to campus users.
APA, Harvard, Vancouver, ISO, and other styles
29

Heath, Michael Lindsey. "Quality control improvement in global apparel sourcing." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/104309.

Full text
Abstract:
Thesis: M.B.A., Massachusetts Institute of Technology, Sloan School of Management, 2016. In conjunction with the Leaders for Global Operations Program at MIT.
Thesis: S.M. in Engineering Systems, Massachusetts Institute of Technology, Department of Mechanical Engineering, 2016. In conjunction with the Leaders for Global Operations Program at MIT.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 49-50).
This project addressed challenges within the quality management process of one of the operating groups of Li & Fung. The primary goals were improved product quality and reduced quality control costs. The operating group works with thousands of factories across the world, producing a large variety of apparel and textile products. The industry trend of fast fashion, with small order sizes and shorter lead times, has placed considerable burden on the limited quality control resources. Understanding the current state of the quality management process was the first project step, and this was accomplished through factory visits and interviewing workers. The current inspection process was designed for large orders and performs sub-optimally with smaller orders. Second, the project took a broad view of the supplier base, performing statistical analysis of inspection and factory data. This revealed problems with the process that lead to high inspection costs and inaccurate inspection results. Next, the project identified technological solutions and process improvements to address the root causes of these issues and to increase the accuracy and efficiency of inspectors. Three specific technology solutions were developed: measurement digitization, label scanners, and improved management metrics. Each solution was prototyped and the critical functionality was tested to demonstrate the value of implementation. Business analysis of the solutions revealed time savings of 60,000 inspector hours/year and cost savings of more than $1 million. At the conclusion of the project, integration of the solutions within the current inspection mobile app was ongoing and expected to be rolled out across the quality organization in the first half of 2016. Finally, recommendations beyond the scope of the technology solutions are provided for further improvement of the quality management process.
by Michael Lindsey Heath.
M.B.A.
S.M. in Engineering Systems
APA, Harvard, Vancouver, ISO, and other styles
30

Karamancı, Kaan. "Exploratory data analysis for preemptive quality control." Thesis, Massachusetts Institute of Technology, 2009. http://hdl.handle.net/1721.1/53126.

Full text
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2009.
Includes bibliographical references (p. 113).
In this thesis, I proposed and implemented a methodology to perform preemptive quality control on low-tech industrial processes with abundant process data. This involves a 4 stage process which includes understanding the process, interpreting and linking the available process parameter and quality control data, developing an exploratory data toolset and presenting the findings in a visual and easily implementable fashion. In particular, the exploratory data techniques used rely on visual human pattern recognition through data projection and machine learning techniques for clustering. The presentation of finding is achieved via software that visualizes high dimensional data with Chernoff faces. Performance is tested on both simulated and real industry data. The data obtained from a company was not suitable, but suggestions on how to collect suitable data was given.
by Kaan Karamancı.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
31

Wang, Wei. "WebRTC Quality Control in Contextual Communication Systems." Thesis, KTH, Radio Systems Laboratory (RS Lab), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-232704.

Full text
Abstract:
Audio and video communication is a universal task with a long history of technologies. Recent examples of these technologies include Skype video calling, Apple’s Face Time, and Google Hangouts. Today, these services offer everyday users the ability to have an interactive conference with both audio and video streams. However, many of these solutions depend on extra plugins or applications installing on the user’s personal computer or mobile device. Some of them also are subject to licensing, introducing a huge barrier for developers and restraining new companies from entering this area. The aim of Web Real-Time Communications (WebRTC) is to provide direct access to multimedia streams in the browser, thus making it possible to create rich media applications using web technology without the need for plugins or developers needing to pay technology license fees. Ericsson develops solutions for communication targeting professional and business users. With the increasing possibilities to gather data (via cloud-based applications) about the quality experienced by users in their video conferences, new demands are placed on the infrastructure to handle this data. Additionally, there is a question of how the stats should be utilized to automatically control the quality of service (QoS) in WebRTC communication systems. The thesis project deployed a WebRTC quality control service with methods of data processing and modeling to assess the perceived video quality of the ongoing session, and in further produce appropriate actions to remedy poor quality. Lastly, after evaluated on the Ericsson contextual test platform, the project verified that two of the stats-parameters (network delay and packet loss percentage) for assessing QoS have the negative effect on the perceived video quality but with different influence degree. Moreover, the available bandwidth turned out to be an important factor, which should be added as an additional stats-parameter to improve the performance of a WebRTC quality control service.
Ljud och videokommunikation är en universell uppgift med en lång historia av teknik. Exempel på dessa teknologier är Skype-videosamtal, Apples ansiktstid och Google Hangouts. Idag erbjuder dessa tjänster vardagliga användare möjligheten att ha en interaktiv konferens med både ljud- och videoströmmar. Men många av dessa lösningar beror på extra plugins eller applikationer som installeras på användarens personliga dator eller mobila enhet. Vissa av dem är också föremål för licensiering, införande av ett stort hinder för utvecklare och att hindra nya företag att komma in i detta område. Syftet med Web Real-Time Communications (WebRTC) är att ge direkt åtkomst till multimediaströmmar i webbläsaren, vilket gör det möjligt att skapa rich media-applikationer med webbteknik utan att plugins eller utvecklare behöver betala licensavgifter för teknik. Ericsson utvecklar lösningar för kommunikationsriktning för professionella och företagsanvändare. Med de ökande möjligheterna att samla data (via molnbaserade applikationer) om kvaliteten hos användare på sina videokonferenser ställs nya krav på infrastrukturen för att hantera dessa data. Dessutom är det fråga om hur statistiken ska användas för att automatiskt kontrollera kvaliteten på tjänsten (QoS) i WebRTC-kommunikationssystem. Avhandlingsprojektet tillämpade en WebRTC-kvalitetskontrolltjänst med metoder för databehandling och modellering för att bedöma upplevd videokvalitet av den pågående sessionen och vidare producera lämpliga åtgärder för att avhjälpa dålig kvalitet. Slutligen, efter utvärdering på Ericssons kontextuella testplattform, verifierade projektet att två av statistikparametrarna (nätverksfördröjning och paketförlustprocent) för bedömning av QoS har den negativa effekten på upplevd videokvalitet men med olika inflytningsgrad. Dessutom visade den tillgängliga bandbredd att vara en viktig faktor, som bör läggas till som en extra statistikparameter för att förbättra prestanda för enWebRTC-kvalitetskontrolltjänst.
APA, Harvard, Vancouver, ISO, and other styles
32

He, Baosheng. "New Bayesian methods for quality control applications." Diss., University of Iowa, 2018. https://ir.uiowa.edu/etd/6133.

Full text
Abstract:
In quality control applications, the most basic tasks are monitoring and fault diagnosis. Monitoring results determines if diagnosis is required, and conversely, diagnostic results aids better monitoring design. Quality monitoring and fault diagnosis are closely related but also have significant difference. Essentially. monitoring focus on online changepoint detection, whilst the primary objective of diagnosis is to identify fault root causes as an offline method. Several critical problems arise in the research of quality control: firstly, whether process monitoring is able to distinguish systematic or assignable faults and occasional deviation; secondly, how to diagnose faults with coupled root causes in complex manufacturing systems; thirdly, if the changepoint and root causes of faults can be diagnosed simultaneously. In Chapter 2, we propose a novel Bayesian statistical process control method for count data in the presence of outliers. That is, we discuss how to discern out of control status and temporary abnormal process behaviors in practice, which is incapable for current SPC methodologies. In this work, process states are modeled as latent variables and inferred by the sequential Monte Carlo method. The idea of Rao-Blackwellization is employed in the approach to control detection error and computational cost. Another contribution of this work is that our method possesses self-starting characteristics, which makes the method a more robust SPC tool for discrete data. Sensitivity analysis on monitoring parameter settings is also implemented to provide practical guidelines. In Chapter 3, we study the diagnosis of dimensional faults in manufacturing. A novel Bayesian variable selection oriented diagnostic framework is proposed. Dimensional fault sources are not explicitly measurable; instead, they are connected with dimensional measurements by a generalized linear mixed effect model, based on which we further construct a hierarchical quality-fault model to conduct Bayesian inference. A reversible jump Markov Chain Monte Carlo algorithm is developed to estimate the approximate posterior probability of fault patterns. Such diagnostic procedure is superior over previous studies since no numeric regularization is required for decision making. The proposed Bayesian diagnosis can further lean towards sparse fault patterns by choosing suitable priors, in order to handle the challenge from the diagnosability of faults. Our work considers the diagnosability in building dimensional diagnostic methodologies. We explain that the diagnostic result is trustworthy for most manufacturing systems in practice. The convergence analysis is also implemented, considering the trans-dimensional nature of the diagnostic method. In Chapter 4 of the thesis, we consider the diagnosis of multivariate linear profile models. We assume liner profiles as piece-wise constant. We propose an integrated Bayesian diagnostic method to answer two problems: firstly, whether and when the process is shifted, and secondly, in which pattern the shift occurs. The method can be applied for both Phase I and Phase II needs. For Phase I diagnosis, the method is implemented with no knowledge of in control profiles, whereas in Phase II diagnosis, the method only requires partial observations. To identify exactly which profile components deviate from nominal value, the variability of the value of profile components is marginalized out through a fully Bayesian approach. To address computational difficulty, we implement Monte Carlo Method to alternatively inspect between spaces of changepoint positions and fault patterns. The diagnostic method is capable to be applied under multiple scenarios.
APA, Harvard, Vancouver, ISO, and other styles
33

Thoutou, Sayi Mbani. "Quality control charts under random fuzzy measurements." Master's thesis, University of Cape Town, 2007. http://hdl.handle.net/11427/19140.

Full text
Abstract:
Includes bibliographical references. .
We consider statistical process control charts as tools that statistical process control utilizes for monitoring changes; identifying process variations and their causes in industrial processes (manufacturing processes) and which help manufacturers to take the appropriate action, rectify problems or improve manufacturing processes so as to produce good quality products. As an essential tool, researchers have always paid attention to the development of process control charts. Also, the sample sizes required for establishing control charts are often under discussion depending on the field of study. Of late, the problem of Fuzziness and Randomness often brought into modern manufacturing processes by the shortening product life cycles and diversification (in product designs, raw material supply etc) has compelled researchers to invoke quality control methodologies in their search for high customer satisfaction and better market shares (Guo et al 2006). We herein focus our attention on small sample sizes and focus on the development of quality control charts in terms of the Economic Design of Quality Control Charts; based on credibility measure theory under Random Fuzzy Measurements and Small Sample Asymptotic Distribution Theory. Economic process data will be collected from the study of Duncan (1956) in terms of these new developments as an illustrative example. or/Producer, otherwise they are undertaken with respect to the market as a whole. The techniques used for tackling the complex issues are diverse and wide-ranging as ascertained from the existing literature on the subject. The global ideology focuses on combining two streams of thought: the production optimisation and equilibrium techniques of the old monopolistic, cost-saving industry and; the new dynamic profit-maximising and risk-mitigating competitive industry. Financial engineering in a new and poorly understood market for electrical power must now take place in conjunction with - yet also constrained by - the physical production and distribution of the commodity.
APA, Harvard, Vancouver, ISO, and other styles
34

Kim, Sang Ik. "Contributions to experimental design for quality control." Diss., Virginia Polytechnic Institute and State University, 1988. http://hdl.handle.net/10919/53551.

Full text
Abstract:
A parameter design introduced by Taguchi provides a new quality control method which can reduce cost-effectively the product variation due to various uncontrollable noise factors such as product deterioration, manufacturing imperfections, and environmental factors under which a product is actually used. This experimental design technique identifies the optimal setting of the control factors which is least sensitive to the noise factors. Taguchi’s method utilizes orthogonal arrays which allow the investigation of main effects only, under the assumption that interaction effects are negligible. In this paper new techniques are developed to investigate two-factor interactions for 2t and 3t factorial parameter designs. The major objective is to be able to identify influential two-factor interactions and take those into account in properly assessing the optimal setting of the control factors. For 2t factorial parameter designs, we develop some new designs for the control factors by using a partially balanced array. These designs are characterized by a small number of runs and some balancedness property of the variance-covariance matrix of the estimates of main effects and two-factor interactions. Methods of analyzing the new designs are also developed. For 3t factorial parameter designs, a detection procedure consisting of two stages is developed by using a sequential method in order to reduce the number of runs needed to detect influential two-factor interactions. In this paper, an extension of the parameter design to several quality characteristics is also developed by devising suitable statistics to be analyzed, depending on whether a proper loss function can be specified or not.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
35

Akkinepally, Radha. "Quality control and quality assurance of hot mix asphalt construction in Delaware." Access to citation, abstract and download form provided by ProQuest Information and Learning Company; downloadable PDF file 2.68Mb, 136p, 2005. http://wwwlib.umi.com/dissertations/fullcit/1428173.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

McGrew, Don. "A review of sensory quality control and quality assurance for alcoholic beverages." Kansas State University, 2011. http://hdl.handle.net/2097/9160.

Full text
Abstract:
Master of Science
Food Science
Delores H. Chambers
Tools are available, through various reference books, to develop a purposeful sensory quality program. Some companies already have a strong sensory program in place; others may require a cultural change to facilitate the implementation. This paper indicates some of the challenges to be overcome, covers some current quality control (QC) sensory practices and addresses advantages and disadvantages of expert tasters Some specific issues regarding sensory evaluations of alcohol beverages are discussed and critical factors in production are reviewed with discussion on the potential for off taint development.
APA, Harvard, Vancouver, ISO, and other styles
37

Andrus, Lauren Elizabeth. "To Control or Be Controlled: Sibling Control and Adolescent Sibling Relationship Quality." BYU ScholarsArchive, 2021. https://scholarsarchive.byu.edu/etd/8894.

Full text
Abstract:
The current body of research pertaining to sibling control dynamics look specifically at either the absence or presence of control within the sibling relationship. Research to date has not differentiated between a sibling's experience of being controlling versus being controlled. This study examined adolescent sibling control dynamics and its link with sibling relationship quality (sibling closeness and sibling conflict), and how those links are moderated by birth order and having an agreeable personality. Data were analyzed from 327 families with two adolescent siblings between the ages of 12 and 18 (Older Sibling M = 17.17 years, SD = .94; Younger Sibling M = 14.52 years, SD =1.27). Results from nested multi-level models revealed that adolescent siblings who are controlling, perceive their sibling relationship to be close. Future research pertaining to the importance of differentiating between the experience of being controlling versus controlled is discussed.
APA, Harvard, Vancouver, ISO, and other styles
38

Krishnamurthy, Janaki. "Quality Market: Design and Field Study of Prediction Market for Software Quality Control." NSUWorks, 2010. http://nsuworks.nova.edu/gscis_etd/352.

Full text
Abstract:
Given the increasing competition in the software industry and the critical consequences of software errors, it has become important for companies to achieve high levels of software quality. While cost reduction and timeliness of projects continue to be important measures, software companies are placing increasing attention on identifying the user needs and better defining software quality from a customer perspective. Software quality goes beyond just correcting the defects that arise from any deviations from the functional requirements. System engineers also have to focus on a large number of quality requirements such as security, availability, reliability, maintainability, performance and temporal correctness requirements. The fulfillment of these run-time observable quality requirements is important for customer satisfaction and project success. Generating early forecasts of potential quality problems can have significant benefits to quality improvement. One approach to better software quality is to improve the overall development cycle in order to prevent the introduction of defects and improve run-time quality factors. Many methods and techniques are available which can be used to forecast quality of an ongoing project such as statistical models, opinion polls, survey methods etc. These methods have known strengths and weaknesses and accurate forecasting is still a major issue. This research utilized a novel approach using prediction markets, which has proved useful in a variety of situations. In a prediction market for software quality, individual estimates from diverse project stakeholders such as project managers, developers, testers, and users were collected at various points in time during the project. Analogous to the financial futures markets, a security (or contract) was defined that represents the quality requirements and various stakeholders traded the securities using the prevailing market price and their private information. The equilibrium market price represents the best aggregate of diverse opinions. Among many software quality factors, this research focused on predicting the software correctness. The goal of the study was to evaluate if a suitably designed prediction market would generate a more accurate estimate of software quality than a survey method which polls subjects. Data were collected using a live software project in three stages: viz., the requirements phase, an early release phase and a final release phase. The efficacy of the market was tested with results from prediction markets by (i) comparing the market outcomes to final project outcome, and (ii) by comparing market outcomes to results of opinion poll. Analysis of data suggests that predictions generated using the prediction market are significantly different from those generated using polls at early release and final release stages. The prediction market estimates were also closer to the actual probability estimates for quality compared to the polls. Overall, the results suggest that suitably designed prediction markets provide better forecasts of potential quality problems than polls.
APA, Harvard, Vancouver, ISO, and other styles
39

Scior, Annika [Verfasser]. "Protein Quality Control during Protein Biosynthesis / Annika Scior." Konstanz : Bibliothek der Universität Konstanz, 2013. http://d-nb.info/1096331527/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Myers, Joseph Kenneth. "Inverse doping profile analysis for semiconductor quality control." Diss., Wichita State University, 2009. http://hdl.handle.net/10057/2556.

Full text
Abstract:
Inverse doping pro le problems are linked to inverse conductivity problems under the assumptions of zero space charge and low injection. Unipolar inverse conductivity problems are analyzed theoretically via three uniqueness proofs, the rst of which has been published as a paper in Inverse Problems [34]. Also, optimized numerical methods are developed for solving the unipolar direct conductivity problem with a piecewise constant conductivity coe cient. Finally, the unipolar inverse conductivity problem is solved for inclusions de ned by as many as 9 parameters, or by as many as 120 parameters when an initial guess for each parameter is known with less than 10% error. Our free boundary identi cation algorithm produces a sequence of improved approximations in a way that provides both regularization and accelerated convergence towards the solution.
Thesis (Ph.D.)--Wichita State University, College of Liberal Arts and Sciences, Dept. of Mathematics and Statistics
APA, Harvard, Vancouver, ISO, and other styles
41

Watkins, Daryl RaMond. "Autocorrelation and its effects on quality control charts." Thesis, Georgia Institute of Technology, 1995. http://hdl.handle.net/1853/23401.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Sanhoty, Rafaat Mohamed Elsayed Seliman el. "Quality control for foods produced by genetic engineering." [S.l.] : [s.n.], 2004. http://deposit.ddb.de/cgi-bin/dokserv?idn=970670354.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Altube, Vázquez Patricia. "Procedures for improved weather radar data quality control." Doctoral thesis, Universitat de Barcelona, 2016. http://hdl.handle.net/10803/400398.

Full text
Abstract:
Weather radar data and its downstream products are essential elements in weather surveillance and key parameters in the initialisation and validation of hydrological and meteorological models, among other downstream applications. Following the quality standards established by the European and global weather radar networking referents, the present thesis aims for the improvement of the base data quality control in the regional weather radar network operated by the Meteorological Service of Catalonia, the XRAD. This objective is accomplished through the analysis, development and implementation of new or existing procedures and algorithms for radar data quality assessment and improvement. Attending to the current radar technology and to the already implemented quality control procedures for the XRAD, the work is focused on the continuous evaluation of the radar system calibration status and on the correction of Doppler velocity data. The quality control algorithms and recommendations presented are easily translatable to any other operative weather radar networking environment. A Sun-based, fully automatic procedure for online monitoring the antenna alignment and the receiver chain calibration is adapted and operationally implemented for the XRAD. This Sun-monitoring technique was developed at the Royal Netherlands and Finnish Meteorological Institutes and is included in the quality control flow of numerous weather radar networks around the world. The method is modified for a robust detection and characterisation of solar interferences in raw data at all scan elevations, even when only data at relatively short ranges is available. The modified detection algorithm is also suitable for detecting interferences from wireless devices, which are stored for monitoring their incidence in the XRAD. The solar interferences detected, in turn, are input observations for the inversion of a two-dimensional Gaussian model that yields estimates of the calibration parameters of interest. A complete theoretical derivation of the model establishes its validity limits and provides analytical estimates of the effective solar widths directly from radar parameters. Results of application of this Sun-monitoring methodology to XRAD data reveal its ability to determine the accuracy of the antenna pointing and to detect changes in receiver calibration and radar system operation status. In order to facilitate the usage of the Sun-monitoring technique and the interpretation of its estimates, the methodology is reproduced under controlled conditions based on the distributions of solar observations collected by two of the XRAD radars. The analysis shows that the accuracy of the estimated calibration parameters is conditioned by the precision, number and distribution of the solar observations which constitute key variables that need to be controlled to ensure reliable estimates. In addition, the Sun-monitoring technique is compared under actual operative conditions with two other common techniques for quantifying the antenna azimuth and elevation pointing offsets. Pointing bias estimates gathered in a dedicated short-term campaign are studied in a direct inter- comparison of the methods that reflects the advantages and limitations in each case. The analysis of the bias estimates reported by the methods in the course of a one-year period reveals that the performance of the techniques depends on the antenna position at the time of the measurement. After this study, a reanalysis of the Sun-monitoring method results is proposed, which allows to additionally quantify the antenna pedestal levelling error. Finally, a post-processing, spatial image filtering algorithm for identification and correction of unfolding errors in dual-PRF Doppler velocity data is proposed. The correction of these errors benefits the usage of radar velocity data in downstream applications such as wind- shear and mesocyclone detection algorithms or assimilation in numerical weather prediction models. The main strengths of the proposed algorithm, in comparison with existing correction techniques, are its robustness to the presence of clustered unfolding errors and that it can be employed independently of post-processing dealiasing algorithms. By means of simulated dual-PRF velocity fields, the correction ability of the algorithm is quantitatively analysed and discussed with particular emphasis on the correction of clustered errors. The quality improvement in real dual-PRF data brought out by the new algorithm is illustrated through application to three selected severe weather events registered by the XRAD.
Seguint els estàndards de qualitat establerts per a les xarxes de radars meteorològics de referència a nivell europeu i global, la present tesi té com a objectiu la millora del control de qualitat de les dades de la xarxa regional de radars meteorològics operada pel Servei Meteorològic de Catalunya (la XRAD). Atenent als procediments de control de qualitat ja implementats per a la XRAD, el treball es centra en l'avaluació contínua de l'estat del calibratge del sistema radar i en la correcció de les dades de velocitat Doppler. Es presenta l'adaptació i aplicació d’un procediment totalment automàtic basat en el Sol, que permet la quantificació remota dels errors d'alineació de l'antena i de calibratge en recepció del radar a la XRAD. El mètode ha estat modificat per a la detecció i caracterització robusta d'interferències solars a les dades primàries de radar. Les interferències solars són utilitzades per a la inversió d'un model físic que proporciona estimacions dels paràmetres de calibratge d'interès. L'algoritme de detecció modificat també és adequat per a la identificació d'interferències procedents de dispositius electrònics externs. Aquestes interferències són emmagatzemades per al seguiment de la seva incidència a la XRAD. La metodologia solar esmentada es modelitza en condicions controlades a partir de la distribució de les observacions solars recollides per dos dels radars de la XRAD. L'anàlisi mostra que la precisió, el nombre i la distribució de les observacions solars constitueixen variables clau que necessiten ser controlades per garantir estimacions fiables dels paràmetres de calibrage. A més, la tècnica solar es compara, sota condicions operatives reals, amb altres dues tècniques habitualment emprades per a la quantificació de l'error d'apuntament de l'antena. A partir d'aquest estudi, es proposa un nou mètode d'anàlisi de les interferències solars, el cual permet quantificar l'error d'anivellament del pedestal de l'antena. Finalment, es desenvolupa i valida un algoritme de filtrat d'imatges per a la identificació i correcció dels errors característics que es donen lloc a les dades dual-PRF de velocitat Doppler. Els punts forts de l'algoritme proposat, en comparació amb les tècniques de correcció existents, són la seva robustesa en la correció d'errors agrupats i que pot emprar- se amb independència dels algoritmes de dealiasing. La millora de la qualitat de les dades reals de velocitat s'il·lustra mitjançant l'aplicació de l’algoritme a tres episodis de temps sever enregistrats per la XRAD.
APA, Harvard, Vancouver, ISO, and other styles
44

Lau, Yuk-yee Sophia. "Seafood quality control and contamination in Hong Kong /." View the Table of Contents & Abstract, 2006. http://sunzi.lib.hku.hk/hkuto/record/B37120748.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Khalidi, Mohammad Said Asem Weheba Gamal S. "Multivariate quality control statistical performance and economic feasibility /." Diss., A link to full text of this thesis in SOAR, 2007. http://soar.wichita.edu/dspace/handle/10057/1077.

Full text
Abstract:
Thesis (Ph.D.)--Wichita State University, College of Engineering, Dept. of Industrial and Manufacturing Engineering.
"May 2007." Title from PDF title page (viewed on October 25, 2007). Thesis adviser: Gamal S. Weheba. Includes bibliographic references (leaves 96-101).
APA, Harvard, Vancouver, ISO, and other styles
46

Min, Jun Young. "Off-line quality control by robust parameter design." Manhattan, Kan. : Kansas State University, 2008. http://hdl.handle.net/2097/597.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Nazir, Karnachi Nayeem A. "Control of the chemical quality of industrial wastewater." Thesis, Leeds Beckett University, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.500766.

Full text
Abstract:
Quality control of wastewater is an important treatment process more so now, tnan ever before. Due to an extremely unpredictable nature of the wastewater, which is a mixture of both inorganic and organic waste, it is very difficult to neutralise. Two approaches have been proposed in developing alternative control strategies as suggestions for the pH control of the wastewater in an industrial plant. The first is to develop a mathematical model of a continuously stirred tank reactor (CSTR) with a possible use of MATLAB®. Three different control methods (linear, nonlinear and adaptive) are subject to vigorous theoretical testing and are proposed as a possible solution. The second, a parallel approach, has been to build a laboratory scale experimental reactor using a seven litre continuously stirred tank with monitors for influent flow, influent pH and reactor tank pH. Results suggest that a more sophisticated controller than the simple PID control, currently in operation, could lend Itself to overcoming the problem of persistent large spikes in the pH of the influent. Further work would consider the implementation of these results to the actual industrial wastewater treatment plant.
APA, Harvard, Vancouver, ISO, and other styles
48

Boyi, Bukata Bala. "Evolutionary optimisation for Volt-VAR power quality control." Thesis, University of Glasgow, 2013. http://theses.gla.ac.uk/3882/.

Full text
Abstract:
With the more environmentally friendly smart grid initiatives during the past few years, intelligent operation and optimisation of the electricity distribution system have received an increasing attention in power system research worldwide. Power flow from the distribution substation to the customer can be optimised at Volt-Ampere-Reactive (VAR) level by reducing the reactive power. Distributed Generation (DG) and Renewable Energy Sources (RES) represent both the broadest potentials and the broadest challenges for intelligent distribution systems and smart grid control. In general, the flexibility envisaged by integrating RES during smart grid transformation is often surrounded by nonlinearities such as wave-form deformations caused by harmonic currents or voltages, which impliedly increase control system complexity. Therefore, conventional controllers presently implemented need to be re-engineered in order to solve power quality (PQ) problems therein. This work aims to improve the controllability of Distribution Static Compensators (DSTATCOMs) through the development of improved control systems using evolu- tionary computation enabled design automation and optimisation. The resultant Volt-VAR Control (VVC) optimises PQ in the presence of nonlinearities and uncertainties. It also aims at increasing overall system’s sensitivity to unconsidered parameters in the design stage like measurement noise, unmodelled dynamics and disturbances. This is otherwise known as the robustness of the system offering it with valuable potential for future smart grids control, which are anticipated to present more nonlinearities due to virtual power plant (VPP) configuration. According to European Project FENIX, a Virtual Power Plant (VPP) aggregates the capacity of many diverse Distributed Energy Resources (DER), it creates a single operating profile from a composite of the parameters characterizing each DER and can incorporate the impact of the network on aggregate DER output. To particularly solve PQ problems, two objectives are realised in this thesis. First, a non-deterministic evolutionary algorithm (EA) is adopted to generate optimum fuzzy logic controllers for DSTATCOMs. This design methodology extends the traditional computer-aided-design (CAD) to computer-automated-design (CAutoD), which provides a unified solution to diverse PQ problems automatically and efficiently. While realizing this objective, the prediction ability of the derivative term in a proportional and derivative (PD) controller is improved by placing a rerouted derivative filter in the feedback path to tame ensuing oscillations. This method is then replicated in a fuzzy PD scheme and is automated through the capability of a “generational” tuning using evolutionary algorithm. Fuzzy logic controllers (FLCs) are rule-based systems which are designed around a fuzzy rule base (RB) related through an inference engine by means of fuzzy implication and compositional procedures. RBs are normally formulated in linguistic terms, in the form of if ... then rules which can be driven through various techniques. Fundamentally, the correct choice of the membership functions of the linguistic set defines the performance of an FLC. In this context, a three rule-base fuzzy mapping using Macvicar-Whelan matrix has been incorporated in this scheme to reduce the computational cost, and to avoid firing of redundant rules. The EA-Fuzzy strategy is proven to overcome the limitation of conventional optimisation which may be trapped in local minima, as the optimisation problem is often multi-modal. The second objective of the thesis is the development of a novel advanced model-free predictive control (MFPC) system for DSTATCOMs through a deterministic non-gradient algorithm. The new method uses its “look-ahead” feature to predict and propose solutions to anticipated power quality problems before they occur. A describing function augmented DSTATCOM regime is so arranged in a closed-loop fashion to locate limit cycles for settling the systems nonlinearities in a model-free zone. Predictive control is performed upon the online generated input-output data-set through the power of a non-gradient simplex algorithm. The strategy is to boycott the usage of a system model which is often based on gradient information and may thus be trapped in a local optimum or hindered by noisy data. As a model-free technique, the resultant system offers the advantage of reduction in system modelling or identification, which is often inaccurate, and also in computational load, since it operates directly on raw data from a direct online procession while at the same time dealing with a partially known system normally encountered in a practical industrial problem. Steady-state and dynamic simulations of both control and simulation models in Matlab/Simulink environment demonstrate the superiority of the new model-free approach over the traditional trial-and-error based methods. The method has been varified to offer faster response speed and shorter settling time at zero overshoot when compared to existing methods. A SimPowerSystems software simulation model is also developed to check experimental validity of the designs. Where specific PQ problems such as harmonics distortion, voltage swells, voltage sags and flicker are solved. A noticeable record level of THD reduction to 0.04% and 0.05% has respectively been achieved. It is therefore safe to recommend to the industry the implementation of this model-free predictive control scheme at the distribution level. As the distribution system metamorphoses into decen- tralised smart grid featuring connectivity of virtual power plants mostly through power electronic converters, e.g., DSTATCOM, it stands to benefit from the full Volt-VAR automated controllability of the MFPCs low control rate. Based on CAutoD, the practical implementation of this technique is made possible through digital prototyping within the real-time workshop to automatically generate C or C++ codes from Simulink, which executes continuous and discrete time models directly on a vast range of computer applications. Its overall wired closed-loop structure with the DSTATCOM would offer reliable and competitive advantages over its PID and SVC (CAD-based) counterparts currently being implemented through physical prototyping, in terms of; quick product-to-market pace, reduced hardwire size, small footprint, maintenance free as it is model-free (and automated), where pickling the controller timers and model contingencies are unnecessary as would be with the conventional controllers. More importantly, the scheme performs the aforementioned control functions robustly at a high speed in the range of 0.005 → 0.01 seconds. High enough to capture and deal with any ensuing PQ problem emanating from changes in customer’s load and system disturbances in an environmentally friendly, but less grid-friendly renewable generators.
APA, Harvard, Vancouver, ISO, and other styles
49

Lau, Yuk-yee Sophia, and 劉玉兒. "Seafood quality control and contamination in Hong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2006. http://hub.hku.hk/bib/B45013536.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Hapuarachchi, Karunarathnage Piyasena. "Contributions to statistical quality control." 1988. http://hdl.handle.net/1993/16618.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography