To see the other types of publications on this topic, follow the link: Virtual Square.

Dissertations / Theses on the topic 'Virtual Square'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 17 dissertations / theses for your research on the topic 'Virtual Square.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Paulin, Jameel Amman. "Congo Square: Afrofuturism as a Space of Confrontation." The Ohio State University, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=osu158712552620892.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Cardace, Antonio. "UMView, a Userspace Hypervisor Implementation." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2017. http://amslaurea.unibo.it/13184/.

Full text
Abstract:
UMView is a partial virtual machine and userspace hypervisor capable of intercepting system calls and modifying their behavior according to the calling process' view. In order to provide flexibility and modularity UMView supports modules loadable at runtime using a plugin architecture. UMView in particular is the implementation of the View-OS concept which negates the global view assumption which is so radically established in the world of OSes and virtualization.
APA, Harvard, Vancouver, ISO, and other styles
3

Nalli, Michele. "Libvdeplug_agno: Ethernet encryption su reti virtuali." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/17340/.

Full text
Abstract:
In questo elaborato si descrive l'implementazione di libvdeplug_agno: un modulo crittografico per VDE4 che permette di cifrare il traffico di una rete virtuale VDE. Libvdeplug_agno è nato dalla necessità di avere uno strumento crittografico da poter usare in simbiosi con VXVDE per poter permettere agli utenti non amministratori di avere accesso shell alla macchina fisica con accesso alla rete reale. Per poter essere usato efficacemente con VXVDE questo strumento è stato progettato per richiedere una configurazione minima. Libvdeplug_agno può essere usato anche in connessioni point-to-point in sostituzione del vecchio strumento vde_cryptcab, dal momento che è molto più efficiente. Sfruttando delle interfacce TAP, lo si può usare anche per cifrare reti fisiche. Il codice sviluppato si può trovare nel repository ufficiale: https://github.com/rd235/vdeplug_agno.
APA, Harvard, Vancouver, ISO, and other styles
4

WILSCHUTZ, SETH DOUGLAS. "Embodying Civil Society in Public Space: Re-Envisioning the Public Square of Mansfield, Ohio." University of Cincinnati / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1112648545.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Pereira, Junior Ilton Ancelmo. "Sistema automatizado de medição e análise das propriedades magnéticas de materiais utilizando o quadro de Epstein." Universidade do Estado de Santa Catarina, 2011. http://tede.udesc.br/handle/handle/1907.

Full text
Abstract:
Made available in DSpace on 2016-12-12T17:38:37Z (GMT). No. of bitstreams: 1 ILTON ANCELMO PEREIRA JUNIOR.pdf: 6854626 bytes, checksum: 8fdf9667f4e840b4dc298f1e9605bf2d (MD5) Previous issue date: 2011-02-15
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
This master thesis introduces a study about the influence of frequency variation induced in the magnetic properties of materials. Experiments are accomplished in the Epstein frame or Epstein square, the measurements are being acquired through a data acquisition system PC-based, which is handled through a software developed in LabVIEW. As results, we have the magnetization curve, hysteresis loop and specific losses for several preset frequencies, and the harmonic distortion evaluation in current and voltage signals during the trial. The end of the essay presents a methodology able to accomplish the harmonics cancellation generated by the steel saturation, applying a voltage source having adjustable harmonic voltages.
Esta dissertação apresenta um estudo sobre a influência da variação da frequência nas propriedades magnéticas dos materiais. Os ensaios são realizados em quadro de Epstein, sendo as medições obtidas através de um sistema de aquisição de dados baseado em PC, o qual é controlado por um software desenvolvido em LabVIEW. Como resultados têm-se a curva de magnetização, a curva de histerese e as perdas específicas para diversas frequências pré-selecionadas, sendo avaliadas as distorções harmônicas de tensão e de corrente, durante os ensaios. Ao final é apresentada uma metodologia capaz de realizar o cancelamento das harmônicas geradas pela saturação do aço, através de uma fonte de tensão com ajuste de harmônicos.
APA, Harvard, Vancouver, ISO, and other styles
6

Pareschi, Federico. "Applying partial virtualization on ELF binaries through dynamic loaders." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2013. http://amslaurea.unibo.it/5065/.

Full text
Abstract:
The technology of partial virtualization is a revolutionary approach to the world of virtualization. It lies directly in-between full system virtual machines (like QEMU or XEN) and application-related virtual machines (like the JVM or the CLR). The ViewOS project is the flagship of such technique, developed by the Virtual Square laboratory, created to provide an abstract view of the underlying system resources on a per-process basis and work against the principle of the Global View Assumption. Virtual Square provides several different methods to achieve partial virtualization within the ViewOS system, both at user and kernel levels. Each of these approaches have their own advantages and shortcomings. This paper provides an analysis of the different virtualization methods and problems related to both the generic and partial virtualization worlds. This paper is the result of an in-depth study and research for a new technology to be employed to provide partial virtualization based on ELF dynamic binaries. It starts with a mild analysis of currently available virtualization alternatives and then goes on describing the ViewOS system, highlighting its current shortcomings. The vloader project is then proposed as a possible solution to some of these inconveniences with a working proof of concept and examples to outline the potential of such new virtualization technique. By injecting specific code and libraries in the middle of the binary loading mechanism provided by the ELF standard, the vloader project can promote a streamlined and simplified approach to trace system calls. With the advantages outlined in the following paper, this method presents better performance and portability compared to the currently available ViewOS implementations. Furthermore, some of itsdisadvantages are also discussed, along with their possible solutions.
APA, Harvard, Vancouver, ISO, and other styles
7

Ben, Salem Aymen. "The Application of Multiuser Detection to Spectrally Efficient MIMO or Virtual MIMO SC-FDMA Uplinks in LTE Systems." Thèse, Université d'Ottawa / University of Ottawa, 2013. http://hdl.handle.net/10393/30351.

Full text
Abstract:
Single Carrier Frequency Division Multiple Access (SC-FDMA) is a multiple access transmission scheme that has been adopted in the 4th generation 3GPP Long Term Evolution (LTE) of cellular systems. In fact, its relatively low peak-to-average power ratio (PAPR) makes it ideal for the uplink transmission where the transmit power efficiency is of paramount importance. Multiple access among users is made possible by assigning different users to different sets of non-overlapping subcarriers. With the current LTE specifications, if an SC-FDMA system is operating at its full capacity and a new user requests channel access, the system redistributes the subcarriers in such a way that it can accommodate all of the users. Having less subcarriers for transmission, every user has to increase its modulation order (for example from QPSK to 16QAM) in order to keep the same transmission rate. However, increasing the modulation order is not always possible in practice and may introduce considerable complexity to the system. The technique presented in this thesis report describes a new way of adding more users to an SC-FDMA system by assigning the same sets of subcarriers to different users. The main advantage of this technique is that it allows the system to accommodate more users than conventional SC-FDMA and this corresponds to increasing the spectral efficiency without requiring a higher modulation order or using more bandwidth. During this work, special attentions wee paid to the cases where two and three source signals are being transmitted on the same set of subcarriers, which leads respectively to doubling and tripling the spectral efficiency. Simulation results show that by using the proposed technique, it is possible to add more users to any SC-FDMA system without increasing the bandwidth or the modulation order while keeping the same performance in terms of bit error rate (BER) as the conventional SC-FDMA. This is realized by slightly increasing the energy per bit to noise power spectral density ratio (Eb/N0) at the transmitters.
APA, Harvard, Vancouver, ISO, and other styles
8

Skehan, Daniel Patrick. "Virtual Training System for Diagnostic Ultrasound." Digital WPI, 2011. https://digitalcommons.wpi.edu/etd-theses/1068.

Full text
Abstract:
"Ultrasound has become a widely used form of medical imaging because it is low-cost, safe, and portable. However, it is heavily dependent on the skill of the operator to capture quality images and properly detect abnormalities. Training is a key component of ultrasound, but the limited availability of training courses and programs presents a significant obstacle to the wider use of ultrasound systems. The goal of this work was to design and implement an interactive training system to help train and evaluate sonographers. This Virtual Training System for Diagnostic Ultrasound is an inexpensive, software-based training system in which the trainee scans a generic scan surface with a sham transducer containing position and orientation sensors. The observed ultrasound image is generated from a pre-stored 3D image volume and is controlled interactively by the user€™s movements of the sham transducer. The patient in the virtual environment represented by the 3D image data may depict normal anatomy, exhibit a specific trauma, or present a given physical condition. The training system provides a realistic scanning experience by providing an interactive real-time display with adjustable image parameters similar to those of an actual diagnostic ultrasound system. This system has been designed to limit the amount of hardware needed to allow for low-cost and portability for the user. The system is able to utilize a PC to run the software. To represent the patient to be scanned, a specific scan surface has been produced that allows for an optical sensor to track the position of the sham transducer. The orientation of the sham transducer is tracked by using an inexpensive inertial measurement unit that relies on the use of quaternions to be integrated into the system. The lack of a physical manikin is overcome by using a visual implementation of a virtual patient in the software along with a virtual transducer that reflects the movements of the user on the scan surface. Pre-processing is performed on the selected 3D image volume to provide coordinate transformation parameters that yield a least-mean square fit from the scan surface to the scanning region of the virtual patient. This thesis presents a prototype training system accomplishing the main goals of being low-cost, portable, and accurate. The ultrasound training system can provide cost-effective and convenient training of physicians and sonographers. This system has the potential to become a powerful tool for training sonographers in recognizing a wide variety of medical conditions."
APA, Harvard, Vancouver, ISO, and other styles
9

Frémondière, Pierre. "L'évolution de l'accouchement dans la lignée humaine. Estimation de la contrainte fœto-pelvienne par deux méthodes complémentaires : la simulation numérique de l'accouchement et l'analyse discriminante des modalités d'accouchement au sein d'un échantillon obstétrical." Thesis, Aix-Marseille, 2015. http://www.theses.fr/2015AIXM5013.

Full text
Abstract:
Notre objectif est d’étudier les modalités d’accouchement au sein de la lignée humaine. Pour cela, nous utilisons deux approches complémentaires : la simulation numérique de l’accouchement et l’analyse discriminante des modalités d’accouchement au sein d’un échantillon obstétrical. Dans un premier temps, nous construisons des maillages de bassins et de crânes de foetus fossiles grâce à une méthode d’interpolation : le krigeage. Les groupes fossiles considérés sont les Australopithèques, les premiers représentants du genre Homo (PRGH) et les représentants du genre Homo au Pléistocène moyen et supérieur (RPMS). Les dimensions des crânes juvéniles sont utilisées pour estimer « à rebours » les dimensions néonatales à l’aide de courbes de croissance humaine et de chimpanzé. Nous réalisons une simulation numérique de l’accouchement à partir des maillages de ces dyades « virtuelles ». Puis nous réalisons des analyses discriminantes avec un jeu de données issu de mesures réalisées sur le pelviscanner de femmes et sur les mesures du crâne de leur nouveau-né afin de séparer les modalités d’accouchement grâce aux variables foeto-pelviennes. Ces mêmes variables foeto-pelviennes sont mesurées chez les dyades fossiles afin d’identifier, par les analyses discriminantes, leurs modalités d’accouchement les plus probables. Nos résultats suggèrent un accouchement eutocique sans rotation intra-pelvienne chez les Australopithèques, eutocique avec rotation intrapelvienne chez les PRGH, dystocique ou eutocique chez les RPMS, l’accouchement eutocique est caractérisé par une rotation et une incurvation de la trajectoire de descente
The purpose of this thesis is to estimate delivery outcomes for extinct hominids. We therefore use two complementary methods : numerical simulation of childbirth and discriminant analysis of delivery outcomes from an obstetrical sample. First, we use kriging to construct meshes of pelves and neonatal skulls. Fossil hominid specimens included in the study are Australopithecines, early Homo (EH) and middle to early Pleistocene Homo (MEPH). We estimate fetal cranial dimensions with chimpanzee or human cranial growth curve that we reversly use and apply on juveniles skull measurements. “Virtual” dyads are formed from pelves and neonatal skulls. Then, we simulate childbirth of these « virtual » dyads. Different levels of laxity of the sacro-iliac junction and different positions of the fetal head are considered. Finally, we use an obstetrical sample: delivery outcome is noted, CT-scans are used to obtain maternal pelvic measurements and diameters of the fetal head were also measured after delivery. A discriminant analysis is performed using this obstetrical sample to separate delivery outcomes thanks to fetal-pelvic measurements. Fossil dyads were subsequently added in the discriminant analysis to assess delivery outcomes to which they belong. Results suggest small fetal-pelvic constraint for Austalopithecines. This constraint is moderate for EH. Fetal-pelvic constraint is more important for MEPH. We suggest that rotational birth appears with EH. The curved trajectory of the fetal head appears with MEPH. Emergence of rotational birth and curved trajectory of fetal head are probably explained by two major increases in brain size during late and middle Pleistocene
APA, Harvard, Vancouver, ISO, and other styles
10

Jones, Jason M. "Games for training leveraging commercial off the shelf multiplayer gaming software for infantry squad collective training." Thesis, Monterey, California. Naval Postgraduate School, 2005. http://hdl.handle.net/10945/2047.

Full text
Abstract:
Combat arms units (both Marine and Army) often do not have enough people, time and resources to properly train collective tasks at the squad level. Resources are often retained by higher headquarters due to tight deployment schedules, land restrictions, logistics constraints and a myriad of other reasons. Due to the current operational demands of combat arms brigades and regiments, the reality of limited resources is often a contributing factor in poor performance at the squad level. Leaders at all levels will need to look for innovative ways to sustain training levels at the small unit level. The scope of this study examined the collective and leader tasks that are required for successful execution of Infantry squad missions (using the Army Training and Evaluation Plan ARTEP 7-8 Drill), and how those tasks could be trained with the use of commercial off-the-shelf multiplayer gaming software. The end-state of this research study is to provide initial analysis on what collective skills games can be used to train at the Infantry squad level, and develop a training model recommendation for the integration of this tool into existing unit plans.
APA, Harvard, Vancouver, ISO, and other styles
11

Oqielat, Moa'ath Nasser. "Modelling water droplet movement on a leaf surface." Queensland University of Technology, 2009. http://eprints.qut.edu.au/30232/.

Full text
Abstract:
The central aim for the research undertaken in this PhD thesis is the development of a model for simulating water droplet movement on a leaf surface and to compare the model behavior with experimental observations. A series of five papers has been presented to explain systematically the way in which this droplet modelling work has been realised. Knowing the path of the droplet on the leaf surface is important for understanding how a droplet of water, pesticide, or nutrient will be absorbed through the leaf surface. An important aspect of the research is the generation of a leaf surface representation that acts as the foundation of the droplet model. Initially a laser scanner is used to capture the surface characteristics for two types of leaves in the form of a large scattered data set. After the identification of the leaf surface boundary, a set of internal points is chosen over which a triangulation of the surface is constructed. We present a novel hybrid approach for leaf surface fitting on this triangulation that combines Clough-Tocher (CT) and radial basis function (RBF) methods to achieve a surface with a continuously turning normal. The accuracy of the hybrid technique is assessed using numerical experimentation. The hybrid CT-RBF method is shown to give good representations of Frangipani and Anthurium leaves. Such leaf models facilitate an understanding of plant development and permit the modelling of the interaction of plants with their environment. The motion of a droplet traversing this virtual leaf surface is affected by various forces including gravity, friction and resistance between the surface and the droplet. The innovation of our model is the use of thin-film theory in the context of droplet movement to determine the thickness of the droplet as it moves on the surface. Experimental verification shows that the droplet model captures reality quite well and produces realistic droplet motion on the leaf surface. Most importantly, we observed that the simulated droplet motion follows the contours of the surface and spreads as a thin film. In the future, the model may be applied to determine the path of a droplet of pesticide along a leaf surface before it falls from or comes to a standstill on the surface. It will also be used to study the paths of many droplets of water or pesticide moving and colliding on the surface.
APA, Harvard, Vancouver, ISO, and other styles
12

Durán, Alcaide Ángel. "Development of high-performance algorithms for a new generation of versatile molecular descriptors. The Pentacle software." Doctoral thesis, Universitat Pompeu Fabra, 2010. http://hdl.handle.net/10803/7201.

Full text
Abstract:
The work of this thesis was focused on the development of high-performance algorithms for a new generation of molecular descriptors, with many advantages with respect to its predecessors, suitable for diverse applications in the field of drug design, as well as its implementation in commercial grade scientific software (Pentacle). As a first step, we developed a new algorithm (AMANDA) for discretizing molecular interaction fields which allows extracting from them the most interesting regions in an efficient way. This algorithm was incorporated into a new generation of alignmentindependent molecular descriptors, named GRIND-2. The computing speed and efficiency of the new algorithm allow the application of these descriptors in virtual screening. In addition, we developed a new alignment-independent encoding algorithm (CLACC) producing quantitative structure-activity relationship models which have better predictive ability and are easier to interpret than those obtained with other methods.
El trabajo que se presenta en esta tesis se ha centrado en el desarrollo de algoritmos de altas prestaciones para la obtención de una nueva generación de descriptores moleculares, con numerosas ventajas con respecto a sus predecesores, adecuados para diversas aplicaciones en el área del diseño de fármacos, y en su implementación en un programa científico de calidad comercial (Pentacle). Inicialmente se desarrolló un nuevo algoritmo de discretización de campos de interacción molecular (AMANDA) que permite extraer eficientemente las regiones de máximo interés. Este algoritmo fue incorporado en una nueva generación de descriptores moleculares independientes del alineamiento, denominados GRIND-2. La rapidez y eficiencia del nuevo algoritmo permitieron aplicar estos descriptores en cribados virtuales. Por último, se puso a punto un nuevo algoritmo de codificación independiente de alineamiento (CLACC) que permite obtener modelos cuantitativos de relación estructura-actividad con mejor capacidad predictiva y mucho más fáciles de interpretar que los obtenidos con otros métodos.
APA, Harvard, Vancouver, ISO, and other styles
13

Muriithi, Paul Mutuanyingi. "A case for memory enhancement : ethical, social, legal, and policy implications for enhancing the memory." Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/a-case-for-memory-enhancement-ethical-social-legal-and-policy-implications-for-enhancing-the-memory(bf11d09d-6326-49d2-8ef3-a40340471acf).html.

Full text
Abstract:
The desire to enhance and make ourselves better is not a new one and it has continued to intrigue throughout the ages. Individuals have continued to seek ways to improve and enhance their well-being for example through nutrition, physical exercise, education and so on. Crucial to this improvement of their well-being is improving their ability to remember. Hence, people interested in improving their well-being, are often interested in memory as well. The rationale being that memory is crucial to our well-being. The desire to improve one’s memory then is almost certainly as old as the desire to improve one’s well-being. Traditionally, people have used different means in an attempt to enhance their memories: for example in learning through storytelling, studying, and apprenticeship. In remembering through practices like mnemonics, repetition, singing, and drumming. In retaining, storing and consolidating memories through nutrition and stimulants like coffee to help keep awake; and by external aids like notepads and computers. In forgetting through rituals and rites. Recent scientific advances in biotechnology, nanotechnology, molecular biology, neuroscience, and information technologies, present a wide variety of technologies to enhance many different aspects of human functioning. Thus, some commentators have identified human enhancement as central and one of the most fascinating subject in bioethics in the last two decades. Within, this period, most of the commentators have addressed the Ethical, Social, Legal and Policy (ESLP) issues in human enhancements as a whole as opposed to specific enhancements. However, this is problematic and recently various commentators have found this to be deficient and called for a contextualized case-by-case analysis to human enhancements for example genetic enhancement, moral enhancement, and in my case memory enhancement (ME). The rationale being that the reasons for accepting/rejecting a particular enhancement vary depending on the enhancement itself. Given this enormous variation, moral and legal generalizations about all enhancement processes and technologies are unwise and they should instead be evaluated individually. Taking this as a point of departure, this research will focus specifically on making a case for ME and in doing so assessing the ESLP implications arising from ME. My analysis will draw on the already existing literature for and against enhancement, especially in part two of this thesis; but it will be novel in providing a much more in-depth analysis of ME. From this perspective, I will contribute to the ME debate through two reviews that address the question how we enhance the memory, and through four original papers discussed in part three of this thesis, where I examine and evaluate critically specific ESLP issues that arise with the use of ME. In the conclusion, I will amalgamate all my contribution to the ME debate and suggest the future direction for the ME debate.
APA, Harvard, Vancouver, ISO, and other styles
14

Fremondière, Pierre. "L'évolution de l'accouchement dans la lignée humaine. Estimation de la contrainte fœto-pelvienne par deux méthodes complémentaires : la simulation numérique de l'accouchement et l'analyse discriminante des modalités d'accouchement au sein d'un échantillon obstétrical." Thesis, 2015. http://www.theses.fr/2015AIXM5013/document.

Full text
Abstract:
Notre objectif est d’étudier les modalités d’accouchement au sein de la lignée humaine. Pour cela, nous utilisons deux approches complémentaires : la simulation numérique de l’accouchement et l’analyse discriminante des modalités d’accouchement au sein d’un échantillon obstétrical. Dans un premier temps, nous construisons des maillages de bassins et de crânes de foetus fossiles grâce à une méthode d’interpolation : le krigeage. Les groupes fossiles considérés sont les Australopithèques, les premiers représentants du genre Homo (PRGH) et les représentants du genre Homo au Pléistocène moyen et supérieur (RPMS). Les dimensions des crânes juvéniles sont utilisées pour estimer « à rebours » les dimensions néonatales à l’aide de courbes de croissance humaine et de chimpanzé. Nous réalisons une simulation numérique de l’accouchement à partir des maillages de ces dyades « virtuelles ». Puis nous réalisons des analyses discriminantes avec un jeu de données issu de mesures réalisées sur le pelviscanner de femmes et sur les mesures du crâne de leur nouveau-né afin de séparer les modalités d’accouchement grâce aux variables foeto-pelviennes. Ces mêmes variables foeto-pelviennes sont mesurées chez les dyades fossiles afin d’identifier, par les analyses discriminantes, leurs modalités d’accouchement les plus probables. Nos résultats suggèrent un accouchement eutocique sans rotation intra-pelvienne chez les Australopithèques, eutocique avec rotation intrapelvienne chez les PRGH, dystocique ou eutocique chez les RPMS, l’accouchement eutocique est caractérisé par une rotation et une incurvation de la trajectoire de descente
The purpose of this thesis is to estimate delivery outcomes for extinct hominids. We therefore use two complementary methods : numerical simulation of childbirth and discriminant analysis of delivery outcomes from an obstetrical sample. First, we use kriging to construct meshes of pelves and neonatal skulls. Fossil hominid specimens included in the study are Australopithecines, early Homo (EH) and middle to early Pleistocene Homo (MEPH). We estimate fetal cranial dimensions with chimpanzee or human cranial growth curve that we reversly use and apply on juveniles skull measurements. “Virtual” dyads are formed from pelves and neonatal skulls. Then, we simulate childbirth of these « virtual » dyads. Different levels of laxity of the sacro-iliac junction and different positions of the fetal head are considered. Finally, we use an obstetrical sample: delivery outcome is noted, CT-scans are used to obtain maternal pelvic measurements and diameters of the fetal head were also measured after delivery. A discriminant analysis is performed using this obstetrical sample to separate delivery outcomes thanks to fetal-pelvic measurements. Fossil dyads were subsequently added in the discriminant analysis to assess delivery outcomes to which they belong. Results suggest small fetal-pelvic constraint for Austalopithecines. This constraint is moderate for EH. Fetal-pelvic constraint is more important for MEPH. We suggest that rotational birth appears with EH. The curved trajectory of the fetal head appears with MEPH. Emergence of rotational birth and curved trajectory of fetal head are probably explained by two major increases in brain size during late and middle Pleistocene
APA, Harvard, Vancouver, ISO, and other styles
15

Yeh, Ya Lun, and 葉雅綸. "Constructing a Virtual Metrology Framework for Halftone thickness based on Partial Least Squares and an Empirical Study." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/60454192367212341929.

Full text
Abstract:
碩士
國立清華大學
工業工程與工程管理學系
104
In a highly competitive and capital-intensive industry, such as panel industry. To keep its competitive advantage, panel industry try to use the halftone mask into the process. Using halftone mask reduce the process flow into four. However, halftone mask easily lead to resist non-uniformity, these make panel industry need to control the resist uniformity to make sure that product quality is fine. Considering the cost of time and equipmetn, industry always use sampling to monitor the product quality, but sampling does not guarantee total quality management. In this study, we collect history data, using partial least squares to construct a virtual metrology framework to predict the halftone thickness. After the prediction model is built, not only reduce the frequency of measuring but help panel industry to inspect the whole production equipment, react deviant problem and reduce product cycle time then achieve high capacity goals. By cooperating with a well-known Taiwanese panel company to test the method validity and the MAPE of the validation data set was 3.96%, means a good representative of the prediction model.
APA, Harvard, Vancouver, ISO, and other styles
16

Lu, Bo active 21st century. "Improving process monitoring and modeling of batch-type plasma etching tools." Thesis, 2015. http://hdl.handle.net/2152/30486.

Full text
Abstract:
Manufacturing equipments in semiconductor factories (fabs) provide abundant data and opportunities for data-driven process monitoring and modeling. In particular, virtual metrology (VM) is an active area of research. Traditional monitoring techniques using univariate statistical process control charts do not provide immediate feedback to quality excursions, hindering the implementation of fab-wide advanced process control initiatives. VM models or inferential sensors aim to bridge this gap by predicting of quality measurements instantaneously using tool fault detection and classification (FDC) sensor measurements. The existing research in the field of inferential sensor and VM has focused on comparing regressions algorithms to demonstrate their feasibility in various applications. However, two important areas, data pretreatment and post-deployment model maintenance, are usually neglected in these discussions. Since it is well known that the industrial data collected is of poor quality, and that the semiconductor processes undergo drifts and periodic disturbances, these two issues are the roadblocks in furthering the adoption of inferential sensors and VM models. In data pretreatment, batch data collected from FDC systems usually contain inconsistent trajectories of various durations. Most analysis techniques requires the data from all batches to be of same duration with similar trajectory patterns. These inconsistencies, if unresolved, will propagate into the developed model and cause challenges in interpreting the modeling results and degrade model performance. To address this issue, a Constrained selective Derivative Dynamic Time Warping (CsDTW) method was developed to perform automatic alignment of trajectories. CsDTW is designed to preserve the key features that characterizes each batch and can be solved efficiently in polynomial time. Variable selection after trajectory alignment is another topic that requires improvement. To this end, the proposed Moving Window Variable Importance in Projection (MW-VIP) method yields a more robust set of variables with demonstrably more long-term correlation with the predicted output. In model maintenance, model adaptation has been the standard solution for dealing with drifting processes. However, most case studies have already preprocessed the model update data offline. This is an implicit assumption that the adaptation data is free of faults and outliers, which is often not true for practical implementations. To this end, a moving window scheme using Total Projection to Latent Structure (T-PLS) decomposition screens incoming updates to separate the harmless process noise from the outliers that negatively affects the model. The integrated approach was demonstrated to be more robust. In addition, model adaptation is very inefficient when there are multiplicities in the process, multiplicities could occur due to process nonlinearity, switches in product grade, or different operating conditions. A growing structure multiple model system using local PLS and PCA models have been proposed to improve model performance around process conditions with multiplicity. The use of local PLS and PCA models allows the method to handle a much larger set of inputs and overcome several challenges in mixture model systems. In addition, fault detection sensitivities are also improved by using the multivariate monitoring statistics of these local PLS/PCA models. These proposed methods are tested on two plasma etch data sets provided by Texas Instruments. In addition, a proof of concept using virtual metrology in a controller performance assessment application was also tested.
APA, Harvard, Vancouver, ISO, and other styles
17

Parker, Joana Ventura. "Process Analytics for Chemical Reactions Modelling." Master's thesis, 2020. http://hdl.handle.net/10316/92103.

Full text
Abstract:
Dissertação de Mestrado Integrado em Engenharia Química apresentada à Faculdade de Ciências e Tecnologia
A segurança dos pacientes e a eficácia dos medicamentos são as principais preocupações da indústria farmacêutica, para a qual a garantia da qualidade do produto é essencial.Convencionalmente, o processamento na indústria farmacêutica é realizado em regime descontínuo, sendo a qualidade dos lotes garantida através de testes laboratoriais a amostras recolhidas, por forma a monitorizar os atributos críticos de qualidade (CQAs) do produto.Contudo, ao longo dos anos, surgiram várias tecnologias que permitem maior conhecimento do produto e do processo de fabrico, possibilitando um melhor desenvolvimento, fabrico e garantia de qualidade através do controlo e da análise de dados do processo.Estas tecnologias, tecnologias analíticas de processo (PAT), utilizam instrumentação e modelação matemática para monitorizar continuamente os CQAs de um produto, permitindo a passagem de testar a qualidade do produto acabado com métodos analíticos off-line para assegurar a qualidade do produto através da monitorização contínua e em tempo real dos seus atributos.Os sensores on-line PAT fornecem a informação necessária para inferir as principais variáveis de controlo de qualidade do sistema, embora corrompida por ruídos, biases, e imprecisões de equipamento.A qualidade desta informação pode ser melhorada através da aplicação de um algoritmo de estimativa óptima, combinando as medições disponíveis com o conhecimento prévio do sistema e dos equipamentos de medição.Neste projecto, a espectroscopia de infravermelhos (IR) é utilizada para monitorizar a concentração do reagente limitante de uma reacção em tempo real.Um modelo PLS é calibrado a partir dos dados IR recolhidos para prever a concentração do reagente, e um modelo cinético para descrever o comportamento do sistema com base nos primeiros princípios é desenvolvido.Esta informação é combinada aplicando o algoritmo do filtro de Kalman estendido híbrido (HEKF).Este trabalho evidencia os resultados favoráveis da aplicação de algoritmos da família do filtro de Kalman para combinar informação resultante dos sensores e informação mecanística, que permitem obter previsões mais precisas do que as obtidas quando a mesma informação é utilizada individualmente.A aplicação do algoritmo HEKF produz uma melhoria de precisão de 50.3 % em relação ao modelo PLS, e uma melhoria de precisão de 80.0 % em relação ao modelo cinético.Esta aplicação permite também identificar o tempo final da reacção 15 minutos antes da sua ocorrência.
Patient safety and drug efficacy are the major concerns in the pharmaceutical industry, for which product quality assurance is essential.Conventionally, pharmaceutical manufacturing consists in batch processing with off-line laboratory testing conducted on collected samples to monitor the product's critical quality attributes (CQAs).However, throughout the years, several technologies with the potential to increase insight into the product and the manufacturing process have emerged, allowing for improved pharmaceutical development, manufacturing, and quality assurance through process control and analysis of process data.These technologies, process analytical technologies (PAT), employ instrumentation and mathematical modelling to continuously monitor the CQAs of a pharmaceutical product, allowing the shift from testing the quality of the finished drug product with off-line analytical methods to assuring the product’s quality by continuous, real-time monitoring of its attributes.PAT on-line sensors provide the information needed to infer key quality control variables of the system, albeit corrupted by noise, biases, and device inaccuracies.The quality of this information can be improved through application of an optimal estimation algorithm, by combining the available measurement data with prior knowledge of the system and of the measuring devices.In this project, infrared (IR) spectroscopy is used to monitor the concentration of the limiting reagent of a reaction in real-time.A PLS model is calibrated from the collected IR data to predict the concentration of the reagent and a kinetic model is developed to describe the behaviour of the system based on first principles. This information is combined applying the hybrid extended Kalman filter (HEKF) algorithm.This work evinces the favourable outcomes of applying Kalman-filter-like algorithms to combine sensor and mechanistic information, which yield predictions that are more accurate than those obtained by the same information, if individually used.The application of the HEKF algorithm yields an accuracy improvement of 50.3 % with respects to the PLS model, and an accuracy improvement of 80.0 % with respects to the kinetic model.This application also makes possible to identify the reaction's end time 15 minutes before the occurrence.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography