To see the other types of publications on this topic, follow the link: Full automation.

Dissertations / Theses on the topic 'Full automation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Full automation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Heralic, Almir. "Towards full Automation of Robotized Laser Metal-wire Deposition." Licentiate thesis, University West, Department of Engineering Science, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:hv:diva-2148.

Full text
Abstract:
<p>Metal wire deposition by means of robotized laser welding offers great saving potentials, i.e. reduced costs and reduced lead times, in many different applications, such as fabrication of complex components, repair or modification of high-value components, rapid prototyping and low volume production, especially if the process can be automated. Metal deposition is a layered manufacturing technique that builds metal structures by melting metal wire into beads which are deposited side by side and layer upon layer. This thesis presents a system for on-line monitoring and control of robotized laser metal wire deposition (RLMwD). The task is to ensure a stable deposition process with correct geometrical profile of the resulting geometry and sound metallurgical properties. Issues regarding sensor calibration, system identification and control design are discussed. The suggested controller maintains a constant bead height and width throughout the deposition process. It is evaluated through real experiments, however, limited to straight line deposition experiments. Solutions towards a more general controller, i.e. one that can handle different deposition paths, are suggested.</p><p>A method is also proposed on how an operator can use different sensor information for process understanding, process development and for manual on-line control. The strategies are evaluated through different deposition tasks and considered materials are tool steel and Ti-6Al-4V. The developed monitoring system enables an operator to control the process at a safe distance from the hazardous laser beam.</p><p>The results obtained in this work indicate promising steps towards full automation of the RLMwD process, i.e. without human intervention and for arbitrary deposition paths.</p><br>RMS
APA, Harvard, Vancouver, ISO, and other styles
2

Ehrmanntraut, Rüdiger. "Full Automation of Air Traffic Management in High Complexity Airspace." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2010. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-32811.

Full text
Abstract:
The thesis is that automation of en-route Air Traffic Management in high complexity airspace can be achieved with a combination of automated tactic planning in a look-ahead time horizon of up to two hours complemented with automated tactic conflict resolution functions. The literature review reveals that no significant results have yet been obtained and that full automation could be approached with a complementary integration of automated tactic resolutions AND planning. The focus shifts to ‘planning for capacity’ and ‘planning for resolution’ and also – but not only – for ‘resolution’. The work encompasses a theoretical part on planning, and several small scale studies of empirical, mathematical or simulated nature. The theoretical part of the thesis on planning under uncertainties attempts to conceive a theoretical model which abstracts specificities of planning in Air Traffic Management into a generic planning model. The resulting abstract model treats entities like the planner, the strategy, the plan and the actions, always considering the impact of uncertainties. The work innovates in specifying many links from the theory to the application in planning of air traffic management, and especially the new fields of tactical capacity management. The second main part of the thesis comprises smaller self-containing works on different aspects of the concept grouped into a section on complexity, another on tactic planning actions, and the last on planners. The produced studies are about empirical measures of conflicts and conflict densities to get a better understanding of the complexity of air traffic; studies on traffic organisation using tactical manoeuvres like speed control, lateral offset and tactical direct using fast time simulation; and studies on airspace design like sector optimisation, dynamic sectorisation and its optimisation using optimisation techniques. In conclusion it is believed that this work will contribute to further automation attempts especially by its innovative focus which is on planning, base on a theory of planning, and its findings already influence newer developments.
APA, Harvard, Vancouver, ISO, and other styles
3

Sonnendecker, Paul Walter. "Quantitative analysis of time-resolved FTIR specta : steps towards full automation." Diss., University of Pretoria, 2015. http://hdl.handle.net/2263/56109.

Full text
Abstract:
Inline, time-resolved FTIR spectra are commonly recorded after completion of the experiments. The abilities and versatility of FTIR spectroscopy can, however, also be utilised in the in situ quantification of absorbing mixtures. Recent developments, in the laboratory where this investigation was conducted, demands the inline quantification of PTFE pyrolysis products for process control purposes. This investigation is primarily focused on the development of a procedure and software capable of processing, fitting and quantifying real-time, time-resolved spectra. Processing methods were evaluated with respect or improvement in SNR, smoothing and baseline tracking of infrared spectra. Execution speed was also considered due the need for real-time analysis. The asymmetric least squares method proved to be the optimal choice with respect to the mentioned criteria. An asymmetric lineshape fitting function together with a Levenberg-Marquardt nonlinear solving function was introduced to represent pure component spectra mathematically. A method for quantitative analysis by means of solving a linear set of equations was developed. The software was implemented on the batch pyrolysis of PTFE pyrolysis as test case. Experiments were conducted to obtain sufficient samples of the components such that FTIR spectra could be captured. Infrared spectra of the perfluorobutenes were experimentally determined. These spectra could not be found in the available literature and are deemed to be novel. The ability of the software to perform real-time quantification of the PTFE pyrolysis stream was demonstrated over a range of experimental conditions spanning the temperature range 650 ºC to 850 ºC, and pressures from <1kPa to 70 kPa.<br>Dissertation (MEng)--University of Pretoria, 2015.<br>tm2016<br>Chemical Engineering<br>MEng<br>Unrestricted
APA, Harvard, Vancouver, ISO, and other styles
4

Rådman, Marcus. "Implementation and testing of a path tracker for a full-scale Unmanned Ground Vehicle." Thesis, Luleå tekniska universitet, Institutionen för system- och rymdteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-73079.

Full text
Abstract:
This project is the implementation and testing of a path tracker for a car-sized Unmanned Ground Vehicle. The vehicle, a Toyota Land Cruiser, was provided by SSC and hadpreviously been modified for remote operation. The developed path tracker uses a "follow the carrot" algorithm and has been writtenin C using the Robot Operating Software (ROS) framework and has been integrated intothe vehicles existing ROS powered software. During the implementation, the Gazebo rigid body simulator was used to simulate a simplified vehicle. Integration with the real sensors was performed using a small-scalecar, both indoors with the aid of a Vicon motion capture system and outdoors utilizingonly sensors available to the full-size car. The small-scale tests showed promise, howeverwhen full-scale field tests were performed the results showed some problems and reasonsfor these are discussed.
APA, Harvard, Vancouver, ISO, and other styles
5

Costley, Austin D. "Platform Development and Path Following Controller Design for Full-Sized Vehicle Automation." DigitalCommons@USU, 2017. https://digitalcommons.usu.edu/etd/6429.

Full text
Abstract:
The purpose of this thesis is to discuss the design and development of a platform used to automate a stock 2013 Ford Focus EV. The platform is low-cost and open-source to encourage collaboration and provide a starting point for fellow researchers to advance the work in the field of automated vehicle control. This thesis starts by discussing the process of obtaining control of the vehicle by taking advantage of internal communication protocols. The controller design process is detailed and a description of the components and software used to control the vehicle is provided. The automated system is tested and the results of fully autonomous driving are discussed.
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Xiaoyu. "A data analytic approach to automatic fault diagnosis and prognosis for distribution automation." Thesis, University of Strathclyde, 2017. http://digitool.lib.strath.ac.uk:80/R/?func=dbin-jump-full&object_id=28772.

Full text
Abstract:
Distribution Automation (DA) is deployed to reduce outages and to rapidly reconnect customers following network faults. Recent developments in DA equipment have enabled the logging of load and fault event data, referred to as pick-up activity. This pick-up activity provides a picture of the underlying circuit activity occurring between successive DA operations over a period of time and has the potential to be accessed remotely for off-line or on-line analysis. The application of data analytics and automated analysis of this data supports reactive fault management and post fault investigation into anomalous network behavior. It also supports predictive capabilities that identify when potential network faults are evolving and offers the opportunity to take action in advance in order to mitigate any outages. This thesis details the design of a novel decision support system to achieve automatic fault diagnosis and prognosis for DA schemes. It combines detailed data from a specific DA device with SCADA data, by utilising rule-based, data science techniques (e.g. data mining and clustering techniques) to deliver the diagnostic and prognostic functions. These are applied to 11kV distribution network data captured from Pole Mounted Auto-Reclosers (PMARs) as provided by a leading UK network operator. This novel automated analysis system diagnoses the condition of device faults, the nature of a circuit's previous fault activity, identifies underlying anomalous circuit activity, and highlights indications of problematic events gradually evolving into a full scale circuit fault using prognostic functionality. The novel contributions also include the characterisation and identification of semi-permanent faults and a re-usable methodology and approach for applying data analytics to any DA device data sets in order to provide diagnostic decisions and mitigate potential fault scenarios.
APA, Harvard, Vancouver, ISO, and other styles
7

Varadhan, Aishwarya. "Design of Control Algorithms for Automation of a Full Dimension Continuouis Haulage System." Thesis, Virginia Tech, 2000. http://hdl.handle.net/10919/10165.

Full text
Abstract:
The main theme of this research will be to develop solutions to the widely known 3-part question in mobile robotics comprising of "Where am I" "Where should I be" and "How do I get there". This can be achieved by implementing automation algorithms. Automation algorithms or control algorithms are vital components of any autonomous vehicle. Design and development of both prototype and full-scale control algorithms for a Long-Airdox Full Dimension Continuous Haulage system will be the main focus. Automation is a highly complex task, which aims at achieving increased levels of equipment efficiency by eliminating errors that arise due to human interference. Achieving a fully autonomous operation of a machine involves a variety of high-level interlaced functions that work in harmony, and at the same time perform functions that mimic the human operator. Automation has expanded widely in the field of mobile robotics, thus leading to the development of autonomous robots, automated guided vehicles and other autonomous vehicles. An indispensable element of an autonomous vehicle is a navigation system that steers it to a required destination. The vehicle must be able to determine its relationship to the environment by sensing, and also must be able to decide what actions are required to achieve its goal(s) in the working environment. The goal of this research is to demonstrate a fully autonomous operation of the Continuous Haulage System, and to establish its potential advantages.<br>Master of Science
APA, Harvard, Vancouver, ISO, and other styles
8

Piera, Alejandro J. "Automation in facilitation of air transport." Thesis, McGill University, 2000. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=33364.

Full text
Abstract:
The air transport industry is at present subject to dramatic traffic growth, which is expected to triple in the next 20 years. The industry must attempt to meet this unavoidable challenge by somehow accommodating the increase in passenger flow. This thesis proposes to examine how automation devices may assist in meeting this challenge by facilitating passenger clearance. They would do so by improving the lengthy, formalistic, and overly-bureaucratic, immigration and customs procedures. A myriad of different legal issues are engaged by these initiatives. Although many of them are mentioned throughout this thesis, the core legal analysis focuses on the challenge to privacy triggered by these endeavours, and the conflicting interests of individuals and industry players. Finally, a proposal to eliminate, or at least to reduce, this conflict is recommended.
APA, Harvard, Vancouver, ISO, and other styles
9

Aguilar, Cortés Carlos Ezequiel. "Air carrier liability and automation issues." Thesis, McGill University, 2002. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=78196.

Full text
Abstract:
Our intended topic is a general discussion of the basic elements of liability related to airline accidents to which fully automated cockpits have constituted an associated contributory factor. In addition we addressed the liability of air carriers arising from injuries or death caused to passengers traveling on international flights. For this purpose, we reviewed the Warsaw System and the different international instruments that constitute it. We also reviewed principles of common law applicable to aircraft manufacturers and the "Free Flight" as an example of the growing automation environment, which is a general benefit to commercial aviation but also a likely contributory cause for accidents in particular cases. In the last part we briefly discuss a personal view regarding the interplay between manufacturers and airlines under the 1999 Montreal Convention, which is an international treaty unifying the desegregated Warsaw System into one single instrument that is expected to enter into force in a few years.
APA, Harvard, Vancouver, ISO, and other styles
10

Baiden, Gregory Robert. "A study of underground mine automation." Thesis, McGill University, 1993. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=41320.

Full text
Abstract:
A review of automation, robotics and communications technology has established the need for the development of a communications infrastructure capable of supporting future underground hard rock mine automation systems. A series of underground experiments were undertaken at Copper Cliff North Mine to evaluate the design criteria and performance of several communications infrastructures. The work successfully demonstrated the capability of real-time operation of voice, data and stationary video communication as well as surface-to-underground tele-operation of a load-haul-dump machine. This was achieved with a communications system consisting of a broadband bus linked to leaky feeder coaxial cables by means of distributed antenna translators. The success of the trials permitted a strategy for mine automation to be devised. The economic benefits of mine automation were estimated by means of economic models developed for the mine. Projected benefits, evaluated in terms of mining cost reduction, throughput time and quality improvement, were concluded to be significant. As a result of the analysis, future research and development is concluded to be best targeted at improving ore grade, optimizing process productivity and maximizing machine utilization.
APA, Harvard, Vancouver, ISO, and other styles
11

Poole, Ross 1949. "Load-haul-dump machine automation at Inco's Ontario division." Thesis, McGill University, 1999. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=30268.

Full text
Abstract:
This thesis is based upon field studies of automation research in Inco mines of the Ontario Division. It considers the advancement of technology and practice for underground hardrock mining automation, emphasizing the Load-Haul-Dump machine and its evolution and automation. It attempts to define the requirements for future mining processes, including the potential for extended teleoperation and autonomous operation of machines from safe vantage points.<br>Design issues including effective underground communications, automation amenable equipment, and process and workplace suitability are analyzed in detail and then related to solutions in design and practice.<br>This thesis concludes with discussions and recommendations towards solutions for future autonomous haulage for extreme long distance situations. Conclusions will highlight the successes the LHD has enabled in Canadian underground hardrock mines and its suitability to the task of optimizing automated haulage for use in safe, higher productivity automated processes that will optimize underground hardrock mining in Canada.
APA, Harvard, Vancouver, ISO, and other styles
12

Abbasi, Mayar. "Semi automation of mean axis of rotation (MAR) analysis." Thesis, McGill University, 2013. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=119364.

Full text
Abstract:
The Mean Axis of Rotation (MAR) Analysis is an analysis procedure used to diagnose problems in the cervical spine. Researchers have shown that many patient symptoms can be correlated to abnormal MAR placement [1], making MAR analysis an important tool in medical diagnosis. However, the current method of calculating the MAR for a single patient is manual, and very labour intensive, thus rendering it impractical to use MAR Analysis in a general clinical setting. This work presents a computer vision based tool which performs the MAR Analysis in a semi-automatic manner, greatly reducing the effort required by the manual procedure. While the results show that further research is required to improve the accuracy results of the semi-automatic MAR tool, the overall results demonstrate a 90% savings in effort, while still achieving high accuracy. The major effort required in the semi-automatic MAR tool consists of manually tracing the vertebrae on the x-rays. To reduce this effort, we explore approaches to automatic vertebrae segmentation in the second part of this work. While fully automatic segmentation is not achieved, a semi-automated segmentation tool is presented, which reduces the effort of tracing a vertebra by an average of 65%. The tool first over-segments the entire image into Super-Pixel regions, and then proactively merges similar regions of the desired object until it is segmented. The accuracy of this approach depends on the accuracy of the initial Super-Pixel segmentation, and currently this is not accurate enough to be used for the MAR Analysis. However, the semi-automated segmentation tool still represents a great improvement over other segmentation approaches for low-contrast, noisy medical images.<br>L'axe de rotation moyen (MAR) L'analyse est une méthode d'analyse utilisée pour diagnostiquer les problèmes de la colonne vertébrale cervicale. Les chercheurs ont montré que de nombreux symptômes du patient peuvent être corrélés à MAR placement anormal [1], ce qui rend MAR analyse un outil important dans le diagnostic médical. Cependant, la méthode actuelle de calcul de la MAR pour un seul patient est manuelle, et très vaste du travail, rendant ainsi impossible d'utiliser l'analyse MAR dans un cadre clinique général.Ce travail présente un outil de vision par ordinateur qui effectue l'analyse MAR d'une manière semi-automatique, ce qui réduit considérablement l'effort requis par la procédure manuelle. Bien que les résultats montrent que des recherches supplémentaires sont nécessaires pour améliorer les résultats de la précision de l'outil semi-automatique MAR, les résultats globaux montrent une économie de 90% de l'effort, tout en obtenant une haute précision.L'effort principal nécessaire à l'outil semi-automatique MAR consiste à tracer manuellement les vertèbres sur les radiographies. Afin de réduire cet effort, nous explorons les approches de segmentation automatique des vertèbres dans la deuxième partie de ce travail. Alors que la segmentation entièrement automatique n'est pas atteinte, un outil de segmentation semi-automatique est présenté, ce qui réduit l'effort de tracer une vertèbre en moyenne de 65%. L'outil fait une segmentation de la totalité de l'image en régions Super- pixels, puis fusionne de manière proactive des régions similaires de l'objet désiré jusqu'à ce qu'il soit segmenté. La précision de cette méthode dépend de la précision de la segmentation initiale en super-pixel, et actuellement ce n'est pas assez précis pour être utilisés pour l'analyse MAR. Cependant, l'outil de segmentation semi-automatique représente quand même une grande amélioration par rapport aux autres approches de segmentation sur les images médicales à faible contraste, bruyant.
APA, Harvard, Vancouver, ISO, and other styles
13

Yi, Kwan 1963. "Text classification using a hidden Markov model." Thesis, McGill University, 2005. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=85214.

Full text
Abstract:
Text categorization (TC) is the task of automatically categorizing textual digital documents into pre-set categories by analyzing their contents. The purpose of this study is to develop an effective TC model to resolve the difficulty of automatic classification. In this study, two primary goals are intended. First, a Hidden Markov Model (HAM is proposed as a relatively new method for text categorization. HMM has been applied to a wide range of applications in text processing such as text segmentation and event tracking, information retrieval, and information extraction. Few, however, have applied HMM to TC. Second, the Library of Congress Classification (LCC) is adopted as a classification scheme for the HMM-based TC model for categorizing digital documents. LCC has been used only in a handful of experiments for the purpose of automatic classification. In the proposed framework, a general prototype for an HMM-based TC model is designed, and an experimental model based on the prototype is implemented so as to categorize digitalized documents into LCC. A sample of abstracts from the ProQuest Digital Dissertations database is used for the test-base. Dissertation abstracts, which are pre-classified by professional librarians, form an ideal test-base for evaluating the proposed model of automatic TC. For comparative purposes, a Naive Bayesian model, which has been extensively used in TC applications, is also implemented. Our experimental results show that the performance of our model surpasses that of the Naive Bayesian model as measured by comparing the automatic classification of abstracts to the manual classification performed by professionals.
APA, Harvard, Vancouver, ISO, and other styles
14

Sicard, Pierre. "Adaptive welding and seam tracking using laser vision." Thesis, McGill University, 1987. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=63837.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Dipa, Fuad, and Erkan Ektiren. "Implementing Full Inventory Control in a Production Facility: A Case Study at Scania CV Engine Assembly." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-45300.

Full text
Abstract:
The concept of inventory control has been around since the early 20th century and it’s constantly evolving. The importance of inventory management and supply chain management is clear, and companies are constantly trying to evolve their systems and ways of handling inventory control. By having a proper inventory control system with adequate inventory record audits, a company could potentially have several benefits such as reduced tied-up capital, reduced holding costs, reduced/redistributed work hours, better automation and more. Most organisations and companies have some form of inventory control, however not all have full control of their inventory. This includes automatic inventory balance updates, package traceability, automatic replenishment systems and more. To implement these ideas, a company would need to foremost find what factors are currently hindering them from obtaining this and consequently being able to adjust their factors. Since there are several ways to obtain an automatic inventory record update that is adequate, multiple proposals are discussed in this thesis project. This thesis project assessed what the necessary steps that a company needs to perform are through a case study at Scania CV Engine and a benchmarking at Scania Production Angers. Through a collection of scientific literature and empirical data, an attempt to identify the factors that determine whether a company can implement full inventory control or not was made. As a supplement to this, this thesis project also looked over what type of consequences an implementation of full inventory control could have in a company, both when it comes to purely systemic consequences as well as economic consequences.<br>Begreppet saldokontroll har cirkulerat sedan början av 1900-talet och teorierna utvecklas ständigt. Betydelsen av lagerstyrning och Supply Chain Management är idag tydlig och företag försöker ständigt utveckla sina system och sätt att hantera saldokontroll på. Genom att ha ordentlig saldokontroll med adekvata lagerregistreringsrevisioner kan ett företag potentiellt få flertalet fördelar som till exempel reducerat bundet kapital, minskade innehavskostnader, reducerade eller omfördelade arbetstimmar, bättre automatisering och mera. De flesta organisationer och företag har någon form av lagerkontroll, men inte alla har 100% kontroll över sina inventeringar. Detta inkluderar automatiska lagerrevisioner, spårbarhet av paket, automatiska påfyllningssystem och mer. För att genomföra dessa idéer måste ett företag framför allt finna vilka faktorer som för närvarande förhindrar dem från att uppnå 100% saldokontroll och följaktligen kunna justera dessa faktorer. Eftersom det finns flera sätt att uppnå automatiska revisioner av inventeringen som är proper så diskuteras flera förslag i denna avhandling. Denna avhandling försöker bedöma vilka nödvändiga steg som ett företag behöver genomföra är genom en utförd fallstudie på Scania CV Engine tillsammans med en benchmarking på Scania Production Angers. Genom en samling av vetenskapliga studier och empiriska data från fallstudien gjordes ett försök att identifiera de faktorer som avgöra om ett företag kan implementera 100% saldokontroll eller inte. Som ett komplement till detta ser denna rapport även över vilken typ av konsekvenser en sådan implementering kan innebära, båda när det gäller rent systematiska förändringar samt ekonomiska förändringar.
APA, Harvard, Vancouver, ISO, and other styles
16

Mackay, Stephen George. "The impact of blended learning in improving the reaction, achievement and return on investment of industrial automation training." Curtin University of Technology, Science and Mathematics Education Centre, 2008. http://espace.library.curtin.edu.au:80/R/?func=dbin-jump-full&object_id=14903.

Full text
Abstract:
There has been a significant increase in the level of remote or distance learning using the Internet, often referred to as e-learning or online education. E-learning is often combined with classroom instruction and on-the-job training and this is referred to as blended learning. The purpose of this research is to investigate the impact blended learning has in improving engineering training in the engineering field of industrial automation. This is especially in improving the reaction, achievement and return on investment of learners compared to that of only the traditional classroom or e-learning approaches. One of the gaps in current research is the examination of the impact of blended learning in improving engineering training. The research revealed significant growth in the use of e-learning for engineers and technicians. There would however appear to be a large number of engineers and technicians who were disappointed with their experiences of e-learning. Significant concerns were also identified in the efficacy of e-learning and the lack of hands-on experience in this form of training for engineers and technicians. Suggestions are made as a result of the research into addressing these issues.
APA, Harvard, Vancouver, ISO, and other styles
17

Bell, Peter. "Full covariance modelling for speech recognition." Thesis, University of Edinburgh, 2010. http://hdl.handle.net/1842/4912.

Full text
Abstract:
HMM-based systems for Automatic Speech Recognition typically model the acoustic features using mixtures of multivariate Gaussians. In this thesis, we consider the problem of learning a suitable covariance matrix for each Gaussian. A variety of schemes have been proposed for controlling the number of covariance parameters per Gaussian, and studies have shown that in general, the greater the number of parameters used in the models, the better the recognition performance. We therefore investigate systems with full covariance Gaussians. However, in this case, the obvious choice of parameters – given by the sample covariance matrix – leads to matrices that are poorly-conditioned, and do not generalise well to unseen test data. The problem is particularly acute when the amount of training data is limited. We propose two solutions to this problem: firstly, we impose the requirement that each matrix should take the form of a Gaussian graphical model, and introduce a method for learning the parameters and the model structure simultaneously. Secondly, we explain how an alternative estimator, the shrinkage estimator, is preferable to the standard maximum likelihood estimator, and derive formulae for the optimal shrinkage intensity within the context of a Gaussian mixture model. We show how this relates to the use of a diagonal covariance smoothing prior. We compare the effectiveness of these techniques to standard methods on a phone recognition task where the quantity of training data is artificially constrained. We then investigate the performance of the shrinkage estimator on a large-vocabulary conversational telephone speech recognition task. Discriminative training techniques can be used to compensate for the invalidity of the model correctness assumption underpinning maximum likelihood estimation. On the large-vocabulary task, we use discriminative training of the full covariance models and diagonal priors to yield improved recognition performance.
APA, Harvard, Vancouver, ISO, and other styles
18

Jones, Charles H. "TOWARDS FULLY AUTOMATED INSTRUMENTATION TEST SUPPORT." International Foundation for Telemetering, 2007. http://hdl.handle.net/10150/604521.

Full text
Abstract:
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada<br>Imagine that a test vehicle has just arrived at your test facility and that it is fully instrumented with sensors and a data acquisition system (DAS). Imagine that a test engineer logs onto the vehicle’s DAS, submits a list of data requirements, and the DAS automatically configures itself to meet those data requirements. Imagine that the control room then contacts the DAS, downloads the configuration, and coordinates its own configuration with the vehicle’s setup. Imagine all of this done with no more human interaction than the original test engineer’s request. How close to this imaginary scenario is the instrumentation community? We’re not there yet, but through a variety of efforts, we are headed towards this fully automated scenario. This paper outlines the current status, current projects, and some missing pieces in the journey towards this end. This journey includes standards development in the Range Commander’s Council (RCC), smart sensor standards development through the Institute of Electrical and Electronics Engineers (IEEE), Small Business Innovation Research (SBIR) contracts, efforts by the integrated Network Enhanced Telemetry (iNET) project, and other projects involved in reaching this goal.
APA, Harvard, Vancouver, ISO, and other styles
19

Nicklin, Timothy J. "Automation of vehicle testing for fuel economy and emissions optimisation." Thesis, Brunel University, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.488732.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Pepiot, Perrine. "Automatic strategies to model transportation fuel surrogates /." May be available electronically:, 2008. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Larsson, Anders. "Fully automatic benchmarking of real-time operating systems." Thesis, University of Skövde, Department of Computer Science, 1998. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-172.

Full text
Abstract:
<p>Testing and evaluating the performance of different software solutions is important in order to compare them with each other. Measuring, or benchmark, software is not a trivial task and conducting tests in a real-time environment implicates it further. Still, measuring is the only way to provide useful information, for example, which real-time operating system is best suitable for a specific hardware configuration.</p><p>The purpose of this project is to design a benchmark support system, which automatically performs benchmarks of a real-time operating system in a host-target environment. The benchmarks are conducted according to a user-defined specification and the support system also allows a developer to create configurable benchmarks.</p><p>The benchmark support system described also allows parameters to increase monotonically within a specified interval during benchmark execution. This is an important feature in order to detect unpredictable behavior of the real-time system.</p>
APA, Harvard, Vancouver, ISO, and other styles
22

Pagalone, Vinod. "Automatic test pattern generator for full scan sequential circuits using limited scan operations /." Available to subscribers only, 2006. http://proquest.umi.com/pqdweb?did=1251871351&sid=8&Fmt=2&clientId=1509&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Salzmann, Roger. "Fuel staging for NOx reduction in automatic wood furnaces /." [S.l.] : [s.n.], 2000. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=13531.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Ergul, Mustafa. "A Fully Automatic Shape Based Geo-spatial Object Recognition." Master's thesis, METU, 2012. http://etd.lib.metu.edu.tr/upload/12614680/index.pdf.

Full text
Abstract:
A great number of methods based on local features or global appearances have been proposed in the literature for geospatial object detection and recognition from satellite images. However, since these approaches do not have enough discriminative capabilities between object and non-object classes, they produce results with innumerable false positives during their detection process. Moreover, due to the sliding window mechanisms, these algorithms cannot yield exact location information for the detected objects. Therefore, a geospatial object recognition algorithm based on the object shape mask is proposed to minimize the aforementioned imperfections. In order to develop such a robust recognition system, foreground extraction performance of some of popular fully and semi-automatic image segmentation algorithms, such as normalized cut, k-means clustering, mean-shift for fully automatic, and interactive Graph-cut, GrowCut, GrabCut for semi-automatic, are evaluated in terms of their subjective and objective qualities. After this evaluation, the retrieval performance of some shape description techniques, such as ART, Hu moments and Fourier descriptors, are investigated quantitatively. In the proposed system, first of all, some hypothesis points are generated for a given test image. Then, the foreground extraction operation is achieved via GrabCut algorithm after utilizing these hypothesis points as if these are user inputs. Next, the extracted binary object masks are described by means of the integrated versions of shape description techniques. Afterwards, SVM classifier is used to identify the target objects. Finally, elimination of the multiple detections coming from the generation of hypothesis points is performed by some simple post-processing on the resultant masks. Experimental results reveal that the proposed algorithm has promising results in terms of accuracy in recognizing many geospatial objects, such as airplane and ship, from high resolution satellite imagery.
APA, Harvard, Vancouver, ISO, and other styles
25

Shan, Juan. "A Fully Automatic Segmentation Method for Breast Ultrasound Images." DigitalCommons@USU, 2011. https://digitalcommons.usu.edu/etd/905.

Full text
Abstract:
Breast cancer is the second leading cause of death of women worldwide. Accurate lesion boundary detection is important for breast cancer diagnosis. Since many crucial features for discriminating benign and malignant lesions are based on the contour, shape, and texture of the lesion, an accurate segmentation method is essential for a successful diagnosis. Ultrasound is an effective screening tool and primarily useful for differentiating benign and malignant lesions. However, due to inherent speckle noise and low contrast of breast ultrasound imaging, automatic lesion segmentation is still a challenging task. This research focuses on developing a novel, effective, and fully automatic lesion segmentation method for breast ultrasound images. By incorporating empirical domain knowledge of breast structure, a region of interest is generated. Then, a novel enhancement algorithm (using a novel phase feature) and a newly developed neutrosophic clustering method are developed to detect the precise lesion boundary. Neutrosophy is a recently introduced branch of philosophy that deals with paradoxes, contradictions, antitheses, and antinomies. When neutrosophy is used to segment images with vague boundaries, its unique ability to deal with uncertainty is brought to bear. In this work, we apply neutrosophy to breast ultrasound image segmentation and propose a new clustering method named neutrosophic l-means. We compare the proposed method with traditional fuzzy c-means clustering and three other well-developed segmentation methods for breast ultrasound images, using the same database. Both accuracy and time complexity are analyzed. The proposed method achieves the best accuracy (TP rate is 94.36%, FP rate is 8.08%, and similarity rate is 87.39%) with a fairly rapid processing speed (about 20 seconds). Sensitivity analysis shows the robustness of the proposed method as well. Cases with multiple-lesions and severe shadowing effect (shadow areas having similar intensity values of the lesion and tightly connected with the lesion) are not included in this study.
APA, Harvard, Vancouver, ISO, and other styles
26

Pastore, Danilo. "Study and development of a software architecture fully compliant with the ANSI/ISA-88 standard and PackML technical report for tablet coating machines." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018.

Find full text
Abstract:
Lo scopo di questa tesi è quello di studiare e sviluppare una nuova architettura software da applicare in primo luogo alle macchine per il coating sviluppate dall’azienda IMA S.p.a, divisione Active. In particolare, come caso di studio, è stata utilizzata la nuova macchina prototipo, sviluppata dalla stessa azienda, chiamata CROMA. Questa tipologia di impianto ingloba al suo interno un nuovo concetto di processo chiamato continuous batch, che ha richiesto l’introduzione di una nuova gestione delle unità di processo. È in questo ambiente che l’architettura software proposta è stata concepita. Per ottenere il risultato finale è stata prima effettuata un’analisi dello stato attuale del software, verificando la sua conformità con i canoni dello standard ANSI/ISA-88, e successivamente sono stati introdotti altri elementi, tra cui lo standard PackML, con l’obiettivo di sfruttare al massimo le potenzialità della macchina.
APA, Harvard, Vancouver, ISO, and other styles
27

Gransten, Johan. "Linear and Nonlinear Identification of Solid Fuel Furnace." Thesis, Linköping University, Department of Electrical Engineering, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-5182.

Full text
Abstract:
<p>The aim of this thesis is to develop the knowledge about nonlinear and/or adaptive solid fuel boiler control at Vattenfall Utveckling AB. The aim is also to make a study of implemented and published control strategies.</p><p>A solid fuel boiler is a large-scale heat (and power) generating plant. The Idbäcken boiler studied in this work, is a one hundred MW furnace mainly fired with wood chips. The control system consists of several linear PID controllers working together, and the furnace is a nonlinear system. That, and the fact that the fuel-flow is not monitored, are the main reasons for the control problems. The system fluctuates periodically and the CO outlets sometimes rise high above the permitted level.</p><p>There is little work done in the area of advanced boiler control, but some interesting approaches are described in scientific articles. MPC (Model Predictive Control), nonlinear system identification using ANN (Artificial Neural Network), fuzzy logic, Hµ loop shaping and MIMO (Multiple Input Multiple Output) PID tuning methods have been tested with good results.</p><p>Both linear and nonlinear system identification is performed in the thesis. The linear models are able to explain about forty percent of the system behavior and the nonlinear models explain about sixty to eighty percent. The main result is that nonlinear models improve the performance and that there are considerable disturbances complicating the identification. Another identification issue was the feedback during the data collection.</p>
APA, Harvard, Vancouver, ISO, and other styles
28

Cavalletti, Matteo. "Intelligent power train control laws in fuel cell electric vehicle." Doctoral thesis, Università Politecnica delle Marche, 2009. http://hdl.handle.net/11566/242332.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Wong, Angela Sai On. "A fully automatic analytic approach to budget-constrained system upgrade." Thesis, University of British Columbia, 1987. http://hdl.handle.net/2429/26670.

Full text
Abstract:
This thesis describes the development of a software package to upgrade computer systems. The package, named OPTIMAL, solves the following problem: given an existing computer system and its workload, a budget, and the costs and descriptions of available upgrade alternatives for devices in the system, what is the most cost-effective way of upgrading and tuning the system to produce the optimal system throughput? To enhance the practicality of OPTIMAL, the research followed two criteria: i) input required by OPTIMAL must be system and workload characteristics directly measurable from the system under consideration; ii) other than gathering the appropriate input data, the package must be completely automated and must not require any specialized knowledge in systems performance evaluation to interpret the results. The output of OPTIMAL consists of the optimal system throughput under the budget constraint, the workload and system configuration (or upgrade strategy) that provide such throughput, and the cost of the upgrade. Various optimization techniques, including saturation analysis and fine tuning, have been applied to enhance the performance of OPTIMAL.<br>Science, Faculty of<br>Computer Science, Department of<br>Graduate
APA, Harvard, Vancouver, ISO, and other styles
30

Thorn, Johan. "Utveckling av ramverk för FAB – Fully Automatic Bagging : Ett produktutvecklingsprojekt." Thesis, Karlstads universitet, Fakulteten för hälsa, natur- och teknikvetenskap (from 2013), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-73571.

Full text
Abstract:
Denna rapport behandlar ett examensarbete som gjorts på Goodtech Solutions AB i Karlstad. Examensarbetet är ett avslut på Högskoleingenjörsprogrammet i maskinteknik vid Karlstads universitet. Målet med projektet var att utveckla en ny typ av lösning för ett ramverk till en bulkhanteringslinje för att effektivisera tillverkningen och på så sätt öka lönsamheten på produkten. Produktutvecklingsprocessen låg till grund som metod för projektet och genom att steg för steg följa de olika faserna kunde en rad koncept tas fram. Ur dessa kunde ett slutgiltigt konceptval göras och därefter skapa en CAD-modell med hållfasthetsberäkningar som underlag. Den nya konstruktionen löser de tidigare problemen med en robust ram och förenklad konstruktion samt fullständig tillgång till underhållspunkter. Ramverket är konstruerat med hänsyn till innermåtten på en ISO-container och är även dimensionerad för både svartstål och rostfritt vilket har resulterat i en ökad flexibilitet hos produkten.<br>This report deals with a degree project done at Goodtech Solutions AB in Karlstad. The degree project is a conclusion to the Bachelor of Science program in mechanical engineering at Karlstad University. The aim of the project was to develop a new type of solution for a framework for a bulk handling line to make production more efficient and thus increase the profitability of the product. The product development process was based on the method for the project and by following the different phases step-by-step, a number of concepts could be developed. From these, a final concept selection could be made and then create a CAD model with strength calculations as a basis. The new design solves the previous problems with a robust frame and simplified design and complete access to maintenance points. The framework is designed with regard to the inner dimensions of an ISO container and is also dimensioned for both black steel and stainless steel, which has resulted in an increased flexibility of the product.
APA, Harvard, Vancouver, ISO, and other styles
31

Smati, Z. "Development of a pulsed MIG system for fully automatic multipass welding." Thesis, Brunel University, 1985. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.355152.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Camino, Irene [Verfasser]. "Fully automatic preprocessing of naval images for 3D reconstruction / Irene Camino." Hamburg : Helmut-Schmidt-Universität, Bibliothek, 2015. http://d-nb.info/1091569959/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Ortiz, Martínez Daniel. "Advances in Fully-Automatic and Interactive Phrase-Based Statistical Machine Translation." Doctoral thesis, Universitat Politècnica de València, 2011. http://hdl.handle.net/10251/12127.

Full text
Abstract:
This thesis presents different contributions in the fields of fully-automatic statistical machine translation and interactive statistical machine translation. In the field of statistical machine translation there are three problems that are to be addressed, namely, the modelling problem, the training problem and the search problem. In this thesis we present contributions regarding these three problems. Regarding the modelling problem, an alternative derivation of phrase-based statistical translation models is proposed. Such derivation introduces a set of statistical submodels governing different aspects of the translation process. In addition to this, the resulting submodels can be introduced as components of a log-linear model. Regarding the training problem, an alternative estimation technique for phrase-based models that tries to reduce the strong heuristic component of the standard estimation technique is proposed. The proposed estimation technique considers the phrase pairs that compose the phrase model as part of complete bisegmentations of the source and target sentences. We theoretically and empirically demonstrate that the proposed estimation technique can be efficiently executed. Experimental results obtained with the open-source THOT toolkit also presented in this thesis, show that the alternative estimation technique obtains phrase models with lower perplexity than those obtained by means of the standard estimation technique. However, the reduction in the perplexity of the model did not allow us to obtain improvements in the translation quality. To deal with the search problem, we propose a search algorithm which is based on the branch-and-bound search paradigm. The proposed algorithm generalises different search strategies that can be accessed bymodifying the input parameters. We carried out experiments to evaluate the performance of the proposed search algorithm.<br>Ortiz Martínez, D. (2011). Advances in Fully-Automatic and Interactive Phrase-Based Statistical Machine Translation [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/12127<br>Palancia
APA, Harvard, Vancouver, ISO, and other styles
34

Roskoff, Nathan. "Development of a Novel Fuel Burnup Methodology and Algorithm in RAPID and its Benchmarking and Automation." Diss., Virginia Tech, 2018. http://hdl.handle.net/10919/84487.

Full text
Abstract:
Fuel burnup calculations provide material concentrations and intrinsic neutron and gamma source strengths as a function of irradiation and cooling time. Detailed, full-core 3D burnup calculations are critical for nuclear fuel management studies, including core design and spent fuel storage safety and safeguards analysis. For core design, specifically during refueling, full- core pin-wise, axially-dependent burnup distributions are necessary to determine assembly positioning to efficiently utilize fuel resources. In spent fuel storage criticality safety analysis, detailed burnup distributions enable best-estimate analysis which allows for more effective utilization of storage space. Additionally, detailed knowledge of neutron and gamma source distributions provide the ability to ensure nuclear material safeguards. The need for accurate and efficient burnup calculations has become more urgent for the simulation of advanced reactors and monitoring and safeguards of spent fuel pools. To this end, the Virginia Tech Transport Theory Group (VT3G) has been working on advanced computational tools for accurate modeling and simulation of nuclear systems in real-time. These tools are based on the Multi-stage Response-function Transport (MRT) methodology. For monitoring and safety evaluation of spent fuel pools and casks, the RAPID (Real-time Analysis for Particle transport and In-situ Detection) code system has been developed. This dissertation presents a novel methodology and algorithm for performing 3D fuel bur- nup calculations, referred to as bRAPID- Burnup with RAPID . bRAPID utilizes the existing RAPID code system for accurate calculation of 3D fission source distributions as the trans- port calculation tool to drive the 3D burnup calculation. bRAPID is capable of accurately and efficiently calculating assembly-wise axially-dependent fission source and burnup dis- tributions, and irradiated-fuel properties including material compositions, neutron source, gamma source, spontaneous fission source, and activities. bRAPID performs 3D burnup calculations in a fraction of the time required by state-of-the-art methodologies because it utilizes a pre-calculated database of response functions. The bRAPID database pre-calculation procedure, and its automation, is presented. The ex- isting RAPID code is then benchmarked against the MCNP and Serpent Monte Carlo codes for a spent fuel pool and the U.S. Naval Academy Subcritical Reactor facility. RAPID is shown to accurately calculate eigenvalue, subcritical multiplication, and 3D fission source dis- tributions. Finally, bRAPID is compared to traditional, state-of-the art Serpent Monte Carlo burnup calculations and its performance will be evaluated. It is important to note that the automated pre-calculation proceedure is required for evaluating the performance of bRAPID. Additionally, benchmarking of the RAPID code is necessary to understand RAPID's ability to solve problems with variable burnups distributions and to asses its accuracy.<br>Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
35

Růžičková, Martina. "UNIVERSAL BASIC OPRESSION." Master's thesis, Vysoké učení technické v Brně. Fakulta výtvarných umění, 2017. http://www.nusl.cz/ntk/nusl-295731.

Full text
Abstract:
Master's thesis Polyamory Design Unit (PDU) explores the possibilities of collaboration between experts being active in fine arts, product design, graphic design, architecture and philosophy in order to create a speculative future scenario. Together with Jana Trundova, Simon Barak, Ondrej Mohyla and Lukas Likavcan, I create the concept and the presentation structure for a housing complex, which is designed for polyamoric coexistence of human and non-human entities. Such a coexistence is made possible by full automation of work and global implementation of universal basic income. These initial parameters constitute a big emancipatory potential, that could change present meaning of the concept of polyamory and thus redefine networks of relations in bigger scales too.
APA, Harvard, Vancouver, ISO, and other styles
36

Osifo, Otasowie. "Automatic Gamma Scanning System for Measurement of Residual Heat in Nuclear Fuel." Licentiate thesis, Uppsala universitet, Institutionen för neutronforskning, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-85830.

Full text
Abstract:
In Sweden, spent nuclear fuel will be encapsulated and placed in a deep geological repository. In this procedure, reliable and accurate spent fuel data such as discharge burnup, cooling time and residual heat must be available. The gamma scanning method was proposed in earlier work as a fast and reliable method for the experimental determination of such spent fuel data. This thesis is focused on the recent achievements in the development of a pilot gamma scanning system and its application in measuring spent fuel residual heat. The achievements include the development of dedicated spectroscopic data-acquisition and analysis software and the use of a specially designed calorimeter for calibrating the gamma scanning system. The pilot system is described, including an evaluation of the performance of the spectrum analysis software. Also described are the gamma-scanning measurements on 31 spent PWR fuel assemblies performed using the pilot system. The results obtained for the determination of residual heat are presented, showing an agreement of (2-3) % with both calorimetric and calculated data. In addition, the ability to verify declared data such as discharge burnup and cooling time is demonstrated.
APA, Harvard, Vancouver, ISO, and other styles
37

Bürg, Markus [Verfasser], and W. [Akademischer Betreuer] Dörfler. "A Fully Automatic hp-Adaptive Refinement Strategy / Markus Bürg. Betreuer: W. Dörfler." Karlsruhe : KIT-Bibliothek, 2012. http://d-nb.info/1025114248/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Huang, X. "XTRA : The design and implementation of a fully automatic machine translation system." Thesis, University of Essex, 1987. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.379393.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Al, Arif S. M. M. R. "Fully automatic image analysis framework for cervical vertebra in X-ray images." Thesis, City, University of London, 2018. http://openaccess.city.ac.uk/19184/.

Full text
Abstract:
Despite the advancement in imaging technologies, a fifth of the injuries in the cervical spine remain unnoticed in the X-ray radiological exam. About a two-third of the subjects with unnoticed injuries suffer tragic consequences. Based on the success of computer-aided systems in several medical image modalities to enhance clinical interpretation, we have proposed a fully automatic image analysis framework for cervical vertebrae in X-ray images. The framework takes an X-ray image as input and highlights different vertebral features at the output. To the best of our knowledge, this is the first fully automatic system in the literature for the analysis of the cervical vertebrae. The complete framework has been built by cascading specialized modules, each of which addresses a specific computer vision problem. This dissertation explores data-driven supervised machine learning solutions to these problems. Given an input X-ray image, the first module localizes the spinal region. The second module predicts vertebral centers from the spinal region which are then used to generate vertebral image patches. These patches are then passed through machine learning modules that detect vertebral corners, highlight vertebral boundaries, segment vertebral body and predict vertebral shapes. In the process of building the complete framework, we have proposed and compared different solutions to the problems addressed by each of the modules. A novel region-aware dense classification deep neural network has been proposed for the first module to address the spine localization problem. The proposed network outperformed the standard dense classification network and random forestbased methods. Location of the vertebral centers and corners vary based on human interpretation and thus are better represented by probability maps than single points. To learn the mapping between the vertebral image patches and the probability maps, a novel neural network capable of predicting a spatially distributed probabilistic distribution has been proposed. The network achieved expert-level performance in localizing vertebral centers and outperform the Harris corner detector and Hough forest-based methods for corner localization. The proposed network has also shown its capability for detecting vertebral boundaries and produced visually better results than the dense classification network-based boundary detectors. Segmentation of the vertebral body is a crucial part of the proposed framework. A new shapeaware loss function has been proposed for training a segmentation network to encourage prediction of vertebra-like structures. The segmentation performance improved significantly, however, the pixel-wise nature of proposed loss function was not able to constrain the predictions adequately. To solve the problem a novel neural network was proposed which predicts vertebral shapes and trains on a loss function defined in the shape space. The proposed shape predictor network was capable of learning better topological information about the vertebra than the shape-aware segmentation network. The methods proposed in this dissertation have been trained and tested on a challenging dataset of X-ray images collected from medical emergency rooms. The proposed, first-of-its-kind, fully automatic framework produces state-of-the-art results both quantitatively and qualitatively.
APA, Harvard, Vancouver, ISO, and other styles
40

Samarakoon, Prasad. "Random Regression Forests for Fully Automatic Multi-Organ Localization in CT Images." Thesis, Université Grenoble Alpes (ComUE), 2016. http://www.theses.fr/2016GREAM039/document.

Full text
Abstract:
La localisation d'un organe dans une image médicale en délimitant cet organe spécifique par rapport à une entité telle qu'une boite ou sphère englobante est appelée localisation d'organes. La localisation multi-organes a lieu lorsque plusieurs organes sont localisés simultanément. La localisation d'organes est l'une des étapes les plus cruciales qui est impliquée dans toutes les phases du traitement du patient à partir de la phase de diagnostic à la phase finale de suivi. L'utilisation de la technique d'apprentissage supervisé appelée forêts aléatoires (Random Forests) a montré des résultats très encourageants dans de nombreuses sous-disciplines de l'analyse d'images médicales. De même, Random Regression Forests (RRF), une spécialisation des forêts aléatoires pour la régression, ont produit des résultats de l'état de l'art pour la localisation automatique multi-organes.Bien que l'état de l'art des RRF montrent des résultats dans la localisation automatique de plusieurs organes, la nouveauté relative de cette méthode dans ce domaine soulève encore de nombreuses questions sur la façon d'optimiser ses paramètres pour une utilisation cohérente et efficace. Basé sur une connaissance approfondie des rouages des RRF, le premier objectif de cette thèse est de proposer une paramétrisation cohérente et automatique des RRF. Dans un second temps, nous étudions empiriquement l'hypothèse d'indépendance spatiale utilisée par RRF. Enfin, nous proposons une nouvelle spécialisation des RRF appelé "Light Random Regression Forests" pour améliorant l'empreinte mémoire et l'efficacité calculatoire<br>Locating an organ in a medical image by bounding that particular organ with respect to an entity such as a bounding box or sphere is termed organ localization. Multi-organ localization takes place when multiple organs are localized simultaneously. Organ localization is one of the most crucial steps that is involved in all the phases of patient treatment starting from the diagnosis phase to the final follow-up phase. The use of the supervised machine learning technique called random forests has shown very encouraging results in many sub-disciplines of medical image analysis. Similarly, Random Regression Forests (RRF), a specialization of random forests for regression, have produced the state of the art results for fully automatic multi-organ localization.Although, RRF have produced state of the art results in multi-organ segmentation, the relative novelty of the method in this field still raises numerous questions about how to optimize its parameters for consistent and efficient usage. The first objective of this thesis is to acquire a thorough knowledge of the inner workings of RRF. After achieving the above mentioned goal, we proposed a consistent and automatic parametrization of RRF. Then, we empirically proved the spatial indenpendency hypothesis used by RRF. Finally, we proposed a novel RRF specialization called Light Random Regression Forests for multi-organ localization
APA, Harvard, Vancouver, ISO, and other styles
41

Zhang, Lawrence M. Eng Massachusetts Institute of Technology. "Bootstrapping fully-automatic temporal fetal brain segmentation in volumetric MRI time series." Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/122993.

Full text
Abstract:
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.<br>Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2019<br>Cataloged from student-submitted PDF version of thesis.<br>Includes bibliographical references (pages 41-42).<br>We present a method for bootstrapping training data for the task of segmenting fetal brains in volumetric MRI time series data. Temporal analysis of MRI images requires accurate segmentation across frames, despite large amounts of unpredictable motion. We use the predicted segmentations of a baseline model and leverage anatomical structure of the fetal brain to automatically select the "good frames" that have accurate segmentations. We use these good frames to bootstrap further model training. We also introduce a novel temporal segmentation model that predicts segmentations using a history of previous segmentations, thus utilizing the temporal nature of the data. Our results show that these two approaches do not provide conclusive improvements to the quality of segmentations. Further exploration into the automatic choice of good frames is needed before<br>by Lawrence Zhang.<br>M. Eng.<br>M.Eng. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science
APA, Harvard, Vancouver, ISO, and other styles
42

Li, Xiaojing School of Electrical Engineering &amp Telecommunications &amp School of Surveying &amp Spatial Information Systems UNSW. "Optimal integrated multi-sensor system for full-scale structural monitoring based on advanced signal processing." Awarded by:University of New South Wales. School of Electrical Engineering and Telecommunications & School of Surveying and Spatial Information Systems, 2006. http://handle.unsw.edu.au/1959.4/27284.

Full text
Abstract:
Modern civil structures as well as loads on them are still too complex to be accurately modeled or simulated. Therefore, structural failures and structural defects are NOT uncommon! More and more full-scale structural monitoring systems have been deployed in order to monitor how structures behave under various loading conditions. This research focuses on how to maximise benefits from such full-scale measurements by employing advanced digital signal processing techniques. This study is based on accelerometer and GPS data collected on three very different structures, namely, the steel tower in Tokyo, the long and slender suspension bridge in Hong Kong, and the tall office tower in Sydney, under a range of loading conditions, i.e., typhoon, earthquake, heavy traffic, and small scale wind. Systematic analysis of accelerometer and GPS data has demonstrated that the two sensors complement each other in monitoring the static, quasi-static and dynamic movements of the structures. It has also been confirmed that the Finite Element Model could under-estimate the natural frequencies of structures by more than 40% in some case. The effectiveness of using wavelet to de-noise GPS measurement has been demonstrated. The weakness and strengths of accelerometer and GPS have been identified and framework has been developed on how to integrate the two as well as how to optimize the integration. The three-dimensional spectral analysis framework has been developed which can track the temporal evolution of all the frequency components and effectively represents the result in the 3D spectrogram of frequency, time and magnitude. The dominant frequency can also be tracked on the 3D mesh to vividly illustrate the damping signature of the structure. The frequency domain coherent analysis based on this 3D analysis framework can further enhance the detection of common signals between sensors. The developed framework can significantly improve the visualized performance of the integrated system without increasing hardware costs. Indoor experiments have shown the excellent characteristics of the optical fibre Bragg gratings (FBGs) for deformation monitoring. Innovative and low-cost approach has been developed to measure the shift of FBG???s central wavelength. Furthermore, a schematic design has been completed to multiplex FBGs in order to enable distributed monitoring. In collaboration with the University of Sydney, the first Australian full-scale structural monitoring system of GPS and accelerometer has been deployed on the Latitude Tower in Sydney to support current and future research.
APA, Harvard, Vancouver, ISO, and other styles
43

Simanaitis, Konrad. "Hur indsutri 4.0 påverkar den anställdas arbetsroll inom produktion." Thesis, Tekniska Högskolan, Jönköping University, JTH, Logistik och verksamhetsledning, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-49510.

Full text
Abstract:
Purpose - The purpose of this thesis is to investigate how Industry 4.0, with a focus on automation, can influence the need for production personnel in production. Industry 4.0 is upon us, therefore it is interesting to see what importance does production personnel have for prodcution today? And how is the production staff's role in production affected by increased automation? Method - In order to fulfil the purpose of this thesis a study was made to enlighten the problem area. Together with literature study, a solid theoretical reference frame was created that further strengthens the problem area. Lastly, interviews were made with three businesses. Once all this information was gathered, an analysis was made, that lead to conclusions.  Findings - The result of this study shows that employees today are not as interchangeable as feared. The first step is semi-automation, but that does not equal the cancellation of the working man. Instead there will be a cooperation between the employees and machine. There is a risk that the number of employers in production will decrease, but machines will never replace them entirerly. The result shows that production personnel have no reason to fear technology or see it as a threat, beacuse the employee today is till an important resource.  Implications - According to the result, automation as a whole is not as big of a threat today as feared, the first step is instead semi-automation. Since the writer chose to look at bigger companies instead of focusing on smaller ones that lack the possibility to even out the difference between automation and production personnel, it might have been interesting to have their view of the matter. It might have shown a different result.  Limitations - This thesis strives to answer above mentioned questions, focus has been put on bigger, and more developed, companies. Furthermore, no attempt has been made to look closer at the employees of these companies, instead all interviews have been made with personnel working at an executive level as well as those knowledgeable in automation.
APA, Harvard, Vancouver, ISO, and other styles
44

McKay, Cory. "Automatic music classification with jMIR." Thesis, McGill University, 2010. http://digitool.Library.McGill.CA:8881/R/?func=dbin-jump-full&object_id=92213.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Ferguson, Robert W. III. "Automatic segmentation in concert recordings." Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=81489.

Full text
Abstract:
"...music is an art that exists in point of time." Aaron Copland, What to Listen for in Music<br>Few definitions are adequate to describe music, but a "point of time" is a concept with which people are familiar. When musicians give concerts they try to create these points in a context, which allows the audience to observe each moment by itself. Concert practice has developed to define the edges of musical points, guided by cues such as clapping, pauses, and concert program notes.<br>This masters thesis investigates how to analyze concert recordings of Western music and their program notes to produce segments which best fit the boundaries of musical points. Modern segmentation techniques are reviewed and a new method specific to concert recordings is examined.
APA, Harvard, Vancouver, ISO, and other styles
46

Burlet, Gregory. "Automatic guitar tablature transcription online." Thesis, McGill University, 2013. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=117224.

Full text
Abstract:
Manually transcribing guitar tablature from an audio recording is a difficult and time-consuming process, even for experienced guitarists. While several algorithms have been developed to automatically extract the notes occurring in an audio recording, and several algorithms have been developed to produce guitar tablature arrangements of notes occurring in a music score, no frameworks have been developed to facilitate the combination of these algorithms. This work presents a web-based guitar tablature transcription framework capable of generating guitar tablature arrangements directly from an audio recording. The implemented transcription framework, entitled Robotaba, facilitates the creation of web applications in which polyphonic transcription and guitar tablature arrangement algorithms can be embedded. Such a web application is implemented, resulting in a unified system that is capable of transcribing guitar tablature from a digital audio recording and displaying the resulting tablature in the web browser. The performance of the implemented polyphonic transcription and guitar tablature arrangement algorithms are evaluated using several metrics on a new dataset of manual transcriptions gathered from tablature websites.<br>Transcrire à la main une tablature pour guitare à partir d'un enregistrement audio est un processus difficile et long, même pour les guitaristes chevronnés. Bien que plusieurs algorithmes aient été créés pour extraire automatiquement les notes d'un enregistrement audio, et d'autres pour préparer des arrangements de notes de tablature pour guitare tels qu'on les retrouve dans la création musicale, aucun environnement n'a été mise en place pour faciliter l'association de ces algorithmes. Le travail qui suit présente un environnement accessible sur l'Internet, permettant la transcription et la préparation d'arrangements de tablatures de guitare, directement à partir d'un enregistrement audio. Cet environnement de transcription, nommée Robotaba, facilite la création d'applications Web, dans lesquelles la transcription polyphonique et les algorithmes d'arrangements de tablature pour guitare peuvent être intégrés. Une telle application Web permet d'obtenir un système unifié, capable de transcrire une tablature pour guitare à partir d'un enregistrement audio numérique, et d'afficher la tablature obtenue dans un navigateur Web. La performance de la transcription polyphonique mise en place et des algorithmes d'arrangements de tablature pour guitare est évaluée à l'aide de plusieurs paramètres et d'un nouvel ensemble de données, constitué de transcriptions manuelles recueillies dans des sites Web consacrés aux tablatures.
APA, Harvard, Vancouver, ISO, and other styles
47

Sivertsson, Martin. "Optimization of Fuel Consumption in a Hybrid Powertrain." Thesis, Linköpings universitet, Fordonssystem, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-63563.

Full text
Abstract:
Increased environmental awareness together with new legislative demands on lowered emissions and a rising fuel cost have put focus on increasing the fuel efficiency in new vehicles. Hybridization is a way to increase the efficiency of the powertrain.The Haldex electric Torque Vectoring Device is a rear axle with a built in electric motor, designed to combine all-wheel drive with hybrid functionality. A method is developed for creating a real time control algorithm that minimizes the fuel consumption. First the consumption reduction potential of the system is investigated using Dynamic Programming. A real time control algorithm is then devised that indicates a substantial consumption reduction potential compared to all-wheel drive, under the condition that the assumed and measured efficiencies are accurate. The control algorithm is created using equivalent consumption minimization strategy and is implemented without any knowledge of the future driving mission. Two ways of adapting the control according to the battery state of charge are proposed and investigated. The controller optimizes the torque distribution for the current gear as well as assists the driver by recommending the gear which would give the lowest consumption. The simulations indicate a substantial fuel consumption reduction potential even though the system primarily is an all-wheel drive concept. The results from vehicle tests show that the control system is charge sustaining and the driveability is deemed good by the test-drivers.
APA, Harvard, Vancouver, ISO, and other styles
48

Cui, Shenshen. "Fully Automatic Segmentation of White Matter Lesions from Multispectral Magnetic Resonance Imaging Data." Thesis, Uppsala University, Department of Information Technology, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-122650.

Full text
Abstract:
<p>A fully automatic white matter lesion segmentation method has been developed and evaluated. The method uses multispectral magnetic resonance imaging (MRI) data (T1,T2 and Proton Density). First fuzzy c means (FCM) was used to segment normal brain tissues (white matter,grey matter, and cerebrospinal fluid). The holes in normal white matter were used to sample the WML intensities in the different images. The segmentation of WML was optimized by a graph cut approach. The method was trained by using 9 manually segmented datasets and evaluated by comparison to 11 other manually segmented, and visually evaluated, datasets. The graph cut part of the automatic segmentation requires, on average, 30 seconds per dataset. The results correlated well (r=0.954) to a manually created reference that was supervised by two neuro radiologists.</p><p>Key Words: White matter lesion, automatic segmentation, graph cuts, MRI, PIVUS</p>
APA, Harvard, Vancouver, ISO, and other styles
49

Taylor, William Patrick. "The design and fabrication of fully integrated magnetically actuated micromachined relays." Diss., Georgia Institute of Technology, 1997. http://hdl.handle.net/1853/13345.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Kaya, Muammer. "The effect of air flow rate and froth thickness on batch and continuous flotation kinetics /." Thesis, McGill University, 1985. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=63358.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!