Dissertationen zum Thema „Automated data processing systems“

Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Automated data processing systems.

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-50 Dissertationen für die Forschung zum Thema "Automated data processing systems" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

李少彬 und Siu-pan Li. „The validity of the use of automated evaluation systems as architectural design aids“. Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B4257562X.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Eskenazi, Cem. „An automated visual inspection system for bare hybrid boards /“. Thesis, McGill University, 1985. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=63302.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Slabber, Frans Bresler. „Semi-automated extraction of structural orientation data from aerospace imagery combined with digital elevation models“. Thesis, Rhodes University, 1996. http://hdl.handle.net/10962/d1005614.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
A computer-based method for determining the orientation of planar geological structures from remotely sensed images, utilizing digital geological images and digital elevation models (DEMs), is developed and assessed. The method relies on operator skill and experience to recognize geological structure traces on images, and then employs software routines (GEOSTRUC©) to calculate the orientation of selected structures. The operator selects three points on the trace of a planar geological feature as seen on a digital geological image that is co registered with a DEM of the same area. The orientation of the plane that contains the three points is determined using vector algebra equations. The program generates an ASCII data file which contains the orientation data as well as the geographical location of the measurements. This ASCII file can then be utilized in further analysis of the orientation data. The software development kit (SDK) for TNTmips v5.00, from MicroImages Inc. and operating in the X Windows environment, was employed to construct the software. The Watcom C\C++ Development Environment was used to generate the executable program, GEOSTRUC© . GEOSTRUC© was tested in two case studies. The case studies utilized digital data derived from the use of different techniques and from different sources which varied in scale and resolution. This was done to illustrate the versatility of the program and its application to a wide range of data types. On the whole, the results obtained using the GEOSTRUC© analyses compare favourably to field data from each test area. Use of the method to determine the orientation of axial planes in the case study revealed the usefulness of the method as a powerful analytic tool for use on a macroscopic scale. The method should not he applied in area with low variation in relief as the method proved to be less accurate in these areas. Advancements in imaging technology will serve to create images with better resolution, which will, in turn, improve the overall accuracy of the method.
4

Heywood, James K. „AUTOMATED TESTING OF THE ADVANCED DATA ACQUISITION AND PROCESSING SYSTEM“. International Foundation for Telemetering, 2001. http://hdl.handle.net/10150/606456.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
International Telemetering Conference Proceedings / October 22-25, 2001 / Riviera Hotel and Convention Center, Las Vegas, Nevada
Software and techniques are described for testing the Advanced Data Acquisition and Processing System (ADAPS), the primary flight test telemetry system used at Edwards AFB, California. The software described acts as an additional simulation capability and moves the simulation definition process into a realm where data is formed by means of a high-order language. The potential for creation of more sophisticated simulated test data is thus enabled. Extension of the techniques described in this paper to applications other than testing is discussed.
5

Bodner, Douglas Anthony. „Real-time control approaches to deadlock management in automated manufacturing systems“. Diss., Georgia Institute of Technology, 1996. http://hdl.handle.net/1853/25607.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Teske, Alexander. „Automated Risk Management Framework with Application to Big Maritime Data“. Thesis, Université d'Ottawa / University of Ottawa, 2018. http://hdl.handle.net/10393/38567.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Risk management is an essential tool for ensuring the safety and timeliness of maritime operations and transportation. Some of the many risk factors that can compromise the smooth operation of maritime activities include harsh weather and pirate activity. However, identifying and quantifying the extent of these risk factors for a particular vessel is not a trivial process. One challenge is that processing the vast amounts of automatic identification system (AIS) messages generated by the ships requires significant computational resources. Another is that the risk management process partially relies on human expertise, which can be timeconsuming and error-prone. In this thesis, an existing Risk Management Framework (RMF) is augmented to address these issues. A parallel/distributed version of the RMF is developed to e ciently process large volumes of AIS data and assess the risk levels of the corresponding vessels in near-real-time. A genetic fuzzy system is added to the RMF's Risk Assessment module in order to automatically learn the fuzzy rule base governing the risk assessment process, thereby reducing the reliance on human domain experts. A new weather risk feature is proposed, and an existing regional hostility feature is extended to automatically learn about pirate activity by ingesting unstructured news articles and incident reports. Finally, a geovisualization tool is developed to display the position and risk levels of ships at sea. Together, these contributions pave the way towards truly automatic risk management, a crucial component of modern maritime solutions. The outcomes of this thesis will contribute to enhance Larus Technologies' Total::Insight, a risk-aware decision support system successfully deployed in maritime scenarios.
7

Naik, Pranab Sabitru. „Design and implementation of a fully automated real-time s-parameter imaging system“. Thesis, Click to view the E-thesis via HKUTO, 2004. http://sunzi.lib.hku.hk/hkuto/record/B30708758.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Xozwa, Thandolwethu. „Automated statistical audit system for a government regulatory authority“. Thesis, Nelson Mandela Metropolitan University, 2015. http://hdl.handle.net/10948/6061.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Governments all over the world are faced with numerous challenges while running their countries on a daily basis. The predominant challenges which arise are those which involve statistical methodologies. Official statistics to South Africa’s infrastructure are very important and because of this it is important that an effort is made to reduce the challenges that occur during the development of official statistics. For official statistics to be developed successfully quality standards need to be built into an organisational framework and form a system of architecture (Statistics New Zealand 2009:1). Therefore, this study seeks to develop a statistical methodology that is appropriate and scientifically correct using an automated statistical system for audits in government regulatory authorities. The study makes use of Mathematica to provide guidelines on how to develop and use an automated statistical audit system. A comprehensive literature study was conducted using existing secondary sources. A quantitative research paradigm was adopted for this study, to empirically assess the demographic characteristics of tenants of Social Housing Estates and their perceptions towards the rental units they inhabit. More specifically a descriptive study was undertaken. Furthermore, a sample size was selected by means of convenience sampling for a case study on SHRA to assess the respondent’s biographical information. From this sample, a pilot study was conducted investigating the general perceptions of the respondents regarding the physical conditions and quality of their units. The technical development of an automated statistical audit system was discussed. This process involved the development and use of a questionnaire design tool, statistical analysis and reporting and how Mathematica software served as a platform for developing the system. The findings of this study provide insights on how government regulatory authorities can best utilise automated statistical audits for regulation purposes and achieved this by developing an automated statistical audit system for government regulatory authorities. It is hoped that the findings of this study will provide government regulatory authorities with practical suggestions or solutions regarding the generating of official statistics for regulatory purposes, and that the suggestions for future research will inspire future researchers to further investigate automated statistical audit systems, statistical analysis, automated questionnaire development, and government regulatory authorities individually.
9

Rossman, Mark A. „Automated Detection of Hematological Abnormalities through Classification of Flow Cytometric Data Patterns“. FIU Digital Commons, 2011. http://digitalcommons.fiu.edu/etd/344.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Flow Cytometry analyzers have become trusted companions due to their ability to perform fast and accurate analyses of human blood. The aim of these analyses is to determine the possible existence of abnormalities in the blood that have been correlated with serious disease states, such as infectious mononucleosis, leukemia, and various cancers. Though these analyzers provide important feedback, it is always desired to improve the accuracy of the results. This is evidenced by the occurrences of misclassifications reported by some users of these devices. It is advantageous to provide a pattern interpretation framework that is able to provide better classification ability than is currently available. Toward this end, the purpose of this dissertation was to establish a feature extraction and pattern classification framework capable of providing improved accuracy for detecting specific hematological abnormalities in flow cytometric blood data. This involved extracting a unique and powerful set of shift-invariant statistical features from the multi-dimensional flow cytometry data and then using these features as inputs to a pattern classification engine composed of an artificial neural network (ANN). The contribution of this method consisted of developing a descriptor matrix that can be used to reliably assess if a donor’s blood pattern exhibits a clinically abnormal level of variant lymphocytes, which are blood cells that are potentially indicative of disorders such as leukemia and infectious mononucleosis. This study showed that the set of shift-and-rotation-invariant statistical features extracted from the eigensystem of the flow cytometric data pattern performs better than other commonly-used features in this type of disease detection, exhibiting an accuracy of 80.7%, a sensitivity of 72.3%, and a specificity of 89.2%. This performance represents a major improvement for this type of hematological classifier, which has historically been plagued by poor performance, with accuracies as low as 60% in some cases.
10

Kuehl, Phillip Anthony. „Real-time processing of electromyograms in an automated hand-forearm data collection and analysis system“. Thesis, Kansas State University, 2015. http://hdl.handle.net/2097/19087.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Master of Science
Department of Electrical and Computer Engineering
Steven Warren
Handgrip contractions are a useful exercise for assessing muscle fatigue in the forearm musculature. Most conventional hand-forearm ergometer systems require the researcher to manually guide subject activity, collect subject data, and assess subject fatigue after it has occurred. Since post-processing tools are not standardized for this type of experiment, researchers resort to building their own tools. This process can make comparing results between research groups difficult. This thesis presents updates to a hand-forearm ergometer system that automate the control, data-acquisition, and data-analysis mechanisms. The automated system utilizes a LabVIEW virtual instrument as the system centerpiece; it provides the subject/researcher interfaces and coordinates data acquisition from both traditional and new sensors. The system also processes the hand-forearm data within the LabVIEW environment as the data are collected. This allows the researcher to better understand the onset of subject fatigue while an experiment is in progress. System upgrades relative to prior work include the addition of new parameters to the researcher display, a change in the subject display from a binary up-down display to a sliding bar for better control over subject grip state, and a software update from a simple data acquisition and display system to a real-time processing system. The toolset has proven to be a viable support resource for experimental studies performed in the Kansas State University Human Exercise Physiology Laboratory that target muscle fatigue in human forearms. Initial data acquired during these tests indicate the viability of the system to acquire consistent and physiologically meaningful data while providing a useable toolset for follow-on data analyses.
11

Shelton, Debra Kay. „A selection model for automated guided vehicles“. Thesis, Virginia Polytechnic Institute and State University, 1985. http://hdl.handle.net/10919/101465.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
This research identifies the attributes to be considered in the selection of an automated guided vehicle (AGV). A distinction is made between automated guided vehicles (AGVs) and an automated guided vehicle system (AGVS). This research is concerned only with the selection of automated guided vehicles (AGVs). A selection model is developed which forces the user to evaluate attributes. his requirements and preferences for AGV The first step of the model allows the user to enter his specifications for AGV attributes which are applicable to his production environment. The second step in the selection model is for the user to determine 8-15 attributes to use as selection criteria. In the third phase, the user inputs his preferences and priorities with respect to the attributes chosen as selection criteria in the second step. model ranks the Based on this information, the selection AGV models in the feasible set. A description of the model and a numerical example are included. Steps 1 and 2, described above, are implemented using an R:BASE™ program. The BASIC computer language was used to perform the interrogation of the user with respect to his priorities and preferences among attributes in Step 3. The IBM PC™ is the hardware chosen for running the selection model.
M.S.
12

Van, der Walt Craig. „An investigation into the practical implementation of speech recognition for data capturing“. Thesis, Cape Technikon, 1993. http://hdl.handle.net/20.500.11838/1156.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Thesis (Master Diploma (Technology))--Cape Technikon, Cape Town,1993
A study into the practical implementation of Speech Recognition for the purposes of Data Capturing within Telkom SA. is described. As datacapturing is increasing in demand a more efficient method of capturing is sought. The technology relating to Speech recognition is herein examined and practical gnidelines for selecting a Speech recognition system are described. These guidelines are used to show how commercially available systems can be evaluated. Specific tests on a selected speech recognition system are described, relating to the accuracy and adaptability of the system. The results obtained illustrate why at present speech recognition systems are not advisable for the purpose of Data capturing. The results also demonstrate how the selection of keywords words can affect system performance. Areas of further research are highlighted relating to recognition performance and vocabulary selection.
13

Swanepoel, Petrus Johannes. „Omnidirectional image sensing for automated guided vehicle“. Thesis, Bloemfontein : Central University of Technology, Free State, 2009. http://hdl.handle.net/11462/39.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Thesis (M. Tech.) -- Central University of Technology, Free State, 2009
Automated Guided Vehicles (AGVs) have many different design specifications, although they all have certain design features in common, for instance they are designed to follow predetermined paths, and they need to be aware of their surroundings and changes to their surroundings. They are designed to house sensors for navigation and obstacle avoidance. In this study an AGV platform was developed by modifying an electric wheelchair. A serial port interface was developed between a computer and the control unit of the electric wheelchair, which enables the computer to control the movements of the platform. Different sensors were investigated to determine which would be best suited and most effective to avoid collisions. The sensors chosen were mounted on the AGV and a programme was developed to enable the sensors to assist in avoiding obstacles. An imaging device as an additional sensor system for the AGV was investigated. The image produced by a camera and dome mirror was processed into a panoramic image representing an entire 360o view of the AGV‟s surroundings. The reason for this part of the research was to enable the user to make corrections to the AGV‟s path if it became stuck along the track it was following. The entire system was also made completely wireless to improve the flexibility of the AGV‟s applications.
14

Hernańdez, Correa Evelio. „Control of nonlinear systems using input-output information“. Diss., Georgia Institute of Technology, 1992. http://hdl.handle.net/1853/11176.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Munnecom, Lorenna, und Miguel Chaves de Lemos Pacheco. „Exploration of an Automated Motivation Letter Scoring System to Emulate Human Judgement“. Thesis, Högskolan Dalarna, Mikrodataanalys, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:du-34563.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
As the popularity of the master’s in data science at Dalarna University increases, so does the number of applicants. The aim of this thesis was to explore different approaches to provide an automated motivation letter scoring system which could emulate the human judgement and automate the process of candidate selection. Several steps such as image processing and text processing were required to enable the authors to retrieve numerous features which could lead to the identification of the factors graded by the program managers. Grammatical based features and Advanced textual features were extracted from the motivation letters followed by the application of Topic Modelling methods to extract the probability of each topics occurring within a motivation letter. Furthermore, correlation analysis was applied to quantify the association between the features and the different factors graded by the program managers, followed by Ordinal Logistic Regression and Random Forest to build models with the most impactful variables. Finally, Naïve Bayes Algorithm, Random Forest and Support Vector Machine were used, first for classification and then for prediction purposes. These results were not promising as the factors were not accurately identified. Nevertheless, the authors suspected that the factors may be strongly related to the highlight of specific topics within a motivation letter which can lead to further research.
16

Palmer, David Donald. „Modeling uncertainty for information extraction from speech data /“. Thesis, Connect to this title online; UW restricted, 2001. http://hdl.handle.net/1773/5834.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Zhao, Guang, und 趙光. „Automatic boundary extraction in medical images based on constrained edge merging“. Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B31223904.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Park, Jonghun. „Structural analysis and control of resource allocation systems using petri nets“. Diss., Georgia Institute of Technology, 2000. http://hdl.handle.net/1853/24529.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Goussard, George Willem. „Unsupervised clustering of audio data for acoustic modelling in automatic speech recognition systems“. Thesis, Stellenbosch : University of Stellenbosch, 2011. http://hdl.handle.net/10019.1/6686.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Thesis (MScEng (Electrical and Electronic Engineering))--University of Stellenbosch, 2011.
ENGLISH ABSTRACT: This thesis presents a system that is designed to replace the manual process of generating a pronunciation dictionary for use in automatic speech recognition. The proposed system has several stages. The first stage segments the audio into what will be known as the subword units, using a frequency domain method. In the second stage, dynamic time warping is used to determine the similarity between the segments of each possible pair of these acoustic segments. These similarities are used to cluster similar acoustic segments into acoustic clusters. The final stage derives a pronunciation dictionary from the orthography of the training data and corresponding sequence of acoustic clusters. This process begins with an initial mapping between words and their sequence of clusters, established by Viterbi alignment with the orthographic transcription. The dictionary is refined iteratively by pruning redundant mappings, hidden Markov model estimation and Viterbi re-alignment in each iteration. This approach is evaluated experimentally by applying it to two subsets of the TIMIT corpus. It is found that, when test words are repeated often in the training material, the approach leads to a system whose accuracy is almost as good as one trained using the phonetic transcriptions. When test words are not repeated often in the training set, the proposed approach leads to better results than those achieved using the phonetic transcriptions, although the recognition is poor overall in this case.
AFRIKAANSE OPSOMMING: Die doelwit van die tesis is om ’n stelsel te beskryf wat ontwerp is om die handgedrewe proses in die samestelling van ’n woordeboek, vir die gebruik in outomatiese spraakherkenningsstelsels, te vervang. Die voorgestelde stelsel bestaan uit ’n aantal stappe. Die eerste stap is die segmentering van die oudio in sogenaamde sub-woord eenhede deur gebruik te maak van ’n frekwensie gebied tegniek. Met die tweede stap word die dinamiese tydverplasingsalgoritme ingespan om die ooreenkoms tussen die segmente van elkeen van die moontlike pare van die akoestiese segmente bepaal. Die ooreenkomste word dan gebruik om die akoestiese segmente te groepeer in akoestiese groepe. Die laaste stap stel die woordeboek saam deur gebruik te maak van die ortografiese transkripsie van afrigtingsdata en die ooreenstemmende reeks akoestiese groepe. Die finale stap begin met ’n aanvanklike afbeelding vanaf woorde tot hul reeks groep identifiseerders, bewerkstellig deur Viterbi belyning en die ortografiese transkripsie. Die woordeboek word iteratief verfyn deur oortollige afbeeldings te snoei, verskuilde Markov modelle af te rig en deur Viterbi belyning te gebruik in elke iterasie. Die benadering is getoets deur dit eksperimenteel te evalueer op twee subversamelings data vanuit die TIMIT korpus. Daar is bevind dat, wanneer woorde herhaal word in die afrigtingsdata, die stelsel se benadering die akkuraatheid ewenaar van ’n stelsel wat met die fonetiese transkripsie afgerig is. As die woorde nie herhaal word in die afrigtingsdata nie, is die akkuraatheid van die stelsel se benadering beter as wanneer die stelsel afgerig word met die fonetiese transkripsie, alhoewel die akkuraatheid in die algemeen swak is.
20

Swarnkar, Divya. „Experience and analysis of the real time data acquisition system“. Access to citation, abstract and download form provided by ProQuest Information and Learning Company; downloadable PDF file, 59 p, 2005. http://proquest.umi.com/pqdweb?did=994252331&sid=12&Fmt=2&clientId=8331&RQT=309&VName=PQD.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Thesis (M.S.)--University of Delaware, 2005.
Principal faculty advisors: Martin Swany, Dept.. of Computer & Information Sciences; and David Seckel, Dept. of Physics & Astronomy. Includes bibliographical references.
21

Wark, Timothy J. „Multi-modal speech processing for automatic speaker recognition“. Thesis, Queensland University of Technology, 2001.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Li, Xiaojing School of Electrical Engineering &amp Telecommunications &amp School of Surveying &amp Spatial Information Systems UNSW. „Optimal integrated multi-sensor system for full-scale structural monitoring based on advanced signal processing“. Awarded by:University of New South Wales. School of Electrical Engineering and Telecommunications & School of Surveying and Spatial Information Systems, 2006. http://handle.unsw.edu.au/1959.4/27284.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Modern civil structures as well as loads on them are still too complex to be accurately modeled or simulated. Therefore, structural failures and structural defects are NOT uncommon! More and more full-scale structural monitoring systems have been deployed in order to monitor how structures behave under various loading conditions. This research focuses on how to maximise benefits from such full-scale measurements by employing advanced digital signal processing techniques. This study is based on accelerometer and GPS data collected on three very different structures, namely, the steel tower in Tokyo, the long and slender suspension bridge in Hong Kong, and the tall office tower in Sydney, under a range of loading conditions, i.e., typhoon, earthquake, heavy traffic, and small scale wind. Systematic analysis of accelerometer and GPS data has demonstrated that the two sensors complement each other in monitoring the static, quasi-static and dynamic movements of the structures. It has also been confirmed that the Finite Element Model could under-estimate the natural frequencies of structures by more than 40% in some case. The effectiveness of using wavelet to de-noise GPS measurement has been demonstrated. The weakness and strengths of accelerometer and GPS have been identified and framework has been developed on how to integrate the two as well as how to optimize the integration. The three-dimensional spectral analysis framework has been developed which can track the temporal evolution of all the frequency components and effectively represents the result in the 3D spectrogram of frequency, time and magnitude. The dominant frequency can also be tracked on the 3D mesh to vividly illustrate the damping signature of the structure. The frequency domain coherent analysis based on this 3D analysis framework can further enhance the detection of common signals between sensors. The developed framework can significantly improve the visualized performance of the integrated system without increasing hardware costs. Indoor experiments have shown the excellent characteristics of the optical fibre Bragg gratings (FBGs) for deformation monitoring. Innovative and low-cost approach has been developed to measure the shift of FBG???s central wavelength. Furthermore, a schematic design has been completed to multiplex FBGs in order to enable distributed monitoring. In collaboration with the University of Sydney, the first Australian full-scale structural monitoring system of GPS and accelerometer has been deployed on the Latitude Tower in Sydney to support current and future research.
23

Gurung, Sanjaya. „Integrating environmental data acquisition and low cost Wi-Fi data communication“. Thesis, University of North Texas, 2009. https://digital.library.unt.edu/ark:/67531/metadc12131/.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
This thesis describes environmental data collection and transmission from the field to a server using Wi-Fi. Also discussed are components, radio wave propagation, received power calculations, and throughput tests. Measured receive power resulted close to calculated and simulated values. Throughput tests resulted satisfactory. The thesis provides detailed systematic procedures for Wi-Fi radio link setup and techniques to optimize the quality of a radio link.
24

Chan, Wing Sze. „Semantic search of multimedia data objects through collaborative intelligence“. HKBU Institutional Repository, 2010. http://repository.hkbu.edu.hk/etd_ra/1171.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Robinson, James Beresford. „Lntp : the implementation and performance of a new local area network transport protocol“. Thesis, University of British Columbia, 1987. http://hdl.handle.net/2429/26523.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
In the past it has been convenient to adopt existing long haul network (LHN) protocols for use in local area networks (LANs). However, due to the different operating parameters that exist between these two types of networks, it is not possible for a LHN protocol to fully exploit the characteristics of a LAN. Thus, the need arises for a protocol designed specifically for use in a LAN environment. LNTP is one such transport level protocol. It was designed for exclusive use in LANs, and thus does not incorporate those features which are not relevant to a LAN environment. The result of this is a simpler and more efficient protocol. As well, LNTP employs a novel deferred flow control strategy which minimizes the time that a transmitting process will be blocked. This thesis examines the implementation of LNTP in the 4.2 BSD UNIX operating system. Various measurements are taken, and LNTP's performance is compared to that of TCP/IP, a LHN protocol which is often used in LAN environments. Several formulas are developed to determine the optimum values for various LNTP parameters, and these theoretical results are compared to the experimentally observed values. We conclude that LNTP does indeed outperform TCP/IP. However, due to the overhead of the non-LNTP specific protocol layers, this improvement is not as great as it might be. Nonetheless, LNTP proves itself to be a viable replacement for TCP/IP.
Science, Faculty of
Computer Science, Department of
Graduate
26

Knights, MS. „Flexible shape models for image analysis in an automated lobster catch assessment system“. Thesis, Honours thesis, University of Tasmania, 2007. https://eprints.utas.edu.au/3013/2/1_front_Knights.pdf.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Management of fisheries is an evolving science combining multiple techniques and strategies. The involvement of the computer in industry management and research continues to grow. The area of image analysis is currently limited but continues to grow as computing equipment becomes faster and cheaper. Locating a particular object in an image and processing information about that object is a significant task that requires a great deal of processing power and finesse. The benefits of a functioning automated task that processes data on an object, such as a lobster, simply by processing an image of that object would greatly enhance the ability to manage a fishery with accurate, up to date data. The Tasmanian Aquaculture and Fisheries Institute (TAFI) intend to create a lobster-sorting tray, which can be used on lobster fishing vessels as standard equipment. This tray would include functionality to take an image of the current lobster and estimate its sex and weight from pertinent measurements on the lobster. This research demonstrates that through the use of the Active Shape Modeller (ASM) these details can be identified and processed from an image of the lobster. The ASM is used within an image analysis process, which can be fully automated, to draw out the required salient details of a lobster from an area of interest in the images. A series of experiments showed that the ASM was able to draw out and fully identify 77.3% images in a test set of 216 images. These images then had pertinent lengths and a sex estimated based on these measurements where 90% of the matched lobsters were sexed correctly.
27

Jung, Uk. „Wavelet-based Data Reduction and Mining for Multiple Functional Data“. Diss., Georgia Institute of Technology, 2004. http://hdl.handle.net/1853/5084.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Advance technology such as various types of automatic data acquisitions, management, and networking systems has created a tremendous capability for managers to access valuable production information to improve their operation quality and efficiency. Signal processing and data mining techniques are more popular than ever in many fields including intelligent manufacturing. As data sets increase in size, their exploration, manipulation, and analysis become more complicated and resource consuming. Timely synthesized information such as functional data is needed for product design, process trouble-shooting, quality/efficiency improvement and resource allocation decisions. A major obstacle in those intelligent manufacturing system is that tools for processing a large volume of information coming from numerous stages on manufacturing operations are not available. Thus, the underlying theme of this thesis is to reduce the size of data in a mathematical rigorous framework, and apply existing or new procedures to the reduced-size data for various decision-making purposes. This thesis, first, proposes {it Wavelet-based Random-effect Model} which can generate multiple functional data signals which have wide fluctuations(between-signal variations) in the time domain. The random-effect wavelet atom position in the model has {it locally focused impact} which can be distinguished from other traditional random-effect models in biological field. For the data-size reduction, in order to deal with heterogeneously selected wavelet coefficients for different single curves, this thesis introduces the newly-defined {it Wavelet Vertical Energy} metric of multiple curves and utilizes it for the efficient data reduction method. The newly proposed method in this thesis will select important positions for the whole set of multiple curves by comparison between every vertical energy metrics and a threshold ({it Vertical Energy Threshold; VET}) which will be optimally decided based on an objective function. The objective function balances the reconstruction error against a data reduction ratio. Based on class membership information of each signal obtained, this thesis proposes the {it Vertical Group-Wise Threshold} method to increase the discriminative capability of the reduced-size data so that the reduced data set retains salient differences between classes as much as possible. A real-life example (Tonnage data) shows our proposed method is promising.
28

Parshakov, Ilia. „Automatic class labeling of classified imagery using a hyperspectral library“. Thesis, Lethbridge, Alta. : University of Lethbridge, Dept. of Geography, c2012, 2012. http://hdl.handle.net/10133/3372.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Image classification is a fundamental information extraction procedure in remote sensing that is used in land-cover and land-use mapping. Despite being considered as a replacement for manual mapping, it still requires some degree of analyst intervention. This makes the process of image classification time consuming, subjective, and error prone. For example, in unsupervised classification, pixels are automatically grouped into classes, but the user has to manually label the classes as one land-cover type or another. As a general rule, the larger the number of classes, the more difficult it is to assign meaningful class labels. A fully automated post-classification procedure for class labeling was developed in an attempt to alleviate this problem. It labels spectral classes by matching their spectral characteristics with reference spectra. A Landsat TM image of an agricultural area was used for performance assessment. The algorithm was used to label a 20- and 100-class image generated by the ISODATA classifier. The 20-class image was used to compare the technique with the traditional manual labeling of classes, and the 100-class image was used to compare it with the Spectral Angle Mapper and Maximum Likelihood classifiers. The proposed technique produced a map that had an overall accuracy of 51%, outperforming the manual labeling (40% to 45% accuracy, depending on the analyst performing the labeling) and the Spectral Angle Mapper classifier (39%), but underperformed compared to the Maximum Likelihood technique (53% to 63%). The newly developed class-labeling algorithm provided better results for alfalfa, beans, corn, grass and sugar beet, whereas canola, corn, fallow, flax, potato, and wheat were identified with similar or lower accuracy, depending on the classifier it was compared with.
vii, 93 leaves : ill., maps (some col.) ; 29 cm
29

Karunanidhi, Karthikeyan. „ARROS; distributed adaptive real-time network intrusion response“. Ohio : Ohio University, 2006. http://www.ohiolink.edu/etd/view.cgi?ohiou1141074467.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
30

Ghosh, Sushmita. „Real time data acquisition for load management“. Thesis, Virginia Tech, 1985. http://hdl.handle.net/10919/45726.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Demand for Data Transfer between computers has increased ever since the introduction of Personal Computers (PC). Data Communicating on the Personal Computer is much more productive as it is an intelligent terminal that can connect to various hosts on the same I/O hardware circuit as well as execute processes on its own as an isolated system. Yet, the PC on its own is useless for data communication. It requires a hardware interface circuit and software for controlling the handshaking signals and setting up communication parameters. Often the data is distorted due to noise in the line. Such transmission errors are imbedded in the data and require careful filtering. The thesis deals with the development of a Data Acquisition system that collects real time load and weather data and stores them as historical database for use in a load forecast algorithm in a load management system. A filtering technique has been developed here that checks for transmission errors in the raw data. The microcomputers used in this development are the IBM PC/XT and the AT&T 3B2 supermicro computer.
Master of Science
31

Schifiliti, Robert P. „Use of Fire Plume Theory in the Design and Analysis of Fire Detector and Sprinkler Response“. Digital WPI, 2000. https://digitalcommons.wpi.edu/etd-theses/1155.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
This thesis demonstrates how the response of fire detection and automatic sprinkler systems can be designed or analyzed. The intended audience is engineers involved in the design and analysis of fire detection and suppression systems. The material presented may also be of interest to engineers and researchers involved in related fields. National Bureau of Standards furniture calorimeter test data is compared to heat release rates predicted by a power-law fire growth model. A model for calculating fire gas temperatures and velocities along a ceiling, resulting from power-law fires is reviewed. Numerical and analytical solutions to the model are outlined and discussed. Computer programs are included to design and analyze the response of detectors and sprinklers. A program is also included to generate tables which can be used for design and analysis, in lieu of a computer. Examples show how fire protection engineers can use the techniques presented. The examples show how systems can be designed to meet specific goals. They also show how to analyze a system to determine if its response meets established goals. The examples demonstrate how detector response is sensitive to the detector's environment and physical characteristics.
32

Yaman, Sibel. „A multi-objective programming perspective to statistical learning problems“. Diss., Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/26470.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Thesis (Ph.D)--Electrical and Computer Engineering, Georgia Institute of Technology, 2009.
Committee Chair: Chin-Hui Lee; Committee Member: Anthony Yezzi; Committee Member: Evans Harrell; Committee Member: Fred Juang; Committee Member: James H. McClellan. Part of the SMARTech Electronic Thesis and Dissertation Collection.
33

Hawkins, Kevin Michael. „Development of an automated anesthesia system for the stabilization of physiological parameters in rodents“. Link to electronic thesis, 2003. http://www.wpi.edu/Pubs/ETD/Available/etd-0424103-105500/.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
34

Neumann, Stefan, und Holger Giese. „Scalable compatibility for embedded real-time components via language progressive timed automata“. Universität Potsdam, 2013. http://opus.kobv.de/ubp/volltexte/2013/6385/.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
The proper composition of independently developed components of an embedded real- time system is complicated due to the fact that besides the functional behavior also the non-functional properties and in particular the timing have to be compatible. Nowadays related compatibility problems have to be addressed in a cumbersome integration and configuration phase at the end of the development process, that in the worst case may fail. Therefore, a number of formal approaches have been developed, which try to guide the upfront decomposition of the embedded real-time system into components such that integration problems related to timing properties can be excluded and that suitable configurations can be found. However, the proposed solutions require a number of strong assumptions that can be hardly fulfilled or the required analysis does not scale well. In this paper, we present an approach based on timed automata that can provide the required guarantees for the later integration without strong assumptions, which are difficult to match in practice. The approach provides a modular reasoning scheme that permits to establish the required guarantees for the integration employing only local checks, which therefore also scales. It is also possible to determine potential configuration settings by means of timed game synthesis.
Die korrekte Komposition individuell entwickelter Komponenten von eingebetteten Realzeitsystemen ist eine Herausforderung, da neben funktionalen Eigenschaften auch nicht funktionale Eigenschaften berücksichtigt werden müssen. Ein Beispiel hierfür ist die Kompatibilität von Realzeiteigenschaften, welche eine entscheidende Rolle in eingebetteten Systemen spielen. Heutzutage wird die Kompatibilität derartiger Eigenschaften in einer aufwändigen Integrations- und Konfigurationstests am Ende des Entwicklungsprozesses geprüft, wobei diese Tests im schlechtesten Fall fehlschlagen. Aus diesem Grund wurde eine Zahl an formalen Verfahren Entwickelt, welche eine frühzeitige Analyse von Realzeiteigenschaften von Komponenten erlauben, sodass Inkompatibilitäten von Realzeiteigenschaften in späteren Phasen ausgeschlossen werden können. Existierenden Verfahren verlangen jedoch, dass eine Reihe von Bedingungen erfüllt sein muss, welche von realen Systemen nur schwer zu erfüllen sind, oder aber, die verwendeten Analyseverfahren skalieren nicht für größere Systeme. In dieser Arbeit wird ein Ansatz vorgestellt, welcher auf dem formalen Modell des Timed Automaton basiert und der keine Bedingungen verlangt, die von einem realen System nur schwer erfüllt werden können. Der in dieser Arbeit vorgestellte Ansatz enthält ein Framework, welches eine modulare Analyse erlaubt, bei der ausschließlich miteinender kommunizierende Komponenten paarweise überprüft werden müssen. Somit wird eine skalierbare Analyse von Realzeiteigenschaften ermöglicht, die keine Bedingungen verlangt, welche nur bedingt von realen Systemen erfüllt werden können.
35

Shankar, Sanjeev. „Analysis of microprocessor based vehicular instrumentation and automatic passenger counting systems“. Thesis, Virginia Tech, 1985. http://hdl.handle.net/10919/41570.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Information on transit ridership and operations is a necessary condition as far as efficient management is considered. Transit managements on the acquisition of such a data base can confirm predictions about scheduling, receive warnings about potential dangers and plan future operations on a much broader and precise base. Data from passenger counts provide essential information to marketing and scheduling personnel by identifying peak load-points and the such. Using manual collection methods for such data is expensive and prone to human errors. Automatic Passenger Counting (APC) systems are viewed as an improved and economical technique for data collection. Such systems monitor the progress of a particular vehicle -- its position, number of passengers getting on and off, times and distances between stops -- and make this data available for processing. These are state of the art systems, mostly microprocessor based and often embracing a modular structure. The Red Pine system is such a system with different dedicated modules for each bank of tasks. Multitasking software is seen to be an powerful tool for such systems and simplify the architecture of the system hardware. A CHMOS hardware design, suited for multitasking softwares is provided. Interfacing software for the Red Pine system has been developed and is explained. Debugging testing and simulation of the Red Pine hardware is detailed. Modifications have been recorded and improvements suggested.


Master of Science
36

Sanford, Jerald Patrick. „An automatic system for converting digitized line drawings into highly compressed mathematical primitives“. Thesis, Virginia Polytechnic Institute and State University, 1985. http://hdl.handle.net/10919/101259.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
The design of an efficient, low-cost system for automatically converting a hardcopy technical drawing into a highly compressed electronic representation is the motivation for this work. An improved method for extracting line and region information from a typical engineering drawing is presented. An efficient encoding method has also been proposed that takes advantage of the preprocessing done by the region and line extraction steps. Finally, a technique for creating a highly compressed mathematical representation (based on spline approximations) for the drawing is presented.
M.S.
37

Lewis, W. Ivan. „DACS: an interactive computer program to aid in the design and analysis of linear control systems“. Thesis, Virginia Polytechnic Institute and State University, 1985. http://hdl.handle.net/10919/76039.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
DACS is an interactive computer program for the IBM-PC that aids in the design and analysis of linear control systems. Written in compiled basic, DACS includes Root Locus, Bode plots, Nyquist diagrams, Nichols charts, system simulation, and calculates the system time response in closed-form. The state-space description is used in the simulation and time response segments while the system transfer function is used in the root locus and frequency response segments. Calculated data may be displayed in either graphical or tabular form. Graphics features of DACS include automatic-scaling, zooming, graph coordinates derived from cursor location, graphics screen dumps to disk or printer, and redisplaying of disk files. DACS is menu driven and the majority of input/output is accomplished through the function keys. All system models and data may be saved on and recalled from disk. Help screens, three levels of sound, color, and session archiving are also provided. DACS provides a wide variety of linear control system analysis tools for the engineering desktop.
Master of Science
38

Hammam, Yasser, und n/a. „Geographical vector agents“. University of Otago. Department of Information Science, 2008. http://adt.otago.ac.nz./public/adt-NZDU20080404.150839.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Simulating geographic phenomena in a realistic and plausible way requires real-world entities to be abstracted based on the dynamic physical characteristics they exhibit, and treated as individuals in a simulation domain. These processes cannot be adequately supported by the traditional spatial model based on cellular-space such as Cellular Automata (CA). Although this approach has received a great attention as a most favoured technique for simulating the geographic phenomena from different aspects, the need for a generic spatial model to overcome the limitations encountered in such an approach has been raised. This applies particularly to the way real-world entities are represented in a simulation domain regarding their physical characteristics and temporal aspects. In this thesis, a new computational approach for a spatial model suitable for simulating geographic phenomena is presented: the vector agents model. The vector agent is goal-oriented, adaptable, physically defined by an Euclidean geometry and able to change its own geometric characteristics while interacting with other agents in its neighbourhood using a set of rules. The agent is modelled with sensor, state, and strategies. The successful implementation of the model�s architecture allows the representation of the physical characteristics of real-world entities and to observe their complex and dynamic behaviour in a simulation domain. Vector agents have developed out of a need to create a systematic basis for the geometric components of Geographic Automata Systems (GAS), as outlined by Torrens and Benenson (2005). A generic vector agents model was built, then tested and validated from different aspects, from which results demonstrated the model�s efficiency. It is confirmed that vector agents are flexible in producing different complex shapes and patterns for recreating real geographic phenomena through the generic use of three algorithms of geometric manipulation: midpoint displacement by using the relaxed Brownian Motion (fractal-like) algorithm, edge displacement and vertex displacement. The effectiveness of this was initially ascertained visually. A simple heuristic to govern shape growth rate and complexity was derived based on the interplay of the three algorithms. There was a further abstract model comparison against the cellular-agents environment, with the result that vector agents have the ability to emerge patterns similar to what can be produced by cellular-agents with the advantage of representing entities as individuals with their own attributes with realistic geometric boundaries. On the other hand, the city as a complex geographic phenomenon was used as a specific domain for validating the model with a real-world system. The results of the urban land use simulations (driven by simple rules based on three classical urban theories) confirmed that: (a) the model is flexible enough to incorporate various external rules based on real-world systems and (b) the model has a sufficient capability in emerging a variety of patterns under several environments close to actual patterns. The agent environment also proved to be an effective way of easily combining the rules associated with each urban theory (different agents behaved according to different theories). Finally, limitations raised through the development of this work are addressed leading to outline possible extensions of both model computation and the domain of applications.
39

Kotze, Benjamin Johannes. „Navigation for automatic guided vehicles using omnidirectional optical sensing“. Thesis, Bloemfontein : Central University of Technology, Free State, 2013. http://hdl.handle.net/11462/185.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Thesis (M. Tech. (Engineering: Electrical)) -- Central University of technology, Free State, 2013
Automatic Guided Vehicles (AGVs) are being used more frequently in a manufacturing environment. These AGVs are navigated in many different ways, utilising multiple types of sensors for detecting the environment like distance, obstacles, and a set route. Different algorithms or methods are then used to utilise this environmental information for navigation purposes applied onto the AGV for control purposes. Developing a platform that could be easily reconfigured in alternative route applications utilising vision was one of the aims of the research. In this research such sensors detecting the environment was replaced and/or minimised by the use of a single, omnidirectional Webcam picture stream utilising an own developed mirror and Perspex tube setup. The area of interest in each frame was extracted saving on computational recourses and time. By utilising image processing, the vehicle was navigated on a predetermined route. Different edge detection methods and segmentation methods were investigated on this vision signal for route and sign navigation. Prewitt edge detection was eventually implemented, Hough transfers used for border detection and Kalman filtering for minimising border detected noise for staying on the navigated route. Reconfigurability was added to the route layout by coloured signs incorporated in the navigation process. The result was the manipulation of a number of AGV’s, each on its own designated coloured signed route. This route could be reconfigured by the operator with no programming alteration or intervention. The YCbCr colour space signal was implemented in detecting specific control signs for alternative colour route navigation. The result was used generating commands to control the AGV through serial commands sent on a laptop’s Universal Serial Bus (USB) port with a PIC microcontroller interface board controlling the motors by means of pulse width modulation (PWM). A total MATLAB® software development platform was utilised by implementing written M-files, Simulink® models, masked function blocks and .mat files for sourcing the workspace variables and generating executable files. This continuous development system lends itself to speedy evaluation and implementation of image processing options on the AGV. All the work done in the thesis was validated by simulations using actual data and by physical experimentation.
40

Svensson, Pontus. „Automated Image Suggestions for News Articles : An Evaluation of Text and Image Representations in an Image Retrieval System“. Thesis, Linköpings universitet, Interaktiva och kognitiva system, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-166669.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Multimodal machine learning is a subfield of machine learning that aims to relate data from different modalities, such as texts and images. One of the many applications that could be built upon this technique is an image retrieval system that, given a text query, retrieves suitable images from a database. In this thesis, a retrieval system based on canonical correlation is used to suggest images for news articles. Different dense text representations produced by Word2vec and Doc2vec, and image representations produced by pre-trained convolutional neural networks are explored to find out how they affect the suggestions. Which part of an article is best suited as a query to the system is also studied. Also, experiments are carried out to determine if an article's date of publication can be used to improve the suggestions. The results show that Word2vec outperforms Doc2vec in the task, which indicates that the meaning of article texts are not as important as the individual words they consist of. Furthermore, the queries are improved by rewarding words that are particularly significant.
41

Oh, Sang Min. „Switching linear dynamic systems with higher-order temporal structure“. Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/29698.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Thesis (Ph.D)--Computing, Georgia Institute of Technology, 2010.
Committee Chair: Dellaert, Frank; Committee Co-Chair: Rehg, James; Committee Member: Bobick, Aaron; Committee Member: Essa, Irfan; Committee Member: Smyth, Padhraic. Part of the SMARTech Electronic Thesis and Dissertation Collection.
42

Daniel, Jérémie. „Trajectory generation and data fusion for control-oriented advanced driver assistance systems“. Phd thesis, Université de Haute Alsace - Mulhouse, 2010. http://tel.archives-ouvertes.fr/tel-00608549.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Since the origin of the automotive at the end of the 19th century, the traffic flow is subject to a constant increase and, unfortunately, involves a constant augmentation of road accidents. Research studies such as the one performed by the World Health Organization, show alarming results about the number of injuries and fatalities due to these accidents. To reduce these figures, a solution lies in the development of Advanced Driver Assistance Systems (ADAS) which purpose is to help the Driver in his driving task. This research topic has been shown to be very dynamic and productive during the last decades. Indeed, several systems such as Anti-lock Braking System (ABS), Electronic Stability Program (ESP), Adaptive Cruise Control (ACC), Parking Manoeuvre Assistant (PMA), Dynamic Bending Light (DBL), etc. are yet market available and their benefits are now recognized by most of the drivers. This first generation of ADAS are usually designed to perform a specific task in the Controller/Vehicle/Environment framework and thus requires only microscopic information, so requires sensors which are only giving local information about an element of the Vehicle or of its Environment. On the opposite, the next ADAS generation will have to consider more aspects, i.e. information and constraints about of the Vehicle and its Environment. Indeed, as they are designed to perform more complex tasks, they need a global view about the road context and the Vehicle configuration. For example, longitudinal control requires information about the road configuration (straight line, bend, etc.) and about the eventual presence of other road users (vehicles, trucks, etc.) to determine the best reference speed. [...]
43

Castagno, Thomas A. „The effect of knee pads on gait and comfort“. Link to electronic thesis, 2004. http://www.wpi.edu/Pubs/ETD/Available/etd-0426104-174716.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
44

Hon, Wing-kai. „On the construction and application of compressed text indexes“. Click to view the E-thesis via HKUTO, 2004. http://sunzi.lib.hku.hk/hkuto/record/B31059739.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
45

Hon, Wing-kai, und 韓永楷. „On the construction and application of compressed text indexes“. Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2004. http://hub.hku.hk/bib/B31059739.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
46

Nduku, Nyaniso Prudent. „Development of methods for distribution network power quality variation monitoring“. Thesis, Cape Peninsula University of Technology, 2009. http://hdl.handle.net/20.500.11838/1144.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Thesis (MTech (Electrical Engineering))--Cape Peninsula University of Technology, 2009
The purpose of this project is to develop methods for distribution network power quality' variations monitoring. Power quality (PO) has become a significant issue for both power suppliers and customers. There have been important changes in power system regarding to power quality requirements. "Power quality" is the combination at voltage quality and current quality. The main research problem of the project is to investigate the power quality of a distribution network by selection of proper measurement, applying and developing the existing classic and modern signal conditioning methods for power disturbance's parameters extracting and monitoring. The research objectives are: To study the standard lEC 61000-4-30 requirements. to investigate the common couplings in the distribution network. To identity the points for measurement, to develop MySQL database for the data from the measurement and to develop MATLAB software tor simulation of the network To develop methods based on Fourier transforms for estimation of the parameters of the disturbances. To develop software for the methods implementation, The influence of different loads on power quality disturbances are considered in the distribution network. Points on the network and meters according to the lEC power quality standards are investigated and applied for the CPUT Bellville campus distribution network. The implementation of the power quality monitoring for the CPUT Bellville campus helps the quality of power supply to be improved and the used power to be reduced. MATLAB programs to communicate with the database and calculate the disturbances and power quality parameters are developed.
47

He, Hai. „Towards automatic understanding and integration of web databases for developing large-scale unified access systems“. Diss., Online access via UMI:, 2006.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
48

Krause, Christian, und Holger Giese. „Quantitative modeling and analysis of service-oriented real-time systems using interval probabilistic timed automata“. Universität Potsdam, 2012. http://opus.kobv.de/ubp/volltexte/2012/5784/.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
One of the key challenges in service-oriented systems engineering is the prediction and assurance of non-functional properties, such as the reliability and the availability of composite interorganizational services. Such systems are often characterized by a variety of inherent uncertainties, which must be addressed in the modeling and the analysis approach. The different relevant types of uncertainties can be categorized into (1) epistemic uncertainties due to incomplete knowledge and (2) randomization as explicitly used in protocols or as a result of physical processes. In this report, we study a probabilistic timed model which allows us to quantitatively reason about nonfunctional properties for a restricted class of service-oriented real-time systems using formal methods. To properly motivate the choice for the used approach, we devise a requirements catalogue for the modeling and the analysis of probabilistic real-time systems with uncertainties and provide evidence that the uncertainties of type (1) and (2) in the targeted systems have a major impact on the used models and require distinguished analysis approaches. The formal model we use in this report are Interval Probabilistic Timed Automata (IPTA). Based on the outlined requirements, we give evidence that this model provides both enough expressiveness for a realistic and modular specifiation of the targeted class of systems, and suitable formal methods for analyzing properties, such as safety and reliability properties in a quantitative manner. As technical means for the quantitative analysis, we build on probabilistic model checking, specifically on probabilistic time-bounded reachability analysis and computation of expected reachability rewards and costs. To carry out the quantitative analysis using probabilistic model checking, we developed an extension of the Prism tool for modeling and analyzing IPTA. Our extension of Prism introduces a means for modeling probabilistic uncertainty in the form of probability intervals, as required for IPTA. For analyzing IPTA, our Prism extension moreover adds support for probabilistic reachability checking and computation of expected rewards and costs. We discuss the performance of our extended version of Prism and compare the interval-based IPTA approach to models with fixed probabilities.
Eine der wichtigsten Herausforderungen in der Entwicklung von Service-orientierten Systemen ist die Vorhersage und die Zusicherung von nicht-funktionalen Eigenschaften, wie Ausfallsicherheit und Verfügbarkeit von zusammengesetzten, interorganisationellen Diensten. Diese Systeme sind oft charakterisiert durch eine Vielzahl von inhärenten Unsicherheiten, welche sowohl in der Modellierung als auch in der Analyse eine Rolle spielen. Die verschiedenen relevanten Arten von Unsicherheiten können eingeteilt werden in (1) epistemische Unsicherheiten aufgrund von unvollständigem Wissen und (2) Zufall als Mittel in Protokollen oder als Resultat von physikalischen Prozessen. In diesem Bericht wird ein probabilistisches, Zeit-behaftetes Modell untersucht, welches es ermöglicht quantitative Aussagen über nicht-funktionale Eigenschaften von einer eingeschränkten Klasse von Service-orientierten Echtzeitsystemen mittels formaler Methoden zu treffen. Zur Motivation und Einordnung wird ein Anforderungskatalog für probabilistische Echtzeitsysteme mit Unsicherheiten erstellt und gezeigt, dass die Unsicherheiten vom Typ (1) und (2) in den untersuchten Systemen einen Ein uss auf die Wahl der Modellierungs- und der Analysemethode haben. Als formales Modell werden Interval Probabilistic Timed Automata (IPTA) benutzt. Basierend auf den erarbeiteten Anforderungen wird gezeigt, dass dieses Modell sowohl ausreichende Ausdrucksstärke für eine realistische und modulare Spezifikation als auch geeignete formale Methoden zur Bestimmung von quantitativen Sicherheits- und Zuverlässlichkeitseigenschaften bietet. Als technisches Mittel für die quantitative Analyse wird probabilistisches Model Checking, speziell probabilistische Zeit-beschränkte Erreichbarkeitsanalyse und Bestimmung von Erwartungswerten für Kosten und Vergütungen eingesetzt. Um die quantitative Analyse mittels probabilistischem Model Checking durchzuführen, wird eine Erweiterung des Prism-Werkzeugs zur Modellierung und Analyse von IPTA eingeführt. Die präsentierte Erweiterung von Prism ermöglicht die Modellierung von probabilistischen Unsicherheiten mittelsWahrscheinlichkeitsintervallen, wie sie für IPTA benötigt werden. Zur Verifikation wird probabilistische Erreichbarkeitsanalyse und die Berechnung von Erwartungswerten durch das Werkzeug unterstützt. Es wird die Performanz der Prism-Erweiterung untersucht und der Intervall-basierte IPTA-Ansatz mit Modellen mit festen Wahrscheinlichkeitswerten verglichen.
49

Teoh, Pek Loo. „A study of single laser interferometry-based sensing and measuring technique in robot manipulator control and guidance. Volume 1“. Monash University, Dept. of Mechanical Engineering, 2003. http://arrow.monash.edu.au/hdl/1959.1/9565.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
50

Du, Preez Hercule. „GrailKnights : an automaton mass manipulation package for enhanced pattern analysis“. Thesis, Stellenbosch : Stellenbosch University, 2008. http://hdl.handle.net/10019.1/2902.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Thesis (MSC (Mathematical Sciences. Computer Science))--Stellenbosch University, 2008.
This thesis describes the design and implementation of an application names GrailKnights that allows for the mass manipulation of automata, with added visual pattern analysis features. It comprises a database-driven backend for automata storage, and a graphical user interface that allows for filtering the automata selected from the database with visual interpretation of visible patterns over the resulting automata.

Zur Bibliographie