To see the other types of publications on this topic, follow the link: Component technology.

Dissertations / Theses on the topic 'Component technology'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Component technology.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Avkarogullari, Okan. "Representing Design Patterns As Super Components In Component Oriented Software Engineering." Master's thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/1305390/index.pdf.

Full text
Abstract:
It is widely believed and empirically shown that component reuse improves both the quality and productivity of software development. This brings the necessity of a graphical editor to model the projects by using components. A graphical editor was implemented for the development of Component Oriented software development. The editor facilitates modeling efforts through application of the graphical modeling language COSEML. Both design patterns and software components have come to play important roles in software development. The correlation between software components and design patterns is apparent. In the design phase of the projects design patterns are used widely both in component and object oriented projects. Design patterns can be used as super components in component-based development . Software reuse, software components, design patterns, use of design patterns in component-based development, and component architectures are studied in details to address the need for the approach. COSE modeling activity starts with the decomposition of the target system into building blocks in a top-down order. Next, interfaces between these blocks are defined. If required design patterns can be added to model as super components.
APA, Harvard, Vancouver, ISO, and other styles
2

Althammer, Egbert. "Reflection patterns in the context of object and component technology /." [S.l. : s.n.], 2001. http://www.bsz-bw.de/cgi-bin/xvms.cgi?SWB9819092.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Price, Tobias E. C. "Multi-component complex hydrides for hydrogen storage." Thesis, University of Nottingham, 2010. http://eprints.nottingham.ac.uk/11988/.

Full text
Abstract:
Hydrogen as an energy vector offers great potential for mobile energy generation through fuel cell technology, however this depends on safe, mobile and high density storage of hydrogen. The destabilised multi-component complex hydride system LiBH4 : MgH2 was investigated in order to characterise the destabilisation reactions which enable reduction of operating temperatures for this high capacity system (ca. 9.8 wt.%). In-situ neutron diffraction showed that regardless of stoichiometry similar reaction paths were followed forming LiH and MgB¬2¬ when decomposed under H¬2 and Mg-Li alloys (Mg0.816Li0.184 and Mg0.70Li0.30) when under dynamic vacuum. Hydrogen isotherms of the 0.3LiBH4 : MgH¬2¬ showed a dual plateau behaviour with the lower plateau due to the destabilised LiBH4 reaction. Thermodynamic data calculated from the isotherm results showed a significant reduction in the T(1bar) for LiBH4 to 322 C (cf. 459 C for LiBH4(l)). Cycling behaviour of 0.3LiBH4 : MgH2 system decomposed under both reaction environments showed very fast kinetics on deuteriding at 400C and 100 bar D2, reaching 90 % conversion within 20 minutes. In contrast 2LiBH4 : MgH2 samples had kinetics an order of magnitude slower and after 4 hours conversions <50 %. These results demonstrate the strong influence of stoichiometry in the cycling kinetics compared to decomposition conditions. Investigation of catalysts found dispersion of metal hydrides through long ball-milling times, or dispersion through reaction with metal halide additions provided the greatest degree of kinetic advantage, with pre-milled NbH providing the best kinetic improvement without reducing capacity due to Li-halide formation. Finally, additions of LiAlH4 to the system formed an Al dispersion through the sample during decomposition, which acted both as a catalyst and destabilising agent on the MgH2 component, forming Mg-Al-Li alloys. Decomposition under H2 also showed a destabilisation effect for the LiBH4 component.
APA, Harvard, Vancouver, ISO, and other styles
4

LUCZAJ, JEROME ERIC. "A FRAMEWORK FOR E-LEARNING TECHNOLOGY." University of Cincinnati / OhioLINK, 2003. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1054225415.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Brown, William Shaler. "Technology for Designing the Steering Subsystem Component of an Autonomous Vehicle." Thesis, Virginia Tech, 2007. http://hdl.handle.net/10919/34960.

Full text
Abstract:
Autonomous vehicles offer means to complete unsafe military operations without endangering the lives of soldiers. Such solutions have fueled many efforts towards designing autonomous, or unmanned, systems. Military and academic research efforts alike continue to focus on developing these systems. While many different autonomous vehicles have been introduced, however, such complex systems have limited drive-by-wire operability. The complete process to up-fit a vehicle to fully autonomous operation involves the design, up-fit, testing and verification of many different subsystems. The objective of this thesis is to design and model an autonomous steering system requiring little modifications to an existing steering system. It is desirable to still operate the vehicle manually as well as preserve the vehicleâ s visual appearance. Up-fit and implementation of the designed steering system and verification of its functionality has been documented as well. Utilization of the supplied controller and software has enabled the testing and characterization of the system. The proposed design offers a solution to a wide variety of wheeled vehicles steered via the traditional and common steering wheel method. In addition, modifications have been made to an existing simulation of an unmanned vehicle in a military testbed environment (Fort Benning). The simulation accounts for the control methodology as it has been designed and tested with, which offers the ability to analyze the dynamics of the unmanned system.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
6

Bimpikas, Vasileios. "Automatic 3-axis component placementsystem with visual alignment formixed technology PCBs." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-312752.

Full text
Abstract:
For the needs of the present master thesis, a Pick & Place machine for through-holecomponents with insight in three axes was studied and implemented. What motivatedthis endeavour was the trend to increasingly automated production lines in theelectronics manufacturing industry. Certain through-hole elements require furthermodifications, such as the screwing of heatsinks on them post the placing andsoldering.That implies that a certain distance from the board is ensured when placing andsoldering the components, which involves further manual labour for securing thecomponents at the desired height until they are soldered, thus increasing the cost andlowering the productivity. Therefore, the resulting system that was developed, placesthrough-hole components at the desired height. For the needs of this, a steppermotor system, operating in open loop, was placed on a prototype mechanical tablethat provided motion in three axes using a belt-and-pulley approach for the needs oftesting and evaluation. For additional robustness, a vision system was integrated aswell. By locating the fiducial markers of the board, it is possible to detect offset in Xand Y axes, as well as rotation of the board that was introduced during its placement.The C code that manipulates the motors was combined with the C++ code of thevision system that uses OpenCV in a GUI for increasing the ease of useand user-friendliness in general.The developed system resulted in a positioning accuracy of 0.7 mm, whereas thevision system counteracted the misalignments of sample boards with an accuracy ofup to 0.4 mm. A soldering system operating in tandem with the developed placingsystem has been left as future work, to complete the automated placement ofdiscussed components in desired height, which ultimately eliminates the additionalmanual work during the PCB manufacturing process
APA, Harvard, Vancouver, ISO, and other styles
7

Mohlin, Martin, and Mikael Hanneberg. "Chassis component made of composite material : An investigation of composites in the automotive industry and the redesign of a chassis component." Thesis, KTH, Lättkonstruktioner, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-203816.

Full text
Abstract:
The demands on fuel efficiency and environmental friendliness of cars have driven the automotive industry towards composite materials which reduce the weight compared to the traditional aluminum and steel solutions. The purpose of this master thesis is to evaluate the possibility and feasibility of redesigning a high volume metal chassis part in composite materials.  To accomplish this the thesis work was divided into two parts. The first part consists of a composite study which explores the available composite technologies in the industry such as implemented chassis components and available manufacturing methods. The composite study shows that almost no high volume chassis component in the market are made out of composites, with exception to leaf springs. In the industry there are many different composite manufacturing methods but in general the most ready for high volume production are Injection molding, compression molding and RTM. A method was also explored to efficiently evaluate different material and manufacturing methods against each other. By knowing the critical requirement both materials and manufacturing methods can be evaluated separately against each other. The second part consists of a design phase where the knowledge from the composite study was used to choose and redesign a chassis component in composite. A motor mount was chosen and redesigned using injection molding. The new design shows that a weight decrease of at least 38% is possible without significant cost differences.
APA, Harvard, Vancouver, ISO, and other styles
8

Ngosi, Theodora N. "Reconstruction of the international information technology standardisation process within a component-based design framework : a component based project development setting perspective." Thesis, City University London, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.446343.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Stedman, John. "Pulsed nuclear magnetic resonance studies of flour component doughs." Thesis, Oxford Brookes University, 1990. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.277917.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Barsch, Robert. "Web-based technology for storage and processing of multi-component data in seismology." Diss., lmu, 2009. http://nbn-resolving.de/urn:nbn:de:bvb:19-110434.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Kassiotis, Christophe. "Nonlinear fluid-structure interaction : a partitioned approach and its application through component technology." Phd thesis, Université Paris-Est, 2009. http://tel.archives-ouvertes.fr/tel-00453394.

Full text
Abstract:
A partitioned approach is studied to solve strongly coupled nonlinear fluid structure interaction problems. The stability, convergence and performance of explicit and implicit coupling algorithms are explored. The partitioned approach allows to re-use existing codes in a more general context. One purpose of this work is to be able to couple them as black-boxes. To that end, the scientific software component framework CTL is considered. Therefore a fluid and a structure component based on existing software are developed and coupled with a master code approach. Computational performance of different remote calls and parallel implementation of components are also depicted herein. The re-use of existing software allows to couple advanced models developed for both sub-problems. In this work, the structure part is solved by the Finite Element Method, with the possibility to use different non-linear and large deformation behaviors. For the fluid part, examples modeled with an arbitrary Lagrangian Eulerian formulation are considered, solved with a finite volume method. The models used are first transient incompressible flows described by the Navier-Stokes equation, then free surface flows. With the latter, the impact of sloshing and breaking waves on model structures can be computed
APA, Harvard, Vancouver, ISO, and other styles
12

Charlton, Phillip. "The application of Zeeko polishing technology to freeform femoral knee replacement component manufacture." Thesis, University of Huddersfield, 2011. http://eprints.hud.ac.uk/id/eprint/14062/.

Full text
Abstract:
The purpose of this study was to develop an advanced 7-axis Computer Numerical Controlled (CNC) Polishing Machine from its successful original application of industrial optics manufacture into a process for the manufacture of femoral knee components to improve wear characteristics and prolong component lifetimes. It was indentified that the successful manufacture of optical components using a corrective polishing procedure to enhance their performance could be applied to femoral knee implant components. Current femoral knee implants mimic the natural shape of the joint and are freeform (no axis of symmetry) in nature hence an advanced CNC polishing machine that can follow the contours associated with such shapes could improve surface finish and conformity of replacement femoral knee bearing surfaces, leading to improved performance. The process involved generating machine parameters that would optimize the polishing procedure to minimize wear of materials used in femoral knee implant manufacture. Secondly a design of a Non-Uniform Refind B-Spline (NURBS) model for control of the Polishing Machine over the freeform contours of the femoral component. Completing the process involved development of a corrective polishing process that would improve form control of the components. Such developments would improve surface finish and conformity which are well documented contributors to wear and hence the lifeline of orthopaedic implants. By the means of comparison of this technique to that of a conventional finishing technique using pin-on-plate disc testing it was concluded that performance of the CNC polished components was an improvement on that of the conventional technique. In the case of form control their were slight indications through small decreases in peak to valley (PV) error that the process helped reduce form error and could increase the lifetime of femoral knee replacement components. The overall study provided results that indicate the the Zeeko process could be used in the application of polishing of hard-on-hard material combinations to improve form control without compromising surface finish hence improving lifetimes of the implant. The results have their limitations in the fact that the wear test performance was only carried out on orthopaedic implant materials using a pin-on-plate wear test rig. Due to the time limitations on the thesis it can be said that further analysis of correcting form without compromising surface finish on entire implant systems under full joint simulator testing which would provide mre realistic contitions would a more definitive answer be achieved.
APA, Harvard, Vancouver, ISO, and other styles
13

Wang, Suogang. "Enhancing brain-computer interfacing through advanced independent component analysis techniques." Thesis, University of Southampton, 2009. https://eprints.soton.ac.uk/65897/.

Full text
Abstract:
A brain-computer interface (BCI) is a direct communication system between a brain and an external device in which messages or commands sent by an individual do not pass through the brain’s normal output pathways but is detected through brain signals. Some severe motor impairments, such as Amyothrophic Lateral Sclerosis, head trauma, spinal injuries and other diseases may cause the patients to lose their muscle control and become unable to communicate with the outside environment. Currently no effective cure or treatment has yet been found for these diseases. Therefore using a BCI system to rebuild the communication pathway becomes a possible alternative solution. Among different types of BCIs, an electroencephalogram (EEG) based BCI is becoming a popular system due to EEG’s fine temporal resolution, ease of use, portability and low set-up cost. However EEG’s susceptibility to noise is a major issue to develop a robust BCI. Signal processing techniques such as coherent averaging, filtering, FFT and AR modelling, etc. are used to reduce the noise and extract components of interest. However these methods process the data on the observed mixture domain which mixes components of interest and noise. Such a limitation means that extracted EEG signals possibly still contain the noise residue or coarsely that the removed noise also contains part of EEG signals embedded. Independent Component Analysis (ICA), a Blind Source Separation (BSS) technique, is able to extract relevant information within noisy signals and separate the fundamental sources into the independent components (ICs). The most common assumption of ICA method is that the source signals are unknown and statistically independent. Through this assumption, ICA is able to recover the source signals. Since the ICA concepts appeared in the fields of neural networks and signal processing in the 1980s, many ICA applications in telecommunications, biomedical data analysis, feature extraction, speech separation, time-series analysis and data mining have been reported in the literature. In this thesis several ICA techniques are proposed to optimize two major issues for BCI applications: reducing the recording time needed in order to speed up the signal processing and reducing the number of recording channels whilst improving the final classification performance or at least with it remaining the same as the current performance. These will make BCI a more practical prospect for everyday use. This thesis first defines BCI and the diverse BCI models based on different control patterns. After the general idea of ICA is introduced along with some modifications to ICA, several new ICA approaches are proposed. The practical work in this thesis starts with the preliminary analyses on the Southampton BCI pilot datasets starting with basic and then advanced signal processing techniques. The proposed ICA techniques are then presented using a multi-channel event related potential (ERP) based BCI. Next, the ICA algorithm is applied to a multi-channel spontaneous activity based BCI. The final ICA approach aims to examine the possibility of using ICA based on just one or a few channel recordings on an ERP based BCI. The novel ICA approaches for BCI systems presented in this thesis show that ICA is able to accurately and repeatedly extract the relevant information buried within noisy signals and the signal quality is enhanced so that even a simple classifier can achieve good classification accuracy. In the ERP based BCI application, after multichannel ICA the data just applied to eight averages/epochs can achieve 83.9% classification accuracy whilst the data by coherent averaging can reach only 32.3% accuracy. In the spontaneous activity based BCI, the use of the multi-channel ICA algorithm can effectively extract discriminatory information from two types of singletrial EEG data. The classification accuracy is improved by about 25%, on average, compared to the performance on the unpreprocessed data. The single channel ICA technique on the ERP based BCI produces much better results than results using the lowpass filter. Whereas the appropriate number of averages improves the signal to noise rate of P300 activities which helps to achieve a better classification. These advantages will lead to a reliable and practical BCI for use outside of the clinical laboratory.
APA, Harvard, Vancouver, ISO, and other styles
14

Moradi, Farshad. "A Framework for Component Based Modelling and Simulation using BOMs and Semantic Web Technology." Doctoral thesis, KTH, Elektronik- och datorsystem, ECS, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4770.

Full text
Abstract:
Modelling and Simulation (M&S) is a multi-disciplinary field that is widely used in various domains. It provides a means to study complex systems before actual physical prototyping and helps lowering, amongst others, manufacturing and training costs. However, as M&S gains more popularity, the demand on reducing time and resource costs associated with development and validation of simulation models has also increased. Composing simulation models of reusable and validated simulation components is one approach for addressing the above demand. This approach, which is still an open research issue in M&S, requires a composition process that is able to support a modeller with discovery and identification of components as well as giving feedback on feasibility of a composition. Combining components in order to build new simulations raise the non-trivial issue of composability. Composability has been defined as the capability to select and assemble reusable simulation components in various combinations into simulation systems to meet user requirements. There are three main types of composability, syntactic, semantic and pragmatic. Syntactic composability is concerned with the compatibility of implementation details, such as parameter passing mechanisms, external data accesses, and timing mechanisms. It is the question of whether a set of components can be combined. Semantic composability, on the other hand, is concerned with the validity of the composition, and whether the composed simulation is meaningful. Pragmatic composability is yet another type which is concerned with the context of the simulation, and whether the composed simulation meets the intended purpose of the modeller. Of these three types syntactic composability is easiest to accomplish and some significant progresses on this issue have been reported in the literature. Semantic and pragmatic composability are much harder to achieve and has inspired many researchers to conduct both theoretical and experimental research. The Base Object Model (BOM) is a new concept identified within M&S community as a potential facilitator for providing reusable model components for the rapid construction and modification of simulations. Although BOMs exhibit good capabilities for reuse and composability they lack the required semantic information for semantic matching and composition. There is little support for defining concepts and terms in order to avoid ambiguity, and there is no method for matching behaviour of conceptual models (i.e., state machines of the components), which is required for reasoning about the validity of BOM compositions. In this work we have developed a framework for component-based model development that supports both syntactic and semantic composability of simulation models by extending the BOM concept using ontologies, Semantic Web and Web Services technologies, and developing a rule-based method for reasoning about BOM compositions. The issue of pragmatic composability has not been the focus of this work, and it has only been partly addressed. The framework utilises intelligent agents to perform discovery and composition of components, according to the modeller needs. It includes a collaborative environment, a semantic distributed repository and an execution environment to support model development and execution process. The basic assumption of this work is that semantic composability should be achieved at conceptual level. Through precise definition and specification of components’ semantic and syntax one can capture the basic requirements for matching and semantically meaningful composition of those components. This requires a common methodology for specification of simulation components. The specification methodology consists of meta-models describing simulation components at different levels. In order to enable automatic matching of meta-models they are formalized and structured using Semantic Web technology in OWL (Web Ontology Language). Hence, the models are based on ontologies to avoid misunderstanding and to provide unambiguous definitions as a basis for reasoning about syntactic and semantic validity of compositions.
QC 20100830
APA, Harvard, Vancouver, ISO, and other styles
15

Zubeir, Abdulghani Ismail. "OAP: An efficient online principal component analysis algorithm for streaming EEG data." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-392403.

Full text
Abstract:
Data processing on streaming data poses computational as well as statistical challenges. Streaming data requires that data processing algorithms are able to process a new data point within micro-seconds. This is especially challenging on dimension reduction, where traditional methods as Principal Component Analysis (PCA) require eigenvectors decomposition of a matrix based on the complete dataset. So a proper online version of PCA should avoid this computational involved step in favor for a more efficient update rule. This is implemented by an algorithm named Online Angle Preservation (OAP), which is able to handle large dimensions in the required time limitations. This project presents an application of OAP in the case of Electroencephalography (EEG). For this, an interface was coded from an openBCI EEG device, through a Java API to a streaming environment called Stream Analyzer (sa.engine). The performance of this solution was compared to a standard Windowised PCA solution, indicating its competitive performance. This report details this setup and details the results.
APA, Harvard, Vancouver, ISO, and other styles
16

Mainwaring, David, and Jonathan Österberg. "Sound Pattern Recognition : Evaluation of Independent Component Analysis Algorithms for Separation of Voices." Thesis, KTH, Skolan för teknikvetenskap (SCI), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-230746.

Full text
Abstract:
With computers being used for more applications where commands can be spoken it is useful to findalgorithms which can separate voices from each other so that software can turn spoken words intocommands. In this paper our goal is to describe how Independent Component Analysis (ICA) can beused for separation of voices in cases where we have at least the same number of microphones, atdifferent distances from the speakers, as speakers whose voices we wish to separate, the so called``cocktail party problem". This is done by implementing an ICA algorithm on voice recordingscontaining multiple persons and examining the results. The use of both ICA algorithms result in aclear separation of voices, the advantage of fastICA is that the computations take a fraction of thetime needed for the ML-ICA. Both algorithms can also successfully separate voices when recordingsare made by more microphones than speakers. The algorithms were also able to separate some ofthe voices when there were fewer microphones than speakers which was surprising as thealgorithms have no theoretical guarantee for this.
I detta arbete har vi undersökt hur oberoende komponentanalys algoritmer (ICA) kan användas förseparation av röster där vi har ett varierande antal röster och mikrofoner utplacerade på olikapositioner i ett rum, mer känt som ”cocktailparty problemet”. Detta görs genom att resultatet frånICA-algoritmer appliceras på ljudinspelningar där flera personer talar i mun på varandra. Vi testar ICAalgoritmerna Maximum Likelihood-ICA (ML-ICA) och fastICA. Båda algoritmerna ger goda resultat närdet är minst lika många mikrofoner som talare. Fördelen med fastICA mot ML-ICA är att körtiden ärmycket kortare. Överraskande resultat från båda algoritmerna är att de klarade att separera ut minsten av rösterna när det var fler talare än mikrofoner då detta inte var ett förväntat resultat.
APA, Harvard, Vancouver, ISO, and other styles
17

Karlsson, Daniel. "Verification of Component-based Embedded System Designs." Doctoral thesis, Linköping : Department of Computer and Information Science, Linköping University, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-7473.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Antwi, Samuel. "Formative Research on Component Display Theory." Ohio University / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1510679208927503.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

SURYANARAYANAN, CHANDRAMOULI. "Functional analysis and modelling of a dynamic seal component for a reciprocating gas compressor." Thesis, KTH, Skolan för industriell teknik och management (ITM), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-236034.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Suzuki, Hiroyuki 1965 Jan. "An analysis of managing the globally dispersed team : a case study of an auto component manufacturer." Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/9226.

Full text
Abstract:
Thesis (S.M.M.O.T.)--Massachusetts Institute of Technology, Sloan School of Management, Management of Technology Program, 2000.
Also available online at the MIT Theses Online homepage .
Includes bibliographical references (leaves 64-65).
Globalization has become one of the most important strategic issues for almost every business organization. Along with globalization, managers and employees are being required to work with people in geographically dispersed locations as well as in local organizations. For a new project, a global company typically creates a group with professionals located in multiple places. The basic principles for managing a co-located group are important and can be applied to managing a group of people in geographically dispersed locations, referred to as a "globally dispersed team". However, global dispersion involves additional complexities, such as cross-cultural and cross-organizational issues. Managing globally dispersed teams is a new challenge for managers and employees, especially those who are appointed as leaders of such teams. Through a review of the literature and existing management publications, as well as actual case studies of globally dispersed teams, this thesis explores the key issues and develops proposals for managers who must deal with managing these globally dispersed teams.
by Hiroyuki Suzuki.
S.M.M.O.T.
APA, Harvard, Vancouver, ISO, and other styles
21

Yates, James William. "MANAGING TELEMETRY AS AN ENTERPRISE SOLUTION LEVERAGING TECHNOLOGY INTO NEXT-GENERATION TELEMETRY ARCHITECTURES." International Foundation for Telemetering, 2000. http://hdl.handle.net/10150/607714.

Full text
Abstract:
International Telemetering Conference Proceedings / October 23-26, 2000 / Town & Country Hotel and Conference Center, San Diego, California
With today’s rapidly shifting technology in the areas of networking, the Web, platform-independent software, and component technology, a paradigm shift will allow us to treat our telemetry systems as enterprise-wide solutions in the future. These technologies will revolutionize how we support all phases of telemetry data acquisition, processing, archiving, distribution, and display. This paper will explain how these changes affect systems designers, operators, and users. Specific technical areas of discussion include: § Technology adoption cycles § Object-oriented environments and component technologies § Database interconnectivity § Web-enabling concepts and implementations § Application servers § Database replication § Data warehousing § Embedded Web servers
APA, Harvard, Vancouver, ISO, and other styles
22

Wales, Craig. "Multi-component crystallisation approaches to controlling crystalline forms of active pharmaceutical ingredients." Thesis, University of Glasgow, 2013. http://theses.gla.ac.uk/3941/.

Full text
Abstract:
Multi-component crystallisation is investigated as a route to controlling crystalline forms of selected materials that possess pharmaceutical properties. This includes investigating the use of co-crystallisation methodology to selectively crystallise metastable polymorphs and solvated forms of these materials. This differs from the conventional use of co-crystallisation, as the aim of this aspect of the investigation is not to obtain a molecular complex of the two components, but instead for them to crystallise independently, while one component perturbs the solution environment to direct the crystallisation of the second component towards a different, often metastable, polymorph (or solvate). This co-crystallisation methodology is used as a route to crystallising new or elusive polymorphs (or solvates) of the active pharmaceutical ingredients paracetamol, piroxicam, gallic acid monohydrate and piracetam. It is also demonstrated that the use of this method can lead to crystal forms with otherwise unobtainable structural features. Co-crystallisation is also investigated as a route to controlling the ionisation state of piroxicam in the formation of molecular complexes. Molecular complexes were formed with a number of mono-substituted benzoic acids as well as with nitrogen-heterocycles and strong acids. In the molecular complexes formed, piroxicam was found to adopt the non-ionised, zwitterionic, anionic or cationic form, depending on the co-former used. Attempts are made to rationalise the occurrence of each ionisation state by consideration of the relative pKa values of piroxicam and the co-formers. The hydrogen bonded supramolecular synthons in these molecular complexes are also investigated. Co-crystallisation is also used as a route to obtaining molecular complexes of paracetamol and its derivative, 4-acetamidobenzoic acid, with nitrogen-heterocycles as co-formers. Molecular complexes of the two, with similar co-formers, are compared in terms of their hydrogen bonded supramolecular synthons. Despite having otherwise similar structural features, the phenolic hydroxyl group in paracetamol and carboxylic acid group in 4-acetamidobenzoic acid result in the formation of very different synthons and in some cases different component ratios. The susceptibility of 4-acetamidobenzoic acid to deprotonation is found to play a major role in the differences observed. Molecular complexes of paracetamol with co-formers containing multiple carboxylic acid groups are also investigated, with a view towards further crystal engineering approaches for molecular complexes of paracetamol. Piracetam complexes with carboxylic acids are investigated in a similar manner. The potential for transfer of a range of these multi-component crystallisations into a non-evaporative environment, with a view to implementing continuous crystallisation approaches, is also investigated. This transfer is found to be challenging for the systems investigated.
APA, Harvard, Vancouver, ISO, and other styles
23

Loftus, John. "On the development of control systems technology for fermentation processes." Thesis, University of Manchester, 2017. https://www.research.manchester.ac.uk/portal/en/theses/on-the-development-of-control-systems-technology-for-fermentation-processes(61955790-a48b-4703-8942-bfe47a38a6c2).html.

Full text
Abstract:
Fermentation processes play an integral role in the manufacture of pharmaceutical products. The Quality by Design initiative, combined with Process Analytical Technologies, aims to facilitate the consistent production of high quality products in the most efficient and economical way. The ability to estimate and control product quality from these processes is essential in achieving this aim. Large historical datasets are commonplace in the pharmaceutical industry and multivariate methods based on PCA and PLS have been successfully used in a wide range of applications to extract useful information from such datasets. This thesis has focused on the development and application of novel multivariate methods to the estimation and control of product quality from a number of processes. The document is divided into four main categories. Firstly, the related literature and inherent mathematical techniques are summarised. Following this, the three main technical areas of work are presented. The first of these relates to the development of a novel method for estimating the quality of products from a proprietary process using PCA. The ability to estimate product quality is useful for identifying production steps that are potentially problematic and also increases process efficiency by ensuring that any defective products are detected before they undergo any further processing. The proposed method is simple and robust and has been applied to two separate case studies, the results of which demonstrate the efficacy of the technique. The second area of work concentrates on the development of a novel method of identifying the operational phases of batch fermentation processes and is based on PCA and associated statistics. Knowledge of the operational phases of a process can be beneficial from a monitoring and control perspective and allows a process to be divided into phases that can be approximated by a linear model. The devised methodology is applied to two separate fermentation processes and results show the capability of the proposed method. The third area of work focuses on undertaking a performance evaluation of two multivariate algorithms, PLS and EPLS, in controlling the end-point product yield of fermentation processes. Control of end-point product quality is of crucial importance in many manufacturing industries, such as the pharmaceutical industry. Developing a controller based on historical and identification process data is attractive due to the simplicity of modelling and the increasing availability of process data. The methodology is applied to two case studies and performance evaluated. From both a prediction and control perspective, it is seen that EPLS outperforms PLS, which is important if modelling data is limited.
APA, Harvard, Vancouver, ISO, and other styles
24

Gardner, Robert C. "System component innovation : observations of selection dynamics in a high-technology organization dc by Robert Charles Gardner." Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/11360.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Davey, Bradford Thomas. "The role of a facilitated online workspace component of a community of practice| Knowledge building and value creation for NASA." Thesis, Pepperdine University, 2013. http://pqdtopen.proquest.com/#viewpdf?dispub=3588584.

Full text
Abstract:

The purpose of this study was to examine the role of an online workspace component of a community in the work of a community of practice. Much has been studied revealing the importance of communities of practice to organizations, project success, and knowledge management and some of these same successes hold true for virtual communities of practice. Study participants were 75 Education and Public Outreach community members of NASA's Science Mission Directorate Earth Forum. In this mixed methods study, online workspace metrics were used to track participation and a survey completed by 21 members was used to quantify participation. For a more detailed analysis, 15 community members (5 highly active users, 5 average users, and 5 infrequent users) selected based on survey responses, were interviewed. Finally, survey data was gathered from 7 online facilitators to understand their role in the community. Data collected from these 21 community members and 5 facilitating members suggest that highly active users (logging into the workspace daily), were more likely to have transformative experiences, co-create knowledge, feel ownership of community knowledge, have extended opportunities for community exchange, and find new forms of evaluation. Average users shared some similar characteristics with both the highly active members and infrequent users, representing a group in transition as they become more engaged and active in the online workspace. Inactive users viewed the workspace as having little value, being difficult to navigate, being mainly for gaining basic information about events and community news, and as another demand on their time. Results show the online workspace component of the Earth Science Education and Outreach Forum is playing an important and emerging role for this community by supporting knowledge building and knowledge sharing, and growing in value for those that utilizing it more frequently. The evidence suggests that with increased participation or "usage" comes increased value to the participant and the organization. This research illustrates the possible change in mindset held by participating community members when it comes to the nature of co-location. Additionally, it may be of particular importance in exploring changes in the community members' feelings of connection and belonging.

APA, Harvard, Vancouver, ISO, and other styles
26

Xiao, Xinshu. "Principal component based system identification and its application to the study of cardiovascular regulation." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/34202.

Full text
Abstract:
Includes bibliographical references (p. 197-212).
Thesis (Ph. D.)--Harvard-MIT Division of Health Sciences and Technology, 2004.
(cont.) Our methods analyze the coupling between instantaneous lung volume and heart rate and, subsequently, derive representative indices of parasympathetic and sympathetic control based on physiological and experimental findings. The validity of each method is evaluated via experimental data collected following interventions with known effect on the parasympathetic or sympathetic control. With the above techniques, this thesis explores an important topic in the field of space medicine: effects of simulated microgravity on cardiac autonomic control and orthostatic intolerance (OI). Experimental data from a prolonged bed rest study (simulation of microgravity condition) are analyzed and the conclusions are: 1) prolonged bed rest may impair autonomic control of heart rate; 2) orthostatic intolerance after bed rest is associated with impaired sympathetic responsiveness; 3) there may be a pre-bed rest predisposition to the development of OI after bed rest. These findings may have significance for studying Earth-bound orthostatic hypotension as well as for designing effective countermeasures to post-flight OI. In addition, they also indicate the efficacy of our proposed methods for autonomic function quantification.
System identification is an effective approach for the quantitative study of physiologic systems. It deals with the problem of building mathematical models based on observed data and enables a dynamical characterization of the underlying physiologic mechanisms specific to the individual being studied. In this thesis, we develop and validate a new linear time-invariant system identification approach which is based on a weighted-principal component regression (WPCR) method. An important feature of this approach is its asymptotic frequency-selective property in solving time-domain parametric system identification problems. Owing to this property, data-specific candidate models can be built by considering the dominant frequency components inherent in the input (and output) signals, which is advantageous when the signals are colored, as are most physiologic signals. The efficacy of this method in modeling open-loop and closed-loop systems is demonstrated with respect to simulated and experimental data. In conjunction with the WPCR-based system identification approach, we propose new methods to noninvasively quantify cardiac autonomic control. Such quantification is important in understanding basic pathophysiological mechanisms or in patient monitoring, treatment design and follow-up.
by Xinshu Xiao.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
27

Logan, Nandi. "CMOS design enhancement techniques for RF receivers : analysis, design and implementation of RF receivers with component enhancement and component reduction for improved sensitivity and reduced cost, using CMOS technology." Doctoral thesis, University of Bradford, 2010. http://hdl.handle.net/10454/5418.

Full text
Abstract:
Silicon CMOS Technology is now the preferred process for low power wireless communication devices, although currently much noisier and slower than comparable processes such as SiGe Bipolar and GaAs technologies. However, due to ever-reducing gate sizes and correspondingly higher speeds, higher Ft CMOS processes are increasingly competitive, especially in low power wireless systems such as Bluetooth, Wireless USB, Wimax, Zigbee and W-CDMA transceivers. With the current 32 nm gate sized devices, speeds of 100 GHz and beyond are well within the horizon for CMOS technology, but at a reduced operational voltage, even with thicker gate oxides as compensation. This thesis investigates newer techniques, both from a systems point of view and at a circuit level, to implement an efficient transceiver design that will produce a more sensitive receiver, overcoming the noise disadvantage of using CMOS Silicon. As a starting point, the overall components and available SoC were investigated, together with their architecture. Two novel techniques were developed during this investigation. The first was a high compression point LNA design giving a lower overall systems noise figure for the receiver. The second was an innovative means of matching circuits with low Q components, which enabled the use of smaller inductors and reduced the attenuation loss of the components, the resulting smaller circuit die size leading to smaller and lower cost commercial radio equipment. Both these techniques have had patents filed by the University. Finally, the overall design was laid out for fabrication, taking into account package constraints and bond-wire effects and other parasitic EMC effects.
APA, Harvard, Vancouver, ISO, and other styles
28

Muncy, Jennifer V. "Predictive Failure Model for Flip Chip on Board Component Level Assemblies." Diss., Georgia Institute of Technology, 2004. http://hdl.handle.net/1853/5131.

Full text
Abstract:
Environmental stress tests, or accelerated life tests, apply stresses to electronic packages that exceed the stress levels experienced in the field. In theory, these elevated stress levels are used to generate the same failure mechanisms that are seen in the field, only at an accelerated rate. The methods of assessing reliability of electronic packages can be classified into two categories: a statistical failure based approach and a physics of failure based approach. This research uses a statistical based methodology to identify the critical factors in reliability performance of a flip chip on board component level assembly and a physics of failure based approach to develop a low cycle strain based fatigue equation for flip chip component level assemblies. The critical factors in determining reliability performance were established via experimental investigation and their influence quantified via regression analysis. This methodology differs from other strain based fatigue approaches because it is not an empirical fit to experimental data; it utilizes regression analysis and least squares to obtain correction factors, or correction functions, and constants for a strain based fatigue equation, where the total inelastic strain is determined analytically. The end product is a general flip chip on board equation rather than one that is specific to a certain test vehicle or material set.
APA, Harvard, Vancouver, ISO, and other styles
29

Greatorex, James Michael. "Continuous aerobic processing of piggery effluent : a new approach to quantifying the fate of the nitrogen component." Thesis, University of Birmingham, 1995. http://etheses.bham.ac.uk//id/eprint/1410/.

Full text
Abstract:
The primary objective was the preparation of a complete mass balance around an aerobic treatment system for pig slurry, to quantify the various forms of nitrogen entering and leaving under different conditions. The purpose of this was to assess the effect of such treatment conditions in terms of the amount of polluting forms of nitrogen generated from the slurry. A laboratory scale reactor (designed for this study) was operated under three separate residence times of 2, 4, and 8 days, and aeration level indicated by a redox value in the range of E\(_{Ag/AgCl}\) = +100 to +200 mV; the latter two giving nitrifying conditions. Emissions of di-nitrogen gas are a major component of a nitrogen mass balance, yet one which has been often neglected because of difficulties in distinguishing it from that in the atmosphere. A novel technique was developed in which atmospheric N\(_2\) in the reactor headspace was removed by flushing the system with an 80/20 gas mixture of argon/oxygen. This left microbially derived N\(_2\) available for collection and analysis by mass spectrometry. Established methods were applied for the measurement of other gaseous nitrogen emissions (NH\(_3\), N\(_2\)O, NO) and other forms of nitrogen in the slurry (organic-N, NH\(_4^+\), NO\(_2^-\) and NO\(_3^-\)). The steam distillation technique for nitrite and nitrate was found to be unreliable, therefore, high performance liquid chromatography was used as an alternative. The existence of the intermediate nitrified N form of hydroxylamine is postulated but was not quantified in this study. The presence of unidentified components in raw slurry was investigated using HPLC, but only chloride and acetate could be recognised with a high degree of confidence. Mean N\(_2\) concentrations measured were 774 mg l\(^{-1}\) in the 4 day treatment and 523 mg l\(^{-1}\) in the 8 day treatment. Emissions of the environmentally damaging N\(_2\)O gas were quantified as being 514 mg l\(^{-1}\) in the 4 day treatment and 219 mg l\(^{-1}\) in the 8 day. The lower emissions from the 8 day treatment are attributed to improved contact between oxygen and slurry, reducing the prevalence of zones favourable for denitrification. In the final mass balance study, overall nitrogen leaving the system equalled 86 (±18) % of that entering in the 2 day treatment, 113 (±10) % in the 4 day treatment, and 104 (±21) % in the 8 day treatment. The variation in values was attributed to errors in the liquid phase analysis of slurry nitrogen compounds.
APA, Harvard, Vancouver, ISO, and other styles
30

TAKEDA, Kazuya, Takanori NISHINO, and Kenta NIWA. "Selective Listening Point Audio Based on Blind Signal Separation and Stereophonic Technology." Institute of Electronics, Information and Communication Engineers, 2009. http://hdl.handle.net/2237/15055.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Jiminez, Gonzalez Aida. "Antenatal foetal monitoring through abdominal phonogram recordings : a single-channel independent component analysis approach." Thesis, University of Southampton, 2010. https://eprints.soton.ac.uk/190815/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Wells, John Gaulden. "Establishment of a taxonometric structure for the study of biotecchnology as a secondary school component of technology education." Diss., This resource online, 1992. http://scholar.lib.vt.edu/theses/available/etd-02052007-081241/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Jansen, van Nieuwenhuizen Rudolph Johannes. "Development of an automated robot vision component handling system." Thesis, Bloemfontein : Central University of Technology, Free State, 2013. http://hdl.handle.net/11462/213.

Full text
Abstract:
Thesis (M. Tech. (Engineering: Electrical)) -- Central University of technology, Free State, 2013
In the industry, automation is used to optimize production, improve product quality and increase profitability. By properly implementing automation systems, the risk of injury to workers can be minimized. Robots are used in many low-level tasks to perform repetitive, undesirable or dangerous work. Robots can perform a task with higher precision and accuracy to lower errors and waste of material. Machine Vision makes use of cameras, lighting and software to do visual inspections that a human would normally do. Machine Vision is useful in application where repeatability, high speed and accuracy are important. This study concentrates on the development of a dedicated robot vision system to automatically place components exiting from a conveyor system onto Automatic Guided Vehicles (AGV). A personal computer (PC) controls the automated system. Software modules were developed to do image processing for the Machine Vision system as well as software to control a Cartesian robot. These modules were integrated to work in a real-time system. The vision system is used to determine the parts‟ position and orientation. The orientation data are used to rotate a gripper and the position data are used by the Cartesian robot to position the gripper over the part. Hardware for the control of the gripper, pneumatics and safety systems were developed. The automated system‟s hardware was integrated by the use of the different communication protocols, namely DeviceNet (Cartesian robot), RS-232 (gripper) and Firewire (camera).
APA, Harvard, Vancouver, ISO, and other styles
34

Somasekaram, Premathas. "A Component-based Business Continuity and Disaster Recovery Framework." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-323989.

Full text
Abstract:
IT solutions must be protected so that the business can continue, even in the case of fatal failures associated with disasters. Business continuity in the context of disaster implies that business cannot continue in the current environment but instead must continue at an alternate site or data center. However, the BC/DR concept today is too fragmented, as many different frameworks and methodologies exist. Furthermore,many of the application-specific solutions are provided and promoted by software vendors, while hardware vendors provide solutions for their hardware environments. Nevertheless, there are concerns that BC/DR solutions often do not connect to the technical components that are in the lower layers, which function as the foundationfor any such solutions; hence, it is equally important to connect and map the requirements accordingly. Moreover, a shift in the hardware environment, such as cloud computing, as well as changes in operations management, such as outsourcing,add complexity that must be captured by a BC/DR solution. Furthermore, the integrated nature of IT-based business solutions also presents new challenges, as it isno longer one IT solution that must be protected but also other IT solutions that are integrated to deliver an individual business process. Thus, it will be difficult to employa current BC/DR approach. Hence, the purpose of this thesis project is to design, develop, and present a novel way of addressing the BC/DR gaps, while supporting the requirements of a dynamic IT environment. The solution reuses most elements fromthe existing standards and solutions. However, it also includes new elements to capture and present the technical solution; hence, the complete solution is designatedas a framework. The new framework can support many IT solutions since it will havea modular approach, and it is flexible, scalable, and platform and application independent, while addressing the solution on a component level. The new framework is applied to two application scenarios at the stakeholder site, and theresults are studied and presented in this thesis.
APA, Harvard, Vancouver, ISO, and other styles
35

Lau, Chung Yin. "Computational stress analysis for ball grid array reliability and passive component reliability in board level assemblies /." View abstract or full-text, 2005. http://library.ust.hk/cgi/db/thesis.pl?MECH%202005%20LAU.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Onuoha, Augustina Tina. "Strategies to Minimize the Bullwhip Effect in the Electronic Component Supply Chain." ScholarWorks, 2018. https://scholarworks.waldenu.edu/dissertations/6258.

Full text
Abstract:
Supply chain leaders in the information technology industry face challenges regarding their ability to mitigate amplified demand and supply variability in a supply chain network--the bullwhip effect--and reduce adverse implications on their component supply chain networks. The purpose of this multiple case study was to explore the strategies supply chain leaders in the United States used to reduce the bullwhip effect. Bullwhip effect theory served as the conceptual framework. Participants in the study were 5 purposefully selected supply chain leaders in the state of Texas who successfully implemented strategies to reduce the bullwhip effect on their networks. Data were collected from semistructured interviews and analysis of documents from the participants' websites. The data were analyzed using the 5 data analysis steps consistent with Yin's approach: collection, stratification, reassembly, interpretation, and conclusion. Four themes emerged from data analysis: (a) collaboration strategy, (b) communication strategy, (c) component shortage reduction strategy, and (d) resource management strategy. Supply chain leaders might use the findings of this study to reduce the bullwhip effect within their networks and improve their profitability. The implications for positive social change include the potential for leaders to improve environmental sustainability by using effective supply chain strategies to reduce the accumulation of excess inventories, reduce transportation fuel usage, and lessen the consumption of natural resources.
APA, Harvard, Vancouver, ISO, and other styles
37

Viljoen, Vernon. "Integration of a vision-guided robot into a reconfigurable component- handling platform." Thesis, [Bloemfontein?] : Central University of Technology, Free State, 2014. http://hdl.handle.net/11462/120.

Full text
Abstract:
Thesis (M. Tech.) -- Central University of Technology, Free State, 2010
The latest technological trend in manufacturing worldwide is automation. Reducing human labour by using robots to do the work is purely a business decision. The reasons for automating a plant include: Improving productivity Reducing labour and equipment costs Reducing product damage Monitoring system reliability Improving plant safety. The use of robots in the automation sector adds value to the production line because of their versatility. They can be programmed to follow specific paths when moving material from one point to another and their biggest advantage is that they can operate for twenty-four hours a day while delivering consistent quality and accuracy. Vision-Guided Robots (VGRs) are developed for many different applications and therefore many different combinations of VGR systems are available. All VGRs are equipped with vision sensors which are used to locate and inspect various objects. In this study a robot and a vision system were combined for a pick-and-place application. Research was done on the design of a robot for locating, inspecting and picking selected components from a moving conveyor system.
APA, Harvard, Vancouver, ISO, and other styles
38

Raja, Visakha. "Sub-Modelling of a Jet Engine Component and Creation of Stiffness Interval Based on Cast Dimensional Variations." Thesis, Linköpings universitet, Hållfasthetslära, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-69479.

Full text
Abstract:
While designing jet-engine components information about the loads that the component will be subjected to, is critical. For this, a full system analysis of the engine is often performed with every component put together in a large finite element model, which is called the whole engine model (WEM). This model will mostly be composed of lower order shell elements with a few thousands of elements. At the design level of a component, the FE model is detailed with several hundred thousand higher order solid elements. The detailed model cannot be directly put into the whole engine model due to excessive run times. Therefore there must be a simpler representation of the component -a sub-model- with much fewer elements so that it can be assembled into the whole engine model. This simple model must have the same stiffness (load/displacement) in chosen directions, the same mass and the basic mode shapes and frequencies should also be the same with the detailed model. 4 different structural optimisation schemes were studied to prepare a model: sizing optimisation, optimisation with material properties as design variables, combined sizing and material property optimisation and free-size optimisation. Among these free-size optimisation where each element in the model has its own design variable -thickness- was found to be the most effective method. The stiffness could be matched to the detailed model as close as 5% and so also could the first two fundamental mode shapes and frequencies. Additionally, the initial sub-model prepared was used to do a preliminary study on how variations in casting dimensions would affect the stiffness of the component in a certain direction. This was done by creating a design of experiments (DoE) for the stiffness. A response surface for the stiffness was created in terms of the dimensions that have the most significant effect. This was later used to predict an interval for the stiffness based on variations in the cast dimensions with a confidence level of 99.7%.
APA, Harvard, Vancouver, ISO, and other styles
39

Wennberg, Martin, and Richard Michelsson. "Work methodology and identification of uncertainties in quality assurance : A case study of internal component manufacturing in the automotive industry." Thesis, KTH, Industriell produktion, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-216376.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Khan, Sardar Zaheer Ahmad. "Technology transfer effectiveness through international joint ventures (IJVs) to their component suppliers : a study of the automotive industry of Pakistan." Thesis, University of Birmingham, 2011. http://etheses.bham.ac.uk//id/eprint/3171/.

Full text
Abstract:
This thesis investigates the important topic of technology transfer effectiveness from international joint ventures (IJVs) established in the automotive industry of Pakistan to their local components suppliers; a relatively under-explored area and context. Using hybrid methodology (qualitative interviews conducted with the 50 Pakistani first tier suppliers, 3 of the major assemblers who control 95%-98% of the market and with the officials of the Ministry of Industries and Production, supplemented with survey questionnaire), the study argues that IJVs in the automotive industry of Pakistan have transferred very limited low-medium complexity parts technology to their Pakistani component suppliers. The results also demonstrate that the assemblers have not, so far, transferred the whole package of technology to their suppliers. This whole package of technology is important for the resource constrained and underdeveloped suppliers to move up in the global value chain. The results also point out that the willingness of the sender is an important aspect for any technology transfer to take place and, in the context of Pakistan; assemblers are willing to transfer components to component- based technology depending on the underlying complexity of that particular component. Inter-organisational dynamics in the form of trust and social ties play a considerable and vital facilitating role in the transfer and effectiveness of technology. The recipient‘s role also in terms of learning intention and absorptive capacity is, highly relevant along with the willingness of the sender for the technology transfer to be effective. The study also shows that different governance mechanisms play an important role for technology transfer effectiveness, and the results demonstrate that only a few suppliers have developed exploitative/ exploratory innovations and a depth/breadth of learning. Finally, the study presents relevant contributions for managers, policy makers and researchers interested in the field of technology transfer and its effectiveness.
APA, Harvard, Vancouver, ISO, and other styles
41

Hilber, Patrik. "Component reliability importance indices for maintenance optimization of electrical networks." Licentiate thesis, Stockholm, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-274.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Oskar, Andersson. "Building Blocks: Utilizing Component-Based Software Engineering in Developing Cross-Platform Mobile Applications." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-158206.

Full text
Abstract:
Contemporary approaches to cross-platform mobile application development, such as hybrid apps from PhoneGap and generated native apps from Xamarin, show promise in reducing development time towards Android, iOS and other platforms. At the same time, studies show that there are various problems associated with these approaches, including suffering user experiences and codebases that are difficult to maintain and test properly. In this thesis, a novel prototype framework called Building Blocks was developed with the purpose of investigating the feasibility of utilizing component-based software engineering in solving this problem. The prototype was developed towards Android along with a web interface that allowed users to assemble an Android app using software components. The report concludes that component-based software engineering can be – and already is – utilized successfully to improve cross-platform mobile app development with special regards to user experience. Qualitative data indicate that Building Blocks as a concept is flexible and shows promise for mobile app development in which functionality is often reused, such as enterprise apps. Rapid prototyping using the web-based visual editing tool was another promising area. However, future use of Building Blocks would require further work on the prototype to improve its ease of use.
APA, Harvard, Vancouver, ISO, and other styles
43

Defernez, Marianne. "Methods based on principal component analysis of mid-infrared spectra : a new approach for the classification and authentication of fruit products." Thesis, University of East Anglia, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.309908.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Lin, Haisheng. "The application of multivariate statistical analysis and batch process control in industrial processes." Thesis, University of Manchester, 2010. https://www.research.manchester.ac.uk/portal/en/theses/the-application-of-multivariate-statistical-analysis-and-batch-process-control-in-industrial-processes(a80fba25-82b1-4f55-a38e-c486262e18dd).html.

Full text
Abstract:
To manufacture safe, effective and affordable medicines with greater efficiency, process analytical technology (PAT) has been introduced by the Food and Drug Agency to encourage the pharmaceutical industry to develop and design well-understood processes. PAT requires chemical imaging techniques to be used to collect process variables for real-time process analysis. Multivariate statistical analysis tools and process control tools are important for implementing PAT in the development and manufacture of pharmaceuticals as they enable information to be extracted from the PAT measurements. Multivariate statistical analysis methods such as principal component analysis (PCA) and independent component analysis (ICA) are applied in this thesis to extract information regarding a pharmaceutical tablet. ICA was found to outperform PCA and was able to identify the presence of five different materials and their spatial distribution around the tablet.Another important area for PAT is in improving the control of processes. In the pharmaceutical industry, many of the processes operate in a batch strategy, which introduces difficult control challenges. Near-infrared (NIR) spectroscopy is a non-destructive analytical technique that has been used extensively to extract chemical and physical information from a product sample based on the scattering effect of light. In this thesis, NIR measurements were incorporated as feedback information into several control strategies. Although these controllers performed reasonably well, they could only regulate the NIR spectrum at a number of wavenumbers, rather than over the full spectrum.In an attempt to regulate the entire NIR spectrum, a novel control algorithm was developed. This controller was found to be superior to the only comparable controller and able to regulate the NIR similarly. The benefits of the proposed controller were demonstrated using a benchmark simulation of a batch reactor.
APA, Harvard, Vancouver, ISO, and other styles
45

Sahd, Lize-Marie. "A structured approach to the identification of the significant risks related to enterprise mobile solutions at a mobile technology component level." Thesis, Stellenbosch : Stellenbosch University, 2015. http://hdl.handle.net/10019.1/96674.

Full text
Abstract:
Thesis (MComm)--Stellenbosch University, 2015.
ENGLISH ABSTRACT: The consumerisation of mobile technology is driving the mobile revolution and enterprises are forced to incorporate mobile solutions into their business processes in order to remain competitive. While there are many benefits relating to the investment in and use of mobile technology, significant risks are also being introduced into the business. The fast pace of technological innovation and the rate of adoption of mobile technology by employees has, however, created an environment where enterprises are deploying mobile solutions on an ad hoc basis. Enterprises are only addressing the risks as they are occurring and resulting in losses. The key contributing factor to this lack of governance and management is the fact that those charged with governance do not understand the underlying mobile technology components. The purpose of this research is to improve the understanding of the underlying components of mobile technology. The research further proposes to use this understanding to identify the significant risks related to mobile technology and to formulate appropriate internal controls to address these risks. The findings of the research identified the following underlying components of mobile technology: mobile devices; mobile infrastructure, data delivery mechanisms and enabling technologies; and mobile applications. Based on an understanding of the components and subcategories of mobile technology, a control framework was used to identify the significant risks related to each component and subcategory. The significant risks identified included both risks to the users (including interoperability, user experience, connectivity and IT support) as well as risks to the enterprise’s strategies (including continuity, security, cost and data ownership). The research concludes by formulating internal controls that the enterprise can implement to mitigate the significant risks. This resulted in two matrixes that serve as quick-reference guides to enterprises in the identification of significant risks at an enterprise specific mobile technology component level, as well as the relevant internal controls to consider. The matrixes also assist enterprises in determining the best mobile solutions to deploy in their business, given their strategies, risk evaluation and control environment.
AFRIKAANSE OPSOMMING: Die mobiele revolusie word deur die verbruiker van mobiele tegnologie aangedryf en, ten einde kompeterend te bly, word ondernemings gedwing om mobiele tegnologie in hul besigheidsprosesse te implementeer. Terwyl daar baie voordele verbonde is aan die investering in en gebruik van mobiele tegnologie, word die besigheid egter ook blootgestel aan wesenlike risiko’s. Die vinnige tempo waarteen mobiele tegnologie ontwikkel en deur werknemers aangeneem word, het egter ʼn omgewing geskep waarin ondernemings mobiele tegnologie op ʼn ad hoc basis ontplooi. Besighede spreek eers die risiko’s aan nadat dit reeds voorgekom het en verliese as gevolg gehad het. Die hoof bydraende faktor tot die tekort aan beheer en bestuur van mobiele tegnologie is die feit dat diegene verantwoordelik vir beheer, nie onderliggend mobiele tegnologie komponente verstaan nie. Die doel van hierdie navorsing is om die begrip van die onderliggende komponente van mobiele tegnologie te verbeter. Die navorsing poog verder om die wesenlike risiko’s verbonde aan mobiele tegnologie te identifiseer en om toepaslike interne beheermaatreëls te formuleer wat die risiko’s sal aanspreek. Die bevindinge van die navorsing het die volgende onderliggende komponente van mobiele tegnologie geïdentifiseer: mobiele toestelle; mobiele infrastruktuur, data afleweringsmeganismes, en bemagtigende tegnologieë; en mobiele toepassings. Gebaseer op ʼn begrip van die komponente en subkategorieë van mobiele tegnologie, is ʼn kontrole raamwerk gebruik om die wesenlike risiko’s verbonde aan elke komponent en subkategorie van die tegnologie, te identifiseer. Die wesenlike risiko’s sluit beide risiko’s vir die gebruiker (insluitend kontinuïteit, gebruikerservaring, konnektiwiteit en IT ondersteuning) sowel as risiko’s vir die onderneming se strategieë (insluitend kontinuïteit, sekuriteit, koste en data eienaarskap) in. Die navorsing sluit af met die formulering van die beheermaatreëls wat geïmplementeer kan word om die wesenlike risiko’s aan te spreek. Dit het gelei tot twee tabelle wat as vinnige verwysingsraamwerke deur ondernemings gebruik kan word in die identifisering van wesenlike risiko’s op ʼn onderneming-spesifieke tegnologie komponentvlak asook die oorweging van relevante interne beheermaatreëls. Die tabelle help ondernemings ook om die beste mobiele tegnologie vir hul besigheid te implementeer, gebaseer op hul strategie, risiko evaluering en beheeromgewing.
APA, Harvard, Vancouver, ISO, and other styles
46

Hansen, Hugo, and Oliver Öhrström. "Benchmarking and Analysis of Entity Referencing Within Open-Source Entity Component Systems." Thesis, Malmö universitet, Fakulteten för teknik och samhälle (TS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-20074.

Full text
Abstract:
Runtime performance is essential for real time games, the faster a game can run the more features designers can put into the game to accomplish their vision.A popular architecture for video games is the Entity Component System architecture aimed to improve both object composition and performance. There are many tests for how this architecture performs under its optimal linear execution.This thesis presents a performance comparison of how several popular open-source Entity Component System libraries perform when fetching data from other entities during iteration. An object-oriented test is also done to compare against and verify if the known drawbacks of object-orientation can still be seen within these test cases. Our results show that doing a random lookup during iteration can cause magnitudes worse performance for Entity Component Systems.
APA, Harvard, Vancouver, ISO, and other styles
47

Kanaparthi, Pradeep Kumar. "Detection and Recognition of U.S. Speed Signs from Grayscale Images for Intelligent Vehicles." University of Toledo / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1352934398.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Hosek, Vicki Ann. "Locating the Critical Component in Technological Pedagogical and Content Knowledge (TPACK)| An Examination of How Graduate Students Recruit TPACK and Critical Digital Literacy into Classroom Practices." Thesis, Illinois State University, 2019. http://pqdtopen.proquest.com/#viewpdf?dispub=10978267.

Full text
Abstract:

The objectives of this study were to gain an understanding of how practicing teachers believe they are prepared to meaningfully and critically integrate technology into their classroom practices; and to understand how practicing teachers recruited those beliefs into their teaching practices. This included gaining an understanding of what they believed led to their engagement in the critical dimensions of technology use in their teaching practices. This mixed-methods study contained two phases. In Phase 1, 58 graduate students in a College of Education completed a newly developed Critical Technological Pedagogical Content Knowledge (C-TPACK) survey containing likert-scale and open-ended questions. A subset of four graduate students, who were also practicing teachers, participated in Phase 2 of this study where lesson plans, observations, and interviews were analyzed. The findings indicated that limited training in and exposure to C-TPACK during teacher education coursework and professional development (PD), uncertainty about students’ critical digital literacies (CDL), the teachers’ varying understandings of CDL, resource limitations and restrictive school policies posed barriers for the teachers’ recruitment of CTPACK to their practices. These findings showed the importance of tying critical theory to technology in education coursework and PD programs. This study proposes the use of a theoretical framework that prioritizes critical theory, namely the C-TPACK framework, when analyzing teachers’ technology integration practices. KEYWORDS: TPACK, C-TPACK, critical digital literacy, digital literacy, teacher education, professional development

APA, Harvard, Vancouver, ISO, and other styles
49

Elmqvist, Jonas. "Components, Safety Interfaces, and Compositional Analysis." Licentiate thesis, Linköping University, Linköping University, Department of Computer and Information Science, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-57490.

Full text
Abstract:

Component-based software development has emerged as a promising approach for developing complex software systems by composing smaller independently developed components into larger component assemblies. This approach offers means to increase software reuse, achieve higher flexibility and shorter time-to-market by the use of off-the-shelf components (COTS). However, the use of COTS in safety-critical system is highly unexplored.

This thesis addresses the problems appearing in component-based development of safety-critical systems. We aim at efficient reasoning about safety at system level while adding or replacing components. For safety-related reasoning it does not suffice to consider functioning components in their intended environments but also the behaviour of components in presence of single or multiple faults. Our contribution is a formal component model that includes the notion of a safety interface. It describes how the component behaves with respect to violation of a given system-level property in presence of faults in its environment. This approach also provides a link between formal analysis of components in safety-critical systems and the traditional engineering processes supported by model-based development.

We also present an algorithm for deriving safety interfaces given a particular safety property and fault modes for the component. The safety interface is then used in a method proposed for compositional reasoning about component assemblies. Instead of reasoning about the effect of faults on the composed system, we suggest analysis of fault tolerance through pair wise analysis based on safety interfaces.

The framework is demonstrated as a proof-of-concept in two case studies; a hydraulic system from the aerospace industry and an adaptive cruise controller from the automotive industry. The case studies have shown that a more efficient system-level safety analysis can be performed using the safety interfaces.

APA, Harvard, Vancouver, ISO, and other styles
50

Ackerstierna, Paula. "The Environmental Impact of an Automotive Plastic Component : A lifecycle approach of a deco panel scenario analysis of two different plastics." Thesis, Karlstads universitet, Fakulteten för hälsa, natur- och teknikvetenskap (from 2013), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-73177.

Full text
Abstract:
The transport sector is a major end-user of energy. As consumers are becoming aware and conscious of their environmental footprint making the enviromental footprint of automobile components one of the requirements in new product design development. The purpose of the study is to identify environmental impacts of a plastic panel. The main objective of the study is to perform an environmental life cycle assessment (E-LCA) of an existing panel regarding four scenarios with a nonbio-based plastic and a bio-based plastic. The first two scenarios have the same production and use phase, but different end-of-life treatments. The plastic in these scenarios is non-biobased. The last two scenarios have the same bio-based plastic and use phase, but different end-of-life-treatments. The first three scenarios have a surface material covering the plastic. The environmental impacts analyzed are global warming potential, acidification potential, eutrophication potential, photochemical ozone creation potential, primary energy demand and particulate matter. The analysis is carried out according to the ISO14040/44 with the four steps of LCA: 1) Goal and Scope Definition, 2) Inventory Analysis, 3) Impact Assessment, and 4) Interpretation. The functional unit of the anaysis is a plastic panel. The inventory was collected by literature, the LCA software GaBi, and the commissoner of the study. The environmental impact assessment was conducted in GaBi 8 with the method of CML2015, Primary Energy, and IMPACT2002+. A dominance and a contribution analyses were applied to identify the hotspots of the life cycle. The hotspot of the life cycle was identified to be the production phase. The main contributor within the scenarios was the plastic production, specifically the granulates and the fiber fillings. The bio-based plastic reduced the impacts compared to the non-bio-based in five out of six cases. However, the photochemical ozone creation potential for both plastics were the same. The bio-based plastic reduced the GWP 16%, AP by 1%, EP by less than 1%, and PED by 19%. If the surface cover in aluminum was removed, the GWP was reduced by 46%, AP by 35%, EP by29%, POCP by 36%, PED by 42%and PM by 40%.  The transportation contributed most to impacts in the acidification potential, eutrophication potential, and particulate matter. The transportation’s impacts were greater in the bio-based plastic than the non-bio-based. The granulates of the plastic along with the injection molding are the main contributors due to usage of coal-based electricity for the injection molding and oil for the plastic production. The values used in the study are based on country averages which may differ depending on geographic location and its development as China is a country with a large area. The GWP is the highest value of the impacts analyzed, but even though the other are small fractions these may cause great damages. These damages can irritate eyes, damage lungs and destroy photosynthesis. By using recycled material for products instead of new materials, as done in the study, the impacts could be lower. As some previous studies agrees, the usage of bio-plastics lowers the environmental impact by a few percentages.  The bioplastic is an environmentally sustainable option to the current plastic as the location of the panel is not sensitive to excessive heat.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography