To see the other types of publications on this topic, follow the link: Computer engineering.

Dissertations / Theses on the topic 'Computer engineering'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Computer engineering.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Kutter, Philipp W. "Montages : engineering of computer languages /." Zürich : TIK Institut für Technische Informatik und Kommunikationsnetze, ETH Eidgenössische Technische Hochschule Zürich, 2004. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=15421.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Er, M. C. "Computer interpretation of engineering drawings." Thesis, University of Essex, 1985. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.373204.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Vieri, Carlin James. "Reversible computer engineering and architecture." Thesis, Massachusetts Institute of Technology, 1999. http://hdl.handle.net/1721.1/80144.

Full text
Abstract:
Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1999.
Includes bibliographical references (p. 162-165).
by Carlin James Vieri.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
4

Pickrell, Nathan. "Efficiently managing the computer engineering and Computer Science labs." Thesis, California State University, Long Beach, 2013. http://pqdtopen.proquest.com/#viewpdf?dispub=1522647.

Full text
Abstract:

University lab environments are handled differently than corporate, government, and commercial Information Technology (IT) environments. While all environments have the common issues of scalability and cross-platform interoperability, educational lab environments must additionally handle student permissions, student files, student printing, and special education labs. The emphasis is on uniformity across lab machines for a uniform course curriculum.

This thesis construes how a specific set of Computer Science labs are maintained. It describes how documentation is maintained, how the lab infrastructure is setup, how the technicians managing the lab build master lab images, how all of the workstations in the lab are cloned, and how a portion of the maintenance is handled. Additionally, this paper also describes some of the specialty labs provided for courses with functional topics.

APA, Harvard, Vancouver, ISO, and other styles
5

Le, Gal Thierry. "Re-engineering software for integration using computer aided software engineering." Thesis, This resource online, 1992. http://scholar.lib.vt.edu/theses/available/etd-06232009-063016/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Akbary-Safa, Mahnaz. "Computer aided inspection in engineering manufacture." Thesis, Imperial College London, 1992. http://hdl.handle.net/10044/1/8346.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Cook, Carl Leslie Raymond. "Towards Computer-Supported Collaborative Software Engineering." Thesis, University of Canterbury. Computer Science and Software Engineering, 2007. http://hdl.handle.net/10092/1140.

Full text
Abstract:
Software engineering is a fundamentally collaborative activity, yet most tools that support software engineers are designed only for single users. There are many foreseen benefits in using tools that support real time collaboration between software engineers, such as avoiding conflicting concurrent changes to source files and determining the impact of program changes immediately. Unfortunately, it is difficult to develop non-trivial tools that support real time Collaborative Software Engineering (CSE). Accordingly, the few CSE tools that do exist have restricted capabilities. Given the availability of powerful desktop workstations and recent advances in distributed computing technology, it is now possible to approach the challenges of CSE from a new perspective. The research goal in this thesis is to investigate mechanisms for supporting real time CSE, and to determine the potential gains for developers from the use of CSE tools. An infrastructure, CAISE, is presented which supports the rapid development of real time CSE tools that were previously unobtainable, based on patterns of collaboration evident within software engineering. In this thesis, I discuss important design aspects of CSE tools, including the identification of candidate patterns of collaboration. I describe the CAISE approach to supporting small teams of collaborating software engineers. This is by way of a shared semantic model of software, protocol for tool communication, and Computer Supported Collaborative Work (CSCW) facilities. I then introduce new types of synchronous semantic model-based tools that support various patterns of CSE. Finally, I present empirical and heuristic evaluations of typical development scenarios. Given the CAISE infrastructure, it is envisaged that new aspects of collaborative work within software engineering can be explored, allowing the perceived benefits of CSE to be fully realised.
APA, Harvard, Vancouver, ISO, and other styles
8

Sivaloganathan, Sangarappillai. "Sketching input for computer aided engineering." Thesis, City University London, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.292733.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Tran, Sang Cong. "Applications of formal methods in engineering." Thesis, University of Warwick, 1991. http://wrap.warwick.ac.uk/60452/.

Full text
Abstract:
The main idea presented in this thesis is to propose and justify a general framework for the development of safety-related systems based on a selection of criticality and the required level of integrity. We show that formal methods can be practically and consistently introduced into the system design lifecycle without incurring excessive development cost. An insight into the process of generating and validating a formal specification from an engineering point of view is illustrated, in conjunction with formal definitions of specification models, safety criteria and risk assessments. Engineering specifications are classified into two main classes of systems, memoryless and memory bearing systems. Heuristic approaches for specification generation and validation of these systems are presented and discussed with a brief summary of currently available formal systems and their supporting tools. It is further shown that to efficiently address different aspects of real-world problems, the concept of embedding one logic within another mechanised logic, in order to provide mechanical support for proofs and reasoning, is practical. A temporal logic framework, which is embedded in Higher Order Logic, is used to verify and validate the design of a real-time system. Formal definitions and properties of temporal operators are defined in HOL and real-time concepts such as timing marker, interrupt and timeout are presented. A second major case study is presented on the specification a solid model for mechanical parts. This work discusses the modelling theory with set theoretic topology and Boolean operations. The theory is used to specify the mechanical properties of large distribution transformers. Associated mechanical properties such as volumetric operations are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
10

Bäckström, Emil. "NIISim, a Simulator for Computer Engineering Education." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-94184.

Full text
Abstract:
Students at KTH can take a course called IS1200 Computer Engineering. This course teaches some of the basic aspects of computer engineering. One important part of the course is the labs which are carried out on an Altera DE2 Development and Educational board. The labs utilize many of the buttons and LEDs on this board. Unfortunately, these boards are only available during the course lab sessions meaning students have no way of fully testing their programs at home. Altera does provide a simulator, but it is not able to simulate the features on the board. NIISim aims to solve this problem. NIISim (Nios II Simulator) is a simulator that will be able to simulate all the functionality on the DE2 board that is necessary to complete all the IS1200 course labs. It comes with support for the Nios II CPU from Altera, several of Altera’s I/O devices and many features on the DE2 board. With a simple graphical user interface the user is able to quickly load the appropriate files and start the simulation. The user is also able to communicate with the simulated program using a console that supports both text input and output. Testing has shown that NIISim simulates the IS1200 course labs without problems. This is a great success. Furthermore, the simulation is performed at a much faster rate than the simulator provided by Altera. The intention is now that NIISim will be used in the IS1200 course to help increase students learning experience as they will have much more time to experiment with the DE2 board features. NIISim also makes a great starting platform for future master’s thesis projects such as implementing a cache simulator or multi-core simulation support.
APA, Harvard, Vancouver, ISO, and other styles
11

Goekay, Mehmet Kemal. "Developing computer methodologies for rock engineering decisions." Thesis, Imperial College London, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.339439.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Race, Charles T. "Value engineering: an application to computer software." Thesis, Monterey, California. Naval Postgraduate School, 1995. http://hdl.handle.net/10945/7516.

Full text
Abstract:
The purpose of this thesis is to determine how and to what extent can the Department of the Department of the Navy's Value Engineering Program be utilized in the acquisition of computer software. A review of professional literature such as journals, perio
APA, Harvard, Vancouver, ISO, and other styles
13

Shahdad, Mir Abubakr. "Engineering innovation (TRIZ based computer aided innovation)." Thesis, University of Plymouth, 2015. http://hdl.handle.net/10026.1/3317.

Full text
Abstract:
This thesis describes the approach and results of the research to create a TRIZ based computer aided innovation tools (AEGIS and Design for Wow). This research has mainly been based around two tools created under this research: called AEGIS (Accelerated Evolutionary Graphics Interface System), and Design for Wow. Both of these tools are discussed in this thesis in detail, along with the test data, design methodology, test cases, and research. Design for Wow (http://www.designforwow.com) is an attempt to summarize the successful inventions/ designs from all over the world on a web portal which has multiple capabilities. These designs/innovations are then linked to the TRIZ Principles in order to determine whether innovative aspects of these successful innovations are fully covered by the forty TRIZ principles. In Design for Wow, a framework is created which is implemented through a review tool. The Design for Wow website includes this tool which has been used by researcher and the users of the site and reviewers to analyse the uploaded data in terms of strength of TRIZ Principles linked to them. AEGIS (Accelerated Evolutionary Graphics Interface System) is a software tool developed under this research aimed to help the graphic designers to make innovative graphic designs. Again it uses the forty TRIZ Principles as a set of guiding rules in the software. AEGIS creates graphic design prototypes according to the user input and uses TRIZ Principles framework as a guide to generate innovative graphic design samples. The AEGIS tool created is based on TRIZ Principles discussed in Chapter 3 (a subset of them). In AEGIS, the TRIZ Principles are used to create innovative graphic design effects. The literature review on innovative graphic design (in chapter 3) has been analysed for links with TRIZ Principles and then the DNA of AEGIS has been built on the basis of this study. Results from various surveys/ questionnaires indicated were used to collect the innovative graphic design samples and then TRIZ was mapped to it (see section 3.2). The TRIZ effects were mapped to the basic graphic design elements and the anatomy of the graphic design letters was studied to analyse the TRIZ effects in the collected samples. This study was used to build the TRIZ based AEGIS tool. Hence, AEGIS tool applies the innovative effects using TRIZ to basic graphic design elements (as described in section 3.3). the working of AEGIS is designed based on Genetic Algorithms coded specifically to implement TRIZ Principles specialized for Graphic Design, chapter 4 discusses the process followed to apply TRIZ Principles to graphic design and coding them using Genetic Algorithms, hence resulting in AEGIS tool. Similarly, in Design for Wow, the content uploaded has been analysed for its link with TRIZ Principles (see section 3.1 for TRIZ Principles). The tool created in Design for Wow is based on the framework of analysing the TRIZ links in the uploaded content. The ‘Wow’ concept discussed in the section 5.1 and 5.2 is the basis of the concept of Design for Wow website, whereby the users upload the content they classify as ‘Wow’. This content then is further analysed for the ‘Wow factor’ and then mapped to TRIZ Principles as TRIZ tagging methodology is framed (section 5.5). From the results of the research, it appears that the TRIZ Principles are a comprehensive set of innovation basic building blocks. Some surveys suggest that amongst other tools, TRIZ Principles were the first choice and used most .They have thus the potential of being used in other innovation domains, to help in their analysis, understanding and potential development.
APA, Harvard, Vancouver, ISO, and other styles
14

CAMPANELLA, Davide. "COMPUTER AIDED ENGINEERING OF SOLID BONDING PHENOMENA." Doctoral thesis, Università degli Studi di Palermo, 2014. http://hdl.handle.net/10447/95507.

Full text
Abstract:
Joining is a fundamental technological process in manufacturing used to create a single piece from two or more parts. Welding is still today one of the most popular joining techniques used in manufacturing allowing a permanent junction. Traditional welding processes are based on the melting of the materials to be joined. In this way, several defects may arise because of solidification problems, joints deformation due to elevated residual stress and metallurgical integrity of the joints (intermetallic, porosities, etc). As an example, some aluminum alloys present considerable problems the junction is carried out by traditional fusion welding methods. During the melting process, in fact, the liquid material can react with the surrounding atmosphere oxidizing and creating a weak joint. On the other hand, Solid Bonding based welding processes allow for defect free joints with low residual stress and low distortion. However, these processes are usually characterized by complex mechanics due to peculiar material flow. Hence, the engineering and optimization of solid bonding processes is difficult and requires a large number of time and cost consuming test trials. In this way, proper numerical models are essential tools permitting effective process design. The aim of this research was the computer aided engineering of two different manufacturing processes taking advantage of the same metallurgical phenomenon, namely solid boding. Linear Friction Welding (LFW), used to weld non-axisymmetric components and Accumulative Roll Bonding (ARB), used to increase the mechanical properties of sheet metals, were selected. Experiments, both of LFW and ARB, were run with the aim to study the effects of the process input parameters on the final product quality, to define proper process windows and to acquire the data needed for the numerical models set up and validation. In particular, as far as LFW is regarded, a dedicated experimental machine, able to produce LFWed joints with varying pressure, oscillation frequency and amplitude, was designed and built. Numerical models were set up, validated and used to design the process by studying the complex material behavior during the solid bonding of different aluminum alloys. In particular, as far as ARB is regarded, two different numerical models were considered, using an explicit and implicit approach, respectively, in order to study the process. An implicit approach was used for the LFW process, leading to the understanding of the main process variables influence on the field variables distribution and the occurrence of actual bonding. The simulation tools used in this work were DEFORM3D and ABAQUS CAE/6.9 (2D and 3D modes). The first prototype of the LFW machine was designed and developed during the first doctorate year. During the second doctorate year, the Accumulative Roll Bonding process was studied at the University of Erlangen-Nuremberg while, during the third year, the Linear Friction Welding process was analyzed at the University of Palermo.
APA, Harvard, Vancouver, ISO, and other styles
15

Roberts, Stephen I. "Energy-aware performance engineering in high performance computing." Thesis, University of Warwick, 2017. http://wrap.warwick.ac.uk/107784/.

Full text
Abstract:
Advances in processor design have delivered performance improvements for decades. As physical limits are reached, however, refinements to the same basic technologies are beginning to yield diminishing returns. Unsustainable increases in energy consumption are forcing hardware manufacturers to prioritise energy efficiency in their designs. Research suggests that software modifications will be needed to exploit the resulting improvements in current and future hardware. New tools are required to capitalise on this new class of optimisation. This thesis investigates the field of energy-aware performance engineering. It begins by examining the current state of the art, which is characterised by ad-hoc techniques and a lack of standardised metrics. Work in this thesis addresses these deficiencies and lays stable foundations for others to build on. The first contribution made includes a set of criteria which define the properties that energy-aware optimisation metrics should exhibit. These criteria show that current metrics cannot meaningfully assess the utility of code or correctly guide its optimisation. New metrics are proposed to address these issues, and theoretical and empirical proofs of their advantages are given. This thesis then presents the Power Optimised Software Envelope (POSE) model, which allows developers to assess whether power optimisation is worth pursuing for their applications. POSE is used to study the optimisation characteristics of codes from the Mantevo mini-application suite running on a Haswell-based cluster. The results obtained show that of these codes TeaLeaf has the most scope for power optimisation while PathFinder has the least. Finally, POSE modelling techniques are extended to evaluate the system-wide scope for energy-aware performance optimisation. System Summary POSE allows developers to assess the scope a system has for energy-aware software optimisation independent of the code being run.
APA, Harvard, Vancouver, ISO, and other styles
16

Abuseta, Yousef M. "AutoTaSC : model driven development for autonomic software engineering." Thesis, Liverpool John Moores University, 2009. http://researchonline.ljmu.ac.uk/5927/.

Full text
Abstract:
Whilst much research progress has been achieved towards the development of autonomic software engineering tools and techniques including: policy-based management, modelbased development, service-oriented architecture and model driven architecture. They have often focused on and started from chosen object-oriented models of required software behaviour, rather than domain model including user intentions and/or software goals. Such an approach is often reported to lead to "misalignment" between business process layer and their associated computational enabling systems. This is specifically noticeable in adaptive and evolving business systems and/or processes settings. To address this long-standing problem research has over the years investigated many avenues to close the gap between business process modelling and the generation of enactment (computation) layer, which is responsive to business changes. Within this problem domain, this research sets out to study the extension of the Model Driven Development (MOD) paradigm to business/domain model, that is, how to raise the abstraction level of model-driven software development to the domain level and provide model synchronisation to trace and analyse the impact of a given model change. The main contribution of this research is the development of a MOD-based design method for autonomic systems referred to as AutoTaSC. The latter consists of a series of related models, where each of which represents the system under development at a given stage. The first and highest level model represents the abstract model referred to as the Platform Independent Model (PIM). The next model encapsulates the PIM model for the autonomic system where the autonomic capabilities and required components (such as monitor, sensor, actuator, analyser, policy, etc.) are added via some appropriate transformation rules. Targeting a specific technology involves adding, also via transformation rules, specific information related to that platform from which the Platform Specific Model (PSM) for the autonomic system is extracted. In the last stage, code can be generated for the specific platform or technology targeted in the previous stage, web services for instance. In addition, the AutoTaSC method provides a situated model synchronisation mechanism, which is designed following the autonomic systems principles. For instance, to guarantee model synchronisation each model from each AutoTaSC stage has an associated policy-based feedback control loop, which regulates its reaction to detected model change. Thus, AutaTase method model transformation approach to drive model query, view and synchronisation. The Auto'Iast? method was evaluated using a number of benchmark case-studies to test this research hypothesis including the effectiveness and generality of AutaTaSe design method.
APA, Harvard, Vancouver, ISO, and other styles
17

McKnight, Walter Lee. "A meta system for generating software engineering environments /." The Ohio State University, 1985. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487260531958418.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Glisson, William Bradley. "The Web Engineering Security (WES) methodology." Thesis, University of Glasgow, 2008. http://theses.gla.ac.uk/186/.

Full text
Abstract:
The World Wide Web has had a significant impact on basic operational economical components in global information rich civilizations. This impact is forcing organizations to provide justification for security from a business case perspective and to focus on security from a web application development environment perspective. This increased focus on security was the basis of a business case discussion and led to the acquisition of empirical evidence gathered from a high level Web survey and more detailed industry surveys to analyse security in the Web application development environment. Along with this information, a collection of evidence from relevant literature was also gathered. Individual aspects of the data gathered in the previously mentioned activities contributed to the proposal of the Essential Elements (EE) and the Security Criteria for Web Application Development (SCWAD). The Essential Elements present the idea that there are essential, basic organizational elements that need to be identified, defined and addressed before examining security aspects of a Web Engineering Development process. The Security Criteria for Web Application Development identifies criteria that need to be addressed by a secure Web Engineering process. Both the EE and SCWAD are presented in detail along with relevant justification of these two elements to Web Engineering. SCWAD is utilized as a framework to evaluate the security of a representative selection of recognized software engineering processes used in Web Engineering application development. The software engineering processes appraised by SCWAD include: the Waterfall Model, the Unified Software Development Process (USD), Dynamic Systems Development Method (DSDM) and eXtreme Programming (XP). SCWAD is also used to assess existing security methodologies which are comprised of the Orion Strategy; Survivable / Viable IS approaches; Comprehensive Lightweight Application Security Process (CLASP) and Microsoft’s Trust Worthy Computing Security Development Lifecycle. The synthesis of information provided by both the EE and SCWAD were used to develop the Web Engineering Security (WES) methodology. WES is a proactive, flexible, process neutral security methodology with customizable components that is based on empirical evidence and used to explicitly integrate security throughout an organization’s chosen application development process. In order to evaluate the practical application of the EE, SCWAD and the WES methodology, two case studies were conducted during the course of this research. The first case study describes the application of both the EE and SCWAD to the Hunterian Museum and Art Gallery’s Online Photo Library (HOPL) Internet application project. The second case study presents the commercial implementation of the WES methodology within a Global Fortune 500 financial service sector organization. The assessment of the WES methodology within the organization consisted of an initial survey establishing current security practices, a follow-up survey after changes were implemented and an overall analysis of the security conditions assigned to projects throughout the life of the case study.
APA, Harvard, Vancouver, ISO, and other styles
19

Pallotta, Vincenzo. "Cognitive language engineering towards robust human-computer interaction /." Lausanne, 2002. http://library.epfl.ch/theses/?display=detail&nr=2630.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Cevheri, Necmettin. "Computer Aided Engineering Of An Unmanned Underwater Vehicle." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/2/12610754/index.pdf.

Full text
Abstract:
Hydrodynamic and thermal analyses performed during the conceptual design of an unmanned underwater vehicle are presented in this study. The hull shape is determined by considering alternative shapes and the dimensions are determined from the internal arrangement of components. Preliminary thermal analyses of the watertight section are performed with a commercial software called FLUENT to check the risk of over-heating due to the heat dissipation of devices. Performance of the proposed hull design is analyzed by FLUENT. Before simulations of the vehicle, validation studies are performed. Models 4159, 4158 and 4154 of Series 58 are chosen as the experimental reference. Their total resistance coefficients are compared with the results of the validations analyses. Mesh densities, turbulence models, near wall modeling approaches and inlet turbulence intensities are varied to understand their effects on the accuracy of predictions. A suitable turbulence modeling approach is chosen to analyze forward and vertical motions of the vehicle to check whether speed requirements are fulfilled. Hull configurations with and without appendages are used to observe their effects on total drag. It is observed that the proposed design satisfies speed requirements of the vehicle and no overheating is expected in the watertight section.
APA, Harvard, Vancouver, ISO, and other styles
21

Zullo, Luca Costantino. "Computer aided design of experiments : an engineering approach." Thesis, Imperial College London, 1991. http://hdl.handle.net/10044/1/8323.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Giles, David. "Computer-based modelling and analysis in engineering geology." Thesis, University of Portsmouth, 2014. https://researchportal.port.ac.uk/portal/en/theses/computerbased-modelling-and-analysis-in-engineering-geology(091c5104-4dbb-4e90-b897-aaf34702100a).html.

Full text
Abstract:
This body of work presents the research and publications undertaken under a general theme of computer-based modelling and analysis in engineering geology. Papers are presented on geotechnical data management, data interchange, Geographical Information Systems, surface modelling, geostatistical methods, risk-based modelling, knowledge-based systems, remote sensing in engineering geology and on the integration of computer applications into applied geoscience teaching. The work highlights my own personal contributions and publications under this theme as well as collaborations and output emanating from PhD co-supervisions which have included the following projects: A geotechnical and geochemical characterisation of dry oil lake contaminated soil in Kuwait; Dust dispersion monitoring and modelling; Geotechnical properties of chalk putties; The application of airborne multispectral remote sensing and digital terrain modelling to the detection and delineation of landslides on clay dominated slopes of the Cotswolds Escarpment; Domestic property insurance risks associated with brickearth deposits; Development of a knowledge-based system methodology for designing solid waste disposal sites in arid and semi-arid environments; GIS Techniques as an aid to the assessment of earthquake triggered landslide hazards; The application of GIS as a data integrator of pre-ground investigation desk studies for terrain evaluation and investigation planning; The influence of clay mineralogy pore water composition and pre-consolidation pressure on the magnitude of ground surface heave due to rises in groundwater level. My publication record comprises; Pathfinder and seminal papers; Papers from co-supervised PhD programmes; Pedagogic contributions; Encyclopaedia entries; International collaborations; Technical authorship and support; Other published contributions; Confidential development and technical reports and Internal briefing papers.
APA, Harvard, Vancouver, ISO, and other styles
23

Oltikar, Akhil Manohar. "Computer-Aided Engineering of Plywood Upholstered Furniture Frames." NCSU, 2001. http://www.lib.ncsu.edu/theses/available/etd-20001221-130641.

Full text
Abstract:

Until the early 1900s, furniture was built by hand, one piece at a time. The industrial revolution and modern manufacturing technology has changed all of that. Today, as the furniture industry moves firmly into the next century, computerized systems and automated manufacturing have become more common to the industry. This thesis represents an effort to analyze the current practices in computer-aided design of upholstered furniture, specifically plywood frame furniture, and to develop new procedures for reducing the lead-time in upholstery product development. Different 3-D modeling techniques for designing plywood furniture frames and their features have been developed and implemented. A plywood frame feature library has been created, and geometric relations needed to fully constrain each feature type have been developed. This reduces modeling time and also increases consistency in the solid models. A new reverse engineering procedure, using an articulating arm, has been proposed, implemented, and tested for 3-D digitization of plywood frames. The proposed methodology eliminates some of the traditional processes currently followed in the industry, thus making the product development faster and more streamlined. Further, an algorithm has been developed, implemented and tested for automatically mirroring plywood upholstery frame assemblies in a CAD system. The algorithm considerably reduces the modeling lead-time in the product development process. Finally, some future work that considers currently available 3-D CAD technologies has been recommended which would help close the gap between upholstery designers and manufacturers.

APA, Harvard, Vancouver, ISO, and other styles
24

Altenhof, Jeffrey L. "Computer-aided concurrent engineering in refrigeration system design." Master's thesis, This resource online, 1990. http://scholar.lib.vt.edu/theses/available/etd-01262010-020010/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Hatchell, Brian. "Data base design for integrated computer-aided engineering." Thesis, Georgia Institute of Technology, 1987. http://hdl.handle.net/1853/16744.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Handa, Sunny. "Reverse engineering computer programs under Canadian copyright law." Thesis, McGill University, 1994. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=22693.

Full text
Abstract:
The field of copyright law has been especially active in recent times as a result of its application to computer programs. Copyright law, not originally designed to protect such works, has had to adapt to suit the special nature of computer programs. This paper addresses the applicability of copyright law to the reverse engineering of computer programs. Reverse engineering is a method by which programmers may uncover the ideas and processes used within an existing computer program, thereby allowing the construction of compatible computer programs. Reverse engineering may also be used to create works which are directly competitive with the original program, and may also be used to assist in the piracy of computer programs. The mere act of reverse engineering computer programs, regardless of its purpose, potentially infringes the copyright of the computer program in question, notwithstanding whether the results of the process are used in an infringing manner.
Recently both the European Union countries and the United States have accepted reverse engineering as an exception to copyright infringement. The European Union has opted for a legislative solution, whereas in the United States several courts have construed the fair use exception contained in that country's Copyright Act as allowing reverse engineering.
In this paper, it is argued that Canada must also adopt a reverse engineering exception to copyright infringement. It is claimed that the implementation of such an exception is justified through examination of the underlying policy goals of copyright law in the context of an economic framework. Reverse engineering fosters the creation of standards which, it is argued, increase societal wealth. The existence of a reverse engineering exception is consistent with the balance between the economic rights of individual authors and societal technological progress, which copyright seeks to maintain. It is demonstrated that copyright exists as the only form of applicable intellectual property protection which can broadly limit the disclosure of concepts underlying computer programs.
It is suggested that an effective exception should be statutorily based. It is felt that the existing fair dealing exception contained in the Canadian Copyright Act is juridically under-developed and too uncertain to provide an effective solution to the reverse engineering problem. A legislative solution would send a clear message to the software industry as well as to the courts, and could prohibit contracting out of the Copyright Act which would potentially be allowed were a judicial solution sought. It is further suggested that the statutory exception should broadly allow the process of reverse engineering as opposed to limiting it to cases where compatibility is sought. Narrowing the exception creates conceptual difficulties in applying limits to reverse engineering. Allowing a broad exception would avoid these difficulties while continuing to provide copyright holders with protection if, after the reverse engineering process is concluded, their protectable expression is used within another's software product.
APA, Harvard, Vancouver, ISO, and other styles
27

Yung, Melody T. 1976. "Revamping EDICS : the Engineering-Design Instructional Computer System." Thesis, Massachusetts Institute of Technology, 2003. http://hdl.handle.net/1721.1/89931.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Jacknis, Michael L. 1975. "Introductory educational laboratory experience for computer engineering undergraduates." Thesis, Massachusetts Institute of Technology, 1999. http://hdl.handle.net/1721.1/80643.

Full text
Abstract:
Thesis (M.Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, June 2001.
Includes bibliographical references (leaf 68).
by Michael L. Jacknis.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
29

Kargas, Abderrezak. "Computer interpretation of engineering drawings as solid models." Thesis, Aston University, 1988. http://publications.aston.ac.uk/11911/.

Full text
Abstract:
Much of the geometrical data relating to engineering components and assemblies is stored in the form of orthographic views, either on paper or computer files. For various engineering applications, however, it is necessary to describe objects in formal geometric modelling terms. The work reported in this thesis is concerned with the development and implementation of concepts and algorithms for the automatic interpretation of orthographic views as solid models. The various rules and conventions associated with engineering drawings are reviewed and several geometric modelling representations are briefly examined. A review of existing techniques for the automatic, and semi-automatic, interpretation of engineering drawings as solid models is given. A new theoretical approach is then presented and discussed. The author shows how the implementation of such an approach for uniform thickness objects may be extended to more general objects by introducing the concept of `approximation models'. Means by which the quality of the transformations is monitored, are also described. Detailed descriptions of the interpretation algorithms and the software package that were developed for this project are given. The process is then illustrated by a number of practical examples. Finally, the thesis concludes that, using the techniques developed, a substantial percentage of drawings of engineering components could be converted into geometric models with a specific degree of accuracy. This degree is indicative of the suitability of the model for a particular application. Further work on important details is required before a commercially acceptable package is produced.
APA, Harvard, Vancouver, ISO, and other styles
30

Davies, Daniel. "Representation of multiple engineering viewpoints in Computer Aided Design through computer-interpretable descriptive markup." Thesis, University of Bath, 2008. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.488893.

Full text
Abstract:
The aim of this work was to find a way of representing multiple interpretations of a product design with the same CAD model and in a way that allowed reduction of the manual work of producing the viewpoint specific models of the product through automation The approach presented is the recording of multiple viewpoint-interpretations of a product design with a CAD product model using descriptive, by-reference (stand-off) computer interpretable markup of the model.
APA, Harvard, Vancouver, ISO, and other styles
31

Alrabghi, Leenah O. "QFD IN SOFTWARE ENGINEERING." Kent State University / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=kent1385046526.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Zardari, Shehnila. "Cloud adoption : a goal-oriented requirements engineering approach." Thesis, University of Birmingham, 2016. http://etheses.bham.ac.uk//id/eprint/6567/.

Full text
Abstract:
The enormous potential of cloud computing for improved and cost-effective service has generated unprecedented interest in its adoption. However, a potential cloud user faces numerous risks regarding service requirements, cost implications of failure and uncertainty about cloud providers’ ability to meet service level agreements. These risks hinder the adoption of cloud computing. We motivate the need for a new requirements engineering methodology for systematically helping businesses and users to adopt cloud services and for mitigating risks in such transition. The methodology is grounded in goal-oriented approaches for requirements engineering. We argue that Goal-Oriented Requirements Engineering (GORE) is a promising paradigm to adopt for goals that are generic and flexible statements of users’ requirements, which could be refined, elaborated, negotiated, mitigated for risks and analysed for economics considerations. The methodology can be used by small to large scale organisations to inform crucial decisions related to cloud adoption. We propose a risk management framework based on the principle of GORE. In this approach, we liken risks to obstacles encountered while realising cloud user goals, therefore proposing cloud-specific obstacle resolution tactics for mitigating identified risks. The proposed framework shows benefits by providing a principled engineering approach to cloud adoption and empowering stakeholders with tactics for resolving risks when adopting the cloud. We extend the work on GORE and obstacles for informing the adoption process. We argue that obstacles’ prioritisation and their resolution is core to mitigating risks in the adoption process. We propose a novel systematic method for prioritising obstacles and their resolution tactics using Analytical Hierarchy Process (AHP). To assess the AHP choice of the resolution tactics we support the method by stability and sensitivity analysis.
APA, Harvard, Vancouver, ISO, and other styles
33

Jones, Sara. "Three-dimensional interactive connection diagrams for knowledge engineering." Thesis, City, University of London, 1993. http://openaccess.city.ac.uk/20156/.

Full text
Abstract:
This thesis describes research into human factors aspects of the use of 3-dimensional node and link diagrams, called Interactive Connection Diagrams (leDs), in the human-computer interface of tools for knowledge engineering. This research was carried out in two main stages: the first concentrated on perceptual aspects of 3-d ICDs, and the second on more general aspects of their use in realistic situations. A final section looked briefly at the possibility of formally specifying 3-d ICD representations. The main aim of the first stage was to investigate whether users were able to make effective judgements about the relative depths of components in 3-d ICDs. Controlled experiments were carried out to determine the extent to which such judgements were supported by the use of a particular approach to creating the illusion of depth. The results of these experiments showed that users were able to make reasonably effective judgements about the relative depths of components in 3-d ICDs. 3-d ICDs produced using the approach of interest were therefore argued to be suitable for use in the second stage of the study. In the second stage, case studies were used to investigate the utility in more realistic knowledge engineering situations of tools supporting 3-d ICDs, and the usability of depth-related features of a prototype tool which permits 3-d leDs to be viewed and edited. On the basis of the findings of these studies it is claimed that tools supporting 3-d ICDs will, in some situations, be more useful than those which employ only more conventional 2-d versions. It was found that depth-related features of the prototype tool were usable but should be improved upon in future implementations. The third and final section of work involved a preliminary investigation into the formal specification of the 3-d ICD representations of the kind used in the second set of studies. A scheme for specifying the range of 3-d leO languages currently supported by the prototype tool was developed, and each of the particular 3-d ICD languages used in the case studies were specified. Implications of the results of this work are discussed and a number of suggestions regarding directions for future work are made. The overall conclusion is that 3-d ICDs have considerable potential as a medium in which to represent knowledge structures for use in knowledge engineering.
APA, Harvard, Vancouver, ISO, and other styles
34

Li, Chin-Hsiang. "Extensions to the attribute grammar form model to model meta software engineering environments /." The Ohio State University, 1985. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487259580261289.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Meegoda, Ranjana L. V. "Computer integrated monitoring." Thesis, Aston University, 1987. http://publications.aston.ac.uk/11851/.

Full text
Abstract:
Computer integrated monitoring is a very large area in engineering where on-line, real time data acquisition with the aid of sensors is the solution to many problems in the manufacturing industry as opposed to the old data logging method by graphics analysis. The raw data which is collected this way however is useless in the absence of a proper computerized management system. The transfer of data between the management and the shop floor processes has been impossible in the past unless all the computers in the system were totally compatible with each other. This limits the efficiency of the systems because they get governed by the limitations of the computers. General Motors of U.S.A. have recently started research on a new standard called the Manufacturing Automation Protocol (MAP) which is expected to allow data transfer between different types of computers. This is still in early development stages and also is currently very expensive. This research programme shows how such a shop floor data acquisition system and a complete management system on entirely different computers can be integrated together to form a single system by achieving data transfer communications using a cheaper but a superior alternative to MAP. Standard communication character sets and hardware such as ASCII and UARTs have been used in this method but the technique is so powerful that totally incompatible computers are shown to run different programs (in different languages) simultaneously and yet receive data from each other and process in their own CPUs with no human intervention.
APA, Harvard, Vancouver, ISO, and other styles
36

Waters, Matthew. "Application of software engineering tools and techniques to PLC programming : innovation report." Thesis, University of Warwick, 2009. http://wrap.warwick.ac.uk/36897/.

Full text
Abstract:
The software engineering tools and techniques available for use in traditional information systems industries are far more advanced than in the manufacturing and production industries. Consequently there is a paucity of ladder logic programming support tools. These tools can be used to improve the way in which ladder logic programs are written, to increase the quality and robustness of the code produced and minimise the risk of software related downtime. To establish current practice and to ascertain the needs of industry a literature review and a series of interviews with industrial automation professionals were conducted. Two opportunities for radical improvement were identified; a tool to measure software metrics for code written in ladder logic and a tool to detect cloned code within a ladder program. Software metrics quantify various aspects of code and can be used to assess code quality, measure programmer productivity, identify weak code and develop accurate costing models with respect to code. They are quicker, easier and cheaper than alternative code reviewing strategies such as peer review and allow organisations to make evidence based decisions with respect to code. Code clones occur because reuse of copied and pasted code increases programmer productivity in the short term, but make programs artificially large and can spread bugs. Cloned code can be removed with no loss of functionality, dramatically reducing the the size of a program. To implement these tools, a compiler front end for ladder logic was first constructed. This included a lexer with 24 lexical modes, 71 macro definitions and 663 token definitions as well as a 729 grammar rule parser. The software metrics tool and clone detection tool perform analyses on an abstract sytax tree, the output from the compiler. The tools have been designed to be as user friendly as possible. Metrics results are compiled in XML reports that can be imported into spreadsheet applications, and the clone detector generates easily navigable HTML reports for each clone as well as an index file of all clones that contains hyperlinks to all clone reports. Both tools were demonstrated by analysing real factory code from a Jaguar Land Rover body in white line. The metrics tool analysed over 1.5 million lines of ladder logic code contained within 23 files and 8466 routines. The results identified those routines that are abnormally complex in addition to routines that are excessively large. These routines are a likely source of problems in future and action to improve them immediately is recommended. The clone detector analysed 59K lines from a manufacturing cell. The results of this analysis proved that the code could be reduced in volume by 43.9% and found previously undetected bugs. By removing clones for all factory code, the code would be reduced in size by so much that it could run on as much as 25% fewer PLCs, yielding a significant saving on hardware costs alone. De-cloned code is also easier to make modifications to, so this process goes some way towards future-proofing the code.
APA, Harvard, Vancouver, ISO, and other styles
37

Beyh, S. "Computer and communication engineering : internet protocol telephony in construction." Thesis, University of Salford, 2004. http://usir.salford.ac.uk/26582/.

Full text
Abstract:
A construction project traditionally involves intensive communication flows between the site operations (workers, gangers, engineers, foremen, etc.), the site office, the company and the Supply Chain. Typically on the jobsite, a temporary site office is set up in order to conduct the operations of the construction project phases. The site office is equipped with traditional telecommunication means such as phone, fax and Internet connection. The site personnel are provided with a multitude of mobile, satellite and wireless telecommunication devices where appropriate, such as PDA, GSM and satellite phones/fax, and walkie-talkies. Technically, these legacy systems, once put together, could be able to provide adequate communication resources to the construction project teams. But one of the main issues emerging from the use of the abovementioned traditional telecommunication systems is that their cost can be found in some cases to be very high. On the other hand, in the absence of providing the necessary communication means available through the traditional telecommunication systems to the personnel on the move for whatsoever reason could be very harmful and, may negatively affect the execution of the construction works and the project lifecycle as a whole. This situation could be overcome if alternative solutions are put in place to reduce cost and improve communications. Therefore, this study has investigated a new communication paradigm known as IP (Internet Protocol) Telephony, which could possibly provide the site office, as well as the entire project team members with adequate, cheaper and more effective communications means at the jobsite. IP Telephony refers to communication services such as voice, video, facsimile, and/or voice-messaging applications that are transported via the Internet, rather than the Public Switch Telephone Network (PSTN). The basic steps involved in originating an IP Telephony call are the conversion of the analogue voice signal into digital format and the compression/translation of the signal into IP packets for transmission over the Internet. This communication paradigm eliminates the need for separate infrastructures for voice and data networks as these services can be implemented over a single data infrastructure. Furthermore, while, from the technical point of view IP Telephony Technology could be ready to satisfy the business case in general, its development within the construction sector has not been observed due to several barriers that have been investigated in this work as being part of the development of an integrated framework that aimed at enabling the use of Internet Protocol Telephony in construction. This research aimed at developing a generic integrated framework for enabling the use of Internet Protocol (IP) Telephony in construction. The process involved in the development of this framework included the conduct of intensive literature around the traditional telecommunication systems used by construction firms in the United Kingdom as well as the investigation of the current situation of IP Telephony technology in terms of availability of commercial services and applications used by the construction industry. The field investigations were obtained through appropriate surveys and interviews conducted with construction firms, telecommunication operators and Internet Protocol (IP) Telephony equipment vendors respectively. The research further looked at the issues related to the transfer of such a technology into the construction industry and investigated the main barriers preventing its implementation in construction sites' environments. These investigations represented an important part in the development of the "Internet Protocol Telephony on Construction Sites (IPTCS) Framework" which represents the focus of this research. The various modes of communications are described under this common framework which is expected to benefit in premier-lieu the construction industry by driving construction firms to look at IP Telephony technology as an adequate and cost effective alternative to their communication means for empowering their mobile personnel on construction sites and in the office alike. It could also motivate telecommunication operators, IP Telephony application developers and equipment vendors to establish specific solutions suitable for construction sites environments according to the industry's needs and requirements.
APA, Harvard, Vancouver, ISO, and other styles
38

Westwood, Chris. "Computer simulation of diffusional creep failure of engineering alloys." Thesis, University of Surrey, 2001. http://epubs.surrey.ac.uk/843127/.

Full text
Abstract:
A simplified model with only 2 degrees of freedom is developed for cavity growth along a grain-boundary by surface and grain-boundary diffusion following a similar model for a row of grains used by Sun et al, (1996). A variational principle for the coupled diffusion problem is used to follow the cavity growth. The approximate solution can be reduced to the well-established equilibrium cavity growth model at the fast surface diffusion extreme. By comparing the 2 degree of freedom model with the full finite element solution by Pan et al, (1997), a 'Validity Map' is constructed in terms of the relative diffusivity and applied stress relative to the capillarity stress. It is found that the simplified model accurately describes the evolution process, in terms of overall cavity profile and propagation rate for engineering alloys subject to normal levels of applied stresses. The 2 degree of freedom model for a single cavity was then extended to allow the modelling of multiple cavities. These cavities can be either pre-existing or nucleated during the lifetime of the system. The relative rotation between the grains is also considered. The initial 2 degrees of freedom were increased to six, and a cavity element has been derived. The cavity elements are assembled together using the classical finite element approach. This allows the evolution of multiple cavities and their interactions to be modelled under different applied loads and material parameters. This simplified multiple cavity finite element model was compared with a model for cavity evolution based on a 'smeared-out' approach. It was shown that the 'smeared-out' model does not accurately predict the creep damage for realistic engineering materials and conditions and results in an under prediction of creep lifetime. Using the simplified finite element model the effect of surface diffusion on the evolution of the creep damage was investigated. The evolution of a large pre-existing 'crack-like' cavity was modelled and the effects of nucleation, surface diffusion and loading were also investigated. It was shown that in the majority of cases as the surface diffusion was increased the rupture time was also increased. The results from the large 'crack-like' cavity simulations showed that there was very little crack propagation through the material and the smaller cavities tended to grow independently of the large 'crack-like' cavity.
APA, Harvard, Vancouver, ISO, and other styles
39

Tanga, Rajan M. "Computer aided software engineering tool for generating C code." Ohio : Ohio University, 1988. http://www.ohiolink.edu/etd/view.cgi?ohiou1182872759.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Varsamidis, Thomas. "Object-oriented information modelling for computer-aided control engineering." Thesis, Bangor University, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.245177.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Mills, Paul. "A knowledge based computer system for engineering design quotations." Thesis, University of Wolverhampton, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.233929.

Full text
Abstract:
The research examines the difficulties relating to the construction of a computer model for the automation of the design and cost estimation practices in engineering. The research has been carried out in an industrial context and the work has included both theoretical and pragmatic issues relating to the modelling of expertise for the production of commercially useful software in the field of combustion system design. The difficulties relating to the capture of ill-defined knowledge and subjective human decision-making with 'traditional' programming languages is examined and a study undertaken regarding the use of Artificial Intelligence techniques. A particular emphasis has been placed on decision-making under uncertainty. The work has resulted in the construction of a prototype expert system that can be used to produce design quotations relating to single burner gas-fired combustion systems. An important aspect of the decision-making characteristic of engineering design, and particularly for the cost estimating stages of a project, is the mannner in which engineers combine both cost and technical data in order to arrive at design solutions on the criterion 'value for money'. The formal mathematics of probability theory and confirmation theory provide tools for modelling expertise of this kind and the research has developed and examined two parallel systems. The first is based on the use of Bayes' theorem and the second makes use of ideas from both confirmation and fuzzy set theory. The general approach, developed within the research, of combining knowledge types relating to 'fitness for purpose' and 'cost' into ordinal measures of 'value' is fundamental to many areas of decision-making and has many applications. The research also addresses the use of rule-based methods for application domains where the knowledge is continually changing and the expertise of users is variable.
APA, Harvard, Vancouver, ISO, and other styles
42

Hoyer, Markus. "Catalogue based computer aided engineering (CAE) of process models." Thesis, University of South Wales, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.441212.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Sawada, Hiroyuki. "Constraint-based computer support for insightful multidisciplinary engineering design." Thesis, University of Strathclyde, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.366854.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Gong, Zitong. "Calibration of expensive computer models using engineering reliability methods." Thesis, University of Liverpool, 2018. http://livrepository.liverpool.ac.uk/3028587/.

Full text
Abstract:
The prediction ability of complex computer models (also known as simulators) relies on how well they are calibrated to experimental data. History Matching (HM) is a form of model calibration for computationally expensive models. HM sequentially cuts down the input space to find the fitting input domain that provides a reasonable match between model output and experimental data. A considerable number of simulator runs are required for typical model calibration. Hence, HM involves Bayesian emulation to reduce the cost of running the original model. Despite this, the generation of samples from the reduced domain at every iteration has remained an open and complex problem: current research has shown that the fitting input domain can be disconnected, with nontrivial topology, or be orders of magnitude smaller than the original input space. Analogous to a failure set in the context of engineering reliability analysis, this work proposes to use Subset Simulation - a widely used technique in engineering reliability computations and rare event simulation - to generate samples on the reduced input domain. Unlike Direct Monte Carlo, Subset Simulation progressively decomposes a rare event, which has a very small probability of occurrence, into sequential less rare nested events. The original Subset Simulation uses a Modified Metropolis algorithm to generate the conditional samples that belong to intermediate less rare events. This work also considers different Markov Chain Monte Carlo algorithms and compares their performance in the context of expensive model calibration. Numerical examples are provided to show the potential of the embedded Subset Simulation sampling schemes for HM. The 'climb-cruise engine matching' illustrates that the proposed HM using Subset Simulation can be applied to realistic engineering problems. Considering further improvements of the proposed method, a classification method is used to ensure that the emulation on each disconnected region gets updated. Uncertainty quantification of expert-estimated correlation matrices helps to identify a mathematically valid (positive semi-definite) correlation matrix between resulting inputs and observations. Further research is required to explicitly address the model discrepancy as well as to take the correlation between model outputs into account.
APA, Harvard, Vancouver, ISO, and other styles
45

Brownbridge, George Peter Edward. "Computer assisted model development applied to chemical engineering systems." Thesis, University of Cambridge, 2015. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.708972.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Chow, Yi-Mei Maria 1974. "Computer-aided engineering methodology for structural optimization and control." Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/80921.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Savoie, Troy Brendon. "Human detection of computer simulation mistakes in engineering experiments." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/61526.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2010.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (p. 97-104).
This thesis investigates the notion that the more complex the experimental plan, the less likely an engineer is to discover a simulation mistake in a computer-based experiment. The author used an in vitro methodology to conduct an experiment with 54 engineers completing a design task to find the optimal configuration for a device with seven two-level control factors. Participants worked individually using a prescribed design approach dependent upon the randomly assigned experimental condition -- an adaptive one-factor-at-a-time plan for the control group or a resolution III fractional factorial plan for the treatment group -- with a flawed computer simulation of the device. A domain knowledge score was measured by quiz, and success or failure in discovering the flaw was measured by questioning during debriefing. About half (14 of 17) of the participants using the one-factor-at-a-time plan discovered the flaw, while nearly none (1 of 27) using the fractional factorial plan did so. Logistic regression analysis of the dichotomous outcome on treatment condition and domain knowledge score showed that flaw detection ability improved with increased domain knowledge, but that an advantage of two standard deviations in domain knowledge was insufficient to overcome the disadvantage of using the fractional factorial plan. Participant reactions to simulation results were judged by two independent raters for surprise as an indicator of expectation violation. Contingency analysis of the surprise rating results showed that participants using the fractional factorial plan were significantly less likely (risk ratio ~ 0.57) to appear surprised when the anomaly was elicited, but there was no difference in tendency to display surprise otherwise. The observed phenomenon has ramifications beyond simulation mistake detection. Cognitive psychologists have shown that the most effective way to learn a new concept is to observe unexpected behavior, investigate the cause, then integrate the new concept into one's mental model. If using a complex experimental plan hinders an engineer's ability to recognize anomalous data, the engineer risks losing opportunities to develop expertise. Initial screening and sensitivity analysis are recommended as countermeasures when using complex experiments, but more study is needed for verification.
by Troy Brendon Savoie.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
48

Grebner, Matthew. "A flexible integrated computer system for engineering geology education." Thesis, Massachusetts Institute of Technology, 1989. https://hdl.handle.net/1721.1/130497.

Full text
Abstract:
Thesis: M.S., Massachusetts Institute of Technology, Department of Civil Engineering, 1989
Includes bibliographical references (leaf 56).
by Matthew Grebner.
M.S.
M.S. Massachusetts Institute of Technology, Department of Civil Engineering
APA, Harvard, Vancouver, ISO, and other styles
49

Pensulo, Emilius M. "Integrating computer aided engineering functions: the management of information." Thesis, Aston University, 1987. http://publications.aston.ac.uk/11853/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Davis, Jonathan J. "Machine learning and feature engineering for computer network security." Thesis, Queensland University of Technology, 2017. https://eprints.qut.edu.au/106914/1/Jonathan_Davis_Thesis.pdf.

Full text
Abstract:
This thesis studies the application of machine learning to the field of Cyber security. Machine learning algorithms promise to enhance Cyber security by identifying malicious activity based only on provided examples. However, a major difficulty is the unsuitability of raw Cyber security data as input. In an attempt to address this problem, this thesis presents a framework for automatically constructing relevant features suitable for machine learning directly from network traffic. We then test the effectiveness of the framework by applying it to three Cyber security problems: HTTP tunnel detection, DNS tunnel detection, and traffic classification.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography