To see the other types of publications on this topic, follow the link: Concrete construction – Data processing.

Dissertations / Theses on the topic 'Concrete construction – Data processing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Concrete construction – Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Navarro, Cota Juan Pedro Martin 1963. "DESIGN AND BEHAVIOR OF COMPOSITE SPACE TRUSSES." Thesis, The University of Arizona, 1987. http://hdl.handle.net/10150/276505.

Full text
Abstract:
A fully automated computer program is developed for the optimum design of steel space trusses acting compositely with a concrete slab placed on top. The program sizes the truss members to meet the requirements of the load and resistance factor design specification of the American Institute of Steel Construction using the load combinations of ANSI. Earthquake loading is not considered. The optimum size is based on minimum cost, regarding the amount of welding required at the joints and of the member itself. The total cost is based on all steel work in the truss. Once the truss configuration has been defined, and it has been ensured that linear elastic behavior exists, the structure is analyzed for the construction process, to make sure that no overstressing will take place in any structural element at any time during construction and service. The analysis and design principles are presented and an actual design case is solved. (Abstract shortened with permission of author.)
APA, Harvard, Vancouver, ISO, and other styles
2

Diaz, Zarate Gerardo Daniel. "A knowledge-based system for estimating the duration of cast in place concrete activities." FIU Digital Commons, 1992. http://digitalcommons.fiu.edu/etd/2806.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Nezamian, Abolghasem 1968. "Bond strength of concrete plugs embedded in tubular steel piles." Monash University, Dept. of Civil Engineering, 2003. http://arrow.monash.edu.au/hdl/1959.1/5601.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hon, Alan 1976. "Compressive membrane action in reinforced concrete beam-and-slab bridge decks." Monash University, Dept. of Civil Engineering, 2003. http://arrow.monash.edu.au/hdl/1959.1/5629.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wei, Xiangmin. "GPR data processing for reinforced concrete bridge decks." Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/53066.

Full text
Abstract:
In this thesis, several aspects of GPR data processing for RC bridge decks are studied. First, autofocusing techniques are proposed to replace the previous expensive and unreliable human visual inspections during the iterative migration process for the estimation of the velocity/dielectric permittivity distribution from GPR data. Second, F-K filtering with dip relaxation is proposed for interference removal that is important for both imaging and the performance of post-processing techniques including autofocusing techniques and CS-based migration studied in this thesis. The targeted interferes here are direct waves and cross rebar reflections. The introduced dip relaxation is for accommodating surface roughness and medium inhomogeneity. Third, the newly developed CS-based migration is modified and evaluated on GPR data from RC bridge decks. A more accurate model by accounting for impulse waveform distortion that leads to less modeling errors is proposed. The impact of the selection of the regularization parameter on the comparative amplitude reservation and the imaging performance is also investigated, and an approach to preserve the comparative amplitude information while still maintaining a clear image is proposed. Moreover, the potential of initially sampling the time-spatial data with uniform sampling rates lower than that required by traditional migration methods is evaluated.
APA, Harvard, Vancouver, ISO, and other styles
6

Du, Toit André Johan. "Preslab - micro-computer analysis and design of prestressed concrete slabs." Master's thesis, University of Cape Town, 1988. http://hdl.handle.net/11427/17057.

Full text
Abstract:
Bibliography: pages 128-132.<br>A micro-computer based package for the analysis and design of prestressed flat slabs is presented. The constant strain triangle and the discreet Kirchhoff plate bending triangle are combined to provide an efficient "shell" element. These triangles are used for the finite element analysis of prestressed flat slabs. An efficient out-of-core solver for sets of linear simultaneous equations is presented. This solver was developed especially for micro-computers. Subroutines for the design of prestressed flat slabs include the principal stresses in the top and bottom fibres of the plate, Wood/Armer moments and untensioned steel areas calculated according to Clark's recommendations. Extensive pre- and post-processing facilities are presented. Several plotting routines were developed to aid the user in his understanding of the behaviour of the structure under load and prestressing.
APA, Harvard, Vancouver, ISO, and other styles
7

Begum, Rushna. "Neural network processing of impact echo NDT data." Thesis, City University London, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.340456.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Fang, Yuan, and 方媛. "A cost-based model for optimising the construction logisticsschedules." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2011. http://hub.hku.hk/bib/B46080351.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lewis, Tony. "Electronic data interchange in the construction industry." Thesis, Loughborough University, 1998. https://dspace.lboro.ac.uk/2134/11183.

Full text
Abstract:
The aim of this research is to improve the efficiency of the construction process through the application of electronic data interchange (EDI). This thesis describes the development and application of EDI messages. The messages described are targeted to provide a means for transferring construction specific information during the construction process. The definition of electronic data interchange and its technical issues are first described. The nature of EDI, replacing paper based communication with electronic messages, impacts on the way in which business is conducted, and also has far reaching legal implications due to the reliance of many legal systems on paper documents and signatures. The business and legal implications are therefore discussed in detail. The application of EDI in the construction industry is investigated by means of a literature review. This work is furthered by a longitudinal study of the construction industry's application of EDI, which consisted of two surveys at a five year interval. A model of the information flows within the traditional construction process is developed to assist in the identification of information flows suitable for EDI. A methodology for message development was produced. The methodology was then applied to develop a description data model that could be utilised in the existing bill of quantity and trading cycle messages. The bill of quantity message set was at a stage ready for trial. To determine the issues related to implementation specifically in the construction industry a trial implementation of this message set was undertaken. The official implementation undertaken by EDICON is described. Software was also developed to undertake the trial. This software was tested and proved the message set developed was suitable for the transfer of bill of quantity related information during a construction project. The factors causing the failure of the implementation of the bill of quantities message set are discussed. A number of these factors are considered valid for all construction project information flows. Finally, the use of shared project models to re-engineer construction information tasks is recommended as a means of achieving significant benefit from electronic data exchange in the construction process.
APA, Harvard, Vancouver, ISO, and other styles
10

Sadri, Saeid Lonbani. "An Integrated information system for building construction projects." Diss., Georgia Institute of Technology, 1993. http://hdl.handle.net/1853/19468.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Bernold, Leonhard Emil. "Productivity transients in construction processes." Diss., Georgia Institute of Technology, 1985. http://hdl.handle.net/1853/20980.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Zheng, Xinmin, and 鄭新敏. "A fuzzy genetic algorithms (GAs) model for time-cost optimization in construction." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2003. http://hub.hku.hk/bib/B27510839.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Ray, Subhasis. "Multi-objective optimization of an interior permanent magnet motor." Thesis, McGill University, 2008. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=116021.

Full text
Abstract:
In recent years, due to growing environmental awareness regarding global warming, green cars, such as hybrid electric vehicles, have gained a lot of importance. With the decreasing cost of rare earth magnets, brushless permanent magnet motors, such as the Interior Permanent Magnet Motor, have found usage as part of the traction drive system in these types of vehicles. As a design issue, building a motor with a performance curve that suits both city and highway driving has been treated in this thesis as a multi-objective problem; matching specific points of the torque-speed curve to the desired performance output. Conventionally, this has been treated as separate problems or as a combination of several individual problems, but doing so gives little information about the trade-offs involved. As a means of identifying the compromising solutions, we have developed a stochastic optimizer for tackling electromagnetic device optimization and have also demonstrated a new innovative way of studying how different design parameters affect performance.
APA, Harvard, Vancouver, ISO, and other styles
14

Park, Man-Woo. "Automated 3D vision-based tracking of construction entities." Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/45782.

Full text
Abstract:
In construction sites, tracking project-related entities such as construction equipment, materials, and personnel provides useful information for productivity measurement, progress monitoring, on-site safety enhancement, and activity sequence analysis. Radio frequency technologies such as Global Positioning Systems (GPS), Radio Frequency Identification (RFID) and Ultra Wide Band (UWB) are commonly used for this purpose. However, on large-scale congested sites, deploying, maintaining and removing such systems can be costly and time-consuming because radio frequency technologies require tagging each entity to track. In addition, privacy issues can arise from tagging construction workers, which often limits the usability of these technologies on construction sites. A vision-based approach that can track moving objects in camera views can resolve these problems. The purpose of this research is to investigate the vision-based tracking system that holds promise to overcome the limitations of existing radio frequency technologies for large-scale, congested sites. The proposed method use videos from static cameras. Stereo camera system is employed for tracking of construction entities in 3D. Once the cameras are fixed on the site, intrinsic and extrinsic camera parameters are discovered through camera calibration. The method automatically detects and tracks interested objects such as workers and equipment in each camera view, which generates 2D pixel coordinates of tracked objects. The 2D pixel coordinates are converted to 3D real-world coordinates based on calibration. The method proposed in this research was implemented in .NET Framework 4.0 environment, and tested on the real videos of construction sites. The test results indicated that the methods could locate construction entities with accuracy comparable to GPS.
APA, Harvard, Vancouver, ISO, and other styles
15

Choi, Ming-hang Edmund, and 蔡銘鏗. "Evaluation of the cost estimating systems." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B31251651.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Bharat, Krishna A. "Supporting the construction of distributed, interoperative, user interface applications." Diss., Georgia Institute of Technology, 1996. http://hdl.handle.net/1853/8135.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Williams, Trefor P. "Knowledge-based productivity analysis of construction operations." Diss., Georgia Institute of Technology, 1987. http://hdl.handle.net/1853/20195.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Qin, Jing, and 覃静. "Application of bioinformatics on gene regulation studies and regulatory network construction with omics data." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2013. http://hdl.handle.net/10722/205684.

Full text
Abstract:
Gene expression is a multi-step process that involves various regulators. From whole genome sequences to the complex gene regulatory system, high-throughput technologies have generated a large amount of omics data, but information in such a large scale is hard to interpret manually. Bioinformatics can help to process this huge biological information and infer biological insights using the merits of mathematics, statistics and computational techniques. In this study, we applied various bioinformatic techniques on gene regulation in several aspects. Multiple primary transcripts of a gene can be initiated at different promoters, termed alternative promoters (APs). Most human genes have multiple APs. However, whether the usage of APs is independent or not is still controversial. In this study, we analyze the roles of APs in gene regulations using various bioinformatics approaches. Chromosomal interactions between APs are found to be more frequent than interactions between different genes. By comparing the APs at two ends of the genes, we find that they are significant different in terms of sequence content, conservation and motif frequency. The position and distance of two APs are important for their combined effects, which prove their regulations are not independent and one AP could affect the transcription of the other. With the aim to understand the multi-level gene regulatory system in various biological processes, a mass of high-throughput omics data have been generated. However, each omics technology measuring the molecular abundance or behavior at a single level has a limited ability to depict the multi-level system. Integrating omics data can effectively comprehend the multi-level gene regulatory system and reduce the false positives. In this study, two web servers, ChIP-Array and ProteoMirExpress, have been built to construct transcriptional and post-transcriptional regulatory networks by integrating omics data. ChIP-Array is a web server for biologists to construct a TF-centered network for their own data. Network library is further constructed by ChIP-Array from publicly available data. Given a series mRNA expression profiles in a biological process, master regulators can be identified by matching the profiles with the networks in the library. To explore gene regulatory network controlled by multiple TFs, least absolute shrinkage and selection operator (LASSO)-type regularization models are applied on multiple integrative data. Golden standard based evaluations demonstrate that the L0 and L1/2 regularization models are efficient and applicable to gene regulatory network inference in large genome with a small number of samples. ProteoMirExpress integrates transcriptomic and proteomic data to infer miRNA-centered networks. It successfully infers the perturbed miRNA and those that co-express with it. The resulting network reports miRNA targets with uncorrelated mRNA and protein levels, which are usually ignored by tools considering only the mRNA abundance, even though some of them may be important downstream regulators. In summary, in this study we analyze gene regulation at multiple levels and develop several tools for gene network construction and regulator analysis with multiple omics data. It benefits researchers to efficiently process high-throughput raw data and to draw biological hypotheses and interpretation.<br>published_or_final_version<br>Biochemistry<br>Doctoral<br>Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
19

Abounia, Omran Behzad. "Application of Data Mining and Big Data Analytics in the Construction Industry." The Ohio State University, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=osu148069742849934.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Appau, Kwaku Addae. "An integrated model of computer-aided cost estimating/scheduling in construction management." Diss., Georgia Institute of Technology, 1994. http://hdl.handle.net/1853/23130.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Marshall, Dana T. "The exploitation of image construction data and temporal/image coherence in ray traced animation /." Full text (PDF) from UMI/Dissertation Abstracts International, 2001. http://wwwlib.umi.com/cr/utexas/fullcit?p3008386.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Williams, Rhys E. "Modelling issues in repetitive construction and an approach to schedule updating." Thesis, University of British Columbia, 1985. http://hdl.handle.net/2429/25145.

Full text
Abstract:
Planning and control of time and other resources are crucial to the construction of large projects. Yet, current computerized techniques are unable to model the work patterns by which construction personnel plan a project. Furthermore, these methods are not capable of reflecting the day to day changes which must be monitored to control the construction site. The purpose of this thesis-is to promote the usability of computerized planning and scheduling through the development of the heuristic manner by which construction personnel perceive the project. Site studies held in cooperation with Poole Construction Limited and Foundation Company of Canada were performed using a computer scheduling system at the University of British Columbia which contained a prototype model of repetitive work. It provided insight to the process of repetition and rhythm by which projects are planned and to the requirements of the updating process necessary to monitor, and hence control the project. Two models evolved. The definition of the general repetitive structure was formulated to provide construction personnel with a tool by which to model the process of repetition. The definition of an updating process was formulated capable of monitoring daily progress on a construction site. Work performed with these models have shown them to be realistic in their approach to construction management.<br>Applied Science, Faculty of<br>Civil Engineering, Department of<br>Graduate
APA, Harvard, Vancouver, ISO, and other styles
23

Almaghrawi, Ahmed Almaamoun. "Collaborative design in electromagnetics." Thesis, McGill University, 2007. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=103363.

Full text
Abstract:
We present a system architecture and a set of control techniques that allow heterogeneous software design tools to collaborate intelligently and automatically. One of their distinguishing features is the ability to perform concurrent processing. Systems based on this architecture are able to effectively solve large electromagnetic analysis problems, particularly those that involve loose coupling between several areas of physics. The architecture can accept any existing software analysis tool, without requiring any modification or customization of the tool. This characteristic is produced in part by our use of a neutral virtual representation for storing problem data, including geometry and material definitions. We construct a system based on this architecture, using several circuit and finite-element analysis tools, and use it to perform electromagnetic analyses of several different devices. Our results show that our architecture and techniques do allow practical problems to be solved effectively by heterogeneous tools.<br>On présente une architecture de système et un ensemble de techniquesde contrôle qui permettent aux logiciels d'analyse hétérogènes de collaborerde façon intelligente et automatique. Un de ses traits caractéristiques est sacapacité d'effectuer simultanément plusieurs traitements. Les systèmes baséssur cette architecture sont capables de résoudre de manière efficace des grandsproblèmes dans le domaine de l'analyse électromagnétique, particulièrementceux où existe un accouplement dégagé entre plusieurs domaines de physique.L'architecture peut accepter n'importe quel logiciel d'analyse existant; ellen'exige pas que les logiciels soyent modifiés ou fabriqués sur mesure. Cettecaractéristique est produite en partie par notre utilisation d'une représentationneutre virtuelle pour représenter les données du problème, y inclus sa géométrieet les proprietés de ses matériels. On construit un système basé sur cettearchitecture, comprenant plusieurs logiciels de simulation, et on l'emploie pourexécuter des analyses électromagnétiques de plusieurs appareils différents. Nosrésultats montrent que notre architecture et nos techniques permettent desproblèmes pratiques d'être résolus efficacement par les outils hétérogènes.
APA, Harvard, Vancouver, ISO, and other styles
24

吳蓬輝 and Fung Fai Ng. "A knowledge analysis model for knowledge engineering in the construction industry." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1990. http://hub.hku.hk/bib/B31232358.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Leung, Wing Pan. "Visual cryptography for color images : formal security analysis and new construction /." access full-text access abstract and table of contents, 2009. http://libweb.cityu.edu.hk/cgi-bin/ezdb/thesis.pl?mphil-cs-b23759100f.pdf.

Full text
Abstract:
Thesis (M.Phil.)--City University of Hong Kong, 2009.<br>"Submitted to Department of Computer Science in partial fulfillment of the requirements for the degree of Master of Philosophy." Includes bibliographical references (leaves 103-108)
APA, Harvard, Vancouver, ISO, and other styles
26

Ozerkan, Nesibe Gozde. "Evaluation Of Air Void Parameters Of Fly Ash Incorporated Self Consolidating Concrete By Image Processing." Phd thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/3/12611195/index.pdf.

Full text
Abstract:
Self consolidating concrete (SCC) is defined as an innovative concrete that does not require vibration for placing and compaction and it is able to flow under its own weight, completely filling formwork and achieving full compaction. Although significant amount of research has been carried out regarding the fresh properties, mix design, placing methods and strength of various SCC mixes, only a very limited amount of work has been done to assess the durability performance of SCC. Concretes in cold climates are subjected to freeze-thaw cycles which are one of the major durability problems, and if the concrete is in a saturated or nearly saturated condition, those cycles lead to expansion of the water in the capillary pores of concrete causing great internal stresses. For a durable concrete subjected to freeze-thaw cycles, an adequate air void system is obtained by using air-entraining admixtures. The performance of the air void system is characterized by air void parameters that are determined using microscopical examination of the concrete microstructure. In this thesis a software tool, based on image analysis of concrete surface, is developed to evaluate the air void parameters of concrete using both American and European standards. Later on, an experimental program is conducted to evaluate the effect of freezing-thawing on self consolidating concrete that contain different percentages of fly ash (FA) and air entraining agents. For this purpose, a total of ten self consolidating concrete mixtures that contain four different contents of fly ash, and three different levels of air entrainment were prepared. During the casting operation, the workability properties of SCCs were observed through slump flow time and diameter, air content, V-funnel flow time, L-box height ratio, and segregation ratio. Hardened properties were evaluated by compressive strength, permeability tests (water absorption, sorptivity and rapid chloride permeability test), freezing-thawing test, resonant frequency test, ultrasonic pulse velocity test. The developed tool was used to characterize and evaluate the effects of air void parameters of SCC on its resistance to freeze-thaw cycles. At the end of this experimental investigation, it was concluded that the addition of air entraining agent increased the flowability and an increase in the fly ash content decreased the effect of air entraining agent. On the other hand, during image processing, it was observed that the surface preparation procedures have a crucial effect on processing quality. Moreover, spacing factor -which is the most important air void characteristic that is utilized for determination of the resistance to freezing-thawing- should not be restricted to 0.2 mm for SCC, since SCCs with spacing factors smaller than 0.4 mm could still exhibit good freezethaw resistance.
APA, Harvard, Vancouver, ISO, and other styles
27

Medek, Liza. "User participation in the housing design process through the use of computers : home builders' response." Thesis, McGill University, 1994. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=26242.

Full text
Abstract:
As a result of the diversity of home buyers within the current housing market, houses are increasingly designed with but little user input into the overall design process. Home builders require the development of design tools which will accommodate user needs within specific housing markets. This thesis investigates the reaction of builders to the participation of home buyers in the design process through the use of computer-aided design (CAD).<br>A review of the role of computers within the homebuilding industry is provided, including an historical overview of the use of CAD. The state of the art in CAD software applications is explored, with an assessment made of three low-cost software programs. Following a description of the existing design process in the homebuilding industry, a proposed system of user participation is outlined. A survey is taken of six builders in the Montreal and Ottawa regions to determine industry response to home buyer use of CAD as a design tool.<br>An analysis of the survey results reveals that although the builders are not currently participants in such a process, they are tentatively positive in their receptiveness to the idea of user-related CAD. The builders express many reservations concerning the available CAD systems, and they declare little interest in adopting the process as it presently exists. The relevant computer programs are found wanting, revealing a need for further development of both the software applications and the procedure for implementing CAD at the user participation level. Suggestions are offered for improvements in the process to the benefit of builder, designer, and end user.
APA, Harvard, Vancouver, ISO, and other styles
28

Farah, Toni E. "Review of current estimating capabilities of the 3d building information model software to support design for production/construction." Link to electronic thesis, 2005. http://www.wpi.edu/Pubs/ETD/Available/etd-082305-165125/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Hon, Wing-kai. "On the construction and application of compressed text indexes." Click to view the E-thesis via HKUTO, 2004. http://sunzi.lib.hku.hk/hkuto/record/B31059739.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Hon, Wing-kai, and 韓永楷. "On the construction and application of compressed text indexes." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2004. http://hub.hku.hk/bib/B31059739.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Rashidi, Abbas. "Evaluating the performance of machine-learning techniques for recognizing construction materials in digital images." Thesis, Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/49122.

Full text
Abstract:
Digital images acquired at construction sites contain valuable information useful for various applications including As-built documentation of building elements, effective progress monitoring, structural damage assessment, and quality control of construction material. As a result there is an increasing need for effective methods to recognize different building materials in digital images and videos. Pattern recognition is a mature field within the area of image processing; however, its application in the area of civil engineering and building construction is only recent. In order to develop any robust image recognition method, it is necessary to choose the optimal machine learning algorithm. To generate a robust color model for building material detection in an outdoor construction environment, a comparative analysis of three generative and discriminative machine learning algorithms, namely, multilayer perceptron (MLP), radial basis function (RBF), and support vector machines (SVMs), is conducted. The main focus of this study is on three classes of building materials: concrete, plywood, and brick. For training purposes a large-size data set including hundreds of images is collected. The comparison study is conducted by implementing necessary algorithms in MATLAB and testing over hundreds of construction-site images. To evaluate the performance of each technique, the results are compared with a manual classification of building materials. In order to better assess the performance of each technique, experiments are conducted by taking pictures under various realistic jobsite conditions, e.g., different ranges of image resolutions, different distance of camera from object, and different types of cameras.
APA, Harvard, Vancouver, ISO, and other styles
32

Jaradat, Ward. "On the construction of decentralised service-oriented orchestration systems." Thesis, University of St Andrews, 2016. http://hdl.handle.net/10023/8036.

Full text
Abstract:
Modern science relies on workflow technology to capture, process, and analyse data obtained from scientific instruments. Scientific workflows are precise descriptions of experiments in which multiple computational tasks are coordinated based on the dataflows between them. Orchestrating scientific workflows presents a significant research challenge: they are typically executed in a manner such that all data pass through a centralised computer server known as the engine, which causes unnecessary network traffic that leads to a performance bottleneck. These workflows are commonly composed of services that perform computation over geographically distributed resources, and involve the management of dataflows between them. Centralised orchestration is clearly not a scalable approach for coordinating services dispersed across distant geographical locations. This thesis presents a scalable decentralised service-oriented orchestration system that relies on a high-level data coordination language for the specification and execution of workflows. This system's architecture consists of distributed engines, each of which is responsible for executing part of the overall workflow. It exploits parallelism in the workflow by decomposing it into smaller sub-workflows, and determines the most appropriate engines to execute them using computation placement analysis. This permits the workflow logic to be distributed closer to the services providing the data for execution, which reduces the overall data transfer in the workflow and improves its execution time. This thesis provides an evaluation of the presented system which concludes that decentralised orchestration provides scalability benefits over centralised orchestration, and improves the overall performance of executing a service-oriented workflow.
APA, Harvard, Vancouver, ISO, and other styles
33

De, Brandt T. "Development of an intelligent printer sharer." Thesis, Cape Technikon, 1993. http://hdl.handle.net/20.500.11838/1134.

Full text
Abstract:
Thesis (M.Diploma in Technology)--Cape Technikon, Cape Town, 1993<br>This thesis describes the design, development and implementation of an intelligent printer sharer, capable of servicing ten personal computers and two printers.
APA, Harvard, Vancouver, ISO, and other styles
34

Arditi, Rocha Luis M. "Intelligent retrieval system for conditions of contract documents in construction." FIU Digital Commons, 1992. http://digitalcommons.fiu.edu/etd/1304.

Full text
Abstract:
The outcome of this research is an Intelligent Retrieval System for Conditions of Contract Documents. The objective of the research is to improve the method of retrieving data from a computer version of a construction Conditions of Contract document. SmartDoc, a prototype computer system has been developed for this purpose. The system provides recommendations to aid the user in the process of retrieving clauses from the construction Conditions of Contract document. The prototype system integrates two computer technologies: hypermedia and expert systems. Hypermedia is utilized to provide a dynamic way for retrieving data from the document. Expert systems technology is utilized to build a set of rules that activate the recommendations to aid the user during the process of retrieval of clauses. The rules are based on experts knowledge. The prototype system helps the user retrieve related clauses that are not explicitly cross-referenced but, according to expert experience, are relevant to the topic that the user is interested in.
APA, Harvard, Vancouver, ISO, and other styles
35

鄒國棠 and Kwok-tong Chau. "Computer-aided modelling and design of switching DC-DC converters." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1993. http://hub.hku.hk/bib/B3123298X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Matson, Darryl Douglas. "The design of steel structures : a second-order approach." Thesis, University of British Columbia, 1989. http://hdl.handle.net/2429/29756.

Full text
Abstract:
The wide spread use of limit states design procedures in both the Canadian and American steel design codes has created a need for a better understanding of how structures behave. Current design practice, however, allows and often encourages engineers to use an approximate linear analysis to determine the member forces in a structure. This is then followed by an even more approximate amplification of forces through the use of several design equations. It is believed that this practice is no longer acceptible as more accurate second-order computer programs have become a very practical alternative. With this as motivation, this thesis will provide a comparison between a second-order computer program available at the University of British Columbia called ULA (Ultimate Load Analysis) and the Canadian and American building code designs, CAN3-S16.1-M84 and LRFD 1986 respectively. It was felt that ULA should be verified, even though the theory it is based on is well established. Thus, ULA was used to generate a load versus L/r curve for a pin ended column (with the parameters modified slightly to allow direct comparison with the curves available in the codes). ULA was then used to predict load-deflection curves for two existing test frames. The resulting curves compared well with the test data. To ensure simplicity, the building codes make several approximations in the derivation of their design equations. This results in the equations being applicable to a very narrow range of structures. Specifically, the equations apply to rigidly connected frames in which all of the columns reach their critical buckling load simmultaniously. Consequently, the results from ULA were compared to the codes for structures of this type. It was found that the codes were conservative for these structures in relation to the results from ULA, yet the amount of conservatism varied greatly between structures. That is, the codes are not consistant in how conservative they are. Results from ULA were then compared to the codes for structures that do not satisfy all of the code limitations. Alhough using the codes to design structures beyond the limit of applicability is not a recommended practice, engineers do use the codes to design all types of structures, with little appreciation for the applicability limits. Consequently, it was deemed appropriate to extend this study to such structures. Though only a few were investigated, it was found that the codes were unreliable, being highly conservative, very accurate, or in one case highly unconservative when compared to the results from ULA.<br>Applied Science, Faculty of<br>Civil Engineering, Department of<br>Graduate
APA, Harvard, Vancouver, ISO, and other styles
37

Huang, Wanjun. "Temporary binding for dynamic middleware construction and web services composition." Phd thesis, [S.l.] : [s.n.], 2006. http://deposit.ddb.de/cgi-bin/dokserv?idn=980539242.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Wegner, Alexander. "The construction of finite soluble factor groups of finitely presented groups and its application." Thesis, University of St Andrews, 1992. http://hdl.handle.net/10023/12600.

Full text
Abstract:
Computational group theory deals with the design, analysis and computer implementation of algorithms for solving computational problems involving groups, and with the applications of the programs produced to interesting questions in group theory, in other branches of mathematics, and in other areas of science. This thesis describes an implementation of a proposal for a Soluble Quotient Algorithm, i.e. a description of the algorithms used and a report on the findings of an empirical study of the behaviour of the programs, and gives an account of an application of the programs. The programs were used for the construction of soluble groups with interesting properties, e.g. for the construction of soluble groups of large derived length which seem to be candidates for groups having efficient presentations. New finite soluble groups of derived length six with trivial Schur multiplier and efficient presentations are described. The methods for finding efficient presentations proved to be only practicable for groups of moderate order. Therefore, for a given derived length soluble groups of small order are of interest. The minimal soluble groups of derived length less than or equal to six are classified.
APA, Harvard, Vancouver, ISO, and other styles
39

Bruneau, Phillippe Roger Paul, and Backstrom T. W. Von. "The design of a single rotor axial flow fan for a cooling tower application." Thesis, Stellenbosch : University of Stellenbosch, 1994. http://hdl.handle.net/10019.1/15528.

Full text
Abstract:
Thesis (MEng (Mechanical Engineering))--University of Stellenbosch, 1994.<br>213 leaves printed on single pages, preliminary pages i-xix and numbered pages 1-116. Includes bibliography, list of tables, list of figures and nomenclature.<br>Digitized at 600 dpi grayscale to pdf format (OCR), using a Bizhub 250 Konica Minolta Scanner.<br>ENGLISH ABSTRACT: A design methodology for low pressure rise, rotor only, ducted axial flow fans is formulated, implemented and validated using the operating point specifications of a 1/6th scale model fan as a reference. Two experimental fans are designed by means of the design procedure and tested in accordance with British Standards 848, Type A. The design procedure makes use of the simple radial equilibrium equations, embodied in a suite of computer programs. The experimental fans have the same hub-tip ratio and vortex distribution, but differ in the profile section used. The first design utilises the well known Clark-Y aerofoil profile whilst the second takes advantage of the high lift characteristics of the more modern NASA LS series. The characteristics of the two designs are measured over the entire operating envelope and compared to the reference fan from which the utility and accuracy of the design procedure is assessed. The performance of the experimental fans compares well with both the reference fan as well as the design intent.<br>AFRIKAANSE OPSOMMING: 'n Ontwerpmetode vir lae drukstyging, enkel rotor aksiaal waaiers is geformuleer, toegepas en bevestig deur gebruik te maak van die ontwerppunt spesifikasies van 'n 1/6 skaal verwysingswaaier. Twee eksperimentele waaiers is ontwerp deur middel van die ontwerpmetode en getoets volgens die BS 848, Type A kode. Die ontwerpmetode maak gebruik van die eenvoudig radiale ewewigsvergelykings en 'n stel rekenaarprogramme. Die twee eksperimentele waaiers het dieselfde naaf-huls verhouding en werwel verdeling, maar verskil daarin dat verskillende vleuelprofiele gebruik is vir elkeen van die twee waaiers. Die eerste ontwerp maak gebruik van die bekende Clark-Y profiel terwyl die tweede die moderne NASA LS profiel gebruik. Die karakteristieke van die twee eksperimentele waaiers is gemeet oor die hele werkbereik en vergelyk met die verwysings waaier waardeur die geldigheid en akkuraatheid van die ontwerpmetode bepaal is. Die werkverigting van die eksperimentele waaiers vergelyk goed met die verwysingswaaier en bevredig die ontwerpsdoelwitte.
APA, Harvard, Vancouver, ISO, and other styles
40

Kelly, John H. "Rule-based fuselage and spine and cross-section methods for computer aided design of aircraft components." Thesis, This resource online, 1993. http://scholar.lib.vt.edu/theses/available/etd-06232009-063138/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Jayaram, Uma. "Extracting dimensional geometric parameters from B-spline surface models." Diss., Virginia Tech, 1991. http://hdl.handle.net/10919/37877.

Full text
Abstract:
In an integrated design environment, the common thread between the different design stages is usually the geometric model of the part. However, the requirements for the geometric definition of the design is usually different for each stage. The transformation of data between these different stages is essential for the success of the integrated design environment. For example, conceptual design systems usually deal with geometric dimensional parameters (e.g. length, radius, etc.) whereas preliminary design systems frequently require the geometry definition to be in the form of surface models. This dissertation presents the necessity and scope of creating and implementing methodologies to obtain dimensional geometric parameters from the surface description of an object. Since the study of geometric modeling and parametric surfaces is a new field, few classical methods are applicable. Methods and algorithms for the extraction of various geometry parameters are created. A few methods to pre-process and manipulate these surfaces before the parameter extraction methods can be applied are outlined. One of the most important applications of parameter extraction is in the field of aircraft design. There are two important aspects of geometry data conversion in the design cycle. The first is the conversion from conceptual CAD models to CFD compatible models. The second is the conversion from surface representations of CFD models to obtain component parameters (e.g. wing span, fuselage fineness ratio, moments of inertia, etc.). The methods created in this dissertation are used to extract geometric parameters of importance in aircraft design. This enables the design cycle to be complete and promotes integrated design. These methods have been implemented in the aircraft design software, ACSYNT. Examples of the conversion of data from B-spline surface models to dimensional geometric parameters using these methods are included. The emphasis of this dissertation is on non-uniform B-spline surfaces. Methods for obtaining geometric parameters from aircraft models described by characteristic points are also considered briefly.<br>Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
42

Hagerty, David Joseph. "Designing and Simulating a Multistage Sampling Rate Conversion System Using a Set of PC Programs." PDXScholar, 1993. https://pdxscholar.library.pdx.edu/open_access_etds/4697.

Full text
Abstract:
The thesis covers a series of PC programs that we have written that will enable users to easily design FIR linear phase lowpass digital filters and multistage sampling rate conversion systems. The first program is a rewrite of the McClellanParks computer program with some slight modifications. The second program uses an algorithm proposed by Rabiner that determines the length of a lowpass digital filter. Rabiner used a formula proposed by Herrmann et al. to initially estimate the filter length in his algorithm. The formula, however, assumes unity gain. We present a modification to the formula so that the gain of the filter is normalized to accommodate filters that have a gain greater than one (as in the case of a lowpass filter used in an interpolator). We have also changed the input specifications from digital to analog. Thus, the user supplies the sampling rate, passband frequency, stopband frequency, gain, and the respective maximum band errors. The program converts the specifications to digital. Then, the program iteratively estimates the filter length and interacts with the McClellan-Parks Program to determine the actual filter length that minimizes the maximum band errors. Once the actual length is known, the filter is designed and the filter coefficients may be saved to a file. Another new finding that we present is the condition that determines when to add a lowpass filter to a multistage decimator in order to reduce the total number of filter taps required to implement the system. In a typical example, we achieved a 34% reduction in the total required number of filter taps. The third program is a new program that optimizes the design of a multistage sampling rate conversion system based upon the sum of weighted computational rates and storage requirements. It determines the optimum number of stages and the corresponding upsampling and downsampling factors of each stage of the design. It also determines the length of the required lowpass digital filters using the second program. Quantization of the filter coefficients may have a significant impact on the frequency response. Consequently, we have included a routine within our program that determines the effects of such quantization on the allowable error margins within the passband and stopband. Once the filter coefficients are calculated, they can be saved to files and used in an appropriate implementation. The only requirements of the user are the initial sampling rate, final sampling rate, passband frequency, stopband frequency, corresponding maximum errors for each band, and the weighting factors to determine the optimization factor. We also present another new program that implements a sampling rate conversion from CD (44.1 kHz) to DAT (48 kHz) for digital audio. Using the third program to design the filter coefficients, the fourth program converts an input sequence (either samples of a sine wave or a unit sample sequence) sampled at the lower rate to an output sequence sampled at the higher rate. The frequency response is then plotted and the output block may be saved to a file.
APA, Harvard, Vancouver, ISO, and other styles
43

Au, Siu-man Michael, and 區兆文. "Construction IT Centre." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B31984988.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Seth, Ernest L. "FASPEC, a program to determine group constants for up to 47 groups in a fast neutron spectrum." Thesis, Virginia Tech, 1985. http://hdl.handle.net/10919/45665.

Full text
Abstract:
<p>In reactor core design, a gap exists between the manual calculation of few-group constants and the many-group calculation, by large computer programs. A method is needed by which group constants may be calculated easily and quickly. The FASPEC program is designed to reduce the amount of manual calculation and to complement the large program by reducing the number of times the large program must be run to achieve desired results.</p> <p>The program calculates group constants from 940 microgroups, collapsing to any user-specified number of macrogroups up to 47. FASPEC is based on group-averaged flux calculations by a solution of the Infinite medium neutron transport equation. Flux contributions from inelastic scatter are included while those from neutron upâ scatter are not. The energy spectrum considered is from 10 MeV to 0.625 eV. Required input is the atomic number density of each isotope, the number of macrogroups desired and the upper and lower microgroup numbers of each macrogroup. Input is facilitated by prompting in each case. Cross section look-up tables were provided by the Very Improved Monte Carlo code (VIM) for a mid-range Infinite hexagonal lattice. Self-shielding effects are included indirectly. A brief user's guide is provided.</p> <p> Group constants calculated and stored for either terminal display or printed output are group number, lowest energy of the group, macroscopic removal cross section, macroscopic absorption cross section, diffusion coefficient, flux, macroscopic fission cross section, v, the average number of neutrons emitted per fission, and vΣ<sub>f</sub>.</p><br>Master of Science
APA, Harvard, Vancouver, ISO, and other styles
45

Hagerty, David Joesph. "Designing and Simulating a Multistage Sampling Rate Conversion System Using a Set of PC Programs." PDXScholar, 1993. https://pdxscholar.library.pdx.edu/open_access_etds/4762.

Full text
Abstract:
The thesis covers a series of PC programs that we have written that will enable users to easily design FIR linear phase lowpass digital filters and multistage sampling rate conversion systems. The first program is a rewrite of the McClellanParks computer program with some slight modifications. The second program uses an algorithm proposed by Rabiner that determines the length of a lowpass digital filter. Rabiner used a formula proposed by Herrmann et al. to initially estimate the filter length in his algorithm. The formula, however, assumes unity gain. We present a modification to the formula so that the gain of the filter is normalized to accommodate filters that have a gain greater than one (as in the case of a lowpass filter used in an interpolator). We have also changed the input specifications from digital to analog. Thus, the user supplies the sampling rate, passband frequency, stopband frequency, gain, and the respective maximum band errors. The program converts the specifications to digital. Then, the program iteratively estimates the filter length and interacts with the McClellan-Parks Program to determine the actual filter length that minimizes the maximum band errors. Once the actual length is known, the filter is designed and the filter coefficients may be saved to a file. Another new finding that we present is the condition that determines when to add a lowpass filter to a multistage decimator in order to reduce the total number of filter taps required to implement the system. In a typical example, we achieved a 34% reduction in the total required number of filter taps. The third program is a new program that optimizes the design of a multistage sampling rate conversion system based upon the sum of weighted computational rates and storage requirements. It determines the optimum number of stages and the corresponding upsampling and downsampling factors of each stage of the design. It also determines the length of the required lowpass digital filters using the second program. Quantization of the filter coefficients may have a significant impact on the frequency response. Consequently, we have included a routine within our program that determines the effects of such quantization on the allowable error margins within the passband and stopband. Once the filter coefficients are calculated, they can be saved to files and used in an appropriate implementation. The only requirements of the user are the initial sampling rate, final sampling rate, passband frequency, stop band frequency, corresponding maximum errors for each band, and the weighting factors to determine the optimization factor. We also present another new program that implements a sampling rate conversion from CD (44.1 kHz) to DAT (48 kHz) for digital audio. Using the third program to design the filter coefficients, the fourth program converts an input sequence (either samples of a sine wave or a unit sample sequence) sampled at the lower rate to an output sequence sampled at the higher rate. The frequency response is then plotted and the output block may be saved to a file.
APA, Harvard, Vancouver, ISO, and other styles
46

馮潤開 and Yun-hoi Fung. "Linguistic fuzzy-logic control of autonomous vehicles." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1998. http://hub.hku.hk/bib/B29812690.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

劉心雄 and Sum-hung Lau. "Adaptive FEM preprocessing for electro magnetic field analysis of electric machines." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1995. http://hub.hku.hk/bib/B31212451.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Shrinivas, Gorur N. "Three-dimensional design methods for turbomachinery applications." Thesis, University of Oxford, 1996. http://ora.ox.ac.uk/objects/uuid:8ace58b5-e251-491e-9753-ae8b236d6c3b.

Full text
Abstract:
This thesis studies the application of sensitivity analysis and optimization methods to the design of turbomachinery components. Basic design issues and a survey of current design trends are presented. The redesign of outlet guide vanes (OGV's) in an aircraft high bypass turbofan engine is attempted. The redesign is necessitated by the interaction of the pylon induced static pressure field with the OGV's and the fan, leading to reduced OGV efficiency and shortened fan life. The concept of cyclically varying camber is used to redesign the OGV row to achieve suppression of the downstream disturbance in the domain upstream of the OGV row. The redesign is performed using (a) a linear perturbation CFD analysis and (b) a minimisation of the pressure mismatch integral by using a Newton method. In method (a) the sensitivity of the upstream flow field to changes in blade geometry is acquired from the linear perturbation CFD analysis, while in method (b) it is calculated by perturbing the blade geometry and differencing the resulting flow fields. Method (a) leads to a reduction in the pylon induced pressure variation at the fan by more than 70% while method (b) achieves up to 86%. An OGV row with only 3 different blade shapes is designed using the above method and is found to suppress the pressure perturbation by more than 73%. Results from these calculations are presented and discussed. The quasi-Newton design method is also used to redesign a three dimensional OGV row and achieves considerable reduction of upstream pressure variation. A concluding discussion summarises the experiences and suggests possible avenues for further work.
APA, Harvard, Vancouver, ISO, and other styles
49

Tomoi, Masatoshi. "Reliability-based design for Japanese timber structures using Canadian S-P-F dimension lumber." Thesis, University of British Columbia, 1991. http://hdl.handle.net/2429/30106.

Full text
Abstract:
Reliability levels of Japanese 2x4 wood frame structures were evaluated using lumber property data derived from evaluation of Canadian Spruce-Pine-Fir dimension lumber. The evaluations were made using the "Standard for Limit States Design of Steel Structures (Draft)", which was newly published by the LRFD Subcommittee of Architectural Institute of Japan, and In-Grade Data obtained by a Canadian Wood Council research project. These analyses were implemented using the computer program "RELAN" developed by Dr. R.O. Foschi at UBC and Monte Carlo simulations. Reliability levels of current Japanese 2x4 wood frame structures were also evaluated. Recommendations were made to encourage the application of limit states design into existing Japanese design methods.<br>Forestry, Faculty of<br>Graduate
APA, Harvard, Vancouver, ISO, and other styles
50

Rouhana, Khalil G. "Neural networks applications in estimating construction costs." Thesis, This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-12302008-063358/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!