To see the other types of publications on this topic, follow the link: Model builder.

Dissertations / Theses on the topic 'Model builder'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Model builder.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Lin, Chia-Yang. "Conceptual model builder." CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2708.

Full text
Abstract:
Whenever one designs a new database system, an Entity-Relationship Diagram (ER diagram) is always needed to present the structure of this database. Using the graphically well-arranged ER Diagram helps you to easily understand the entities, attributes, domains, primary keys, foreign keys, constraints, and relationships inside a database. This data-modeling tool is an ideal choice for companies and developers.
APA, Harvard, Vancouver, ISO, and other styles
2

Vijay, Sony. "The LibX LibApp Builder." Thesis, Virginia Tech, 2014. http://hdl.handle.net/10919/24812.

Full text
Abstract:
LibX is a browser extension that provides direct access to library resources. LibX enables users to add additional features to a webpage, such as placing a tutorial video on a digital library homepage. LibX achieves this ability of enhancing web pages through library applications, called LibApps. A LibApp examines a webpage, extracts and processes information of the page, and modifies the web content. It is possible to build an unlimited number of LibApps and enhance web pages in numerous ways. The developers of LibX team cannot build all possible LibApps by themselves. Hence, we decided to create an environment that allows users to create and share LibApps, thereby creating an eco-system of library applications. We developed the LibApp Builder, a cloud-based end-user programming tool that assists users in creating customized library applications with minimal effort. We designed an easy-to-understand meta-design language model with modularized, reusable components. The LibApp language model is designed to hide the complex programming details from the target audiences who are mostly non-technical users, primarily librarians. The LibApp Builder is a web-based editor that allows users to build and test LibApps in an executable environment. A built-in publishing mechanism allows users to group LibApps into packages and publish them in AtomPub format. Any user can directly reuse or adapt published components as required. Two error checking mechanisms have been built into the LibApp Builder viz., type checking and semantic checking to enhance user experience and reduce debugging effort. Additionally, the web interface displays help tooltips to guide users through the process of building a LibApp. We adhered to good software engineering practices such as the agile development model and the model-view-controller design paradigm. The LibApp Builder is built with the ZK AJAX framework and provides a rich interactive user interface. The LibApp Builder is integrated with an optimized full-text, fuzzy search engine and facilitates faceted search by exploiting the BaseX XML database system and XPath/XQuery processor. Users can locate and reuse existing language components through the search interface. To summarize, the LibApp Builder is a community platform for librarians to create, adapt and share LibApps.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
3

Harmain, H. M. "Building object-oriented conceptual models using natural language processing techniques." Thesis, University of Sheffield, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.312740.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Roesch, Patric Karl. "The development of a model builder for a microcircuit substrate." Thesis, Monterey, California. Naval Postgraduate School, 1991. http://hdl.handle.net/10945/26479.

Full text
Abstract:
The Naval Postgraduate School is currently in possession of software designed to perform a thermal analysis of electronic components. This software package incorporates a model builder whose primary function is to generate a thermal model. In its present configuration, the model builder requires an inordinate amount of time for data input and model verification. This thesis describes the development of a model builder designed specifically to reduce the time required to model the substrate, epoxy, and carrier layers of a microcircuit assembly
APA, Harvard, Vancouver, ISO, and other styles
5

Glaser, Stephen J. "The development of a thermal analysis model builder for a printed circuit board." Thesis, Monterey, California. Naval Postgraduate School, 1991. http://hdl.handle.net/10945/28124.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kasal, Ondřej. "Zjednodušený model axiálně chlazeného oblouku." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2009. http://www.nusl.cz/ntk/nusl-218009.

Full text
Abstract:
Using of computer technology brings advantages of quick and well-arrange data. Computer technology offers change of transferring information the same way as that often using. Capacity and execution of computer technology is depending on kind and size of elaborated information. It is necessary to use to computer technology in application working with big data range and in application which needs many steps to finish calculation it means when we are using iterative methods. It is computer technology which creates main part of simulation, computation and visualization methods which are using in technical application. Plasma is using in many way of industry from technical manufacturing of materials to application in power electronics. Attention is paid to plasma. To description to fourth state of substance and for description to processes in plasma are used mathematic and physic equations. To solve to equations is hard and we need many steps to cover it. It is necessary to use computer technology to work up different method performs to describe electric arc.
APA, Harvard, Vancouver, ISO, and other styles
7

Jiřík, Leoš. "Návrh trading strategie pro řízení volného finančního kapitálu firmy." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2014. http://www.nusl.cz/ntk/nusl-224700.

Full text
Abstract:
This thesis deals with the design of trading strategies suitable for trading the currency markets. Design is carried out by means of artificial intelligence, the proposed strategies are then optimized and evaluated using previously unknown data. The partial objective is to implant this process in an existing company with the aim to broaden its capital. The consequences arising from this trading approach to the development of the company’s capital are subsequently studied from several perspectives – a schedule is outlined for the introduction into the company that has been chosen earlier, then the expected costs and revenues are compared in the scope of medium-term and in the last part the above procedure is analyzed so its risks can be pointed out and therefore procedures for their restrictions can be proposed as well.
APA, Harvard, Vancouver, ISO, and other styles
8

Damsgaard, Falck Hanna, Johanna Ring, and Erik Svensson. "Creating Bushing Core Geometries." Thesis, Uppsala universitet, Institutionen för materialvetenskap, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-444328.

Full text
Abstract:
Bushings are a necessary component of the transformers in the power grid. A bushing is used to control the electric field's strength and shape. It is also an insulator for high-voltage conductors. The bushing enables a conductor to be safely brought through a grounded barrier. In this report, several methods for creating a 2D axi-symmetrical bushing core geometry in COMSOL Multiphysics were developed. The geometry includes the conductor, hollow area inside the conductor, the RIP, the mold and aluminum foils. First, the base-geometry was constructed, which includes all geometry parts except the foils. Afterward, two different approaches were used to construct the foils. The first approach was to automatically build a requested number of foils. The second approach was to create the foils based on data from excel-sheets. The developed method should be able to create both full foils and partial foils. A total of four foil methods were developed. The first method used COMSOL's Model Builder to create a requested number of foils uniformly distributed within the base-geometry. The second method used COMSOL's Application Builder to create a requested number of foils based on mathematical expressions. The third method reads data from an excel sheet to create the foils in COMSOL. Method four is an improved version of method three that can create partial foils as well as the base-geometry. Foil methods II, III, and IV, created every foil as a separate geometrical object. As a result, an associated method that deletes the foils were also developed for each of these methods. A conclusion that the fourth method was the most realistic method of creating a bushing core could be draw due to, among other factors, it is the only method that can build partial foils.
APA, Harvard, Vancouver, ISO, and other styles
9

Ryttberg, Mattias. "Introducing Lantmäteriet’s gravity data in ArcGIS with implementation of customized GIS functions." Thesis, Uppsala universitet, Institutionen för geovetenskaper, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-203137.

Full text
Abstract:
Gravity is measured and used by Lantmäteriet to calculate a model of the geoid to get accurate reference heights for positioning. Lantmäteriet are continuously measuring new gravity and height data across Sweden to both complement, replace and to add new data points. This is mainly done by measurements in the field at benchmark points. One of the major reasons for continued measurements on e.g. benchmark points is that the measuring always moves forward which makes the measurements more accurate. More accurate data leads to a more accurate calculation of the geoid due to the more accurate gravity values. A more accurate geoid gives the possibility of more precise positioning across Sweden, due to the more precise height values. Lantmäteriet is in the process of updating their entire database of gravity data. They are also measuring at locations where there are none or sparse with measurements. As a stage in the renewing of their database and other systems the Geodesy department wishes to get an introduction to the ArcGIS environment. By customizations of several ArcGIS functions, Lantmäteriet’s work with the extensive data will get easier and perhaps faster. Customized tools will help make e. g. adding and removing data points easier, as well as making cross validation and several other functions only a click of a button away.
APA, Harvard, Vancouver, ISO, and other styles
10

Lood, Olof. "Prediktering av grundvattennivåi område utan grundvattenrör : Modellering i ArcGIS Pro och undersökningav olika miljövariablers betydelse." Thesis, Uppsala universitet, Institutionen för geovetenskaper, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-448020.

Full text
Abstract:
Myndigheten Sveriges Geologiska Undersökning (SGU) har ett nationellt ansvar för att övervaka Sveriges grundvattennivåer. Eftersom det inte är möjligt att få ett heltäckande mätstationssystem måste grundvattennivån beräknas på vissa platser. Därför är det intressant att undersöka sambandet mellan grundvattennivån och utvald geografisk information, så kallade miljövariabler. På sikt kan maskininlärning komma att användas inom SGU för att beräkna grundvattennivån och då kan en förstudie vara till stor hjälp. Examensarbetets syfte är att genomföra en sådan förstudie genom att undersöka vilka miljövariabler som har störst betydelse för grundvattennivån och kartlägga modellosäkerheter vid grundvattenprediktering. Förstudien genomförs på sju områden inom SGUs grundvattennät där mätstationerna finns i grupper likt kluster. I förstudien används övervakad maskininlärning som i detta examensarbete innebär att medianvärden på grundvattennivån och miljövariablerna används för att träna modellerna. Med hjälp av statistisk data från modellerna kan prestandan utvärderas och justeringar göras. Algoritmen som används heter Random Forest som skapar ett klassifikations- och regressionsträd, vilket lär modellen att utifrån given indata fatta beslut som liknar männiksans beslutfattande. Modellerna ställs upp i ArcGIS Pros verktyg Forest-based Classification and Regression. På grund av områdenas geografiska spridning sätts flera separata modeller upp. Resultatet visar att det är möjligt att prediktera grundvattennivån men betydelsen av de olika miljövariablerna varierar mellan de sju undersökta områdena. Orsaken till detta lär vara geografiska skillnader. Oftast har den absoluta höjden och markens lutningsriktning mycket stor betydelse. Höjd- och avståndsskillnad till låg och hög genomsläpplig jord har större betydelse än vad höjd- och avståndsskillnad har till medelhög genomsläpplig jord. Höjd- och avståndsskillnad har större betydelse till större vattendrag än till mindre vattendrag. Modellernas r2-värde är något låga men inom rimliga gränser för att vara hydrologiska modeller. Standardfelen är oftast inom rimliga gränser. Osäkerheten har visats genom ett     90 %-igt konfidensintervall. Osäkerheterna ökar med ökat avstånd till mätstationerna och är som högst vid hög altitud. Orsaken lär vara för få ingående observationer och för få observationer på hög höjd. Nära mätstationer, bebyggelse och i dalgångar är osäkerheterna i de flesta fallen inom rimliga gränser.
The Swedish authority Geological Survey of Sweden (SGU) has a national responsibility to oversee the groundwater levels. A national network of measurement stations has been established to facilitate this. The density of measurement stations varies considerably. Since it will never be feasible to cover the entire country with measurement stations, the groundwater levels need to be computed in areas that are not in the near vicinity of a measurement station. For that reason, it is of interest to investigate the correlation between the groundwater levels and selected geographical information, so called environmental variables. In the future, SGU may use machine learning to compute the groundwater levels. The focus of this master's thesis is to study the importance of the environmental variables and model uncertainties in order to determine if this is a feasible option for implementation on a national basis. The study uses data from seven areas of the Groundwater network of SGU, where the measuring stations are in clusters. The pilot study uses a supervised machine learning method which in this case means that the median groundwater levels and the environmental variables train the models. By evaluating the model's statistical data output the performance can gradually be improved. The algorithm used is called “Random Forest” and uses a classification and regression tree to learn how to make decisions throughout a network of nodes, branches and leaves due to the input data. The models are set up by the prediction tool “Forest-based Classification and Regression” in ArcGIS Pro. Because the areas are geographically spread out, eight unique models are set up. The results show that it’s possible to predict groundwater levels by using this method but that the importance of the environmental variables varies between the different areas used in this study. The cause of this may be due to geographical and topographical differences. Most often, the absolute level over mean sea level and slope direction are the most important variables. Planar and height distance differences to low and high permeable soils have medium high importance while the distance differences to medium high permeable soils have lower importance. Planar and height distance differences are more important to lakes and large watercourses than to small watercourses and ditches.  The model’s r2-values are slightly low in theory but within reasonable limits to be a hydrological model. The Standard Errors Estimate (SSE) are also in most cases within reasonable limits. The uncertainty is displayed by a 90 % confidence interval. The uncertainties increase with increased distance to measuring stations and become greatest at high altitude. The cause of this may be due to having too few observations, especially in areas with high altitude. The uncertainties are smaller close to the stations and in valleys.
SGUs grundvattennät
APA, Harvard, Vancouver, ISO, and other styles
11

Pumprla, Ondřej. "Získávání znalostí z datových skladů." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2009. http://www.nusl.cz/ntk/nusl-236715.

Full text
Abstract:
This Master's thesis deals with the principles of the data mining process, especially with the mining  of association rules. The theoretical apparatus of general description and principles of the data warehouse creation is set. On the basis of this theoretical knowledge, the application for the association rules mining is implemented. The application requires the data in the transactional form or the multidimensional data organized in the Star schema. The implemented algorithms for finding  of the frequent patterns are Apriori and FP-tree. The system allows the variant setting of parameters for mining process. Also, the validation tests and efficiency proofs were accomplished. From the point of view of the association rules searching support, the resultant application is more applicable and robust than the existing compared systems SAS Miner and Oracle Data Miner.
APA, Harvard, Vancouver, ISO, and other styles
12

Labounek, René. "Fúze simultánních EEG-FMRI dat za pomoci zobecněných spektrálních vzorců." Doctoral thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2018. http://www.nusl.cz/ntk/nusl-371799.

Full text
Abstract:
Mnoho rozdílných strategií fúze bylo vyvinuto během posledních 15 let výzkumu simultánního EEG-fMRI. Aktuální dizertační práce shrnuje aktuální současný stav v oblasti výzkumu fúze simultánních EEG-fMRI dat a pokládá si za cíl vylepšit vizualizaci úkolem evokovaných mozkových sítí slepou analýzou přímo z nasnímaných dat. Dva rozdílné modely, které by to měly vylepšit, byly navrhnuty v předložené práci (tj. zobecněný spektrální heuristický model a zobecněný prostorovo-frekvenční heuristický model). Zobecněný frekvenční heuristický model využívá fluktuace relativního EEG výkonu v určitých frekvenčních pásmech zprůměrovaných přes elektrody zájmu a srovnává je se zpožděnými fluktuacemi BOLD signálů pomocí obecného lineárního modelu. Získané výsledky ukazují, že model zobrazuje několik na frekvenci závislých rozdílných úkolem evokovaných EEG-fMRI sítí. Model překonává přístup fluktuací absolutního EEG výkonu i klasický (povodní) heuristický přístup. Absolutní výkon vizualizoval s úkolem nesouvisející širokospektrální EEG-fMRI komponentu a klasický heuristický přístup nebyl senzitivní k vizualizaci s úkolem spřažené vizuální sítě, která byla pozorována pro relativní pásmo pro data vizuálního oddball experimentu. Pro EEG-fMRI data s úkolem sémantického rozhodování, frekvenční závislost nebyla ve finálních výsledcích tak evidentní, neboť všechna pásma zobrazily vizuální síť a nezobrazily aktivace v řečových centrech. Tyto výsledky byly pravděpodobně poškozeny artefaktem mrkání v EEG datech. Koeficienty vzájemné informace mezi rozdílnými EEG-fMRI statistickými parametrickými mapami ukázaly, že podobnosti napříč různými frekvenčními pásmy jsou obdobné napříč různými úkoly (tj. vizuální oddball a sémantické rozhodování). Navíc, koeficienty prokázaly, že průměrování napříč různými elektrodami zájmu nepřináší žádnou novou informaci do společné analýzy, tj. signál na jednom svodu je velmi rozmazaný signál z celého skalpu. Z těchto důvodů začalo být třeba lépe zakomponovat informace ze svodů do EEG-fMRI analýzy, a proto jsme navrhli více obecný prostorovo-frekvenční heuristický model a také jak ho odhadnout za pomoci prostorovo-frekvenční skupinové analýzy nezávislých komponent relativního výkonu EEG spektra. Získané výsledky ukazují, že prostorovo-frekvenční heuristický model vizualizuje statisticky nejvíce signifikantní s úkolem spřažené mozkové sítě (srovnáno s výsledky prostorovo-frekvenčních vzorů absolutního výkonu a s výsledky zobecněného frekvenčního heuristického modelu). Prostorovo-frekvenční heuristický model byl jediný, který zaznamenal s úkolem spřažené aktivace v řečových centrech na datech sémantického rozhodování. Mimo fúzi prostorovo-frekvenčních vzorů s fMRI daty, jsme testovali stabilitu odhadů prostorovo-frekvenčních vzorů napříč různými paradigmaty (tj. vizuální oddball, semantické rozhodování a resting-state) za pomoci k-means shlukovacího algoritmu. Dostali jsme 14 stabilních vzorů pro absolutní EEG výkon a 12 stabilních vzorů pro relativní EEG výkon. Ačkoliv 10 z těchto vzorů vypadají podobně napříč výkonovými typy, prostorovo-frekvenční vzory relativního výkonu (tj. vzory prostorovo-frekvenčního heuristického modelu) mají vyšší evidenci k úkolům.
APA, Harvard, Vancouver, ISO, and other styles
13

Osmani, Laura. "Database relazionali e applicazioni gis e webgis per la gestione, l'analisi e la comunicazione dei dati territoriali di un'area protetta. Il Parco Regionale del Conero come caso applicativo." Doctoral thesis, Università degli studi di Trieste, 2010. http://hdl.handle.net/10077/3637.

Full text
Abstract:
2008/2009
Alla base del lavoro di ricerca è stato posto un impianto metodologico a carattere multidisciplinare contrassegnato da un lato da un’analisi introduttiva teorico-geografica relativa alle tematiche inerenti il governo del territorio e del paesaggio in area protetta, dall’altro, da una fase di esame (connessa ad un ambito di indagine più strettamente cartografico – digitale) tesa a fare il punto sullo stato di avanzamento -a livello comunitario e nazionale- in merito al tema della costruzione di infrastrutture di dati spaziali, pubblicazione e condivisione di servizi legati al settore del Geographical Information System, con attenzione alla comunicazione degli elementi di natura ambientale (rintracciabili anche all’interno del contesto aree protette). Entrambe le panoramiche, arricchite al loro interno dalla descrizione del quadro normativo transcalare di riferimento, risultano necessarie ai fini della contestualizzazione e impostazione del lavoro e conducono ad una fase di screening in merito al tema comunicazione dei dati territoriali in ambiente webgis da parte degli enti italiani gestori delle aree protette, nello specifico parchi nazionali e regionali. Tali elementi teorici, legislativi e conoscitivi sono stati poi presi a riferimento nel corso della fase applicativa della ricerca con lo scopo di guidare e supportare i momenti che hanno condotto alla realizzazione di applicazioni dedicate all’area Parco del Conero facenti seguito ad una fase di survey sul campo, ad un’organizzata raccolta di dati territoriali (di base e di Piano del Parco) e successive fasi di analisi spaziale. Lo scopo è quello di supportare (grazie agli applicativi realizzati) le operazioni di gestione, studio e comunicazione territoriale che un Ente responsabile di un’area protetta si trova a dover definire e implementare alla luce delle tematiche considerate nel corso della sezione teorica. I risultati tangibili si incarnano nella creazione di un’architettura che partendo dal relational database, passando per il geodatabase e giungendo alle piattaforme webgis dinamiche e interattive funga da supporto ai processi di coordinamento, analisi e diffusione di selezionati elementi territoriali relativi al comprensorio Parco del Conero e al suo principale strumento di pianificazione (Piano del Parco) agevolando e supportando così sia processi gestionali e decisionali più “consapevoli”, sia percorsi informativi e partecipativi strutturati. Il corpus definitivo dell’elaborato è stato suddiviso in due parti distinte allo scopo di scandire i momenti dello studio e consentirne una più immediata lettura. Ciascuna si articola in tre capitoli. La prima parte, a cui si è assegnato il titolo “Governo del territorio e condivisione del dato informativo e cartografico. Scenari evolutivi verso lo sviluppo di dinamiche partecipative” esplicita al suo interno il quadro teorico, normativo e conoscitivo posto alla base della ricerca. - Nel corso del primo capitolo si è ritenuto opportuno introdurre brevemente alle recenti dinamiche che hanno interessato i concetti, le definizioni e gli aspetti normativi inerenti le tematiche relative al governo del territorio e del paesaggio in area protetta, più nel dettaglio di quello dei parchi naturali regionali in Italia e forme del paesaggio da tutelare. Un excursus che ha preso in esame gli scritti geografici nazionali e internazionali sul tema, facendo emergere posizioni eterogenee, in continua evoluzione e, comunque, oggi in linea con i recenti indirizzi di contesto sviluppati e approvati in ambito comunitario e convenientemente riletti alla scala nazionale. Il tutto ha la necessità di essere supportato da un’adeguata rappresentazione cartografico-tassonomica delle diverse tipologie, unità e categorie di paesaggio e parco. Principio, quello della classificazione, che caratterizza una delle fondamentali linee di dibattito, internazionale e nazionale sull’argomento. - Il secondo, attraverso un approccio che lega il mondo del Geographical Information System e le aree protette tramite il tema della pubblicazione e condivisione dei dati spaziali e ambientali, configura brevemente lo stato dell’arte nel contesto di realizzazione di infrastrutture ad essi dedicate, di implementazioni relative alla stesura dei metadati da indicare per set e serie di elementi territoriali, nonché servizi per i medesimi. Lo sguardo viene rivolto alle direttive, ai regolamenti e alle decisioni in ambito comunitario e alle trasposizioni delle stesse all’interno del contesto nazionale. - Nel terzo si inizia ad entrare nella parte del lavoro di ricerca caratterizzata da un’impronta più conoscitiva che teorico-normativa. Ci si spinge oltre il quadro concettuale e si cerca di capire, attraverso la realizzazione di uno screening sul tema della comunicazione e diffusione (da parte dei rispettivi enti gestori) dei più rilevanti dati territoriali relativi ai parchi nazionali e regionali italiani tramite piattaforme webgis, cosa nel nostro paese è stato fatto a favore della loro divulgazione e quali possono configurarsi come margini di miglioramento futuro. L’analisi è corredata da grafici e tabelle di dettaglio in relazione alle quali si espongono commenti relativi ai risultati ricavati nel corso dell’indagine -sia in valore assoluto che in valore percentuale-. Il capitolo funge da ponte tra la sezione teorica del lavoro e quella dedicata invece al caso di studio specifico. La seconda parte “Un’applicazione territoriale: il Parco del Conero. Da un’analisi geografica di contesto ad una di dettaglio attraverso tools gis-analyst. Database Management System e Web Service Application per la gestione e la comunicazione”, memore dell’indagine teorico-conoscitiva, è dedicata alla presentazione del caso applicato all’area protetta del Conero. Nel dettaglio: - all’interno del capitolo quarto si fornisce un inquadramento territoriale dell’area oggetto di esame tramite analisi condotte grazie a tools gis-analyst (ArcGis – ArcToolbox). Tale inquadramento viene arricchito dal rilievo sul campo della rete sentieristica interna al Parco del Conero in relazione alla quale si descrivono le modalità di acquisizione dei dati e le successive fasi di post-elaborazione. Il rilievo dei sentieri (reso necessario dal fatto che la rete era stata solo digitalizzata sulla carta) ha consentito di completare il quadro di analisi relativo alla viabilità pedonale interna all’area parco, ponendo l’accento non solo sulle caratteristiche di fruibilità turistico-paesaggistica che questa possiede, ma integrando i dati raccolti con quelli del Piano del Parco già a disposizione dell’Ente al fine di giungere alla realizzazione di modelli di analisi spaziale (ESRI Model Builder) da poter applicare in successive fasi di valutazione territoriale dell’area stessa o di programmazione concernente interventi puntuali da effettuarsi sulla rete sentieristica in relazione a tratti di percorso caratterizzati da elementi di criticità. Di tali modelli si sottolineano le caratteristiche di versatilità e adattabilità a qualsiasi tipologia di territorio, protetto e non, che risulti attraversato da sentieri, percorsi e itinerari turistico - culturali o di fruibilità paesaggistica e naturalistica. Il capitolo si conclude con la descrizione delle finalità di indagine e struttura dei modelli stessi. - Nel capitolo quinto i dati alfanumerici, quelli ricavati dalle survey della rete sentieristica, quelli di piano, nonché quelli riguardanti le fonti bibliografiche vengono integrati all’interno di un database relazionale MS Access pensato ai fini della loro consultazione anche da parte di utenti non esperti GIS. Tale database consente collegamenti e interazioni sia con un personal geodatabase ESRI che con il database spatial PostgreSQL (estensione PostGIS) all’interno dei quali sono stati archiviati i dati spaziali dedicati invece ad una utenza GIS specialist. Si prosegue con la descrizione delle tipologie di dataset territoriali in essi inseriti ai fini della loro archiviazione e del loro aggiornamento. - Il sesto capitolo risulta, infine, dedicato al testing e sviluppo (localhost) di un applicativo Webgis UMN Mapserver con front-end dinamico P.Mapper contenente una selezione dei dati spaziali di cui sopra. In relazione ad esso si delineeranno le caratteristiche fondanti, le categorie e le query di interrogazione, i parametri degli strati informativi di cui si intende consentire la visualizzazione. Il tutto consapevoli che la pubblicazione web di un Sistema Informativo Territoriale trova, di fatto, il suo fine ultimo non solo nel mero passaggio da un’utenza locale a una multiutenza condivisa del dato/database spaziale, ma anche nella sua auto-identificazione a strumento atto a supportate, favorire e attivare processi di condivisione informativa e partecipazione decisionale collettiva secondo dinamiche che, alternativamente, vertano ad un andamento di tipo top - down e bottom – up. Il lavoro, dopo le note conclusive, si chiude con le consuete indicazioni bibliografiche e sitografiche e tre allegati all’interno dei quali si riportano: due tabelle sinottiche relative allo screening sui parchi nazionali e regionali presentato nel corso del terzo capitolo, l’estratto di alcuni strati informativi inseriti nel file .map di Mapserver e infine un elenco delle sigle e degli acronimi incontrati nel corso dello scritto.
XXII Ciclo
1979
APA, Harvard, Vancouver, ISO, and other styles
14

Roumboutsos, Athena. "The application of deconvolution in well test analysis." Thesis, Heriot-Watt University, 1988. http://hdl.handle.net/10399/973.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Gander, Werner. "Buildup/Washoff Model for Dissolved Iron in Stormwater Runoff." ScholarWorks@UNO, 2007. http://scholarworks.uno.edu/td/531.

Full text
Abstract:
This research focused on the calibration of the Buildup and Washoff for dissolved iron. The test site located at the intersection of the Interstate-10 and Interstate-610, New Orleans, Louisiana. Storm water runoff from the examined elevated roadway section was analyzed for 14 storm events. The model used a linear function of antecedent dry time for the buildup model. The rating curve assumption was selected to estimate the pollutant transport capacity. In a further step the two were combined. The derived Buildup/Washoff model was calibrated for the collected data and precipitation data of the International Airport of New Orleans. The obtained power function for pollutant transport capacity is Y = 0.1028 * V 0.8212, where the transport capacity Y [ìg/l] and the total runoff volume V [l]. For the Buildup/Washoff model the 2 year interval gave the most reliable. The value of 12.13 mg/day was obtained for the Buildup Rate C.
APA, Harvard, Vancouver, ISO, and other styles
16

Repka, Martin. "Investiční modely v prostředí finančních trhů." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2013. http://www.nusl.cz/ntk/nusl-224014.

Full text
Abstract:
This thesis focuses on automated trading systems for financial markets trading. It describes theoretical background of financial markets, different technical analysis approaches and theoretical knowledge about automated trading systems. The output of the present paper is a diversified portfolio comprising four different investment models aimed to trading futures contracts of cocoa and gold. The portfolio tested on market data from the first quarter 2013 achieved 46.74% increase on the initial equity. The systems have been designed in Adaptrade Builder software using genetic algorithms and subsequently tested in the MetaTrader trading platform. They have been finally optimized using sensitivity analysis.
APA, Harvard, Vancouver, ISO, and other styles
17

Zemirline, Nadjet. "Assisting in the reuse of existing materials to build adaptive hypermedia." Phd thesis, Université Paris Sud - Paris XI, 2011. http://tel.archives-ouvertes.fr/tel-00664996.

Full text
Abstract:
Nowadays, there is a growing demand for personalization and the "one-size-fits-all" approach for hypermedia systems is no longer applicable. Adaptive hypermedia (AH) systems adapt their behavior to the needs of individual users. However due to the complexity of their authoring process and the different skills required from authors, only few of them have been proposed. These last years, numerous efforts have been put to propose assistance for authors to create their own AH. However, as explained in this thesis some problems remain.In this thesis, we tackle two particular problems. A first problem concerns the integration of authors' materials (information and user profile) into models of existing systems. Thus, allowing authors to directly reuse existing reasoning and execute it on their materials. We propose a semi-automatic merging/specialization process to integrate an author's model into a model of an existing system. Our objectives are twofold: to create a support for defining mappings between elements in a model of existing models and elements in the author's model and to help creating consistent and relevant models integrating the two models and taking into account the mappings between them.A second problem concerns the adaptation specification, which is famously the hardest part of the authoring process of adaptive web-based systems. We propose an EAP framework with three main contributions: a set of elementary adaptation patterns for the adaptive navigation, a typology organizing the proposed elementary adaptation patterns and a semi-automatic process to generate adaptation strategies based on the use and the combination of patterns. Our objectives are to define easily adaptation strategies at a high level by combining simple ones. Furthermore, we have studied the expressivity of some existing solutions allowing the specification of adaptation versus the EAP framework, discussing thus, based on this study, the pros and cons of various decisions in terms of the ideal way of defining an adaptation language. We propose a unified vision of adaptation and adaptation languages, based on the analysis of these solutions and our framework, as well as a study of the adaptation expressivity and the interoperability between them, resulting in an adaptation typology. The unified vision and adaptation typology are not limited to the solutions analysed, and can be used to compare and extend other approaches in the future. Besides these theoretical qualitative studies, this thesis also describes implementations and experimental evaluations of our contributions in an e-learning application.
APA, Harvard, Vancouver, ISO, and other styles
18

Myers, Lee A. "Novel build-to-rent strategies for single family homebuilders." Thesis, Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/51873.

Full text
Abstract:
Following the recession of 2007-2009, conditions in the housing and finance industries favored an increase in renter occupied homes relative to owner occupied homes. With rental properties comprising an increasing share of the housing supply, the home building industry should consider housing products that meet the needs of renters. This thesis proposes a build-to-rent product for single family home builders, to be offered as a complement to the traditional built-for-sale product. The purpose of the research is to demonstrate that a build-to-rent product is financially feasible under ordinary market conditions. In order to determine the viability of a build-to-rent product under likely market conditions, a financial model has been developed for a single family build-to rent product. The research involves reviewing the literature related to similar investment product types in order to develop a business model for the proposed build-to-rent product. The proposed model utilizes financial parameters currently in the industry, respectively, in the analysis of homebuilding projects and rental property investments. Using the analytical methods used for analogous investment classes, the author calculates a projected market range of input variables for the model. Sensitivity analysis of the model was then used to test the financial feasibility of a build-to-rent product. The analysis showed that the proposed product would be feasible under ordinary market conditions. Additional recommendations for future research has been explored based on the findings of this study.
APA, Harvard, Vancouver, ISO, and other styles
19

Salazar, Rodríguez Felipe Antonio. "Comparación Cuantitativa entre el Modelo Diseño-Licitación-Construcción (Design-Bid-Build) y el Modelo Diseño-Construcción (Design-Build)." Tesis, Universidad de Chile, 2010. http://www.repositorio.uchile.cl/handle/2250/103853.

Full text
Abstract:
No autorizada por el autor para ser publicada a texto completo
El objetivo general del presente trabajo de título es comparar dos modelos de contratos, los cuales afrontan un proyecto desde dos ángulos muy distintos con respecto a manejo de riesgos y manejo de información. De acuerdo a esto, se buscó encontrar qué tipo de modelo conlleva una mayor eficiencia en términos de plazos y costos. Hoy en día, el ministerio de Obras Públicas enfrenta los proyectos en gran mayoría mediante el modelo de contrato Diseño-Licitación-Construcción. La espera por diseños detallados, realizados con la finalidad de controlar los costos de construcción, produce un aumento en los plazos totales de los proyectos. En algunos casos sería posible disminuir los plazos, sin poner en riesgo los costos, con otro modelo de contratación. Bajo esta premisa, se estudió un universo contractual en el cual se clasificaron los modelos de contrato más utilizados, confeccionando fichas para cada modelo a comparar. Para llevar a cabo el estudio, el mercado fue analizado mediante el método de Benchmarking, estudiando experiencias de obras públicas nacionales bajo el modelo Diseño-Licitación-Construcción y obras públicas de Estados Unidos bajo el modelo Diseño-Construcción. Este análisis se basó, en índices relativos a plazos y costos del proyecto. El resultado final de los análisis llevados a cabo, entregaron diferencias de eficiencia cuando los índices correspondían a plazos. Con respecto a los índices de costos, las significancias fueron nulas. Las magnitudes asociadas en el análisis reflejaron que no hay contundencia para una conclusión global certera, sin embargo, se puede inferir que en términos de plazos el modelo de contrato Diseño-Construcción es más eficiente que el modelo Diseño-Licitación-Construcción. Las comparaciones no solo son cuantitativas, sino que también cualitativas, es decir, hay otra información que respalda la mayor eficiencia del tipo de contrato Diseño-Construcción, por ejemplo, existe una experiencia mundial que avala el hecho de utilizar este tipo de contrato, en proyectos que tienen ciertas características. Por lo anterior es de esperar, que en Chile se aproveche esta experiencia en el ámbito público, y se de paso al desarrollo de proyectos bajo nuevos modelos de contratación que optimicen procesos y recursos.
APA, Harvard, Vancouver, ISO, and other styles
20

Essa, Fagmie. "A regulatory assessment of the Build-Own-Operate model for New Nuclear Build in South Africa." Master's thesis, University of Cape Town, 2017. http://hdl.handle.net/11427/27375.

Full text
Abstract:
Governments pursue New Nuclear Build (NNB) projects for different strategic reasons. Many countries are not able to devise a funding model for the excessive costs of a NNB project nor do they have the local skills to construct, operate and maintain a NPP. South Africa's Integrated Resource Plan 2016 indicates that the base case scenario includes a target date of 2037 for the first unit of a NNB programme. This date moves to 2026 when the carbon budget is included in the forecast. The National Nuclear Regulator (NNR) has limited experience in licensing NNB programs as the country's only other commercially operating nuclear power plant was completed in 1984. The Build-Own-Operate (BOO) model is being promoted by Russia as an option to finance, design, supply most of the equipment for, construct and operate, a NPP. The first of this model is being executed at Akkuyu in Turkey. South Africa and specifically the NNR will face many challenges should it pursue the Build-Own- Operate model for its New Nuclear Build programme, given this model's novel and complex demands. This study has concluded that the Nuclear Energy Policy does not support an entity other than Eskom (the South African electricity utility) from owning and operating any new NPP in South Africa. The NNR does not presently have the resources to be able to service a NBB program but should be able to adapt to the increased demands of a NNB programme. The minister responsible for the promotion of nuclear energy also provides oversight of the NNR. This conflict of interest does not appear to have affected NNR decisions thus far; however, a NNB programme will exert undue pressure on all stakeholders which could change the relationship between the NNR and its minister. The National Nuclear Regulator Act (NNRA) needs to be enhanced to change this reporting structure while also enhancing the independence of the NNR. Should the Hinkley Point C model be followed in South Africa, the appointment of a design authority within the licensee which effectively creates an additional layer of verification should warrant strong consideration for the South African NNB model. The Akkuyu BOO model challenges the principles of the Intelligent Customer concept. Should the Akkuyu BOO model be followed, the NNR would require the licensee to show that independent verification is in place. Roles and responsibilities should be clearly described and understood by all stakeholders. The onus remains with the licensee to prove Intelligent Customer capability.
APA, Harvard, Vancouver, ISO, and other styles
21

Nguyen, Thi Mai. "A model driven engineering approach to build secure information systems." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLL001/document.

Full text
Abstract:
Aujourd’hui, les organisations s'appuient de plus en plus sur les systèmes d'information pour collecter, manipuler et échanger leurs données. Dans ces systèmes, la sécurité joue un rôle essentiel. En effet, toute atteinte à la sécurité peut entraîner de graves conséquences, voire détruire la réputation d'une organisation. Par conséquent, des précautions suffisantes doivent être prises en compte. De plus, il est bien connu que plus tôt un problème est détecté, moins cher et plus facile il sera à corriger. L'objectif de cette thèse est de définir les politiques de sécurité depuis les premières phases de développement et d’assurer leur déploiement correct sur une infrastructure technologique donnée.Notre approche commence par spécifier un ensemble d'exigences de sécurité, i.e. des règles statiques et dynamiques, accompagnées de l'aspect fonctionnel d'un système basé sur UML (Unified Modeling Language). L'aspect fonctionnel est exprimé par un diagramme de classes UML, les exigences de sécurité statiques sont modélisées à l'aide de diagrammes de SecureUML, et les règles dynamiques sont représentées en utilisant des diagrammes d'activités sécurisées.Ensuite, nous définissons des règles de traduction pour obtenir des spécifications B à partir de ces modèles graphiques. La traduction vise à donner une sémantique précise à ces schémas permettant ainsi de prouver l'exactitude de ces modèles et de vérifier les politiques de sécurité par rapport au modèle fonctionnel correspondant en utilisant les outils AtelierB prover et ProB animator. La spécification B obtenue est affinée successivement à une implémentation de type base de données, qui est basée sur le paradigme AOP. Les affinements B sont également prouvés pour s'assurer que l’implémentation est correcte par rapport à la spécification abstraite initiale. Le programme d’AspectJ traduit permet la séparation du code lié à la sécurité sécurité du reste de l'application. Cette approche permet d’éviter la diffusion du code de l'application, et facilite ainsi le traçage et le maintien.Enfin, nous développons un outil qui génère automatiquement la spécification B à partir des modèles UML, et la dérivation d'une implémentation d'AspectJ à partir de la spécification B affinée. L'outil aide à décharger les développeurs des tâches difficiles et à améliorer la productivité du processus de développement
Nowadays, organizations rely more and more on information systems to collect, manipulate, and exchange their relevant and sensitive data. In these systems, security plays a vital role. Indeed, any security breach may cause serious consequences, even destroy an organization's reputation. Hence, sufficient precautions should be taken into account. Moreover, it is well recognized that the earlier an error is discovered, the easier and cheaper it is debugged. The objective of this thesis is to define adequate security policies since the early development phases and ensure their correct deployment on a given technological infrastructure. Our approach starts by specifying a set of security requirements, i.e. static and dynamic rules, along with the functional aspect of a system based on the Unified Modeling Language (UML). Fundamentally, the functional aspect is expressed using a UML class diagram, the static security requirements are modeled using SecureUML diagrams, and the dynamic rules are represented using secure activity diagrams. We then define translation rules to obtain B specifications from these graphical models. The translation aims at giving a precise semantics to these diagrams, thus proving the correctness of these models and verifying security policies with respect to the related functional model using the AtelierB prover and the ProB animator. The obtained B specification is successively refined to a database-like implementation based on the AOP paradigm. The B refinements are also proved to make sure that the implementation is correct with respect to the initial abstract specification. Our translated AspectJ-based program allows separating the security enforcement code from the rest of the application. This approach avoids scattering and tangling the application's code, thus it is easier to track and maintain. Finally, we develop a tool that automates the generation of the B specification from UML-based models and of the AspectJ program connected to a relational database management system from the B implementation. The tool helps disburden developers of the difficult and error-prone task and improve the productivity of the development process
APA, Harvard, Vancouver, ISO, and other styles
22

Upalekar, Ruta Sunil. "Tools to help build models that predict student learning." Link to electronic thesis, 2006. http://www.wpi.edu/Pubs/ETD/Available/etd-050206-154628/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Lynn, Charity M. "Accuracy models for SLA build style decision support." Thesis, Georgia Institute of Technology, 1998. http://hdl.handle.net/1853/16832.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Tornqvist, Dominicus P. "Model-Master-Transfer: Formally Deconstructing Educational Games to Build a Quantitative Theory." Thesis, Griffith University, 2020. http://hdl.handle.net/10072/397038.

Full text
Abstract:
We live in an increasingly interconnected, complex world, where our collective decisions can have unanticipated indirect consequences on the world. What is today a hands-on job may soon be a job of managing variables at a computer screen. However, people have tremendous difficulty handling even relatively mild levels of complexity in experiments. Yet, in the context of many modern computer games, people eagerly teach themselves vastly complex systems, all without any external guidance or coercion. Educational video games research is a menagerie of different methodologies and paradigms. Meta analyses have found that mixed results in the literature are difficult to interpret due to the combination of different theoretical approaches, different data reporting conventions, and a general focus on proof-of-concept studies. They recommend a transition to narrower investigations of specific causal relationships between game properties and outcomes of engagement and learning, but this requires a way to incorporate them into the broader picture of educational and serious games research. In this thesis, I focus on educational games and propose the Model-Master-Transfer (MMT) framework to break down educational game usage into a set of formal subprocesses that can be studied in more depth individually, and specify how such narrow studies can then be assembled to build up a causal model of the underlying effects. The framework is illustrated using different educational examples. The conceptual study contributes a comprehensive framework to the ambiguous research on educational learning using games. MMT is then used in empirical experiments to address two sub-problems: 1) Why players sometimes choose to lose the game, completely derailing its intended purpose; 2) The design of inherently learnable systems in terms of how the complexity of a game relates to the player’s ability to master it - searching for forms of complexity that elicit curiosity to learn about that complexity. These experiments demonstrate the value and manner of how to apply MMT to investigate specific psychological phenomena while retaining a logically coherent place in our understanding of how educational and serious games achieve positive outcomes - a logically coherent place provided by the framework ofMMT. This body of work provides practical applications for game developers and educators alike, as well as interesting theoretical implications for cognition, curiosity, and complexity.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Info & Comm Tech
Science, Environment, Engineering and Technology
Full Text
APA, Harvard, Vancouver, ISO, and other styles
25

Orenäs, Nissas Sebastian, and Nangi Rahimi. "Digitalized Construction Project : To Build after a Legally Binding BIM-model." Thesis, KTH, Fastigheter och byggande, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-279110.

Full text
Abstract:
Digitalization has become something of a buzzword in today's society and rightly so as it brings multiple benefits and opportunities. The AEC/FM industry has constantly lagged behind other industries in terms of change and development and is often regarded as conservative. Strongly associated with digitalization in construction are the concepts of Building Information Modeling (BIM) and Virtual Design and Construction (VDC), which partly include technology and models for integrated and model-based approaches to, for example, reduce fragmentation between project members who traditionally work independently of one another. In research, it is revealed that there are large gains with a successful implementation of BIM/VDC in projects and this is something that many companies in the industry are working with and seeking to develop. The purpose of the thesis is to investigate how the project team members have worked and how the working methods are perceived by them in a well-known construction project in Sweden, where they have taken a step further in digitalizing construction by building after a legally binding digital model instead of the traditional paper drawings. The subject is explored with a qualitative method, in the form of a case study, where a scientific literature study, interview study, and observations together form the basis of the study and these parts act as a basis for the discussion. The literature study covers previous research as well as concepts relevant for answering formulated research questions and concepts that emerged during the interview study that are important to understand for a qualitative discussion and, consequently, qualitative conclusions. In the interview study, 13 respondents were interviewed in so-called semi-structured interviews and all of them were involved in the case project. The findings indicate that the BIM-model can contribute to better communication, higher resource efficiency, better quality and, at the same time for a lower total cost of the project. Identified perceptions in designing a BIM-model and then building after the model instead of 2D drawings are predominantly positive. While advantages and opportunities are demonstrated by this way of working, new challenges and risks arise. This entails legal risks, technical risks and management risks. There are new types of errors that arise with a more detailed design.
Digitalisering har blivit något av ett modeord inom dagens samhälle och det med all rätt då det medför sina fördelar och möjligheter. Bygg- och fastighetsbranschen har ständigt släpat efter övriga industrier vad gäller förändring och utveckling och ses därefter ofta som konservativ. Starkt associerat med digitalisering inom bygg är koncepten Building Information Modeling (BIM) och Virtual Design and Construction (VDC) som dels innefattar teknologi och modeller för integrerade och modellbaserade arbetssätt. De används exempelvis för att minska fragmentering mellan projektmedlemmar som vanligen enbart fokuserar på sina egna teknikområden. Inom forskningsvärlden sägs det finnas stora vinningar med en lyckad implementering av BIM/VDC i projekt och det är något som många företag inom branschen arbetar med och söker utveckla. Samtidigt anger forskningen också att det finns stora utmaningar och att man ännu inte kommit så långt med digitaliseringen. Syftet med detta examensarbete är bland annat att undersöka hur projektmedlemmar har arbetat och hur arbetssätten har upplevts i ett av få välkända byggprojekt i Sverige där man tagit ett steg längre i att digitalisera byggandet genom att bygga efter en digital modell som juridisk bygghandling istället för de traditionella pappersritningarna. Ämnet utforskas kvalitativt, i form av en fallstudie, där en vetenskaplig litteraturstudie, intervjustudie samt observationer tillsammans utgör grunden för arbetet och som alla agerar underlag för analysdelen. Litteraturstudien täcker tidigare studier på området för att beskriva kunskapsläget samt koncept som är relevanta för att besvara formulerade frågeställningar samt begrepp som dykt upp under intervjustudien som är viktiga att förstå för en kvalitativ diskussion och följaktligen likaså kvalitativa slutsatser. I intervjustudien har 13 respondenter intervjuats i så kallade semi-strukturerade intervjuer och som alla varit inblandade i det undersökta projektet. Resultatet tyder på att BIM-modellen kan bidra till en bättre kommunikation, högre resurseffektivitet, bättre kvalitet och samtidigt till en lägre totalkostnad av projektet. Identifierade upplevelser med att projektera en BIM-modell och att därefter bygga efter modellen istället för 2Dritningar är till övervägande del positiva. Samtidigt som fördelar och möjligheter påvisas med detta arbetssätt så uppkommer nya utmaningar och risker. Det medför juridiska risker, tekniska risker och hanteringsrisker. Det är exempelvis nya typer av fel som uppkommer med en mer detaljerad projektering.
APA, Harvard, Vancouver, ISO, and other styles
26

Zhu-Colas, Beiting. "Using knowledge-based MAS to build a reliable structural earth model." Paris 6, 2008. http://www.theses.fr/2008PA066529.

Full text
Abstract:
Les modèles structuraux fiables sont des représentations surfaciques proches du sous-sol exploré. Les interprétations structurales à l'origine des modélisations sont validées et remises à jour en se conformant aux logs de puits importés quotidiennement. Les outils de modélisation existants proposent un environnement d'interprétation permettant d'intégrer facilement les opinions des géologues. Cependant le travail d'interprétation reste manuel. Compte tenu de la complexité du sous-sol et de la dynamique des données de forage, nous proposons un système multi-agents basé sur les connaissances géologiques pour assister l'interprétation. En s'inspirant du principe de l'Eco-résolution, notre approche associe un agent cognitif à une surface géologique pour l'interpréter, et laisse ces agents s'auto-organiser localement pour faire émerger un modèle cohérent au niveau global du système. Nous validons ce système avec des tests pertinents fournis par les géologues de l'IFP et de l'ENSMP. Les expérimentations démontrent que les choix architecturaux sont pertinents et que les développements pour l’implémentation ont complètement été menés à leur terme.
APA, Harvard, Vancouver, ISO, and other styles
27

Stepanova, Maria. "Using survival analysis methods to build credit scoring models." Thesis, University of Southampton, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.364729.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Armes, Michael W. "New initiatives in public/private contracting under the build-operate-transfer model." Thesis, Monterey, California. Naval Postgraduate School, 1996. http://hdl.handle.net/10945/8736.

Full text
Abstract:
CIVINS
Internationally, the demand for infrastructure continues to grow as repressed needs of developed countries and new needs of emerging countries proliferate. This requirement for infrastructure provision causes great pressure on public expenditure. Member countries of the Organization for Economic Cooperation and Development (OECD), for example, devote on average nearly one- fifth of capital formation to infrastructure (1:3). The increased need for infrastructure coupled with debt and tax limitations on governments which restricts their ability to provide capital for infrastructure projects has led to the involvement of the private sector in comprehensive contracting partnerships
APA, Harvard, Vancouver, ISO, and other styles
29

Thor, Nandan G. "Using Computer Vision to Build a Predictive Model of Fruit Shelf-life." DigitalCommons@CalPoly, 2017. https://digitalcommons.calpoly.edu/theses/1721.

Full text
Abstract:
Computer vision is becoming a ubiquitous technology in many industries on account of its speed, accuracy, and long-term cost efficacy. The ability of a computer vision system to quickly and efficiently make quality decisions has made computer vision a popular technology on inspection lines. However, few companies in the agriculture industry use computer vision because of the non-uniformity of sellable produce. The small number of agriculture companies that do utilize computer vision use it to extract features for size sorting or for a binary grading system: if the piece of fruit has a certain color, certain shape, and certain size, then it passes and is sold. If any of the above criteria are not met, then the fruit is discarded. This is a highly wasteful and relatively subjective process. This thesis proposes a process to undergo to use computer vision techniques to extract features of fruit and build a model to predict shelf-life based on the extracted features. Fundamentally, the existing agricultural processes that do use computer vision base their distribution decisions on current produce characteristics. The process proposed in this thesis uses current characteristics to predict future characteristics, which leads to more informed distribution decisions. By modeling future characteristics, the process proposed will allow fruit characterized as “unfit to sell” by existing standards to still be utilized (i.e. if the fruit is too ripe to ship across the country, it can still be sold locally) which decreases food waste and increases profit. The process described also removes the subjectivity present in current fruit grading systems. Further, better informed distribution decisions will save money in storage costs and excess inventory. The proposed process consists of discrete steps to follow. The first step is to choose a fruit of interest to model. Then, the first of two experiments is performed. Sugar content of a large sample of fruit are destructively measured (using a refractometer) to correlate sugar content to a color range. This step is necessary to determine the end-point of data collection because stages of ripeness are fundamentally subjective. The literature is consulted to determine “ripe” sugar content of the fruit and the first experiment is undertaken to correlate a color range that corresponds to the “ripe” sugar content. This feature range serves as the end-point of the second experiment. The second experiment is large-scale data collection of the fruit of interest, with features being recorded every day, until the fruit reaches end-of-life as determined by the first experiment. Then, computer vision is used to perform feature extraction and features are recorded over each sample fruit’s lifetime. The recorded data is then analyzed with regression and other techniques to build a model of the fruit’s shelf-life. The model is finally validated. This thesis uses bananas as a proof of concept of the proposed process.
APA, Harvard, Vancouver, ISO, and other styles
30

Rafferty, Martin James. "A hiring and training model to build a diverse government employee base." Online version, 2004. http://www.uwstout.edu/lib/thesis/2004/2004raffertym.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Luciano, Cristiana da Costa. "Ação de detergentes e desinfetantes em biofilme tradicional e buildup no modelo MBEC." Universidade Federal de Goiás, 2016. http://repositorio.bc.ufg.br/tede/handle/tede/7057.

Full text
Abstract:
Submitted by Cássia Santos (cassia.bcufg@gmail.com) on 2017-04-03T12:00:58Z No. of bitstreams: 3 Tese - Cristiana da Costa Luciano - 2016 - parte 1.pdf: 9498921 bytes, checksum: 226b340d0cc04924992dc645bdc6a8ad (MD5) Tese - Cristiana da Costa Luciano - 2016 - parte 2.pdf: 14789773 bytes, checksum: 89f6da4ebc9309b0d0527169ad37cc50 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2017-04-03T12:05:34Z (GMT) No. of bitstreams: 3 Tese - Cristiana da Costa Luciano - 2016 - parte 1.pdf: 9498921 bytes, checksum: 226b340d0cc04924992dc645bdc6a8ad (MD5) Tese - Cristiana da Costa Luciano - 2016 - parte 2.pdf: 14789773 bytes, checksum: 89f6da4ebc9309b0d0527169ad37cc50 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Made available in DSpace on 2017-04-03T12:05:34Z (GMT). No. of bitstreams: 3 Tese - Cristiana da Costa Luciano - 2016 - parte 1.pdf: 9498921 bytes, checksum: 226b340d0cc04924992dc645bdc6a8ad (MD5) Tese - Cristiana da Costa Luciano - 2016 - parte 2.pdf: 14789773 bytes, checksum: 89f6da4ebc9309b0d0527169ad37cc50 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Previous issue date: 2016-12-20
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES
INTRODUCTION: Flexible Gastrointestinal Endoscopes (FGE) are used for diagnostic and therapeutic procedures, classified as semicritical health products (HP), requiring High-Level Disinfection (HLD) processing among users. The FGE designs are complex, making it difficult to process and favor the occurrence of faults that contribute to the accumulation of organic matter on the surface of the internal channels of the endoscopes, contributing to the formation of the biofilm. OBJECTIVE: To develop a Biofilm Buildup (BBF) accumulation model, based on repeated exposure of test soil containing Entercoccus faecalis and Pseudomonas aeruginosa by means of repeated cycles of fixation and to evaluate the ability of detergents and disinfectants to destroy and remove bacteria in the Traditional Biofilm (TBF) and Buildup. MATERIALS AND METHODS: TBF was developed in MBEC ™ peg, without hydroxyapatite, and BBF, with hydroxyapatite, over a period of eight days. For the development of both biofilms, E. faecalis and P. aeruginosa, containing 8 log10, colony forming units per cubic centimeters (CFU/cm2) were used. Prolystica Enzymatic (D1), Prolystica Neutral (D2), Neodisher (D3) and Endozime (D4) were tested alone and in combination with Glutaraldehyde (GLUT), Orthophthaldehyde (OPA) and Hydrogen Peroxide Accelerator (APH) to determine if both biofilms could be removed. The removal of the traditional biofilm and buidulp, using viable bacteria count, quantification of protein and carbohydrates and by means of scanning electron microscopy (SEM) was evaluated. RESULTS: After eight days of BBF development, 6.14 log10 CFU/cm2 of E. faecalis and 7.71 log10 CFU/cm2 of P. aeruginosa were reached. None of the detergents and disinfectants have been able to remove the traditional biofilms and buildup or reduce the level of bacteria. The combination of detergents and disinfectants tested in BBF provided a reduction of 3 to 5 log10 in viable bacteria, but no combination could provide the expected reduction of l log10. Only enzyme Prolystica and Endozime removed both E. faecalis (3.90 log10 colony forming units per milliliter (CFU/mL)) and P. aeruginosa (3.96 log10 CFU/mL) in suspension bacteria. None of the detergents tested removed > 1 log10 CFU/cm2 from the bacteria within the traditional biofilm. No combination of high-level disinfectant and detergent reduced the level of both E. faecalis and P. aeruginosa from the traditional biofilm interior (3 to 5 log10 CFU/cm2). Although the combination of Endozyme and Glutaraldehyde reduced 6 log10, it did not eliminate both bacteria in the traditional biofilm. CONCLUSION: Data indicate that if TBF and BBF accumulate in the EGF channels during repeated processing cycles, neither detergents nor high level disinfectants will provide the expected level of bacterial removal or destruction. Future research using the buildup model can help develop new cleaning and disinfection methods that can prevent or eliminate the BBF within the endoscope channels.
INTRODUÇÃO: Endoscópios Gastrointestinais Flexíveis (EGF) são utilizados para procedimentos de diagnóstico e terapêutica, classificados como Produtos Para Saúde (PPS) semicríticos, que necessitam ser submetidos ao processamento de Desinfecção de Alto Nível (DAN) entre usuários. Os designs dos EGF são complexos, dificultando o processamento e favorecem a ocorrência de falhas que contribuem para o acúmulo de matéria orgânica na superfície dos canais internos dos endoscópios e para a formação do biofilme. OBJETIVO: Desenvolver um modelo de acumulação de Biofilme Buildup (BBF), baseado em exposição repetida de solo teste, contendo Entercoccus faecalis e Pseudomonas aeruginosa por meio de ciclos repetidos de fixação e avaliar a capacidade de detergentes e desinfetantes para destruir e remover Bactérias nos Biofilmes Tradicional (TBF) e buildup. MATERIAIS E MÉTODOS: O TBF foi desenvolvido em MBEC™ peg, sem hidroxiapatita, e o BBF, com hidroxiapatita, ao longo de oito dias. Para o desenvolvimento de ambos os biofilmes, utilizaram-se E. faecalis e P. aeruginosa, contendo 8 log10, unidades formadoras de colônias por centímetros cúbicos (UFC/cm2). Testaram-se os detergentes, Prolystica enzimática (D1), Prolystica Neutro (D2), Neodisher (D3) e Endozime (D4) isoladamente e em combinação com o Glutaraldeído (GLUT), Ortoftaldeído (OPA) e Acelerador Peróxido de Hidrogênio (APH) para determinar se ambos os biofilmes poderiam ser removidos. Avaliou-se a remoção dos biofilmes tradicional e buidulp, utilizando contagem de bactérias viáveis, quantificação de proteína e carboidratos e por meio da microscopia eletrônica de varredura (MEV). RESULTADOS: Após oito dias de desenvolvimento BBF, foram atingidos 6,14 log10 UFC/cm2 de E. faecalis e 7,71 log10 UFC/cm2 de P. aeruginosa. Nenhum dos detergentes e desinfetantes conseguiu remover os biofilmes tradicional e buildup ou reduzir o nível de bactérias. A combinação de detergentes e desinfetantes testada em BBF proporcionou uma redução de 3 a 5 log10 em bactérias viáveis, mas nenhuma combinação pôde proporcionar a redução esperada de log10. Apenas Prolystica enzimática e Endozime removeram ambos E. faecalis (3,90 log10 unidades formadoras de colônias por mililitros (UFC/mL) e P. aeruginosa (3,96 log10 UFC/mL) em bactérias em suspensão. Nenhum dos detergentes testados removeu > 1 log10 UFC/cm2 das bactérias dentro do biofilme tradicional. Nenhuma combinação de detergente e desinfetante de alto nível reduziu o nível de ambos E. faecalis e P. aeruginosa do interior de biofilme tradicional (3 a 5 log10 UFC/cm2). Embora a combinação de Endozime e Glutaraldeído reduziu 6 log10, não eliminou ambas as bactérias no biofilme tradicional. CONCLUSÃO: Os dados indicam que, se TBF e BBF acumularem nos canais de EGF durante ciclos repetidos de processamento, nem os detergentes nem os desinfetantes de alto nível irão fornecer o nível esperado de sua remoção ou destruição bacteriana. Pesquisas futuras, utilizando o modelo buildup, podem ajudar a desenvolver novos métodos de limpeza e desinfecção que consigam evitar ou eliminar a BBF dentro dos canais do endoscópio.
APA, Harvard, Vancouver, ISO, and other styles
32

Huo, Jin. "Build and evaluate state estimation Models using EKF and UKF." Thesis, KTH, Fordonsdynamik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-140947.

Full text
Abstract:
In vehicle control practice, there are some variables, such as lateral tire force, body slip angle and yaw rate, that cannot or is hard to be measured directly and accurately. Vehicle model, like the bicycle model, offers an alternative way to get them indirectly, however due to the widely existent simplification and inaccuracy of vehicle models, there are always biases and errors in prediction from them. When developing advanced vehicle control functions, it is necessary and significant to know these variables in relatively high precision. Kalman filter offers a choice to estimate these variables accurately with measurable variables and with vehicle model together. In this thesis, estimation models based on Extended Kalman Filter (EKF) and Uncented Kalman Filter (UKF) are built separately to evaluate the lateral tire force, body slip angel and yaw rate of two typical passenger vehicles. Matlab toolbox EKF/UKF developed by Simo Särkkä, et al. is used to implement the estimation models. By comparing their principle, algorithm and results, the better one for vehicle state estimation will be chosen and justified. The thesis is organized in the following 4 parts: First, EKF and UKF are studied from their theory and features. Second, vehicle model used for prediction in Kalman filter is build and justified. Third, algorithms of EKF and UKF for this specific case are analysed. EKF and UKF are then implemented based on the algorithms with the help of Matlab toolbox EKF/UKF. Finally, comparisons between EKF and UKF are presented and discussed.
APA, Harvard, Vancouver, ISO, and other styles
33

Salama, Haythem S. "The selection of participants in the build operate transfer (BOT) model in Libya." Thesis, University of Reading, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.446212.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Nwosa, Obiajulu C. "Extending the Petrel Model Builder for Educational and Research Purposes." Thesis, 2013. http://hdl.handle.net/1969.1/149438.

Full text
Abstract:
Reservoir Simulation is a very powerful tool used in the Oil and Gas industry to perform and provide various functions including but not limited to predicting reservoir performance, conduct sensitivity analysis to quantify uncertainty, production optimization and overall reservoir management. Compared to explored reservoirs in the past, current day reservoirs are more complex in extent and structure. As a result, reservoir simulators and algorithms used to represent dynamic systems of flow in porous media have invariably got just as complex. In order to provide the best solutions for analyzing reservoir performance, there is a need to continuously develop reservoir simulators and reservoir simulation algorithms that best represent the performance of the reservoir without compromising efficiency and accuracy. There exists several commercial reservoir simulation packages in the market that have been proven to be extremely resourceful with functionality that covers a wide range of interests in reservoir simulation yet there is the constant need to provide better and more efficient methods and algorithms to study and manage our reservoirs. This thesis aims at bridging the gap in the framework for developing these algorithms. To this end, this project has both an educational and research component. Educational because it leads to a strong understanding of the topic of reservoir simulation for students which can be daunting especially for those who require a more direct experience to fully comprehend the subject matter. It is research focused because it will serve as the foundation for developing a framework for integrating custom built external simulators and algorithms with the workflow of the model builder of our reservoir simulation package of choice i.e. Petrel with the Ocean programming environment in a seamless manner for simulating large scale multi-physics problems of flow in highly heterogeneous flow of porous media. Of particular interest are the areas of model order reduction and production optimization. In-house algorithms are being developed for these areas of interest and with the completion of this project. We hope to have developed a framework whereby we can take our algorithms specifically developed for areas of interest and add them to the workflow of the Petrel Model Builder. Currently, we have taken one of our in-house simulators i.e. a two dimensional, oil-water five-spot water flood pattern as a starting point and have been able to integrate it successfully into the “Define Simulation Case” process of Petrel as an additional choice for simulation by an end user. In the future, we will expand this simulator with updates to improve its performance, efficiency and extend its capabilities to incorporate areas of research interest.
APA, Harvard, Vancouver, ISO, and other styles
35

Lin, Yi-Chun, and 林怡君. "A Study of Risk Evaluation for Debris Flow by Model Builder Method." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/22963019025568552103.

Full text
Abstract:
碩士
中國文化大學
建築及都市設計學系
99
Taiwan's natural environment is prone to debris flow. As economic develop, the needs for land resources became more urgent, thus hillside development has become an inevitable trend. In recent years, the occurrence of natural disasters and especially of landslides in the mountain areas has increased. In order to reduce and prevent disasters caused by landslides, a debris flow analyze must be done to understand the possible locations in risk of debris flow and its level of danger. This study uses literature of review method, spatial statistical analysis, and Model Builder method as theoretical basis and with the basic information of the store area through the geographic information system, create a comprehensive archive, to facilitate potential debris flow disaster and risk factors Buildings, roads and population vulnerability assessment research.This research uses the Geographic Information System (GIS) for database management to conduct risk factor for debris flow spatial data, and compile the attribute data. With Model Builder module of environmental potential, physical environment and socio-economic module these three types of modules, the entire work process is presented through a flow chart and an automatic operation process is established. Once the module has run its course, the debris flow hazard analysis of risk and vulnerability can be condone. According to the analysis chart, the vulnerability of buildings, the maximum of vulnerability building area is 1.67ha, followed by the risk area 1.38ha, Nikko risk building in the area was 1.28ha. In vulnerability road, it can be seen that debris flow occurs when the Taipei-Ilan road for the emergency road, fire road in the well-being of disaster relief and evacuation evacuation roads Sec bimodal sections, these sections have faced when the debris flow landslide blocked the road risk. In the socio-economic vulnerability, the vulnerability of young and older people vulnerable populations are less than a thousand people. Hillside areas can be inferred therefore store area is relatively high value of its vulnerability in urban areas, and without the risk of debris flow disaster in urban areas are about forty-one Li. Through these three aspects presentation of vulnerability, it can be utilized for different aspects of the debris flow disaster risk, and urban physical environment reflect the current status of the overall vulnerability assessment results of great reference value, and disaster prevention are related development plan. In addition, to compare with the mode of manual operation and the automated modules that the Model builder can achieve a multiplier effect, it spends less time consuming than the mode of manual operation. It spends more time and money on the mode of manual operation than the Model builder automation module.
APA, Harvard, Vancouver, ISO, and other styles
36

Tsai, Chen-Cheng, and 蔡振成. "The design and implementation of 3D model builder for 3D Electronic Story Book." Thesis, 1996. http://ndltd.ncl.edu.tw/handle/72342889981546684451.

Full text
Abstract:
碩士
國立交通大學
資訊工程學系
84
Unlike traditional story books, Electronic Story Books can interact with users based on the program control to show its ability to support multimedia demonstration. Thus, the most significant feature provided by electronic story books is its interactive ability with users. With multimedia interactive demonstration, readers can simply use mouse to change the effect of story scenario (such as animation). With the 3D animation, Electronic Story Books can bring users into another space dimension and will be more attractive to common users. To create 3D animation is not an easy job, rather it is a time consuming process. In this thesis, our objective is to design a 3D model builder to ease 3D animation creation. Through this designed system, users can create 3D animation they wish through a visual authoring environment. This tool also provides ability to preview the result of the editing animation. Also, allowing users to correct the animation immediately is another consideration for this designed tool. These important features for reducing 3D animation creation time will be discussed and demonstrated in this designed tool.
APA, Harvard, Vancouver, ISO, and other styles
37

Bastos, João Nuno Pereira. "Emerging startup studios in Portugal : organizational characteristics of Portuguese startup studios." Master's thesis, 2019. http://hdl.handle.net/10400.14/26911.

Full text
Abstract:
There is an emerging trend in the entrepreneurial landscape which promises to become revolutionary: Startup Studios. This emerging incubation model is generally characterized by the development of internally generated ideas and the creation of its own cohort of startups, supported by internal resources and a multidisciplinary team. Despite the Startup Studio model being already highly implemented in bigger and more developed entrepreneurial ecosystems, the infancy of the underlying concept and the diverging way each Startup Studio organizes itself, combined with the lack of academic research on this topic leaves an unclear definition of the Startup Studio concept. This thesis aims to clarify the definition of the Startup Studio concept and its characteristics. Moreover, it focuses on the implementation of the studio incubator model in smaller and emerging entrepreneurial ecosystems to understand if it can be a viable alternative to foster innovation and prompt economic development. Thus, two case studies of Startup Studios in Portugal were conducted. The findings of this analysis revealed that elements like funding, type of founders and exit strategy will be context specific, differing from Startup Studios in bigger economies. However, elements like ideation process, equity distribution, operations and team set up of Startup Studios in smaller and emerging entrepreneurial ecosystem will be like the ones found in more developed entrepreneurial ecosystem. Based on this evidence, a definition for the Startup Studios is suggested as well as the proposition of the possible implication of the studio incubator model in smaller economies.
Tem-se observado uma tendência emergente que promete revolucionar o panorama de empreendedorismo: os Startup Studios. Este modelo de incubação emergente é geralmente caracterizado pelo desenvolvimento de ideias geradas internamente e pela criação do seu próprio grupo de startups incubadas, suportado pela utilização de recursos internos e por uma equipa multidisciplinar. Apesar deste modelo já estar largamente implementado em ecossistemas de empreendedorismo mais desenvolvidos, a infância do conceito subjacente e a forma como cada Startup Studio se organiza, em conjunto com a falta de investigação académica sobre este tópico deixa uma definição pouco clara do conceito. Esta tese tem como objetivo esclarecer a definição do conceito do Startup Studio e suas características. Além disso, concentra-se na implementação deste modelo de incubação em ecossistemas de empreendedorismo menores e emergentes, para entender se o mesmo pode ser considerado uma alternativa viável para promover inovação e o desenvolvimento económico destes ecossistemas. Assim, foi conduzido um estudo de caso com dois Startup Studios em Portugal. Os resultados desta análise revelaram que elementos como financiamento, tipo de fundadores e estratégia de saída são específicos ao contexto e diferentes dos observados em Startup Studios em economias mais desenvolvidas. No entanto, elementos como processo de ideação, distribuição de capital, operações e a configuração da equipa nos Startup Studios em ecossistemas de empreendedorismo emergentes serão semelhantes aos encontrados em ecossistemas de empreendedorismo mais desenvolvidos. Com base nestas evidências, sugere-se uma definição para o conceito de Startup Studio, bem como as possíveis implicações que este modelo terá em economias menores.
APA, Harvard, Vancouver, ISO, and other styles
38

Ha, Mi-Ae 1979. "Optimizing Feedstock Logistics and Assessment of Hydrologic Impacts for Sustainable Bio-Energy Production." Thesis, 2012. http://hdl.handle.net/1969.1/148247.

Full text
Abstract:
Rising world petroleum prices and global warming are contributing to interest in renewable energy sources, including energy produced from agricultural crops and waste sources of biomass. A network of small mobile pyrolysis units may be the most cost effective system to convert biomass from agricultural feedstocks to bio-crude oil. Mobile pyrolysis units could be moved to the feedstock production fields thereby greatly simplifying feedstock logistics. In the North Central (NC) region of the U.S., possible feedstocks are corn stover, energy sorghum, and switchgrass. A grid-based Geographic Information System (GIS) program was developed to identify optimum locations for mobile pyrolysis units based on feedstock availability in the NC region. Model builder was used to automate the GIS analysis. Network analysis was used to find the best route to move the mobile pyrolysis units to new locations and to identify the closest refinery to transport the bio-crude oil. To produce bioenergy from feedstocks, the removal of biomass from agricultural fields will impact the hydrology and sediment transport in rural watersheds. Therefore, the hydrologic effects of removing corn stover from corn production fields in Illinois (IL) were evaluated using the Soil Water Assessment Tool (SWAT). The SWAT model was calibrated and validated for streamflow and sediment yields in the Spoon River basin in IL using observed data from the USGS. The modeling results indicated that as residue removal rates increased, evapotranspiration (ET) and sediment yields increased, while streamflows decreased. Biochar is a carbon-based byproduct of pyrolysis. To ensure that the mobile pyrolysis system is economically and environmental sustainable, the biochar must be land applied to the feedstock production fields as a soil amendment. An assessment of hydrologic changes due to the land application of biochar was made using the SWAT model in the Spoon River basin and changes in soil properties due to incorporation of biochar into the soil obtained from laboratory experiments by Cook et al. (2012). Model simulations indicated that a biochar application rate of 128 Mg/ha decreased water yield, and sediment yield in surface runoff and increased soil moisture and ET.
APA, Harvard, Vancouver, ISO, and other styles
39

Hohls, Ronja. "Corporate innovation - do corporate company builder have the potential to build viable business models outside their parent companies core business activities?" Master's thesis, 2019. http://hdl.handle.net/10362/106989.

Full text
Abstract:
Corporate Innovation -Do Corporate Company Builder have the Potential to build viable Business Models outside their Parent Companies’ Core Business Activities?This work project focuses on the corporate innovation strategy of corporate company building, being a recent phenomenon in practice following successful independent company buildersincubating new ventures in a factory-like manner. By analyzing data from relevant literature a framework on company building differentiation dimensions was developed in order to draw conclusions on whether corporate company building entities have the potential to build viable business models outside their parent companies’ core business activities. Expert interviews were used to validate the framework with its dimensions of objective, program focus and program organization and to answer the corresponding research question.
APA, Harvard, Vancouver, ISO, and other styles
40

Chao, Wei-Sheng, and 趙偉勝. "Using genetic algorithm integrated state space model to build stock forecasting models." Thesis, 2002. http://ndltd.ncl.edu.tw/handle/25683230279786288250.

Full text
Abstract:
碩士
國立臺北大學
企業管理學系
90
This research combined the technic of statistic and artificial intelligence to find if there is the characteristic of predictability or not in Taiwan Weighted Stock Index. The purposes of this paper are listed below:1. By integrating the searching ability of genetic algorithm (GA) into the State Space Model and then building reasonable Statistics frameworks, this research tried to find the nonlinear function of short-term stock behavior.2. By comparing the models built in this research with the buy-and-hold strategy, we can know whose performance is better.3. For testing the models of this research, we compared the performance of the model built in this research and the performance of the time series model.One of the most significant characteristic of genetic algorithm is its massive parallel optimizing ability. The 17 kinds of technical indexes was calculated with the information of prices and volumes and would be chosen automatically by GA. Then we used two-stepwised method to integrate GA into State Space Model. Two of the main results of this research are listed below: 1. The technical indexes used in this research are 163 kinds of varieties, and each chromosome has ten of these. Thus the search space is , about 2.753064116×1015. For the practical purpose, the amount of chromosome and generation was chosen and restricted by a reasonable time frame and the data processing ability in this research. The forecasting ability might improve further if a company or an organization has higher data processing ability.2. Because the characteristic of GA, the forecasting models will not be the same every time. Through many times of in-sample and out-sample testing, this research can stably make profits in a long term. The result of this research shall be valid.
APA, Harvard, Vancouver, ISO, and other styles
41

Dodd, Samuel Tommy. "Merchandising the postwar model house at the Parade of Homes." Thesis, 2009. http://hdl.handle.net/2152/ETD-UT-2009-08-345.

Full text
Abstract:
The Parade of Homes began in 1948 as a novel form of sales merchandising and publicity. The model house, on display at the Parade of Homes, was a powerful advertising tool employed by postwar merchant-builders to sell modern design to a new market of informed consumers and second-time homeowners. Using House & Home as a primary source, I contextualize the postwar housing industry and the merchandising efforts of builders. Then, through an examination of the 1955 Parade of Homes in Houston, Texas, I analyze the early Parade of Homes events and the language of domestic modernism that they showcased.
text
APA, Harvard, Vancouver, ISO, and other styles
42

Chi-Tung, Lai, and 賴啟東. "Development of Decision Model for Builders to Select Urban Renewal Projects." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/6zrx82.

Full text
Abstract:
碩士
國立臺北科技大學
管理學院工業工程與管理EMBA專班
101
Taipei City is the capital of the Republic of China, the main government administration establishment in here, and mostly of the business practices and economic activities take place in this city too. Taipei City provided abundant resources and advantages in lots of area such as in the city construction and with the comprehensive social welfare measures; in additional health care, education and job opportunities, it has attracted the majority of domestic population to inhabit here. Taipei becomes the national’s most densely populated regions, residence requirements also the highest in the country. In a large immigration population, the government and construction company has deliberately and massive development, therefore Taipei City has become a high-cost land, the price is unattainable. It is difficult for ordinary people to own a house, and construction companies also face difficulties in obtaining land and costs continue to elevate. For the worse, lots of lands is almost over development and with the land value and house price is keep breaking the record! Therefore, the only economical solution is through urban renewal, such as old house demolition and reconstruction methods to improve the living environment. This solution can allow builders to avoid a huge land cost and therefore may create a business opportunity with a win-win situation. The promotion of urban renewal by the government, provide an opportunity of cooperation between construction companies and the owners of old departments. However, few cases were ended with success. To successfully complete an urban renewal case, there are some indispensable elements: government policy support, the involvement of- construction company and the agreement of original household. In this paper, a semi-structured interview and the SMART decision technique were applied from builder’s perspective to develop a model for the decision of investment on urban renewal projects.
APA, Harvard, Vancouver, ISO, and other styles
43

Liu, Mei-Chien, and 劉美倩. "To build the model of Information Asset Evaluation Model." Thesis, 2004. http://ndltd.ncl.edu.tw/handle/04963390921631948111.

Full text
Abstract:
碩士
中國文化大學
資訊管理研究所碩士在職專班
92
Calculating intangible asset influences on four dimensions “Intellectual Capital, Accounting, Risk Assessment of Asset (BS7799) and IT Baseline Protection Management” Those are all related analyzing the asset evaluated standard rules. There are many evaluation rules in each dimension. This research realizes all related asset definitions and measurements on build up the Evaluation Model. This information asset evaluation model can be used to evaluate asset values in the market. This model integrated methodologies collected, according to the standard evaluation process. The information asset evaluation model took security risk assessment factors, IT intangible asset protection values, evaluated methods, and knowledge values into account. This research proposes five key influence factors about the evaluation model, including Asset Incurrence calculation, Evaluated the suitable way to measure how to rent asset, risk assessment, how much we still can investigate on this asset, and should we replace this asset by a new one. Taking all these factors into consideration can evaluate the true values of the asset.
APA, Harvard, Vancouver, ISO, and other styles
44

Chang, Chao-Kai, and 張朝凱. "Build a Taiwan macro-econometric model." Thesis, 2006. http://ndltd.ncl.edu.tw/handle/35330632456587609321.

Full text
Abstract:
碩士
國立臺灣大學
經濟學研究所
94
The purpose of this thesis is to build a macro-econometric model for Taiwan economy. After that we use it to predict economy in the future and perform scenario analysis when the unpredictable shocks happen. We generate the prediction of the economy from the first quarter in 2005 to the forth quarter in 2008. Moreover, we conduct the scenario analysis concerning the variation of international circumstances, for example the fluctuation of oil price per barrel, China’s economic growth. In both cases, we also show their impacts on the fiscal budget in the government. The range of the GDP growth rate predicted from the first quarter in 2005 to the forth quarter in 2008 in this macro-econometric model is between 2 % to 6 %. And the predicted range of unemployment rate is between 4.07% to 5.06%.The trend of other financial variables ascends gradually and smoothly. Generally speaking, the model’s predictions perform well and without any sharp rising and falling. The results of scenario analysis demonstrate the oil‘s price still impact domestic economy a lot. When the price per barrel rises up to $80 permanently, most price index increase quickly and domestic economy declines. The direction of future study and improvement are appointed in the end of this thesis.
APA, Harvard, Vancouver, ISO, and other styles
45

Wang, Chao-Hsin, and 王昭鑫. "Utilizing XML to Build EPCIS Communication Model." Thesis, 2006. http://ndltd.ncl.edu.tw/handle/84034207152859257666.

Full text
Abstract:
碩士
大同大學
資訊經營學系(所)
94
Radio Frequency Identification (RFID) can work efficiently in identification. RFID can reduce the waste in producing and negligence to achieve automatic production. Furthermore, we can also use Electronic Product Code (EPC™) stored in the tags to share information and extend to other areas such as manufacturing, transportation, and service. RFID system can not work just by tags, readers, and middleware. If RFID cannot provide data storage and query functions, the information collected by RFID system is nothing more than a pile of data in the database. Furthermore, a working RFID needs to connect with information systems in business and among businesses. Therefore, EPCIS (Information Service) was established. Though application of RFID is mature in organization, the application between organizations is still in the process of exploratory development. We don’t have complete model to refer so we are not sure the application in reality. Therefore, this study focuses on the functions and operations model of EPCIS. We discuss the functions of EPCIS in an organization to understand how to communicate with internal systems by XML (eXtensible Markup Language). We build a prototype system to prove the concept of message exchange procedure. The functions include tracing inventory, improving transaction procedures, communication of heterogeneous systems in order to setup an RFID reference for enterprises.
APA, Harvard, Vancouver, ISO, and other styles
46

Guo, Feng-Yi, and 郭奉宜. "Schedule Planning Model for Design/Build Projects." Thesis, 2005. http://ndltd.ncl.edu.tw/handle/86627541331129247022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Chu, Jung-Te, and 朱容德. "Build Rings Upsetting Friction Model by RSM." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/93041837564412005250.

Full text
Abstract:
碩士
國立中央大學
機械工程研究所
100
In this paper, we study the behave of friction of forging between die and material. We make different surface morphology specimen and mold , upset on lubricating condition. To optimize the efficiency and reduce the number of experiment, we use response surface methodology to design and analyse the results of rings upsetting experiment. The model we build is combine isotropy and anisotropy deformation of rings upsetting. Compare the different models performance and discuss its composition.
APA, Harvard, Vancouver, ISO, and other styles
48

Kang, Chin-chih, and 康鈞誌. "Using model tree to build up diagnostic model of semiconductor probing time." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/00040579823192547934.

Full text
Abstract:
碩士
國立成功大學
工業與資訊管理學系專班
97
Living in the age that information flows so fast. How to convert data into useful information becomes an important issue. Model trees have a tree structure with leaves containing linear regression models. They could reveal the relationship between attributes and the information embedded in data. The data gather in wafer testing factory contain attributes about operators, machines, material, and processes. An attribute type can be either numeric or nominal. Model tree is originally designed for processing numeric attributes. M5’ could deal with both attribute types, hence this study adopts this tool for building model trees. Before building a model tree, the preprocess of data includes data integration, data clearance, data transformation, and data normalization. Then data will be divided into normal and abnormal groups based on their wafer testing times. The model trees built for the two data groups are compared to distinguish their differences in internal nodes and leaf nodes. The attributes that are critical for increasing the test time of a wafer are extracted, and this information could let related staffs know the appropriate steps for dealing with an abnormal wafer.
APA, Harvard, Vancouver, ISO, and other styles
49

Su, Yi-Jing, and 蘇怡靜. "Integrating ICA and DEA Models to Build and Analysis of customers from the Credit Card Behavior Scoring Model." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/5pn7e5.

Full text
Abstract:
碩士
國立臺北科技大學
商業自動化與管理研究所
96
The competition of .finance industry is more and more vigorous. For most of the banking organizations, credit card business is considered highly important among their consumers businesses. The effectiveness of the credit cards issued directly impact the profits of the banks. Therefore, it becomes one of the most important topics to operate a customer relationship management to consolidate those valuable customers. Industry should be to improve purchasing amount of each customer as the goal and raise interest rates of the Bank. This study will estimate the label of customers from the credit card behavior of usage by using DEA. Then we analyze the weights of DEA by using ICA. It interprets the characteristic of each DMU. Moreover we can use the results to adjust credit lines revolving expenses and .procedures expenses.
APA, Harvard, Vancouver, ISO, and other styles
50

Tien, Shiao-Hua, and 田曉華. "A Study on build hospital’s material acceptance model." Thesis, 2004. http://ndltd.ncl.edu.tw/handle/05612226275165361795.

Full text
Abstract:
碩士
國立雲林科技大學
工業工程與管理研究所碩士班
92
The medical material is not deficient to hospital. How to buy the standard medical material fit to hospital is not to ignore of medical quality assure plan. But the most accept procedure of hospital is to check the amounts and gather the accounting datum. The quality inspection with regard to medical material is hard to concrete describe and not much to do by accepter. Therefore, it is not to ignore of subject about to confer the medical material accept procedure. The study is help hospital to build an effective acceptable system of buy the medical material. To analysis the problem about to use medical material by hospital, and to gather the situations about the property is related to medical material in Taiwan. The next step is to confer the laws whether clear and define about the accept procedure of medical material. First, I interviewed 14’s experts of the medical material manufacturers, the users in hospital and the expert of government. And used the Delphi method to gather the opinions about the experts. I collected the tentative programs about the experts’ medical material opinions. That separated three components about the characteristic of medical material, the consult standard of inspection and the plan of acceptance sampling. Final, the study will verify the method can apply to hospital. That selected the case study objects are Department of Health, Executive Yuan, Southern District Hospital Alliance, to build the acceptable model.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography