Dissertations / Theses on the topic 'Metric estimation'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Metric estimation.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
O'Loan, Caleb J. "Topics in estimation of quantum channels." Thesis, University of St Andrews, 2010. http://hdl.handle.net/10023/869.
Full textKazzazi, Seyedeh Mandan. "Dental metric standards for sex estimation in archaeological populations from Iran." Thesis, University of Edinburgh, 2018. http://hdl.handle.net/1842/31067.
Full textTabulo, Moti M. "Radio resource management and metric estimation for multi-carrier CDMA systems." Thesis, University of Edinburgh, 2005. http://hdl.handle.net/1842/13065.
Full textStrobel, Matthias. "Estimation of minimum mean squared error with variable metric from censored observations." [S.l. : s.n.], 2008. http://nbn-resolving.de/urn:nbn:de:bsz:93-opus-35333.
Full textEngström, Isak. "Automated Gait Analysis : Using Deep Metric Learning." Thesis, Linköpings universitet, Medie- och Informationsteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-178139.
Full textExamensarbetet är utfört vid Institutionen för teknik och naturvetenskap (ITN) vid Tekniska fakulteten, Linköpings universitet
Winden, Matthew Wayne. "INTEGRATING STATED PREFERENCE CHOICE ANALYSIS AND MULTI-METRIC INDICATORS IN ENVIRONMENTAL VALUATION." The Ohio State University, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=osu1343325594.
Full textSchenkel, Flávio Schramm. "Studies on effects of parental selection on estimation of genetic parameters and breeding values of metric traits." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp03/NQ35812.pdf.
Full textKhodabandeloo, Babak, Dyan Melvin, and Hongki Jo. "Model-Based Heterogeneous Data Fusion for Reliable Force Estimation in Dynamic Structures under Uncertainties." MDPI AG, 2017. http://hdl.handle.net/10150/626477.
Full textRojas, Christian Andres. "Demand Estimation with Differentiated Products: An Application to Price Competition in the U.S. Brewing Industry." Diss., Virginia Tech, 2005. http://hdl.handle.net/10919/28916.
Full textPh. D.
Paditz, Ludwig. "On the error-bound in the nonuniform version of Esseen's inequality in the Lp-metric." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-112888.
Full textDas Anliegen dieses Artikels besteht in der Untersuchung einer bekannten Variante der Esseen'schen Ungleichung in Form einer ungleichmäßigen Fehlerabschätzung in der Lp-Metrik mit dem Ziel, eine numerische Abschätzung für die auftretende absolute Konstante L zu erhalten. Längere Zeit erweckten die Ergebnisse, die von verschiedenen Autoren angegeben wurden, den Eindruck, dass die ungleichmäßige Fehlerabschätzung im interessantesten Fall δ=1 nicht möglich wäre, weil auf Grund der geführten Beweisschritte der Einfluss von δ auf L in der Form L=L(δ)=O(1/(1-δ)), δ->1-0, beobachtet wurde, wobei 2+δ, 0<δ<1, die Ordnung der vorausgesetzten Momente der betrachteten unabhängigen Zufallsgrößen X_k, k=1,2,...,n, angibt. Erneut wird die Methode der konjugierten Verteilungen angewendet und die gut bekannte Beweistechnik verbessert, um im interessantesten Fall δ=1 die Endlichkeit der absoluten Konstanten L nachzuweisen und um zu zeigen, dass L=L(1)=<127,74*7,31^(1/p), p>1, gilt. Im Fall 0<δ<1 wird nur die analytische Struktur von L herausgearbeitet, jedoch ohne numerische Berechnungen. Schließlich wird mit einem Beispiel zur Normalapproximation von Summen l_2-wertigen Zufallselementen die Anwendung der gewichteten Fehlerabschätzung im globalen zentralen Grenzwertsatz demonstriert
Muller, Jacob. "Higher order differential operators on graphs." Licentiate thesis, Stockholms universitet, Matematiska institutionen, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-178070.
Full textCarbonera, Luvizon Diogo. "Apprentissage automatique pour la reconnaissance d'action humaine et l'estimation de pose à partir de l'information 3D." Thesis, Cergy-Pontoise, 2019. http://www.theses.fr/2019CERG1015.
Full text3D human action recognition is a challenging task due to the complexity ofhuman movements and to the variety on poses and actions performed by distinctsubjects. Recent technologies based on depth sensors can provide 3D humanskeletons with low computational cost, which is an useful information foraction recognition. However, such low cost sensors are restricted tocontrolled environment and frequently output noisy data. Meanwhile,convolutional neural networks (CNN) have shown significant improvements onboth action recognition and 3D human pose estimation from RGB images. Despitebeing closely related problems, the two tasks are frequently handled separatedin the literature. In this work, we analyze the problem of 3D human actionrecognition in two scenarios: first, we explore spatial and temporalfeatures from human skeletons, which are aggregated by a shallow metriclearning approach. In the second scenario, we not only show that precise 3Dposes are beneficial to action recognition, but also that both tasks can beefficiently performed by a single deep neural network and stillachieves state-of-the-art results. Additionally, wedemonstrate that optimization from end-to-end using poses as an intermediateconstraint leads to significant higher accuracy on the action task thanseparated learning. Finally, we propose a new scalable architecture forreal-time 3D pose estimation and action recognition simultaneously, whichoffers a range of performance vs speed trade-off with a single multimodal andmultitask training procedure
Akcay, Koray. "Performance Metrics For Fundamental Estimation Filters." Master's thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/12606510/index.pdf.
Full textAlpha-Beta Filter, Alpha-Beta-Gamma Filter, Constant Velocity (CV) Kalman Filter, Constant Acceleration (CA) Kalman Filter, Extended Kalman Filter, 2-model Interacting Multiple Model (IMM) Filter and 3-model IMM with respect to their resource requirements and performance. In resource requirement part, fundamental estimation filters are compared according to their CPU usage, memory needs and complexity. The best fundamental estimation filter which needs very low resources is the Alpha-Beta-Filter. In performance evaluation part of this thesis, performance metrics used are: Root-Mean-Square Error (RMSE), Average Euclidean Error (AEE), Geometric Average Error (GAE) and normalized form of these. The normalized form of performance metrics makes measure of error independent of range and the length of trajectory. Fundamental estimation filters and performance metrics are implemented in MATLAB. MONTE CARLO simulation method and 6 different air trajectories are used for testing. Test results show that performance of fundamental estimation filters varies according to trajectory and target dynamics used in constructing the filter. Consequently, filter performance is application-dependent. Therefore, before choosing an estimation filter, most probable target dynamics, hardware resources and acceptable error level should be investigated. An estimation filter which matches these requirements will be &lsquo
the best estimation filter&rsquo
.
Hwang, Sung Jun. "Communication over Doubly Selective Channels: Efficient Equalization and Max-Diversity Precoding." The Ohio State University, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=osu1261506237.
Full textAndersson, Veronika, and Hanna Sjöstedt. "Improved effort estimation of software projects based on metrics." Thesis, Linköping University, Department of Electrical Engineering, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-5269.
Full textSaab Ericsson Space AB develops products for space for a predetermined price. Since the price is fixed, it is crucial to have a reliable prediction model to estimate the effort needed to develop the product. In general software effort estimation is difficult, and at the software department this is a problem.
By analyzing metrics, collected from former projects, different prediction models are developed to estimate the number of person hours a software project will require. Models for predicting the effort before a project begins is first developed. Only a few variables are known at this state of a project. The models developed are compared to a current model used at the company. Linear regression models improve the estimate error with nine percent units and nonlinear regression models improve the result even more. The model used today is also calibrated to improve its predictions. A principal component regression model is developed as well. Also a model to improve the estimate during an ongoing project is developed. This is a new approach, and comparison with the first estimate is the only evaluation.
The result is an improved prediction model. There are several models that perform better than the one used today. In the discussion, positive and negative aspects of the models are debated, leading to the choice of a model, recommended for future use.
Asif, Sajjad. "Investigating Web Size Metrics for Early Web Cost Estimation." Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-16036.
Full textArchibald, Colin J. "A software testing estimation and process control model." Thesis, Durham University, 1998. http://etheses.dur.ac.uk/4735/.
Full textNowak, James. "Integrated Population Models and Habitat Metrics for Wildlife Management." Doctoral thesis, Université Laval, 2015. http://hdl.handle.net/20.500.11794/26023.
Full textSuccessful management of harvested species critically depends on an ability to predict the consequences of corrective actions. Ideally, managers would have comprehensive, quantitative and continuous knowledge of a managed system upon which to base decisions. In reality, wildlife managers rarely have comprehensive system knowledge. Despite imperfect knowledge and data deficiencies, a desire exists to manipulate populations and achieve objectives. To this end, manipulation of harvest regimes and the habitat upon which species rely have become staples of wildlife management. Contemporary statistical tools have potential to enhance both the estimation of population size and vital rates while making possible more proactive management. In chapter 1 we evaluate the efficacy of integrated population models (IPM) to fill knowledge voids under conditions of limited data and model misspecification. We show that IPMs maintain high accuracy and low bias over a wide range of realistic conditions. In recognition of the fact that many monitoring programs have focal data collection areas we then fit a novel form of the IPM that employs random effects to effectively share information through space and time. We find that random effects dramatically improve performance of optimization algorithms, produce reasonable estimates and make it possible to estimate parameters for populations with very limited data. We applied these random effect models to 51 elk management units in Idaho, USA to demonstrate the abilities of the models and information gains. Many of the estimates are the first of their kind. Short-term forecasting is the focus of population models, but managers assess viability on longer time horizons through habitat. Modern approaches to understanding large ungulate habitat requirements largely depend on resource selection. An implicit assumption of the resource selection approach is that disproportionate use of the landscape directly reflects an individual’s desire to meet life history goals. However, we show that simple metrics of habitat encountered better describe variations in elk survival. Comparing population level variation through time to individual variation we found that individual variation in habitat used was the most supported model relating habitat to a fitness component. Further, resource selection coefficients did not correlate with survival.
Miller, Jordan Mitchell. "Estimation of individual tree metrics using structure-from-motion photogrammetry." Thesis, University of Canterbury. Geography, 2015. http://hdl.handle.net/10092/11035.
Full textEllis, Kyle Kent Edward Schnell Thomas. "Eye tracking metrics for workload estimation in flight deck operations." Iowa City : University of Iowa, 2009. http://ir.uiowa.edu/etd/288.
Full textEllis, Kyle Kent Edward. "Eye tracking metrics for workload estimation in flight deck operations." Thesis, University of Iowa, 2009. https://ir.uiowa.edu/etd/288.
Full textShahidi, Parham. "Fuzzy Analysis of Speech Metrics to Estimate Crew Alertness." Diss., Virginia Tech, 2011. http://hdl.handle.net/10919/50436.
Full textPh. D.
Fonseca, Filho José Raimundo dos Santos. "ESTIMAÇÃO DE MÉTRICAS DE DESENVOLVIMENTO AUXILIADA POR REDES NEURAIS ARTIFICIAIS." Universidade Federal do Maranhão, 2003. http://tedebc.ufma.br:8080/jspui/handle/tede/324.
Full textSeveral modeling approaches for the process of development in software engineering able of subsidizing decision making in the management of project are being searched. Metric of softwares, process modeling and estimation techniques have been independently considered either taking into consideration the intrinsic characteristic of softwares or their constructive process. This research proposes a complete, simple and efficient model for representing the whole process of development which, based on a set of features of the process and basic attributes of softwares, yields good estimation metrics (time and effort) of the development of the software still at the beginning of the process. The model relates constructive characteristics of the process to each type of organization, for identifying classes of homogeneous behavior based on Kohonen Neural Network. Directly, from this classification, according to the basic attributes of each software being developed, metrics may be estimated supported by Feedforward Neural Networks. A prototype is specified in Unified Model Language (UML) and implemented to estimate metrics for the development of softwares. Comparisons of the obtained results with those available in literature are presented.
Diversas representações do processo de desenvolvimento na Engenharia de softwares capazes de, eficientemente, subsidiar a tomada de decisões no gerenciamento de projetos, vêm sendo arduamente pesquisadas. Métricas de softwares, modelos de processo e técnicas de estimação têm sido propostos em grande quantidade, tanto devido a características intrínsecas dos softwares quanto a características do próprio processo construtivo. Buscando superar algumas das dificuldades de estimação de métricas relacionadas ao processo de desenvolvimento, este trabalho realiza, inicialmente, um estudo de ferramentas voltadas para tal objetivo e que estão disponíveis no mercado. Em seguida, um conjunto de descritores do processo em questão e também um conjunto de atributos básicos dos softwares será levantado. A partir de então, é proposto um modelo que represente o processo de desenvolvimento de maneira simples e eficiente. O modelo de processo do desenvolvimento na Engenharia de softwares relaciona as características desse processo construtivo a classes de entidades desenvolvedoras, tal que se possa estabelecer um comportamento homogêneo ao processo. Baseado nessa classificação, são relacionados, de maneira direta, métricas (tempo e esforço) de desenvolvimento com os atributos básicos dos softwares, definidos por Albrecht, visando a estimação de métricas. O modelo de processo é baseado no mapa de Kohonen e o estimador de métricas será auxiliado por redes neurais feed forward. Uma ferramenta de software (protótipo) é especificado em Linguagem de modelamento unificada (UML). Esta ferramenta auxiliará a produção de estimativas de tempo e de esforço de desenvolvimento de softwares. Comparações de resultados obtidos serão realizadas com os disponibilizados na literatura consultada.
Marshall, Ian Mitchell. "Evaluating courseware development effort estimation measures and models." Thesis, University of Abertay Dundee, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.318946.
Full textVieira, Andrws Aires. "Uma abordagem para estimação prévia dos requisitos não funcionais em sistemas embarcados utilizando métricas de software." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2015. http://hdl.handle.net/10183/117766.
Full textThe increasing complexity of embedded systems demands the use of new approaches to accelerate their development, such as model-driven engineering. Such approaches aim at increasing the level of abstraction using concepts such as object-orientation and UML for modeling the embedded software. However, with the increase of the abstraction level, the embedded software developer looses controllability and predictability over important issues such as performance, power dissipation and memory usage for a specific embedded platform. Thus, new design estimation techniques that can be used in the early development stages become necessary. Such a strategy may help the designer to make better decisions in the early stages of the project, thus ensuring the final system meets both functional and non-functional requirements. In this work, we propose an estimation technique of non-functional requirements for embedded systems, based on data (metrics) extracted from early stages of the project. The proposed methodology allows to better explore different design options in the early steps of software development process and can therefore provide a fast and yet accurate feedback to the developer. Experimental results show the applicability of the approach, particularly for software evolution and maintenance, which has a history of similar applications metrics to be used as training data. In this scenario, the accuracy of the estimation is at least of 98%. In a heterogeneous scenario, where the estimation is performed for a system that is different from the one used during training, the accuracy drops to 80%.
Dinh, Ngoc Thach. "Observateur par intervalles et observateur positif." Thesis, Paris 11, 2014. http://www.theses.fr/2014PA112335/document.
Full textThis thesis presents new results in the field of state estimation based on the theory of positive systems. It is composed of two separate parts. The first one studies the problem of positive observer design for positive systems. The second one which deals with robust state estimation through the design of interval observers, is at the core of our work.We begin our thesis by proposing the design of a nonlinear positive observer for discrete-time positive time-varying linear systems based on the use of generalized polar coordinates in the positive orthant. For positive systems, a natural requirement is that the observers should provide state estimates that are also non-negative so they can be given a physical meaning at all times. The idea underlying the method is that first, the direction of the true state is correctly estimated in the projective space thanks to the Hilbert metric and then very mild assumptions on the output map allow to reconstruct the norm of the state. The convergence rate can be controlled.Later, the thesis is continued by studying the so-called interval observers for different families of dynamic systems in continuous-time, in discrete-time and also in a context "continuous-discrete" (i.e. a class of continuous-time systems with discrete-time measurements). Interval observers are dynamic extensions giving estimates of the solution of a system in the presence of various type of disturbances through two outputs giving an upper and a lower bound for the solution. Thanks to interval observers, one can construct control laws which stabilize the considered systems
González, Rojas Victor Manuel. "Análisis conjunto de múltiples tablas de datos mixtos mediante PLS." Doctoral thesis, Universitat Politècnica de Catalunya, 2014. http://hdl.handle.net/10803/284659.
Full textEl contenido fundamental de esta tesis corresponde al desarrollo de los métodos GNM-NIPALS, GNM-PLS2 y GNM-RGCCA para la cuantificación de las variables cualitativas a partir de las primeras k componentes proporcionadas por los métodos apropiados en el análisis de J matrices de datos mixtos. Estos métodos denominados GNM-PLS (General Non Metric Partial Least Squares) son una extensión de los métodos NM-PLS que toman sólo la primera componente principal en la función de cuantificación. La trasformación de las variables cualitativas se lleva a cabo mediante procesos de optimización maximizando generalmente funciones de covarianza o correlación, aprovechando la flexibilidad de los algoritmos PLS y conservando las propiedades de pertenencia grupal y orden si existe; así mismo se conservan las variables métricas en su estado original excepto por estandarización. GNM-NIPALS ha sido creado para el tratamiento de una (J=1) matriz de datos mixtos mediante la cuantificación vía reconstitución tipo ACP de las variables cualitativas a partir de una función agregada de k componentes. GNM-PLS2 relaciona dos (J=2) conjuntos de datos mixtos Y~X mediante regresión PLS, cuantificando las variables cualitativas de un espacio con la función agregada de las primeras H componentes PLS del otro espacio, obtenidas por validación cruzada bajo regresión PLS2. Cuando la matriz endógena Y contiene sólo una variable de respuesta el método se denomina GNM-PLS1. Finalmente para el análisis de más de dos bloques (J>2) de datos mixtos Y~X1+...+XJ a través de sus variables latentes (LV) se implementa el método NM-RGCCA basado en el método RGCCA (Regularized Generalized Canonical Correlation Analysis) que modifica el algoritmo PLS-PM implementando el nuevo modo A y especifica las funciones de maximización de covarianzas o correlaciones asociadas al proceso. La cuantificación de las variables cualitativas en cada bloque Xj se realiza mediante la función inner Zj de dimensión J debido a la agregación de las estimaciones outer Yj. Tanto Zj como Yj estiman la componente ξj asociad al j-ésimo bloque.
Hill, Terry. "Metrics and Test Procedures for Data Quality Estimation in the Aeronautical Telemetry Channel." International Foundation for Telemetering, 2015. http://hdl.handle.net/10150/596445.
Full textThere is great potential in using Best Source Selectors (BSS) to improve link availability in aeronautical telemetry applications. While the general notion that diverse data sources can be used to construct a consolidated stream of "better" data is well founded, there is no standardized means of determining the quality of the data streams being merged together. Absent this uniform quality data, the BSS has no analytically sound way of knowing which streams are better, or best. This problem is further exacerbated when one imagines that multiple vendors are developing data quality estimation schemes, with no standard definition of how to measure data quality. In this paper, we present measured performance for a specific Data Quality Metric (DQM) implementation, demonstrating that the signals present in the demodulator can be used to quickly and accurately measure the data quality, and we propose test methods for calibrating DQM over a wide variety of channel impairments. We also propose an efficient means of encapsulating this DQM information with the data, to simplify processing by the BSS. This work leads toward a potential standardization that would allow data quality estimators and best source selectors from multiple vendors to interoperate.
Chan, Joanne S. M. Massachusetts Institute of Technology. "Rail transit OD matrix estimation and journey time reliability metrics using automated fare data." Thesis, Massachusetts Institute of Technology, 2007. http://hdl.handle.net/1721.1/38955.
Full textIncludes bibliographical references (p. 190-191).
The availability of automatic fare collection (AFC) data greatly enhances a transit planner's ability to understand and characterize passenger travel demands which have traditionally been estimated by manual surveys handed out to passengers at stations or on board vehicles. The AFC data also presents an unprecedentedly consistent source of information on passenger travel times in those transit networks which have both entry and exit fare gates. By taking the difference between entry and exit times, AFC transactions can be used to capture the bulk of a passenger's time spent in the system including walking between gates and platforms, platform wait, in-train time, as well as interchange time for multi-vehicle trips. This research aims at demonstrating the potential value of AFC data in rail transit operations and planning. The applications developed in this thesis provide rail transit operators an easy-to-update management tool that evaluates several dimensions of rail service and demand at near real-time. While the concepts of the applications can be adapted to other transit systems, the detailed configurations and unique characteristics of each transit system require the methodologies to be tailored to solve its needs.
(cont.) The focus of this research is the London Underground network which adopted the automatic fare collection system, known as the "Oyster Card", in 2003. The Oyster card is now used as the main form of public transport fare payment in all public transport modes within the Greater London area. The two applications developed for the London Underground using Oyster data are (1) estimation of an origin-destination flow matrix that reflects current demand and (2) rail service reliability metrics that capture both excess journey time and variation in journey times at the origin-destination, line segment or line levels. The Oyster dataset captures travel on more than three times the number of OD pairs in one 4-week AM peak period compared to those OD pairs evident in the RODS database - 57,407 vs. 17,421. The resulting Oyster-based OD matrix shows very similar travel patterns as the RODS matrix at the network and zonal levels. Station level differences are significant at a number of central stations with respect to interchanges. At the OD level, the differences are the greatest and a significant number of OD pairs in the RODS matrix seem to be erroneous or outdated. The proposed Excess Journey Time Metric and Journey Time Reliability Metric utilize large continuous streams of Oyster journey time data to support analyses during short time periods.
(cont.) The comparison of the Excess Journey Time Metric and the official Underground Journey Time Metric show significant differences in line level results in terms of both number of excess minutes and relative performance across lines. The differences are mainly due to differences in scheduled journey times and OD demand weightings used in the two methodologies. Considerable differences in excess journey time and reliability results exist between directions on the same line due to the large imbalance of directional demand in the AM peak.
by Joanne Chan.
S.M.
Chen, Lein-Lein. "Study of the effectiveness of cost-estimation models and complexity metrics on small projects." FIU Digital Commons, 1986. http://digitalcommons.fiu.edu/etd/2134.
Full textHonauer, Katrin [Verfasser], and Bernd [Akademischer Betreuer] Jähne. "Performance Metrics and Test Data Generation for Depth Estimation Algorithms / Katrin Honauer ; Betreuer: Bernd Jähne." Heidelberg : Universitätsbibliothek Heidelberg, 2019. http://d-nb.info/1177045168/34.
Full textAndersen, Hans-Erik. "Estimation of critical forest structure metrics through the spatial analysis of airborne laser scanner data /." Thesis, Connect to this title online; UW restricted, 2003. http://hdl.handle.net/1773/5579.
Full textMacedo, Marcus Vinicius La Rocca. "Uma proposta de aplicação da metrica de pontos de função em aplicações de dispositivos portateis." [s.n.], 2003. http://repositorio.unicamp.br/jspui/handle/REPOSIP/276368.
Full textDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Computação
Made available in DSpace on 2018-08-03T22:41:17Z (GMT). No. of bitstreams: 1 Macedo_MarcusViniciusLaRocca_M.pdf: 842436 bytes, checksum: 27c553ffbf6a5e8dd848ac57e803bdf8 (MD5) Previous issue date: 2003
Resumo: Este trabalho apresenta uma proposta de contagem de pontos de função para aplicações de dispositivos portáteis (telefones celulares) tomando como base técnicas de contagem para aplicações de interface gráfica (GUI). A obtenção da contagem de pontos de função a partir de especificações funcionais e de layout fornecidas no início do projeto de desenvolvimento do produto de software abre a possibilidade de se estabelecer estimativas de custo (esforço) e duração mais precisas, beneficiando significativamente o planejamento do projeto. Nesse sentido são apresentadas algumas técnicas de estimativas de esforço. Também são apresentadas e comparadas algumas métricas de software com ênfase na métrica de pontos de função. Finalmente são apresentados os conceitos básicos de aplicações gráficas de telefones celulares e é estabelecido um paralelo na contagem de pontos de função para então se mostrar um exemplo prático de contagem utilizando este paralelo
Abstract: This dissertation presents a proposal of function point account for portable device (wireless phone) applications based on normal graphical user interface (GUI) application accounting techniques. The function point accounting done with functional and layout specification provided in the beginning of the software development project allows the establishment of cost (effort) estimation more precise, significantly improving the project planning. This way some effort estimation techniques are presented. Software metrics are also presented and compared with emphasis on function point metric. Finally, the basic concepts of graphical user interface wireless phone applications are presented and it is established a parallel in the function point accounting that is applied in a practical example
Mestrado
Engenharia de Computação
Mestre em Computação
Nyman, Moa. "Estimating the energy consumption of a mobile music streaming application using proxy metrics." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-288532.
Full textBland mobila musikströmningstjänster är energikonsumtion en konkurrensfaktor. Har tjänsten en för hög energikonsumtion kan det leda till att användarna avinstallerar applikationen. Därför är det viktigt för apputvecklare att minimera energin deras app använder. För att kunna minska energikonsumtionen måste den först mätas. I den här studien är proxymetriker, såsom CPU-användning och antal skrivna bytes till minnet, undersökta för deras lämplighet som prediktorer för energikonsumtion i en musikströmningstjänst på mobila enheter. En litteraturstudie genomfördes för att hitta vilka metriker som har störst påverkan på energikonsumtionen. Vidare användes litteraturstudien för att identifiera potentiella samband mellan metrikerna och energikonsumtion. En OnePlus 6T som kör Android 9 som operativsystem rotas och dess batteri anpassas för att samla in data. Datan samlas in från tre testfall inom ramen för den mobila musikströmningsapplikationen. Minnes- och nätverksstatistik samlas genom att använda mjukvaran strace, CPU-statistik samlas in genom att använda data från proc-filsystemet medan energikonsumtionen mäts genom att använda en energimätare. Baserat på resultaten från litteraturstudien konstrueras modeller för att modellera energikonsumtionen. Resultaten visar att många av metrikerna är högst kollinära. Från varje par av kollinära variabler behålls endast en. Den slutliga modellen visade en 32.5% bättre prediktiv förmåga än en icke-optimerad modell. Den bäst presterande modellen använde skickade nätverksbytes, lästa och skrivna minnesbytes samt användar-CPU som prediktorvariabler.
Gonçalves, André Miguel Augusto. "Estimating data divergence in cloud computing storage systems." Master's thesis, Faculdade de Ciências e Tecnologia, 2013. http://hdl.handle.net/10362/10852.
Full textMany internet services are provided through cloud computing infrastructures that are composed of multiple data centers. To provide high availability and low latency, data is replicated in machines in different data centers, which introduces the complexity of guaranteeing that clients view data consistently. Data stores often opt for a relaxed approach to replication, guaranteeing only eventual consistency, since it improves latency of operations. However, this may lead to replicas having different values for the same data. One solution to control the divergence of data in eventually consistent systems is the usage of metrics that measure how stale data is for a replica. In the past, several algorithms have been proposed to estimate the value of these metrics in a deterministic way. An alternative solution is to rely on probabilistic metrics that estimate divergence with a certain degree of certainty. This relaxes the need to contact all replicas while still providing a relatively accurate measurement. In this work we designed and implemented a solution to estimate the divergence of data in eventually consistent data stores, that scale to many replicas by allowing clientside caching. Measuring the divergence when there is a large number of clients calls for the development of new algorithms that provide probabilistic guarantees. Additionally, unlike previous works, we intend to focus on measuring the divergence relative to a state that can lead to the violation of application invariants.
Partially funded by project PTDC/EIA EIA/108963/2008 and by an ERC Starting Grant, Agreement Number 307732
Koch, Stefan. "Effort Modeling and Programmer Participation in Open Source Software Projects." Department für Informationsverarbeitung und Prozessmanagement, WU Vienna University of Economics and Business, 2005. http://epub.wu.ac.at/1494/1/document.pdf.
Full textSeries: Working Papers on Information Systems, Information Business and Operations
Haufe, Maria Isabel. "Estimativa da produtividade no desenvolvimento de software." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2001. http://hdl.handle.net/10183/1651.
Full textDrach, Marcos David. "Aplicabilidade de metricas por pontos de função em sistemas baseados em Web." [s.n.], 2005. http://repositorio.unicamp.br/jspui/handle/REPOSIP/276360.
Full textDissertação (mestrado profissional) - Universidade Estadual de Campinas, Instituto de Computação
Made available in DSpace on 2018-08-04T13:53:20Z (GMT). No. of bitstreams: 1 Drach_MarcosDavid_M.pdf: 1097883 bytes, checksum: 0be02ee41451affd5b6b6ef00b77ddf1 (MD5) Previous issue date: 2005
Resumo: Métricas de software são padrões quantitativos de medidas de vários aspectos de um projeto ou produto de software, e se constitui em uma poderosa ferramenta gerencial, contribuindo para a elaboração de estimativas de prazo e custo mais precisas e para o estabelecimento de metas plausíveis, facilitando assim o processo de tomada de decisões e a subsequente obtenção de medidas de produtividade e qualidade. A métrica de Análise por Pontos de Função - FPA, criada no final da década de 70 com o objetivo de medir o tamanho de software a partir de sua especificação funcional, foi considerada um avanço em relação ao método de contagem por Linhas de Código Fonte - SLOC, a única métrica de tamanho empregada na época. Embora vários autores tenham desde então publicado várias extensões e alternativas ao método original no sentido de adequá-lo a sistemas específicos, sua aplicabilidade em sistemas Web ainda carece de um exame mais crítico. Este trabalho tem por objetivo realizar uma análise das características computacionais específicas da plataforma Web que permita a desenvolvedores e gerentes de projeto avaliarem o grau de adequação da FPA a este tipo de ambiente e sua contribuição para extração de requisitos e estimativa de esforço
Abstract: Software metrics are quantitative standards of measurement for many aspects of a software project or product, consisting of a powerful management tool that contributes to more accurate delivery time and cost estimates and to the establishment of feasible goals, facilitating both the decision-making process itself and the subsequent obtention of data measuring productivity and quality. The metric Function Point Analysis - FPA, created at the end of 70¿s to measure software size in terms of its functional specification, was considered an advance over the Source Line Of Code - SLOC counting method, the only method available at that time. Although many authors have published various extensions and alternatives to the original method, in order to adapt it to specific systems, its applicability in Web-based systems still requires a deeper and more critical examination. This work aims to present an analysis of the specific computational characteristics of the Web platform that allows developers and project managers to evaluate the adequacy of the FPA method to this environment and its contribution to the requirement extraction and effort estimation
Mestrado
Engenharia de Computação
Mestre Profissional em Computação
Alomari, Hakam W. "Supporting Software Engineering Via Lightweight Forward Static Slicing." Kent State University / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=kent1341996135.
Full textRai, Ajit. "Estimation de la disponibilité par simulation, pour des systèmes incluant des contraintes logistiques." Thesis, Rennes 1, 2018. http://www.theses.fr/2018REN1S105/document.
Full textRAM (Reliability, Availability and Maintainability) analysis forms an integral part in estimation of Life Cycle Costs (LCC) of passenger rail systems. These systems are highly reliable and include complex logistics. Standard Monte-Carlo simulations are rendered useless in efficient estimation of RAM metrics due to the issue of rare events. Systems failures of these complex passenger rail systems can include rare events and thus need efficient simulation techniques. Importance Sampling (IS) are an advanced class of variance reduction techniques that can overcome the limitations of standard simulations. IS techniques can provide acceleration of simulations, meaning, less variance in estimation of RAM metrics in same computational budget as a standard simulation. However, IS includes changing the probability laws (change of measure) that drive the mathematical models of the systems during simulations and the optimal IS change of measure is usually unknown, even though theroretically there exist a perfect one (zero-variance IS change of measure). In this thesis, we focus on the use of IS techniques and its application to estimate two RAM metrics : reliability (for static networks) and steady state availability (for dynamic systems). The thesis focuses on finding and/or approximating the optimal IS change of measure to efficiently estimate RAM metrics in rare events context. The contribution of the thesis is broadly divided into two main axis : first, we propose an adaptation of the approximate zero-variance IS method to estimate reliability of static networks and show the application on real passenger rail systems ; second, we propose a multi-level Cross-Entropy optimization scheme that can be used during pre-simulation to obtain CE optimized IS rates of Markovian Stochastic Petri Nets (SPNs) transitions and use them in main simulations to estimate steady state unavailability of highly reliably Markovian systems with complex logistics involved. Results from the methods show huge variance reduction and gain compared to MC simulations
LARIZZATTI, FLAVIO E. "Determinacao de metais pesados e outros elementos de interesse por ativacao neutronica, em amostras de sedimentos da Laguna Mar Chiquita (Cordoba, Argentina)." reponame:Repositório Institucional do IPEN, 2001. http://repositorio.ipen.br:8080/xmlui/handle/123456789/10972.
Full textMade available in DSpace on 2014-10-09T14:01:38Z (GMT). No. of bitstreams: 1 07917.pdf: 6865830 bytes, checksum: 4b3f91ec8ea8157f98b981ad8985ffbb (MD5)
Dissertacao (Mestrado)
IPEN/D
Instituto de Pesquisas Energeticas e Nucleares - IPEN/CNEN-SP
França, Ana Beatriz Coelho. "Assinatura magnética e espectral na estimativa de elementos potencialmente tóxicos em solos tropicais /." Jaboticabal, 2019. http://hdl.handle.net/11449/183569.
Full textCoorientador: Livia Arantes Camargo
Banca: Luis Reynaldo Ferracciú Alleoni
Alan Rodrigo Panosso
Resumo: A contaminação dos solos causada pela ação antrópica representa uma preocupação mundial no escopo da segurança alimentar. Dessa forma, torna-se necessário mapear de maneira rápida e não poluente os teores dos elementos potencialmente tóxicos (EPTs) dos solos, como Ba, Co, Cr, Cu, Ni, Pb e Cd, a fim de mitigar a ação danosa destes elementos no ambiente. A falta de informações sobre estes EPTs e a necessidade de um grande número de amostras dificultam as avaliações de risco em áreas contaminadas e sua espacialização em grandes áreas agrícolas. Nesse sentido, a Suscetibilidade magnética (SM) e a Espectroscopia de reflectância difusa (ERD) podem ser técnicas indiretas promissoras para a predição dos teores dos EPTs por estar relacionada com atributos pedoindicadores do solo, tais como a mineralogia. Por isso, os objetivos com este trabalho foram: (a) avaliar os teores de EPTs em solos sob cultivo de cana-de-açúcar e compreender a influência antrópica na presença destes elementos nos solos, e (b) estimar os teores dos EPTs (Ba, Co, Cr, Cu, Ni, Pb e Cd) nos solos com o auxílio das medidas de SM e ERD. As amostras de solo foram coletadas em uma área de transição de solos originários de basalto, arenito Botucatu e Depósito Colúvio Eluvionar. Foram realizadas análises granulométricas, químicas, mineralógicas, espectrais e medidas de SM. Os dados foram analisados por estatística descritiva, correlação de Pearson, regressão linear múltipla (RLM), geoestatísica e funções de pedotranferên... (Resumo completo, clicar acesso eletrônico abaixo)
Abstract: Contamination of soil caused by anthropic action is a worldwide concern for food safety. Thus, it is necessary to map the potentially toxic elements (PTEs) as Ba, Co, Cr, Cu, Ni, Pb and Cd, in a quick and non-polluting way in order to mitigate the damaging action of these elements in the environment. In addition, the lack of information on these elements prevents risk assessments in contaminated areas and its spatialization in large agricultural areas requires many samples. In this sense, magnetic susceptibility (MS) and the diffuse reflectance spectroscopy (DRS) may be promising indirect techniques for the prediction of PTEs because it is related to soil pedoindicator attributes such as mineralogy. Therefore, the objectives with this work were: (a) to evaluate the levels of PTEs in soils under sugar cane cultivation and to understand the anthropic influence on these elements in the soils and (b) to estimate the content of PTEs (Ba, Co, Cr, Cu, Ni, Pb and Cd) in soils through MS and DRS. Soil samples were collected in a transition area of soils originating from Basalt, Botucatu sandstone and Eluvionar Collution Deposit. Sieve analysis, chemical, mineralogical, spectral and MS measurements were performed. Data were analyzed by descriptive statistics, Pearson's correlation, linear multiple regression (LMR), geostatistics and pedotransfer functions were used in the estimation the content PTEs using MS and DRS. The prediction models of the PTEs were calibrated using MS and clay f... (Complete abstract click electronic access below)
Mestre
Zhang, Tianfang. "Direct optimization of dose-volume histogram metrics in intensity modulated radiation therapy treatment planning." Thesis, KTH, Skolan för teknikvetenskap (SCI), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-231548.
Full textVid optimering av behandlingsplaner i intensitetsmodulerad strålterapi används dosvolym- histogram-funktioner (DVH-funktioner) ofta som målfunktioner för att minimera avståndet till dos-volymkriterier. Varken DVH-funktioner eller dos-volymkriterier är emellertid idealiska för gradientbaserad optimering då de förstnämnda inte är kontinuerligt deriverbara och de sistnämnda är diskontinuerliga funktioner av dos, samtidigt som båda också är ickekonvexa. Speciellt fungerar DVH-funktioner ofta dåligt i bivillkor då de är identiskt noll i tillåtna områden och har försvinnande gradienter på randen till tillåtenhet. I detta arbete presenteras ett generellt matematiskt ramverk som möjliggör direkt optimering på samtliga DVH-baserade mått. Genom att betrakta voxeldoser som stickprovsutfall från en stokastisk hjälpvariabel och använda ickeparametrisk densitetsskattning för att få explicita formler, kan måtten volume-at-dose och dose-at-volume formuleras som oändligt deriverbara funktioner av dos. Detta utökas till DVH-funktioner och så kallade volymbaserade DVH-funktioner, såväl som till mindos- och maxdosfunktioner och medelsvansdos-funktioner. Explicita uttryck för evaluering av funktionsvärden och tillhörande gradienter presenteras. Det föreslagna ramverket har fördelarna av att bero på endast en mjukhetsparameter, av att approximationsfelen till konventionella motsvarigheter är försumbara i praktiska sammanhang, och av en allmän konsistens mellan härledda funktioner. Numeriska tester genomförda i illustrativt syfte visar att slät dose-at-volume fungerar bättre än kvadratiska straff i bivillkor och att släta DVH-funktioner i vissa fall har betydlig fördel över konventionella sådana. Resultaten av detta arbete har med framgång applicerats på lexikografisk optimering inom fluensoptimering.
Schwieder, Marcel. "Landsat derived land surface phenology metrics for the characterization of natural vegetation in the Brazilian savanna." Doctoral thesis, Humboldt-Universität zu Berlin, 2018. http://dx.doi.org/10.18452/19368.
Full textThe Brazilian savanna, known as the Cerrado, covers around 24% of Brazil. It is characterized by a unique biodiversity and a strong gradient in vegetation structure. Land-use changes have led to almost half of the Cerrado being converted into cultivated land. The mapping of ecological processes is, therefore, an important prerequisite for supporting nature conservation policies based on spatially explicit information and for deepening our understanding of ecosystem dynamics. New sensors, freely available data, and advances in data processing allow the analysis of large data sets and thus for the first time to capture seasonal vegetation dynamics over large extents with a high spatial detail. This thesis aimed to analyze the benefits of Landsat based land surface phenological (LSP) metrics, for the characterization of Cerrado vegetation, regarding its structural and phenological diversity, and to assess their relation to above ground carbon. The results revealed that LSP metrics enable to capture the seasonal dynamics of photosynthetically active vegetation and are beneficial for the mapping of vegetation physiognomies. However, the results also revealed limitations of hard classification approaches for mapping vegetation gradients in complex ecosystems. Based on similarities in LSP metrics, which were for the first time derived for the whole extent of the Cerrado, LSP archetypes were proposed, which revealed the spatial patterns of LSP diversity at a 30 m spatial resolution and offer potential to enhance current mapping concepts. Further, LSP metrics facilitated the spatially explicit quantification of AGC in three study areas in the central Cerrado and should thus be considered as a valuable variable for future carbon estimations. Overall, the insights highlight that Landsat based LSP metrics are beneficial for ecosystem monitoring approaches, which are crucial to design sustainable land management strategies that maintain key ecosystem functions and services.
Görgens, Eric Bastos. "LiDAR technology applied to vegetation quantification and qualification." Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/11/11150/tde-10042015-112503/.
Full textA metodologia para quantificar vegetação a partir de dados LiDAR (Light Detection And Ranging) está de certa forma consolidada, porém ainda existem pontos a serem esclarecidos que permanecem na lista da comunidade científica. Quatro aspectos foram estudos nesta tese. No primeiro estudo, foi investigado a influência das alturas de referência (altura mínima e altura de quebra) na qualidade do conjunto de métricas extraído visando estimação do volume de um plantio de eucalipto. Os resultados indicaram que valor mais altos de alturas de referência retornaram um conjunto de métricas melhor. O efeito das alturas de referência foi mais evidente em povoamentos jovens e para as métricas de densidade. No segundo estudo, avaliou-se a estabilidade de métricas LiDAR derivadas para uma mesma área sobrevoada com diferentes configurações de equipamentos e voo. Este estudo apresentou como a seleção de métricas estáveis pode contribuir para a geração de modelos compatíveis com diferentes bases de dados LiDAR. De acordo com os resultados, as métricas de altura foram mais estáveis que as métricas de densidade, com destaque para os percentis acima de 50% e a moda. O terceiro estudo avaliou o uso de máquinas de aprendizado para a estimação do volume em nível de povoamento de plantios de eucalipto a partir de métricas LiDAR. Ao invés de estarem limitados a um pequeno subconjunto de métricas na tentativa de explicar a maior parte possível da variabilidade total dos dados, as técnicas de inteligência artificial permitiram explorar todo o conjunto de dados e detectar padrões que estimaram o volume em nível de povoamento a partir do conjunto de métricas. O quarto e último estudo focou em sete áreas de diferentes tipologias florestais brasileiras, estudando os seus perfis verticais de dossel. O estudo mostrou que é possível diferenciar estas tipologias com base no perfil vertical derivado de levantamentos LiDAR. Foi observado também que o tamanho das parcelas possui diferentes níveis de dependência espacial. Cada tipologia possui características específicas que precisam ser levadas em considerações em projetos de monitoramento, inventário e mapeamento baseado em levantamentos LiDAR. O estudo mostrou que é possível determinar o perfil vertical de dossel a partir da cobertura de 10% da área, chegando a algumas tipologias em apenas 2% da área.
Ferreira, Marcos Manoel. "Estimativa dos fluxos de Zn, Cd, Pb e Cu no saco do Engenho, Baía de Sepetiba, RJ Niterói." Niterói, 2017. https://app.uff.br/riuff/handle/1/3048.
Full textMade available in DSpace on 2017-03-14T15:32:17Z (GMT). No. of bitstreams: 1 Dissertação - Marcos Ferreira.pdf: 4318874 bytes, checksum: 78e7b7840a168358d28a5f9dfa90c96c (MD5)
Coordenação de Aperfeiçoamento de Pessoal Nível Superior
Universidade Federal Fluminense. Instituto de Química. Programa de Pós-Graduação em Geociências- Geoquímica, Niterói, RJ
Este estudo quantificou o aporte via transporte aquático superficial de Zn, Cd, Pb, e Cu para a Baía de Sepetiba, oriundos do Saco do Engenho, buscando estimar a carga da poluição anual, proveniente em grande parte, dos rejeitos industriais armazenados na área da falida Cia. de beneficiamento de Zn e Cd, a Ingá Mercantil. Os resultados mostram o quão preocupante é a carga dos contaminantes metálicos que ainda são exportados para a Baía de Sepetiba através do Saco do Engenho, devido ao processo conjunto de lixiviação e erosão dos rejeitos industriais ricos em metais pesados. Zn e Cd foram caracterizados como os principais metais que são lixiviados do rejeito pela ação das águas da Baía e das chuvas locais, sendo então os metais que apresentam os maiores riscos ao ambiente aquático local, principalamente se a este fato for acrescentado às suas altas concentrações no rejeito. Nas análises realizadas, em 71% das amostras, a concentração de Zn total superou os limites máximos permissíveis na Classe 2 para águas, segundo a legislação ambiental brasileira. Comparando-se os resultados deste estudo com outras regiões, as concentrações de Zn, Cd, Pb, e Cu encontrados nas águas do Saco do Engenho são comparáveis às concentrações encontradas em outros áreas altamante impactadas por atividades industriais e portuárias. Os resultados referentes ao fluxo de contaminantes que aportam à Baía de Sepetiba a partir do Saco do Engenho foram de 33 t.ano-1 de Zn, 0,3 t.ano-1 de Cd, e 0,06 t.ano-1 de Pb. O metal Cu apresentou um fluxo de 0,51 t.ano-1, mas no sentido inverso. Zn e Cd, apresentaram valores de fluxo anuais maiores, ou pelo menos na mesma ordem de grandeza, que os encontrados no Canal de São Francisco e no Rio Guandú, rios estes altamente contaminados por metais pesados, sobretudo em seus trechos finais devido a grande concentração de indústrias com elevado potencial poluidor, e que possuem uma vazão anual cerca de 100 vezes maior, que a o maior valor de vazão medido no Saco do Engenho
This study quantified the superficial aquatic aport of Zn, Cd, Pb, e Cu at Sepetiba Bay, derivative from Engenho Inlet, to look for the daily values of trace metals pollution, mostly originated of a large environmental liabilitie in that place, the tailing of old bankrupt Cia. Ingá Mercantil. The results show how concern is the load of metal contaminants that are still exported to the Sepetiba Bay through the Engenho Inlet, due to the joint process of leaching and erosion of industrial wastes rich in heavy metals. Zn and Cd were characterized as the main metals that are leached from the tailings by the action of the waters of the Bay and local rains, and then the metals that pose the greatest risks to the local aquatic environment, especially if this fact is added to their high concentrations to reject. In the analysis performed, in 71% of the samples, the concentration of Zn total exceeded the maximum allowable for Waters Class 2, according to Brazilian environmental legislation. Comparing our results with other regions, the concentrations of Zn, Cd, Pb, and Cu found in the waters of Engenho Inlet are comparable to concentrations found in other areas highly impacted by industrial and portuary activities. The results for the flow of contaminants who come to Sepetiba Bay from the Engenho Inlet were 33 t.year-1 of Zn, 0.3 t.year-1 of Cd and 0.06 t.year-1 of Pb. The metal Cu showed a flow of 0.51 t.ano-1, but in reverse flux. Zn and Cd showed higher anual flow values, or at least the same order of magnitude, than those found in the Canal of San Francisco and Rio Guandu rivers, these highly contaminated by heavy metals, especially in its final stretches due to high concentration of industries with high pollution potential, and have an annual flow of about 100 times greater than the highest value of flow measured in the Engenho Inlet
Leandre, Fernet Renand. "Estimating Effects of Poverty on the Survival of HIV Patients on ART and Food Supplementation in Rural Haiti: A Comparative Evaluation of Socio-Economic Indicators." Thesis, Harvard University, 2014. http://nrs.harvard.edu/urn-3:HUL.InstRepos:13041360.
Full textZemánek, Ondřej. "Počítání vozidel v statickém obraze." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2020. http://www.nusl.cz/ntk/nusl-417211.
Full textBellorio, Marcos Bruno. "Revisão sobre critérios de fadiga para cabos condutores de energia e uso de metodologia para estimativa de sua vida remanescente." reponame:Repositório Institucional da UnB, 2009. http://repositorio.unb.br/handle/10482/5951.
Full textSubmitted by Jaqueline Ferreira de Souza (jaquefs.braz@gmail.com) on 2010-11-11T15:24:36Z No. of bitstreams: 1 2009_MarcosBrunoBellorio.pdf: 1800293 bytes, checksum: a7c6b5c97e0b0799f6fb4bcf15ae8fc8 (MD5)
Approved for entry into archive by Daniel Ribeiro(daniel@bce.unb.br) on 2010-11-19T23:13:51Z (GMT) No. of bitstreams: 1 2009_MarcosBrunoBellorio.pdf: 1800293 bytes, checksum: a7c6b5c97e0b0799f6fb4bcf15ae8fc8 (MD5)
Made available in DSpace on 2010-11-19T23:13:51Z (GMT). No. of bitstreams: 1 2009_MarcosBrunoBellorio.pdf: 1800293 bytes, checksum: a7c6b5c97e0b0799f6fb4bcf15ae8fc8 (MD5)
O presente trabalho tem como objetivo conduzir uma revisao critica sobre as diferentes metodologias existentes para o projeto e manutencao de linhas de transmissao de energia quanto a fadiga sob condicoes de fretting. Entre as metodologias estudadas, o metodo da Cigre para calculo da vida remanescente mostrou ser a mais consistente e util. Uma vez concluida a analise critica aplicou-se estas metodologias para um conjunto de dados medidos para um cabo da Eletronorte em uma linha de transmissao de 230 kV instalada na regiao Norte do Brasil no trecho de travessia Vila do Conde - Guama. Essa analise permite estimar a durabilidade do cabo e constituiu importante ferramenta de analise para o setor de manutencao de empresas da area. Atualmente ha uma forte demanda das empresas do setor de transmissao de energia eletrica para tentar elevar o nivel da carga de pre-esticamento do cabo condutor. Isso alem de reduzir custos amenizaria dificuldades operacionais associadas, por exemplo, a construcao de torres muito altas para a travessia de grandes rios Amazonicos. Nesse sentido, esse trabalho propos uma alternativa para o calculo da curva de resistencia a fadiga do cabo na presenca de maiores cargas de esticamento. Ate onde o autor tenha conhecimento, essa e uma proposta inedita no sentido que, na presenca de cargas de esticamento mais elevadas costuma-se corrigir apenas os valores da solicitacao dinamica do cabo por meio da correcao do fator de rigidez na formula de Poffenberger-Swart. Essa e a primeira tentativa de corrigir-se nao apenas a solicitacao, mas tambem a curva de resistencia a fadiga da montagem cabo/grampo de suspensao. Uma discussao sobre a necessidade dessa medida e construida em detalhes. _________________________________________________________________________________ ABSTRACT
The aim of this work is to conduct a critical review of the different existing methodologies for the design and maintenance of overhead conductors under fretting fatigue. Among these methodologies, the Cigre method for remain life calculation has proved to be the most consistent and useful. Once the critical review was concluded those methodologies were applied using a set of data obtained from an Eletronorte overhead conductor installed in the North region of Brazil. This analysis allows one to estimate the durability of the overhead conductor which is a very important tool for the maintenance sector of transmission line companies. Nowadays there is a strong demand from the electrical transmission line companies to elevate the stretch factor (Every Day Stress - EDS) of the overhead conductor. This will reduce the costs and the operational difficulties associated to it, for instance, the construction of very high towers for the crossing of large Amazon rivers. In this context, this work proposed an alternative way to calculate the conductor fatigue curve for higher pre-loads. To the author’s knowledge, this is a proposal without precedent in the sense that in a presence of higher stretch loads it is common to correct only the values for dynamic loads by correcting the stiffness factor of Poffenberger-Swart formula. This is the first proposal to correct the stiffness factor and also the fatigue endurance limit curve for the conductor/clamp set. A discussion on the need for this measure is constructed in detail.
Chen, Li. "Quasi transformées de Riesz, espaces de Hardy et estimations sous-gaussiennes du noyau de la chaleur." Phd thesis, Université Paris Sud - Paris XI, 2014. http://tel.archives-ouvertes.fr/tel-01001868.
Full text